More threads by Nick 11

Nick 11

Member
Joined
Mar 29, 2020
Messages
8
Reaction score
2
Hello

User-agent: *
Crawl-Delay: 20

Rankings have dropped, checked /robots.txt and the above is what I get, also a screenshot of Google Search Console attached as an error message.

Any advice appreciated I have been onto my web host chat help who say contact a techie.

Thanks

Capture.JPG
 
1. Unless you have a burning need to control crawl speed, remove that section. The only time I have ever used it was back in the days where Yahoo Search was still a thing. Today, most searchbots don't pay attention to that directive anyway, including Google. Let the important searchbots set their own crawl rate and block the others (although blocking in robots.txt is also ignored by many bots.

2. Is that your complete robots.txt file? That's not what the error is complaining about. What are your site settings for sitemaps and how was this submitted to Google?
 

Login / Register

Already a member?   LOG IN
Not a member yet?   REGISTER

LocalU Event

  Promoted Posts

New advertising option: A review of your product or service posted by a Sterling Sky employee. This will also be shared on the Sterling Sky & LSF Twitter accounts, our Facebook group, LinkedIn, and both newsletters. More...
Top Bottom