User-agent: * Crawl-Delay: 20 Robots.TXT File

Nick 11

Member
Joined
Mar 29, 2020
Messages
8
Hello

User-agent: *
Crawl-Delay: 20

Rankings have dropped, checked /robots.txt and the above is what I get, also a screenshot of Google Search Console attached as an error message.

Any advice appreciated I have been onto my web host chat help who say contact a techie.

Thanks

Capture.JPG
 

djbaxter

Administrator
Administrator
Joined
Jun 28, 2012
Messages
3,225
1. Unless you have a burning need to control crawl speed, remove that section. The only time I have ever used it was back in the days where Yahoo Search was still a thing. Today, most searchbots don't pay attention to that directive anyway, including Google. Let the important searchbots set their own crawl rate and block the others (although blocking in robots.txt is also ignored by many bots.

2. Is that your complete robots.txt file? That's not what the error is complaining about. What are your site settings for sitemaps and how was this submitted to Google?
 

Weekly Digest

Weekly Digest
Subscribe/Unsubscribe

Trending: Most Viewed

  Promoted Posts

New advertising option: A review of your product or service posted by a Sterling Sky employee. This will also be shared on the Sterling Sky & LSF Twitter accounts, our Facebook group, LinkedIn, and both newsletters. More...

  Local University Guide

Google Product Exert


Top