Does Using Fetch As Google and Requesting Indexing Have Any SEO Benefits?


Joined
Jan 2, 2019
Messages
1
Likes
0
First off, I just want to say this is an awesome forum! I recently joined as I am working as a SEO Specialist at a local web design firm.

This is my first SEO role where I work primarily with local businesses. Even though I am new to local SEO, I have worked for several companies before this, and had my own business.

When I started working here four months ago, I noticed the SEO Specialists here all use the Fetch As Google and request indexing function in the old version of Google Search Console. I've never heard of anybody doing this to optimize a website, other than maybe when a new site is launched, or there are new pages that need to be indexed in the search engines.

The SEO Specialists have clients where we spend about two hours per month optimizing our clients' websites. Part of this optimization includes submitting all of the main URLs of a local business website to Fetch As Google in Google Search Console, and requesting indexing for the URLs. This is typically done every month for most clients, even if there are not any major changes.

To me, I think this is a big waste of time. Even with new pages, I believe the Google algorithm is perfectly capable of finding new pages on it's own these days.

The owner seems to think using Fetch As Google for the URLs of a website has some benefits. What do you think?
 

Yan Gilbert

Administrator
Joined
Oct 15, 2016
Messages
97
Likes
82
I ask for a crawl if a page is new, or after I make any change to a page.

Smaller sites may not get crawled as often and so if I update a page, even if it is minor, I will submit it through Search Console. That way I will have a better idea if my change made a difference because the timing will be closer.

Asking for recrawl of a page that had no change whatsoever seems pointless.

Btw, you can submit a link through the new Search Console dashboard as well, although it wont store a dated list like the old version.

1546887397163-png.3527
 

brettmandoes

Forum Member
Joined
Nov 2, 2018
Messages
31
Likes
32
That tool is primarily for diagnostic purposes to see if there's anything technically wrong with your page, not for indexation.

Add timestamps to your sitemaps. Whenever a change is made, the date of the change is stamped next to the URL in the sitemap. We've found that has an impact on new changes getting picked up and it's completely automated.

When we tested indexation improvements by submitting via Fetch (about 2.5 years ago), there was no difference in the rate at which pages were reindexed compared to the control group, for which we did nothing and let Google reindex at its own pace.
 

Cherie Dickey

Local Search Expert
Joined
Jan 30, 2018
Messages
139
Likes
97
I agree that it's a waste of time to submit to index when there is nothing new to index.

That tool is primarily for diagnostic purposes to see if there's anything technically wrong with your page, not for indexation.
The fetch and render is for diagnostics, I agree. It's helpful to see how Google sees a page. But I do also see value for submitting new content/content updates to index outside of that.

I'm interested in hearing more about your tests though... were the sites tested websites that were larger or linked to from other larger sites that might have a chance of being crawled more often? Or were they more similar to a small local business site that might not have those benefits?
 

brettmandoes

Forum Member
Joined
Nov 2, 2018
Messages
31
Likes
32
We had a network of 150 local sites (all legitimate sites) that suffered a crawl error due to a botched cdn implementation. Google began deindexing the sites. We implememted a fix and broke half the sites into a cohort where we submitted their urls for reindexation and the other half we're the control. We didn't submit anything. Google reindexed them at the same rate. Site size was small, about 200-300 pages each. Local home service companies.
 

JoshuaMackens

Local Search Expert
Joined
Sep 12, 2012
Messages
1,427
Likes
284
First off, I just want to say this is an awesome forum! I recently joined as I am working as a SEO Specialist at a local web design firm.

This is my first SEO role where I work primarily with local businesses. Even though I am new to local SEO, I have worked for several companies before this, and had my own business.

When I started working here four months ago, I noticed the SEO Specialists here all use the Fetch As Google and request indexing function in the old version of Google Search Console. I've never heard of anybody doing this to optimize a website, other than maybe when a new site is launched, or there are new pages that need to be indexed in the search engines.

The SEO Specialists have clients where we spend about two hours per month optimizing our clients' websites. Part of this optimization includes submitting all of the main URLs of a local business website to Fetch As Google in Google Search Console, and requesting indexing for the URLs. This is typically done every month for most clients, even if there are not any major changes.

To me, I think this is a big waste of time. Even with new pages, I believe the Google algorithm is perfectly capable of finding new pages on it's own these days.

The owner seems to think using Fetch As Google for the URLs of a website has some benefits. What do you think?
I think you're asking if submitting a fetch increases your rank on Google. It does not, based on what we know.
 
Joined
Oct 22, 2018
Messages
22
Likes
27
I've noticed that when using the old GSC "Fetch as Google" tool (ie. Google Search Console - use "Mobile" and make sure you click the button for "Request indexing"), you can very quickly (within minutes) get Google to start showing the updated title tag/meta descriptions. This is my go-to tool when I optimize title tags/meta descriptions on a site.

1547056409516-png.3534
 

Local Search Forum


Weekly Digest
Subscribe/Unsubscribe


Google Product Exert

@LocalSearchLink

Join Our Facebook Group

Top