Can We Trust the Major Aggregators? Whats the Best Way to Propagate NAP Data to IYPs?


russofford

Forum Member
Joined
Jul 25, 2012
Messages
130
Likes
25
Hi Folks,

In your experience, what is the best way to have 'corrected' NAP (Name/Address/Phone Number) info propagated to the various IYP (Internet Yellow Page) websites?

I am already familiar with Localeze, InfoGroup, Acxiom and UBL (which supposedly feeds InfoGroup and Acxiom directly) as well as Yext (which seems to be the only way to update certain IYP sites.) I am also familiar with claiming profiles and verifying them via mail or phone.

What I am wondering is a preferred process or order in which to best go about the NAP propagation process which would 1. minimize duplicate listings and 2. correct the NAP info on partner sites the fastest.

My purpose for asking this question is mostly to avoid making a client's NAP listings 'worse' by assuming that relying solely on the major data aggregators to propagate the NAP data.

For example, if incorrect NAP listings already exist, can we blindly trust that the updated NAP (with the data aggregators) will properly propagate... or will it just cause more duplicate listings? The process is 'supposed' to propagate and 'correct/update' listings on their partner sites... but I fear that instead of correcting the info, it often causes a 'new' listing along side the old incorrect listing.

I have so many clients with 2 or 3 or even 6+ duplicate li stings on a single IYP site, I would like to minimize any possible damage from trying to 'fix' a NAP in the first place.

Thanks for you suggestions.

Russ
 

Linda Buquet

Moderator
Moderator
Joined
Jun 28, 2012
Messages
14,433
Likes
4,282
Hi Russ,

I don't deal with citation clean up issues much, but here's another thread that just started that may have some good tips.

Tackling Local Citations - Where To Start?

I'll Tweet this thread and see if we can get a discussion going.

Then hopefully some other pros will weigh in and give you some answers too.
 

Colan Nielsen

Administrator
Administrator
Joined
Jul 19, 2012
Messages
2,887
Likes
870
Hi Russ,

One thing I can say for sure is that you can't rely exclusively on cleaning up the data providers to get NAP issues cleared up. We have done a couple tests with Localeze and are currently testing out ACXIOM. We have pretty much concluded that cleaning up the data providers will not solve any duplicate issues on partner sites. We have also noticed that fixing the data providers does not guarantee that the partner sites will update accordingly.

Awesome post! This is a subject that I think deserves a lot more attention.
 
Joined
Jul 19, 2012
Messages
113
Likes
69
Russ,

Colan is perfectly correct here in saying that relying on the data aggregators will not help much (or not at all). The main reason is the process through which these sites (Localeze, ExpressUpdate, Acxiom) syndicate data across their "partner network". What happens normally is:

1) You submit your listing to the aggregator
2) The aggregator provides the submitted data to the "partners"
3) Partners take the data
4) Partners offer the data for public usage on their properties in whatever way they want.

Same process happens when you update a listing on some aggregator:

1) You submit your listing's updates to the aggregator
2) The aggregator provides the submitted updated data to the "partners"
3) Partners take the data
4) Partners offer the data for public usage on their properties in whatever way they want.

It all depends on the technology that the "partners" possess. Obviously, Google possesses very high quality technology but is still unable to combat duplicates 100%. Most of the other sites either don't have such type of technology (that would merge duplicate data, or would update data when it considers some already existing data to be outdated and some newly coming data to be for the same entity, but more up to date). That is why fixing the aggregators' listings is doing practically nothing but creating duplicates (in about 95+% of the cases).

I've explained why Yext is also not a good fit for working your way through inconsistent data + duplicates in this article:
Why Yext Might Not Be the Best Fit for Your Business | Local Search Marketing Blog by NGS
 

russofford

Forum Member
Joined
Jul 25, 2012
Messages
130
Likes
25
Thanks Colan & Nyagoslav.


In my experience using the 3 major data aggregators to 'correct' data, I have seen 'some' proper propagation. I was pleasantly surprised to see some of the NAP data corrected, but it was on only a couple YellowBook websites.


Some corrections, however, caused a 'broken' link to the originating profile on a specific IYP site. For example, if a CitySearch listing was deleted, it's 'partner' sites might not know that it no longer exists but still link to it as it's source.


This seems to be the case for sites such as MerchantCircle (which may have a link that says 'Get more information about this merchant', which points to CitySearch) and MojoPages where they show 'Provided by: CitySearch' at the bottom of the listing. It seems that MojoPages syndicates it's content (along with it's brand/look) to many cheap / small time local directories too.

Nyagoslav, you said
"... fixing the aggregators' listings is doing practically nothing but creating duplicates (in about 95+% of the cases)."
Are you saying that in 95+% of the time there is bound to be a duplicate listing created 'somewhere' among the plethora IYP sites... or that 95+% of IYP sites will create duplicate listings when correcting the aggregator data? hehe ;)

If, over 95% of the time, fixing data with a data aggregator causes yet more new / duplicate listings to be created on any given IYP site, would you suggest a different or more optimal way to go about 'correcting' bad NAP data for the best propagation scenario? ... Or is it just a matter of KNOWING that this corrective action WILL create duplicates... and being prepared to do the work of duplicate cleanup?

I am starting to think that the 3 major aggregators are best for propagating 'new' business listings where none previously existed.

However, I have seen cases where the data among the aggregators is conflicting (completely different address on Axciom vs Localeze/Infogroup)... and this would obviously NEED to be corrected at the source. For example, I saw the edit history of a Google Map Maker POI showing this company address switch back and forth many times over the last years. It is just too bad that doing so would likely create duplicate listings somewhere downstream.

Sincerely,

Russ Offord
 

russofford

Forum Member
Joined
Jul 25, 2012
Messages
130
Likes
25
So, several months and many UBL.org listing creations later... I have found that is is HIGHLY LIKELY that creating a profile on UBL will create duplicate listings, especially for an established business with a healthy number of existing IYP listings.

In many cases now, when I have checked the listing status in the UBL dashboard, about 1/2 of the listings (sometimes more, sometimes less) are newly created duplicates. I can tell because I've done a before/after inventory and you can also tell because of the particular wording used in the original UBL listing.

I have also found that categories are messed up and sometimes phone numbers and/or website URLs are not propagted into the new listings.

When you add multiple categories, services, products or keywords they are often imported with the | pipe separator and either look bad or mess up how the category systems are used in a particular IYP site. (Often times the 'best' category is not chosen/propagated.)

For example in one case:

  • CitySquares - duplicate - did not import the phone number for several clients
  • JudysBook - duplicate - no website listed, wrong category
  • ShowMeLocal - duplicate - shown as 'verified' (by UBL, much harder to delete now), categories are not well formed
  • ChamberOfCommerce.com - duplicate
  • LocalDatabase - duplicate


In my experience, out of 11 unique listings submitted, Yelp and Yahoo Local have been the only 'publishers' that will return an 'existing link' result, so a duplicate isn't created with UBL for them when that happens.

The UBL.org service has only ever directly reported to have created listings on the following sites (this list is combined results from all of my listings):

AboutUs.org
ChamberOfCommerce.com
CitySquares.com
Company.com
FourSquare.com
JudysBook.com
LocalDatabase.com
ShowMeLocal.com
Tyloon.com
USCity.net
Yahoo Local
Yelp.com


So, if you still dare to create a new NAP listing on UBL.org after reading this... you will want to double check your listing inventory for duplicates on each of those websites (at a minimum) before and a few weeks after submitting the listing. Then be prepared for a several hours of work to contact the websites/webmasters for the duplicate removal proceedure!

For a newer company with not many pre-existing NAP citations in existence, UBL.org is a great way to get a good base of new listings quickly created. I have also used UBL in order to avoid needing to send Acxiom official paperwork to prove a new company's legitimate existence. (As apparently they would feed that aggregator anyways?!?)

Good luck! ;)

I hope to hear back from others who have used UBL.org... what has your experience been with listing propagation from the major aggregators/publishers/pushers?

Russ
 
Joined
Feb 26, 2013
Messages
23
Likes
12
This thread, is essence, is "local seo". Knowing how to do this exactly yields a large large portion of local seo results.

Most businesses have built up domain authority of the years with domain age and associations, press coverage, etc. The NAP consistency part of the algorithm is where most businesses are lacking.

After almost 5 years of trials and tribulations, as of right now it seems the best way to create consistency is to claim all the bad listings in nearly all the directories and change them manually, one by one.

There is no short cut. It's just a tedious manual process.

The data aggregators "push" data out, but its not a two way street. The "partners" as Phil says do not "push" data back. That is, the aggregators do not seemingly "pull" the data that is already out there. There seems to be a very small handful of websites that actually change the data when they received the aggregators information, but it's a very small amount (maybe less than 10 I've seen).

The other problem is time lag. The aggregators only push out data to their partners every 30-90 days depending on the aggregator, then it might take a few weeks if not months for the data to update into the "partners" system, then it might take 1-4 weeks for that directory to get crawled by Google or Bing. You're looking at a good 3-6 months for any data coming from aggregators to change at all.
 
Joined
Jul 19, 2012
Messages
113
Likes
69
@Russ:

Are you saying that in 95+% of the time there is bound to be a duplicate listing created 'somewhere' among the plethora IYP sites... or that 95+% of IYP sites will create duplicate listings when correcting the aggregator data? hehe ;)
Sorry, I somehow have missed this the last time you posted it. What I'm saying is that in more than 95% of the cases duplicates (at least a few) will be created when you submit via an aggregator (LocalEze, ExpressUpdate, UBL, Yext). There are only very tiny percent of cases where the business is brand-new and doesn't have any footprint, so duplicates won't be created then.

I haven't personally used UBL's service, because I have a very long track record of seeing exactly what you are seeing with clients of mine that had previously used UBL. And unfortunately, it is not just UBL but all the rest as well. The reason is what I mentioned in my previous email - this is not really that much of a problem of the aggregators, but of the business directories themselves. The aggregators simply submit the business data and it is to be expected from the directories to figure out what is what. Unfortunately, most of them are technologically unsophisticated in this sense, so they fail, in some cases miserably. The worst of all sites are Merchant Circle, Citysquares, and Get Fave, because they receive business information from the biggest amount of data providers. Furthermore, they do not really have a clear mechanism for removing duplicates other than claiming those and making sure they are accurate. I've had less than 10% success in removing reported duplicates from Merchant Circle, and 0% with the other two mentioned sites.

I concur with what Jacob said - it just has to be done manually. That's what I've always recommended, long before I had myself such a service.
 

russofford

Forum Member
Joined
Jul 25, 2012
Messages
130
Likes
25
The worst of all sites are Merchant Circle, Citysquares, and Get Fave, because they receive business information from the biggest amount of data providers. Furthermore, they do not really have a clear mechanism for removing duplicates other than claiming those and making sure they are accurate. I've had less than 10% success in removing reported duplicates from Merchant Circle, and 0% with the other two mentioned sites.
Hi Nyagoslav,

I just wanted to let you know that recently I have had great success in getting Citysquares duplicate listings removed by simply emailing info@citysquares.com. I have normally received an email from Allison Crane within an hour or two, the same day of the request. Her email signature shows she is a Customer Service Manager and her email address sends from the Sightly.com domain.) The citysquare listings are deleted and produce a 404 after.

I have also had good luck lately with MerchantCircle by using their online Duplicate Listing Removal Form. I don't think they reply to let you know the dupes were removed, so you have to re-visit the profile URLs a few days later to check if they were 301 redirected to the corresponding city page.

I can't comment on Get Fave, because I have not attempted listing removal with them recently (that I've paid attention to) :)

Sincerely,

Russ
 
Joined
Mar 28, 2013
Messages
8
Likes
10
Hi, this is Damian from UBL. I wanted to respond to these ultimately very helpful comments on our submission process and the state of the union for local search publishers in general. At UBL we are very sensitive to issues such as Russ describes, and we work carefully with publishers to prevent duplication of listings and to encourage them to pass along as much of the high quality listing information we receive from clients as possible. Nyagoslav is quite right that not all publishers have foolproof methods for preventing duplication and other issues. Our goal is to submit clean listing information wherever we can and simultaneously to promote within the publisher community the importance of improving listing content for the benefit of businesses and consumers. We intend to take Russ's detailed feedback right to the publishers and use it along with our own quality testing in our ongoing effort to improve our own services and the quality of local data in publication.

Russ, if you would like to follow up further with the service issues you've described I would be happy to help.

Thanks,
Damian
 

russofford

Forum Member
Joined
Jul 25, 2012
Messages
130
Likes
25
Re: Can We Trust the Major Aggregators? Whats Best Way to Propagate NAP Data to IYPs?

Thank you for the input Damien... and also, thank you for the personal reply to my support rant/request via email.

I would absolutely love your service, use it more and recommend it... if it didn't cause duplicate listings for my clients that already have an existing IYP presence.

Like I've said before, I realize that it is not necessarily 'your fault' that dupes are published, but due in part by the IYP's methods of receiving/publishing... and due in part to the way the whole industry works.

Would you offer us some information on how the aggregators/data providers/publishers work and how they might better prevent publicizing duplicates?

How do some IYP websites already check against / prevent duplicates? What mechanisms might be available to help other IYP companies to avoid propagating dupes?

Do they cross reference company names / phone numbers that are provided with what is in their existing database? I suppose the issue would be 'how would they know' when to accept and when to reject a new potential listing... especially when a minor variation of the business name, or an alternative phone number was received/scraped.

I also assume that it is easier for an IYP to not create a dupe only when the NAP they receive from you exactly matches the NAP they may already have on file. However, I have seen some dupes on some IYP websites that are identical in name/phone/address, too.

Thanks,

Russ
 
Joined
Mar 28, 2013
Messages
8
Likes
10
Hi Russ,

It's not easy to give a definitive answer to your questions. As Nyagoslav said quite correctly, there are companies with the resources of a Google and then there are others that are much smaller. To Google I would add Yahoo, Bing, Yelp, YP, and a few others where there is (or should be) a high sensitivity to issues like duplication coupled with the means to do something about it. But again, even Google doesn't get it right all the time.

Part of the difficulty is exactly as you state. If the NAP changes significantly, you may have only a 60% match to the other published listing. Through automated, large scale processes, you have to make a reasonable decision about whether a 60% match is or is not the same business. There may be a tendency to tolerate more duplication simply in order to avoid deleting legitimate businesses from your database.

Another response to your questions has to do with more of an attention shift we would like to see in the industry. The agenda of publishers has not always been tightly focused on providing the best channel for merchants to fix listing problems thoroughly and consistently. This forum is testament to that and our mission is to advocate for you guys and for merchants in any way we can. We think that although it does not always show up in an obvious way on a revenue statement, quality local data benefits publishers too.

So it's a technical challenge and a matter of getting more attention focused on quality. For years local listings have been thought of in the industry as a commodity but I do think that is changing -- think of the reaction to Apple Maps last year.

Again, Russ, we really do appreciate your feedback and want to do everything we can to keep you as a customer of our service.

Best,
Damian
 

Linda Buquet

Moderator
Moderator
Joined
Jun 28, 2012
Messages
14,433
Likes
4,282
Hey Damian, thanks so much for taking the time to weigh in and provide more clarity!

Since you don't have a sig, I changed your title so folks know who you are. :)
 

Local Search Forum


Weekly Digest
Subscribe/Unsubscribe


Google Product Exert

@LocalSearchLink

Join Our Facebook Group

Top