More threads by Broland

Joined
Aug 8, 2012
Messages
136
Reaction score
86
Tracking SERPS instead of just keywords provides a lot of benefits and is really helpful for audits, consulting and troubleshooting penalties and updates. I'm working on an article with a few different helpful methods for tracking SERPS but wanted to go ahead and share this killer new tool that just recently launched a Beta version.

It's called SERPWoo: https://www.serpwoo.com

Basically you load in keywords and it tracks the top 20 results for each keyword daily with stats on each domain/ranking page. It has customizable notification settings and lots of other features. I've been using it for a little over a month and haven't even checked out all the features yet, and since it launched they have been adding features about once a week. It's also good for tracking and getting notifications for business/brand names.

You can do a lot with this thing and it gives some great new ways to look at data for you other data geeks out there. I'm interested to hear what other uses people might get out of this. Also the developers are taking suggestive feedback and have already implemented some people's requests which is always great with new tools.
 
Thanks Broland. I keep getting an error on load for multiple browsers...able to access via internal pages though
 
Hi All,

I'm not sure what the rules are, so I sincerely apologize in advance if I may be breaking/bending one of them. I'm Carter, one of the founders of SERPWoo. First I want to thank OP for the shoutout! We are working to bring a great tool to users and love feedback if anyone has any on how we can improve.

A couple of things, first on the particular day Broland created this thread, I accidentally added the wrong homepage file to the site, that caused the homepage to not render correctly. All of the other pages were find and the general system was fine, it was just the homepage. We caught the error within an hour and fixed it. This things happen in beta stages. I apologize for that confusion.

Second, one thing I know this audience is going to want is local seo tracking (Google maps), so we have it on our roadmap to include down the road. Right now our focus is to get the universal overview working perfectly, then bring a unique perspective to local SEO. Our target audience are small businesses so we definitely need the local search perspective to be top notch.

If anyone has any questions, feedback, or comments please don't hesitate to make your voice known to me, or my team, on here, on twitter, or in the feedback box on the interface. We want to provide a tool that you not only use, but is essential to your success, so any critical feedback that make sense will be considered.

Thanks again for the mention Broland, and I'm already starting to love browsing this forum. I've been looking for a community to join which hyper focuses on local search and seo for a while.

- CCarter
 
Hi CCarter,

Thanks for joining and for weighing in.

What you did is just fine. If someone brings up your tool it's OK to elaborate or clarify.
(We just don't allow self promotion so you can't start a thread to talk about your tool or link to your own site.)

HOWEVER if when you do the local update, if you want to run it by me - if I think it would be helpful to members and pre-approve it, then you can tell us about it but we'd need to connect via email so you can clear it with me and I can tell you how to post it.
 
Hey good to see you here Carter. That's great news about adding local results in the future. There is a HUGE need for it right now... Google just recently broke all my custom local SERP tracking tools with the removal of the places search...:(
 
I'm starting to test it out and play with it this morning. It looks like you can customize some projects to edit a certain amount of keywords and remove some of the default keywords. I'm interested to see how accurate this is.

@CCarter - Are you guys planning on developing an API for this, or is it a stand alone product? (hint: I like API's for tools :p)
 
Any ideas on possible API integration possibilities? I may contact support directly to ask this...
 
Sorry about the late reply. You can delete all the default keywords and projects and do your own.

Accuracy - one thing to note is we currently neutralize localized/personalized searches, so far we've been hitting the mark better then anything we've tested against, but overall our main focus is not rank tracking but rather helping SEOs understand what's going on in their SERPs and allow customers to see a god over view of the SERPs for ORM control if that makes sense. As long as we get 90% accurate we are fine since those domains are the ones that are really ranking in the SERPs - since we aren't a rank tracker we enjoy some flexibility on how we do things.

We just launched in June, so things are moving rather quickly, however at the moment we don't have an API integration but DO have that on our to-do list. We are still in beta, and plan on coming out very soon, with more options for more keywords, and features - and the API down the line. :)
 
I have to admit this is way over my head and I don't have time to dig in...

But I think it could shed some light on Pigeon.

I don't want to link out to a black hat forum so you need to add www and fix .net to go to read this.
VERY involved!

blackhatunderground DOT net/forum/blackhat-seo-marketing-and-traffic-generation/serp-analysis-sep-3rd-2014/

CCarter can you summarize here what you are trying to explain there since it all has to do with local? Not the part about you and competition but just what the SERPs are doing.
 
I have to admit this is way over my head and I don't have time to dig in... But I think it could shed some light on Pigeon.

CCarter can you summarize here what you are trying to explain there since it all has to do with local? Not the part about you and competition but just what the SERPs are doing.

Yeah it's a bit involving since it draws upon several factors of previous conversations and analysis done.

First let me go into monitoring and the volatility metrics for a background. We monitor the volatility of each keyword and display them for users to see in the main interface when looking at the SERPs.

The volatility score is calculated by 1 movement up or down, equaling 1% of the top 10 results of a keyword (even though we monitor top 20 to 28, the volatility metric only takes into account the top 10). So if the URL in #1 position moved to #10, that would be a 9% movement - also vice-a-versa if URL #10 moved to #1, that would be 9% movement. Now that score would be if only 1 URL moved when comparing to the previous day's position. So as more URLs move up and down for that keyword the total volatility number will increase. if #1 moves to #10 that's 9% - and then #2 moves to #8 that's 6%. If nothing else moved for that day's comparison to the previous day, then the total for that example would be 15% (9% + 6%).

Now if ANYTHING drops out of the top 10, that's automatically the MAX of 10% - So If #1 goes below #10 spot (#11 to infinity) - that's total of 10%. We keep it simple - so if ALL top 10 drop out of the top 10 results, that would equate to 100% volatility.

I normally don't like seeing anything above 20% - unless a major update is happening, cause that would mean lots of strong movement. If you see 7 day averages of 20% or more, that would be the equivalent of two URLs in the top 10 dropping out of the top 10 EVERY DAY for 7 days - that's really incredible in my opinion.

To give a comparison the keyword "Garcinia Cambogia" has running 28 day average of 13% vol. - that's a manually monitored keyword. "Cheap Viagra" has an 28 day of average of 17% - that's a highly spammy and volatile niche. "Replica Gucci" is the most competitive niche I've seen in my life, but only has an average of only 13% volatility. For a visual here is a screenshot of the last 300 days of 'replica gucci'. All that white area are urls which are no longer ranking:


replica_1.jpg


To see the URLs from the white area, set the filter to "All URLS within the View Range", you'll see there is so many URLs that the chart is completely un-readable (reason for the focus date):


replica_2.jpg


So in that date range, there have been tons of URLs rankings and then disappearing - lots of activity, spammers and blackhats fighting one another, yet averaging less than 20%. That means when I say above 20% is high, that's REALLY high movement for the top 10. Since even the most volatile niches aren't seeing averages above 20%.


So we have a global volatility metric - soon available to all users, where we are able to see the volatility of all the SERPs we are monitoring in SERPWoo. Here is a screenshot of today's:


volatility.jpg


It's worth noting Moz uses the same methodology except they only monitor 1000 selected, private, keywords for their "Mozcast". Here is a screenshot of today's Mozcast:


mozcast.jpg


Algoroo.com is another Google monitoring service that monitors an "unknown" amount of keywords - but does not state HOW they come up with the metrics, but still here is their screenshot of today:


algoroo.jpg


Both Mozcast and Algoroo don't display today's data though unlike us, whether that's bad or good, is still to be determined since we have a running progress chart.


So that's ends the "rounding out the whole picture" of volatility. Now to come to what I was talking about at BHU (blackhatunderground dot net), we implemented volatility metrics upon each project, so users can group keywords together that are for clients, or related, or whatever and see the volatility for that set of data, like so:


projects.jpg


^^ If you look closely, you notice something is way off when comparing the Celebrities and Christmas Holiday projects versus these "pseudo local terms". That's what the BHU post was focusing on, cause those terms are completely out of control. Now looking at the original Global Volatility screenshot of SERPWoo, you'll notice this update started on August 25th (I'm not sure if it's Google Pigeon related, but it's movement seen in SERPWoo, Mozcast, and Algoroo around the 25th to 28th days - SW just seems to be picking up more sensitivity - possibly due to our userbase, or that we calculate based on all data versus selected from the other guys).


On 8/25 is when these "pseudo local terms" all started going crazy. First what are "pseudo local terms"? They are terms that are either entered in nationally or globally, but should ALWAYS 100% of the time really return local results; "pseudo local terms" include localized and specialized repairmen - think air conditioning, generator repair, plumbers, garage doors, windshield repair. This also includes attorneys keyword phrases which are specialized think injury or ticket attorneys, and other regional services like cellphone service, internet services, phone service, etc.


Now it needs to be pointed out the reason we use "pseudo" is because there is no location based word inputted by users when searching, so if they are in Colorado looking for phone service, they aren't typing in "Denver phone service", or "Colorado phone service" they are simply typing in "phone service" - generic and broad.


Whenever someone types in these "pseudo local terms" in incognito mode - and this is where you have to pay attention close to what I am saying, Google will at times return localized results in incognito mode even with pws=0 and all those crazy local neutralizing variables - DURING AN UPDATE.


Now what happens when SW pulls data from Google, we used a variety of IPs and double and triple check rankings in a given pull for accuracy. So if we see something off, the system will check and re-check just to make sure those are indeed the results. "pseudo local terms" are really a special outlier terms if you think about it from a macro level, they should always return regional results so the user has the best experience but we as a crawler tell it not to - and it complies except when these updates happen.


Here are the 4 terms I'm monitoring in my "pseudo local terms" project:


rca1.jpg


--


lip1.jpg


--


pi1.jpg


--


rodg1.jpg


Here you are seeing those spikes occur, that have regular data, or as I called them the old folks", and then the new areas (white areas cause there are not ranking on the set focus date - today, Sept 12th, 2014), where those were including some localized results even though we specified not to.


Those spikes are so disruptive they are causing volatility up to 90%+ - That's serious. If you were a website in those rankings you'd see constantly flux of traffic day by day - HOWEVER remember since these "pseudo local terms" are in a special category websites are getting regional searches already, so they will not necessary see a flux in their traffic in THIS special case. The flux is happening in what Google is returning, but since "pseudo local terms" always really return 99% of the time regional results, webmasters won't see this. This is one of those things that's an outlier of the algo which we are able to see, but doesn't really effect a webmaster at their level since they get regional traffic anyways, and when searcher search they are using personalized results. We are seeing an anomaly where we can detect when a Google update is occurring across the map, cause of this one 'glitch'.


I think the anomaly is due to databases fighting with one another, and there is missing data that has to be accounted for when pulling the results, so they put in data based on your IP address. Now you'll notice in the recent two days, there have been leveling off of the spikes in all 4 examples. This is the first time I've seen this, and could indicate that the major update that started on 8/25/2014 is finally coming to an end - or has ended, and google has turned off the "switch" for the update.


What's important to understand is, if these were not "pseudo local terms", and your website was ranking nationally and your SERPs were seeing this, you'd be getting traffic one day, then nothing the next - a constant pendulum of back and forth, which is crazy for any business. Looking at the volatility pie charts for the individual keywords, you are able to see the crazy averages and movements that NO website owner wants to see as volatility. That's why I say, anything above 20% is really out of control SERPs, and you might as well figure something else out for traffic, cause there is no chance. But since the "pseudo local terms" are regionalized, webmasters aren't seeing that flux.


So bottom line, it appears that we can see when Google is pushing an update through their system through this crack/anomaly in their settings. This is a great thing for us and the users.


Another thing we talked about in that BHU post was the ability to filter, so if people want to see what is ranking, and why, at the moment - you can filter the results accordingly. So if you think Higher PR is a factor, or lower Alexa, or backlinks or social signals you can filter with our pre-determined filters (soon going to customizable). Here is an example of the very first "pseudo local terms" filtered for URLs WITH social signals:


yes_social.jpg


This shows there is only 1 URL ranking for their term with social signals, and even that is simply 1 linkedin. So the majority of the URLS do not have ANY social signals what-so-ever, which are ranking:


no_social.jpg


This might be just an anomaly, and users are able to make their own determination whether social makes a difference for their niche, keyword, or industry, or they don't make a difference. In this case - there is no ranking factor for social signals for any of the URLs which are ranking except the one with the 1 Linkedin.


That probably was a lot to take in, but a quick summary - Monitoring "pseudo local terms" allows us to see when a Google update is REALLY occurring and in the aftermath you can use various filters to dig deep into the winners and losers to see what makes a difference, what doesn't, and how you can move forward for your own SEO campaign.
 
Whoa Carter, that post was epic! Are you a wizard??? :p (Was just checking out your MOE site, it appears you are many things!)

It's a lot to take in for someone not used to your tools the way you are. But very interesting.
Thanks so much for sharing!

I don't monitor pseudo local terms. I should but I just haven't. Prior to Pigeon I could just never figure that one out, whereas I had the blended algo nailed most of the time. But now with Pigeon nothing makes sense, so I have not even been checking KWs with a Geo modifier. Maybe I'll start now though.

Let us know if you see anything new or changing in Google Local land!
 
Whoa Carter, that post was epic! Are you a wizard??? :p (Was just checking out your MOE site, it appears you are many things!)

It's a lot to take in for someone not used to your tools the way you are. But very interesting.
Thanks so much for sharing!

I don't monitor pseudo local terms. I should but I just haven't. Prior to Pigeon I could just never figure that one out, whereas I had the blended algo nailed most of the time. But now with Pigeon nothing makes sense, so I have not even been checking KWs with a Geo modifier. Maybe I'll start now though.

Let us know if you see anything new or changing in Google Local land!

LOL, I wish I was a wizard. I'm just a marketer that knows how to program. I focus on overall marketing, online and offline, with SEO being one tool in the toolbox to generate traffic. I'm a big advocate of Traffic Leaks for generating tremendous amount of targeted traffic. A traffic leak sounds like what it is, going to a location where your target audience is, studying it for a while, then jumping into the swing of things and slowly building your authority and driving traffic towards your own project.


Once marketers truly understand that the WHOLE internet is a potential traffic source, not just Google, you can go to any platform, forum, social network and jump into the swing of things without spamming. If you provide useful information to your target audience, they'll come looking for you - and when someone discovers something by themselves versus it being pushed upon them, they are more likely to trust and buy it.


A week back one dude came to me and stated he was only getting about 1-3 visitors a day from SEO since June. I told him bluntly, clearly SEO is not working for you - so why not traffic leak, I sent him a link to my post about it, he was a little hesitant and stated he tried reddit in the past, but only got 150 visitors from that traffic leak - from a one time post. When I put it into perspective that those 150 visitors to his news site in that single day are more then he's gotten from SEO in 3 months, the light came on.


I then told him to try different platforms, different forums, different blogs and monitor what traffic is the best, what has lower bounce rate, more stickiness, more newsletter sign-ups, and keep trying. Exactly 7 days after we started chatting he's generating 1,000+ visitors a day from traffic leaks, and completely disregarding SEO - his daily stream of returning visitors is steady increasing and he's broken the Google chains. That's what I try to teach - I'm setting up a site for traffic leaks in the near future to discuss all possible techniques without having to go to Wickedfire to see my post about it.


That's a quick background on me, I tell folks SEO is one tool in the toolbox, and any website's traffic can be YOUR traffic if done correctly, without being a spammer.


---


Today's follow up - here is a keyword which wasn't a part of the original 4 above, but I recalled we were tracking this for 2 months before SERPWoo opened it's doors and saw the same pattern: Input "Auto Insurance" into your SW projects and check it out:


auto_insurance.jpg


^^ Same spikes we saw - starting on Aug 25, 2014, notice the term is really suppose to return regional results for the best user experience. now we see leveling off and no longer crazy spikes as the update dies down. (I revealed this one since we've been tracking it for a long time internally and to give you guys an understanding of what we're seeing). As well none of the other 4 pseudo local terms have been spiking in the last 6 days, so that's a pretty good since things are over; what does the results look like, changes? It's so much data, it's hard to tell at the moment but I'm going to continue digging down for this.


But I think we got a way to really see if there is a Google update going on or not. With "Auto Insurance" You can see past days of similar spikes around times Google updates have been known to be rolling out. We're just starting to crack the surface of what's possible. :)
 
I'm a big advocate of Traffic Leaks for generating tremendous amount of targeted traffic. A traffic leak sounds like what it is, going to a location where your target audience is, studying it for a while, then jumping into the swing of things and slowly building your authority and driving traffic towards your own project.

Once marketers truly understand that the WHOLE internet is a potential traffic source, not just Google, you can go to any platform, forum, social network and jump into the swing of things without spamming. If you provide useful information to your target audience, they'll come looking for you - and when someone discovers something by themselves versus it being pushed upon them, they are more likely to trust and buy it.

Awesome, thanks for all that info Carter!

I recommend building authority and getting active on forums or social networks as a way for consultants to pull qualified leads to them instead of prospecting too.
 
Just finally set up a demo account Christine.

I don't work on clients any more - just help the industry. So excited to see how your tool can help me with SERP analysis. There are a couple keywords I use for tracking packs, so I just added them to your tool.

Have a feeling I'll be asking you to do a dedicated post/video more targeted to direct local queries. I've mainly only seen you talk about pseudo localized queries. Anxious to expose this to more of the local search community, but I'm still trying to get my mind around it because it's so different than what we are used to.
 
Just finally set up a demo account Christine.

I don't work on clients any more - just help the industry. So excited to see how your tool can help me with SERP analysis. There are a couple keywords I use for tracking packs, so I just added them to your tool.

Have a feeling I'll be asking you to do a dedicated post/video more targeted to direct local queries. I've mainly only seen you talk about pseudo localized queries. Anxious to expose this to more of the local search community, but I'm still trying to get my mind around it because it's so different than what we are used to.

Thanks, we're working on bringing local map packs and other local SEO solutions in the coming months to SERPWoo since we plan on having local businesses as one of our key audiences. I'm here if you need anything and I would love to do a post/video on more targets.

Just a side note, that's not me in the videos... :)
 
Local Ranking and Map Packs are here!!

local-seo-pizza-chicago-map-tracking.jpg

Let me know if you have any questions or requests, we're starting down our local SEO route to start servicing SEOs that target this area. Hit me up if you need anything, I'm always available!
 

Login / Register

Already a member?   LOG IN
Not a member yet?   REGISTER

LocalU Event

  Promoted Posts

New advertising option: A review of your product or service posted by a Sterling Sky employee. This will also be shared on the Sterling Sky & LSF Twitter accounts, our Facebook group, LinkedIn, and both newsletters. More...
Top Bottom