Call Tracking Can be a Major Issue for Local SEO if Used Improperly.

Here’s Why:
Call tracking systems use javascript to display numbers dynamically depending on where the visitor came. This can be extremely useful data, but the problem is, if any of those numbers get read and indexed, it can wreak havoc on your NAP consistently. Because of that, your rankings and (more unfortunately) your ability to drive business through local search results suffers.

This used to not be a problem, now it is (From Google’s Webmaster Central Blog):

In 1998 when our servers were running in Susan Wojcicki’s garage, we didn’t really have to worry about JavaScript or CSS. They weren’t used much, or, JavaScript was used to make page elements… blink! A lot has changed since then. The web is full of rich, dynamic, amazing websites that make heavy use of JavaScript. Today, we’ll talk about our capability to render richer websites — meaning we see your content more like modern Web browsers, include the external resources, execute JavaScript and apply CSS.

Traditionally, we were only looking at the raw textual content that we’d get in the HTTP response body and didn’t really interpret what a typical browser running JavaScript would see. When pages that have valuable content rendered by JavaScript started showing up, we weren’t able to let searchers know about it, which is a sad outcome for both searchers and webmasters.

In order to solve this problem, we decided to try to understand pages by executing JavaScript. It’s hard to do that at the scale of the current web, but we decided that it’s worth it. We have been gradually improving how we do this for some time. In the past few months, our indexing system has been rendering a substantial number of web pages more like an average user’s browser with JavaScript turned on.

Sometimes things don’t go perfectly during rendering, which may negatively impact search results for your site. Here are a few potential issues, and – where possible, – how you can help prevent them from occurring:
If resources like JavaScript or CSS in separate files are blocked (say, with robots.txt) so that Googlebot can’t retrieve them, our indexing systems won’t be able to see your site like an average user. We recommend allowing Googlebot to retrieve JavaScript and CSS so that your content can be indexed better. This is especially important for mobile websites, where external resources like CSS and JavaScript help our algorithms understand that the pages are optimized for mobile.
If your web server is unable to handle the volume of crawl requests for resources, it may have a negative impact on our capability to render your pages. If you’d like to ensure that your pages can be rendered by Google, make sure your servers are able to handle crawl requests for resources.
It’s always a good idea to have your site degrade gracefully. This will help users enjoy your content even if their browser doesn’t have compatible JavaScript implementations. It will also help visitors with JavaScript disabled or off, as well as search engines that can’t execute JavaScript yet.
Sometimes the JavaScript may be too complex or arcane for us to execute, in which case we can’t render the page fully and accurately.
Some JavaScript removes content from the page rather than adding, which prevents us from indexing the content.

To make things easier to debug, we’re currently working on a tool for helping webmasters better understand how Google renders their site. We look forward to making it to available for you in the coming days in Webmaster Tools.
Posted by Erik Hendriks and Michael Xu, Software Engineers, and Kazushi Nagayama, Webmaster Trends Analyst

In the last sentence of that blockquote is what I believe to be the PageSpeed Insights Tool. Here is what I am shown for a client’s site currently using call tracking:
pagespeed insights call tracking test

The words eliminate, and remove, are in the instructions in how to fix this problem. The very first item listed to be removed…the call tracking javascript.

What Does this Mean for Local SEO?

Essentially, Google is and has been actively trying to read (and index accordingly) the information that javascript is hiding and displaying. In the tool that now helps to render what is being displayed. This means that the phone number you are trying to display/hide is/will be read and associated with your business. Being the least imperfect is the goal when it comes to NAP consistency, and this obviously is not the least imperfect.

Solution: Do Not Use Call Tracking Companies

(sorry call tracking companies)
Here is the good news: there is a way around this issue, and it can be something that actually improves the accuracy of the data that you could ever collect from a call tracking system:

Integrate your CRM with Google Analytics.

Integrating your CRM (where you collect all the information about the calls that are coming in anyway) to your analytics account makes it much easier to weed out the non-client calls from the data you are trying to use to make better decisions on how to gain more clients. This makes your data more accurate, and helps you make better decisions.

The other part of how this data becomes more accurate, is that asking how they found you will yield a far more insightful look into the touchpoint that influenced them most (the one they actually remember) than the javascript ever could. By design, the a javascript system will either track the first, or the last interaction. Although that can be useful, if there were two other touchpoints in between, I would rather know which one of the four was most memorable by the person contacting me, not what a bot can track.

A little bad news (for some) is that the only CRM I have been able to find that integrates with Google analytics is Salesforce.

I know that there are other CRMs that are actively working to integrate with Google analytics, but I have not found any others that can actually do it. Do you know of any that can integrate with GA?

March 26, 2015 | Local SEO

Share This Post