6 Matters SEO Experts Just Love to Deal with – And a Couple They Hate

When it comes to search engine optimization, you have to take care of a wide spectrum of tasks. There are the big projects that take weeks or even months to complete. There are the medium-sized tasks that will require your undivided attention. And, then, there are the low-hanging fruits – SEO issues that are easy to fix but pay big rewards.

Evidently, all SEO experts love the low-hanging SEO efforts that can improve their sites dramatically.

On the other hand, there are the problems that drive SEOs crazy. They do everything that they’re supposed to do, stay away from black hat SEO techniques, and comply with Google’s rules. However, in spite of their efforts, they just can’t seem to get enough traffic (or the right kind) to their sites.

But, we’ll deal with those issues later. First, let’s take a look at six common SEO problems that are easy to fix but pay off big time.


A common issue that can cause a lot of problems is a simple slash symbol (“/”) that was placed improperly in the robots.txt file. Even a professional can sometimes make this mistake, but most of the times the “offender” is the small business owner who built the website with little SEO knowledge.

How much harm can a simple “/” do, you might ask yourself? Well, an improperly configured robots.txt file can prevent search engines from indexing your website. In other words, this minor problem can jeopardize your site’s organic traffic.

The good news is that you can correct the robots.txt file with a text editor like notepad Go to your sitename.com/robots.txt and look for the following message: “User-agent: *Disallow: /” Talk to your SEO or developer before making any change.



Bad Sitemaps

Sitemaps help Google and other search engines understand your website. Due to their ability to tell web crawlers what your site is about and avoid any confusion, they play a vital role in your search engine optimization strategy.

More often than not, small business owners forget to update their sitemaps when they make changes to their sites. Since search engine spiders are not brought up to date with these changes, they have difficulties crawling and understanding your site. Hence, you can expect to see a dip in your traffic.

One of the most effective ways to find these types of issues is with the help of your Google Webmaster tools. Look for problems like not including the location of the XML sitemap in the robots.txt file, allowing an old version of the sitemap to exist or allowing multiple versions to co-exist.

Bad Redirects

Redirects are an excellent way to manage and control dead pages. However, if they are not used correctly, they can turn into a loop, sending search engine and people to a dead end.

If you move a page or your entire website to a different URL, a 301 redirect is the best way to ensure that users and search engines are directed to the correct page. A redirect that is not properly setup can cause a lot of damage. For example, multiple redirects can signal spider crawlers that the design or coding of your site has an error and it’s best to avoid it. This issue can have a dramatic impact on your organic traffic.

Shady Link Building

Link building can help website owners see a significant gain in their organic traffic, but it comes with a bit of a risk if they don’t know what they’re doing. The last Penguin update changed the way Google evaluates the quality of links, making it clear that if you use black hat SEO methods, the consequences will be almost immediate.

Although not necessarily a low-hanging fruit, link building is one of the most effective ways to build domain reputation, drive organic traffic to your site, and grow your business steadily. Just make sure that the backlinks come from reputable, relevant sources in your niche and appear natural.

Multiple Versions of Homepage

People don’t really care if you have multiple URLs pointing to the same homepage (example: sitename.com, www.sitename.com, www.sitename.com/home.html, and so on.) Search engines, on the other hand, care, and this configuration can affect your organic traffic.

In most cases, Google will decide which version to index, but it might also index a variety of your URL versions, which can lead up to a lot of confusion.

The simplest way to fix this problem is to centralize the correct version (usually www.sitename.com) by creating 301 redirects for the other variations. That way, you are solidifying link equity instead of spreading it between multiple versions of your homepage.

Speed Issues

Internet users are extremely impatient. 40% of them will leave a page if it takes more than three seconds to load. What’s worst is that 44% of them will tell a friend about the poor experience they had.

Page speed plays a key role not only in your users’ experience but your rankings as well. Google has included page load time in its mobile-friendly algorithm, meaning that it favors sites that load fast. You can do a lot of things to improve your site’s speed, such as play with images, videos, JavaScript codes, and other bells and whistles.

Make sure to also use the Google PageSpeed tools to test and increase the load time of your site.

As mentioned at the beginning of this article, there are the problems that SEO professionals love to deal with, and then there are the issues that everybody loathes.

Here are just two of them.

Spammy Referral Traffic

Not all traffic is good traffic. The problem with spam websites is that the visitors they send have nothing to do with your site. Moreover, they’re not even real people in most cases but bots than hit your site and bounce back after a fraction of a second. When fake traffic starts appearing in your Analytics reports, it can make your data inaccurate.

One of the most frustrating things about spam traffic is that it’s ongoing. You block one ghost referral from showing up in your future Analytics data and, before you know it, another spammy site starts sending bogus traffic your way.

Sudden Dips in Traffic

A dip in traffic doesn’t necessarily mean trouble. During the holiday season, on weekends or during the summer months you can expect to experience low traffic rates. But, if your traffic loss is sudden and steep, then that’s a red flag.

The annoying thing about sudden dips in traffic is that it’s hard to pin them down. Anything from new analytics software that doesn’t work well with your Google Analytics to a new algorithm update can cause them. Here are other potential causes:

The best protection against sudden dips in traffic is to stay alerted.


SEO is not a once and done type of deal. It needs constant monitoring and adjusting. Encountering several small SEO issues is not a big deal and most professionals are used to them and even take joy in discovering and fixing them. But, if you ignore these low-hanging fruit problems, they’ll snowball and damage your site in the long run.

Kostas Chiotis is a content marketing expert and the founder of IrisSignals.com. He’s also a blogger outreach specialist who has helped numerous businesses build their reputation, boost their traffic and, ultimately, increase their bottom lines.

Connect with him on Linkedin, Twitter @IrisSignals, and Facebook