Getting high rankings on competitive keyword terms is a large part of what search engine optimization (SEO) aims to accomplish. However, just because a website has a good ranking doesn't necessarily mean it will generate a click, deliver engagement, or produce a sale.
Unfortunately, there are many simple things that are either overlooked or just not considered that ultimately serve as a detriment to otherwise very solid SEO initiatives.
The following are 10 common issues that seem to arise when companies are optimizing their websites.
1. Treating social media as an independent exercise
Your audience is going to spend three times as much time on Facebook as it does on Google. Social media (particularly Facebook) cannot be abandoned or left alone in a world with real-time search results. You know who thinks social media must be optimized for search? Google, that's who. So does Bing. Activity on your Facebook page will translate directly to more page real estate in search results for your brand. With Open Graph, you almost have to go out of your way to silo these practices, so why not align your digital activity with how your users spend their time online?
In this case, Specialized has more than 18,000 fans on Facebook but no updates to its Fan Page in more than four months. As a result, other websites related to the keyword "Specialized" are getting page one real estate on Google when the brand, Specialized, could be capturing multiple positions.
2. Too broad of a navigation palette syndrome (TBNPS)
Getting that coveted No.1 spot for a competitive term is one thing, but getting the full click-through rate value of that listing is another. Look at this site:
Auto Parts Warehouse actually ranks No. 1 on Google for "Auto Parts." But while other websites on this results page feature site links (separate links to interior pages presented in the listing above the main site), Auto Parts Warehouse does not. And every bit of real estate on page one matters. The problem here is that, according to my spider, there are 651 links on the homepage. At Resolution Media, we generally discourage more than 100 links for an entire sitemap, let alone your homepage. No wonder search engines aren't identifying key access points on this site!
Granted, there's no silver bullet in this regard, but sorting the homepage by brand, product type, and year, and having sub-navigation therein, could create a much more powerful and enticing user experience right in the search result.
3. Fail to plan (and plan to fail) for traffic fluctuation
So let's say you crack the first page for a high volume query. Your title is enticing, your description reads like a dream, and the URL even has some keywords in it. Everything lines up to drive the maximum click-through rate. But what happens if it works so well that Google starts sending more traffic to your site than the site is equipped to handle? I'm talking about the network stability. What can happen is something like this:
The highlighted area in this chart represents a brand's launch in five new markets with offline marketing for the site and an aggressive paid search campaign. A tax on your server/network can slow down your site, which limits its search optimization capabilities because page load times impact rank (on Google). Additionally, you're seriously hindering the experience for users on Smartphones or mobile devices who may not have the luxury of a WiFi connection.
I've always advocated for keeping IT/Server Admins in the loop with your traffic forecasts. The more they know about the potential SEO volume, the better they can ensure the site is properly load-balanced, the CMS isn't overly taxing, and the network connectivity is equipped to accommodate both users and search engine spiders.
4. Ignore non-web-page elements of search (namely video)
YouTube is now the second-largest search engine behind Google. Brands should plan on consumers looking for their logos, commercials, and store locations and optimize accordingly. For example, if somebody types in "Harley Davidson Commercial" on Google, we are given video results, but none of them are from a Harley Davidson Brand Channel, and the split results I'm shown do not feature official Harley pages.
In this particular search result, an official Harley page is not shown until below the fold and, even then, it has nothing to do with commercials. This is a great opportunity for Harley to capture a particularly brand-engaged consumer, and there are almost 15,000 queries per month for "Harley Davidson Commercial." Furthermore, with the advent of Facebook's Open Graph, Harley could seed pages with "Like" buttons to further evangelize the channel and its brand.
5. Fail to optimize for the website's goals
Perhaps I'm being presumptuous in assuming I know what everyone's goals are, but if a site has a transactional goal (we'll say to produce sales), then that should be apparent from the onset. In the hosting realm (a competitive search space), you'll see that Hivelocity is a very transactional site with four price-point offers above the fold on the homepage:
However, the meta descriptions that inform the organic search listing have no call to action. In fact, calls-to-action in this search result aren't very apparent at all:
If you aren't reflecting the goals and purpose of your site (especially in a highly competitive search result), you're losing out to those that do.
6. Mischaracterize who your competitors are
This happens a lot with emerging technologies and markets. I'm talking about that time when a new technology emerges, and the world becomes rife with sellers and "let-me-the-expert-explain-what-this-is" authoritative websites. In these cases, it's important to trust the engines to separate the wheat from the chaff. They use aggregate query intent data to decipher how best to show results. Evaluating a balance between being largely informational websites versus transactional websites can give you plenty of insight. Take a look at the search results for "3G Wireless." As we can see, there's a host of informational data (Wikipedia), local data, and retailers.
If you're the marketing manager at Cricket, your online competitors aren't necessarily Sprint or Clear or transactional websites but, rather, a healthy set of tech information sites. This should inform the nature of the content presented for pages trying to rank on this term. However, on the Cricket site, the content on exactly what 3G is and why you need it is fairly thin:
Adding a simple FAQ or speed comparisons could help this page rank for 3G informational queries.
7. Run a paid search campaign independent of SEO rankings and visibility
Many studies have shown that the more page real estate a brand can occupy for a given search query, the better. In this example, see how the description in this natural link trails off at the end:
Assuming the copy in the paid ad is the best performing message, that 85,000 number should be repeated in the natural search description. It would surely stick out to searchers on this page. Also, perhaps for the term "best hotel prices," Hotels.com should bid up to a higher position for its paid ad since the natural ranking already covers lower-fold real estate. This way Hotels.com could have coverage over the whole page wherever the searcher's eye should dart.
8. Implement a CMS without applying SEO best practices
Content Management Systems (CMS) have become increasingly effective at producing SEO-friendly pages. However, there are still certain methods of implementation that can wreak havoc on SEO best practices right out of the gate. Speaking from personal experience, most of these issues are best addressed when making decisions about which CMS works best for your company. But ultimately, anytime you use a CMS, you're already a leg behind the crowd that is publishing unique titles and descriptions per page. Additionally, if you go to market with an un-optimized CMS, you also create files one level behind the root domain right out of the gate. This makes it difficult to establish a content hierarchy as the "homepage" isn't actually "home" but rather a few sub-directories deep.
In this case, the Borders.com CMS redirects the user to a directory three levels away from the domain (/online/store/home). Architectural issues such as this make it difficult for engines to understand what the main pages are on the site.
9. Lack of Robots.txt
One of the most bone-headed oversights I see is when old versions of pages are being crawled because there is no robots.txt file telling the search engines to ignore them. This can have a negative effect on how a spider can get through the site, and it's such a simple thing to implement.
For example, Coca-Cola's multilingual site has no directory guidance for spiders due to the lack of a robots.txt file. This will impede the crawl-rate and make it difficult for Coca-Cola to get specific pages ranked for relevant queries (e.g., Diet Coke page for "Diet Coke" queries).
10. Ignoring how the public perceives your brand
Disclaimer: I do not have a vendetta against Coca-Cola. In fact, as I write this, I have a Cherry Coke Zero by my side! It turns out Coca-Cola is called Coke by 90 percent of the population (don't take it from me, see what Google says). This wouldn't matter if it weren't for the fact that Coca-Cola now serves up two distinct user experiences to searchers. Observe the differences in user experience in Google (when logged out).
The latter is clearly more dominated by Coca-Cola- branded sites. One of the videos even originates from an official Coca-Cola channel. The company could alleviate this situation by optimizing images and videos to both Coca-Cola and Coke in a consistent manner to help engines understand the relationship between the two better.
It's hard enough to get a first-page ranking in the organic search results. Don't screw it up by making one of the 10 mistakes I've pointed out here. Capture the full value you deserve from SEO and win the hearts and minds of your customers, not to mention your bosses!
On Twitter? Follow iMedia Connection at @iMediaTweet.