If you're like most online merchants, you've got an eye on your SEO strategy and the tactics involved with gaining higher organic rankings and driving more targeted, free traffic to your site. If you've been paying attention to Google lately, you've realized a few changes that have had or may have an impact on your business. I'm here to help you make sense of these recent changes and offer a long-term SEO strategy that'll stand the test-of-time -- or at least stand the test of Google algorithm changes and, in the case of Hummingbird, complete overhauls.
Rundown of recent changes this year with Google and its effects
Penguin 2.0 -- May 22, 2013: This was primarily a link-quality algorithm update that took into account the incoming link profile of any given website. Google was looking for situations where it seemed obvious there was link manipulation happening. For example, situations where websites were clearly trying to build incoming links with very specific anchor text that went beyond what a more natural-looking incoming link profile would resemble. Typically, this anchor text consisted of the most valuable business terms for that website (e.g. LCD HDTVs).
This update also targeted link web spam in general, an ongoing Google battle for years now, and obvious advertorials that pass PageRank. In other words, if you have a lot of very low quality incoming links and are associated with spammy links (e.g. automated systems that build low quality links), you've probably realized ranking losses. Widget bait and site-wide links fit this bill. Also, if you've been paying for links and you pay for written endorsements on other websites with links that pass PageRank, you've probably realized ranking losses.
Even if you haven't realized losses and are still incorporating these tactics, I encourage you to stop. Google has become increasingly sophisticated in its detection and filtering technology. As long as SEO is important to your business, don't play with these matches.
Updates to Google's Webmaster Guidelines -- ongoing: These tend to slip under most people's radars, so I recommend always checking in on Google's guidelines after every update. I also encourage you to return to these guidelines to guide your strategy. It's also good to check-in on Bing too. These updates outline and give more detail into all the best practices Google recommends to help it find, crawl, and better understand the content and products on your site. There is a section, link schemes, under quality guidelines where Google outlines linking practices that go against Google's guidelines. They certainly help you identify the likely causes of penalties like Penguin 2.0 and 2.1. You can click through that link to get more information as well.
Google Webmaster Tools reporting on webspam connected accounts -- August 2013: Google has been making efforts to give site owners notice of manual webspam actions in GWT -- links Google has identified as unnatural and other forms of webspam including problems with content. Log in to your GWT account and navigate to Search Traffic -- Manual Actions. If you receive a manual action message, realize you are on the radar and this is your opportunity to clean your content and/or incoming link profile of harmful, unnatural links. This additional help isn't all-inclusive at this point, and it might not ever be. I tend to think of this as Google's way to subdue the outcries for support in terms of getting direct and specific answers from Google about content and link penalties. It does offer you a link to submit a request for review, a reconsideration request through this manual webspam message. Regarding unnatural links, I have seen situations where Google will respond to reconsideration requests sent through GWT with actual sample URLs of links outside its quality guidelines. If your reconsideration request is detailed enough and you can provide proof of your work to remove unnatural links, it may be more inclined to offer sample URLs.
If you find yourself in a situation where you think you have been penalized and yet no notices have been brought forth through GWT, there are solutions available to help you identify the likely cause.
100 percent term (not provided) -- most people heard about it on Sept 23, 2013: Ugh…SEOs saw this one coming, but it's very critical for every web site owner and inbound marketer to know that all searches through Google are now 100 percent encrypted. Before you had to be signed into your Google account to perform secure searches; now it no longer matters. Anyone performing a search on Google is SSL protected, 100 percent of the time.
As a marketer and business owner, this now means all keyword-level data for organic traffic will be reported as 100 percent term (not provided) through analytic platforms like Google Analytics. This greatly disrupts the foundation for which most SEOs report on keyword performance and disables direct knowledge for specific keyword details. It blinds keyword data and in this day-and-age of technology, data blinding is never a positive thing.
Hummingbird -- Sept. 2013: SEOs often talk about the concept of short-tail search queries and long-tail search queries. The idea is to separate search users into different groups to help better understand search intent and to identify, very broadly, where a particular user is in the purchase funnel. Typically, someone using three or fewer terms in a search query is near the top of the purchase funnel. With four or more terms, you probably have someone searching for something specific and closer to making a purchase. This is important because it helps shape SEO strategy and the content and information architecture for an e-commerce website. It's flawed because there is a lot of grey area. This line of separation is really an illusion and theoretically a single web page of content or a well produced product page for that matter could provide the best result for a related set of search queries, regardless of term length.
It's all about understanding search intent. It's a contextual problem and Google has this challenge as well. Hummingbird was very much a search engine infrastructure upgrade. It is not an algorithm update. This is Google's attempt to upgrade its search results for this aforementioned long-tail user and to address this contextual challenge. Google reports long-tail and extreme long-tail as the largest proportion of search queries, by far. It's trying to better understand the context of these searches and match web content and not just keywords in copy to provide better, more relevant and accurate search results. This could mean several things:
- Certain content may now be able to appear for a greater variety of search results, no matter the length of the search queries.
- Less effective, low quality content will get fewer organic impressions.
- Content which is created for the sole purpose of ranking for a particular search query may also realize fewer organic impressions because it lacks a certain amount of depth and context.
Penguin 2.1 -- Oct. 4, 2013: Matt Cutts, head of Google's webspam team, reported this as a minor tweak, but none-the-less relevant. Some reports have suggested certain websites penalized back in May from Penguin 2.0 saw a recovery. Others have reported increased ranking losses. As with all Penguin-related ranking changes, this one was targeting link spam and other manipulative linking practices.
What I said before bears repeating: Google has become increasingly sophisticated in its detection and filtering technology, so as long as SEO is important to your business, don't play with these matches (i.e. manipulative link schemes).
The relationship between all of the above
If you take a step back and look at the picture Google is painting here, one thing is clear: The value of an SEO strategy set on tactics involving direct manipulation of search results is becoming less effective. At the rate Google is churning out updates, a year from now these practices may be completely ineffective and obsolete. This is exactly what Google wants. They want inbound marketers and business owners to shift their primary focus away from Google and manipulative link and content schemes and concentrate this energy on each business' target market and to create the best products, services, and content possible. These websites will eventually rise to the top of Google organic search results while poorer quality websites and brands will become less competitive. The primary reason is because people naturally will link to and share socially useful content and products they truly love. The average web user is not going to link to, buy from, and/or promote a brand they don't trust or don't use.
This shows Google's commitment to the semantic web and their desire to truly understand content and conversations in the same way people understand one another, communicate with one another, and share things online naturally. We've seen this for years now with such developments like Knowledge Graph, support for Schema.org, and even the development of G+ as a legitimate social platform.
Stay tuned, as we will be discussing how to create a long-term SEO strategy for e-commerce business success.