By Kent Lewis, Anvil Media
I first discovered the power of incorporating keywords into websites to boost rankings in 1996. By adding a desirable keyword into the META keyword tag and repeating it frequently in the text, I ranked clients’ websites on popular search engines like WebCrawler and Lycos. It wasn’t simple however, as I was optimizing client websites for 14 search engines, which all had differing algorithms. While much has changed since Google introduced PageRank, many of the fundamentals I refined in the 90s still apply.
I originally shared my philosophy on search engine optimization (SEO) a decade ago with The 3 C’s of SEO. In my article, I identified three primary areas of focus for successful SEO: content, code and credibility. While social media has influenced Google’s algorithm, the 3 C’s still apply today. In this article, I will outline the most common (and timeless) mistakes and misconceptions about SEO.
One of the most formidable challenges with SEO, is the creation of fresh, unique, relevant and compelling content. Creating copy, photography, video or other forms of content is both time-consuming and costly. Developing content so remarkable that it is worth sharing is even more challenging. This is a key issue, as inbound links still drive organic rankings in search. There are a few common mistakes marketers make when attempting SEO without adult supervision, most of which centers around volume, quality and relevance.
For starters, marketers often mistake the value of quantity vs. quality of content. Thus, they may hire or pay vendors to create high volumes of generic, unprofessional or generally low-quality content, including short blog posts, fact-less articles or poor quality images and video. While less common today, the use of exact-match domains (EMDs) with thin and flat topical microsites is still a thing with marketers unfamiliar with Google’s Panda update in 2011.
When creating new content for a website, too many marketers fail to keep voice search in mind. With the rapid adoption of Siri, Amazon Alexa and Google Home, more searches than ever are conducted by voice. Thus, queries tend to be phrased in the form of a question instead of 3-5 topical keywords. A related oversight by some marketers is the failure to localize content for different countries, languages or business locations. The internet is both global and local and Google rewards brands that understand this fact.
Even if marketers can create truly compelling content, some are still working off outdated strategies including keyword-stuffing website pages. This is a particularly prevalent issue with blog posts and articles being overly-optimized beyond what would be considered natural or authentic. Conversely, overly-aggressive publication-type websites may inundate pages with advertising or affiliate links to maximize revenue, at the expense of the user experience and rankings.
While most website visitors are not familiar with HTML and content management systems (CMS) that are used to create and manage websites, Google cares a good deal about source code underlying websites. There are a handful of mistakes marketers make when building websites with SEO in mind. First and foremost, the code must be clean, fast, responsive and WC3/ADA-compliant. Far too many marketers are sold CMS platforms with bloated code and limited SEO functionality that hurts rankings for a variety of reasons.
The most common problem with CMS platforms is that the large amount of code can push down the content Google wants to rank. Big code also slows down the website, which negatively impacts the user experience. Google recognizes and rewards exceptional user experiences, which includes designing with mobile users in mind. Also known as responsive design, mobile-friendly websites render a page differently depending on screen size.
While I haven’t experienced code trickery like cloaking for years, some marketers are still committed to black and grey hat SEO, meaning they are willing to bend or break rules for short-term gains. Unfortunately, Google eventually catches up and once penalized, websites may not recover for months, if ever. Along the same lines, many sites are not properly secured, which Yahoo! recently demonstrated can be quite costly. While we’re on the topic of trickery, it should be noted that Google does not like duplicate content, whether intended or not.
The final area of oversight relating to code best practices, revolves around keywords and structured data. Believe it or not, despite a proactive focus on SEO, many marketers fail to incorporate keywords where it matters most: Title and header tags. Additional keyword-optimization opportunities include meta description, ALT tags, anchor text and file names. Lastly, some marketers fail to properly utilize schema markup or rich snippets to help Google and others understand the content and context. This is particularly useful for ecommerce product pages, and location-based information like addresses and phone numbers.
While the need for quality content and clean code has not changed in the last 20 years of search engine marketing, credibility factors have changed dramatically. Since Google came on the scene in 1998, with an innovative algorithm that focused on the hub-and-spoke model of authority, SEO professionals have put a good deal of effort into securing inbound links. Unfortunately, too many marketers have forsaken quality links (from popular and reputable websites) for quantity (typically lower quality websites with questionable domain authority).
We’ve known for years that quality trumps quantity, when it comes to inbound links. Some marketers are holding out on that insight and continue to purchase links from high domain authority websites or even create or buy into link farms, which has been out-of-vogue for nearly a decade, but still retains the allure for desperate marketers.
While links continue to be a major focus for SEO professionals, there has been discussion around the weighting of inbound links in Google’s algorithm. Recent research unveiled by Stone Temple Consulting at SEMpdx Engage Conference, indicates that links are still a significant factor in the ranking algorithm. The vote of confidence an inbound link (or citation) provides a website is still a key factor and should be considered heavily in marketing efforts.
One area that marketers continue to debate, is the impact of SEO initiatives on graphic design, copywriting and coding. In the early days of Internet marketing, I would get into arguments with my interactive agency counterparts about copy, code and design, in which SEO best practices would appear to conflict with design best practices. That issue has largely resolved itself, as Google has become more sophisticated and focuses more on the user experience and high value content. As a result, sites that are beautifully designed with unique content and artful coding tend to out-rank sites designed solely for SEO and not the user.
Credibility covers a host of elements, many of which are unknown or misunderstood by unwashed marketers. One example is domain history, which includes the age of the domain and when it expires. Google likes old domains that expire many years from now, so stop auto-renewing annually and renew every 5-10 years. Domain authority, which is available for free via Moz, indicates how likely the site is to rank for unbranded terms. A strong domain authority is over 50 out of 100. Off-site factors including quality and quantity of inbound links weigh heavily in the Open Site Explorer domain authority ranking.
Most marketers understand the importance of social signals in search results, but typically focus only on sharing or securing links and mentions via Facebook and Twitter. Although Google+ has over 400 million active users, many marketers fail to understand the importance of a single user, known as Google. Google+ provides marketers a back-channel to Google, informing the engine what active users feel is important in terms of content.
A clear majority of businesses have a formal address. Regardless, every business should claim and optimize its Google My Business local listing. For retail locations, this is particularly important. Far too often, marketers overlook the claiming and optimizing of local listings, including Facebook, Yelp and other third party feed providers for maps and business directories. Another factor related to local listings is reviews. Business reviews can make or break a business and must be monitored and managed. Marketers are particularly bad about ignoring bad reviews and not securing a meaningful number of positive reviews, which directly impact revenue, per recent research. Adding one additional star in the 5-star economy, adds 9 percent to topline revenue.
Learn from the mistakes of others and follow best practices when optimizing your website. I’ve included a few helpful SEO resources below, to ensure you are up-to-speed on the latest SEO strategies, tactics and tools.