ellipsis flag icon-blogicon-check icon-comments icon-email icon-error icon-facebook icon-follow-comment icon-googleicon-hamburger icon-imedia-blog icon-imediaicon-instagramicon-left-arrow icon-linked-in icon-linked icon-linkedin icon-multi-page-view icon-person icon-print icon-right-arrow icon-save icon-searchicon-share-arrow icon-single-page-view icon-tag icon-twitter icon-unfollow icon-upload icon-valid icon-video-play icon-views icon-website icon-youtubelogo-imedia-white logo-imedia logo-mediaWhite review-star thumbs_down thumbs_up

SEO's unexpected consequences: More lucrative traffic

SEO's unexpected consequences: More lucrative traffic Brandt Dainow
VIEW SINGLE PAGE

Search engine optimization can have unexpected consequences for a website. In theory, SEO is a set of techniques for tuning a website in order to improve the site's search engine listings. The theory is that better listings will mean more traffic, which will translate into more business. In general, this works. However, I have discovered that it is possible to use SEO to improve the quality of search engine traffic without necessarily increasing the volume. I stumbled upon this by accident, so I don't have fancy theories about search algorithms to explain the whys and wherefores -- all I can do is provide my experience as a case study and let you draw your own conclusions.


One of my clients is London's City Cruises PLC, which runs a chain of tour boats and floating restaurants on the River Thames. One of the restaurant boats, the R.S. Hispaniola, has its own website. With 69 million results in Google for "London restaurant," competition for relevant listings is ferocious. However, having been one of the first restaurants in London with its own website, and with a continuous SEO program now approaching its 10th year, the site does extremely well in the search engines. A few years ago, it started doing a little too well.


Like many restaurant sites, the Hispaniola's contains sample menus, such as a lunch menu, dinner menu, Christmas menus, and so forth. A key USP of the Hispaniola restaurant is that it has one of the best wine collections on the Thames, so there's even a detailed wine menu on the website, with an aim of enticing wine buffs (and French tourists) to the restaurant.


All was going very well until about three years ago, when the conversion rate began to fall and the bounce rate began to rise. Investigation revealed a rise in people coming to the website searching for menu ideas -- suggestions for what to prepare for a Christmas dinner or recipe combinations for a dinner party. There are many websites specializing in this type of information, frequently recipe sites that include the recipes for the items listed in the menu. In other words, looking for menu ideas is a way people search for complementary recipes. The Hispaniola website, by being better search-optimized than most menu and recipe sites and by containing multiple menus, was outranking them for searches such as "Spanish menu" or "menu for Christmas dinner." Within a few months more than 80 percent of the traffic coming from the search engines was what we came to call "menu" traffic.


However, the aim of the Hispaniola website is to gain bookings, so a conversion for http://www.hispaniola.co.uk/ is either a booking or an inquiry. Needless to say, people who came looking for a menu they could cook themselves weren't overly interested in booking a table; as soon as they arrived, it was obvious to them this was not the sort of site they were looking for, and they would leave. The bounce rate for menu traffic was around 85 percent, and the conversion rate was similarly low. In fact, I was surprised to see any conversions at all.


This wasn't a major problem -- it was easy to see what was going on, and easy enough to remove menu traffic from the web analytics data. We could continue to improve the site's conversion ratio (converting visitors into bookings), and the search listings that were used by our target market continued to improve. We simply accepted that there was a little "fog" around the stats that had to be removed each month in order to get a clear picture of what our target audience was doing.


Another key point in this case study is that the Hispaniola website is edited with Adobe Contribute. This is a simple, but excellent, content editing system for websites that don't need a database-driven CMS. But there is a minor downside to Contribute -- over time the code it generates can become messy. As formatting is changed, then changed again, Contribute can accumulate a change "history." For example, you can find this HTML code in Contribute pages: . This translates as "turn bold on, turn bold on, turn bold on, turn bold off, turn bold off, turn bold off"; in other words "do nothing." This occurs because when you remove formatting, Contribute sometimes decides that, instead of removing the offending formatting command, it will add another command to counteract it.


In contrast, search engines like clean code. After a decade of being edited with Contribute, the amount of "rubbish" code had reached the stage where we needed to remove it in order to maintain the site's domination of relevant search listings. This is a fairly straightforward task, and was accomplished with a minimum of fuss.


Within a few weeks the conversion rate had unexpectedly tripled. Search listings for targeted phrases did not change, nor did the total amount of search engine traffic. What did happen was that the menu traffic vanished almost overnight. Search engines had delisted the site for "menu" phrases. However, instead of a corresponding drop in search engine traffic, it continued at the same level. Obviously something had happened to the mix of visitors the search engines were providing.


The only genuine "demographic" anyone has for search engine traffic is the keywords people used to locate the website's listing. Keyword lists have incredibly long tails -- often 95 percent of the search phrases will have been used only once or twice. However you can analyze referring keywords for patterns by looking for critical words within the search phrases. For example, we looked at the number of visitors arriving from searches that include the words "London" or "restaurant." You can further trawl these samples for additional word inclusions, if you need to. Our analysis revealed that, while the total volume of searches had remained reasonably level, the number of relevant searches had increased and the number of irrelevant ("menu") searches had decreased. In effect, irrelevant menu traffic had been replaced by relevant "restaurant traffic." Keyword analysis showed that the search engines were matching the site against a wider range of relevant terms and a smaller number of irrelevant terms than they had previously.


This demonstrated that removing the rubbish code had enabled the search engines to get a more accurate assessment of the site's content -- improving the quality of search traffic dramatically. We went from 20 percent relevant traffic to 100 percent, while keeping the total volume at the same level. Bounce rates dropped through the floor as a result, and conversions went sky-high, tripling daily bookings within a few weeks.


There has always been a strand in SEO thinking that Google, in particular, didn't really like HTML, and that the ideal web page for Google would be a notepad file -- pure text and nothing else. Whether that is true or not, what the Hispaniola's case does demonstrate is that code reduces a search engine's ability to assess a site's content.


One might argue that the code we removed was "rubbish" and that "good" code doesn't get in the way. However, most code really says nothing about the content of a page and can't be used by a search engine at all. You may get the occasional

The big lesson is that SEO techniques aren't just for chasing No. 1 for some phrase in a search engine. SEO techniques enable search engines to get a more accurate picture of a website's content. Here the gain is not higher listings, but wider listings. Wider listings mean being listed for a broader range of (relevant) terms and thereby found by more (relevant) people.


You don't have to run a SEO campaign to benefit from SEO techniques. When an SEO specialist hits a site for the first time, they undertake an optimization process. This involves going through the code in the site and tuning it for search engines as much as possible. It's reasonable to do this as a one-time exercise. Search engine optimization techniques are not just about more traffic; they can improve the quality of your traffic as well.


It's worth talking a little about the logic of search engine optimization and how it works. The most important thing to remember about SEO is that it is based on guesswork. Despite the claims of many "experts," the only SEO person who could really understand Google's ranking system would be someone who'd worked for Google on that ranking system. The logic behind a search engine's ranking is each search engine's most precious intellectual property, the single thing that distinguishes them from their competitors. Microsoft's own research for the launch of Bing showed that users have no brand loyalty to any given search engine -- provided the quality of the results are just as good. Search engines, therefore, guard their ranking systems closely. Though I have no proof, my suspicion is that Google's "ranking algorithm" consists of hundreds of individual rules, combined with thousands of manual tweaks. Furthermore, members of Google's ranking staff have told me at least one rule is modified every 48 hours, so whatever anyone knows can go out of date very quickly.


SEO work, at its best, is based on analysis of results and guesses as to the logic underlying them. However, we should remember that, since Google is adding manual exceptions for each country and industry, there is a limit to how much a general knowledge of Google's central underlying logic (if there is any) can help in any specific case. The most successful SEO operatives I have met can't really explain why they do certain things, they just claim to have a "feel" for the way Google tends to do things.  However, even for them, it's still a case of try something and see what happens. SEO work is always unpredictable.


This can make selecting someone to do a search optimization tuning on a website problematic. However, you don't need to go that far. If you closely look through Google's (limited) published guidelines for what they look for when ranking sites, you'll discover that a site that meets disability access guidelines possesses almost everything Google is looking for. If your site meets Web Content Accessibility Guidelines Level 1.0 or, even better, Level 2.0, your code meets the needs of Google. There are many free online WCAG tests that will tell you what needs changing, so that step is relatively easy.


Next, the site code needs to be evaluated to see how it can be trimmed. In my experience, any system that evaluates your website's speed should help. This will provide some useful suggestions for minimizing code, as this is always a key area for improving site speed. For example, they will point to where multiple CSS or JavaScript files can be combined. Any coder should then be able to go through the site looking for ways to get code out of the page without damaging functionality.


Finally the last step would be to ensure the content of the site contains a wide variety of relevant phrases.


Getting the search engines to have an accurate view of your site is a good way to improve your bottom line without an additional marketing spend. A more accurate profile of your content translates directly into more quality traffic. SEO is not just about getting ranked No.1; it's also about ensuring your site gets the widest profile it can -- being found for the widest variety of relevant search terms. This translates directly into more business.


Brandt Dainow is an independent web analyst with companies throughout the English-speaking world and in the EU.


On Twitter? Follow iMedia Connection at @iMediaTweet.


Brandt is an independent web analyst, researcher and academic.  As a web analyst, he specialises in building bespoke (or customised) web analytic reporting systems.  This can range from building a customised report format to creating an...

View full biography

Comments

to leave comments.