ellipsis flag icon-blogicon-check icon-comments icon-email icon-error icon-facebook icon-follow-comment icon-googleicon-hamburger icon-imedia-blog icon-imediaicon-instagramicon-left-arrow icon-linked-in icon-linked icon-linkedin icon-multi-page-view icon-person icon-print icon-right-arrow icon-save icon-searchicon-share-arrow icon-single-page-view icon-tag icon-twitter icon-unfollow icon-upload icon-valid icon-video-play icon-views icon-website icon-youtubelogo-imedia-white logo-imedia logo-mediaWhite review-star thumbs_down thumbs_up

A Redesign Worthy of Google De-listing

A Redesign Worthy of Google De-listing Jamie Roche

Those of us who rely on free listings on search portals like Google, Yahoo! or MSN have a problem. We want to create great, relevant and compelling pages for our prospects and our customers, and we want those pages to be listed by the search engines in order to get the content into the right people's hands. Unfortunately, we have limited and often contradictory information about what helps move us up the list, and about what might get us removed altogether, so the process of changing content in order to make it better is a risky proposition.

What the search engines tell us
Despite their best efforts, the portals regularly send our natural traffic to pages that are not the ones that we would choose. While we may like to change these natural search landing pages to improve the prospect experience, we are rightly fearful: If we change the page, will we lose our ranking? If we optimize the page for natural search traffic, will we get de-listed? 

It's difficult to know, because although the search portals share information about how to rank well in natural search results, the information is scarce (here is Google's information) and some of the recommendations are contradictory.

For example, Google tells us that "if fancy features such as JavaScript, cookies, session IDs, frames, DHTML or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site." But it also recommends that we "make pages for users, not for search engines." So the very things we're using to create a better experience for the end user (as Google recommends), such as JavaScript, ajax, Flash and other features, are the things that may be causing trouble with the search engines.

At Search Engine Strategies in London a few weeks ago, a couple of Google folks reinforced this message when they suggested we build the site for our users. "Create the best experience for them that you can. The most important thing is that the landing page and the site are relevant and useful," they said. "Treat the spiders the same as any first time visitor. Do not show one page to spiders and another to 'real' visitors or you may get in trouble."

And finally, "If your page has problems, we notify you and there are ways to get the issue ironed out." However, it was not clear if they would notify you before delisting you or after, and if your page would be returned immediately or after a formal appeal and review process. The conversation ended before I could gather this information. 

What the experts tell us, and what we believe
An endless number of third parties fill in the blanks with their interpretations about how the search results are determined. A quick search on Google for "landing page optimization tips" re-turns 339 results, for example.

Then there is the "common knowledge." Most people believe the following:

  1. That the appearance and placement of key words or phrases on your page can increase ranking

  2. That the number and quality of sites that link to your site, especially when the link includes your key word or phrase, might be the most important factor

  3. That there is special magic that SEO firms know, including submitting the site or pages to indexes, as well as setting up meta-tags, image tags and other hidden stuff on the page to get the spiders to connect your page with your key words or phrases

  4. And perhaps the most ubiquitous belief is that, if your page ranks well, don't mess with it.

Despite our best efforts, we may lose rank anyway
Unfortunately, even if you don't mess with your page, your rank could change. That's because the rules that the search engines use to rank sites change as they discover newer, and presumably better, ways to rank results. A page that ranked well one day might drop to the third or fourth page the next, or get removed altogether. 

The result is that we live in a state of fear about changing well-ranked pages, while knowing that even if we don't change them we could lose rank anyway.

So when we realize that we want to change our natural search landing pages because they don't provide the best user experience, we wonder whether it might make sense to experiment with changes despite our fears.

Although Google suggests that we don't present different content to search engines than you display to users (commonly referred to as "cloaking"), some of us consider showing content on the page that reinforces the word or term that the prospect searched for on Google by using an automated "keyword repeater" functionality. Others of us wonder whether we can make the page better in general simply by using a little "fancy stuff" that will provide an improved experience for our visitors.

Next: Three tactics to take

Return to Page 1

So what can we do?
First, let me repeat: Doing the following may get you de-listed from Google. Still, I believe there are times when experimenting with organic landing pages, in the name of improving the user ex-perience, is justified.

Here are a few things it might be worth trying:

1. Create a new page (one that you believe offers a better experience) and send 50 percent of your natural search traffic to it. Send the other 50 percent to the version of the page that the spiders picked. Measure the conversion impact of the new page. If the new page beats the old one by a lot, have it reviewed by SEO experts, implement their recommendations, give it the same name as the old page and hope for the best. 

Keep in mind that there is a line here. It's one thing if 50 percent of your traffic sees the same page as the spiders and 50 percent sees the new page. If 99 percent see the new page and 1 per-cent see the same as the spider, that's probably another story. You have to pick the level of risk you are willing to take.

(Note: This tactic is probably a pretty safe bet. Google has its own page tester and it uses javascript. It would be pretty risky for Google to penalize you for doing what it promotes others do.)

2. Change a small amount of content on the page to create a clearly better user experience, but leave the bulk of the page unchanged.

The problem with many natural search landing pages is that they are highly specific and they do not contain some of the more general content that lives on the home page, so you might, for example, add a line or two of text at the top of the landing page that introduces the company or talks about your returns or security policy.

This is more likely to be a risk. If the content is minor and clearly improves the user experience, then you may be able to justify the move with Google if the company questions the tactic. If the content fundamentally changes the page, there might be trouble.

3. Change the content to be more relevant to second time visitors. Since the spiders do not allow cookies, they always see first-time visitor content.

It is fair to assume that if a person is manually checking to see if the spiders see the same page as real people (highly unlikely, given Google's preference for automated systems) they will also see first-time content. A minority percentage of your visitors from the search engines will be visiting for the second time, but I believe this is a pretty safe way of creating a better user experience without getting on the wrong side of the search engines.

The net is this: The idea of creating or changing a site specifically for the spiders has always been, and will continue to be, counter-productive. The search engines perform a needed service, and this is not about how we can get around them.

On the other hand, we need to create great, relevant and compelling pages for our prospects and our customers, and as the search industry expands, the search engines will need to begin to take into account the fact that relevant content means changing content. In fact, as Google and the others move toward offering personalized search results, it will soon become virtually impossible to have a consistent ranking, since personalized results mean that everyone will see different list-ings. Then, perhaps, it will become easier and less risky to explore with changing content on organic search landing pages.

In the meantime, it may be worth the risk of losing a bit of rank in the name of a better user experience.

Jamie Roche is CEO of OTTO Digital, and president of Offermatica. Read full bio.

Jamie Roche is President and Co-Founder of Offermatica – the leading provider of on–demand marketing services, including testing and landing page optimization, that allow marketers to maximize revenue from their online advertising spend.  Also,...

View full biography


to leave comments.