ellipsis flag icon-blogicon-check icon-comments icon-email icon-error icon-facebook icon-follow-comment icon-googleicon-hamburger icon-imedia-blog icon-imediaicon-instagramicon-left-arrow icon-linked-in icon-linked icon-linkedin icon-multi-page-view icon-person icon-print icon-right-arrow icon-save icon-searchicon-share-arrow icon-single-page-view icon-tag icon-twitter icon-unfollow icon-upload icon-valid icon-video-play icon-views icon-website icon-youtubelogo-imedia-white logo-imedia logo-mediaWhite review-star thumbs_down thumbs_up

The grand conspiracy of SEO agencies

The grand conspiracy of SEO agencies Brandt Dainow
VIEW SINGLE PAGE

Search engine optimization isn't always about getting better listings in search engines. More frequently, SEO is quietly rescuing second-rate websites. Most websites are second-rate. They are built by second-rate coders who are either lazy, barely competent, stupidly arrogant, or a combination of all three. Many web designers fall into this category. Their sites often look great while being almost incomprehensible to search engines. These sites have been coded without any regard for best practice in HTML coding and without any recognition that every website has two audiences -- humans and search engines. It's possible to build a site that works properly, looks great to humans, but that a search engine can barely process. Most sites are like that.



The grand conspiracy of SEO agencies



Search engine optimization is only about outranking your competitors when some of your competitor websites have decent coding. Since the quality of coding on most websites is truly awful, your site can usually outrank the competition merely by being coded properly. Link building, developing authoritative status, and all the other talked-about SEO techniques are usually only required if your site has quality competition. Most of the time, all that is required to stand out from the ocean of second-rate coding that surrounds us is that your site be coded with some care and attention by people who know how to code properly.

Why most sites are second-rate


The key causes of the aforementioned problem are that HTML is fairly simple and web browsers are extremely forgiving.


HTML is a very easy coding system to learn. All you need to do is memorize a few dozen tags, and you can create pleasing and effective websites. This enables many people to teach themselves HTML and become web designers. Unfortunately, most of them don't learn it properly or have any real understanding of what they're trying to achieve. Most of them think building websites is about creating stuff that looks good, but it's more subtle than that. Web design is really about building stuff that looks good on someone else's device, not your own, and that are useful. There are literally hundreds of ways you can code HTML to create the same visual effect, but many of these will create a slow, unresponsive site that will drive people crazy or that will render the site impenetrable to search engines.


Furthermore, even if you make mistakes in the coding, browsers will bend over backwards to handle your error. This means you can learn HTML incompletely and make mistakes in your coding without ever knowing because the browser is constantly compensating for your errors.

Gross neglect of speed


The most common failing of the second-rate coder is bloatware -- vast quantities of overly long and complex code where a few efficient lines would do the same task. It's as if Charles Dickens was their model: "Why use two lines of code where 20 will accomplish the same task?" Sometimes it looks like these coders are being paid by the line, and sometimes it looks like they're just trying to make work for themselves to avoid doing anything more useful.


However, I suspect what is really happening is they're just grabbing the first solution that springs to mind and never raising themselves to the level of asking, "Can I do this better? Is there a more efficient way of coding this?" It's far easier to simply trot out code like a donkey with your brain in neutral.


Bloatware has a number of negative consequences. First, it makes the pages slower to download and harder for the browser to process. Both download and processing time are important factors in search engines' assessments of a site because they want to send people to faster sites. As mobile computing grows, this will become a bigger and bigger issue.


Speed seems to have been forgotten by the web design industry around the time broadband arose. Prior to that, in the 1990s, everyone was very aware that web pages took time to download and bore that in mind when designing websites. Speed was so central to design that major development tools like Dreamweaver kept a running total of download time in the status bar as you coded so that you could see the impact of your changes on the site's speed. Designers didn't like casting aside their lovely creations because they were too slow, but they accepted the commercial realities of the world they inhabited and learned to compromise between appearance and performance.


With the rise of broadband, the web design industry simply forgot about speed to the point of ridiculousness. These days, designers throw in multiple calls from browser to server during page rendering. They call down web fonts, third-party components in iframes, and so on. The consequence is that many websites are slower now, over high-speed broadband, than they were when we were all running 256K dial-up connections.


While designers might have forgotten about speed, users haven't. There's a direct connection between website speed and the site's appeal. Sites that render in under five seconds are four times more likely to get a conversion than sites that take longer. This situation is even worse in the mobile market. In mobile, the critical time span is only two seconds, and you will get 10 times more mobile conversions if you meet this limit. Since search engines want to send people to sites that people like, search engines reserve the higher rankings for faster sites.


Most sites can be dramatically sped up with no visual changes, simply by re-coding for speed. Many sites use multiple JavaScript functions, called from multiple JavaScript files. It doesn't take more than a few minutes to combine them all into a single file. That alone can double the speed of a website. The same is true of CSS files, which provide stylistic information. While web fonts can look great, they are slow because they have to be downloaded from the web. Even Google warns designers about this and recommends no more than one web font per website.


Most search engine optimizers know all this. This is why a good SEO company will want to "search optimize" your site's code. Its salespeople might tell you they're doing fancy stuff based on a detailed knowledge of Google's search algorithms, but what the SEO techie is really doing is nothing more than basic code optimization for speed. This is something every web designer used to do automatically. It remains something customers could reasonably expect from every web designer today as a basic part of the service. But they don't get it, so SEO companies charge for it.

The CSS problem


Bloatware is even more common in second-rate CSS coding. CSS is the system used to tell a browser what visual appearance page elements should have. Most page elements, such as paragraphs and headings, have a default appearance, but it's pretty horrible. CSS is used to add extra style commands to do things like change margins, create borders, specify fonts and colors, and so forth. Tricks like sliding menus or transparent backgrounds can also be accomplished with CSS.


To give an element a specific appearance with CSS, you have three choices. The most basic method is to create a new default appearance, which will apply to that type of element everywhere. I can, for example, specify that all headings will be 20 point and red. The downside is that this will apply everywhere, without exception. It is therefore more usual to create a "class" for that element. So I could create a red/20pt class for headings, and then apply it only to the exceptional headings that I wanted to give that appearance, leaving the rest unchanged. Thirdly, I could also create an "id," which has pretty much the same effect as a class.


The standard says you use classes when you want to use the same formatting over and over, and ids when you only want to use that formatting once. Browsers are programmed to hold these objects in memory differently as a result, tuning handling of divs for multiple calls and treating ids as one-time disposable objects. However, many second-rate coders will use multiple ids with identical formatting. They should be using just one class.


While this might seem like a pedantic distinction, it has a huge impact on performance. Each CSS definition has to be downloaded by the browser and individually processed, so the more of them you have, the bigger your files, the slower the download time, and the longer it will take the browser to process the page before it can display it. It is therefore obvious you don't want any more CSS definitions than necessary. Yet website after website is bloated with multiple identical CSS definitions. Many pages contain a unique id for every element, yet all those different ids do exactly the same thing. I have seen web pages with literally hundreds of different ids, all designed to create exactly the same appearance. Each of them had to be individually processed, taking time, when one class could have done the same job in less than 1 percent of the time.


What sort of idiot would code multiple copies of exactly the same thing? Didn't they realize the stupidity of what they were doing while they copy-pasted the same commands over and over? No -- they were simply second-rate coders doing a second-rate job. They weren't thinking about what they were doing at all; they were just doing it. If there were enough coders to meet the demand, we could do the world a favor and throw these people out into the street where they belong. But there's a shortage of coders, so SEO agencies pick up easy cash cleaning up the mess second-rate coders leave behind.

Lazy structure


CSS misuse gets worse when it comes to creating a proper "meta structure." The meta structure of a document is its major functional components. Web pages have headings and paragraphs, and often lists and tables. Search engines need to know which is which. A heading tells the search engine something about the paragraph underneath it and is clearly a different type of content from that paragraph. Search engines need to know whether copy is a heading or a paragraph so it can understand the structure of the page and how the different bits of copy relate to each other.


However, many web designers don't use paragraphs or headings at all. Instead of using the correct

and tags designed 25 years ago for just this purpose, they use

.
is just code for a block of space on screen. It could be a paragraph, a call out box, a block of images, the entire page, or almost anything else. Designers use it because it has no default appearance, so it can be used to easily create whatever style they want.


There's nothing wrong with using divs -- unless they are being used instead of paragraph and heading tags. With CSS, it is possible to make a set of divs that create the visual appearance of headings and paragraphs so the design works fine for humans. However, if your page has no headings or paragraphs, search engines won't be able to tell what's what in your page. If you have competitor websites that are doing HTML properly -- using heading and paragraph tags to guide the search engines -- they will outrank you simply because search engines will be more certain of their understanding of the content. As a result, much SEO work is simply replacing divs with

and tags. Easy money, courtesy of second-rate coders.


Not all divs are bad. Divs are essential for creating visual appearance around a block of paragraphs, or for things like rounded corners in borders. However, even here we see bloatware. Lazy designers are those who don't think, but instead simply write vast amounts of inefficient code.


A common bloatware technique is to use one div for one aspect of the appearance, such as a border style, then another inside it for the font formatting, then another inside that for the line spacing, then another inside that for a background color, and so forth. Some pages end up with 10 or 15 divs nested inside each other when one or two could have done exactly the same job. They might even repeat this overloaded structure on every single paragraph. Multiple divs like this are very complicated for the browser to process because CSS commands can override other CSS commands. This means multiple divs have to be cross-referenced with each other to determine the final appearance. This makes a noticeable impact on speed and can even overload some browsers completely so that they can't display properly at all. Good coders try to minimize the number of nested divs they deploy. Second-rate coders simply never think it through to this level.

Ignorance on the server level


Second-rate work at the server level also provides SEO agencies with some easy money. There is a (small) set of "status codes" that web servers use to indicate the status of web pages to browsers and search engines when they ask for them. The most common code is 200, which means "everything is OK." After sending out a 200 status code, the server will follow up with the requested file.


The next most common one is 404, which means "can't find the file, server's running fine, no idea what's wrong." A 404 is not an appropriate code for a server to send out when a webpage has been removed from the site, moved, or renamed. A 410 is the correct code to say "permanently deleted," and there are codes in the 300 range for different types of file relocation or renaming. Yet most websites intentionally serve a 404 for pages that have been deleted, renamed or moved. This drives search engines nuts. Getting a 404 is like your website saying, "Gee dude, I'm working perfectly, I think, but duh, don't know, like wow, I can't find the file, this is like, you know, check it out, something's not right, but -- hey -- don't ask me what's wrong 'cause I'm OK. Bye." In other words, 404 is a good way of saying, "My website's a moron."


The 404 code is not a strategy -- it's the absence of thought by server admins, usually under the "guidance" of thoughtless web designers (who were the people who removed, renamed, or moved the page in the first place). If Google asks for a page and gets a 404, it has to come back and ask for the page again. If the page has been intentionally removed, it could waste six months repeatedly asking for the page. Multiply this over billions of websites, and it adds up to a significant cost for Google, a cost that could have been avoided if people merely ran their websites according to the standard they're getting paid to use. As a result, Google promotes websites that use status codes as they were designed and downgrades the listings of sites that just moronically 404 everything. So a good SEO design agency can take more of your money for simply setting up a proper HTTP status code regime -- a task your web designers should have done as part of normal operations.


The problem with SEO agencies


You won't hear any of this from most SEO agencies. Most SEO agencies survive via their working relationship with design agencies. Customers usually seek out design agencies, relying on the designers to provide the SEO services. Sometimes these are delivered from in-house staff, sometimes via an outside SEO specialist.


If an SEO agency gets called in by the design agency, they're not going to damage that relationship by telling the client much of the fee is simply because the agency did a second-rate coding job. The SEO agency keeps its opinions to itself. Even if the design agency's management could handle being told, the SEO agency won't do it. Firstly, the SEO staff have to liaise directly with the coders. It makes for difficult working relationships if your liaison knows you've told his boss that you think his work is sub-standard. Secondly, it rarely does any good. Most design agencies lack suitable technical management structures to address this issue. Furthermore, most web design team leaders are just as second-rate as their staff. Coding quality is usually cultural to a business. In a good design agency, all staff at all levels will code well (after all it's not hard), while all the work coming out of a second-rate agency will be badly coded, no matter who codes it.


It's worse if the SEO staff are inside the design agency. There's no way they can tell the client much of the fee is simply for covering over second-rate coding by their co-workers. In most cases the SEO people won't even warn their own management because, for the reasons just listed, it won't make any difference, it won't increase the company's bottom line, nor will it be welcomed by managers who (usually) can't do anything about it.


How to fix this massive problem


If you can cut HTML code, you'll know whether you are guilty of these second-rate practices yourself. If you are, have a little respect for your craft and learn to code HTML properly.


If you're a customer, not a coder, firstly you have my sympathy. If you're unsure whether your sites are the work of second-rate coding, open the code up (a simple "view source" in most browsers). Do you see every paragraph starting with

and every heading starting with tags (

,

, etc.)? Or do you see line after line of div tags? This is not rocket science. If you can't find

or tags in your pages, you've got a second-rate website.


Remember: You can't tell a second-rate coding job from the visual appearance of the site. A site can look great and still be second-rate. It's just like a house. I could create a building with fancy windows and ornate architectural features -- something that looked great. But if it was made of rotten timbers, bad wiring, and leaky plumbing, it would still be a second-rate house -- albeit a good-looking second-rate house. A website can be just like that: a second-rate mess that looks great.


If you have a second-rate website, you're not ready for SEO. Your initial SEO expenditures will just get consumed bringing your site's code up to scratch. Don't join in the SEO conspiracy of "let's take money for doing what those second-rate web designers were supposed to do in the first place." Get rid of your second-rate code before you take your site to the search engines. The search engines will punish you if you don't.


Brandt Dainow is the CEO of ThinkMetrics.


On Twitter? Follow iMedia Connection at @iMediaTweet.


"Discretion" image via Shutterstock.

Brandt is an independent web analyst, researcher and academic.  As a web analyst, he specialises in building bespoke (or customised) web analytic reporting systems.  This can range from building a customised report format to creating an...

View full biography

Comments

to leave comments.

Commenter: Adam Hutchins

2013, November 14

"What sort of idiot would code multiple copies of exactly the same thing?", "If there were enough coders to meet the demand, we could do the world a favor and throw these people out into the street where they belong.”, "They are built by second-rate coders who are either lazy, barely competent, stupidly arrogant, or a combination of all three. Many web designers fall into this category.”

Wow. Angry little man, aren't you? Arrogant academic. You should know better than to use "most" and other overstatements when making a point. BTW, stylistically it's best to use "first", "second", etc. rather than the superfluous, "firstly", "secondly", etc. I question the professional advice of someone who views everyone but himself, a moron.

Commenter: Al Loise

2013, November 11

"Link building, developing authoritative status, and all the other talked-about SEO techniques are usually only required if your site has quality competition. Most of the time, all that is required to stand out from the ocean of second-rate coding that surrounds us is that your site be coded with some care and attention by people who know how to code properly."-This is perhaps the most laughable statement in an article of exaggerations and hyperbole. I'm curious what are these verticals he is referring to that don't require link building, developing authoritative status etc? He's sitting on a gold mine!

Commenter: Roberta Oyakawa

2013, November 11

Being someone who has built many top quality, search engine optimized websites with extreme attention to detail, performance, error-free and well-commented code, and efficient CSS, I certainly understand the premise of your article and believe that some of what you characterize is going on out there. However, I find it hard to get past your delivery. Your harsh and sweeping characterizations of most web developers as lazy, stupid, barely competent, arrogant, second-rate, moronic, grossly-neglectful, thoughtless, idiotic donkeys with their brains in neutral comes across as contrived for the sake of getting attention. Maybe instead of scouring your thesaurus for every insulting adjective you could find, you should have put some time into brushing up on your html. Uppercase tags haven't been seen since pre-HTML 4.

Commenter: Obi-Wan Plunkett

2013, November 11

"Search engine optimization is only about outranking your competitors when some of your competitor websites have decent coding. Since the quality of coding on most websites is truly awful, your site can usually outrank the competition merely by being coded properly. Link building, developing authoritative status, and all the other talked-about SEO techniques are usually only required if your site has quality competition. Most of the time, all that is required to stand out from the ocean of second-rate coding that surrounds us is that your site be coded with some care and attention by people who know how to code properly."

So true.. however.. not ALL agencies are like this.. some have SEO departments that work with the UX department, Program/Project Manager, Copywriting and the Creative department.

Website redesign = Client: "WOW we have a website that gets search traffic now"

Every single time.. Brand Stabilization and Classification.. then Competition. =)

p.s. it's not a scam.. sometimes the lead SEO is the person making sure the budget isn't wasting the clients money.. problem is, too many people don't get it.. but it is that simple..

The 4 C's - Classification, Customers, Clarity, Content


"SEO isn't a project, it's a process.." (Google it)

Commenter: Nick Stamoulis

2013, November 11

While some SEO agencies might be "in cahoots" with the design firm not all of us are! We have an in-house designer/developer because we know how much the design and back-end of a website can affect it's SEO performance. If a site has some serious design issues that are going to hinder our work I want the client to know so they don't start blaming me when things aren't going according to plan!

Commenter: Tom Pick

2013, November 11

Hi Brandt - thoughtful and detailed post as always, but I must take exception to your basic premise that "Most of the time, all that is required to stand out from the ocean of second-rate coding that surrounds us is that your site be coded with some care and attention by people who know how to code properly." Between greater awareness of SEO principles and Google reducing the amount of space it devotes to organic results (so it can sell more ads), SEO has become more difficult, and vital. Solid SEO is a mixture of art and science. Writing clean code is a big part of the science, and you've done a bang-up job of explaining that here. But a designer can develop the overall page structure, understand the various audiences for a website and their different information needs, do keyword research, write copy that appeals both to humans and search engines, develop social authority for the content, etc. The art of SEO is equally important.