ellipsis flag icon-blogicon-check icon-comments icon-email icon-error icon-facebook icon-follow-comment icon-googleicon-hamburger icon-imedia-blog icon-imediaicon-instagramicon-left-arrow icon-linked-in icon-linked icon-linkedin icon-multi-page-view icon-person icon-print icon-right-arrow icon-save icon-searchicon-share-arrow icon-single-page-view icon-tag icon-twitter icon-unfollow icon-upload icon-valid icon-video-play icon-views icon-website icon-youtubelogo-imedia-white logo-imedia logo-mediaWhite review-star thumbs_down thumbs_up

What Not to Do When Analyzing Your Site

What Not to Do When Analyzing Your Site Brandt Dainow
VIEW SINGLE PAGE

You should not analyze what search engines are sending you traffic. You should not analyze what paths people take through your website. You should not analyze the average duration your visitors spend on a visit, or the average number of pages they read. None of these things will help you in the slightest, and in some cases they will even mislead you.


Averages confuse
Average statistics for websites are commonly given for duration (the amount of time people spend on your site), and for average page views (the number of pages people read during a visit). This information is extremely misleading and will probably cause you to make incorrect decisions. The problem with an average is that it can be horribly skewed by the extremely high durations (like the person who spends three hours on your site because they are trying to copy the design) or -- more commonly -- extremely low numbers. 


Two types of people are visiting your site; people who glance at your home page then leave very quickly because the site is not what they are looking for ("scanning visitors") and people who are interested in what your site offers and spend time in it ("committed readers"). Averaging all the visitors to your site into one number mixes scanning visitors in with committed readers. This tells you nothing. This fusion of the two is not appreciated by most people, and the assumption is the numbers are talking about committed readers. However, the fusion of these two into one number drags down the amount of time most people believe their committed readers are spending on the site. This has been going on for so long and is so widespread that most web designers believe people spend four to six minutes on a website, when in actual fact most people spend much longer. This is especially true if they are looking to spend serious money, such as in a travel or loan site.


The reason you need to separate scanning visitors from committed readers is that they represent different modalities of management within the site. Improving scanning visitor performance is about how well the site is selling itself to new arrivals as a place to stay. Improving committed reader performance is about how well the site is then selling your products or services as something to buy.


It is therefore absolutely essential that you separate out committed readers from scanning visitors before you calculate average duration and average page views. This will typically show you that those people who do spend time on the site are spending twice as long as you had previously imagined. 


I strongly recommend you think about what constitutes a scanning visitor. It's not just someone who only looks at one page. If someone views three pages in 25 seconds then leaves, it's pretty obvious they just scanned those pages looking for something, but didn't find it. You need to determine -- for your site -- how long it should take to determine what the site has to offer.


What is interesting about committed readers is how long they spend on the site and whether there is a difference between committed readers who convert and committed readers who do not. It is not uncommon to find that the committed readers who spend the longest on your site are the ones least likely to convert. This usually indicates that they were committed to the idea of buying from you but were unable to find what they were looking for after a detailed examination of your site. If you believe you had what they were looking for, then you probably have a navigation problem. It is thus important to cross-reference the average duration for committed visitors with the number of pages they read, and cross-reference this with where they came from, what search phrase they used, and so forth. In fact, once you get down to this level of analysis, you may find it is better to start to do some form of quartile analysis and contrast committed readers who do engage in your target action with committed readers who do not on many different levels.


Search engines don't matter
Most of the time it doesn't matter what search engine somebody came from. The only time this is important is when you are spending the money on search engine optimization or PPC advertising and you need to analyze what you are getting for either of these investments. 


What is much more important when analyzing the sales performance of your site is the phrase people use in their searches. In order to analyze your phrases easily you need this information amalgamated across all search engines. What you want to know is what percentage of the people who searched for "buy a widget online" bounced when they hit your site and what they landed on when using that phrase. The difference in sales performance between different search engines is practically non-existent. The difference in sales performance between different search phrases can be astronomical, even when all of those phrases are relevant to your site. I have seen conversion rates from PPC advertising range from 0.5 percent to 50 percent on the same site. If you've got a spread that wide you want to know and it really doesn't matter which search engine these people saw the ads in.


Next: Why paths don't matter, and the core of web analysis summed up as a simple 1-2-3. (Page 2 of 2) 

Paths don't matter
Many stats packages will give you an analysis of your most common navigation paths through your website. Unless you've got an incredibly restrictive navigation structure this is a complete waste of time. You will be lucky if the most common navigation path people take through your site accounts for more than one percent or two percent of all visits. If you bothered to get a list of how many different navigation paths people took through your site, you would probably find that there were almost as many different paths as there were visitors. One of the things that underpins the nature of the web is the ability of hyperlinks to provide people with the option of following information in an associative way, a way that mirrors their thinking. This is what makes the web so great, so fundamentally different from printed material. 


The only time you want somebody to follow a linear path is when they absolutely have to. Such occasions occur around forms. Typically if somebody wishes to buy something, or sign up for something, or get a quotation, they have to go through a series of forms. The reason it's a series is that no designer wants to present a potential customer with all the questions they need to answer on a single form-- that form would be intimidating. Thus we lead people through a series of successive forms. This is exactly when you need to analyze a page-by-page path. But what you need to know is clickthrough rate, not what path people took-- there is only the one path. What you need to know here is how many people on page A followed through to page B and how many of those followed through to page C and so on. If you're not happy with the clickthrough rate on one of these pages you need to cross-reference success and failure here with the amount of time spent on this page and previous behavior.


This is not to say that you do not wish to analyze some things regarding how people moved through your site. But you don't need to look at this at a super-detailed level. On most sites, all the pages in a given section are the same design and function. There is usually a logical progression from one section to the next. For example, people may move from a listing of product categories to category pages that list individual products and then to those individual product pages. In such a scheme they will probably bounce back and forth between these different levels in your hierarchy. Trying to follow these individual paths is confusing.


What you are really concerned with is how well each level in the hierarchy performs. This means you need to look at overall stats for each level, not specific paths. If you're happy at that level, then you move on to looking at how individual pages in a particular level vary from the norm for that type. For example; is there a particular product listing page that has a significantly lower or higher clickthrough rate than the rest? If so, what is different about that page from its fellows? If its performance is lower, that difference is something to eliminate. If its performance is higher, that difference is something to emulate on its fellow pages. You need to amalgamate all the visitors across all of the individual paths in order to be able to draw conclusions about the effectiveness of your design. Looking at individual paths in this respect will only obscure things for you.


Conclusions
Just because a stats package is giving you a metric doesn't mean it's of any use to you. You need to know when to drill down on common stats, such as average duration, and when to look at things differently. The secret is to decide, in advance, what it is you want to improve in your site. You need to be able to formulate this in business terms. Don't look at your stats until you know what you're looking for or you risk being misled or confused. If you're looking at how effective your web pages are as sales tools, you don't want to count people who bounced from your site and never saw them. In this respect it doesn't matter to you what search engine they came from. What is more important to you is what they were looking for and whether you gave it to them. The core of web analysis is a simple 1-2-3:



  1. Do I have what people want?

  2. Did I show it to them?

  3. Did they buy it?

Return to page 1


Brandt Dainow is CEO of Think Metrics, creator of the InSite Web reporting system. Read full bio.

Brandt is an independent web analyst, researcher and academic.  As a web analyst, he specialises in building bespoke (or customised) web analytic reporting systems.  This can range from building a customised report format to creating an...

View full biography

Comments

to leave comments.