Each web analyzer measures the number of pages and visitors a little bit differently. It is no secret that these programs give different numbers. The reason - in different methods of measuring statistics. How do the results differ from each other? Which indicators are underestimated / overstated by certain web analytics packages and why? These questions are answered by a careful comparison of seven programs: Clicktracks, Google Analytics, IndexTools, Unica Affinium NetInsight, WebSideStory HBX Analytics, Omniture SiteCatalyst and WebTrends.
The web analyzers were compared as part of an independent
2007 Analytics Shoot Out test conducted by Stone Temple Consulting. In addition to statistics, they also evaluated the ease of installation and use, the basic functionality of each package, the ability to solve practical problems, the unique advantages and disadvantages of each program, the basic technological elements of each package, and the compliance of programs with the needs of users.
Seven web analytics packages were tested on four sites. Here are the main results.
Unique visitorsAll web analyzers showed completely different numbers of traffic for each site. The difference in numbers is usually 7–13%, but sometimes it reaches 20%. According to experts, the reasons are both technological and methodological. For example, a different type of cookies, a different understanding of the daily audience (the last 24 hours or calendar days), and, of course, different settings of software packages.
')
The HBX Analytics package usually yielded absolutely minimal attendance figures among all programs. On two sites with a lot of advertising, the most “generous” programs can be called Clicktracks and Google Analytics. Perhaps this is due to the best account of advertising traffic. On another site, WebTrends ranked first with a huge margin of 20%. By the way, this was the only site where this program was tested, so it can be assumed that WebTrends will always show inflated numbers.
Page showsThe discrepancies in this indicator are much smaller than for unique visitors. This is understandable: there are no difficulties with cookies and other settings. Here we are talking only about whether the script worked or did not work. However, if we compare not the general statistics on the site, but the number of views for individual user groups, then the same discrepancies associated with different visitor counts also appear here. For example, one program considers a “session” a period in which there can be no more than 30 minutes of inactivity, while another program may consider it completely different. The difference between the “testimony” of web analytics packages can reach hundreds of percent.
A detailed report with full test results will be published in July 2007.
via
Stone Temple Consulting