Stonetemple's team implemented 7 web analytics solutions simultaneously and compared results. Here one of the comparison tables:
The team found the following error sources:
1. One user visiting more than one computer would likely be seen as a multiple visitor.
2. Multiple users using one computer
3. Counting of questionable session s
At the end of part 1 of the study the authors note:
"Should we be alarmed at this level of variance in the results? Not really, but it is important to understand that these sources of error exist, and it's important to understand how to deal with them."
I really think this study is great and it is not alarming for a web analyst(!), b/c even if the numbers are 30-40% lower than other solutions, he/she still has the ability to do trending, a/b testing or work on KPI's.
Having said that, I also believe that the authors of this study protected the different vendors too much. In my opinion it is not acceptable, that anybody outside the webanalyst "community" has to deal with numbers that could be up to 30-40% off. This study should be used to put the top 10 vendors on a table and figure out how they can solve the issues that are in their hands (e.g. how to count questionable sessions). Otherwise the entire web analytics industry could keep on loosing it's credibility.