facebook sdk

google tag manager

Monday, February 27, 2006

Google Analytics and Privacy: A Quick International Comparison

As we know, Privacy laws vary throughout the world. Companies doing business internationally need to pay attention to privacy laws to stay on top of the appropriate country-specific requirements.

As an example, take a look at the 1-paragraph Privacy section of Google Analytics US Terms Of Service document, and compare it to their 4-paragraph Privacy section in the TOS for the UK. The common verbiage is around the fact that they collect information, using a cookie as an identifier, and they do not want site operators to send them any personally identifiable information. A good idea to be sure.

There are a few differences worth noting:
  1. They include a sample statement to be included in the privacy policy of the site using GA. They call out that the servers collecting the data are stored in the United States, and that they are collecting data for reports and for "providing other services relating to website activity and internet usage". Hmmm. They also note that they "will not associate your IP address with any other data held by Google".
  2. They have a short paragraph noting that Google may review your website to make sure you've posted an appropriate policy.
  3. Finally, they note that they will retain all of the data.
I wonder why they don't have the same TOS for the US? It all seems reasonable and clear (except that "providing other services" stuff...). The sample privacy statement is straight-forward, and it makes sense that they would want to follow up to see if folks are actually including a privacy statement. And, we definitely know they are collecting and keeping the data...all part of the circle of analytics.

Oh, and perhaps Xavier can come out of retirement to confirm that the French version does indeed look like the UK version (it looks the same to me...but of course, I can't read French).

Filed in:

Saturday, February 18, 2006

More Analysis on Competing on Analytics

Until recently, I haven't received many comments on this blog. The Web Analytics Forum that EricP setup provides a terrific format for discussions in the industry (and thank you to Robbin for the nice comment!).

However, two folks wrote some nice comments on my Competing on Analytics post, and I wanted to make note of them for you to check out. Zach Gemignani is with the Juice Analytics team, and Aurélie Pols is with the fine WebAnalytics.be European Dream Team in Belguim.

Both comments are very thoughtful, and well articulated. Zach wrote a good follow up post as well, providing a deeper critique of the research.

Thanks for the notes...keep them coming!

Filed in:

Wednesday, February 15, 2006

Measuring the Impact of Measure Map

I've been pondering the reasons for the Measure Map acquisition by Google. As we know, Google purchased Urchin back in March of 2005, then re-launched it in November as Google Analytics. It's a terrific analytics offering. So, why buy another analytics tool?

There are two compelling reasons. First, there's Jeffrey Veen and team. Clearly, they are doing some very slick, and very interesting stuff. Google is lucky to have them on board.

Second, I believe it's not about web analytics (or even blog analytics), it's about buzz analytics. Measure Map has two important analytics dimensions that distinguish them from other analytics tools: Links Out and Comments.

Links Out is a pretty easy one for web analytics vendors to solve. It requires some additional javascript, and some additional data collection. But Measure Map noted that it was a must-have feature from the start. Why? Because they get the fact that your blog is one link in a larger conversation chain. The more you understand the chain, the more you can manipulate it to your advantage.

Analyzing Comments is a more difficult proposition. It requires a better understanding of how blogs are structured, and how to extract information. But how cool is this...it gives you the ability to start to articulate new social networking dimensions, including "conversation index" values (frequency, volume, etc.), and a better understanding of those who are commenting (more than just paying attention).

One further bit of analysis on the importance of this acquisition. I'm a huge fan of memeorandum, which has featured the conversation of this acquisition as the top meme for the past day. Now, this analysis may not be totally fair as there are many variables involved, but take a look at the discussion as of this morning. Now, compare this to when Google Analytics launched. GA was a big deal at the time (remember that Urchin was a released product too), but it wasn't generating the same level of buzz (and by the next day, it wasn't even a major item on memeorandum).

It's the buzz. This will be fun to watch as it matures.

Tuesday, February 14, 2006

Google and Measure Map

Measure Map has joined the Google team today, which is fantastic news for them, and great news for the analytics industry in general. MM has been very focused on delivering smart and simple analytics to bloggers. I've enjoyed using their tool thus far, and I hope they continue to innovate (and deliver on that API!).

Congratulations to Jeffrey Veen and the team at Adaptive Path.

Filed in:

Superbowl Analysis: Two Views

There has been a lot of analysis of this year's Superbowl...and I'm sure there was some analysis of the actual game as well. Those who watch the advertising world seemed disappointed with the ads this year. Tivo has emerged as the analysis authority on the event/game, noting in their press release that:
  • Ameriquest ran two ads that were the favorite among the Tivo rewind/replay crowd
  • Tivo viewers hit replay 30 times on average
  • And, "A controversial touchdown call in the second quarter spurred almost as much replay and rewind activity as the most popular commercials"
What I found most interesting is that they use the data from a sampling of 10,000 households to determine this information. This is out of 4 million subscribers. That's a pretty small percentage, but it's not clear how many Tivo viewers tuned into the game, so it might be statistically more meaningful than it looks.

But I'm curious why they need to sample the data in order to pull these statistics? I believe they have full information on all of the activity of the subscribers...so why sample? Are there statistical reasons for taking an assumed complete data set, and choose to throw out a certain "random" percentage in order to make sense of the data?

Another way of looking at the game is to analyze a core human need: wings. That's right, if you're going to have a Superbowl party, you gotta have some wings. Need a recipe? Search for one, and you'll possibly end up at the best wing site on the internet: cluckbucket.com. As it turns out, many people did just that during the week leading up to big game. Scott "The Cluckbucket" Roth, notes on his blog the following terrific analysis of the week prior to the game, compared to the week before:
  • Number of "Unique Visitors" - Increased 142% (from 586 to 1,422)
  • Number of "Pages Viewed" - Increased 180% (from 1,661 to 4,647)
  • Number of "Visitors who visited more than once" - Increased 120%
  • Number of "New Visitors" to the site - Increased 153% (from 522 to 1,325)
  • Number of "Returning Visitors" to the site - increased 70% (from 54 to 92)
He also provides some notes regarding his incoming search traffic.
"As usual, the big three search engines were responsible for referring the bulk of the traffic throughout the week; Google - 463 visits, Yahoo - 327 visits, and MSN - 312 visits. It's interesting to note that I do a lot better on what I consider core search phrases (like "chicken wings" and "chicken wing recipes") on Yahoo and MSN, but my phrases referred from Google seem to be more sporadic. Overall, "chicken wing recipes" was responsible for sending the most traffic to my site, to a tune of 129 visits (mostly from Yahoo and MSN where I perform in the top 5 listings)."

That's pretty impressive considering he hadn't done any additional advertising or outreach. Nice work Scott. I've got to try one of these recipes! Mmmmmm....wings.

Thanks Clay!

Filed in:

Monday, February 13, 2006

Research: Competing on Analytics

The fine folks at Babson College have a research report out entitled "Competing on Analytics". You can get a copy via the Harvard Business School, or via the authors' blog.

The report does not focus on web analytics specifically, but it does provide a few good reference points that apply in our industry. They describe 5 stages of "analytical competition" a company falls in, the highest being "analytical competitors". These are the companies who "have embarked upon analytical competition as a primary dimension of strategy". They are clearly using analytics in all areas of their business, and analytics are driven from the top down as part of the culture of managing their business.

There is an interesting section on "predictive modeling and optimization techniques" that discusses the value of analytics when testing ideas and concepts (think multivariant testing). They note that "Capitol One, for example, conducts more than 30,000 experiments a year with different credit card interest rates, incentives, direct mail packaging and other parameters...".

The authors make one argument that isn't completely clear to me. They suggest that "if analytics are to be a company's basis for competition, and if they are to be broadly adopted across the firm, it makes more sense to manage them at an enterprise level." I understand the need to keep costs as low as possible by re-using technology where possible (hardware and software standards for example). That can only be done effectively through an enterprise approach. However, I don't believe analytics must be managed at the enterprise level to make this happen. Perhaps I'm not fully grasping their argument, but I don't think centralized IT organizations can really manage the rapid development and innovation required to help internal organizations succeed.

Any comments on this subject?

Filed in:

Wednesday, February 08, 2006

Web Analytics Testing: Excluding Your Own Traffic

Fiddler is an amazing tool. Written by the very generous Eric Lawrence (in his spare time), it allows you to see all HTTP requests and responses through your PC (via wininet). This is very powerful as it gives you a single tool to watch traffic to/from IE, Firefox and even your web analytics widget.

Not only can you watch traffic, but you can modify it as well. This is where it gets really cool, and very useful for testing functionality on your site. To illustrate this, let's say you were using Google Analytics, Measure Map and WebTrends to analyze the data from your site (purely hypothetical, I assure you ;-), and you wanted to browse your site without sending hits to those tools. Oh, and you really didn't want iTunes tracking your usage either. Here's what you'd do.

1) Install Fiddler
2) Select Rules -> Customize Rules...
3) In the editor, find the section "OnBeforeRequest"
4) In that section of the code, you'd add the following:

// Don't send Measure Map requests for site XYZ (XYZ is a number)
if (oSession.url.indexOf("/XYZ")>-1 &&
oSession.url.indexOf("tracker.measuremap")>-1) {
oSession.oRequest.FailSession(403, "Measure Map tracking", "Measure Map tracking");

// Don't send GA requests for site NNNNN-N
if (oSession.url.indexOf("NNNNN-N")>-1){
oSession.oRequest.FailSession(403, "GA Tracking", "GA Tracking");

// Don't send WebTrends requests for DCSID
if (oSession.url.indexOf("DCSID")>-1){
oSession.oRequest.FailSession(403, "WebTrends", "WebTrends");

// Don't send tracking information to Apple (actually 2o7.net) when using iTunes
if (oSession.host.indexOf("metrics.apple")>-1){
oSession.oRequest.FailSession(403, "Apple tracking", "Apple tracking");

Drop me a comment or email if you have any questions on this. It's a very powerful tool that provides a lot of additional functionality as well. Thanks EricL!

Filed in:

Thursday, February 02, 2006

Google Analytics - Site Overlay

Google has updated their Google Analytics service, adding back in their Site Overlay feature. Unlike the overlay features from other vendors, it operates within a frame of the reporting UI. Four measures are included that I can see: Clicks, Clicks %, G1/Clicks, and Avg. Score. I don't have any goals setup, so I don't have a ton of data to review at this time.

Here's a screenshot of the site overlay frame using data for this blog:

From what I've seen so far, they've made the tool relatively simple and limited in scope. Per their help, it's currently limited to "static pages with unique links to content located elsewhere on the website". They also note that the overlay is "not currently able to work with the following types of content:
  • Javascript links
  • CSS content
  • Flash navigation
  • downloadable files (.pdf)
  • outbound links
  • frames
  • auto redirects."
As I'm checking out the service, I'm intrigued by something. As you navigate inside the overlay window, your browser isn't actually requesting content from your site directly, but rather, the data is coming through Google's systems - I'm assuming they're acting as a proxy for the content. I had to test it to make sure the site wasn't coming from a cache...and it's not...interesting.

Filed in:


webtrends reinvigorate analytics