facebook sdk

google tag manager

Friday, December 22, 2006

SEM and Analytics Example

This is an interesting example of how important it is to understand your analytics data to make sense of unusual activity. From the PepperjamBlog:

"One of our clients spends about $2,500 per day on Yahoo Search Marketing. In this particular instance, the client is opted into the Yahoo Content Network and spends about 20% of their daily budget or $500 per day in content. One of our senior SEM’s was reviewing daily click activity and noticed that the account, which the previous day was funded with $10K was offline and the entire available balance depleted. After further investigation we found that two particular keywords within the content network accounted for almost the entire $10K previous day deposit in less than 12 hours (rough estimate of time)."

Filed in: , ,

Technorati Tags: , ,

Monday, December 18, 2006

Microsoft's New Homepage - and AJAX Analytics

After a lot of very hard work by many folks up in Redmond, Microsoft has released their new homepage design. From their FAQ:

"On December 14 we introduced a new home page for Microsoft.com. The new page incorporates months of research, testing, customer feedback, and refinements. We hope the new page makes it easier to find what you’re looking for on Microsoft.com, and that you find new items of interest along the way."
The new site is very sophisticated, as is the push for analytics behind the scenes. If you like to see how sites are leveraging analytics to track AJAX navigation, use your favorite HTTP debugger (Fiddler is still my fav - thanks Eric L) and see the analytics data sent to the Microsoft team for analysis (via m.webtrends.com). Specifically note how the AJAX menu (reached through the menu in the upper right hand part of the homepage) tracks each "event" (clicks mostly in this case) inside the menu, including which "view" (upper right) you choose.

Also note how each AJAX event in the menu references navigation structure ("ngn" parameters), and utilizes an event handler parameter WT.dl to define the type of event. These are a couple of examples of methods for tracking more sophisticated Web 2.0 events to provide rich analytics on the backend.

Congratulations to the team at Microsoft!

Thanks William...nice work!

Filed in: , , ,

Technorati Tags: , , ,

Monday, December 04, 2006

WebTrends and ClickShift = WebTrends Dynamic Search

We announced this morning that WebTrends and ClickShift have joined forces. From the release:
"The acquisition will strengthen the company's existing product family, WebTrends Marketing Lab, with the addition of WebTrends Dynamic Search™"
This is excellent news for both companies. ClickShift has created an excellent product that fits perfectly with the expanding WebTrends Marketing Lab environment.

Congratulations and welcome to John, Leo, and the rest of the ClickShift team!!!

Filed in: , ,

Technorati Tags: , ,

Wednesday, November 08, 2006

Internet Broadcasting Notes Online Traffic for U.S. Elections

...more good use of technology in politics this week!

Internet Broadcasting leveraged WebTrends Analytics™ On Demand to provide real-time analytics across its network of 79 sites. According to data from WebTrends, Internet Broadcasting’s Election Day traffic produced:

* 3.5 million unique visitors – the company’s highest day in history
* 4.8 million page views to the Politics section – a record total nearly three times larger than 2004 Election Day traffic
* 24.8 million page views – nearly 50 percent higher than the 2004 Presidential Election Day
* 50-150 percent traffic spikes above typical page view levels for top sites

Filed in: , ,

Technorati Tags: , ,

Monday, November 06, 2006

Technology And Elections

How about an election technology discussion that doesn't include Diebold in it? :-)

Here's something very cool. Last week, Bonnie Bogle noted on the N-TEN blog that the Institute On Money In State Politics (followthemoney.org) released a public facing set of API's for accessing their incredibly rich database. Since it's election week, I thought I'd put up a web widget showing "Contributions By Party And Status For Oregon" which you can see in my sidebar, and below:

Now, how is that for a great use of technology!

Don't forget to vote tomorrow (or if you're in Oregon, by tomorrow!)!

Filed in: , ,

Technorati Tags: , ,

Friday, November 03, 2006

Measure Map Insight - Interview with Jeff Veen

If you haven't already done so, check out Brian Oberkirch's interview (mp3) with Jeff Veen (who is now with Google, formerly Adaptive Path). Jeff put together the very cool Measure Map analytics package.

About 1/3 of the way into the interview, Jeff digs into some of the history of building Measure Map. As a blogger, he essentially couldn't find a tool to help him. He notes:
"I was blogging all the time...none of us really had a sense of how well we were doing"

As is the case with many of us, he wanted to know:
"What is my connection to the rest of the community out there"..."What affect am I having"

So, unable to find a tool (or more specifically, the right tool), he set out to build one, with the assumption that with blogging tools like Blogger, MovableType, WordPress, etc.:
"You could help about 90% of the bloggers by creating a tool that integrated into just 4 or 5 products"

For sure, Measure Map is a different kind of a tool. It's specifically designed for bloggers. It doesn't even use typical jargon to describe activity on your site. Jeff notes:
"The big irony of Measure Map is that it's a web analytics tool that makes no mention of pageviews whatsoever."

And the interface is designed around blog posts, rather than URL's, etc.
"One of my first ideas when we first started talking about Measure Map was the idea that every post on your site had a corresponding page on...Measure Map"

Where the product really shines is with its visualizations. It provides simple, friendly visual representations of the data via a clean interface spiced with Flash and Ajax.
"I didn't care what we used...sometimes it made sense to use flash for a relatively complex series of animations or interactions, but the flash would then talk to the javascript, back and forth...and most people can't really tell as they're navigating through the product which parts are Flash and which parts are Ajax"

One of the things Measure Map did early on that I liked was to create feeds for the data to easily retrieve it. I use the feeds for the widget I created that pulls Feedburner and Measure Map data together.

Can you tell I'm a Measure Map fan? Thanks to Marko for point me to the interview

Filed in: , ,

Technorati Tags: , ,

Tuesday, October 31, 2006

Webcast - Competing On Analytics

Professor Tom Davenport of Babson College will be speaking today (10am PST) on a topic he's written on previously, Competing on Analytics Paper. The subtitle for the presentation is, "Move Faster, Accomplish More, and Avoid Mistakes by Learning From The Best". Could be interesting.

The Juice Analytics folks will be covering the event.

Filed in:

Technorati Tags:

Tuesday, October 17, 2006

Tracking Ajax

As we know, many Web 2.0 implementations require some new tricks from our analytics packages. As Eric P and others have written, the old-guard measure of "hit" (or something like it) is possibly making a comeback as we ponder how to measure (or what to measure) in this more dynamic world we are in now.

Enter Ajax
One of the more interesting Web 2.0 application types to tackle is Ajax implementations. Built on the idea that you don't have to refresh the entire page to present updated information to the visitor, Ajax is really gaining some steam in the industry. And rightfully so...it's really quite powerful, very efficient, and is a great example of a "rich" user experience. The basic idea is that instead of a link rendering a new page request, a link only fetches a smaller amount of content that is displayed back to the visitor quickly. To confuse things a bit, this content can come in a variety of formats (XML, HTML, JSON, etc.), and can be requested a number of different ways (XMLHttpRequest, IFrame, etc.). From an analytics perspective, it doesn't really matter. What's important is that the visitor has taken some action, and we likely want to know what the action was (and sometimes we want to know what was inside the response too...I'll get to that in another post).

A major issue we face then is how to treat pages vs. the (many) potential requests that follow on the page. These secondary (and subsequent) requests are not pageviews in the traditional sense. So, what do we call these non-pageviews? What do we track, and how? Of course, it depends on your needs, but let's dive into an example to see how others are thinking about it.

Example: Microsoft
One of the more important properties on the internet is working on a new soon-to-be-released Ajax facelift. The very smart folks over at Microsoft have a major redesign in the works that is very impressive, and a offer us very heavy example of Ajax from which to learn. If you go to the current Microsoft Homepage, you'll see an example of Ajax in the tab interface about 2/3rds of the page down (in the center). If you hover over the "Latest releases" menu heading, you'll see some new text and a new image appear. The images are dynamically returned back to the browser in an Ajax call. Side note: if you stay hovered over one of the menu headings for a second, a tracking hit is sent...but only one time per menu item to avoid overkill on mouse hover tracking.

However, if you want to find out more about Windows Vista, you might click on the upper left navigation "Windows" link, then click "Windows Vista". Three pageviews altogether.

Now try this. Go to the new Microsoft "preview" site, which features some new Ajax look-and-feel. Now if you want to find out about Windows Vista, you might use the "Microsoft Site Guide" menu on the upper right, selecting "Products & Related Technologies". This makes an Ajax request, but instead of refreshing the page, presents you with a new pop-up window over the top of the page. Now select "Windows", and the window is updated with more dynamic information. Now select "Windows Vista" and you are sent to a new page. Two pageviews with two extra requests.

You could argue that the pop-up menu looks like a page, and you might want to track it as a pageview. But you can also suggest that since you were still technically on the homepage, it should be tracked as a separate type of request. For sure the third click, which simply rendered new data in the menu was different than a full page request.

Tracking Ajax Requests
In order to track these requests differently, we have defined different event "types" via a new parameter. In this case, the parameter is WT.dl, with values to describe a pageview or an ajax request (or a mouseover, or an RSS feed view, or a "start" of a video...you get the idea). Within the analytics tool (WebTrends - no great surprise there ;-), we simply leverage the powerful analytics reporting (or new Marketing Lab Warehouse) to track these events independently (or together if needed). Pageviews remain as pageviews. Other requests are defined appropriately and tracked as needed to provide accurate reporting.

This type of tracking leverages the client to send the request to the analytics tool (I'll call it Asynchronous Client Tracking of AJAX, or ACTAjax - ok, this is why I'm not in Marketing :-) There are other methods for collecting data though that might become necessary when you don't want the client to send the data (security, privacy or the scenario just doesn't allow it), or when the client isn't a traditional browser (tracking API requests, RSS feeds, some mobile devices, etc.). For these reasons, you may also want to have other collection strategies ready (server-side requests, traditional logfile analysis, etc.). I'll talk more about these types of requests in a subsequent post.

More Info
An excellent team of speakers is putting together a series of great presentations for our upcoming Marketing Performance In Action conference in Orlando (Oct 24-25). Our own Clay Moore will be speaking with Brant Barton (co-founder and VP of Business Development at Bazaarvoice) will be speaking about tracking Web 2.0 technologies (Optimizing Your Web 2.0 Programs with WebTrends) in their session on Wednesday.

I'll be at the MPIA conference as well - if you're going to be there, please drop me a line (elbpdx @ gmail) and let's be sure to connect!

Otherwise, if you have any best practices, or other excellent ideas...feel free to leave a comment here!

Filed in: analytics, ajax

Technorati Tags: ,

powered by performancing firefox

Thursday, October 05, 2006

Non-Profit Startup Analysis

I wanted to share the final results of an entrepreneurial non-profit effort to benefit Schoolhouse Supplies here in Portland. As I mentioned previously, the general idea was to offer school supplies online so parents could avoid some of the tedious back-to-school shopping in the fall, and benefit a great organization in the process.

What made this effort different than other online stores is:
1) The proceeds benefit Schoolhouse Supplies - a non-profit "free" store for teachers
2) The list of supplies were customized, based on the exact needs of
the teachers
3) The supplies were distributed (by volunteers) directly to the child's classroom
4) The online store was built by volunteers (many thanks to Nate M and David M!)
5) The online store featured some .NET, a little JAX (we didn't have time for the Asynchronous part, but it was close, and we'll get it for next year ;-), the fabulous salesforce.com as a backend, AuctionPay for credit card processing, and of course, WebTrends for some slick analytics.

Our original goal this year was to pilot the program at one elementary school and one middle school. It became obvious very early on that the middle school wasn't going to fit with our model. This program works well for elementary schools because most of the supplies are "shared" in the classroom. That is, the students bring the supplies, and they are all combined together, and the teacher uses them throughout the year as needed. In middle school the students hang on to their supplies (in lockers and VERY heavy backpacks!). So, we stuck with one elementary school.

The elementary school we chose has a population of less than 500 students. For this pilot year, I was hoping we'd get a total of 50 orders so we could figure out the flow of the online application, and whether the logistics of delivering the supplies worked. I was very anxious to hear the feedback from the parents on the idea.

Ok, so, how'd we do?
8 Weeks Online Accepting Orders
129 Total Supplies Sets Ordered
$3,482 in Orders
85 Total Families Ordered
8 Middle School Students Volunteered 4+ Hours Each to Distribute Supplies

In short, it was a great success. We surpassed all of our goals, and got some absolutely terrific feedback from parents and teachers. Sweet!

Of course, I've got to include some analytics! Over the 8 weeks, we had a total of 305 visits. One of the lessons learned is that parents are generally not thinking about school supplies over the summer. We tend to wait until the last week or two before school starts to worry about supplies. We had a deadline to buy the supplies, so we had to cut off the ordering early. A couple of parents sent out an email blast to other parents near the end, and we saw a nice uptick the last few weeks.

The ordering process had essentially 6 steps after the Welcome page: Parents had to choose the class the child was entering; enter the student's name; review the "cart" (if a parent had more than one child in the school, this is where they could add their other child(ren)); enter checkout information; verify/approve the order, then receive confirmation. Of the 305 total visits , 86 of them converted. 28.2% conversion! Very nice.

We did not attempt to optimize the site for search engines as we weren't focused in this area at all. We assumed that only individuals who heard about the site from our limited communications and word of mouth (buzz!) would visit the site. But of course, a few folks found us via searches, so it's good to capture what was on their mind (search phrase) when they arrived for planning for next year.

All in all, it was a great pilot. It was a terrific win for teachers, parents, and Schoolhouse Supplies (and therefore a win for many students in need). Many pieces had to come together to make this work well, and thanks to many folks (Nate, David, Nick, Courtney, Liz, Kara, Simone and her amigos) for giving this effort its positive spirit! Also, thanks to WebTrends and the Salesforce Foundation for offering free services for this non-profit startup.

Our current plan is to roll this out to more schools next year and figure out some scaling issues (mostly on the distribution side...the technology is already set to scale world-wide! :-) After next year, I anticipate it can grow very quickly.

Filed in: analytics, salesforce.com, nonprofit, web+analytics

Technorati Tags: , , ,

Wednesday, September 27, 2006

URL Best Practices

The folks over at SEOmoz have a smart post noting best practices for URLs. The guidelines they outline are spot on for overall usability and are obviously helpful with regard to SEO.

The list includes the following categories:
Describe Your Content
Keep it Short
Static is the Way & the Light
Descriptives are Better than Numbers
Keywords Never Hurt
Subdomains Aren't the Answer
Fewer Folders
Hyphens Separate Best
Stick with Conventions
Don't be Case Sensitive
Don't Append Extraneous Data
The suggestions are also very helpful from an analytics perspective. Building a structure that is easy to read and navigate, is also going to be helpful for defining reporting needs based on specific segments, content groups, scenarios, paths, etc. It's great advice all around.

Filed in: analytics, search

Technorati Tags: ,

Tuesday, September 19, 2006

Search Results and User Behavior

ClickZ ran an article yesterday on a recent Google "experiment" to tie ad position to user behavior. This puts a new twist on analyzing search referrer traffic as there are more variables thrown into the mix. The article notes:
"The Google quality score uses text ad relevance, historical keyword performance, landing page quality and other factors to determine ad placement. Yahoo's new Panama search ad platform takes into consideration similar factors to rank text ads."
I wonder to what extent Google (and Yahoo) will pass along the "other factors" data to its customers (in the referrer, or in the ad/campaign performance data).

Filed in: analytics, search

Technorati Tags: ,

Wednesday, September 13, 2006

Anti-Spyware Cookie Detection

Check out this very interesting study on the current status of Cookies Detected by Anti-Spyware Programs conducted by Ben Edelman for the folks over at Clicks2Customers.

This is important information that marketing folks have been following for awhile now.  There's plenty of research on the topic of cookie deletion and blocking (several good links in the article).  It think perhaps we're all pretty tired of talking about it!  But there continue to be new tools introduced and existing tools updated to further "help" our visitors detect, block and delete "harmful" cookies, including analytics cookies.

The article notes some options in the conclusion, suggesting:
"Could shorter cookie durations address ad systems' needs, while reducing user privacy concerns? If most conversions occur within days of an ad impression, a far longer cookie duration may be unnecessary and needlessly privacy-invasive. Similarly, it seems separating cookies into advertiser-specific chunks -- either first-party cookies, or path-specific third-party cookies -- might blunt many privacy concerns, while preserving the tracking many advertisers consider most important."
The article also includes a "revenue loss calculator", that gives an approximation of affiliate/merchant lost revenue based "as a function of the advertiser's conversion speed and the percent of the advertiser's cookies that are removed".  Clever.

Filed in: analytics

Technorati Tags:

Tuesday, September 12, 2006

Busy but Good Dashboard

The Dashboard Spy does such a great job of digging up interesting visualizations. I really like the recent post showing the Airline Executive Dashboard. It's a great use of sparklines, white space, and limited color to get a lot of detail across. It even won an award...very nice.

Filed in: analytics

Technorati Tags:

Friday, September 08, 2006

Mobile devices coming (fast!)

MSNBC has a good article on some of the cool goodness to be found on mobile devices in Japan.  From the article:
"Thanks to early investments in high-speed mobile networks, Japan’s cellular telephone industry is about a year and a half ahead of America’s. Everywhere you look, it shows."
Europe is also ahead of the U.S. in this regard with great innovation at Nokia, Sony Ericsson, Orange, BT, and others.

Your visitors are reaching your sites using wireless devices with more frequency. And we're just getting started! Having a plan for dealing with browsers on wireless devices is a tremendous amount of work. Let alone the variables involved with correctly handling javascript, cookies, source IP addresses, latency issues, and other potential hazards for analytics.

Are there any interesting challenges out there that folks might be facing with this evolving technology?

Filed in: analytics

Monday, August 28, 2006

Analytics and Results

Two weeks ago Google Analytics opened up their doors to anyone who wants an account. Last week the GA blog noted how to link your AdWords account to your GA account.

Peter Da Vanzo over at the SEO/Marketing News & Opinion blog notes that Google needs to be more transparent about how they'll use the information they are collecting. Definitely a good point.

They've made it quite easy to link this information together. Why? One way to slice this is to suggest that it's a decent approach to being a good business partner. They are asking you to tell them what your goals are - and then trying to help you reach those goals with their tools. And reporting back as to whether you've attained your goals. That sounds good.

Yet, some wonder: Should one company be responsible for both setting prices they charge you and reporting your revenue results? What happens when they know that you're over-achieving your goals? What happens when they discover that your AdWords ads are working better than your ads from a competitor of theirs?

Of course, all of this is more likely about stretching business model for pricing ads. Once they can move away from CPC to cost-per-action pricing for the masses, it will leap-frog them out ahead of their competition once again.

Eric P started a great thread on defining a new standard for measuring Web 2.0. Our Analytics industry may also want to make sure "conversions" or "actions" are also well defined standards so we can successfully and consistently report ad (or whatever campaign) performance information for our customers.

Filed in: analytics, google

Technorati Tags: ,

powered by performancing firefox

Friday, August 04, 2006

IAB to Define a Click

The IAB announced the formation of an industry-wide Click Measurement Working Group to define what a "click" is.  As Business Week notes:
"There's a lot at stake. Click fraud—using software or low-cost workers to repeatedly click on banner ads in order to artificially inflate the success of an ad campaign—cost advertisers just shy of $1 billion last year"
It sounds like a good idea to me, although I still think the idea of charging ads on a cost-per-action basis is a better long-term solution. Forget about a simple click from a search page - let's talk about what that visitor did (action) after clicking to the site. That's where the value is.

Filed in: analytics

Technorati Tags:

Monday, July 31, 2006

Analyzing StumbleUpon

My friend and colleague Martin Waugh dropped by the other day to say that he had noticed a new Digg shadow effect. His fantastic photography site, LiquidSculpture gets dugg every now and then. This, of course, brings a lot of traffic to his site, but within a short time after, he is now seeing a lot of traffic from StumbleUpon. And it turns out, it is a lot of traffic!

StumbleUpon is a social networking service. You sign up for an account, install an extension in Firefox (and it's recently become available as a plugin for IE), select some preferences (what your interested in stumbling upon), and you're off stumbling.

It's actually very cool. You add to the social network by tagging sites or pages you come across. Like del.icio.us or other tagging tools, you can add your own tags to pages. But the real power is when you hit the "I like it!" button when you come across something you like, thus adding more substantial data to the network.

Now, when you want to just stumble upon something, you click the Stumble! button. Think of it as a feeling-lucky-reverse-search. You've already given your topic(s) of interest (say, Web Development), and StumbleUpon now takes you to a site you might be interested in seeing. The cool part is that you can Stumble upon a subset of a topic by also selecting a filter. So, I selected Web Development as a topic, then web-analytics as a filter (tag), and quickly stumbled upon Marshall's WebMetricsGuru, Pat's Conversion Rater, and others. I went to a few other WA blogs and hit "I like it!" so they would appear as well.

How do you know if someone has stumbled across your site? There are a few answers, but I'll just note the most frequent now. The primary use of the tool is through the Stumble! button. When someone visits your site using that button, you'll see one of the following referrers:
- http://www.stumbleupon.com/refer.php, or
- http://www.stumbleupon.com/refer.html

If an individual goes back to the StumbleUpon site to view the pages they've tagged (or those of another Stumbler), then select a site from list of tagged sites, they will show up as a referrer from their account name, like this:
- http://elbpdx.stumbleupon.com/

If an individual looks at a list of sites that all have the same tag, then selects a site, the referrer comes through with the tag name on it:
- http://www.stumbleupon.com/tag/web-analytics/

Note that there's a default setting that likely throws analytics off a bit as folks are using the Stumble! button. In the toolbar configuration, there's a setting to "Prefetch Stumbles". In my testing, the browser would actually prefetch a few sites at a time. Sites that I may not end up visiting. However, if I do visit them, they are downloaded again (except for objects that are cached), and a second view is recorded to the site. It would be nice if they would note the fact that this is a prefetch in the request (ala Firefox).

One final note. There is a commerce side to StumbleUpon via "advertising". It's not your typical advertising in that the customer will "stumble" directly to a landing page on your site. I've not spent much time with it, but you can setup "campaigns" with them. As they ask Stumbler's to note their location and age as part of the signup/networking process, they have do have the ability to target campaigns pretty well (given that folks are entering in accurate information of course). It currently costs $.05 per vistior.

Filed in: analytics, stumbleupon

Technorati Tags: ,

Wednesday, July 26, 2006

Quick Visit to OSCON

I made a quick visit over to OSCON today. We're lucky to have such a cool gathering just a short light-rail ride away from downtown. It was interesting to see who was there - Sun, Microsoft, HP, Intel, etc. You know, the usual open source crowd ;-)

I did get a chance to meet blogger Jeremy Zawodny, who was hanging out with a few Yahoo's at their booth. Nice folks.

I also got a chance to chat with the Amazon team. They have many cool projects in the works!

As it's an O'Reilly event, the Make folks were also there with some kits and magazines. I love their stuff.

Friday, July 14, 2006

Traffic "Patterns" from useit.com

A few good nuggets from the useit.com site this week.  Traffic patterns using statistical measurements.  The data on referring sites caught my attention:
.Zipf distribution of incoming referrals from other sites, sorted by traffic rank
The data follows a predictable pattern except for the #1 referrer: Google.  They note:
"The chart's one obvious exception to the theory is that the site that referred the most visitors accounted for many more visits than predicted. Google (depicted as an extra-large dot) referred 257,040 visitors; in theory, it should have referred only 52,479.

Google is five times as popular as the theory predicts, but this phenomenon could fade as other search engines catch up. Only time will tell."
I suppose something will come along eventually to change the dramatic influence of Google, but for now it sure seems like the pattern will continue for awhile.

The next comment is spot on as well:
"Also, while Google is disproportionally important, when taken together, the other 35,631 referring sites accounted for 35% more traffic. Clearly, it's not a good idea to focus only on #1. "
Filed in: analytics

Technorati Tags:

Tuesday, July 11, 2006

In Boston

I'm attending the Microsoft World Wide Partner Conference in lovely Boston. WebTrends is a Gold Certified Microsoft partner, and we are very lucky to have them both as a customer and very strategic partner.

Off-topic: I went for a run early this morning along the river. Speed touring the local sites (BU, Harvard and MIT). Fantastic views....and such great history.

Friday, July 07, 2006

Spokes And No Hub

Pat McCarthy makes a good point about the current pain of having so many specialized service providers - who are also in the business of providing analytics. As he notes:
It’s not the fault of the companies, I commend Feedburner and Wufoo for providing specialized analytics, I just don’t like having to go so many places to get it and learn so many interfaces.
And I agree with his idea of someone coming up with a tool that serves as the hub between these services. Perhaps the very smart Juice Analytics team can come up with some creative ideas? Avinash clearly has some great presentation ideas to contribute (read his post if you haven't already - good stuff).

Now the fun begins.  Who's in?

Filed in: analytics

Technorati Tags:

Thursday, June 22, 2006

Cost Per Action and Analytics

Wired has the scoop. Google's creative team has come up with a very smart new method of getting paid: Cost per "action" (CPA). From the article:

The new cost-per-action, or CPA, network will pay only if net users perform a specific action such as making a purchase or generating a sales lead

This is a decent attack on click fraud. Presumably Google doesn't get paid unless a specific goal or conversion happens in this scenario. Or maybe they just get paid more when an "action" does happen? The pricing model will be very interesting for this new approach. Is there a concept of an auction price? Are some goals weighted higher than others?

And how will one quantify the various "actions"? Well, I guess you need some sort of analytics to do that. Will it matter which analytics package you choose?

This one will be very fun to watch unfold.

Filed in: analytics, google

Technorati Tags: ,

Tuesday, June 13, 2006

A Non-Profit "Startup" - Schoolhouse Supplies Online

Hi everyone, I'd love to get your feedback on a new application a few of us have put together to benefit a local non-profit organization.  I had an idea last fall when buying school supplies for my children. I realized:
  1. I was spending a lot of time looking for exactly the right supplies requested by our teachers
  2. Other parents were also spending a lot of time looking around for the right supplies
  3. The money I was about to spend at the big box store I was in was essentially going to leave my community
  4. There had to be a better way
Why not build an online application, designed specifically for parents buying school supplies?  The application would list the exact supplies requested by the teachers, per grade, at the school.  It would make life so much easier for parents, and for the teachers.  And while we're at it, why not figure out a way to ultimately benefit the schools through this tool?

I ran this idea by a friend of mine, Nick Viele, who is the Executive Director a fabulous local non-profit organization called Schoolhouse Supplies.  Their mission is to serve "classrooms in need by operating a volunteer-run free store for teachers, which is stocked with supplies donated by the community."  They have provided over $6M worth of school supplies to Portland area students, and have been very creative and thoughtful about how to continue to be an important resource for our community.  We thought this idea might be a perfect fit.

I then chatted with a couple of very talented individuals at WebTrends (who also happen to be terrific developers) who agreed to volunteer their time to help build the application.

So we put together Schoolhouse Supplies Online (as of this post, it's still in beta, so there are a few known issues).  It's a relatively straight-forward application as you will see.  The basic idea is as follows:
  1. Teachers create a supplies list for next year which we enter into the application.
  2. Parents then shop online for the supplies they need for their child(ren).
  3. The supplies are delivered (through volunteers of course!) directly to the school.
Pretty easy, yes?  As a really fun bonus, we added a couple of tools into the mix.  As you may know, Salesforce.com provides their very slick service FREE to non-profit organizations.  This includes their API's.  So, since we've done a lot of fantastic work with Salesforce.com already, we decided to build this application with Salesforce.com as a back-end.  For credit card processing, we're leveraging the very flexible API of another great Portland-based technology company, AuctionPay.

So check it out and let me know what you think.  Any and all feedback is welcome.  We clearly have a few things to tighten up (at the moment, our final confirmation email is broken), but please let me know via comments or email (elbpdx at gmail) what you think.  We hope to go live at the end of this week!  Thanks!

Filed in: analytics, salesforce.com, nonprofit

Technorati Tags: , ,

Friday, June 09, 2006

Google Is Making Analytics Easier

Google announced a new Firefox extension today.  It synchronizes your browser settings between computers, including saved passwords, history, bookmarks, and....wait for it...persistent cookies.

A common issue plaguing web analytics accuracy is tracking a unique visitor between different computers at home, and/or at work.  Unless you require a visitor to register and/or login to your web property, you generally cannot identify the same visitor who moves from computer to computer, nor can you accurately track those visitors who have recently rebuilt their computer, or have purchased a new system, or are borrowing a computer, etc.

This extension, which even "remembers which tabs and windows you had open", may provide new levels of accuracy previously only available to a small number of sites.  Granted, it's only available currently for Firefox customers (and only those who are savvy enough to add extensions from Google, and then use it properly).  But it's definitely an intriguing opportunity.

Plenty has been written about unique visitor tracking in the analytics world.  Leave it to Google to shake things up a bit.

Some threads (from Techmeme):
Filed in: analytics, google

Technorati Tags: ,

Tuesday, June 06, 2006

Feed Conversion: Geffen.com

Feedburner announced today that Geffen Records is using their services.  This is a nice feather in Feedburner's cap, and a good use of feeds for providing updated consumer content (in this case, artist specific content like news, tour dates, etc.).  They plan on using FeedFlare and the FeedBurner Ad Network (FAN).  Cool stuff.

In Feedburner's blog post on this subject, they note this interesting conversion data:
Geffen's early trials proved that feeds were the marketing tool that garnered the highest conversions. Feed subscribers were four times more likely to take action (e.g. download wallpaper, play audio/video clips, sign up for a message board, etc.) than those reached through more traditional methods. Recognizing the growing audience that will no doubt follow the launch of the next generation of browsers, Geffen wanted to lay the groundwork for a company-wide embrace of feeds to ensure they're able to leverage this new medium.
Four times more likely to take action.  Impressive.

Filed in: analytics, feedburner

Technorati Tags: ,

Wednesday, May 31, 2006

WebTrends ODBC Driver and Yahoo! Maps

The WebTrends ODBC driver is a great feature of the WebTrends Marketing Lab in my humble (and biased) opinion. It's pretty powerful, especially when pulling data from the WebTrends On Demand service. So I thought I'd update my WebTrends-Yahoo! Maps mashup to take advantage of the flexibility of the ODBC driver. Check it out with your own data!

Here's how the U.S. and Canada traffic (visits) looks for insideanalytics.blogspot.com:

Feel free to download this spreadsheet to use if you'd like to map your own data. It has a few macros built in to allow you to refresh the data from WebTrends, and upload it to Yahoo's slick mapping service. I've been testing with WebTrends On Demand data, but it should work find for those of you using WebTrends software too.

The general idea is as follows. You install the WebTrends ODBC driver, and create a data source called "WTODGeo" to match what this spreadsheet is using as a default. The WTODGeo data source connects to the profile you want to use. Then you enter in the Time Period you want to use in the spreadsheet and refresh the spreadsheet with your data. Then map it using the macro!

There are more specific instructions in the spreadsheet.

I'm using the Yahoo! maps "simple" api for this only because it's VERY easy to use, and gets the general point across without much fuss. Their Flash and AJAX api's are definitely worth a look too. If someone would like to make this map even better...be my guest!

This isn't a WebTrends supported spreadsheet. Just my own creation. Use at your own risk. If you have trouble with it, or have any questions or feedback, feel free to drop me a line at elbpdx @ gmail

Quick Update: Note that even if you do not use WebTrends, you can still download the spreadsheet and use the data I have inside the spreadsheet to see the Yahoo! map. After downloading the spreadsheet, just skip all the other steps and use the macro to map the current data. It's worth it to see how easy Yahoo! maps is to use!

Filed in: analytics, visualization, map, mashup

Technorati Tags: , , ,

InfoWeek on Low-Cost Analytics Packages

InformationWeek has a review of five low-cost analytics packages.  Short reviews on each, highlighting some key features.

The last page has a blurb on cookies (the issue that just won't die!), noting:
"Many experts feel that the whole "cookie hype" has largely been due to media coverage. "Cookies are the lowest concern of any of type of spyware," says Joe Telafici, director of operations for McAfee Avert Labs."
- and -
"The controversy resulted in some spyware sweepers picking up these types of cookies -- and, as a result, it's probable that a lot of users deleted them. So if the cookies are dumped, how useful are the analytics products? Not very."
Filed in: analytics

Technorati Tags:

Internet Advertising Hits $3.9B

The IAB and PWC released a report noting that internet advertising revenues reached a new record of $3.9B in Q1 of 2006. A 38% growth year-over-year. Impressive.

Source: PwC/IAB Internet Advertising Revenue Report (www.iab.net)

Everyone in the valley with the "Please God, give me just one more bubble" bumber stickers must be dancing in the streets!

Filed in: advertising

Technorati Tags:

Sunday, May 28, 2006

Site Visualization Tool

Check out this slick graphical view of a web site...I ran a few sites through the tool - my fav is the view of boingboing.net...here's a part of the view of insideanalytics.blogspot.com:

Folks are taking screenshots of sites and posting them to Flickr here. Cool!

Found via Blogoscoped.

Filed in: visualization

Monday, May 22, 2006

Thinking about the industry

Xavier Casanova has a smart post today about the changing analytics industry. I like this quote:
"it seems to me that the amount of free, uncontrolled buzz has skyrocketed on over the last 2-3 years"
It's very true. The external buzz is really only measured by referrer traffic today in current analytics tools, and yet it's spawning new parallel analytics to the web analytics world (Feedburner, BuzzMetrics, etc.).  As Xavier says, "think about it".

Filed in: ,

Thursday, May 18, 2006

Verisign and OpenID

Could Verisign bring a web2.0 identity effort into the mainstream? They have announced that they are now an OpenID (wiki) Personal Identity Provider (PIP). Verisign is definitely one of the names that comes to mind when thinking of privacy and security, so this does have good potential.

OpenID is a relatively straight-forward identity mechanism that LiveJournal, MovableType/TypePad and others have adopted. You sign up for an Identity (which is a URL) with an Identity Provider (like Verisign), then use that Identity at sites that use the OpenID mechanism. From an analytics perspective, this is nirvana...a universal visitor identifier.

I signed up for the Verisign PIP. It was a fairly painless signup process. Ironically enough however, especially for the leader in the encryption world, the signup process (including entering in a password) is NOT done via SSL. I'm sure they'll fix that soon...

Filed in:

Wednesday, May 17, 2006

Del.icio.us, APIs and SSL

One of the ways to get a technology service running quickly is to put security concerns on the back burner. Either it's too expensive (in terms of engineering resources), or too complex, or just an oversight. I'd like to see all new services articulate their position on basic security issues (SSL, authentication, cookies, etc.) to give us technology early adopters a better sense for what their approach will be moving forward.

At the very least though, every service should run over SSL. This is super easy to setup, and requires very little engineering effort. There are some systems architecture decisions to be made, including how to host the SSL certificate (preferably on the load balancer). And of course, there is some cost (performance, CPU time) associated with encryption, which could add some load. But otherwise, this is a slam-dunk easy win.

Del.icio.us noted yesterday in their blog that they have changed their APIs to support (in fact, require) SSL. Good move. Again, IMHO they should have done this from the start. Since their APIs are so heavily used, many other software vendors are now scrambling to fix their applications that use their APIs (they've been given 6 months to make the change before del.icio.us shuts down the old site). And now all of us users need to update our apps that leverage those APIs (in my case, I will have to update Attensa, Performancing, Firefox/Del.icio.us extension, and that widget (that I haven't finished yet anyway - ok that's not as big of a deal ;-)

Filed in: ,

Feedburner Ads

Pat McCarthy notes a couple of challenges he sees with Feedburner's Ad Network. I agree with his comments.

I have written about Feedburner a few times. They've clearly got very smart people solving problems we don't know we have yet. What I like about the new Ad service (FAN) approach is that they are using meta data to help drive the ad content. I like the concept of integrating ads into other information (eg, web pages, etc.) presented as microcontent. As Fred Wilson notes, "FeedBurner continues to lead the way in monetizing microchunked content".

Filed in: , ,

Thursday, May 11, 2006

First Google Gadgets for Analytics

The folks over at Performancing have the first Google Gadget (GG) for web analytics that I've come across. GG is one of Google's latest desktop tools, and it's pretty slick - definitely a direct attack on one of my favorite tools - the very popular Yahoo! Widgets.

Here's a snapshot of the Performancing gadget:
I've been using Performancing to measure traffic on this blog for a couple of months now. My favorite feature is their RSS feed of analytics data. You can see the data for my site here:

I find that I don't visit the Performancing site much since I'm subscribed to the RSS feed of the data. Reading analytics data via a feed is a bit awkward in that the data is updated several times throughout the day, and each update is picked up by my RSS reader, so if I don't get to the feed for a couple of days, it's a lot of repetitive data.

Congrats to the Performancing team for being first to the punch. Nice job! I still plan on giving my Yahoo! Widgets a facelift and a few more features in the near future.

Update: In looking at this again, it looks like the GG is actually built by a third-party (geniosity.co.za)...but it works fine...cool stuff.

Filed in:

Thursday, April 27, 2006

Apache and SSL

This is an odd piece of data. Netcraft reported yesterday that:
Apache has overtaken Microsoft as the leading developer of secure web servers. Apache now runs on 44.0% of secure web sites, compared to 43.8% for Microsoft.
I don't doubt that there are many more sites using Apache web servers vs. Microsoft's IIS. But the preferred architecture of so many medium-large sites is to host SSL certificates on more efficient load balancers, that it seems incomplete to only discuss the web server side of things.

When discussing architecture options with customers and colleagues, I always recommend using load balancers to offload SSL traffic. The maintenance of the certificates is much easier as there are fewer systems to update. Load balancers are generally more efficient, especially with hardware accelerators. Plus the entire environment is easier to scale as needed.

Filed in: operations

Tuesday, April 25, 2006

PageRank Visualization

Neat idea from the folks over at iWEBTOOL. An overlay showing your site along with PageRank data. They don't give a lot of information about how they are calculating the PageRank information, but it's a clever implementation of an overlay.

Check out EricP's site with the overlay. Pretty cool, eh?

Filed in: visualization

UI Extremes

Dion Hinchcliffe has an interesting piece on the extremes of UI interfaces. His observations center around the extremes of the current visualization slickness of AJAX-enabled applications with the no-nonsense command line interface of your favorite search engine.

I like this line in particular:
Do you use Google's Search more than any other software application? I do, and probably most others too. So what happened to the GUI?

Filed in: visualization

Wednesday, April 12, 2006

Microsoft and Niall Kennedy: Feeds Platform

Niall Kennedy, a veteran of the RSS and search community announced he is going to work for Microsoft. This is a great move for Microsoft, as they pick up someone who is a true "feeds as a platform" industry influencer.

In his announcement, Niall casually mentions that Live.com will be the new homepage for IE7 and Vista. That's very big news, and demonstrates a fundamental shift in priorities for Microsoft. Richard McManus does a nice job of articulating the importance of this move, noting:

1) It'll be the biggest mass market use of RSS technologies since Yahoo put feeds into MyYahoo back in September 2004. This will be a huge boost for RSS.

2) It will mean Live.com replaces MSN as the IE homepage - which as LiveSide pointed out
will mean "pushing Windows Live Search at the expense of MSN ad
revenue." If this isn't confirmation that Microsoft sees Google as its
number 1 competitor/threat now, I don't know what is…

It will mean the world of gadgets (aka widgets or modules) and web
services will go mainstream. Forget the current lot of boring clock and
weather gadgets, the real power of these mini-apps is in their ability
to integrate devices and media — think of the upcoming tv recommendations gadget, which talks to your Media Center box in order to program tv shows.

Sweet. Content provided by the default behavior of IE and Vista will be extensible by design. This is not MSN with some preferences...it's dramatically more customizable to include content (microcontent) from any and all feed sources.

Tuesday, April 11, 2006

Feed Analytics: Case Study #1

I'd like to explore the current state of feed analytics in MSM, looking at it from the outside in. I do not have access to the analytics for any of these media sites - I'm only taking a best guess here based on the data collected as a consumer of these feeds. If anyone reading this has any further information, or clarifications, I'd love to hear from you.

First up is Newsweek, which is a MSNBC property, hosted on MSN. One of the reasons I've chosen Newsweek first is that they are one of the MSM sites with feeds hosted by FeedBurner, and is one of the first media sites to leverage FeedBurner's FeedFlare tools. FeedFlare gives you the ability to add footers to each feed, providing a little extra functionality to each article. It's a brilliant tool. So, what is Newsweek using FeedFlare for? Take a look at one of their feeds (they have 40 of them!)...and you'll see they are including:
  • Email this (essentially a "mailto" link)
  • Technorati (if there are technorati links to the article, this flare appears - and links to a technorati search for the article)
  • Add to del.icio.us (posts this article to your del.icio.us bookmarks)
How does FeedBurner do all of this dynamically for each article within each feed? Each of the FeedFlare items is an image of the text with a dynamic URL that gives FeedBurner the ability to track which FeedFlare link was clicked (to give clickthrough statistics). They've expanded this service recently - and when they allow for dynamic flares, then things will really get interesting!

Another thing you'll note when you visit a feed is the Newsweek logo in the upper right. This is a standard feed image for RSS. Every time the feed is read (given that the RSS reader reads images), the image is called from:
  • http://msnbcmedia.msn.com/i/msnbc/Sections/Newsweek/NWProjects/NW_RSS/rss_logo.gif
Newsweek could obviously analyze the traffic to that image request. That would give them information about how often the feed is viewed (feed impression). Of course, FeedBurner will give them that stat as well.

Many of the articles have images. Each image in the feed is pulled from MSN too, so you could analyze those images to see when articles are viewed (article impression). Again...FeedBurner will provide that stat too.

If you do decide to click on an article, here's what happens:
  1. The actual link is to feeds.newsweek.com, which is a CNAME to feedburner.com
  2. Which redirects to a specific article ID in FeedBurner (example: http://feeds.feedburner.com/newsweek/TechnologyScience?m=55)
  3. This redirects to the Newsweek site for the article with "/from/RSS/" tacked on the end of the URL which can be picked up in their analytics package
It's a great implementation of feeds and of FeedBurner. They have branded the feeds (the URL is to feeds.newsweek.com), and embellished the feeds (via FeedFlare) to best take advantage of this maturing technology.

To summarize the analytics, I believe they have the following impressive set of analytics data available:
  • Feed impressions (via FeedBurner)
  • Article impressions (via FeedBurner)
  • Article clickthroughs (via FeedBurner and via the URL modification)
  • Various other feed stats that FeedBurner provide, including:
    • Types of feed readers used
    • Number of subscribers to the feed
    • How often a feed article is emailed, searched for on Technorati and bookmarked into Del.icio.us
Additional information that requires some work to tie together:
  • Returning visitors to the feed - they would need to analyze the traffic to the Newsweek feed image, and tie the visitors back into their analytics package
  • Returning visitors to individual articles - where there are images in the article, they can analyze the traffic to those images. Where they don't have an image in the article, there is no reference back to Newsweek for that visitor.

Tuesday, April 04, 2006

Analytics Challenges with Emerging Technologies

Great info from Marketing Sherpa posted in this article from eMarketer last week. Their results show where Marketers would like to experiment with advertising...the top 3: Mobile devices, Video and RSS. Each of which presents interesting challenges in the analytics world.

Also note that a whopping 40% of those who responded to the survey said they would be adding RSS feeds this year and 35% would be adding in-house blogs. This is the year of transition to microcontent.

As with any emerging technology, new analytics methods are constantly being defined and refined to meet these changing needs. Your success in building the right analytics for your site requires time for proper planning and testing. I encourage you to leverage the professional services team or consulting team of your analytics vendor to help ensure your results meet your expectations. And don't forget to have fun with it!

Technorati Tags: , ,

Monday, April 03, 2006

I Knew It Was Rigged

I never realized how much Amazon was in control of things...even the final four...I just received this email from Amazon:

NYTimes Site Update

The NYTimes updated their web presence last night. I like the wider page structure they've moved to in this redesign, as well as the great use of whitespace. They've updated some features, and added a few. The "most popular" areas within the sections (for example, check out the Technology section, lower-right) include email and blog references. There's a new section for "most blogged" stories from the site, which is updated hourly. This is pretty slick...I wonder how what technology (or service) they are using to determine which articles bloggers are linking to?

Khoi Vinh, the Design Director for the site has a post describing the launch. Anil Dash has a good overview as well, with some helpful extra reference.

As a techie/privacy side-note, they have an odd mix of P3P statements on the site that they may want to address. They set a cookie when you first visit the site, but they don't send a P3P with the cookie (which means that IE users with their privacy setting to "low" and above are not setting their cookie). And there are two CNAME'd "first party" domains referenced, each setting their own cookies, and both contain different P3P statements.

Filed in: , ,

Tuesday, March 21, 2006


Richard MacManus has started an intriguing series on Microcontent Design. This first post is a general overview of ideas and terms, where he does a good job of quickly articulating definitions around:
  • Data sources and formats (xml, RSS)
  • Current endorsements (Yahoo=Microsoft=RSS while Google=Atom)
  • Structured blogging (publishing content in formats readable by aggregators)
  • Microformats (built on XML to add metadata to define objects)
So, if you're providing microcontent to the world, how will you measure the usage of it? What data will be important for you to analyze? How will you define successful uses of your data?

It's probably safe to say that it will not be measured in terms we use today!

Filed in: analytics

Word to the Wise

A good article from Business Week that Winthrop Hayes was kind enough to post on the Yahoo board.

There has been so much discussion about cookies over the past year+ in our industry. The article boils it all down to this sound byte:
"Concerns about privacy are leading an estimated 10% of Web surfers to erase their Internet cookies on a regular basis." All of that research, and the world gets "10%". Oh well...at least the concept is simplified for the readership of Business Week to understand.

The article goes on to say, "The race is on to find new ways to track customer behavior." This will be a good race indeed!

Another great quote from the article, "What's more, in
display advertising, even the more concrete metric of clicks is
questionable. "Click measurement has been abused," says Greg Stuart,
president of the Interactive Advertising Bureau in New York, an
industry group. "There's no relationship between clicks and brand

Monday, March 20, 2006

WebTrends Marketing Lab Buzz Update

Since my previous post on March 4th, there have been quite a few new articles written about the new WebTrends Marketing Lab. I have been watching the buzz via Technorati, IceRocket and others, so I thought I'd note what I've found. There are, of course, other sites who can help track the buzz...I like these two as a starting point.

This is more of a list than an analysis of the buzz, but it's a reference point anyway. The major sites I've seen thus far on IceRocket include:

"blog" search:
WebTrends Expands Beyond Website Analytics Into ... InternetWeek ( Focus Exclude Subscribe! ) 371 links from 248 blogs
WebTrends Expands Beyond Website Analytics Business Intelligence Pipeline ( Focus Exclude Subscribe! ) 199 links from 69 blogs
WebTrends Expands Into Marketing Performance Measurement Systems Management Pipeline ( Focus Exclude Subscribe! ) 164 links from 88 blogs

"web" search:
destinationCRM.com: Come Up to the Marketing Lab
Web Host News | WebTrends Launches Marketing Lab
WebTrends to Launch Data Warehouse Product

"news" search:
Firms get better online marketing analysis
WebTrends Unveils WebTrends Marketing
WebTrends Expands Beyond Website Analytics
WebTrends expands into online marketing metrics

Technorati notes a few different items:
WebTrends Launches Marketing Lab In Web Host Industry Review | Fin...
10 Quick Wins for Email Marketing In iMedia Connection: Connecting ...
WebTrends To Release Marketing Lab By Pat McCarthy in Conversion Rater
How WebTrends will be positionned after the... By Rene Dechamps Otamendi and Aurelie Pols in WebAnalytics.be Blog
Launch Update: WebTrends Marketing Lab Webcast By Rene Dechamps Otamendi and Aurelie Pols in WebAnalytics.be Blog

Filed in: analytics webtrends buzz

Sunday, March 19, 2006

Changing the Recipe at Delicious: Private Bookmarks

Del.icio.us launched (beta) their "private saving" feature tonight. This allows you to bookmark an item with a new "do not share" flag to keep items from your public bookmarks.

This change is somewhat controversial as some folks don't want to see the social networking power of Delicious reduced by offering a means to make bookmarks private. The folks at Delicious are obviously aware of this, and will be monitoring the effects...as Joshua notes in the announcement, "we will be watching how this feature impacts the community".

I see it as an opportunity to increase the social network fabric by giving individuals another reason to use the tool, or use it more frequently. Many folks will use it to store bookmarks of important but not public information, which hopefully serves to increase the overall usage of Delicious in the process.

Filed publicly in:

Friday, March 17, 2006

MapSurface Has A Great Product!

Glenn Jones has created a couple of very slick tools. I have installed his MapSurface tool this week, and you're welcome to check it out. Visit my site and press Alt-X. You'll see the MapSurface widget pop up to give you some general stats.

First, note that you can move it around on the page. Cool, eh? Now click on the "map" link to see an overlay of link data on the page. And if you'd like to see more, you guessed it, click on "more". Go ahead...I'll wait.

Impressive, yes? I'm very impressed. I like this approach: Build a useful tool, that's simple enough to use, and leverages very cool technology. Oh, and while you're at it, deliver a new way of thinking about analytics to the industry. Nice!

For us web2.0 geeks, he's not actually leveraging AJAX, but rather he's using "a combination of on-demand script loading and JSON data transfers." He's taken the time to give us a great write-up of some of the technology, which includes a reference to the very helpful Yahoo developer page on this topic.

Thanks for the great tool Glenn! Well done!

Filed in:

Tuesday, March 14, 2006

Bitty Browser for Web Analytics

Bitty Browser is a slick idea. It provides a more interactive experience with lists on your site. I created an OPML file of various Web Analytics sites and embedded the file into the Bitty Browser.

As it is stuffed into the right hand side of my site it's a little awkward. If I could control it a bit further, I'd make the font a little smaller, and get rid of a few extra spaces. But for the most part, it's a very cool idea.

For those of you who only read my feed, you'll need to go to the site to appreciate this one.

Filed in:

Vast – Open API Search Tool

An interesting new search service, and business approach. Built as an open API source of data - ready for the mining. From TechCrunch:

vast logo

The key to Vast is that they have done an excellent job of aggregating the long tail, as well as the top sites, and they want developers to now ’steal this site’ (as it says on their homepage). Vast is making all this information available through an API which developers and site owners can use to integrate with other services or to build their own services with. The licensing terms for the API are as liberal as they can get – they just ask that you don’t do anything illegal and that you attribute Vast as the source of your data. Developers can now build their own implementation of the world largest auto search site in a matter of days and do what they like with it (place ads on it, etc.). In fact, the Vast.com site itself is an implementation of this API that took only a couple of weeks to write.

Filed in:

Wednesday, March 08, 2006

Buzz 101

Andy Beal, with Fortune Interactive, has penned a great resource guide for monitoring your "Online Reputation". He offers some helpful hints on getting started, including:
How to track?

• If possible, monitor hourly as early action is crucial.

• Create custom RSS feeds based on keyword searches: Feedster.com, Technorati.com, IceRocket.com, Google.com/blogsearch, Blogpulse.com, MSN Spaces, Yahoo! News, Google News, MSN News and PubSub.

• Filter all feeds into one RSS Reader for easy and time-efficient monitoring options include: Newsgator.com, Bloglines.com, Google Reader or Pluck.com.

• Sign up for Google and Yahoo email alerts using your desired keywords (http://alerts.yahoo.com/ and www.google.com/alerts).
That's quite a list! Even if you did some of it though, along with some of the other items he suggests, you'd be headed in the right direction.

And, I would add Attensa Outlook to the options in the RSS Reader category.

Finally, I would also recommend looking at this transcript from an online Q&A session with Steve Rubel. This piece of the industry is clearly in its infancy.

Filed in:


webtrends reinvigorate analytics