A Little Taste of X Change
Wednesday May 25th 2011, 8:40 pm
Filed under: Gary Angel
By Gary Angel
One of the highlights of the X Change Web Analytics Conference is the main dinner that caps off the first full day of the Conference. In the past, we’ve done a spectacular 5 course tasting menu at the Ritz Carlton, a wine-pairing at the St. Regis, and, my favorite by far, last year’s spectacular event at the Monterey Bay Aquarium. This year, we’re going Spanish-theme, with an outdoor “Tapas” dinner from the Coronado terrace with a stunning view across the bay to the San Diego skyline.
Tapas, if you’re not familiar with Spanish Cuisine, is the word for “small plates.” It’s a style of dining where you get to sample lots of dishes all with very different tastes.
I think it’s an appropriate word too, for today’s post, where I want to give you just a quick taste of the many different topics we have in store for you at X Change. As I wrote in my last X Change post, the Conference packs an amazing informational punch for a small, intimate experience. With 60+ unique breakouts, each a full 2 hours long, X Change has far more deep informational content than ANY other analytics Conference no matter how large.
We’ve pretty much nailed down the Huddle topics for X Change 2011 and it looks delicious indeed!
It’s always interesting to see how themes develop as I talk to Huddle Leaders and they pick their topics. Some of the top themes this year include Big Data and Analytics Warehousing (even more than last year), Attribution (this is hot everywhere right now and people actually seem to be doing real work on it), and integration – especially with Social and VoC data.
Deep analytics topics are also getting more traction this year – a fact which gladdens my heart. So I thought today I’d cover some of the deep analytics topics that are being offered. I can’t hope to describe every topic (even within a category), but I’m going to give a light sampler to give you a sense of what the full meal will deliver!
Tom Betts of the Financial Times is going to lead a discussion of Paywall Analytics (a topic I’ve blogged on before) and another on Predictive Analytics for Anonymous Traffic. Paywall Analytics is media-specific, but for any media site, I expect this Huddle to be de rigueur.
There is so much to Paywall Analytics; understanding the cost of the wall, tracking what happens when a user hits a wall, how the wall leads to and prevents conversion; how to continuously optimize the wall; it’s an incredibly rich and fascinating playground for behavioral analysis and, of course, it happens to be vitally important to the business bottom line. In few areas of analytics are good answers both so important and so readily accessible if you measure and use your data wisely.
Predictive Analytics for Anonymous Traffic, on the other hand, is a topic rich in value for almost every Website. The fact of anonymity is the single most important and distinguishing factor in Digital Analytics from traditional BI. Understanding how to deal with and use anonymous data is the essential first step in the effective transition to digital analytics.
Did I mention hot topics? Predictive Analytics is certainly one. But one of the drawbacks to throwing around Predictive Analytics is how intimidating it can seem. A common first reaction – “Ohhh – that sounds hard.” Kiele Cauble of American Airlines is leading a Huddle focused on exactly that reaction. It’s called “Predictive Analytics – the Low Hanging Fruit” and it’s all about finding the easy opportunities for Predictive Analytics. It should be a great discussion for those either looking to get started using more advanced techniques or those looking for some cherry-picking opportunities they might have missed. It will be a perfect place to bring your thinking cap and some of your own best-examples for quick wins with more advanced analysis.
When you talk analytics you are inevitably talking Segmentation (as my current series highlights). Michelle Lambert of QVC will be leading a discussion on ways to improve Advanced Segmentation and Retargeting. It’s a topic that is (or should be) at the heart of almost every Digital Marketer’s thinking, and it’s also an area where there are an infinite number of approaches to try and learn from. In no area should we be able to learn more from the example of others – good segmentations are often highly adaptable and there are so many different and valuable approaches that we should all be able to bring some good thinking to the table.
Michelle will also be covering analytics for SEO. I love this topic. SEO is a critical traffic-driver for most Web properties. But SEO traffic invariably performs and behaves differently than any other traffic segment. There are a host of techniques for studying and optimizing SEO traffic – techniques that are as – or even more – important the SEO techniques for getting traffic to your site in the first place. No Website with significant SEO traffic should fail to explore the special analytic and optimization challenges this segment provides – and every site should be able to bring some of their unique insights and problems to the discussion.
James Robinson of the NY Times will be leading a discussion of one of the central topics in today’s Digital Analytics: Shifting to Visitor Level Analytics. This is a huge deal in media, but I think it’s fair to say that it’s a huge deal everywhere. Web analytics tools have done a remarkably poor job of supporting the single most important level of analysis in marketing – visitor-level measurement. My recent series is also, in many respects, around this exact problem. James’ Huddle will cover all the key elements of getting to Visitor-centric analysis – from access to the data, to segmentation to visitor-level analytic techniques. If you’re not working to get to Visitor-Level analytics, you should be. If you are, here’s a chance to swap ideas on how best to drive visitor-level analytics in the digital realm.
One of the beauties of X Change is that you can get topics of surprisingly broad interest that just wouldn’t happen anywhere else. I have in mind Chris Johannessen of BarclayCard and his Huddle on “Merchtainment.” What’s Merchantainment? It’s the challenge of creating entertaining and engaging content on Websites whose business isn’t ad impressions but selling product or generating leads. With Website experiences getting constantly richer, commercial Websites of every stripe (and media information sites too) have to answer a fundamental question – just how far do they / should they go to entertain their customers? Do customers want or need entertainment? Will it create engagement or distract from sales? Chances are, you’re site has some level of Merchantainment – and you’d probably like to be able to answer those questions definitively. If so, you should bring your best song-and-dance and your thinking cap to this deep-dive into the measurement of engaging experiences on commercial Websites.
I’m always frustrated at the dearth of Conference discussion about real analytics. At X Change 2011, that’s just not going to be an issue. The all-conversational format of X Change and the sophistication of the attendees create a unique opportunity to really learn about topics at a deep level – making it the perfect venue for sharing ideas about the right kinds of analysis and the right approaches for tackling them.
In my next couple posts, I’m going to give you a sample of the many Integration topics slated for this year’s X Change as well as the HUGE amount (how appropriate) of big-data discussion on tap. Meanwhile, I’m working on my next post in the continuing series of Digital Analytics and Database Marketing – how to physically construct a Two-tiered segmentation. It’s another pivotal post in that series and I hope will provide a deep understanding of how to translate a conceptual segmentation scheme into a set of physical filters in your Web analytics solution.
It’s a great time to Register for X Change!
Son of Nedstat: comScore’s Digital Analytix vs. SiteCatalyst, Google Analytics, Webtrends et al
By Phil Kemelor
I started following Nedstat when I wrote the first CMSWatch Web Analytics Report in 2007. Based in Amsterdam, the company never got much attention outside of Europe, even though it was the largest analytics vendor outside of North America. Over the years, the company built a loyal following of clients, driven by excellent, localized service, fair pricing and an interest in providing innovative features that were introduced either before or at the same time as its North American counterparts, such as video analytics, third party integration and an emphasis on providing “on the fly” segmentation.
I thought Nedstat’s purchase by comScore last summer would make for a potentially interesting option in a market that was starting to witness a lack of choice in enterprise analytics solutions. I also liked the idea that an audience measurement company was getting into analytics and could understand the complementary nature of behavioral analytics to its core business. From an acquisition perspective, it seemed to make sense.
So, with this as a background, Semphonic was invited to attend a 3 day partner training at comScore’s Reston, VA headquarters to dive deep into the tool. My colleagues, Chris Meares, Ryan Praskievicz and Royce Fung joined me, and I asked them to share their takeaways on the “son of Nedstat” that is now comScore Digital Analytix (DA). With that being said, I was really impressed by the comScore’s team interest in trainee feedback, and commitment to acting on a range of recommendations to further improve the offering. This is a long post, but I think you’ll find it a very thorough analysis of where DA stands today. We’ll hear from Chris and Royce today, and Ryan later this week. I think Chris sums up the general feeling in the class in his opening…
From Chris Meares, Senior Consultant, @chris_meares:
The first thing everyone asks me is “How does it compare to Omniture (Adobe) or Google Analytics?” What I am going to tell you is how it doesn’t compare to Omniture or Google Analytics and it can be summed up in three words, un-aggregated, segmentation and integration. Obviously, the tools mentioned previously do all, or some, of these items also, but none do it as well as DA. Here is a brief overview of each term and how DA differs from its competitors.
Digital Analytix (DA) stores all of the data it receives from your site in un-aggregated form which means you have the ability to slice and dice it anyway you like and you are not held to the constraints of how it is collected as is with other tools. It also means there is no data sampling which is seen daily in Google Analytics and occasionally in Omniture Discover. Omniture does have a data warehousing feature but the lag time on running reports using data warehousing is immense whereas DA allows you to run queries in real time with impressive speed. Of course, DA doesn’t have a client base in the US yet and it will be interesting to see how the lag time increases (or not) once they begin collecting enormous amounts of data. The other advantage of un-aggregated data comes into play when doing attribution to campaigns. DA allows you to set up different attribution models on the same sets of data at the same time such as first click, last click, linear increasing/decreasing, equal share to all, etc., in order to see the differences in attribution models in real time. This is something that cannot be done in other web analytics solutions.
Digital Analytix (DA) has the best segmentation capabilities I have seen in a web analytics solution. You can segment, in real time, on any of your data points and create reports and do deep dive analysis. Also, you can have as many custom variables as you see fit, you are not set to a specific number, either per page or per site. Because it is included “out of the box” with the DA package and because it can segment in real time, it has an advantage over both previously mentioned competitors. It is one of the most flexible segmentation tools in the market that I have seen.
The last area that sets Digital Analytix (DA) apart at the moment is the integration feature that is included with the solution. DA can integrate with your email service provider, your voice of customer tool, your multivariate testing tool and Google Adsense. It also has the ability to integrate directly with your backend database either via Data Lookup (a SQL database hosted by DA) or through data merge (data that is uploaded nightly), which allows you to easily combine offline and online data.
Segmentation is available on all of the integrated pieces because it all part of one interface. It will also integrate with comScore’s audience measurement business which will allow clients to see demographic data based directly in their analytics solution. This seems to be a huge hook for DA but was not actually shown during the training so I am not quite sure how it will exactly work.
Final Takeaway from Chris
There are a few concerns with Digital Analytix. The user interface is slightly clunky and is a work in progress. It currently feels like it is more focused on the power user or full time analyst. Occasional users may find it difficult to get the reporting they need without some heavy training. Like any enterprise level implementation, it will take extensive time with the client to figure out business needs in order to set out a road map that will allow correct data capture. If DA can upgrade their user interface to the level of Webtrends or even Site Catalyst then I believe it can be a major player in the marketplace.
From Royce Fung, Consultant, email@example.com:
While not necessarily groundbreaking as a whole, the tool is clearly catered towards those willing to invest time and resources into their web analytics tool and those who seek to drive their web strategy from a deeper understanding of user behavior on their website.
DA is striving for the high end of enterprise web analytics tools and is unabashed at saying it aims to wrest away Omniture’s dominance at the top end of the spectrum. After seeing the tool and getting some insight into its development path, I believe, in time, it has more than a chance at doing this.
The browser-based tool provides a range of metrics on par with the other top tier analytics tools. Report Builder allows for building and saving customized reports. There is also an easily accessible way to create custom, calculated metrics based on a mathematical formula. Definitions tend to be very precise in DA. This forces users to select between and, therefore, recognize nuances such as something that takes place within a visit-based scope or an event-based scope. Ultimately this is a good thing — it certainly raises the bar when it comes to the web metrics reporting – though I can envision it being initially frustrating to many.
The Reporting Interface is, all-around perhaps, one of the biggest drawbacks of the tool. In order for reports to be displayed, they must be put into what most users might think of as a dashboard. The interface tends to require users to work within sub-menus, making the overall experience somewhat tedious. For example, to add a report to the Report display, users enters a rather small dedicated interface that requires a bit of navigation before performing even the simplest report. From the Report display, several clicks are required to produce a simple visits report, even when using the default parameters, compared to just one click in SiteCatalyst. Editing of reports suffers from the same type of syndrome.
DirectView is a useful tool comparable to Omniture’s ClickMap. Some of the common limitations apply, such as the reporting of flash-based elements as well as separating multiple identical links on a page.
Segmentation is very robust and can be applied to all or individually to reports. Segments can also be applied in DirectView, which is a really nice feature (and a common theme of how well everything is tied together and integrated), though, we did not see this demonstration. The ability to apply segments live at a top level to the Reports display, to individual reports, or to individual metrics is very impressive.
Again, there are some shortfalls with the interface, but overall it is a comparatively strong feature.
A lot of thought has clearly been placed on integrating data sources. And coming from comScore, owners of a treasure trove of data, it isn’t all that surprising. DA has placed heavy emphasis on allowing clients to integrate data from other sources whether they come from comScore’s other range of products, third party sources or the clients own internal data sources.
By leveraging its RING integration clients can for example, not just track the success of an email campaign, but segment that population in DA to create a specific target audience for a follow-up campaign. The integration with the email provider allows for specifying that target audience in the form of an email list.
On the subject of campaigns, comScore’s expertise in market intelligence and the media industry clearly shows through in the DA product. Much thought has been given to areas such as campaigns and the concept of engagement, and DA clients can make use of this, particularly with the eCommerce add-on module. How conversions can be applied to campaigns can be adjusted very precisely (again, scope comes into play, as well as other definition rules). Clients can also assign the weighting of content contributing to engagement on the site, by making use of an engagement score which can be assigned to individual pages. As of now, point values must be tagged (that is, they cannot be set up in an interface), but this is set to be addressed by the end of the year.
Video tracking can be finely tuned with the StreamSense add-on module, though standard DA seems to have much of the basic milestone-based measurements that I suspect many would find sufficient interaction with a video player can be tracked with StreamSense.
Final Takeaway from Royce
Overall, the tool is built with precision and deep measuring capability in mind. It also attempts to get the most out of its data integration capabilities and from that stand point succeeds in not simply being a web analyst tool but a broader business tool. Having an existing client base, many with the UDM tag already in place, will be a huge plus initially for comScore in getting a number of customers on board to DA.
Public Sector (.Gov) Sites and Two-Tiered Segmentation
Wednesday May 18th 2011, 8:46 pm
Filed under: Gary Angel
By Gary Angel
Greetings from Seoul! Perhaps it’s a bit ironic, particularly for someone who rarely travels overseas, that I find myself writing about public sector sites while abroad. There is nothing like a 12 hour flight to produce a flow of words, however, and I’ve just finished off a post for Clicktale’s May Madness. So as promised, my topic today is the application of Semphonic’s Two-Tiered Digital Segmentation to the Public Sector.
Why Public Sector? First, because Semphonic’s Public Sector practice has grown dramatically in the last few years. It’s become so important to us that we actually went through the process of GSA certification (about as painful from a corporate perspective as giving birth). I also think it’s a fascinating practice area.
Public Sector sites face challenges different and in some ways more severe than those (already challenging enough) endured by the private sector. Government sites have struggled with the basics of measurement. Fettered by extraordinarily restrictive visitor tracking rules (thankfully somewhat relaxed), the basic mechanics of segmentation have been largely missing from the public sector. That, however, is really just a small part of a larger problem. Public Sector sites have struggled mightily with the most basic problem in Web analytics; deciding what is success on their Website.
There are no transactions on Public Sector sites; there are no leads; there are no ad impressions on every page view. In such a situation, what is meant by success?
Most Public Sector sites have chosen to answer this question outside of behavioral analytics. They’ve come to rely on Voice of Customer measures of satisfaction and basic measures of Reach as captured in Visits to the Website.
It’s far from a terrible solution. Satisfaction measures embodied in opinion research provide an excellent measure of overall site success, a means of comparing sites and even a method of isolating success by task. It’s no wonder that Foresee has become the de facto standard for measuring Public Sites and that the adoption of VoC in public sector is at least on par with the private sector even while Web analytics practice has severely lagged.
There are, however, drawbacks to this extreme reliance on Opinion Research. Pure VoC isn’t very fine-grained. As a way of identifying site problems and opportunities it’s rather poor. As a means of site targeting it’s useless. It’s not that VoC isn’t a powerful and appropriate tool, it’s just that giving up on behavioral analysis on the Web is like giving up wine when you’re visiting Napa; wrong place, wrong time.
I believe that our Two-Tiered Segmentation provides a behavioral approach to measuring success on Public Sector sites.
The idea behind Two-Tiered Segmentation is simple: proper segmentation in the Digital world requires two-dimensions. The first is the classic dimension of Visitor Type, the second is Visit Type – what the Visitor is trying to accomplish. Almost every aspect of Web analytics can be framed within this simple Two-Tiered Segmentation of “Who” and “Why”.
In the past few posts, I’ve showed how this scheme can be used in Financial Services, Hospitality, and Media. It’s our belief that almost every type of Web site can and should be analyzed within this type of framework – including Public Sector sites.
The search for good measures of success on Public Sector sites has always been defeated by the very problem that Two-Tiered Segmentation is designed to address. Brute-force site-wide measures like “views per visitors” or “average visit time” can’t be applied consistently across different use-cases. Increasing views per visitor when deploying engaging content is great, increasing views per visitor for getting to a form is not so ideal. For a Website with both functions, all you get when you track the site-wide metric is noise.
A Two-Tiered Segmentation solves the problem. It parses up every visit to a Website into a specific type of Visit by a specific audience. In doing so, it creates a context for metrics that is both powerful and comparable.
Probably no vertical is more diverse than the Public Sector. I cannot hope to create a Two-Tiered Segmentation that is completely representative. Instead, I’m just going to assume a Public Service website for example purposes. The Visitor Type Segmentation might look something like this:
In this case, I’ve assumed the site has a particular target audience in addition to some common additional communities (Professionals, Educators, and Students). If a Website is designed to reach a particular target audience, it’s essential to try and identify whether you’re visits fall into that group or not and to make sure your Reporting and Analysis reflect that. Being successful with the wrong audience is no success.
Unlike my previous segmentations, a Public Sector site will not typically have a “value” dimension to sub-segment the audience. Some of the Visitor Type Segmentations I suggested for media will sometimes apply (Local/National/International, Access Channel, and Site Relationship). Site Relationship, in particular, is often a good Public Sector Visitor Segmentation:
Public Sector sites have often failed to incorporate Visitor Segmentation into their approach just as Private Sector sites have usually missed the boat on Visit Type. It’s a mistake either way. Using only Visit Type simply isn’t enough to achieve clarity in your metrics and KPIs.
Suppose, for example, that I wanted to measure Engagement on the site. No matter what measure I create or how complicated I make it, the standard of success for a Health Professional is going to be different than for a Student. The two types of visitors arrive at the site with different needs, different interests, and different content that will satisfy them. Capturing success in a single Engagement Metric will necessarily confuse the issue – no MATTER WHAT MEASURE OF ENGAGEMENT YOU USE. There is no one metric of Engagement that can possibly measure engagement accurately across two different populations.
What’s more, if I don’t segment, my view of site success will become dependent on changes in my customer mix. If Students register as more engaged than Health Professionals, then an increase in Student visitors relative to Health Professionals will indicate improved site-wide Engagement when, in fact, no such improvement exists.
Engagement, like any other KPI, simply cannot be applied site-wide without creating unacceptable levels of noise.
In common practice, Visit-Type is probably more congenial to Public Sector analysts. The concept of Use-Cases is widely understood and, what is more, ties in well to existing VoC practice.
Here’s a sample Visit Type Segmentation for our hypothetical Public Service Site:
One interesting aspect of this Visit Type segmentation is the split of “Finding a Form” into two Use-Cases. We find that almost all Form sessions fall into two basic patterns. In one pattern, the visitor arrives knowing the form needed and is searching for that form. In the other pattern, the visitor arrives at the site knowing they need a form but unaware of the appropriate Form Name / Identifier. In one sense, both visits have the same success (download a form). However, the level of success and the supporting metrics (time to find / satisfaction) are often quite different.
These are very common Use-Case on Public Sector sites (and not just Public Sector Sites). Remember, when a segmentation scheme would conflate two populations with significantly different performance, you’ll always want to at least consider a further sub-segmentation.
It’s within the matrix of Visitor Type and Visit Type that truly interesting KPIs, site analytics, comparative benchmarking, and testing opportunities all exist. Of these, I’ve talked a great deal about KPIs, testing and analysis. So I want to emphasize the idea of comparative benchmarking. Behavioral benchmarking of Public Sector sites would be a huge advantage to all concerned. With a consistent Two-Tiered Segmentation across multiple sites, that type of benchmarking would be possible. Given specific use-cases (Visit Types) and generalized audience types, it should be possible to create a real benchmark for success across multiple web properties. I think that’s a compelling prospect for Public Sector analytics and one I would love to tackle someday.
This is the fourth Two-Tiered Segmentation matrix I’ve demonstrated. I think that’s enough (maybe more than enough) to convey the general idea. In my next post, I’m going to tackle the art of actually building a segmentation in a Web analytics tool.