Filed under: Gary Angel
By Gary Angel
Late last year I worked with Celebrus Technologies on a white paper entitled “The Future of Digital Measurement and Personalization.” I know that sounds a little bit grand, but I promise, it’s not the sort of empty, for-pay puffery you may have come to expect in our industry. First, I’m deeply concerned with the issues and arguments covered by the white paper: the right way to build a digital measurement infrastructure. So much so, in fact, that I’m working on a second, Semphonic only, white paper that extends the Celebrus Technologies effort with our research into the creation of an online data model for the warehouse. Second, I’m convinced that the Celebrus solution ought to be getting significant interest and traction in the marketplace. If you’re an enterprise committed to advanced digital measurement, you’d be remiss not to be at least considering Celebrus. Third, and I suppose this is what really matters, I believe the white paper makes an important case for a specific direction when it comes to building a digital measurement infrastructure.
My intention is to write two posts relative to the White paper – ones that largely mirror its structure. In this first post, I intend to layout some of the significant challenges in creating a really good digital measurement infrastructure. In the second post, I’ll explain why the Celebrus solution solves some of the biggest of those challenges and is a compelling direction for any organization wanting to build a measurement infrastructure that can support more advanced customer analytics, segmentation and real-time personalization.
While I hope to convey a good sense of the white paper in these two posts, I also intend to keep both rather short – precisely because the white paper exists and it is, after all, free. Here then, is a little sample, I hope tantalizing…
All last year I wrote about the evolution of Web into Customer analytics. If evolution in the natural world comes in fits and sudden dramatic starts, the same is surely true of technical advances. In 2011 we saw a dramatic acceleration in the use of data warehousing technologies to provide Customer Analytics even as traditional Web analytic tools worked hard to re-platform themselves to support deeper, faster, and better analysis at the customer level.
If you believe that data warehousing is the future of your digital measurement platform, then you should be thinking carefully about the nature of your measurement infrastructure. The vast majority of digital data warehouses that Semphonic worked with last year were sourced via a data-feed from a Web analytics solution. That’s an obvious choice since the work to create a good collection system via Web analytics tags has already been largely accomplished at most enterprises. But is it really the right direction?
In the white paper, I cover four areas of key concern when thinking about digital measurement infrastructure: Governance, Robust Data Collection, Data Model, and Real-time capability. In each case, sourcing from your existing Web analytics system presents real problems.
The problems of Web analytics tagging governance have become well-documented. Indeed, the problems have created a whole new industry of Tag Management Systems (TMS) that create a new layer of abstraction between the measurement system and the Website. I’m a big believer in TMS. Ensighten, Tealium, Tagman and Omniture’s new solution each have their respective advantages and disadvantages, but all provide significantly better governance than out-of-the-box Web analytics tagging. On the other hand, none of them entirely solve the single biggest problem in creating a digital measurement infrastructure – the necessity to pre-plan your information capture when designing a tag. Governance is improved because you move the tag creation function from IT to measurement. This is all to the good, but it doesn’t change the underlying dynamic. Someone still has to do all of the customization, and the customization has to happen before the measurement takes place. So a TMS only changes one piece of the problem. Measurement still has to be carefully planned. Customization still has to take place at the page level. You still lose anything you didn’t collect. That’s simply not ideal. What’s more, a TMS introduces an additional cost into the system – sometimes a fairly significant one. You’re paying more to collect the same information. If you’re committed to a Web analytics tool for digital customer analysis, then a TMS is the right way to go. Believe me, the improvement in governance is worth the cost. If you’re focused on the warehouse, however, a TMS may not be the best option.
This problem of pre-planned measurement and governance is all of a piece with the next issue – robust data collection. Web analytics tags are largely page-based. They have to be manually added to individual links, they require specific (and often rather arduous) customization to capture intra-page based actions. Scrolls, internal search, DHTML, Ajax, and Forms are all completely mysterious to the traditional Web analytics one-per-page tag. Not only does this fundamental limitation make governance and tag creation far more difficult, it limits the collection of many of the most interesting customer remarketing data points. Did a customer start a form? Did they fill-in a field? Did they scroll? Which link did a customer click? It would be nice to get all this seamlessly, without effort, and without customization.
Of course, collecting information is the fundamental purpose of a digital measurement infrastructure. Nevertheless, how that information gets organized turns out to be a huge challenge. No issue in warehousing digital data has proven to be more difficult than creating a good data model of digital data. For all the attention paid to “big-data” technologies, it’s my belief that poor modelling cripples many more efforts than does query speed. You may not think of the Digital Data Model as a part of a collection infrastructure, but you’d be mistaken. True, the data model implicit in a typical Web analytics data feed is so poor as to be hardly a model at all; it’s built for interchange not usage. A really good collection infrastructure will house data in a good data model and the two are largely inseparable.
Finally, there is the issue of real-time. Few aspects of digital measurement are as poorly understood and misrepresented as the need and function for real-time measurement. Traditional tag-based Web analytics tools were often sold on the premise that real-time data collection and reporting was a significant and important advantage. In general, that’s simply not true. Very few business decisions and very few analytic tasks can or should be tackled in real-time. There are a few verticals and few problem sets where true real-time reporting is essential. For most of our clients, it’s simply unnecessary. So the 24 hour delay inherent in sourcing your data warehouse from a Web analytics data feed should be no big deal, right?
Wrong. Because if you want to use your digital data to drive personalization or re-marketing, time suddenly becomes much more important. Real-time decision-making is fundamental to good personalization and re-marketing because nothing, NOTHING, is more important than what a customer just did. While some people believe that this type of real-time decision-making is the sole purview of black-box tools, I strongly disagree. The best personalization opportunities will come from the integration of customer data with real-time data using rule-based, analyst driven optimizations.
If I’m right, then sourcing your data warehouse from a Web analytics solutions will effectively preclude you from advanced personalization and remarketing. Even if that means little to you right now, I suggest that building your warehouse on a technology stack with such a critical limitation is short-sighted.
In short, if you’re committed to the warehouse as your solution for digital customer analytics and optimization, there are some pretty compelling reasons why sourcing that warehouse from your Web analytics solution isn’t ideal. It looks easy, it may be the right direction for a pilot, but it places some potentially crippling limitations on the long-term capabilities of your program.
In my next post, I’ll show how the Celebrus solution – a digital measurement infrastructure built specifically for the data warehouse – solves some of these challenges and explain why I think it ought be getting serious attention from any enterprise focused on digital data warehousing.
Or you can get the whole white paper here….
1 Comment so far
Leave a comment
Leave a comment (Your e-mail address will not be displayed)