Update on the Evolution of Comscore MMX 360
It’s been nearly seven months since Comscore first announced the introduction of MMX 360, our new panel-centric Unified Measurement of digital audiences. Our stated premise behind this initiative was to bring the digital media industry a solution which integrates server-side web analytics which do a good job of measuring total page views (if properly filtered for non user-requested traffic and counted correctly as one beacon per page) and panel-based audience measurement which provides insights into the behavior of individual people, as opposed to cookies or machines. The response to this initiative has been overwhelmingly positive, as evidenced by the high level of participation among top publishers – approximately 75% of the top 50 publishers in the U.S. are either fully reportable under this new methodology or in the process of doing so – as well as the reaction we’ve gotten from agencies and other industry stakeholders. This new methodology becomes even more important when considering the evolution of digital media, including the emergence of new channels for media consumption (mobile devices, tablets, e-readers, etc.) and the increasing fragmentation of the content landscape.
In short, our industry requires a digital media measurement infrastructure equipped to handle the realities of the next decade and beyond, and Comscore has risen to the challenge. As with any major undertaking, there have been some hurdles and challenges we’ve encountered and overcome along the way. But in the course of these seven months, we have learned an extraordinary amount that we believe is enabling us to quickly vault our industry years into the future.
Through this undertaking, we’ve accumulated a new understanding of the highly complex and fragmented digital media landscape. To say that measuring this environment is complicated would be a severe understatement. But we have rapidly unearthed many intricacies and nuances which lead to a more accurate and harmonious measurement landscape.
- One of the earliest – and perhaps most obvious – findings along the way is that we’ve seen ample evidence of the foibles of server-side analytics for measuring the number of unique visitors (i.e. people) who visit a site. Due to many inflationary factors, including cookie deletion and rejection, bot and spider traffic, and site visitation from multiple locations, we’ve found clear and direct evidence that web site servers routinely overstate actual people counts by a factor of two and higher. In particular, the inflationary impact of cookie deletion is consistent with independent research from a variety of other research companies, including Forrester, Belden, Jupiter and Nielsen. The inflation in server data has also now become apparent to the IAB (see their Audience Reach Measurement Guidelines) and academics such as Max Fomitchev, an assistant professor of Computer Science & Engineering at Pennsylvania State University who conducted an exhaustive study and concluded:“Cookies are about just as inaccurate in estimating unique visitors as unique network addresses. This is the new and unrealized fact in the industry that has a direct impact on Internet advertising as currently reported unique visitor / core audience size numbers tend to overestimate the true audience size by a large factor (7-30, depending on the visitation frequency and the sampling period).”