‘Measuring’ The Difference At Cannes This Year

As the dust continues to settle post-Cannes, and people across the media, technology, and advertising industries start to dive back into work, one takeaway from this year’s festival seems clear: discussions on TV data and measurement were notably subdued compared to recent years. Just 12 months ago, these topics dominated every stage, yacht, and beach bungalow, with industry players hashing out challenges, and at times even setting down their rosé to point fingers at one another’s solutions for their perceived shortcomings.

Yet this year, the volume and tone was different. Was it fatigue on the topic, the absence of a leading industry voice or the lack of measurable progress or consensus among buyers and sellers? Perhaps AI discussions overshadowed the debate on measurement. Or have we simply accepted the status quo? While the reasoning for the shift might not be entirely clear, the lack of substantive discussion was noticed by many.

As someone who has been in the measurement space for more than a quarter-century, I believe the focus on the currency battle has been an unfortunate distraction. Perhaps there’s now a collective realization that we might be missing the forest for the trees. The sense on the ground this year shifted to a greater awareness that the battle for currency supremacy might have been at its core, triggered  by the industry-wide frustration that Nielsen’s measurement products have not kept pace with the rapidly changing TV landscape. 

One of Nielsen’s responses has been that currencies, all currencies, must be stable and therefore any change must be considered and given time to introduce. To be honest, I understand that answer on the surface. Consider, for example, the damage to foreign trade and the US economy a volatile US dollar would cause. But the need for a bigger panel has been obvious for twenty years and set-top box (STB) data has been available at scale for fifteen.

Today’s  media environment is very different from just a few years ago, and just as we have seen in many other industries where an innovation vacuum occurs, Nielsen’s lack of pace has opened the door for companies to propose new ideas. These alternatives in the currency race (alt-currency as they are now known) made big promises about big data and gave buyers and sellers the impression that things would change quickly. However, after roughly a decade of trying – here we (still) are - (note Comscore just celebrated their 25th anniversary - yes time does fly when you are having fun).

Following numerous meetings at Cannes with agencies, brands, media firms, and "alt" providers, the pressing question emerges: why haven't these alternatives fulfilled their promises, and more importantly, where do we go from here?

The insights emerge:

  1. Not All Data Is The Same

    Big data is different from panel data (master of the obvious), and for a risk-averse sector, changing horses mid-ride could be perceived as, well, risky (remember $65 billion worth of TV advertising is still being bought/sold, and while that number continues to shrink it is still a large sum). And big data sourced from legacy smart TV manufacturers has issues. The devices know if the TV is on or off and in some cases what is on the screen. But because manufacturers don’t have a first-party relationship with their customers, they don’t know if the device is primary, secondary, or tertiary. They don’t know if someone is actually watching, and they don’t have verified demographic data. In short, big data does not yet have the data that the industry has come to rely on.

  2. Currency vs. Analytics

    The second variable hamstringing innovation in the media research space is the fact that currency use cases are different from analytics use cases. “Big data” is the perfect fuel for deterministic attribution or accurate share-of-voice reporting, which are analytics—not currency. For us to move forward, we must accept that currency and analytics can survive separately and take a “horses for courses” approach.

Until a cost-effective way to build and maintain a panel sufficiently large enough to accurately represent the fragmented viewing behavior of 120 million households in the US is developed, the future of TV data and measurement will remain in flux, and woefully behind the insights available in mobile and digital.

The challenges we face are known and readily addressable. We know intuitively that the panel size must be (a lot) bigger than 40,000. We also know that the reporting must include verified data about who is watching - not just that the TV is turned on but how many (if any) actual humans were in front of the content and exposed to advertising. And, we need this data to not only be stable and predictable (not likely to be withdrawn from the market due to an acquisition or other business shifts) but also future-proofed for a respectful, transparent, privacy-first world. These are each solvable in a future where the consumer is part of the equation.

While Cannes may have seen muted discussions this year, 2025 promises to be a transformational year in TV data and measurement. Expect it to once again dominate conversations up and down the Croisette next summer.

Bob Ivins

Bob Ivins, a long time industry innovator and thought leader, currently leads data strategy and measurement for Telly. Prior to this he has had senior roles at Comcast, Yahoo! Europe, Nielsen and Comscore.

Previous
Previous

The Disney+/Hulu Combo Platter: Ultimate Churn Solution Or Too Much Of A Good Thing?

Next
Next

Paramount’s Anne Becker Sees AI As Human Potential Amplifier In Marketing