Skift din region

Making attention work

Attention is an exciting new metric but taking advantage is not that simple. Belle Cartwright, Global Tech Acceleration Lead at EssenceMediacom, busts some common myths around attention measurement.

Attention is designed to move brands beyond simply knowing that their ads appeared on a page and provide proof that an ad has been seen by a real human.

That makes it extremely attractive and over the last few years it’s built up enough momentum to be heralded as the new “it” measurement solution.

The goal is to understand whether a human has seen an ad for long enough to process the information contained within the ad. It’s sometimes seen as a step on from viewability, but it’s much more than that.

The ultimate benefit is to inform media planning and optimisation to drive better value outcomes.

Attention can do this in three key ways:

  1. Raise the floor:

Attention metrics can be used to remove poor performing inventory that is not looked at by a human. This can help us more easily avoid inventory that appears as viewable but rarely delivers on business results.

2. Inform planning and optimisation:

Attention can act as a predictive and real-time signal for harder to measure business impacts such as brand lift and absolute sales lift, once correlation has been proved. Where there is a clear link, planners can use attention metric benchmarks and cost metrics to find the sweet spot between the minimum level of attention needed to deliver lift and the maximum amount that’s worth paying.

3. Drive creative x media breakthroughs:

As this evolves, it will allow us to design a custom measure based on what we need to; a) “surprise” consumers with our messaging and break the norm of what a person may believe about a brand, or b) “reinforce” a message by fitting into their understanding and belief systems for the brand. These scenarios use both media and creative attention measures.

How to measure attention

Before we can take advantage of these opportunities, however, we need accurate measures for attention.

There are currently two basic types of attention metric; the first seeks to create an enhanced version of viewability as a proxy of attention; and the second seeks to incorporate eye-tracking data to “scientifically” measure attention.

Most attention vendors use a mix and match approach to create their attention metrics, utilising page/domain context, ad context, eye tracking and modelling to produce two key types of metrics: attentive seconds and attention scores.

The former assesses “time in view” and is based on how long someone looks at an ad. This measure is primarily based on panel observations from eye tracking data, correlated to page/domain and ad context data (e.g., the percentage of screen an ad takes up) and then modelled out to scale the metric.

The latter seeks to create a score per ad format using an algorithm trained on multiple media datasets. The score can be generated in near real-time using live tags and ad level media performance reports to collect metrics of formats being run, page position, clutter, size of ad on screen and other inventory-based data.

The idea behind a score rather than attentive seconds is to compare inventory types and formats to provide comparability and cost for buying attention across rich format, statics and video formats of various lengths.

A range of metrics

Having different types of attention measurement also enables us to select the right one for each use case and desired business outcome. For example, attentive seconds are incredibly useful for understanding the value of video inventory, while attention scores provide significant value in planning.

There are still challenges, however, with the lack of metric consistency. For example, not all vendors use eye-tracking panel study validation and many don’t consider visual clutter. What matters in such cases is our ability to prove that these metrics deliver positive outcomes for our clients.

Many vendors are also limited to programmatically bought inventory with almost none of the walled gardens allowing active tagging of their inventory sets. Even without walled garden data, however, attention is hugely exciting and provides an opportunity to move closer to a measure that is more predictive of value outcomes for advertisers.

The bottom line, however, is that attention is still an evolving metric despite the hype.

Best practice for advertisers is to test which metrics and partners work for their specific needs, working with vendors to help evolve their methodologies, while taking into account key limitations on scaling attention across inventory sets and markets.

Article first published by Performance Marketing World, April 10th 2024.