Guest Commentary: Frequency-Capping Linear TV
"We are reducing non-productive spend by leveraging our scale better, duplicating work less and getting more copy on air. This allows us to hold our overall media spend and remain fully competitive"
These comments from Unilever's Chief Financial Officer Jean -Marc Huet during his company's quarterly earnings call last month represent the aspirations - if not outright demands - of virtually all large marketers.
Every marketer needs to pursue efficiency in their media spending, especially in light of double digit rates of inflation for television advertising. To that end, eliminating duplication is increasingly important.
However, if such efforts only relate to the processes associated with "non-working" spending (such as creative production, media planning and buying operations) it will be difficult to save more than a few percentage points of total expenditures. "Working" costs associated with actual advertising inventory usually account for more than 85% of a campaign, and this is where duplication adds significant cost.
A marketer may develop a campaign intending to reach a defined target audience a certain number of times over the course of a campaign. But if some of the target audience sees a commercial in heavy rotation while others among the target seldom do, reach among some of the target has been duplicated and others were under-served.
On the web, the solution to this problem is described as "frequency-capping" and has become standard practice. However, in television advertising it is virtually non-existent. This is generally a consequence of the limitations of conventional planning tools and data sets related to television advertising.
To put this in practical terms, we recently ran an analysis of a three similar campaigns for marketers with directly comparable products. In each case the marketers reached approximately 80% of target audiences six times on average during the campaign. Such results undoubtedly met the media plan's goals and were likely deemed a success.
Broadcasting & Cable Newsletter
The smarter way to stay on top of broadcasting and cable industry. Sign up below
But the data sets we manage showed us that in each case approximately 30% of target audiences were exposed to the campaign's commercials more than 10 times during the campaign, approximately 20% of target audiences were exposed between 5 and 9 times, and approximately 30% of target audiences were exposed between 1 and 4 times. Nearly 20% of target audiences were not reached at all during each of these campaigns.
By applying predictive algorithms to the inventory included in buying plans, frequency could have been balanced more evenly. It is possible to identify the programming which is likely to contain high concentrations of the audiences already exposed to a commercial several times before. Budgets intended for those duplicative units of programming can instead be allocated to units containing large concentrations of audiences who have been under-exposed to the campaign. Reliable set-top box data sets or other respondent-level data sets paired with reliable predictive algorithms are required to accomplish this effort and those technologies are now available from a number of providers including Simulmedia.
Frequency-capping is conceptually simple, and has become a standard approach on the web. Not surprisingly, we've seen this concept work on television with our clients, and expect growth in adoption will continue. Given the competitive nature of marketers and the continuing demands they face to optimize their budgets, it's only a matter of time before this notion will become as ubiquitous for all television advertisers as it is for those using the internet today.
Brian Wieser is the Chief Marketing Officer of Simulmedia.