Retail Trading Areas & The Post-Cookie World

 

Today there’s a big… huge… enormous elephant in the rooms of advertising and marketing: the death of cookie-based digital marketing.  Scant discussion on the topic is due to one simple reason: no one really knows how to solve for the problem.  The most popular answer tossed around by ad agency illuminati seems to be to ditch bottoms up digital attribution models and return to relying only on media mix modeling for planning/execution.  

 

Really?  The answer is to go back to the standard from the analog TV-dominant era?   The fact that Facebook does not share sufficient data to make this plausible is completely ignored…unless you’re planning to drop Facebook and Instagram entirely from your plans.

 

Let’s take a step back for some perspective.

 

Back in the good old days, the advent of TV as the dominant marketing medium pushed ad agencies to research, plan, purchase, and evaluate media plans from a macro (national) level. A highly homogenous society watched the same portfolio of content on 3 dominant networks.  There was an audit process for the odd occasion when a national show under-delivered in a few local markets, but those were the exceptions that proved the rule.  

 

In that world, ratings and demographics were everything:

·        Ratings generated by Nielsen from a small sample of households, projected to reflect the entire nation

·        Demographic breakdowns of program audiences to allow marketers to narrow their selection of content for ad placement, allowing for affordable TV investment

 

By the end of that era (circa 2011), cable/satellite fragmentation of TV audiences had killed the golden goose: inability to achieve any affordable reach over 50%, with massively excessive ad frequency delivered in over-saturated commercial pods.

Then along came streaming and digital advertising, with their new definition of an “impression” (:01 screentime; 50% pixilation) and “cookies” that enabled unique user identification and attribution.

 

Cookies also allowed the industry to double-down on the practice of nationally based planning and execution.  This fueled a massive expansion of ad tonnage, since digital impressions were cheap, clients wanted ever-lower CPMs, and tonnage meant more fees for the buying agencies.

 

Just a couple of problems with all of this.

 

1)     Massive digital content fragmentation among a much more diverse consumer universe has resulted in a non-monolithic marketplace. Content consumption in TV markets can vary considerably from the national average, and community-based consumption within markets can vary from their own market average. National programmatic buys have no formal audit process to ensure equal delivery among markets, a critical issue especially for multi-unit retailers.

2)     Cookies came with a price: pushing the boundaries of nascent or non-existent PII rules among consumer/government oversight agencies, until the inevitable pushback came.

 

Future solutions depend on our ability to recognize the primacy of the retail trading area.   Understanding consumers through the prism of the trading areas where they live/work is the key to finding effective solutions.  Daily consumer behavior (migration; retail traffic; content consumption) is highly related to three factors specific in each trading area: density; local competitors; demographic composition.

 

Because of this, behavioral segmentation, based on what consumers are actually doing every day in trading areas across the country, becomes the primary research/planning tool.  Other segmentation analysis (demographic; psychographic) are secondary, and only engaged after the geographic, behaviorally-qualified audiences are confirmed.  

 

How can this local consumer behavior be measured, while still adhering to stricter PII guidelines?  

The key is looking at the smartphone as a consumer avatar: capturing and analyzing the device’s behavior as representing a consumer, without the ingestion of any privacy/identification data.  

 

Two companies that are using anonymized device data for client solutions are Placer.ai and Intuizi.  Both platforms focus on device behavior within retail trading areas, but there are critical differences:

 

>Placer.ai follows a “Nielsen”methodology: its universe consists of ~30 million devices (~7% of the 400 million active U.S. devices); data is analyzed and projected to specific trading areas.  Their sample size likely does not reflect geographic variance to ensure projectability in any individual trading area.  They provide consumer insights data only.

 

>Intuizi’s data base is comprised of ~380 million devices (95% of total), with the ability to drill down to any specific trading area for analysis on non-projected, real-time data.  Intuizi can provide consumer insights data and analysis, along with the ability to activate marketing plans through the client’s agency team, or directly through Intuizi’s DMP partners.    

 

The means to solve for the post-cookie world are out there, but will require CMOs and agencies to think differently from the conventional wisdom of the cookie-myopic 1st decade of digital.  

 

Or, to coin a phrase, we need to act globally by analyzing locally.

 

 

More Articles

August 12, 2020
Intuizi - The Data Deep Dive - Dr. Thomas Blumer

In this episode of "The Data Deep Dive," Ron Donaire interviews Dr. Thomas Blumer, a digital adoption expert with 13 years at QAD. They discuss DAP's significance, leveraging data, and the future of digital solutions. Connect with industry leaders and stay informed. Watch now! #DataDeepDive #DigitalAdoption

Ron Donaire

August 12, 2020
Intuizi - The Data Deep Dive - Netomi - Can Ozdoruk

Join Ron Donaire as he chats with Can Ozdoruk, VP of Marketing at Netomi. They dive deep into data-driven customer acquisition strategies, navigating privacy issues, the power of generative AI, and improving customer satisfaction with data. Tune in to learn from an expert transforming the customer service landscape.

Ron Donaire

August 12, 2020
Intuizi - The Data Deep Dive - The London Data Company - Sarah-Jane Smyth

Join Ron Donaire as he converses with Sara-Jayne Smith, CEO of The London Data Company. They explore data ethics, data provenance, and data compliance, and the unique business model that helps them stand out in the data landscape. Tune in to this intriguing episode of Data Deep Dive!

Ron Donaire

may 14, 2013
Intuizi - The Data Deep Dive - Littlefieldagency - Sam Littlefield

Join Ron Donaire and Sam Littlefield, CEO of Littlefield Agency, as they explore innovative B2B marketing strategies, the power of data storytelling in improving ROI, and the impact of generative AI on their workflow. Immerse yourself in this insightful episode of Data Deep Dive.

Ron Donaire

may 14, 2013
Intuizi - The Data Deep Dive - Rival Fantasy - Bryan Oldham

Join Ron Donaire, CEO of Intuizi, as he interviews Bryan Oldham, CMO of Rival Fantasy, discussing their innovative platform, data-driven strategies, and generative AI. Tune in for valuable insights into the world of fantasy sports! #DataDeepDive #RivalFantasy #FantasySports #DataDriven #GenerativeAI

Ron Donaire

may 14, 2013
Intuizi - The Data Deep Dive - Boltive - Dan Frechtling

Explore privacy and security with Dan Frechtling, CEO of Boltive, as he joins Ron Donaire on "The Data Deep Dive." Learn about Boltive's approach, client use cases, and navigating privacy challenges in marketing. #DataDeepDive #Boltive #Privacy #Security #GenerativeAI #DataAnalytics

Ron Donaire

Request a your
Demo Today

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.