data Archives - Digital Content Next Official Website Tue, 07 Oct 2025 12:40:31 +0000 en-US hourly 1 As media content spending grows, AI and data drive strategy https://digitalcontentnext.org/blog/2025/10/07/as-media-content-spending-grows-ai-and-data-drive-strategy/ Tue, 07 Oct 2025 11:26:00 +0000 https://digitalcontentnext.org/?p=46102 Hybrid media models that blend traditional content with user-generated material, along with strategic partnerships and advanced data analytics, are emerging as essential strategies for success in today’s media landscape. According...

The post As media content spending grows, AI and data drive strategy appeared first on Digital Content Next.

]]>
Hybrid media models that blend traditional content with user-generated material, along with strategic partnerships and advanced data analytics, are emerging as essential strategies for success in today’s media landscape. According to new research by KPMG LLP, The Future of Content Spend and Business Models in Media, individual creators and streaming platforms are gaining influence, and AI tools are becoming increasingly important for both data analysis and content creation.

Content spending by top streaming platforms

The 12 leading content players studied for the report – which include Disney, Amazon, Paramount, Netflix, Comcast, and YouTube- spent about 210 billion dollars on content in 2024. This represents a 10% compounded annual growth rate since 2020. Comcast led the way in 2024 with 37 billion in spending, followed by YouTube at 32 billion, Disney at 28 billion, and Amazon at 20 billion. The top 12 spenders were primarily U.S. based platforms and media companies.

Among the big takeaways:

  • Investments in live sporting events continue to rise, while investment in scripted and reality programming has slowed.
  • The rising popularity of free streaming platforms such as PlutoTV and Tubi is poised to accelerate content expansion.
  • The future will rely on a blend of traditional high budget film and television series material with nimbler user-generated content.
- chart that shows how content gets to the screen from Studios and TV to streaming platforms, which impacts content spending and monetization -

The rising impact of user-generated content

User generated content, enabled by social media platforms, has become an essential part of the media content landscape and one that increasingly overlaps traditional studio TV and film models. A few takeaways:

  • The rapid expansion of user-generated content is outpacing other content categories and that trend is expected to continue.
  • User-generated content has become its own genre. As such, rather than replacing traditional TV and film material, it has become a critical part of a hybrid media model.
  • The line will continue to blur between traditional studio models of financing content and social media and streaming platforms that enable individuals to profit from content through ads, sponsorships, and memberships.
  • Fierce competition for influential individual content creators is likely to heat up in the future, requiring innovative collaboration and partnership strategies.

Partnerships at home and abroad

To remain fully competitive, major media companies will need to partner with individual content creators, as well as other media entities, technology companies, and telecommunications outlets. It’s also essential to interact effectively with global markets.

International audiences prefer lower price points and ad-supported structures. They also gravitate towards local content, which can mean “localization” of exported U.S. material to suit international markets. To be competitive in the global marketplace, media companies need to tailor their content and services to include flexible pricing and audience customization.

AI and data analytics influence strategic content spending

Data is the key to gaining insights and making decisions that drive return on investment. The ability to leverage consumer data to enhance personalization and target content investment wisely will be critical going forward. AI utilization will be integral to this process, with AI tools increasingly relied upon to automate, enhance, and extract insights from data.

AI isn’t just playing a role in data analytics, however; it’s also impacting content. This report lists “Choose-your-own adventure narratives, automated local dialogue, and ultra-low-cost formats” among the content AI could generate. However, the authors opine that, due to the importance of human talent and fandom, AI will augment rather than take over the content production process.

Smart choices for content spending

As media continues to evolve, content leaders face pivotal choices. The blending of studios, platforms, and creators, alongside the growth of ad-supported streaming and AI-powered personalization, is changing how content is made, shared, and monetized.

To stay competitive, leaders need to adopt flexible business models, invest wisely across formats, and connect directly with audiences. Success will hinge on spending smarter by leveraging data, technology, and partnerships to grow new value in the shifting media ecosystem.

The post As media content spending grows, AI and data drive strategy appeared first on Digital Content Next.

]]>
Calibrated Signal: audience data yield without audience data leakage https://digitalcontentnext.org/blog/2025/09/08/calibrated-signal-audience-data-yield-without-audience-data-leakage/ Mon, 08 Sep 2025 11:28:00 +0000 https://digitalcontentnext.org/?p=45960 There isn’t a premium content company on the planet that isn’t feeling the combined effects of several significant shifts reshaping how publishers reach and monetize audiences. Google’s AI Overviews now...

The post Calibrated Signal: audience data yield without audience data leakage appeared first on Digital Content Next.

]]>
There isn’t a premium content company on the planet that isn’t feeling the combined effects of several significant shifts reshaping how publishers reach and monetize audiences. Google’s AI Overviews now answers more user queries directly on the search results page, reducing click-throughs to original content and reporting. Brand safety systems have long filtered out legitimate journalism when stories touch on sensitive topics, further shrinking the monetizable audience on the open Internet. And since Apple began tightening the screws on third-party identity, high-value audience budgets have been steadily consolidating inside walled gardens, where targeting and measurement are easy, and results are immediate.

It’s tempting to frame publisher revenue as a traffic problem or an audience targeting problem. In reality, it’s a control problem. Walled gardens win because they own the most useful audience data, control the environment in which it’s applied, and make performance measurement effortless for buyers. The open Internet, by contrast, has allowed too much of its most valuable audience signal to slip away. Publishers have an incredibly valuable asset, which they have failed to surface to advertisers in usable ways without giving away the farm. The result is a performance gap that premium publishers can’t afford to ignore.

The way forward isn’t to hand over more raw data in the hopes of winning a larger share of your clients’ ad budgets. It’s to calibrate the signal you share. It needs to offer enough to make your media competitive on targeting, measurement and optimization, while keeping control of the underlying data.

At Symitri, we call this approach Calibrated Signal.

Defining Calibrated Signal

Calibrated Signal is the deliberate practice of exposing just enough information about your audience and content to drive advertiser performance, without revealing personal identifiers or relinquishing ownership of the data. It’s a middle ground between “lock it all down” and “let it all out,” designed to satisfy privacy requirements, protect competitive advantage and give buyers the clarity they need to plan, measure and optimize with confidence. Publishers who calibrate signal stay in control of their valuable audience data while giving buyers everything they need to justify premium CPMs.

This isn’t theoretical. Calibrated Signal can take the form of aggregated intent cohorts, recency-based engagement tiers, deterministic privacy-safe audiences, affinity-oriented contextual signals and privacy-safe conversion reporting. Each element is designed for one purpose: to make it easy for an advertiser to plan, buy, measure and optimize against your audience as effectively – or more effectively – than they can inside a walled garden.

A publisher playbook for Calibrated Signal

Premium content publishers are at an advantage because they tend to attract repeat, engaged audiences. This regularity is a built-in differentiator that distinguishes premium publishers from the rest of the transactional open Internet. For publishers in this category, creating value from Calibrated Signal requires a modest effort and investment that will yield significant returns.

1. Own your data spine

Start with consented, first-party data and a durable identity framework you control. If you can motivate your audience to log in, that’s gold. But even without explicit authentication, commercial identity solutions can help you uniquely recognize repeat visitors in a privacy-compliant fashion. Whatever your approach, standardize the way you capture and categorize signal – content taxonomies, intent cohorts, engagement tiers – so it maps cleanly to advertiser needs. Share the schema (not the identifiers) so buyers or their technology partners can act on it without taking custody of the underlying data.

2. Federate signal to create reach

Even the largest premium publishers rarely have enough audience scale to fulfill most campaign objectives on their own. That’s why Calibrated Signal works best when it can be activated across a network of trusted publishers with a consistent taxonomy and governance model. When buyers can plan, target and measure against a unified signal set that spans multiple properties – without the underlying audience data ever leaving publisher control – it becomes a true alternative to walled-garden scale.

3. Measure results in real-time

Performance budgets follow proof. Programmatic performance budgets follow proof in real time.

While closed-loop measurement and optimization are standard in walled gardens, achieving real-time performance on the open Internet has become far more challenging. Why? Because underlying audience identity signals have largely disappeared from publisher inventory. Premium publishers need to own the work of restoring the signals that make fast, consistent outcome validation possible across trusted inventory – without giving away the farm. When that happens, the results will speak for themselves. Research strongly suggests that the premium open Internet will easily match – and likely exceed – platform benchmarks.

4. Give buyers the confidence to trust what they’re buying

All this focus on audience data and measurement would be meaningless without the valuable, high-quality inventory to which Calibrated Signal is attached. Buyers need to know they’re reaching real audiences in brand-safe environments that have a high propensity to perform. That means eliminating MFA inventory, ensuring direct, cost-efficient supply paths and consistently operating with verifiable supply-chain transparency. With these quality promises – and a consistent level of Calibrated Signal – buyers are likely to shift their big budget bets from the walled gardens to the trusted and transparent environments that premium open Internet publishers can uniquely offer.

5. Anchor everything in outcomes

Structure every campaign with clear KPIs – incremental reach, CPA improvement, conversions and lift – and fixed timelines. Make it easy for buyers to prove value quickly and justify scaling spend.

Why this is different from legacy data deals

Legacy audience data-driven deals often handed over audiences with limited controls or offered broad contextual targeting with no performance accountability. Calibrated Signal is different in three ways:

  • Value is intentional: you share only what improves outcomes, nothing more.
  • Measurement is integral: buyers don’t just trust; they test and see results.
  • Privacy is by design: you always own and control your valuable consumer data assets.

This reframes the open Internet’s so-called “signal loss.” No individual publisher needs the same identity resolution scale as a walled garden. Instead, you and your premium open Internet publishing peers need consistent, privacy-safe signals that outperform on cost per outcome.

A premium publisher call to action

Traffic patterns will continue to shift as AI captures more queries on the results page. Brand safety systems will still misclassify legitimate journalism. Don’t fight these headwinds by creating more commodity inventory or compromising with looser data practices. Win with Calibrated Signal that gives you the power to stay in control of your data while proving you can generate outcomes for advertisers that matter. When you do, brands will discover what many already suspect – that the open Internet can outperform walled gardens when premium publishers make signal a product, not a by-product.

The post Calibrated Signal: audience data yield without audience data leakage appeared first on Digital Content Next.

]]>
Capitalize on first party data (like platforms do) https://digitalcontentnext.org/blog/2025/08/21/capitalize-on-first-party-data-like-platforms-do/ Thu, 21 Aug 2025 11:42:00 +0000 https://digitalcontentnext.org/?p=45869 Two Thanksgivings ago, Amazon Prime streamed its first Black Friday NFL game, part of its groundbreaking exclusive deal for Thursday Night Football broadcasts. Pundits from the broadcast and streaming industries...

The post Capitalize on first party data (like platforms do) appeared first on Digital Content Next.

]]>
Two Thanksgivings ago, Amazon Prime streamed its first Black Friday NFL game, part of its groundbreaking exclusive deal for Thursday Night Football broadcasts.

Pundits from the broadcast and streaming industries were watching to see whether Amazon’s servers could handle the load (no problem) or whether its production standards were up to snuff (mostly).

The thing that caught my eye?

The QR-codes embedded alongside some traditional 30-second commercials. They explicitly invited viewers to whip out their phones and get early access to pre-holiday deals.

On Amazon, of course. Without ever taking an eye off the game.

In a single moment, Amazon crystallized the changes happening in advertising. They clearly demonstrated how first-party data owned by the tech platforms trumps the decades of experience of those of us in ad-supported media industries.

Buy direct

If those QR codes don’t make your blood run cold – well, they should. They’re one more sign of the need for those of us in traditional media to get serious about understanding our audience members as individuals and build direct relationships with them.

“Amazon’s real advantage is the depth of their first-party data,” says Bill Day, former senior vice president of the media research firm Magid Associates. “Prime Video accounts are directly tied into your Prime shopping account – a seamless integration from video exposure all the way to point of sale activation for endemic categories.”

In other words: Amazon knows who its audience members are (including credit-card numbers), where they live, what they buy. And they can connect them to a one-click purchase system.

Viewers are buying

Up next, Day says: Moving beyond endemic categories to local advertisers.

Amazon even has a fancy product name – “Interactive Video Ads” – a combination of QR codes, remote-control cues and other triggers to allow viewers to research and buy advertisers’ products without leaving the stream.

Last Thanksgiving, Amazon sold out its Black Friday game inventory four months ahead of the games. Even the Super Bowl rarely sells out that early.

Jay Marine, Amazon’s global head of sports, pitched the IVAs thusly as this year’s upfronts telling the Hollywood Reporter, “What we’re able to do, which excites advertisers, is deliver that live event scale, combined with the digital insights, combined with the Amazon shopping capabilities.” He pointed out that “a customer can go from watching the game to seeing an interactive advertisement that they can one-click ‘buy,’ and it’s showing up at their door in a couple hours. … I think we’re really positioned to deliver something that they can’t find in the rest of the market.”

(He doesn’t think. He knows. Can your media outlet do that? Didn’t think so.)

Individual knowledge, and data, is power

It’s not just Amazon, either. Every member of the so-called FAANG club knows more about individual audience members than we do. Meta understands consumers’ content consumption on Facebook and Instagram. They also have deep insight into audience members’ other browsing through their tracking pixel.

Alphabet can build a dossier based on a users Google Chrome browser history and YouTube viewing. Then they can deploy that data through the chunks of the programmatic ad ecosystem that it owns.

Apple has data from iPhones, its podcast platform and, increasingly, Apple TV+. Netflix knows what, when and where you watch video.

Meanwhile, too many of us talk in broad demographic strokes: “We’re No. 1 with women 25-54!”

Know your audience now

Regardless of our industry of origin – newspaper, broadcast, digital native – media companies are severely handicapped compared to the tech platforms, who have decades of first-party data and the expertise to use it effectively.

So what do we do?

Start with a serious commitment to gathering more of your own first-party data – from digital log files, from active outreach like quizzes and newsletters, through value exchanges with our audiences.

We should think, too, about how we might revisit and refresh ideas the industry considered, and rejected, decades ago.

One of my favorites, from Ye Olden Days of the ‘90s: The legacy newspaper industry formed a consortium, the New Century Network, to build collective tools in the then-emerging technologies of the internet. One of those proposed answers was universal registration system (what today we would call a single-sign on attached to a data lake) that would aggregate identity and usage data across the major news sites of the day.

My mentor Owen Youngman – then a senior executive at Tribune Co., later a professor of digital media at Northwestern – loudly advocated for that system (and probably crafted most of the pitch). It was basically laughed out of the room by companies who believed their classified-ads monopoly was divinely granted and immortal. So, why worry about first-party data?

Oops.

Within a decade, Google effectively built out that model. Using tools like Gmail to get users into their ecosystem, then adding in data from search and Chrome to build a dominant position in the programmatic ecosystem.

De-FAANGing the internet

There are glimmers of hope, though: The European Union is aggressively confronting the FAANG companies’ data practices; even the U.S. Justice Department is showing some renewed vigor toward anti-trust enforcement.

However, we can’t afford to complacently hope the regulators will solve all problems. Though we can hope those actions open a small window of opportunity. And it’s time to figure out if your media organization is building the data and skills it needs to make the most it.  


About the author

Tom Davidson is the Bellisario professor of practice in media innovation at Penn State University. He was a longtime reporter turned media executive and product developer at Tribune Co., PBS and Gannett.

The post Capitalize on first party data (like platforms do) appeared first on Digital Content Next.

]]>
How Data Collaboration Platforms are transforming publishing https://digitalcontentnext.org/blog/2024/12/09/how-data-collaboration-platforms-are-transforming-publishing/ Mon, 09 Dec 2024 12:14:00 +0000 https://digitalcontentnext.org/?p=44227 As digital publishers seek to expand their revenue streams, strengthen their data capabilities, and keep pace with evolving adtech, data collaboration platforms (DCPs) have emerged as powerful allies. According to...

The post How Data Collaboration Platforms are transforming publishing appeared first on Digital Content Next.

]]>
As digital publishers seek to expand their revenue streams, strengthen their data capabilities, and keep pace with evolving adtech, data collaboration platforms (DCPs) have emerged as powerful allies. According to recent market research, DCPs benefit not only marketers and agencies but also publishers. They equip publishers to enhance their first-party data, gain valuable audience insights, and offer advertisers high-quality data—all while maintaining a competitive edge in the open web vs. walled-garden debate. Here, we explore how DCPs can transform digital publishing in an increasingly data-driven world.

Data Collaboration Platforms as a key solution for publishers

For publishers, data collaboration platforms are becoming invaluable. DCPs unify data silos, expand reach, and provide deeper consumer insights, making them essential tools in today’s data landscape. Publishers with strong first-party data can offer advertisers unique, high-value data and insights through collaboration, further enhancing the advertiser’s ability to reach relevant audiences.

Recent research Lotame conducted, fielded by Cint, shows that 44% of marketers and 40% of agencies currently use collaboration technology, not including clean rooms. Publishers have an opportunity here to build strategic partnerships with advertisers because then can help them fill in gaps in audience data, improve targeting precision, and create unique insights unavailable elsewhere. By leveraging DCPs, publishers can maximize the potential of their data assets. They have the ability to facilitate more accurate cross-screen measurement, bolster audience targeting, and deepen advertiser understanding of reader interests.

Shift in programmatic spending between open web and walled gardens

One of the report’s most surprising findings is the near-equal division of ad spend between walled gardens and the open web. While it may seem that walled gardens dominate the advertising market, 50% of marketers plan to spend equal amounts on open web and walled gardens, which amounts to 25-49% of their budgets. This balanced spend is promising for independent publishers, as it reflects a renewed interest in high-quality inventory and a move away from a reliance on walled-garden platforms.

As advertisers diversify their spending, publishers on the open web have a chance to attract higher ad spend by offering valuable, targeted inventory that rivals walled gardens. DCPs play a critical role here, allowing publishers to better target and measure their audiences, making the open web a more viable option for advertisers looking to connect with audiences outside of walled gardens.

Adoption of new technologies among publishers

Investment in data technology is at an all-time high, with data collaboration platforms leading as a priority. While traditional query clean rooms are being phased out by some—25% of marketers and agencies plan to retire these due to high costs and limited scale—data collaboration platforms are gaining popularity. They offer a flexible, scalable approach to data integration and collaboration, without the technical challenges or expenses associated with clean rooms.

For publishers, this shift toward DCPs offers a streamlined and cost-effective way to handle collaboration. With DCPs, publishers can more easily collaborate with brands and other entities without the high barriers or technical limitations of clean rooms, enhancing data fluidity and maximizing revenue potential. By adopting DCPs, publishers can stay technologically agile, ready to meet evolving advertiser needs without compromising on data quality or scalability.

Budget priorities and transparency needs in Real-Time Bidding (RTB)

Transparency and ease of doing business are critical to open web spending. According to the research, the top priority for marketers when considering increased spend on the open web is the ease of curating deals with publishers on high-quality inventory matched with data—a priority shared by agencies. Publishers can capitalize on this need by simplifying deal structures and ensuring transparent, high-quality inventory offerings.

By addressing these transparency needs, publishers can make it easier for advertisers to allocate budgets to the open web. DCPs help publishers align their inventory with advertiser data needs in real-time, facilitating seamless, data-driven deals. This transparency can enhance trust between publishers and advertisers, making the open web a more attractive alternative to walled gardens for RTB ad dollars.

Benefits of DCPs beyond targeting: attribution and personalization

Beyond improved targeting, DCPs offer benefits in personalization and attribution. This aligns well with publishers’ goals to provide a valuable user experience.

For publishers, these capabilities translate into higher engagement and monetization potential. By tapping into the personalization and audience segmentation features of DCPs, publishers can not only serve relevant content but also ensure that their audience experience is enhanced, leading to longer engagement times and better monetization. These capabilities resonate with the needs of modern advertisers, allowing publishers to meet demands for high-quality advertising in a competitive digital ecosystem.

There is a promising landscape for publishers seeking to stay competitive and capitalize on new data technologies. By investing in DCPs, publishers can address limitations in first-party data, build stronger relationships with advertisers, and offer transparent, high-quality inventory on the open web. As the balance of ad spend shifts toward a more open ecosystem, DCPs offer publishers a path to optimizing programmatic revenue and delivering a personalized, high-performing audience experience. With these tools, publishers can thrive as critical players in the evolving ad landscape.

The post How Data Collaboration Platforms are transforming publishing appeared first on Digital Content Next.

]]>
How publishers can use performance marketing to boost growth https://digitalcontentnext.org/blog/2024/12/02/how-publishers-can-use-performance-marketing-to-boost-growth/ Mon, 02 Dec 2024 12:14:00 +0000 https://digitalcontentnext.org/?p=44211 The digital publishing industry is navigating a complex transformation. Organic traffic, once a cornerstone of success, is steadily declining due to changes in search engine algorithms, the rise of AI-driven...

The post How publishers can use performance marketing to boost growth appeared first on Digital Content Next.

]]>
The digital publishing industry is navigating a complex transformation. Organic traffic, once a cornerstone of success, is steadily declining due to changes in search engine algorithms, the rise of AI-driven content, and shifts in consumer behavior. To address this challenge, a growing trend we’ve seen among publishers, especially those with premium content and audiences, is to adopt performance marketing—a highly data-driven approach—as a means to both acquire new readers as well as enhance the performance of sponsored content.

This shift represents a departure from the traditional reliance on organic reach. Performance marketing requires a results-oriented strategy, leveraging insights and analytics to optimize campaigns and allocate budgets effectively. Its rise signifies an evolution in how publisher leaders may want to think about growth and monetization in today’s competitive landscape.

Reasons to consider embracing performance marketing

  1. Addressing declines in organic traffic: Search engine algorithm updates and the growing dominance of AI-generated content are reducing the visibility of publishers’ articles. Performance marketing allows publishers to regain control over how their content reaches audiences by targeting specific demographics and optimizing spend for measurable results.
  2. Effectively promoting sponsored content: Brands now expect measurable outcomes from their partnerships with publishers. Performance marketing enables publishers to deliver targeted sponsored content to the right audiences, increasing engagement and ROI for brand partners.
  3. Enhancing reader acquisition strategies: Through precise targeting and retargeting, performance marketing helps publishers attract subscribers who are more likely to engage deeply with their content, driving sustained revenue growth.

Key data-driven methods to adopt

Adopting performance marketing requires publishers to implement robust, data-driven techniques to maximize results. Some key methods include:

  1. Incrementality testing: Using incrementality testing to measure the actual impact of campaigns. By turning campaigns on and off strategically, they can observe the effect on conversions, providing a clearer picture of whether advertising efforts are driving measurable outcomes.
  2. Advanced attribution models: Traditional attribution models, such as last-click or platform-based metrics, often overestimate the contribution of specific channels. Exploring alternative methods like multi-touch attribution or data-driven attribution to capture the full customer journey can help ensure a more accurate representation of campaign performance.
  3. Leveraging first-party data: First-party data has become a critical asset for publishers. By encouraging readers to share their email addresses or engage with content through subscriptions, publishers build a more direct relationship with their audience. This data is instrumental to creating more personalized marketing campaigns and refining targeting strategies.

Adopting strategies for success

For publishers looking to integrate performance marketing into their strategies, the following steps can serve as a roadmap:

  1. Build a data infrastructure: Invest in tools and platforms that enable robust data collection and analytics. These may include customer data platforms (CDPs) and marketing automation systems to streamline targeting and reporting. Breaking down data silos and creating a unified, single source of truth shared across teams is essential for effective collaboration and visibility.
  2. Experiment and optimize: Start with small-scale campaigns and use insights from incrementality testing and attribution analysis to optimize future efforts. Treat campaigns as iterative processes, constantly refining strategies based on performance metrics.
  3. Focus on first-party data: Launch initiatives to collect first-party data, such as newsletters, exclusive content offerings, or gated access. Use this data to segment audiences and deliver highly targeted campaigns.
  4. Balance acquisition and retention: As subscription bases grow, retention strategies become as important as acquisition efforts. Provide an exceptional user experience and foster ongoing engagement through personalized content recommendations and value-driven communication.

Looking ahead: building sustainable digital businesses

While there are no silver bullets, digital publishers that embrace performance marketing and data-driven methodologies can position themselves for long-term success. By breaking down data siloes, diversifying marketing channels, focusing on first-party data strategies, and continually refining user experiences, publishers can create sustainable business models while meeting the evolving needs of their audiences.

In an industry that demands constant adaptation, performance marketing offers actionable steps forward—one rooted in measurable impact, audience insights, and a deeper connection with readers.


About the Author

Ju-kay Kwek is a leader in creating enterprise-scale data analytics products. Before co-founding Switchboard Software, Ju-kay launched Google BigQuery and was a founding product executive for Google Cloud Platform. Ju-kay uses his expertise in media and audience data to help companies like Spotify, Target, DISH, and Dotdash Meredith to accelerate their revenue.  

The post How publishers can use performance marketing to boost growth appeared first on Digital Content Next.

]]>
How publishers can avoid the high cost of inaccurate data https://digitalcontentnext.org/blog/2024/10/22/how-publishers-can-avoid-the-high-cost-of-inaccurate-data/ Tue, 22 Oct 2024 11:02:00 +0000 https://digitalcontentnext.org/?p=43919 Every year, poor data quality costs organizations an average $12.9 million, according to Gartner. These companies are actively looking for ways to eliminate that waste and the market has responded....

The post How publishers can avoid the high cost of inaccurate data appeared first on Digital Content Next.

]]>
Every year, poor data quality costs organizations an average $12.9 million, according to Gartner. These companies are actively looking for ways to eliminate that waste and the market has responded. Gartner also reports that by 2025, 90% of data quality technology buying decisions will focus on ease of use, automation, operational efficiency, and interoperability as the critical decision factors to mitigating the data quality problem.

Data accuracy is the lifeblood for digital publishers

Losing revenue due to inaccurate data is particularly painful for digital publishers and media companies whose business models largely depend on their ability to leverage high-quality data to deliver for advertiser outcomes. Fresh, accurate data is the foundation for effective segmentation, targeting, and business intelligence. It’s essential for optimizing content, improving user experiences, and ultimately maximizing revenue.  

Inaccurate, poor-quality, older data can lead to missed opportunities and loss of credibility among ad partners. These costly mistakes show up in multiple ways, including:

  • Misaligned content strategies: Targeting the wrong audience can result in wasted resources and low engagement.
  • Ineffective advertising campaigns: Poor data can lead to targeting errors, resulting in lower click-through rates and conversions.
  • Poor user experience: Inaccurate data can lead to personalized recommendations that are irrelevant or worse, driving users away from your content.

Data accuracy is directly tied to revenue

Inaccurate data can have a direct and significant impact on a media company’s bottom line. Decreased ad revenue from targeting errors can result in lower ad impressions and clicks. A poor user experience due to inaccurate data can lead to subscriber churn. Without an ability to harness fresh data, publishers may not be able to quickly capitalize on the news cycle by jumping on trends and creating new opportunities to serve advertisers. Publishers will also have challenges measuring the ROI of their campaigns and reporting results to advertisers. 

Know the signs of data accuracy issues

Even publishers who think their data is accurate, know that data quality can deteriorate over time. This is due to various factors, including human error, system failures, and changes in data sources. It’s well worth the time and effort to conduct periodic checks to make sure your data pipeline is running smoothly and your data is as accurate as possible. 

Here are five signs to watch for:

1. Inconsistent or conflicting data

One of the most common signs of data accuracy problems is inconsistencies or conflicts between different data sources. For example, you may find discrepancies between data from your first-party systems, analytics tools, and ad platforms. These inconsistencies can make it difficult to get a clear and accurate picture of your audience, campaigns, and performance.

2. Missing or incomplete data

Another red flag is missing or incomplete data. This can occur due to data collection errors, system failures, or changes in data ingestion methods. 

3. Outdated data

Data can become outdated over time, particularly in rapidly changing industries like media. Using outdated data can lead to inaccurate insights, ineffective targeting, and wasted resources.

4. Data quality issues

These issues can arise due to errors such as gaps, inconsistencies, problems with validity, latency, or a lack of data normalization across systems.  

5. Lack of data governance & reliability

Without proper policies and procedures in place to manage data, it can become fragmented, inconsistent, and unreliable. In addition, media companies may be working with dozens of first-party and third-party systems that organize data differently. A single source of truth is essential to truly optimize campaigns. 

Limited engineering resources exacerbate data quality problems

Despite the critical importance of data accuracy, some media companies simply don’t have the engineering resources to ensure their data is consistently accurate. With millions of pieces of data constantly arriving from dozens of sources, managing that data at scale all day, every day becomes a monumental, ongoing challenge. 

Traditional data engineering approaches often fall short in meeting the specific needs of publishers. These challenges typically fall into some combination of these categories: 

  • Data Silos: Data is often pulled directly from content platforms and dumped into data warehouses without proper structuring or enrichment, making it difficult to use immediately.
  • Manual Reporting: Some publishers still rely on manual methods like exporting data to Google Sheets for reporting, which is time-consuming and error-prone.
  • First-Party Data Integration: Integrating all first-party data securely and efficiently into the data pipeline is painful. 
  • Prioritization: Revenue teams often struggle to get their data engineering needs prioritized. In many cases, data engineers simply don’t want to perform the mundane tasks associated with maintaining and updating APIs. Revenue teams  are then forced to create DIY workarounds that are time consuming, frustrating and not as effective as they hoped. 

Selecting a data operations platform

The current demand for skilled data engineers far exceeds supply, making it difficult to allocate sufficient time and expertise to data quality initiatives. Media companies that leverage no-code or low-code data engineering tools that can automate data pipelines and workflows, can drastically reduce the need for extensive engineering expertise. 

Here are some qualities to look for and questions to ask when selecting a data operations platform to overcome your data quality issues:

  • Automation and Speed: Look for platforms that offer automated data pipelines and ETL processes that can streamline data management and reduce manual effort.
  • Scalability: Make sure the platform can handle large volumes of data and scale as your needs grow.
  • High-Quality Data: Find out how the platform ensures reliable and trustworthy data. Do they consolidate data from various sources, (e.g. first-party data, agency data, and data from different channels).
  • Domain Expertise: Is the platform backed up by a team of experts in digital publishing who can ensure that data is handled and transformed correctly for optimal analysis?
  • Customization: Can the platform be customized to meet your specific requirements and business logic?
  • Comprehensive Measurement: Are your essential metrics built in for effective campaign evaluation?
  • Privacy and Security: What kinds of security measures have been taken to protect your sensitive data and ensure adherence to data privacy rules and regulations? 

Publishers and media companies simply cannot ignore the impact that inaccurate data has on their bottom line. By investing in data operations platforms, quality control processes, and governance, you can unlock the full potential of your data and drive sustainable growth for your business.


About the author

Manny Balbin, a seasoned veteran with over 15 years in digital media and advertising, currently shapes vision and strategy for BI products at Switchboard Software. Switchboard’s data engineering automation platform aggregates disparate data at scale in real-time for better business decisions. Prior to Switchboard, Manny led Product, Ad Technology, and Revenue Operations at Freestar, PMC, and Quantcast.

The post How publishers can avoid the high cost of inaccurate data appeared first on Digital Content Next.

]]>
Why adtech loves fragmentation–and why publishers shouldn’t  https://digitalcontentnext.org/blog/2024/09/23/why-adtech-loves-fragmentation-and-why-publishers-shouldnt/ Mon, 23 Sep 2024 11:22:00 +0000 https://digitalcontentnext.org/?p=43752 The advertising landscape is fracturing due to third-party signal loss, causing targeting scarcity for publishers and advertisers. Half of users have “disappeared” from digital advertising by using browsers and devices...

The post Why adtech loves fragmentation–and why publishers shouldn’t  appeared first on Digital Content Next.

]]>
The advertising landscape is fracturing due to third-party signal loss, causing targeting scarcity for publishers and advertisers. Half of users have “disappeared” from digital advertising by using browsers and devices that limit tracking and preserve privacy. 

While Chrome supports cookies, 40% of users modify privacy settings or browse incognito to block ad tracking. We’ve seen this reflected in our publishers’ audiences, where 70% of the internet is now invisible to adtech, leaving only 30% addressability. Chrome’s ‘reject all’ buttons in Europe and the proposed American Privacy Rights Act of 2024, or APRA, will further reduce data visibility and increase the signal loss we see today.

As businesses seek ways to address signal loss, solutions flood the market that are short-term workarounds that lack interoperability. The result is a fragmented, chaotic adtech ecosystem.  

Publishers have not experienced signal loss because they have first-party signals. This means publishers can append signals to 100% of impressions, whether an ID for users that authenticate or a cohort based on known attributes and behaviors collected on the user. The challenge for publishers is to build signals that the buy side will want to buy. This requires navigating fragmentation and having the tools to connect data and collaborate with the ecosystem on their signals.

Connecting disparate data

According to a recent AdMonster’s Publisher Pulse survey, monetizing audience data is seen as a significant opportunity for growth for 50% of publishers, and 33% plan to leverage audience data to create new revenue streams.

Consolidating their data asset is critical for publishers as it offers an opportunity to leverage  a lot more dimensions to build the audiences that advertisers want to buy. Publishers can augment the first-party signals they collect across their digital properties with additional attributes such as demographic, preference, and intent data. However, today’s fragmented data ecosystem makes it challenging to harness the full potential of this information. Data often sits in silos across different environments, different teams and consolidation requires data engineers, complex workflows, and significant time investment. 

Greater connectivity creates seamless, integrated systems that efficiently combine all data assets. This empowers publishers to activate their data without the burden of engineering-heavy processes.

Greater connectivity not only simplifies data management but also enhances its effectiveness. By consolidating various data sources, publishers can build a richer, more comprehensive profile of their audiences, allowing for better targeting and more personalized advertising experiences. This, in turn, creates new revenue opportunities and strengthens relationships with advertisers.

Collaboration reimagined

The challenges don’t end with data consolidation. Retargeting, once a cornerstone of performance marketing, is losing its effectiveness as signal loss impacts the scalability of traditional tactics. Advertisers are now struggling to achieve scale and performance, with all buyers competing for the same shrinking 30% of the internet’s inventory. This lack of scale and poor performance leaves many legacy retargeting strategies falling flat.

Data collaboration holds the potential to solve these challenges, but the current state of the Data Clean Room (DCR) space presents two significant hurdles:

1. Fragmentation and operational complexity

The DCR space is highly fragmented, making data collaboration cumbersome and resource-intensive. Moving data between systems requires complex operations, lengthy data processing agreements (DPAs), and heavy legal and operational overhead. This complexity not only delays campaign execution but also reduces return on ad spend (ROAS). Advertisers and publishers are left with inefficient processes that yield minimal performance gains.

2. The wrong signal to activate against

For companies that manage to navigate these hurdles, the results often don’t justify the effort if the signal that is activated is a matched record. Match rates between data sets are typically low. However a seed of matched records is a great asset to help a publisher identify which of their signals might have the highest propensity to hit advertiser KPIs or the highest affinity towards the brand.

The industry must move towards streamlined connectivity and more efficient data collaboration to overcome these challenges. By creating unified environments where data can flow seamlessly between publishers, advertisers, and platforms, we can reduce the operational burden. This would allow for more effective use of first-party data, greater scale in retargeting efforts, and better performance across the board.

Embrace collaboration & connectivity 

This reimagined approach to data collaboration and connectivity allows publishers to unlock the full potential of their data, creating better consumer experiences, more opportunities for growth, and a stronger, more transparent ecosystem where data works for everyone.

The post Why adtech loves fragmentation–and why publishers shouldn’t  appeared first on Digital Content Next.

]]>
Media companies need a privacy and data management roadmap https://digitalcontentnext.org/blog/2024/08/05/media-companies-need-a-privacy-and-data-management-roadmap/ Mon, 05 Aug 2024 11:27:00 +0000 https://digitalcontentnext.org/?p=43295 If you find it challenging to keep up with changes in privacy standards and regulation, you’re not alone. Twenty states have passed comprehensive consumer privacy laws, and more are expected...

The post Media companies need a privacy and data management roadmap appeared first on Digital Content Next.

]]>
If you find it challenging to keep up with changes in privacy standards and regulation, you’re not alone. Twenty states have passed comprehensive consumer privacy laws, and more are expected to follow. Since a national privacy law does not yet exist, media companies must navigate a patchwork of state and international regulations.

Google’s recent announcement that it will no longer deprecate third-party cookies likely won’t diminish the need for publishers, advertisers and ad tech companies to continue to focus on privacy and protecting consumer data. Google says it will give users the option to consent to tracking. So, some leaders believe this could still lead to deprecation.

With the privacy landscape seemingly always in flux, where do media companies begin? Here’s an overview of what publishers need to know to plan for a compliant future.

Background: Where to start understanding privacy issues

As the digital advertising landscape has evolved, consumers have become increasingly concerned with how their data is collected, used and stored. These concerns led to calls for legislation and industry standards to protect their data and guide businesses on best practices.

The California Consumer Privacy Act (CCPA) was the first state law passed to address this. Key provisions included consumers’ right to know how their data is being used, and the choice to delete or opt out of data collection.

CCPA became a blueprint for other states to follow. But without national legislation, the result has been an assortment of laws that differ across jurisdictions. As advertisers and media companies attempt to navigate these regulations, the industry has recognized the need for solutions.

In March, the IAB released its State of Data 2024 report, which shares feedback from brands, agencies and publishers about current data practices and where the industry is headed. In the study, 95% of respondents expected continued legislation and signal loss this year and beyond. Because of these challenges, companies must significantly change their practices.

Most publishers are taking a state-by-state approach. Brand marketers are opting for a one-size-fits-all strategy aimed at the highest common denominator. Angelina Eng, IAB’s vice president of measurement, addressability, and data center, explained that as organizations navigate a world with greater signal loss, they need to think holistically about the data they’re collecting.

“Companies need to ask themselves who they’re sharing data with, how to activate and whether they have consumer consent. Our research found that nearly 70% of consumers are willing to share their personal data to support advertising overall, and nearly three in four consumers understand that sharing their data enables websites/apps to know more about them in order to serve personalized ads,” Eng said. “We need to provide consumers with guidance and education around the value of allowing advertisers to leverage some data points and provide ads relevant to consumers, which would in turn allow us to measure performance.”

Media industry compliance solutions are in the works

Several industry initiatives are being developed to streamline compliance and make it easier for businesses to implement industry standards.

The IAB Multi-State Privacy Agreement (MSPA) is a framework designed to help companies from all corners of the industry comply with various state-level privacy regulations. It ensures that companies can efficiently manage compliance across jurisdictions.

The Global Privacy Platform works with the MSPA to transmit consumer preferences across jurisdictions, ensuring compliance with privacy laws such as GDPR and state regulations.

Another element of the IAB’s privacy solutions portfolio is its Diligence Platform. This data privacy platform includes standardized privacy diligence questions for different segments of the advertising industry to help streamline the evaluation process and improve compliance efficiency.

Media companies need a roadmap to navigate privacy

As the industry experiences changes including signal loss and new privacy regulations, organizations must build strategies that allows them to continue to leverage first-party data while remaining privacy compliant.

There are new tools to help publishers better understand and navigate these challenges. ThinkMedium, a consulting firm founded by ad tech veteran Dennis Buchheim, recently released its Publisher Readiness Playbook. It Outlines the context, questions and steps for publishers to understand their preparedness for ongoing data- and identity-related shifts.

“Part of the challenge with privacy regulations is that they go beyond laws. It’s the platform policies that in many ways are having a huge impact on the industry,” Buchheim said.

Buchheim added that the number of policies publishers must be aware of can be overwhelming.

“The breadth of regulations and policies is tremendous. You really must understand what applies to you and what doesn’t. You can comply with the strictest interpretation, have a more bespoke plan, or take a blanket approach. Making very conscious decisions like these requires having a good understanding of what’s happening in the industry. We believe the Playbook can help provide some of that understanding.”

Media companies also can gain a better understanding of their level of compliance by participating in an industry certification program. These programs measure companies against current industry standards and can reveal gaps in compliance that may lead to process improvements to ensure they meet industry standards. While keeping up with privacy changes can seem daunting, solutions and guidance exist to help media companies navigate these complexities. Devising a plan and seeking help from industry resources can help media companies remain in good standing with advertisers, consumers and the law.

The post Media companies need a privacy and data management roadmap appeared first on Digital Content Next.

]]>
Oracle’s ad tech exit is a wake-up call for data agility https://digitalcontentnext.org/blog/2024/07/31/oracles-ad-tech-exit-is-a-wake-up-call-for-data-agility/ Wed, 31 Jul 2024 12:27:00 +0000 https://digitalcontentnext.org/?p=43265 Oracle’s announcement that they are shutting down their ad tech division came as a surprise to their clients as well the industry at large. For media companies heavily reliant on...

The post Oracle’s ad tech exit is a wake-up call for data agility appeared first on Digital Content Next.

]]>
Oracle’s announcement that they are shutting down their ad tech division came as a surprise to their clients as well the industry at large. For media companies heavily reliant on Oracle’s data solutions, it served as a stark reminder: the ability to adapt to unforeseen changes is paramount in today’s data-driven landscape.

This isn’t just a concern for large enterprise companies. Media companies of all sizes need to prioritize data agility. But what exactly is data agility, and why is it so crucial for success in the media world?

Data agility: adaptability in the face of change

Data agility goes beyond simply having a lot of data. It’s about the ability to access, analyze, and leverage your data quickly and efficiently in a constantly changing environment. It’s the freedom to pivot strategies, swap data sources, and adjust workflows with minimal disruption when faced with unexpected challenges.

Since media businesses rely on their data as the backbone of their revenue generation strategy, it’s even more important for them to think about the agility of their data. Agile data can bring many benefits, including:

  • Enhanced Transparency: With readily available, accurate data, media companies gain a clearer picture of audiences, campaign performance, and overall business health.
  • Informed Decision Making: Agile data empowers media companies to react swiftly to market trends and audience preferences. Accessing and gaining real-time insights from  trusted data across the organization can inform campaign optimization, content strategy,  and resource allocation.
  • Competitive Advantage: Data agility enables media companies to adapt to changing trends and audience behaviors, respond effectively to competitor moves, and maintain a competitive edge.
  • Streamlined Collaboration: When data is readily accessible and easily shared across departments, collaboration flourishes. Agile data fosters better communication between marketing, sales, and finance teams, leading to a more unified approach.
  • Resource Allocation: Even the most efficient data operations and engineering teams often struggle to field dozens of daily requests from across the company. Executives that prioritize data agility as part of their budgeting and staffing process can help relieve the  burden placed on their teams. 
  • Future-Proofing Your Business: Data agility allows media companies to be prepared for the unexpected – whether it’s a new technology disruption, a regulatory shift, or a change in audience preferences. Companies that can adapt their data strategy and infrastructure to navigate these changes can avoid derailing their plans.

Building a foundation of data agility

So, how can media companies cultivate data agility? The first step is to invest in a flexible data platform that scales with your needs and offers seamless integration from all the sources that are important to run the business. Choose a data platform that allows non-technical teams to ingest and customize their data with proprietary business rules so they can perform independent analysis. Also consider cloud-based solutions for scalability and flexibility.

Next, make data governance a top priority. Establish clear policies and procedures around data collection, access, and security. This helps maintain data integrity and compliance with internal and external regulations. 

When strong systems are in place, work on developing a data-driven culture where data is readily available across the organization. Provide training and resources to empower even non-engineering members of the team with the skills to understand, analyze, and utilize data effectively.

When both your data platform and teams are up and running, don’t give in to the temptation to “set it and forget it.” Embrace continuous improvement by regularly evaluating the data strategy and infrastructure. Be prepared to adapt your approach as your needs and the media landscape evolve.

A proactive approach always wins

The Oracle situation demonstrates that even established giants can make decisions that disrupt their partners. That’s why taking a proactive approach to data management is essential. Here are few things to look for and some questions to ask when when selecting a data operations platform:

  • Data Source Agnostic: Can the platform easily integrate with data sources across different parts of the business to avoid creating silos? Can it swap in alternative sources,  if needed, to minimize workflow disruption?
  • Streamlined Data Management: How does the platform reduce complexity for your team? What is the process for data ingestion, transformation, and analysis? 
  • Domain Expertise: Are they the right strategic partner? Do they bring relevant domain expertise and best practices into your organization? Can they provide the proper levels of support? 
  • Modular Solutions: Does the platform have modules that are tailored to your specific use case such as campaign analytics, inventory forecasting, and revenue insights? 

Change is inevitable, but preparation is a choice

Thousands of years ago, an ancient Greek philosopher observed that change is the only constant. But change doesn’t have to derail your business. No matter which platform you choose, be sure to plan for the unexpected. By prioritizing data agility, media companies can build a resilient foundation that is adaptable, responsive, and empowers data-driven decision-making in the face of any future change.


About the Author

Ju-kay Kwek is a leader in creating enterprise-scale data analytics products. Before co-founding Switchboard Software, Ju-kay launched Google BigQuery and was a founding product executive for Google Cloud Platform. Ju-kay uses his expertise in media and audience data to help companies like Spotify, Target, DISH, and Dotdash Meredith.

The post Oracle’s ad tech exit is a wake-up call for data agility appeared first on Digital Content Next.

]]>
Four media companies on tackling signal loss with data collaboration https://digitalcontentnext.org/blog/2024/06/24/four-media-companies-on-tackling-signal-loss-with-data-collaboration/ Mon, 24 Jun 2024 11:28:00 +0000 https://digitalcontentnext.org/?p=42992 Signal loss makes it increasingly difficult for advertisers to run campaigns across the open web. Traditional methods for prospecting and direct response are particularly impacted, with only 30% of the...

The post Four media companies on tackling signal loss with data collaboration appeared first on Digital Content Next.

]]>
Signal loss makes it increasingly difficult for advertisers to run campaigns across the open web. Traditional methods for prospecting and direct response are particularly impacted, with only 30% of the open web currently being addressable.

This change places advertisers in a challenging position, making it difficult to reach their target audience and maintain brand equity, especially on the open web. However, publishers and broadcasters are uniquely positioned to assist advertisers in navigating these issues, as they have not experienced signal loss.

Media companies have highly engaged audiences and access to a growing variety and volume of behavioral and contextual data points, which are essential for effective audience modeling. Essentially, publishers are the key to achieving 100% addressability and the future of targeting on the open web. However, to maximize the value of these insights, they need tools that foster collaboration and provide advertisers with clarity amidst the chaos.

To gain insights into how publishers are strategically addressing these issues, Permutive gathered four customers and publishing leaders who are reimagining data collaboration. We asked them where they see opportunities and how they are solving the challenges that arise.

Our panel included Stephanie Mazzamaro, VP, Addressability & Premium Programmatic at The Arena Group, Michael Nuzzo, SVP Data Solutions at Hearst Magazines, Josh Peters, Global Head of Commercial Data Strategy and Programmatic Operations at The Washington Post, and Bethany Hillman, Vice President, Data and Advertising Operations at TelevisaUnivision. 

Here are the four key insights from that discussion: 

1. Solving for signal loss: What advertisers want

The panel’s resoundingly indicated that advertisers are looking to publishers to solve signal loss across the open web and the addressability data gap caused by privacy regulations and third-party cookie deprecation. Josh Peters at The Washington Post emphasized the complexity of these advertiser requests. He said they are seeing varied inquiries about data access and standards and noted the challenge of advocating for better solutions beyond standard industry offerings. He said: “They want the IAB standard. We have to make the case that we actually have something better.” 

Stephanie Mazzamaro at The Arena Group stressed the need for standardizing signals. She explained that one of the big initiatives and challenges for the publisher this year is making the signals a standard but still unique. “How do we still create that special sauce and still provide differentiators in the marketplace?, she mused. On the issue of creating standardized audiences –  and echoing Mazzamaro’s challenge – Michael Nuzzo at Hearst Magazines said: “Having a single person be one thing, at any given time, is kind of an impossible task.”

Publishers know their audience, and, through the right tech, can connect the dots and provide insights into audiences that advertisers might not realize. Nuzzo believes it’s important to understand who users are at the right time in a given contextual space and expand beyond existing user bases. 

He said that “it’s something that advertisers and agencies understand really well: If someone’s reading about dog food, I should serve them a dog food ad. But we also know, through a taxonomy, that people who are interested in dog food are often outdoor runners because they run with their dogs. And so we open up new audiences, and we’re not just pitching these people into a single segment.”   

2. Metrics for success: Moving away from clicks  

The Arena Group has put a lot of resources into launching “as many IDs as possible” to find its North Star, and has started shifting from page views to addressability metrics, focusing on user engagement and “stickiness.” Highlighting the role of Permutive’s new identity hub in streamlining these efforts, Mazzamarro said her team has been busy finding ways to use contextual with addressable audiences. They are also focused on finding ways to make them stickier and have created a scorecard internally to measure them. 

Amid the furor of made-for-advertising sites (MFAs) and external companies deciding the premium status of publishers, is a focus on quality and equipping premium publishers to tell their own story through insights, which can be used at every stage of the sales cycle. Washington Post, for example. is moving towards quality over quantity, emphasizing time spent and exposure. They are also using clean room interactions for better post-campaign analytics and insights. 

Peters told the audience the publisher is being more precise with its actions and feedback to advertisers. For example, if an advertiser spends money in one area, Washington Post can point out another area with better performance and time on site, suggesting they focus more there. He said: “That’s what the advertisers are looking for and that’s where the dollars are going to end up.” 

The challenge here is different for broadcasters, particularly given the proximity to its end users in the app environment and considering both linear and digital buyers. TelevisaUnivision is packaging digital metrics with traditional video and CTV platforms, aiming to provide comprehensive audience insights and drive better market adoption. “I’m pulling those linear buyers through,” explained Bethany Hillman at TelevisaUnivision. “Not just saying you have to purchase video and big-screen, but giving them the full concept of all the digital metrics, we’re not just looking at households. Pulling that market along has been the most important piece for us. 

3. Identity management: Connecting disparate data  

TelevisaUnivision and The Arena Group both see identity management as an important part of their strategies, particularly the need for consolidation and easier data access to drive forward resource allocation and storytelling. Referencing Permutive’s Collaboration and Connectivity products, Bethany Hillman at TelevisaUnivision said: “We have offline pieces. I’m calling APIs to get data in. I’m trying to find coverage in different places. For me, the consolidation is going to be huge.” 

Nuzzo at Hearst stressed the responsibility of managing identity data, including consent management and internal collaboration to maximize data utility. 

It’s important to be transparent about that data, too. Hearst integrates identity and data with media activations, determining value through CPM uplift. Washington Post has developed a dashboard to track audience transactions and contextual performance, providing transparent and actionable insights company-wide.

4. Clean room: Scaled activation versus single solution 

Washington Post and The Arena Group both discussed the challenges of adopting and implementing clean room technologies, calling for clear next steps and collaboration to fully leverage these tools. Mazzamaro said “it’s a checkbox” for agencies when they ask if publishers can access clean rooms. It’s always a positive reaction when a publisher says yes, but that agency always runs something else that does not require a clean room. “Working through adoption is really hard as an industry,” explained Mazzamaro. 

TelevisaUnivision highlighted a partnership with Home Depot to illustrate the potential of clean rooms but said that “it’s an oxymoron that clean rooms equal data collaboration.” She said: “What I’ve seen so far is you load your data in, and you get a match rate… where I am collaborating on data? My big hope for clean rooms as we stand them up, is that we see the activation, not just from a one-to-one match perspective, but really to see that growth scale and to collaborate for the first time. So what data can I bring to the table to help that person expand their consumers.”  

The critical role of data collaboration

In the face of signal loss, publishers and broadcasters play a vital role in addressing advertisers’ needs for audience reach on the open web. As our panelists have discussed, advertisers are coming to them with requests because they possess highly engaged audiences and access to a wealth of data points. 

By leveraging data collaboration strategies and connectivity tools, publishers can advance the industry by consolidating disparate data for effective identity management and fully realizing the potential of clean rooms for scaled activation and data collaboration. These strategies will enable the media industry to continue providing effective audience targeting and drive greater success in an evolving digital landscape.

The post Four media companies on tackling signal loss with data collaboration appeared first on Digital Content Next.

]]>