supporter Archives - Digital Content Next Official Website Thu, 23 Apr 2026 18:35:12 +0000 en-US hourly 1 How publishers rebuild audience ties as search falls https://digitalcontentnext.org/blog/2026/04/29/how-publishers-rebuild-audience-ties-as-search-falls/ Wed, 29 Apr 2026 11:34:00 +0000 https://digitalcontentnext.org/?p=47202 Data shows that publishers are already experiencing steep traffic losses: Business Insider is down 55% in organic search traffic since 2022, with Forbes and HuffPost close behind at roughly 50%....

The post How publishers rebuild audience ties as search falls appeared first on Digital Content Next.

]]>
Data shows that publishers are already experiencing steep traffic losses: Business Insider is down 55% in organic search traffic since 2022, with Forbes and HuffPost close behind at roughly 50%. In the 12 months following Google’s AI Overviews launch, organic traffic to publisher websites fell from 2.3 billion to under 1.7 billion monthly visits — more than 600 million lost visits in under a year. When Google’s AI answer resolves the query on the results page, the publisher never sees the user, and Google is resolving more queries that way each quarter.

The implicit deal publishers had with search – make good content, earn rankings, convert traffic – no longer holds. The publishers in the best position today recognized early that this wasn’t a temporary dip and started planning for referrals to keep declining.

Search was always a rented audience

Search was always someone else’s distribution channel. Google’s incentives lined up with publishers for a 15-year stretch, and most of the industry built acquisition strategies on that alignment. The alignment is over.

It’s a familiar pattern. Social played out the same way. Facebook referral traffic peaked around 2016 and has fallen unevenly since. Any publisher whose acquisition engine depended on organic social reach has already been through a version of what’s happening with search now.

Owned channels are what’s left. The publishers who built them early are ahead and everyone else is catching up.

From traffic intelligence to relationship intelligence

According to Parse.ly data, across the publisher network it works with (more than 400 sites with 15B+ pageviews a month) the pattern is consistent. The publishers whose audience base has held up are the ones that started investing in direct and newsletter channels years before the search decline forced the issue. The ones that didn’t are trying to build that muscle now, during the decline, which is a much harder job.

Most publisher analytics, including ours, grew up in an era when the publisher’s job was to understand what search traffic did once it arrived. Which articles held attention. Which converted. Which didn’t. That’s content intelligence, and it was the right problem to solve when traffic was abundant and external.

The new problem is different. How does a reader move from a first visit to a repeat visit to a loyal relationship? What content earns the second visit? Which acquisition sources produce readers who stay? When should the newsletter signup appear, and to whom?

That’s a different type of analysis – one that we call relationship intelligence.

Three diagnostic questions

The starting point is a traffic-mix audit. This is not to confirm assumptions, but to see where things actively stand. Most publishers are surprised by what they find. Three questions cut to the picture quickly:

  1. What percentage of your traffic is direct or newsletter-driven today, compared to 12 and 24 months ago? If that figure is flat or shrinking while search declines, owned audience isn’t developing fast enough to offset the loss.
  2. Which pieces of content drive newsletter signups or repeat direct visits, as opposed to the ones that get the highest raw pageviews? These are often different articles, and the conversion-first bias tends to be under-examined in editorial reviews.
  3. Where did your most loyal subscribers originally come from, and what was the first piece of yours they engaged with? The acquisition path that produces a long-term subscriber is probably the most underused signal in publisher analytics.

What owned relationships produce

Direct traffic converts to paid subscriptions at a higher rate than search-referred traffic. A reader typing in your URL or clicking through from your newsletter already has a relationship with your site. A search visitor often doesn’t.

Newsletters are the most concrete example. Publishers sent 28 billion emails in 2025 to over 255 million readers, with average open rates above 41%. There’s no intermediary algorithm between the publisher and the inbox, which is the whole point. The Financial Times now gets more than 70% of its subscriber traffic through its mobile app. That traffic doesn’t move if Google changes a ranking signal next quarter.

What’s missing: the audience connection

What’s missing in most publisher analytics today isn’t more pageview data – it’s relationship intelligence. The acquisition path that produces a long-term subscriber. The content that earns a second visit. The newsletter signup that started a ten-year reader relationship.

A reader who found you through a newsletter, opens your app a few times a week, and subscribed because they trust your coverage on a specific beat is not a reader Google or an AI assistant can reassign. That’s a different audience than the one search was providing for most of the last decade. It’s a much more valuable audience. And relationship intelligence is how you build it.


About the author

Bob Ralian is Head of Unified Analytics at Automattic, including Parse.ly, the content analytics platform for enterprise publishers. His team works with publishers to make sense of their audience data, what’s working, what isn’t, and what to do about it.

The post How publishers rebuild audience ties as search falls appeared first on Digital Content Next.

]]>
The new publisher challenge https://digitalcontentnext.org/blog/2026/04/13/the-new-publisher-challenge/ Mon, 13 Apr 2026 11:23:00 +0000 https://digitalcontentnext.org/?p=47170 Publishers have been under the gun for 25 years. The transition to the digital age forced media companies to adapt again and again to evolving consumer habits and changing technology....

The post The new publisher challenge appeared first on Digital Content Next.

]]>
Publishers have been under the gun for 25 years. The transition to the digital age forced media companies to adapt again and again to evolving consumer habits and changing technology. The trend has been cumulative, each staff reduction making it harder to maintain the talent needed to survive the next round of change. Now publishers must contend with the continued dominance of the big platforms and sudden, dramatic declines in their own traffic driven by AI-powered search.

Publishers understand what they’re up against. They’ve done the math. They know they need to engage audiences across social video, YouTube, audio platforms, and emerging AI interfaces, environments where discovery is driven by algorithms, not direct visits. Every day they work to balance maximizing short-term revenue while maintaining the user experience that builds and keeps an audience over time.

But execution is hard. And it’s getting harder.

To operate across more platforms and environments requires people and know-how that most publishers no longer have. Short-staffed teams can’t juggle dozens of disconnected tech vendors. Data doesn’t flow where it needs to flow. And the operational debt from years of patching together point solutions is making it harder to move fast.

Create. Transform. Distribute. Engage. Monetize.

To compete in a market now dominated by platforms, creators, and AI-driven discovery, publishers need to reorganize their operations around a clear set of functions: creating content, transforming it for different environments, distributing it effectively, driving engagement, and monetizing it across channels.

Create

Every successful content creator–from influencers to the best known media brands– has their secret sauce; their unique style or point of view. Many are rightly concerned that unconstrained use of AI will commoditize quality content, or that a torrent of AI slop will drown out the good stuff. This is why publishers have to be maniacal about quality and authenticity to create real consumer engagement. 

Transform

But how do you scale that quality content across today’s fragmented consumer landscape?  Here is where AI finds purpose. It turns out that AI is really good at taking content and adapting it to different environments and formats. With some expertise and guidance, it can maintain brand standards of quality, trust, and authenticity across many surfaces. 

Today, the words “publisher” and “website” cannot be synonymous. Content has to be created to meet the consumer where he/she lives. That includes the social platforms where the goal to drive traffic back to the publisher’s website is in opposition to the platforms’ imperative to keep the audience within their walled garden. The content then has to do double duty; yes drive traffic, but also maximize monetization programs that encourage customer engagement within even if the audience is experiencing your content outside of your site or app. 

Distribute

Once you’ve got content that’s tailored and transformed, the next problem is getting it everywhere it needs to be really fast. You cannot brute-force this. There aren’t enough hours in the day and there aren’t enough people on your team.

The market for consumer attention shifts constantly. The lifespan of a piece of content is finite: hours or days for news and longer for evergreen, explanatory, or enthusiast content. You need a real-time feedback loop telling you what’s still relevant, what’s gaining traction, and what’s already dead. Without that, you’re flying blind. And a piece of content that could have driven real revenue at hour two is worthless by hour six.

Speed isn’t a nice-to-have. It’s the whole game.

Engage

The goal of distribution is to drive engagement, because engagement drives revenue. The challenge is that the best format to engage with you may not be the same as what’s needed to engage with me. Some in your audience will prefer long-form video, some will prefer audio, some still prefer reading, and others will opt for short-form video. Other consumers will respond to more interactive experiences, like community boards, polls, quizzes, and games. Getting that right, at scale, for each individual is the engagement opportunity. And the publishers who solve it  are rewarded with more content consumed, more time spent, and more frequent repeat visits. 

Monetize

Let’s be honest about something. The platforms were not designed to make publishers rich. They were designed to keep audiences inside their walls, and they’re very good at it. For years, the monetization math outside your owned-and-operated properties was ugly, and most publishers knew it.

But something has shifted. Not because the platforms suddenly became generous. But because the pressure on them to attract and retain quality professional content has forced them to open doors they used to keep firmly shut.

YouTube now offers monetization models that generate real revenue for creators who treat their channel like a full-scale media business, not an afterthought. Its dynamic ad insertion tools give serious content owners the ability to operate more like TV networks, swapping sponsored segments in and out, extending the lifespan of sponsorships, and unlocking new monetization opportunities within existing content. Last year, Facebook made meaningful changes to its creator program, and publishers who wrote it off are quietly revisiting that math.

None of this is a windfall. The platforms will always take their cut. But “the platforms take a cut” and “there’s real revenue to be captured” are not mutually exclusive statements. The publishers extracting value from these channels aren’t doing it because the platforms are benevolent. They’re doing it because they’ve built the operational infrastructure to move fast, transform content for each environment, and actually work the monetization programs available to them.

That’s the opportunity. 

Shifting Thinking

While most publishers know they need expertise to help them extract value from their content wherever it is experienced, most are looking at the current moment with a clear-eyed view to extract as much value as they can. They are adapting to the current circumstances and are seeking out new partners who help them succeed. The partners who can consolidate data flows, simplify workflows and harness AI to automate processes will be their best friends. 

The post The new publisher challenge appeared first on Digital Content Next.

]]>
Turning AI content usage into revenue https://digitalcontentnext.org/blog/2026/04/06/turning-ai-content-usage-into-revenue/ Mon, 06 Apr 2026 12:33:00 +0000 https://digitalcontentnext.org/?p=47057 As AI systems increasingly access digital content, publishers are entering a new commercial reality. Content is being consumed in ways that often sit outside traditional channels such as search, social,...

The post Turning AI content usage into revenue appeared first on Digital Content Next.

]]>
As AI systems increasingly access digital content, publishers are entering a new commercial reality. Content is being consumed in ways that often sit outside traditional channels such as search, social, or direct audience relationships. While the industry has made progress on scraping detection, permissions, and licensing negotiations, a core challenge remains unresolved: how to consistently turn AI usage into measurable, recurring revenue.

Most publishers now accept that AI licensing will become part of future business models. The challenge is operational. Converting content usage into revenue requires infrastructure that connects traffic signals, pricing frameworks, and payment workflows into a cohesive system.

AI licensing is moving from policy to execution

Early conversations about AI and publishing focused on access rights, attribution, and platform accountability. Those debates still matter. But publishers are now entering a more practical phase of the market centered on execution.

This shift requires answering several basic questions:

  • Who is using the content?
  • What are they allowed to do with it?
  • What is that usage worth?
  • How does the publisher get paid?

Today, these answers are often scattered across tools and teams. Analytics platforms may identify bot activity. Legal teams negotiate licensing terms. Commercial teams structure agreements. Finance teams handle billing and reporting. Without integrated workflows, AI monetization strategies remain fragmented and difficult to scale.

The need for usage-based AI monetization infrastructure

For AI licensing to become a durable revenue stream, publishers will need systems built around usage-based economics. In practice, this means enabling workflows that can:

Identify and classify AI traffic.

Publishers need visibility into how AI systems interact with content, including frequency of access, depth of engagement, and types of material consumed.

Apply flexible licensing models.

AI agreements are unlikely to follow a single template. Some will involve flat-fee partnerships, while others will rely on usage-based pricing or dataset licensing. Infrastructure must support experimentation without requiring new operational processes for every deal.

Convert usage signals into billable events.

Operationalizing AI monetization requires translating content access into economic transactions. This includes assigning rate cards, tracking consumption, and generating revenue statements that support negotiation, compliance, and financial reporting.

Settle payments and route revenue.

Once pricing is applied, publishers need systems that can manage invoicing, revenue allocation, and partner payouts across multiple licensing structures.

Emerging solutions are beginning to address parts of this workflow by bringing usage measurement, pricing logic, and settlement processes into a unified environment. The goal is not to replace existing systems, but to create an operational layer that allows publishers to run AI licensing as an ongoing business function rather than a series of bespoke agreements.

Flexibility will define the next phase of AI monetization

The AI market is evolving quickly, and publishers will need optionality. Direct licensing agreements, collective negotiations, and marketplace models may all coexist. Some organizations will prioritize strategic partnerships with major AI platforms. Others will focus on monetizing specialized datasets, archives, or real-time information.

Infrastructure that supports experimentation will be essential. Publishers must be able to test pricing models, analyze usage patterns, and refine commercial strategies without rebuilding workflows each time the market shifts. This mirrors earlier transitions in digital publishing, where scalable advertising and subscription technology enabled new revenue streams to grow.

AI content monetization will only become meaningful if publishers move from fragmented signals to repeatable revenue systems. Visibility into AI usage is the starting point. The real opportunity lies in building the infrastructure that makes licensing measurable, manageable, and financially actionable, turning content consumption into a predictable commercial engine.

The post Turning AI content usage into revenue appeared first on Digital Content Next.

]]>
Why pre-sales determines how well revenue will scale https://digitalcontentnext.org/blog/2026/03/30/why-pre-sales-determines-how-well-revenue-will-scale/ Mon, 30 Mar 2026 11:23:00 +0000 https://digitalcontentnext.org/?p=47043 Pre-sales in advertising operations shapes how quickly revenue converts and how reliably it holds through execution. Effectively managed, it becomes a determining factor in how efficiently revenue can be generated...

The post Why pre-sales determines how well revenue will scale appeared first on Digital Content Next.

]]>
Pre-sales in advertising operations shapes how quickly revenue converts and how reliably it holds through execution. Effectively managed, it becomes a determining factor in how efficiently revenue can be generated and sustained.

Before a deal ever reaches order management, it moves through pricing validation, proposal construction, revisions, and internal approvals. That process shapes how quickly deals close, how accurate they are, and how well they hold up in execution.

In many media organizations, this stage is still managed through manual coordination across systems and teams. Unfortunately, this can make pre-sales a structural tax on revenue capacity rather than a marginal annoyance.

Our survey of 500 media professionals revealed that teams pour significant effort into pricing validation, proposal revisions, and approvals. 77% of respondents reported recurring pricing or deadline errors and 44% say those mistakes derail work entirely. However, deals still close, which makes the system feel like it works.

The work no one sees: advertising pre-sales

It’s the normalization of that friction, where 92% describe themselves as satisfied with their tools despite the risk, tells a different story. Friction that doesn’t show up in one place, it’s built into the day-to-day work of advertising pre-sales.

It shows up as repeated, low-value work across the process. Teams check pricing across multiple systems that don’t fully align. They rebuild proposals as inputs change or feedback comes in. Approvals move through email threads where context is incomplete or buried. Details are re-entered or reformatted as work passes between teams.

In interviews with media leaders, teams consistently described working across CRM platforms, shared drives, spreadsheets, and email to assemble deals. Information is fragmented, and finding the right version is often part of the effort itself. As one leader put it, “email is my CRM.”

Taken together, this creates a system where progress depends on continuous coordination rather than a defined, structured flow of work.

Why it feels like existing processes work

If this level of effort is built into presales, why hasn’t this been addressed?

The answer is simple. The system still produces results.

Deals move forward. Revenue comes in. From the outside, the system appears to work.
But what’s hidden is the level of effort required to sustain that performance. Over time, that effort becomes part of the operating rhythm. It’s expected, absorbed, and rarely measured directly.

This is where perception starts to diverge from reality. Organizations report high levels of satisfaction with their current tools, even as manual errors and rework remain common.

Success is measured by whether revenue comes in, not by the cost or effort it takes to produce it. As long as deals continue to move forward, the underlying inefficiency remains largely invisible.

Where ad deals actually slow down

That gap in perception also shapes how delays are understood.When deals lose momentum in pre-sales, the instinct is often to attribute it to sales execution or responsiveness. In practice, the causes are overwhelmingly operational.

The data reinforces this. Survey data revealed that 32% of respondents cited client input delays, while 22% pointed to data and system issues and 21% to stakeholder coordination.

Each step depends on inputs from other systems and teams. When those inputs fall out of sync, progress stops and work must be rebuilt to reflect the latest information. Because that reconciliation is constant, delays tend to repeat rather than resolve, directly affecting time-to-revenue and the predictability of pipeline conversion.

Where scale starts to break

This model holds at lower volumes but becomes difficult to sustain as deal flow increases. More deals introduce more revisions, dependencies, and coordination across teams, and the workload grows with that complexity instead of being absorbed by the system.

As volume rises, inconsistencies become harder to contain, delays increase, and execution risk rises. At that point, the constraint is no longer demand, it is the organization’s ability to convert that demand into revenue efficiently.

Why this is an operating model problem

Advertising pre-sales is not managed as a system. It operates as a series of disconnected tasks. Information moves across email, spreadsheets, and multiple platforms, where it is gathered, reconciled, and updated by hand. There is no mechanism to keep pricing, proposals, and approvals aligned as deals evolve.

When inputs change, work has to be rebuilt. When approvals stall, teams compensate. The process holds together through effort rather than design, which makes revenue capacity a function of how much coordination teams can absorb.

Some organizations are starting to restructure this as an orchestrated workflow, where pricing, proposals, and approvals remain synchronized as deals change, reducing the need to rebuild work at each step.

What changes when pre-sales doesn’t break

When pre-sales is structured as a connected, orchestrated workflow, the nature of the work shifts. Instead of being rebuilt at each step, work progresses with continuity. Changes stay aligned as deals evolve, rather than triggering rework across systems and teams.

Coordination doesn’t disappear, but it becomes part of the process rather than something teams have to manage manually. Dependencies are handled within the workflow, not across disconnected tools and handoffs.

As deal volume increases, that difference shows up in how revenue moves. Deals progress with fewer interruptions, timelines become more predictable, and execution holds more consistently against what was sold.

Because advertising pre-sales defines the terms of the deal, it ultimately defines the quality of the revenue itself—how quickly it converts, how reliably it delivers, and how much effort is required to sustain it. When the process depends on coordination, growth requires more effort to keep pace. When the system maintains alignment, revenue grows, and with far less friction.

The post Why pre-sales determines how well revenue will scale appeared first on Digital Content Next.

]]>
Trusted content classification fuels advertiser spend on news https://digitalcontentnext.org/blog/2026/03/23/trusted-content-classification-fuels-investment-in-news/ Mon, 23 Mar 2026 11:24:00 +0000 https://digitalcontentnext.org/?p=47034 Advertiser investment in news depends on buyers having clear, reliable control over how their suitability preferences, meaning the content they consider appropriate for their brands are applied. As new brand...

The post Trusted content classification fuels advertiser spend on news appeared first on Digital Content Next.

]]>
Advertiser investment in news depends on buyers having clear, reliable control over how their suitability preferences, meaning the content they consider appropriate for their brands are applied. As new brand suitability tools enter the market, that control is becoming less consistent. The result is growing misalignment between how content is classified and how advertisers actually assess risk.

Content classification approaches that are accurate, consistent, policy-driven and well-aligned with the expectations of advertisers and the broader industry are the foundation on which trust and confidence are built between advertisers and publishers. But as the industry evolves, a new set of unproven brand suitability vendors may be complicating advertisers’ ability to invest confidently in news. 

Evidence suggests these new tools and solutions do not adequately consider advertisers’ perspectives on risk and suitability, and they classify news content in ways that are not aligned with buyers’ needs and expectations. This misalignment risks damaging news publishers’ relationships with advertisers and weakening long-term revenue.

Data backs this up. A recent DoubleVerify (DV) survey collected feedback from 25+ advertisers across different industry verticals on content classified as “low risk” by new and unproven tools across three major news publishers’ sites. Although a tiny portion of marketers – just 1% – view all types of news content as unsuitable for their advertising, respondents to the survey said in 92% of cases that they considered this “low risk” content (as classified by these new tools) to be unsuitable and would actively wish to avoid it.

Consumer sentiment shows a similar trend. A DV survey of 295 U.S. consumers found that 26%, on average, would think less of a brand or respect it less if it appeared alongside the content categorized as “low risk” by these new vendors.

Maintaining advertiser trust

It’s tempting to believe that classifying large portions of news content as “low risk” might open up advertiser spending and drive additional revenue for publishers. However, in practice, the inverse is likely true. Classifications that do not reflect advertisers’ views could damage their ability to invest confidently in news or alienate them from news publishers entirely.

Many publishers understand that accurate and consistent classifications that meet advertisers’ expectations ultimately help them build trust and confidence with their partners. Brands in some verticals are particularly sensitive, publishers say, such as those selling luxury goods or operating in regulated industries.

Reliable, flexible and customizable tools therefore are critical for maximizing revenue for news publishers by enabling advertisers to avoid only content they deem unsuitable for their brands and campaign strategies. This approach is validated and supported by the advertiser community, including Wayne Blodwell, Global SVP Programmatic at Assembly Global.

Blodwell said that, “Trustworthy and accurate content classification is critical for enabling our clients to invest their budgets confidently — particularly in news environments. Reliable and consistent classification technology enables us to maximize our investment with news publishers while ensuring our clients’ varied suitability needs and preferences are met.”

Growing advertiser investment in news

Research continues to show that investing in news is smart business for advertisers, and that those avoiding it altogether are missing out on powerful opportunities to engage with valuable, high-performing audiences.

DV’s News Accelerator initiative has seen first hand that advertisers feel most confident investing in news environments when they have transparency, flexibility and control over the specific types of news content they align with. DV data also reveals that advertising on news content now drives 16% more engagement than non-news content, and advertising alongside news content typically outperforms other digital channels, according to marketers.

Ensuring content classification is accurate, reliable and scalable requires continued alignment across advertisers, publishers and technology providers. Without it, confidence in news environments will erode, along with the investment that depends on it.


About the author

Jack Marshall is the Head of News for DV. With nearly two decades of experience in digital media journalism and publishing, Jack leads DV’s efforts in the news sector, including DV’s News Accelerator initiative. The DV News Accelerator aims to align DV’s product innovation with the needs of the news industry, and encourage advertiser spending on news and journalism. 

The post Trusted content classification fuels advertiser spend on news appeared first on Digital Content Next.

]]>
Speed vs. accuracy: Journalism’s ethical balancing act https://digitalcontentnext.org/blog/2026/03/16/speed-vs-accuracy-journalisms-ethical-balancing-act/ Mon, 16 Mar 2026 11:27:00 +0000 https://digitalcontentnext.org/?p=47001 The pressure to publish first has always existed in journalism. What has changed is the pace at which decisions are made. In today’s digital-first newsrooms, journalists often report live, publish...

The post Speed vs. accuracy: Journalism’s ethical balancing act appeared first on Digital Content Next.

]]>
The pressure to publish first has always existed in journalism. What has changed is the pace at which decisions are made.

In today’s digital-first newsrooms, journalists often report live, publish updates in real time, and interact directly with audiences as stories unfold. The result is tension between speed and accuracy. It is no longer just a professional challenge but, increasingly, an ethical one shaped by the systems and workflows that define real-time journalism.

Our latest research with student and early-career journalists, drawing on interviews and survey responses, highlights how strongly this concern is felt. Many young reporters say the expectation to publish quickly, correct later, and keep the feed moving can feel like pressure to take risks. When verification occurs after publication rather than before, accuracy becomes reactive instead of foundational.

For media executives, this shift raises an important question: how can news organizations deliver the speed audiences expect while protecting the credibility that sustains trust? Addressing that question requires more than reminding journalists to “be careful.” It requires rethinking the systems, workflows, and newsroom culture that shape real-time journalism.

The ethical pressure of real-time news

Live blogs, rolling coverage, push notifications, and social platforms mean that each new detail can reach audiences within seconds. This immediacy is powerful, enabling newsrooms to inform the public almost in real time. But once information is published, it spreads quickly across platforms and communities, often far beyond a newsroom’s control. Even when updates or corrections are issued later, there is no guarantee they will reach the same audiences. The original version can continue to circulate long after corrections have been made.

For younger journalists working inside these workflows, the ethical stakes feel high. They are often operating at the intersection of reporting, publishing, and audience interaction. In some cases, they are expected to monitor live feeds, write updates, verify information, and respond to audience questions simultaneously.

The intention behind these workflows is understandable. Audiences expect immediacy, competitors publish in real time, and the news cycle moves quickly. But when newsroom systems reward velocity above all else, they risk signaling that speed matters more than judgment.

That perception matters. Trust depends on the belief that news organizations prioritize accuracy even when it slows them down. If journalists feel pushed to publish unverified information, that trust becomes harder to sustain.

When technology accelerates publishing but not verification

Digital publishing tools have transformed how breaking news is reported. They allow reporters to update stories instantly, provide minute-by-minute coverage, and keep audiences informed as events unfold.

Used well, these tools strengthen journalism. They enable transparency, allow corrections to be made quickly, and give audiences a clearer view of what is known and what is still developing.

The problem arises when technology rewards speed without supporting the editorial decisions behind it. Real-time publishing environments can encourage constant updates, even when information is incomplete. If newsroom dashboards or performance metrics emphasize update frequency or time-to-publish above all else, journalists may feel pressure to move forward before verification is complete.

Media executives should consider whether their tools and metrics reinforce the right priorities. Do workflows allow time for verification? Do editors have clear visibility on updates before they go live? Are journalists encouraged to label uncertain information clearly rather than present it as confirmed?

Technology cannot replace editorial judgment, but it can either strengthen or weaken it.

Credibility built through transparency

Accuracy is not only about getting facts right the first time. It is also about how news organizations respond when information changes.

In live coverage, new details often emerge that challenge earlier assumptions. Responsible reporting means correcting inaccuracies quickly and clearly. It also means explaining those corrections so audiences understand what changed and why.

This transparency is essential for maintaining credibility. Audiences are often more understanding of evolving information than silence or defensiveness when mistakes occur.

The same principle applies to audience engagement. Today’s journalists frequently interact directly with readers through comment sections and social platforms. These conversations can build trust when handled well, but they can also spread confusion or misinformation if inaccurate claims are left unaddressed. When false information appears in comment threads or audience discussions, correcting it promptly helps prevent those claims from spreading further.

Newsrooms should be prepared for this reality. That preparation includes setting clear community guidelines, assigning responsibility for monitoring conversations, and ensuring journalists are supported when responding in fast-moving environments.

Responding quickly matters, but so does responding carefully.

Building systems that support ethical speed

The core challenge facing digital newsrooms is not whether to move quickly. Speed is part of modern journalism, and audiences expect it. The challenge is ensuring it does not weaken the editorial standards that define the profession.

That preparation starts with clear expectations. Verification is not optional, even under pressure. When information is uncertain, the responsible approach is to say so.

It also requires practical support. Editors, producers, and audience teams should work together so reporters are not juggling every responsibility alone during live coverage. When someone is responsible for monitoring comments or verifying incoming information, the reporter covering the story can focus on accurate updates.

Training also matters, particularly for younger journalists who are starting their careers in live, digital news environments rather than traditional reporting structures. They need guidance not only on how to publish quickly but also on when to pause.

Finally, newsroom leaders must reinforce that credibility remains the industry’s real competitive advantage. Speed may capture attention in the moment, but trust determines whether audiences return tomorrow.

Accuracy sustains trust

The modern newsroom operates in an environment defined by constant updates and immediate audience response. That reality is unlikely to change. What can change is how organizations balance the demands of speed with the responsibility of accuracy.

Journalism has always required difficult judgment calls. In digital reporting, those decisions simply happen faster and in public view. The goal is not to slow down the news cycle, but to ensure that the systems behind it protect the principles journalism depends on.

Speed may capture attention. Trust depends on whether the systems behind the newsroom protect accuracy when the pressure to publish is highest.

The post Speed vs. accuracy: Journalism’s ethical balancing act appeared first on Digital Content Next.

]]>
Retention over reach: the strategic reset behind publisher apps https://digitalcontentnext.org/blog/2026/03/09/retention-over-reach-the-strategic-reset-behind-publisher-apps/ Mon, 09 Mar 2026 11:24:00 +0000 https://digitalcontentnext.org/?p=46924 Is this round two of apps? That was the question Jonny Kaldor, CEO of Pugpig, posed on stage at Arc XP Connect NYC. After years dominated by platform distribution, algorithmic...

The post Retention over reach: the strategic reset behind publisher apps appeared first on Digital Content Next.

]]>
Is this round two of apps?

That was the question Jonny Kaldor, CEO of Pugpig, posed on stage at Arc XP Connect NYC.

After years dominated by platform distribution, algorithmic volatility, and pageview economics, publishers are once again sharpening their focus on direct relationships. In that recalibration, the mobile app is stepping back into the spotlight, not as a companion product, but as a core offering.

On stage with Kaldor were Ariscielle Novicio, CTO & SVP of Product & Digital Strategy of the New York Post, Kathy Colafemina, VP of Program Management of The Boston Globe, and James Cooney, VP of Product Engineering of Condé Nast. Their organizations differ widely in brand, audience, and business model. Yet one theme surfaced repeatedly: apps may represent a smaller slice of total reach, but they consistently deliver the most engaged, most loyal, and often the most valuable users.

Here’s what that shift means for publishers thinking seriously about mobile strategy in 2026 and beyond.

The most engaged audience is already there

Novicio was unequivocal about the value of the New York Post’s app users: “The most engaged users that we have — and we know this from a lot of research and studies — is with our app.”

These readers don’t just visit. They return multiple times a day. “I’m actually really impressed at how often they come,” she added.

Cooney described a similar pattern at Condé Nast. While apps are not the primary scale driver across its portfolio, “they’re the most engaged audience and they’re paying audiences” on key brands. The Vogue app in particular delivers standout engagement levels.

The takeaway isn’t that apps replace the web. The web remains essential for scale and discovery. But when it comes to direct, habitual relationships, the app environment performs differently.

Retention is the core function

When asked whether apps are retention or acquisition tools, Cooney was direct: “It’s a retention tool.” That clarity shapes how success is measured. At Condé Nast, repeat usage over time is a leading signal. If a user becomes a multi-times-per-month visitor into month three, churn likelihood drops significantly.

Retention isn’t measured by downloads alone. Teams look at weekly users, DAU/MAU ratios, and sustained engagement across months. Habituation is the goal.

The Boston Globe’s strategy reinforces this approach. Colafemina explained that the Globe rebuilt its app in 2024 specifically as a retention product. It is embedded into subscriber onboarding immediately after purchase, making the app part of the reader’s daily routine from the start. Today, more than 40% of Globe subscribers access content through the app.

For them, the app is a mechanism for reinforcing loyalty from day one.

From chasing pageviews to building depth

The renewed focus on apps reflects a broader strategic shift.

“In earlier discussions, we were chasing page views,” Novicio said. “Now we still want to maintain scale, but we’re focused on future-proofing the business by building for loyalty and depth.”

That evolution has shifted product priorities within the app experience. Instead of optimizing primarily for raw reach, the team now looks closely at engagement metrics such as daily active sessions per user, repeat visits per day, and screen views per session. As Novicio explained, the focus is on how often people return each day and how deeply they engage once they’re there.

While depth can be more difficult to earn than traffic, it’s also more durable once established.

Monetization is catching up to engagement

Historically, apps were framed primarily as subscription vehicles. Advertising often lagged behind web performance. But that is changing.

At the New York Post, Novicio described plans for premium areas within the app dedicated to direct sales and high-value sponsorships, alongside expanded e-commerce integration. “I’m focused on the CPMs,” she said, underscoring the importance of monetization and her own goal of achieving web and app CPM parity. Investments in data infrastructure are enabling stronger signal delivery across all audience types. “We’re applying that to 100% of the audience that comes to us,” Novicio explained, referencing anonymous, registered, and opted-in users alike.

At Condé Nast, advertising conversations are similarly accelerating. Rather than replicating every piece of web-based ad logic inside native apps, Cooney described a more focused approach: start with advertiser use cases and design purpose-built solutions.

There is also a competitive lens. Internal comparisons increasingly point to app-first platforms like TikTok and Instagram. Advertisers expect immersive, high-quality environments. Publisher apps must meet that experiential standard while preserving editorial integrity.

Exclusivity drives action

While retention remains the primary function, apps can generate acquisition spikes when they host exclusive experiences.

Cooney pointed to Vogue’s app-only Nicki Minaj group chat as an example. The interactive event was accessible solely inside the app and became “one of the biggest single day drivers of downloads and new starts that we had had.”

Exclusivity created urgency and urgency drove downloads.

The lesson isn’t about locking content behind arbitrary walls. It’s about designing experiences that feel inherently mobile, and valuable enough to justify the download.

Competing on experience, not volume

Vertical video and swipe-based storytelling also surfaced as key areas of experimentation. Publishers recognize they cannot match the output scale of social platforms. The objective is different.

“We’re trying to compete in the sexiness of the experience,” Novicio said. “But as far as content, we still truly want to be ourselves.”

That means adopting intuitive interaction patterns — seamless swiping, strong visual storytelling, easy sharing — without sacrificing brand voice.

That is something apps offer that social platforms cannot: full control over the environment, the data, and the relationship.

A strategic reset

So, is this round two of apps?

In many ways, yes. But it is not a repeat of 2010 enthusiasm. It is a more disciplined, data-informed reset.

The web will continue to play a critical role in driving reach, and social platforms will play a role in discovery. What emerged from this conversation, however, is that the app holds a distinct and increasingly strategic position within that ecosystem. As Novicio, Colafemina, and Cooney underscored, the app environment is uniquely suited to cultivating habit, strengthening loyalty, and generating monetizable engagement within a publisher’s own infrastructure.

For publishers shaping their mobile strategy in 2026 and beyond, the conversation has moved forward. The real opportunity lies in building apps with clear purpose, cross-functional alignment, and a long-term view of audience value.

The post Retention over reach: the strategic reset behind publisher apps appeared first on Digital Content Next.

]]>
Media sellers face a performance reset in 2026 https://digitalcontentnext.org/blog/2026/03/02/media-sellers-face-a-performance-reset-in-2026/ Mon, 02 Mar 2026 12:41:00 +0000 https://digitalcontentnext.org/?p=46880 After years of volatility—shifting buyer expectations, uneven ad spend, and constant platform change—this year is shaping up to be a defining one for media sellers. Unlike previous cycles, the uncertainty...

The post Media sellers face a performance reset in 2026 appeared first on Digital Content Next.

]]>
After years of volatility—shifting buyer expectations, uneven ad spend, and constant platform change—this year is shaping up to be a defining one for media sellers. Unlike previous cycles, the uncertainty has given way to clearer buyer behavior. Advertisers are no longer experimenting. They’re standardizing how they plan, evaluate, and invest. The question for media sellers isn’t whether demand will return, but who will earn it.

Based on what we’re seeing across the market, and reinforced by data shared in MediaRadar’s State of the Industry: 2026 Advertising Predictions webinar, let’s dive into three trends we’re predicting will shape how buyers allocate budgets in 2026 and how publishers must evolve to capture share.

1. Buyers are planning around outcomes, not environments

In 2026, advertisers are entering the market with fewer experimental dollars and clearer performance mandates. According to EMarketer data, U.S. media spend is projected to grow from $622B in 2025 to more than $838B by 2028. But that growth is flowing disproportionately to channels and partners that can demonstrate impact.

At the same time, open web display advertising continues to lose ground. MediaRadar data shows open web display spend flattening and declining year-over-year through 2024 and 2025, as budgets move toward higher-impact video and direct, curated buys.

Buyers are now planning backward from outcomes (awareness lift, site traffic, and performance signals). They are also asking sellers to prove how inventory, formats, and creative directly contribute to those goals.

  • Budgets are consolidating with fewer partners as buyers look to simplify execution and measurement. 
  • Packages need to align to use cases (launches, seasonal moments, competitive conquesting, etc.) rather than impressions alone. 
  • Performance benchmarks and historical proof points are increasingly required in RFPs.

Publishers that can clearly connect their offerings to outcomes, and support that story with data, are earning larger, more strategic commitments.

2. Creative is becoming the primary lever for differentiation

As addressability narrows, creative has emerged as the primary driver of performance. There’s a clear shift in where attention, and budgets, are going. Programmatic video ad spend alone is expected to approach $150B by 2027, according to EMarketer data, and CTV is no longer treated as  an incremental reach-only channel.

Across industries, CTV ad spend is growing aggressively. For example, according to EMarketer and MediaRadar data, automotive CTV spend is projected to grow from $3.1B in 2025 to $5.2B by 2028, while CPG is expected to nearly double from $2.6B to $4.9B over the same period. 

These gains are being driven not just by audience reach, but by creative formats that move people. Simply put, message-level performance is shaping buying decisions. Celebrity-led advertising increased 42% year-over-year, rising from 9.8% of total ad spend in 2024 to 13.9% in 2025 — a signal that advertisers are leaning into creative that builds trust and emotional connection.

The implications are clear:

  • Creative effectiveness is increasingly used as a proxy for media effectiveness.
  • Category-level creative insights (tone, format, spokesperson strategy)  strengthen both upfront and scatter conversations. 
  • High-impact and custom units perform best when informed by performance data, not intuition.

The most effective sellers in 2026 aren’t just selling space. They’re helping advertisers tell engaging stories and make smarter creative decisions before campaigns go live.

3. First-party data needs to be activated, not just collected

Nearly every publisher has invested in first-party data, but one point is clear: possession is no longer enough. Activation is what buyers value.

As programmatic buying shifts away from the open exchange, control is consolidating. Today, according to data from ANA via Marketing Drive, 59% of programmatic ad spend flows through private marketplaces versus 41% through the open marketplace. And, when CTV is included, that split widens to 66% private versus 34% open.

At the same time, discovery behavior is changing. According to EMarketer, AI-driven search is projected to grow from just 1% of total search ad spend today to 13.6% by 2029, compressing the path from intent to action and reducing the role of the traditional click altogether. In this environment, vague audience claims quickly lose credibility.

  • Audience segments must be clearly defined, transparent, and tied to real outcomes.
  • First-party data should support planning, creative strategy, and optimization, not live in isolation. 
  • Context, creative, and data must work together to drive measurable performance.

Publishers that can explain how their data enhances effectiveness—rather than simply replacing targeting—will stand out in an increasingly competitive market.

What media sellers need to do now

The opportunity in 2026 is real, but so is the competition. To protect and grow share, media sellers should focus on three priorities:

  • Lead with outcomes, supported by proof
  • Use creative intelligence to guide advertiser strategy
  • Turn first-party data into a clear, buyer-friendly value story

2026 is not about chasing the next new thing. It’s about execution. Media sellers who adapt their approach now won’t just keep pace with buyers, they’ll help define how the market moves next.

MediaRadar will continue exploring these shifts in our State of the Industry webinar series. Our next session, Video Everywhere—Winning in the New Era of CTV, takes place on March 19. It will dive deeper into how streaming, CTV, and digital video are converging — and what that means for media sellers navigating a rapidly evolving video landscape.

About the author

Fan Shi Blackwell is the VP of Strategic Partnerships at MediaRadar, where she focuses on building high-impact alliances across AdTech, programmatic, retail media, data, and e-commerce. With over 15 years of experience, she helps platforms, brands, and agencies turn marketing intelligence into measurable growth, profitability, and ROI through omni-channel strategy, data-driven decisioning, and performance optimization.

The post Media sellers face a performance reset in 2026 appeared first on Digital Content Next.

]]>
From scale to signal: Why cleaner publisher environments gain value https://digitalcontentnext.org/blog/2026/02/02/from-scale-to-signal-why-cleaner-publisher-environments-gain-value/ Mon, 02 Feb 2026 12:26:00 +0000 https://digitalcontentnext.org/?p=46725 Programmatic advertising has long been essential to publisher revenue. Unfortunately, it has also posed a challenge to user experience.  However, the longstanding trade-off between ad density and revenue is shifting...

The post From scale to signal: Why cleaner publisher environments gain value appeared first on Digital Content Next.

]]>
Programmatic advertising has long been essential to publisher revenue. Unfortunately, it has also posed a challenge to user experience. 

However, the longstanding trade-off between ad density and revenue is shifting as improved buyer-side signals enable the market to distinguish—and reward—higher-performing publisher environments. As outcome-based buying becomes more prevalent, programmatic markets are getting better at recognizing performance and pricing publisher quality accordingly. 

For years, more ads often meant more money, but also slower page load times, diminished user experience, and growing tension with other revenue drivers. Publishers understood the tradeoff, but lacked the tools to measure its long-term impact or determine whether the market would ever reward restraint rather than sheer volume.

Advertising only works well when it functions for all sides of the market:

  • For readers, ads must not overwhelm or degrade the experience.
  • For advertisers, ads must appear in trusted environments where they are seen and effective.
  • For publishers, ads must generate predictable, sustainable revenue without eroding audience perceptions.

Aligning those incentives hasn’t been easy. 

New analysis suggests that this dynamic is finally beginning to change. As buyer-side signals improve and outcome-based buying becomes more prevalent, the market is increasingly able to distinguish low-performing, high-density environments from cleaner, higher-performing ones—and price them differently over time.

This is important for publishers making strategic decisions about their user experience, and ultimately, long-term business health. 

Why the market didn’t reward quality before

For much of programmatic’s evolution, ad buyers lacked the inputs needed to consistently pay more for better environments.

Signals tied to outcomes were limited, and viewability and brand safety focused on avoiding the worst placements rather than rewarding the best ones. Attention and engagement metrics were either unavailable or unevenly applied.

Publishers faced their own constraints. Reducing ad density almost always produced short-term revenue declines, reinforcing a bias toward volume even when long-term performance might improve.

The result was widespread commoditization.

What long-term testing reveals

At Raptive, we’ve spent the past year evaluating ad density changes using longer-term, site-level cohort testing. With this testing, a different pattern emerges. You can observe how pricing evolves once you have sufficient performance data. The critical variable is time. Short-term tests often obscure these effects; only extended observation allows pricing signals to fully adjust.

Across multiple publisher cohorts, cleaner pages demonstrate notable CPM resilience. These effects appear across both mobile and desktop environments. The key takeaway is not that fewer ads automatically mean more revenue in the short term.  It is that the market is increasingly able to recognize environments where ads are more likely to perform, and then price those impressions accordingly.

The shift in the broader ad market

Several broader programmatic advertising market developments help explain why this is happening now:

  1. Buyer-side data has improved. 
  2. Attention and engagement signals are more widely integrated into buying decisions.
  3. Outcome-based buying models continue to gain traction.

Together, these changes reduce noise in the system and increase the market’s confidence in performance signals tied to cleaner environments. Also, cleaner pages reduce noise. They improve load times. They increase the likelihood that ads are seen and engaged with. All of these signals compound, and a healthier feedback loop begins to form between publishers and buyers.

This evolution reframes how ad density should be evaluated. Density is no longer just a tactical response to increasing revenue. It is a strategic decision that influences how the market perceives and values inventory over time.

Publishers also train the market through their choices. High-density environments teach buyers to expect commoditized performance, and high-quality environments teach buyers to bid accordingly.

Cleaner publisher ad layouts are the future

We’re just beginning to see what the future could look like, and a long way off from this becoming the standard. But the good news is that programmatic markets are slowly moving beyond pure scale economics. Page experience and outcomes are finally becoming meaningful inputs into pricing decisions.

For premium publishers, this creates an opportunity to reclaim value lost during years of commoditization, but only through deliberate, strategic choices. Cleaner environments are not just a user-experience improvement; they actively shape how the market learns to value inventory.

Programmatic markets are not automatically fair, but they are increasingly teachable. Publishers that prioritize performance-oriented environments train buyers to recognize—and bid for—quality. For an industry that has long argued quality matters, the market is finally beginning to respond.

The post From scale to signal: Why cleaner publisher environments gain value appeared first on Digital Content Next.

]]>
The high cost of ad tech friction: Why publishers must go direct https://digitalcontentnext.org/blog/2026/01/26/the-high-cost-of-ad-tech-friction-why-publishers-must-go-direct/ Mon, 26 Jan 2026 12:28:00 +0000 https://digitalcontentnext.org/?p=46679 Digital media executives have operated for nearly a decade within a paradoxical market structure. To achieve scale, the industry accepted opacity. Publishers plugged into a complex programmatic ad supply chain...

The post The high cost of ad tech friction: Why publishers must go direct appeared first on Digital Content Next.

]]>
Digital media executives have operated for nearly a decade within a paradoxical market structure. To achieve scale, the industry accepted opacity. Publishers plugged into a complex programmatic ad supply chain and conceded that a significant percentage of advertiser spend would vanish into the “ad tech tax.” The prevailing logic suggested that volume would eventually compensate for the erosion of margin.

That calculation no longer balances.

As the industry approaches 2026, the era of accepting opaque infrastructure has ended. For premium publishers, the definition of a modern media stack is shifting from broad connectivity to radical directness. Revenue durability now depends on ruthlessly rationalizing the supply path. Control, data, and economics must remain with the content creators.

The high cost of intermediary bloat

The systemic critique of the current programmatic environment is well-documented, yet the inefficiencies persist. Despite years of discourse regarding Supply Path Optimization (SPO), the chain remains cluttered with intermediaries. Many of these vendors function primarily through arbitrage rather than value addition.

For media executives, the issue extends beyond fees. It centers on the misalignment of incentives. When a supply chain involves multiple hops, reselling, and bid duplication, technology partners often optimize for their own volume and take rates. They rarely prioritize the publisher’s yield or the advertiser’s working media.

This complexity acts as a shield. It obscures where value leaks and complicates the auditing of revenue streams. In 2026, transparency serves as an architectural requirement rather than a sales talking point. If a publisher cannot trace a dollar from the DSP to their bank account without losing 30 to 50 percent to friction, the infrastructure has failed.

Redefining modern media infrastructure

Publishers must demand directness from the ad supply chain in the coming year. Moving toward a direct-to-publisher model represents a strategic reclamation of economic power rather than a simple technical adjustment. A direct infrastructure removes the cost of unnecessary middlemen. This ensures a higher percentage of advertiser spend reaches working media. When the path clears, yield increases naturally because the friction costs disappear.

Directness also maps to control. When publishers rely heavily on third-party legacy tech stacks, they become beholden to product roadmaps that do not prioritize their specific needs. By demanding infrastructure that allows for direct connections, media executives regain crucial operational capabilities.

  • Dictating terms. Executives can structure distinct commercial agreements that remain undiluted by third-party revenue shares. This clarity allows for more accurate forecasting and P&L management.
  • Protecting data. Direct paths limit the leakage of first-party data signals to unauthorized resellers. This security becomes paramount as privacy regulations tighten globally.
  • Accelerating innovation. Publishers can deploy new ad formats or privacy-preserving technologies immediately. They no longer need to wait for a massive intermediary to update legacy code.

The sustainability and efficiency imperative

A secondary argument for direct infrastructure has emerged as a critical business driver: Sustainability. The digital advertising industry generates a massive carbon footprint. This is driven largely by the sheer computing power required to process billions of bid requests, many of which are duplicative. In a convoluted supply chain, a single impression opportunity might generate dozens of calls to different servers. This consumes energy at every hop, only to result in a single ad serving.

This is the very definition of waste.

Brands and agencies now face pressure to report on Scope 3 emissions. Consequently, they are scrutinizing the carbon cost of their media buys. A direct connection offers an inherently greener alternative. It reduces Hello the computational load significantly by eliminating reselling and secondary auctions.

For the publisher, this presents a dual advantage. A streamlined, direct supply chain generates higher profitability by capturing more working media. Simultaneously, it appeals to the growing number of eco-conscious buy-side partners. Efficiency serves as both a responsibility and a competitive differentiator.

The cooperative path forward

The shift away from opaque intermediaries toward transparent, direct connections is already underway. Industry bodies and specialized cooperatives are re-platforming to prioritize these direct specifications. However, the technology only functions as well as the demand for it.

Media executives must stop viewing the tech tax as a fixed cost of doing business. It is a solvable inefficiency. The goal for 2026 involves building a supply chain where value derives from the quality of the audience and the content. It should not depend on the complexity of the pipe used to reach them.

We must close the gap between the advertiser’s dollar and the publisher’s pocket. The technology to do so exists. The imperative now relies on the will to implement it.

The post The high cost of ad tech friction: Why publishers must go direct appeared first on Digital Content Next.

]]>