Friday, June 6, 2025

Top 5 This Week

Related Posts

IAB Tech Lab unveils plans to bolster publisher monetization in the AI era

- Advertisement -

0:00

- Advertisement -

By Ronan Shields  •  June 4, 2025  •

- Advertisement -

IAB Tech Lab has used its annual summit to announce two major initiatives aimed at modernizing digital advertising infrastructure and content governance, as well as addressing some of the fundamental challenges that generative AI and LLMs pose to content monetization.

The two-part announcement details two key initiatives with the LLM Content Ingest API Initiative, addressing publishers’ concerns prompted by AI agents and large language models, as well as AI-driven search summaries that reduce publisher traffic. Meanwhile, its Containerization Project is geared toward the development and maintenance of programmatic infrastructure (see more below).

  • The LLM Content Ingest API initiative proposes a technical framework to help publishers and brands control how their content is accessed, monetized, and represented by AI systems, aiming to address traffic and revenue losses caused by generative AI. These APIs can then be used to control access to publisher content by LLMs, with the two then able to agree on monetization models.  
  • The Containerization Project introduces standardized container technology for OpenRTB to streamline ad tech deployment, improve scalability, and reduce latency across the programmatic supply chain. This includes specialized bid enrichment and evaluation partners, mounting scaling challenges, especially for live events, fragmented systems, and uneven performance, which have made the current foundation difficult to evolve.

IAB Tech Lab is inviting publishers, brands, LLM platforms, and AI agent developers to provide feedback on the proposals, with a workshop for the LLM Content Ingest API scheme planned for next month. Elsewhere, the Tech Lab Containerization Project Working Group is responsible for leading the separate effort, with representatives there also requesting feedback on the initiative.

- Advertisement -

The two-pronged announcement marks the fruition of the standards body’s earlier pledge to release up to 31 new or updated specifications this year, with the efforts targeting sub-sectors of the industry, including CTV, conversion tracking, and curation.

However, it is the rising tide of generative AI and LLMs that has proven the most fundamentally concerning shift in recent years, with the number of related job losses in 2025 alone a striking concern. IAB Tech Lab CEO Anthony Katsur discussed this and the latest initiatives with Digiday ahead of this week’s IAB Tech Lab Summit, detailing his belief that every publisher should ink licensing deals with LLMs, how brands, too, need to protect themselves amid the “contextual soup.” 

When quizzed on mass layoffs across the sector, Katsur also gave recommendations on how individuals can future-proof their careers in this new paradigm of the internet economy. 

The conversation below has been lightly edited for brevity and clarity. 

Many publishers are wary of the latest era of the internet, with layoffs taking place across the industry, how will the latest initiative help? 

Some publishers are starting to do content licensing deals with the LLMs, and every publisher should do a content licensing deal with their LLMs, full stop.

Every publisher should know that every LLM is crawling your content, so do a licensing deal, stop the bleed, and get paid for your content. Any LLM that is effectively ransacking publisher content without paying for it; that’s intellectual property theft, in my opinion. 

The challenge, though, is we don’t believe that the crawling approach has is a long term, feasible approach. By introducing a standardized set of APIs [LLM Content Ingest API], we can get the industry to lock arms and shut down the crawling, lock them at the IP level. Then we can create an open-source, standardized API that gives structure to this content, and that structure does a number of things. 

For example, you can now create a gateway that you know allows access to the LLMs that reflect the business terms of a contract that you sign with them. The problem is there are publishers with different tiers of content: your archival, always-on content, then there’s your day-to-day, and the same [monetization or paywall model] should exist for the LLMs.

There’s a logging component to the API, so now you can audit the crawls and make sure you’re invoicing correctly and getting paid appropriately for your content. And then fourth, and I would argue, maybe the most important, is the tokenization of the content so it demonstrates a source of truth. The issue with the LLM, while promising, is that they’re still nascent in their development, and they are prone to [making factual] mistakes.

Can you explain more about the need for the Containerization Project at this time? 

Containerization is arguably the biggest development in programmatic since Open RTB. In today’s current server-to-server architecture, Open RTB is a [meta-protocol] and is an HTTP request that makes a wide area network call, so even if a DSP and SSP are in the same data center, it isn’t necessarily smart enough to know to stay in the same data center. 

The beauty of containerization is you can leverage the GRCP protocol and protocol buffers to make a containerized version of the RTB protocol. So, what we’re doing is we’re taking that 300-to-500 milliseconds and potentially shrinking that down to 50-to-100 milliseconds… and what you can do with that [saved] time is a lot. 

The connection between DSP and SSP will open and close much faster, or you can keep the connection open, just keep streaming new requests through it, that works great for scaling live events [opening programmatic up to new content types such as live sports]. 

There are a lot of people in the industry who have recently lost their jobs with AI developments cited, etc. What advice would you have for people considering such developments? 

Agentic AI, or purpose-driven AI that is not so task-driven, is the one that really comes into play in terms of media buying and optimization, combating fraud. They’ll be able to spot patterns in the supply chain or performance patterns or creative optimization. 

I think my advice to anyone in our ecosystem is to learn and become an expert in working with those tools. It’s early days, but I think those that embrace this can get an edge from a learning curve perspective.

https://digiday.com/?p=579996

More in Media

Read More

- Advertisement -

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Popular Articles