In the session titled ‘Can AI be regulated? What public policy can’t do to address AI risks’, speakers Renard Jenkins, President of SMPTE, and Francois Lavoir, Senior EU Policy Advisor for the EBU, debated some of the responses to the new risks created by the rapid development of AI and how issues like the proliferation of AI-powered disinformation have been testing the limits of what the law can accomplish.
Legal protections for artists and content owners have lagged behind the pace of generative AI (genAI) development the world over but regulators are tightening the noose.
“We want to keep control,” Lavoir told the audience at the AI Tech Stage session. With the AI Act in force since August 2024, the European Union has led the way in protecting citizens against the risk of data misuse and harm, he claimed.
“The regulations provide more transparency about how data is used in the training of AI systems and its output so we can use that knowledge to prevent AI providers from using it,” he said.
Some genAI tool developers offer creators the chance to opt out of being included in training models. “But you need to know who is using your content in order to opt out in the first place,” said Lavoir. “Once you have the power to say ‘yes’ or ‘no’ to whether your data is used then you have the power to negotiate remuneration.”
For the public service broadcasters represented by the EBU, financial compensation is not the only concern. Visibility around how a broadcaster’s brand and content has been manipulated – or not – in final output is imperative.
Lavoir said: “We want to have a comprehensive discussion with AI providers that gives our members levers to negotiate something in return for use of their content.”
The EU is also keeping a close eye on the progress of a controversial AI safety bill passed by the California State Assembly earlier this month. While it contains data protection safeguards in common with the AI Act it also goes further in building in potential civil liability of up to 10% of the costs of training capped at $100m.
“We are following developments closely because it will have a significant impact on what we do,” said Lavoir.