Back to Stories
STORIES

AI Act: high-risk AI systems need more nuance

09 September 2022
AI Act: high-risk AI systems need more nuance

The EBU supports the objectives of the AI Act. The risk-based approach taken by the European Commission and the graduated levels of obligations seem well suited to regulate such a complex area.

Public Service Media use AI systems in their delivery of news and services to citizens. In the EBU-led digital news project ‘A European Perspective’, EBU Members are using AI enabled recommendation and translation tools (Peach and EuroVox) to deliver real-time news across borders and in the local language. These tools are extensively supervised and reviewed by editorial teams. This is just one example of the innovative use of AI by Public Service Media

Within the AI Act, the EBU and its Members have identified areas of concern in the category of high-risk AI systems. The broad scope of this category could threaten the legitimate use of AI systems in the media sector. Specifically, our concerns are:

  1. AI systems intended to produce complex text or to generate or manipulate image, audio or video content should not be classified as high-risk by default
    In the case of Public Service Media, these systems are rigorously reviewed by editors, who also undertake responsibility for the results produced.
  2. The systematic classification as high-risk of all AI systems intended to be used for the biometric categorisation and identification of natural persons is too broad 
    This classification risks that benign AI systems developed or used by media organisations fall within this category. Public Service Media may face an undue burden for systems that: suggest metadata to tags for archiving purposes; use facial/voice recognition to attribute content to celebrities/politicians; or identify monuments.
  3. Protection for the right to freedom of expression and the right to freedom of the arts and sciences should be maintained and further clarified
    As many audiovisual productions use AI for visual effects, the application of transparency rules must be flexible enough to avoid damaging the user experience. The Regulation must take into account the wide range of possible situations, for example the fact that viewers can start watching audiovisual content mid-way (making it difficult to pin-point the first interaction or exposure).

Relevant links and documents

Contact

Get in touch. We're here to help.