Home Culture YouTube extends access to deepfake detection tool to celebrities and artistic agencies

YouTube extends access to deepfake detection tool to celebrities and artistic agencies

12
0

YouTube extends its resemblance detection tool with AI to the entire entertainment industry, opening access for the first time to celebrities and talent agencies.

The expansion comes six months after YouTube began deploying the AI detection tool in October, initially limited to a specific set of creators with YouTube channels.

On Tuesday, April 21, the company announced that talent agencies, management firms, and the celebrities they represent can now sign up, whether or not they have a YouTube channel. The expansion was developed with the support of talent agencies and management firms, including Creative Artists Agency (CAA), United Talent Agency (UTA), WME, and Untitled Management.

YouTube had previously announced in December 2024 that it enlisted the help of CAA talent to create the AI-powered deepfake detection tool. The tool analyzes newly uploaded videos on YouTube to identify content potentially containing the face of each registered creator using the resemblance detection.

Participants must provide a government-issued ID and a short selfie video for identity verification and to create a facial resemblance model. The verification process takes up to five days. Once registered, participants can authorize agents, managers, or other representatives to review flagged content without undergoing individual verification.

The platform stores resemblance models and identity information for up to three years from the last login of a registered individual unless they withdraw consent or delete their account.

Currently, the tool only detects visual matches of registered creators’ faces, with plans to expand to audio detection in the near future.

The AI-powered resemblance detection tool is an extension of YouTube’s privacy tools aimed at addressing deepfakes. In July 2024, YouTube updated its privacy policies to allow users to request removal of AI-generated content simulating their appearance or voice.

YouTube noted that the tool is still experimental and emphasized the importance of reporting any failures to detect resemblance via the platform.

In 2023, YouTube announced the development of a system for music partners to request removal of content imitating an artist’s unique singing or rapping voice.

The music industry has been cracking down on deepfakes, with Sony Music requesting the removal of over 135,000 songs created by fraudsters using AI to impersonate artists on its roster.

Denis Cooker, President of Global Digital Activities and Sales in the US at Sony Music Entertainment, told BBC that deepfakes directly harm legitimate artists’ commercial interests and reputation.

Meanwhile, Spotify recently tested a new subscription feature allowing artists to review and approve eligible releases before they go live, aiming to protect against deepfakes and incorrect AI attributions.