Senate bill aims to protect artists' work from AI training

A new bipartisan Senate bill seeks to protect artists by making it illegal to train AI on their work without permission, ensuring creators' rights are upheld.

Jul 15, 2024 - 10:59
Senate bill aims to protect artists' work from AI training
A newly introduced bipartisan bill in the U.S. Senate aims to make it illegal to train AI models using artists' work without their explicit permission.

A newly introduced bipartisan bill in the U.S. Senate aims to make it illegal to train AI models using artists' work without their explicit permission. Named the Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act), this legislation would grant companies a two-year period to create tools that enable creators to attach provenance and watermark data to their content. Training AI on this protected material would constitute a violation of the law.

The bill is designed to empower artists, journalists, musicians, and other creators to safeguard their work against unauthorized AI usage and to pursue legal action against violators. The National Institute of Standards and Technology (NIST) would be responsible for establishing the necessary standards for content protection.

Spearheaded by Senate Commerce Committee Chair Maria Cantwell (D-WA), fellow committee member Marsha Blackburn (R-TN), and AI Working Group member Martin Heinrich (D-NM), the COPIED Act has received support from various industry organizations, including Hollywood’s SAG-AFTRA, the Recording Industry Association of America (RIAA), and the News/Media Alliance.

This is not the first legislative effort aimed at regulating AI's impact on creative industries. In June, Senator Ted Cruz (R-TX) introduced the Take It Down Act, which mandates social media platforms to remove deepfake pornography. A growing number of federal and state lawmakers are also proposing AI-related bills or advocating for stricter regulations.

In October, President Biden ordered the development of AI safety and security standards, requiring AI platform developers to disclose essential details, such as test results, prior to launching their technologies.

What sets the COPIED Act apart is that it represents the first federal legislation specifically restricting how AI developers can train their models. Compliance may necessitate using limited datasets comprised of unwatermarked and public domain material. While this could potentially impact the quality of AI-generated content, it serves as a crucial step in protecting copyright and preventing the unauthorized use of creative works.

As the debate around AI ethics and copyright continues, the COPIED Act could set a significant precedent for future legislation, emphasizing the need for a balanced approach that fosters innovation while protecting the rights of creators. The legislation reflects growing concerns over AI's rapid advancement and its implications for intellectual property, highlighting the necessity for ongoing discussions and policies that safeguard the creative community.