| Senators Cantwell, Blackburn, and Heinrich introduce the Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act), Giving Artists New Tools to Protect Against Deepfakes |
| “Deepfakes pose an existential threat to our culture and society, making it hard to believe what we see and hear and leaving individual creators vulnerable as tech companies use our art without consent while AI-generated content leads to confusion about what is real. Requiring transparency is a meaningful step that will help protect us all – ensuring that nonconsensual, harmful content can be removed quickly and providing a clear origin when our life’s work has been used.” – Dr. Moiya McTier, Human Artistry Campaign Senior Advisor |
| With widespread creative community support from organizations including the Artist Rights Alliance, SAG-AFTRA, the Recording Academy, RIAA, NMPA, NSAI, and more, the bill would set new federal transparency guidelines for marking, authenticating and detecting AI-generated content, protect journalists, actors and artists against AI-driven theft, and hold violators accountable for abuses. Creates Transparency Standards: Requires the National Institute of Standards and Technology (NIST) to develop guidelines and standards for content provenance information, watermarking and synthetic content detection. These standards will promote transparency to identify if content has been generated or manipulated by AI, as well as where AI content originated. The bill also directs NIST to develop cybersecurity measures to prevent tampering with provenance and watermarking on AI content. Puts Journalists, Artists and Musicians in Control of Their Content: Requires providers of AI tools used to generate creative or journalistic content to allow owners of that content to attach provenance information to it and prohibits its removal. The bill prohibits the unauthorized use of content with provenance information to train AI models or generate AI content. These measures give content owners—journalists, newspapers, artists, songwriters, and others—the ability to protect their work and set the terms of use for their content, including compensation. Gives Individuals a Right to Sue Violators: Authorizes the Federal Trade Commission (FTC) and state attorneys general to enforce the bill’s requirements. It also gives newspapers, broadcasters, artists, and other content owners the right to bring suit in court against platforms or others who use their content without permission. Prohibits Tampering with or Disabling AI Provenance Information: Currently, there is no law that prohibits removing, disabling, or tampering with content provenance information. The bill prohibits anyone, including internet platforms, search engines and social media companies, from interfering with content provenance information in these ways. |
