Current:Home > MyNew bipartisan bill would require online identification, labeling of AI-generated videos and audio -GrowthSphere Strategies
New bipartisan bill would require online identification, labeling of AI-generated videos and audio
View
Date:2025-04-16 04:12:24
WASHINGTON (AP) — Bipartisan legislation introduced in the House Thursday would require the identification and labeling of online images, videos and audio generated using artificial intelligence, the latest effort to rein in rapidly developing technologies that, if misused, could easily deceive and mislead.
So-called deepfakes created by artificial intelligence can be hard or even impossible to tell from the real thing. AI has already been used to mimic President Joe Biden’s voice, exploit the likenesses of celebrities and impersonate world leaders, prompting fears it could lead to greater misinformation, sexual exploitation, consumer scams and a widespread loss of trust.
Key provisions in the legislation would require AI developers to identify content created using their products with digital watermarks or metadata, similar to how photo metadata records the location, time and settings of a picture. Online platforms like TikTok, YouTube or Facebook would then be required to label the content in a way that would notify users. Final details of the proposed rules would be crafted by the Federal Trade Commission based on input from the National Institute of Standards and Technology, a small agency within the U.S. Department of Commerce.
Violators of the proposed rule would be subject to civil lawsuits.
“We’ve seen so many examples already, whether it’s voice manipulation or a video deepfake. I think the American people deserve to know whether something is a deepfake or not,” said Rep. Anna Eshoo, a Democrat who represents part of California’s Silicon Valley. Eshoo co-sponsored the bill with Republican Rep. Neal Dunn of Florida. “To me, the whole issue of deepfakes stands out like a sore thumb. It needs to be addressed, and in my view the sooner we do it the better.”
If passed, the bill would complement voluntary commitments by tech companies as well as an executive order on AI signed by Biden last fall that directed NIST and other federal agencies to set guidelines for AI products. That order also required AI developers to submit information about their product’s risks.
Eshoo’s bill is one of a few proposals put forward to address concerns about the risks posed by AI, worries shared by members of both parties. Many say they support regulation that would protect citizens while also ensuring that a rapidly growing field can continue to develop in ways that benefit a long list of industries like health care and education.
The bill will now be considered by lawmakers, who likely won’t be able to pass any meaningful rules for AI in time for them to take effect before the 2024 election.
“The rise of innovation in the world of artificial intelligence is exciting; however, it has potential to do some major harm if left in the wrong hands,” Dunn said in a statement announcing the legislation. Requiring the identification of deepfakes, he said, is a “simple safeguard” that would benefit consumers, children and national security.
Several organizations that have advocated for greater safeguards on AI said the bill introduced Thursday represented progress. So did some AI developers, like Margaret Mitchell, chief AI ethics scientist at Hugging Face, which has created a ChatGPT rival called Bloom. Mitchell said the bill’s focus on embedding identifiers in AI content — known as watermarking — will “help the public gain control over the role of generated content in our society.”
“We are entering a world where it is becoming unclear which content is created by AI systems, and impossible to know where different AI-generated content came from,” she said.
veryGood! (149)
Related
- Trump suggestion that Egypt, Jordan absorb Palestinians from Gaza draws rejections, confusion
- How many points did Caitlin Clark score today? Fever star sets another WNBA rookie record
- Newborn rattlesnakes at a Colorado ‘mega den’ are making their live debut
- Lamont nominates Justice Raheem L. Mullins to become next chief justice of Connecticut Supreme Court
- Person accused of accosting Rep. Nancy Mace at Capitol pleads not guilty to assault charge
- Joey Chestnut vs. Kobayashi rules spark talk of cheating before hot dog eating contest
- Raise from Tennessee makes Danny White the highest-paid athletic director at public school
- Jeff Goldblum on playing Zeus in Netflix's 'KAOS,' singing on set with 'Wicked' co-stars
- Paula Abdul settles lawsuit with former 'So You Think You Can Dance' co
- Mae Whitman reveals she named her first child after this co-star
Ranking
- Nearly 400 USAID contract employees laid off in wake of Trump's 'stop work' order
- The 35 Most Popular Amazon Items E! Readers Bought This Month: Problem-Solving Hacks, Viral Beauty & More
- No. 1 Jannick Sinner moves into the third round at the US Open, Hurkacz and Korda ousted
- Civil rights lawyer Ben Crump advertises his firm on patches worn by US Open tennis players
- The 401(k) millionaires club keeps growing. We'll tell you how to join.
- Biden Administration Backs Plastic as Coal Replacement to Make Steel. One Critic Asks: ‘Have They Lost Their Minds?’
- Criminal charges weighed against a man after a country music star stops show over an alleged assault
- Is job growth just slowing from post-pandemic highs? Or headed for a crash?
Recommendation
Nevada attorney general revives 2020 fake electors case
Sneex: Neither a heel nor a sneaker, a new shoe that is dividing the people
Zappos Labor Day 60% Off Sale: Insane Deals Start at $10 Plus $48 Uggs, $31 Crocs & $60 On Cloud Sneakers
Why Black students are still disciplined at higher rates: Takeaways from AP’s report
Global Warming Set the Stage for Los Angeles Fires
Ohio regulators: Marijuana sellers can’t give out food from ice cream truck
Autopsy determines man killed in Wisconsin maximum-security prison was strangled
Nikki Glaser set to host 2025 Golden Globes, jokes it might 'get me canceled'