US President Donald Trump has signed a invoice criminalizing nonconsensual synthetic intelligence-generated deepfake porn, which additionally requires web sites to take down any illicit pictures inside 48 hours.
Trump signed the invoice into legislation on Could 19, often called the TAKE IT DOWN Act, an acronym for Instruments to Deal with Identified Exploitation by Immobilizing Technological Deepfakes on Web sites and Networks.
The bill, backed by first woman Melania Trump, makes it a federal crime to publish, or threaten to publish, nonconsensual intimate pictures, together with deepfakes, of adults or minors with the intent to hurt or harass them. Penalties vary from fines to jail.
Web sites, on-line providers, or apps should take away unlawful content material inside 48 hours and set up a takedown course of.
Trump said in remarks given on the White Home Rose Backyard and posted to the social media platform Fact Social that the invoice additionally covers “forgeries generated by a man-made intelligence,” generally referred to as deepfakes.
Melania Trump had straight lobbied lawmakers to assist the invoice, and said in an announcement that the legislation is a “nationwide victory.”
“Synthetic Intelligence and social media are the digital sweet of the following technology — candy, addictive, and engineered to have an effect on the cognitive improvement of our kids,” she stated.
“However in contrast to sugar, these new applied sciences will be weaponized, form beliefs, and sadly, have an effect on feelings and even be lethal,” she added.
Senator Ted Cruz and Amy Klobuchar introduced the invoice in June 2024, and it handed each homes in April of this 12 months.
US the newest to ban specific deepfakes
There was a rising variety of instances the place deepfakes are used for harmful purposes. One of many extra high-profile situations noticed deepfake-generated illicit pictures of pop star Taylor Swift rapidly spread through X in January 2024.
X briefly banned searches utilizing Taylor Swift’s title in response, whereas lawmakers pushed for legislation criminalizing the manufacturing of deepfake pictures.
Associated: AI scammers are now impersonating US government bigwigs, says FBI
Different nations, such because the UK, have already made sharing deepfake pornography unlawful as a part of the nation’s Online Safety Act in 2023.
A 2023 report from safety startup Safety Hero revealed that almost all of deepfakes posted on-line are pornographic, and 99% of people focused by such content material are ladies.
Journal: Deepfake AI ‘gang’ drains $11M OKX account, Zipmex zapped by SEC: Asia Express