Xorte logo

News Markets Groups

USA | Europe | Asia | World| Stocks | Commodities



Add a new RSS channel

 
 


Keywords

2025-01-07 14:21:56| Engadget

Bad actors have created deepfakes to imitate celebrity endorsements, President Biden and employers. But, one of the most heinous uses is making sexually explicit deepfakes of real people. Now, the UK government is taking additional steps to deter their creation, introducing new criminal offenses for producing or sharing sexually explicit deepfakes. Only sharing deepfakes is currently an offense under UK law.  "With these new measures, were sending an unequivocal message: creating or sharing these vile images is not only unacceptable but criminal," said Baroness Margaret Beryl Jones, minister for the future digital economy and online safety. "Tech companies need to step up too platforms hosting this content will face tougher scrutiny and significant penalties." The new offenses will be proposed in parliament under the Governments Crime and Policing Bill. A similar measure was proposed in April 2024 by the previous UK government under former Prime Minister Rishi Sunak. However, it only covered cases in which a person created the deepfake to "cause alarm, humiliation or distress to the victim," creating a loophole for perpetrators to argue their case. The law never progressed as Sunak called a general election just one month later. Notably, the new measure covers only adults, as it is already illegal to create or share any sexually explicit images of children.  The UK government has also announced its intention to make it a criminal offense if a person takes intimate photos or video without consent. Additional offenses would look at whether it was created without consent and to cause alarm, distress, humiliation or sexual gratification for themselves or another. A person charged with one of these actions can face up to two years in custody.  The US has attempted to create helpful measures or individuals impacted by deepfakes. In 2024, the Senate passed the Disrupt Explicit Forged Images and Non-Consensual Edits Act (DEFIANCE Act), which would allow victims of sexually explicit deepfakes to sue the perpetrators. It would give the individual 10 years to sue for up to $150,000 or $250,000 if it relates to attempted sexual assault, stalking or harassment. However, it's fate is unclear, having sat in limbo in the House of Representatives since last July. This article originally appeared on Engadget at https://www.engadget.com/new-uk-law-would-criminalize-creating-sexually-explicit-deepfakes-132155132.html?src=rss


Category: Marketing and Advertising

 

Latest from this category

10.12The NES game Jaws is getting a retro physical re-release on Switch and PS5
10.12Apple TV and Apple Music were down for some users
10.12Meta is reportedly working on a new AI model called 'Avocado' and it might not be open source
10.12Spotify's new playlist feature gives users more control over their recommendation algorithm
10.12Intel loses its latest challenge to 16-year-old EU antitrust case
10.12The world premieres and other hotness from The Game Awards 2025 Day of the Devs stream
10.12PS Plus Game Catalog additions for December include Assassin's Creed Mirage
10.1212 steps you can take right now to be safer online
Marketing and Advertising »

All news

10.12The NES game Jaws is getting a retro physical re-release on Switch and PS5
10.12Apple TV and Apple Music were down for some users
10.12Meta is reportedly working on a new AI model called 'Avocado' and it might not be open source
10.12Stocks Reversing Higher into Final Hour on More Dovish Than Expected Fed, Stable Long-Term Rates, Earnings Outlook Optimism, Financial/Homebuilding Sector Strength
10.12A brief history of Calibri, the woke font the Trump administration is replacing
10.12Fed lowers interest rates but future cuts uncertain
10.12Spotify's new playlist feature gives users more control over their recommendation algorithm
10.12WBEZ loses local programming amid ongoing power outage at Navy Pier studios
More »
Privacy policy . Copyright . Contact form .