Xorte logo

News Markets Groups

USA | Europe | Asia | World| Stocks | Commodities



Add a new RSS channel

 
 


Keywords

2025-01-07 14:21:56| Engadget

Bad actors have created deepfakes to imitate celebrity endorsements, President Biden and employers. But, one of the most heinous uses is making sexually explicit deepfakes of real people. Now, the UK government is taking additional steps to deter their creation, introducing new criminal offenses for producing or sharing sexually explicit deepfakes. Only sharing deepfakes is currently an offense under UK law.  "With these new measures, were sending an unequivocal message: creating or sharing these vile images is not only unacceptable but criminal," said Baroness Margaret Beryl Jones, minister for the future digital economy and online safety. "Tech companies need to step up too platforms hosting this content will face tougher scrutiny and significant penalties." The new offenses will be proposed in parliament under the Governments Crime and Policing Bill. A similar measure was proposed in April 2024 by the previous UK government under former Prime Minister Rishi Sunak. However, it only covered cases in which a person created the deepfake to "cause alarm, humiliation or distress to the victim," creating a loophole for perpetrators to argue their case. The law never progressed as Sunak called a general election just one month later. Notably, the new measure covers only adults, as it is already illegal to create or share any sexually explicit images of children.  The UK government has also announced its intention to make it a criminal offense if a person takes intimate photos or video without consent. Additional offenses would look at whether it was created without consent and to cause alarm, distress, humiliation or sexual gratification for themselves or another. A person charged with one of these actions can face up to two years in custody.  The US has attempted to create helpful measures or individuals impacted by deepfakes. In 2024, the Senate passed the Disrupt Explicit Forged Images and Non-Consensual Edits Act (DEFIANCE Act), which would allow victims of sexually explicit deepfakes to sue the perpetrators. It would give the individual 10 years to sue for up to $150,000 or $250,000 if it relates to attempted sexual assault, stalking or harassment. However, it's fate is unclear, having sat in limbo in the House of Representatives since last July. This article originally appeared on Engadget at https://www.engadget.com/new-uk-law-would-criminalize-creating-sexually-explicit-deepfakes-132155132.html?src=rss


Category: Marketing and Advertising

 

Latest from this category

19.02Mark Zuckerberg testifies in social media addiction trial that Meta just wants Instagram to be 'useful'
19.02Dyson announces the PencilWash wet floor cleaner
18.02Gemini can now generate a 30-second approximation of what real music sounds like
18.02Cyberpunk platformer Replaced has once again been delayed
18.02Nevada sues Kalshi for operating a sports gambling market without a license
18.02Here's what to expect at Apple's product launch event on March 4
18.02Spotify debuts SeatGeek integration for concert ticket sales
18.02Google Pixel 10a vs. Pixel 9a: What's changed and which one should you buy?
Marketing and Advertising »

All news

19.02ETMarkets Smart Talk | After 18 months of no returns, stage set for 23 years of above-average gains: Rakesh Pujara
19.02The Golden Thumb Rule | Growth at a reasonable price is my rule; overpaying can destroy returns even in bull markets: Srinivas Rao Ravuri
19.02Oil prices dip as investors assess trajectory of US-Iran tensions
19.02Why there's no quick fix in sight for the problem of dazzling headlights
19.02India set to become a meaningful part of LGT biz; regulatory complexity a hurdle: Prince Max von und zu Liechtenstein
19.02Metal stocks glitter on Dalal Street, eye stronger March quarter
19.02The two farms in Senegal that supply many of the UK's vegetables
19.02Indias steel sector gears up for primary market boom in coming months
More »
Privacy policy . Copyright . Contact form .