Xorte logo

News Markets Groups

USA | Europe | Asia | World| Stocks | Commodities



Add a new RSS channel

 
 


Keywords

2025-01-07 14:21:56| Engadget

Bad actors have created deepfakes to imitate celebrity endorsements, President Biden and employers. But, one of the most heinous uses is making sexually explicit deepfakes of real people. Now, the UK government is taking additional steps to deter their creation, introducing new criminal offenses for producing or sharing sexually explicit deepfakes. Only sharing deepfakes is currently an offense under UK law.  "With these new measures, were sending an unequivocal message: creating or sharing these vile images is not only unacceptable but criminal," said Baroness Margaret Beryl Jones, minister for the future digital economy and online safety. "Tech companies need to step up too platforms hosting this content will face tougher scrutiny and significant penalties." The new offenses will be proposed in parliament under the Governments Crime and Policing Bill. A similar measure was proposed in April 2024 by the previous UK government under former Prime Minister Rishi Sunak. However, it only covered cases in which a person created the deepfake to "cause alarm, humiliation or distress to the victim," creating a loophole for perpetrators to argue their case. The law never progressed as Sunak called a general election just one month later. Notably, the new measure covers only adults, as it is already illegal to create or share any sexually explicit images of children.  The UK government has also announced its intention to make it a criminal offense if a person takes intimate photos or video without consent. Additional offenses would look at whether it was created without consent and to cause alarm, distress, humiliation or sexual gratification for themselves or another. A person charged with one of these actions can face up to two years in custody.  The US has attempted to create helpful measures or individuals impacted by deepfakes. In 2024, the Senate passed the Disrupt Explicit Forged Images and Non-Consensual Edits Act (DEFIANCE Act), which would allow victims of sexually explicit deepfakes to sue the perpetrators. It would give the individual 10 years to sue for up to $150,000 or $250,000 if it relates to attempted sexual assault, stalking or harassment. However, it's fate is unclear, having sat in limbo in the House of Representatives since last July. This article originally appeared on Engadget at https://www.engadget.com/new-uk-law-would-criminalize-creating-sexually-explicit-deepfakes-132155132.html?src=rss


Category: Marketing and Advertising

 

Latest from this category

12.01Our favorite UGreen 3-in-1 wireless charger is 32 percent off right now
12.01Lego's first Pokémon sets are now available for pre-order
12.01Anthropic made a version of its coding AI for regular people
12.01The Disney+ Hulu bundle is on sale for $10 for one month right now
12.01Mark Zuckerberg announces new 'Meta Compute' initiative for its data center and AI projects
12.01Paramount won't quit, files suit against Warner Bros. Discovery over rejected bid
12.01India is proposing another far-reaching security rule for smartphones
12.01Apple's Siri AI will be powered by Gemini
Marketing and Advertising »

All news

13.01Trump announces 25% tariff on countries doing business with Iran
12.01Stocks Reversing Modestly Higher into Final Hour on US Economic/Earnings Optimism, Stable Long-Term Rates, Technical Buying, Alt Energy/Tech Sector Strength
12.01Meta plans to cut about 10% of employees in Reality Labs business
12.01Our favorite UGreen 3-in-1 wireless charger is 32 percent off right now
12.01New Morton Grove Metra station opens but more work on the $4M project still to be done: It was a big deal for us
12.01Paramount escalates hostile takeover bid of Warner Bros. with new board slate
12.01DOJ investigation of Fed Chair Jerome Powell sparks backlash, support for Fed independence
12.01GM nods to its heritage as it begins future at new Hudsons Detroit HQ
More »
Privacy policy . Copyright . Contact form .