Xorte logo

News Markets Groups

USA | Europe | Asia | World| Stocks | Commodities



Add a new RSS channel

 
 


Keywords

2025-01-07 14:21:56| Engadget

Bad actors have created deepfakes to imitate celebrity endorsements, President Biden and employers. But, one of the most heinous uses is making sexually explicit deepfakes of real people. Now, the UK government is taking additional steps to deter their creation, introducing new criminal offenses for producing or sharing sexually explicit deepfakes. Only sharing deepfakes is currently an offense under UK law.  "With these new measures, were sending an unequivocal message: creating or sharing these vile images is not only unacceptable but criminal," said Baroness Margaret Beryl Jones, minister for the future digital economy and online safety. "Tech companies need to step up too platforms hosting this content will face tougher scrutiny and significant penalties." The new offenses will be proposed in parliament under the Governments Crime and Policing Bill. A similar measure was proposed in April 2024 by the previous UK government under former Prime Minister Rishi Sunak. However, it only covered cases in which a person created the deepfake to "cause alarm, humiliation or distress to the victim," creating a loophole for perpetrators to argue their case. The law never progressed as Sunak called a general election just one month later. Notably, the new measure covers only adults, as it is already illegal to create or share any sexually explicit images of children.  The UK government has also announced its intention to make it a criminal offense if a person takes intimate photos or video without consent. Additional offenses would look at whether it was created without consent and to cause alarm, distress, humiliation or sexual gratification for themselves or another. A person charged with one of these actions can face up to two years in custody.  The US has attempted to create helpful measures or individuals impacted by deepfakes. In 2024, the Senate passed the Disrupt Explicit Forged Images and Non-Consensual Edits Act (DEFIANCE Act), which would allow victims of sexually explicit deepfakes to sue the perpetrators. It would give the individual 10 years to sue for up to $150,000 or $250,000 if it relates to attempted sexual assault, stalking or harassment. However, it's fate is unclear, having sat in limbo in the House of Representatives since last July. This article originally appeared on Engadget at https://www.engadget.com/new-uk-law-would-criminalize-creating-sexually-explicit-deepfakes-132155132.html?src=rss


Category: Marketing and Advertising

 

Latest from this category

28.11Florida buyers are first to close on a home using AI, saving thousands in realtor fees
27.11Decathlon asks folks to shoot them an invite this Black Friday  for sports, not shopping
26.11SMB Landing Page Optimization Trends
26.11How to Turn a Branded B2B Podcast Into a High-Impact Revenue Engine
26.11With its new course, MasterClass reframes cybersecurity as a must-have skill for consumers
25.11The Top Frustrations B2B Buyers Have With Vendors
25.11How US Professionals Are Building Their Personal Brands [Infographic]
25.11Brand vs. Branding: Aligning Your Brand and Branding Builds Perception and Trust
Marketing and Advertising »

All news

29.11Airlines work to fix software glitch on A320 aircraft and some flights are disrupted
29.11Budget 2025: What's the best and worst that could happen for Labour?
29.11Dalal Street Week Ahead: Nifty scales record high, but limited market support flags caution near 26,300
29.11Almost 20 years after he was first shorn, Andrew grad Tom Serratore is helping lead St. Baldricks charge
29.11Flights returning to normal after Airbus warning grounded planes
29.11FIIs net sellers of Rs 3,765 crore till November 29, but flows may shift as sentiment turns
29.11How SailGP turned the niche sport into a $200 million celebrity investment magnet
29.11Silver ready for next leg of rally? This technical pattern indicates upside potential of up to $9
More »
Privacy policy . Copyright . Contact form .