Xorte logo

News Markets Groups

USA | Europe | Asia | World| Stocks | Commodities



Add a new RSS channel

 
 


Keywords

2024-11-14 23:34:47| Engadget

On Thursday, the European Union published its first draft of a Code of Practice for general purpose AI (GPAI) models. The document, which wont be finalized until May, lays out guidelines for managing risks and giving companies a blueprint to comply and avoid hefty penalties. The EUs AI Act came into force on August 1, but it left room to nail down the specifics of GPAI regulations down the road. This draft (via TechCrunch) is the first attempt to clarify whats expected of those more advanced models, giving stakeholders time to submit feedback and refine them before they kick in. GPAIs are those trained with a total computing power of over 10 FLOPs. Companies expected to fall under the EUs guidelines include OpenAI, Google, Meta, Anthropic and Mistral. But that list could grow. The document addresses several core areas for GPAI makers: transparency, copyright compliance, risk assessment and technical / governance risk mitigation. This 36-page draft covers a lot of ground (and will likely balloon much more before its finalized), but several highlights stand out. The code emphasizes transparency in AI development and requires AI companies to provide information about the web crawlers they used to train their models a key concern for copyright holders and creators. The risk assessment section aims to prevent cyber offenses, widespread discrimination and loss of control over AI (the its gone rogue sentient moment in a million bad sci-fi movies). AI makers are expected to adopt a Safety and Security Framework (SSF) to break down their risk management policies and mitigate them proportionately to their systemic risks. The rules also cover technical areas like protecting model data, providing failsafe access controls and continually reassessing their effectiveness. Finally, the governance section strives for accountability within the companies themselves, requiring ongoing risk assessment and bringing in outside experts where needed. Like the EUs other tech-related regulations, companies that run afoul of the AI Act can expect steep penalties. They can be fined up to 35 million (currently $36.8 million) or up to seven percent of their global annual profits, whichever is higher. Stakeholders are invited to submit feedback through the dedicated Futurium platform by November 28 to help refine the next draft. The rules are expected to be finalized by May 1, 2025.This article originally appeared on Engadget at https://www.engadget.com/ai/the-eu-publishes-the-first-draft-of-regulatory-guidance-for-general-purpose-ai-models-223447394.html?src=rss


Category: Marketing and Advertising

 

Latest from this category

15.09At Shenzhen Airport, Meituans delivery robots bring meals to passengers waiting at gates
12.09AI Update, September 12, 2025: AI News and Views From the Past Week
12.09When phone batteries drain, Vodafone steps in with free replacements
11.09What Is 'Unbossing'? And Do Workers Want It? [Infographic]
11.09Adtech's Publisher-First Era: How SSPs Must Adapt to Survive
11.09Ralph Laurens new AI stylist, Ask Ralph, delivers custom style advice and curated outfits
10.09What Stops Marketers From Getting the Most Out of Data?
10.09Why B2B Brands Should Stop Selling--and Start Teaching
Marketing and Advertising »

All news

16.09Google-owner reveals 5bn AI investment in UK ahead of Trump visit
16.09Auto or consumer? How DSP Mutual Fund's Vinit Sambre is picking stocks after GST reforms
16.09Gold scales new high as dollar weakens ahead of Fed meeting
16.09Senate clears Trump pick Miran to Fed board ahead of key interest rate vote
16.09Just 48 hours ahead of big rate call, Senate approves Trump economic adviser to serve on Federal Reserve board
16.09Asian stocks gain at open after Wall Street rally
16.09Anand Rathi Share to launch Rs 745 crore IPO this month
16.09'Not satisfactory': After Nestle, Australia's Super Retail fires CEO over relationship disclosure
More »
Privacy policy . Copyright . Contact form .