Xorte logo

News Markets Groups

USA | Europe | Asia | World| Stocks | Commodities



Add a new RSS channel

 
 


Keywords

2024-11-14 23:34:47| Engadget

On Thursday, the European Union published its first draft of a Code of Practice for general purpose AI (GPAI) models. The document, which wont be finalized until May, lays out guidelines for managing risks and giving companies a blueprint to comply and avoid hefty penalties. The EUs AI Act came into force on August 1, but it left room to nail down the specifics of GPAI regulations down the road. This draft (via TechCrunch) is the first attempt to clarify whats expected of those more advanced models, giving stakeholders time to submit feedback and refine them before they kick in. GPAIs are those trained with a total computing power of over 10 FLOPs. Companies expected to fall under the EUs guidelines include OpenAI, Google, Meta, Anthropic and Mistral. But that list could grow. The document addresses several core areas for GPAI makers: transparency, copyright compliance, risk assessment and technical / governance risk mitigation. This 36-page draft covers a lot of ground (and will likely balloon much more before its finalized), but several highlights stand out. The code emphasizes transparency in AI development and requires AI companies to provide information about the web crawlers they used to train their models a key concern for copyright holders and creators. The risk assessment section aims to prevent cyber offenses, widespread discrimination and loss of control over AI (the its gone rogue sentient moment in a million bad sci-fi movies). AI makers are expected to adopt a Safety and Security Framework (SSF) to break down their risk management policies and mitigate them proportionately to their systemic risks. The rules also cover technical areas like protecting model data, providing failsafe access controls and continually reassessing their effectiveness. Finally, the governance section strives for accountability within the companies themselves, requiring ongoing risk assessment and bringing in outside experts where needed. Like the EUs other tech-related regulations, companies that run afoul of the AI Act can expect steep penalties. They can be fined up to 35 million (currently $36.8 million) or up to seven percent of their global annual profits, whichever is higher. Stakeholders are invited to submit feedback through the dedicated Futurium platform by November 28 to help refine the next draft. The rules are expected to be finalized by May 1, 2025.This article originally appeared on Engadget at https://www.engadget.com/ai/the-eu-publishes-the-first-draft-of-regulatory-guidance-for-general-purpose-ai-models-223447394.html?src=rss


Category: Marketing and Advertising

 

Latest from this category

28.12Samsung's two new speakers will deliver crisp audio while blending into your decor
27.12OpenAI is hiring a new Head of Preparedness to try to predict and mitigate AI's harms
27.12Heres the first real look at the Retroid Pocket 6 running PS2 games
27.12Stardew Valley players on the Nintendo Switch 2 get a free upgrade
26.12New York State will require warning labels on social media platforms
26.12What we listened to in 2025
26.12You may soon be able to change your Gmail address
26.12LG announces line of premium gaming monitors that offer 5K visuals
Marketing and Advertising »

All news

28.12These devastating photos show the chaos of Trumps mass deportation campaign in 2025
28.12How this retirement community is using virtual reality to help residents build social connections
28.12Today in Chicago History: Montgomery Ward department store announces it will shut down operations
28.12Why active investing could outperform passive strategies in 2026
28.12Should leaders always be true to their values?
28.129 Midcap stocks with massive upside potential; up to 45%! Do you own any?
28.1211 IPOs set to list this week. GMPs signal listing returns of up to 90%
28.129 Midcap stocks with massive upside potential; up to 45%! Do you own any?
More »
Privacy policy . Copyright . Contact form .