|
It's an understatement to say that cell phones have evolved in the 30-plus years since they entered mainstream society. But, despite all the stuff our smart phones offer, they'll never hit the same way those early models did. Now, there's an opportunity to step back in time, thanks to the new digital Nokia Design Archive sharing sketches, photos, interviews and videos spanning from the mid-1990s to 2017. Aalto University, in Nokia's home country of Finland, is responsible for the Nokia Design Archive. Its team of researchers curated 700 entries and included a repository with another 20,000 items and 959GB of born-digital file. The never-before-seen content from Nokia, which released its first GSM hand-portable phone in 1992, doesn't disappoint. Anyone feeling extra nerdy (ahem, me) can even read through presentations with mood boards and concept designs. The ensuing nostalgia dive provides not only an ode to the classic Nokia devices (and their very 90s styling), but also an interesting look into how technology evolves. "In the early ages of Nokia, there was a genuine wish to understand people, how they live, what makes them tick. Now were at a similar point of societal transformation with AI. Nobody has concretised what it is yet, but we need to get people thinking about what could be," said lead researcher Professor Anna Valtonen in a release. The Archive reveals how designers made visions concrete so that they could be properly explored long before they became reality. It reminds us that we do have agency and we can shape our world by revealing the work of many people who did just that. Nokia The Design Archive looks a bit like a word graph floating through space, with topics including Mobile Games and Gaming which provides an overview of the infamous Snake game's creation and Phones Fashion and Accessories. The free platform offers four topic filters: products, aesthetics, design process and design strategy. Plus, you can narrow in on specific years for a better look at your favorite model's time period. The team hopes to continue adding more content as the project develops further. This article originally appeared on Engadget at https://www.engadget.com/mobile/the-nokia-design-archive-has-20-plus-years-of-never-before-seen-images-sketches-and-strategy-150044971.html?src=rss
Category:
Marketing and Advertising
Axios is expanding its local newsletter presence from 30 to 34 cities. The catch? OpenAI is funding it. In its continued pretense of benefiting newsrooms, OpenAI has partnered with Axios in a three-year deal to cover Pittsburgh, Pennsylvania; Kansas City, Missouri; Boulder, Colorado; and Huntsville, Alabama. What does OpenAI get in exchange for its funding? Oh, just the ability to use Axios content to answer users' questions. Like the close to 20 newsrooms that OpenAI has already partnered with, Axios seems to have forgotten that the scorpion did end up stinging the frog. Instead, we have this starry-eyed statement from Axios co-founder and CEO Jim VandeHei: "We launched Axios Local nearly four years ago with the bold goal of bringing local news to communities across the country. OpenAIs investment allows us to continue our expansion and aid us in bringing essential local news to deserving audiences." Axios will be able to use OpenAI's technology to create its own AI-powered systems and products. However, VandeHei issued a memo to employees stating the aforementioned technology won't be used for reporting stories (sure, because no one has been laid off in favor of AI before oh wait, wait some more and the list continues). The Axios announcement does, however, point out that The New York Times is currently suing both OpenAI and Microsoft for copyright infringement, so maybe there's some awareness of what its entering into. This article originally appeared on Engadget at https://www.engadget.com/ai/axios-partners-with-openai-forgetting-the-scorpion-stung-the-frog-144242204.html?src=rss
Category:
Marketing and Advertising
The rise of AI NPCs has felt like a looming threat for years, as if developers couldn't wait to dump human writers and offload NPC conversations to generative AI models. At CES 2025, NVIDIA made it plainly clear the technology was right around the corner. PUBG developer Krafton, for instance, plans to use NVIDIA's ACE (Avatar Cloud Engine) to power AI companions, which will assist and banter with you during matches. Krafton isn't just stopping there it's also using ACE in its life simulation title InZOI to make characters smarter and generate objects. While the use of generative AI in games seems almost inevitable, as the medium has always toyed with new methods for making enemies and NPCs seem smarter and more realistic, seeing several NVIDIA ACE demos back-to-back made me genuinely sick to my stomach. This wasn't just slightly smarter enemy AI ACE can craft entire conversations out of thin air, simulate voices and try to give NPCs a sense of personality. It's also doing that work locally on your PC, powered by NVIDIA's RTX GPUs. But while all of that that might sound cool on paper, I hated almost every second I saw the AI NPCs in action. TiGames' ZooPunk is a prime example: It relies on NVIDIA ACE to generate dialog, a virtual voice and lip syncing for an NPC named Buck. But as you can see in the video above, Buck sounds like a stilted robot with a slight country accent. If he's supposed to have some sort of relationship with the main character, you couldn't tell from the performance. I think my visceral aversion to NVIDIA's ACE-powered AI comes down to this: There's simply nothing compelling about it. No joy, no warmth, no humanity. Every ACE AI character feels like a developer cutting corners in the worst way possible, as if you're seeing their contempt for the audience manifested a boring NPC. I'd much rather scroll through some on-screen text, at least I wouldn't have to have conversations with uncanny robot voices. During NVIDIA's Editor's Day at CES, a gathering for media to learn more about the new RTX 5000-series GPUs and their related technology, I was also underwhelmed by a demo of PUBG's AI Ally. Its responses were akin to what you'd hear from a pre-recorded phone tree. The Ally also failed to find a gun when the player asked, which could have been a deadly mistake in a crowded map. At one point, the PUBG companion also spent around 15 seconds attacking enemies while the demo player was shouting for it to get into a car. What good is an AI helper if it plays like a noob? Poke around NVIDIA's YouTube channel and you'll find other disappointing ACE examples, like the basic speaking animations in the MMO World of Jade Dynasty (above) and Alien: Rogue Incursion. I'm sure many devs would love to skip the chore of developing decent lip syncing technology, or adopting someone else's, but for these games leaning on AI just looks awful. To be clear, I don't think NVIDIA's AI efforts are all pointless. I've loved seeing DLSS get steadily better over the years, and I'm intrigued to see how DLSS 4's multi-frame generation could improve 4K and ray-tracing performance for demanding games. The company's neural shader technology also seems compelling, in particular its ability to apply a realistic sheen to material like silk, or evoke the slight transparency you'd see from skin. These aren't enormous visual leaps, to be clear, but they could help deliver a better sense of immersion. Now I'm sure some AI boosters will say that the technology will get better from here, and at some undefinable point in the future, it could approach the quality of human ingenuity. Maybe. But I'm personally tired of being sold on AI fantasies, when we know the key to great writing and performances is to give human talent the time and resources to refine their craft. And on a certain level, I think I'll always feel like the director Hayao Miyazaki, who described an early example of an AI CG creature as, "an affront to life itself." AI, like any new technology, is a tool that could be deployed in many ways. For things like graphics and gameplay (like the intelligent enemies in F.E.A.R. and The Last of Us), it makes sense. But when it comes to communicating with NPCs, writing their dialog and crafting their performances, I've grown to appreciate human effort more than anything else. Replacing that with lifeless AI doesn't seem like a step forward in any way.This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/nvidias-ai-npcs-are-a-nightmare-140313701.html?src=rss
Category:
Marketing and Advertising
All news |
||||||||||||||||||
|