|
AMDs CES 2025 presser is nearly upon us and rumors are swirling about new graphics cards, CPUs and more. (Follow Engadget's CES 2025 liveblog for real-time updates from the show.) The company regularly uses CES to promote upcoming chips and this year should be no different. To that end, the organization unveiled the AI-centric Ryzen 8000G desktop chips at CES 2024. What to expect at the AMD CES 2025 press conference Rumors have been flying for weeks regarding AMDs probable CES 2025 announcements. Theres a safe bet that the company will reveal its new RX 9070 XT graphics cards. These will likely be based on the new RDNA 4 architecture and should operate as a great mid-range GPU option. Its also probable that AMD will finally announce the long-awaited next-gen 50-series GeForce RTX GPUs. The company typically sticks to laptop components during CES, but itll likely break tradition to unveil these desktop chips. Its been rumored that the company will even present the Strix Halo mobile chip. This one is expected to bring a 40 compute unit GPU onto a single die alongside the CPU. This could translate to smaller and lighter gaming laptops, without sacrificing power. Finally, some folks have been reporting that AMD will unveil a new gaming handheld CPU that could be a direct follow up to the Ryzen Z1 Extreme. The Z1 Extreme currently powers stuff like the Asus ROG Ally X and the Lenovo Legion Go. AMD CES 2025 livestream You can watch the AMD CES press conference as it happens below. The feed will start Monday, January 6 at 2PM ET. Still to come at CES press day: Samsung, Sony and NVIDIA (among others). This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/amd-ces-2025-press-conference-watch-it-here-today-at-2pm-et-182557163.html?src=rss
Category:
Marketing and Advertising
Unlike some of the robots weve seen at CES 2025, Mi-Mo doesnt have a face, but it still looks a little familiar thanks to its resemblance to the iconic Pixar lamp. Mi-Mo is still just a prototype, but there are some interesting ideas behind the unusual-looking robot walking around the show floor. The creation of Japanese firm Jizai, the company describes it as a general purpose AI robot that thinks and acts on its own. It has a built-in camera and microphones, which allows it to move around and respond to voice prompts and commands. It runs on multiple large language models that enable its voice and image recognition capabilities. When we saw it, Mi-Mo didnt show many signs of being autonomous. It mostly shimmied around the show floor and waved at people passing by, which was honestly kind of cute. Jizais Yuji Oshima told me that the company envisions it as being useful for some childcare tasks, like reminding children to do their homework and then watching over them to make sure they actually complete it. (Jizais website notes the company is also interested in using robotics for elder care.) This is Mi-Mo a "general purpose AI robot" that looks kind of like the Pixar lamp on top of a small table. pic.twitter.com/yTHq8Smnoz Karissa Bell (@karissabe) January 6, 2025 But Mi-Mo wasn't created only to be a caretaker robot. Oshima said its meant to be an open platform for developers, researchers and others to find their own ways to use the robot. Jizai also intends for it to be somewhat modular so people can customize Mi-Mo with bespoke software, additional sensors or other hardware attachments. Jizai plans to make Mi-Mo available as a developer kit later this year and has opened a waitlist where interested parties can sign up for updates. This article originally appeared on Engadget at https://www.engadget.com/ai/this-six-legged-lamp-might-help-your-kid-with-their-homework-183046893.html?src=rss
Category:
Marketing and Advertising
Some of the best tech we see at CES feels pulled straight from sci-fi. Yesterday at CES 2025, I tested out Neural Lab's AirTouch technology, which lets you interact with a display using hand gestures alone, exactly what movies like Minority Report and Iron Man promised. Of course, plenty of companies have delivered on varying forms of gesture control. Microsoft's Kinect is an early example while the Apple Watch's double tap feature and Vision Pro's pinch gestures are just two of many current iterations. But I was impressed with how well AirTouch delivered and, unlike most gesture technology out there, it requires no special equipment just a standard webcam and works with a wide range of devices. Neural Lab's software is compatible with tablets, computers and really any device running at least Android 11, Windows 10 and later or Linux. The technology was developed with accessibility in mind after one of the founders had trouble keeping in touch with their parents overseas because navigating video conferencing programs was just too difficult for the older generation. The Neural Labs representative I spoke with added how his parents preferred using an iPad to a computer/mouse/keyboard combo because touch controls are so much more intuitive. With AirTouch, they can use their TV much like they do a tablet. In addition to accessibility, there are plenty of commercial applications too such as letting surgeons manipulate MRI scans without touching anything or a more commonplace scenario like moving through slides in a presentation. AirTouch tracks 3D hand movements and keys off of eye gazes to recognize intent, allowing it to ignore extraneous gestures. It currently supports nine gestures and customization allows users to program up to 15. I tried out two demonstrations: a 3D screen with an animated image of a tree frog and a monitor displaying a webpage on a browser. On the 3D screen, holding up one finger dropped a pinecone on the frog's head, two fingers dropped an acorn, a thumbs up spun the frog around on its leaf perch and a quiet coyote gesture turned it back. It took me all of 15 seconds to learn and use the four gestures and soon I was raining down acorns on the poor frog like some ill-tempered squirrel. It was nearly as easy (though not quite as fun) to control the screen displaying the web browser. Moving my hand around dragged the cursor across the screen and pinching took the place of clicking. I was able to scroll around on a streaming site, pick something to play, pause it and start it back up again within seconds of learning the hand movements. There were a few instances where my movements didn't do the thing I'd hoped, but after a few tries, I started to get the hang of the controls. AirTouch is available now as a $30-per-month subscription for individuals (and $300 monthly for companies). Neural Labs says it takes just five minutes to install the software on any compatible device. This article originally appeared on Engadget at https://www.engadget.com/computing/accessories/neural-labs-airtouch-brings-gesture-control-to-windows-and-android-devices-with-just-a-webcam-180031750.html?src=rss
Category:
Marketing and Advertising
All news |
||||||||||||||||||
|