|
|||||
|
|
Lumus got a major boost in brand recognition when one of its waveguides was selected for use in the Meta Ray-Ban Display glasses. But that already feels like old tech now because at CES 2026, the company brought some of its latest components to the show and based on what I saw, they seem poised to seriously elevate the optical quality of the next wave of high-end smartglasses. When the Meta Ray-Ban Displays glasses came out, they wowed users as they were (and still are) one of a handful of smartglassess to feature a full-color in-lens display with at least a 20-degree field of view. But going by the specs on Lumus newest waveguides, were set for a major upgrade in terms of future capabilities. If you look closely, you can see where light from the waveguide propagates into the one of the smartglasses' lenses.Sam Rutherford for EngadgetThe first model I tried featured Lumus optimized Z-30 waveguides, which not only offer a much wider 30-degree FOV, they are also 30 percent lighter and 40 percent thinner than previous generations. On top of that, Lumus says they are also more power efficient with the waveguides capable of hitting more than 8,000 nits per watt. This is a big deal because smartglasses are currently quite limited by the size of batteries they can use, especially if you want to make them small and light enough to wear all day. When I tried them on, I was dazzled by both the brightness and sharpness I saw from the Z-30s despite them being limited to 720 x 720 resolution. Not only did the increase in FOV feel much larger than 10 degrees, colors were very rich, including white, which is often one of the most difficult shades to properly reproduce.I had to take a photo of one of Lumus' non-functioning smartglasses with the company's 70-degree FOV waveguide, because two out of three of the working ones had already broke and the last one that I used was being held together by tape. Sam Rutherford for EngadgetHowever, even after seeing how good that first model was, I was totally not prepared for Lumus 70-degree FOV waveguides. I was able to view some videos and a handful of test images and I was completely blown away with how much area they covered. It was basically the entire center portion of the lens, with only small unused areas around the corners. And while I did notice some pincushion distortion along the sides of the waveguides display, a Lumus representative told me that it will be possible to correct for that in final retail units. But make no mistake, these waveguides undoubtedly produced some of the sharpest, brightest and best-looking optics Ive seen from any smartglasses, from either retail models or prototypes or. It almost made me question how much wider FOV these types of gadgets really need, though to be clear, I dont think weve hit the point of diminishing returns yet. This is one of Lumus' thinnest waveguides measuring in at just 0.8mm.Sam Rutherford for EngadgetOther advantages of Lumus geometric reflective waveguides include better overall efficiency than their refractive counterparts along with the ability to optically bond the displays to smartglasses lenses. That means unlike a lot of rivals, Lumus waveguides can be paired with transitions lenses instead of needing to resort to clip-on sunglass attachments when you go outside. Lumus also claims its designs also simplifies the manufacturing process, resulting in thinner waveguides (as small as 0.8mm) and generally higher yields. Unfortunately, taking high-quality photos of content from smartglasses displays is incredibly challenging, especially when youre using extremely delicate prototypes, so youll just have to take my word for now. But with Lumus in the process of ramping up production of its new waveguides with help from partners including Quanta and SCHOTT, it feels like there will be a ton of smartglasses makers clamoring for these components as momentum continues to build around the industrys pick for the next big thing. This article originally appeared on Engadget at https://www.engadget.com/wearables/lumus-brought-a-massively-wider-fov-to-smartglasses-at-ces-2026-233245949.html?src=rss
When Meta first announced its display-enabled smart glasses last year, it teased a handwriting feature that allows users to send messages by tracing letters with their hands. Now, the company is starting to roll it out, with people enrolled in its early access program getting it first,I got a chance to try the feature at CES and it made me want to start wearing my Meta Ray-Ban Display glasses more often. When I reviewed the glasses last year, I wrote about how one of my favorite tings about the neural band is that it reduced my reliance on voice commands. I've always felt a bit self conscious at speaking to my glasses in public. Up to now, replying to messages on the display glasses has still generally required voice dictation or generic preset replies. But handwriting means that you can finally send custom messages and replies somewhat discreetly. Sitting at a table wearing the Meta Ray-Ban Display glasses and neural band, I was able to quickly write a message just by drawing the letters on the table in front of me. It wasn't perfect it misread a capital "I" as an "H" but it was surprsingly intuitive. I was able to quickly trace out a short sentence and even correct a typo (a swipe from left to right will let you add a space, while a swipe from right to left deletes the last character). Alongside handwriting, Meta also announced a new teleprompter feature. Copy and paste a bunch of text it supports up to 16,000 characters (roughly a half-hour's worth of speech) and you can beam your text into the glasses' display. If you've ever used a teleprompter, Meta's version works a bit differently in that the text doesn't automatically scroll while you speak. Instead, the text is displayed on individual cards you manually swipe through. The company told me it originally tested a scrolling version, but that in early tests, people said they preferred to be in control of when the words appeared in front of them. Teleprompter is starting to roll out now, though Meta says it could take some time before everyone is able to access. The updates are the among the first major additions Meta has made to its display glasses since launching them late last year and a sign that, like its other smart glasses, the company plans to keep them fresh with new features. Elsewhere at CES, the company announced some interesting new plans for the device's neural band and that it was delaying a planned international rollout of the device.This article originally appeared on Engadget at https://www.engadget.com/wearables/handwriting-is-my-new-favorite-way-to-text-with-the-meta-ray-ban-display-glasses-213744708.html?src=rss
While wave upon wave of smartglasses and face-based wearables crash on the shores of CES, traditional glasses really havent changed much over the hundreds of years weve been using them. The last innovation, arguably, was progressive multifocals that blended near and farsighted lenses and that was back in the 1950s. It makes sense that autofocusing glasses maker IXI thinks its time to modernize glasses. After recently announcing a 22-gram (0.7-ounce) prototype frame, the startup is here in Las Vegas to show off working prototypes of its lenses, a key component of its autofocus glasses, which could be a game-changer. IXIs glasses are designed for age-related farsightedness, a condition that affects many, if not most people over 45. They combine cameraless eye tracking with liquid crystal lenses that automatically activate when the glasses detect the users focus shifting. This means that, instead of having two separate prescriptions, as in multifocal or bifocal lenses, IXIs lenses automatically switch between each prescription. Crucially like most modern smartglasses the frames themselves are lightweight and look like just another pair of normal glasses. Mat Smith for Engadget With a row of prototype frames and lenses laid out in front of him, CEO and co-founder Niko Eiden explained the technology, which can be separated into two parts. First, the IXI glasses track the movement of your eyes using a system of LEDs and photodiodes, dotted around the edges of where the lenses sit. The LEDs bounce invisible infrared light off the eyes and then measure the reflection, detecting the subtle movements of your eye and how both eyes converge when focusing on something close. Using infrared with just a "handful of analog channels" takes far less power than the millions of pixels and 60-times-per-second processing required by camera-based systems. IXIs system not only tracks eye movements, but also blinking and gaze direction, while consuming only 4 milliwatts of power. Mat Smith for Engadget Most of the technology, including memory, sensors, driving electronics and eye tracker, is in the front frame of the glasses and part of the arms closest to the hinge. The IXI prototype apparently uses batteries similar in size to those found in AirPods, which gives some sense of the size and weight of the tech being used. The charging port is integrated into the glasses left arm hinge. Naturally, this does mean they cant be worn while charging. IXI says that a single charge should cover a whole days usage. The prototype frames I saw this week appeared to be roughly the same weight as my traditional chunky specs. And while these are early iterations, IXIs first frames wouldnt look out of place in a lineup of spectacle options. The team has also refined the nose pieces and glasses arms to accommodate different face shapes. Apparently, when testing expanded from Finland to the UK, British faces were ...different. A little harsh when talking to me, a Brit. Eiden pulled out some prototype lenses, made up of layers of liquid crystal and a transparent ITO (indium tin oxide) conductive layer. This combination is still incredibly thin, and it was amazing to watch the layers switch almost instantly into a prescription lens. It seemed almost magical. As theyre so thin, they can be easily integrated into lenses with existing prescriptions. It can also provide cylindrical correction for astigmatism too. Autofocus lenses could eliminate the need for multiple pairs of glasses, such as bifocals and progressives. Even if the glasses were to run out of power, theyd still function as a pair of traditional specs with your standard prescription, just lacking the near-sighted boost. IXIs sensor sensitivity can also offer insight into other health conditions, detect dry eyes, estimate attentiveness and, by tracking where youre looking, even posture and neck movement. According to Eiden, blink rate changes with focus, daydreaming and anxiety, and all that generates data that can be shown in the companion app. Mat Smith for Engadget Hypothetically, the product could even potentially adapt prescriptions dynamically, going beyond the simple vision correction of Gen 1. For example, it could offer stronger corrections as your eyes get fatigued through the day. IXI appears to be putting the pieces in place to make these glasses a reality. It still needs to obtain the necessary medical certifications in order to sell its glasses and get all the production pieces in place. Its already partnered with Swiss lens-maker Optiswiss for manufacturing. Eiden says the final product will be positioned as a high-end luxury glasses option, selling through existing opticians. The company hopes to finally launch its first pair sometime next year.This article originally appeared on Engadget at https://www.engadget.com/wearables/ixis-autofocusing-lenses-multifocal-glasses-ces-2026-212608427.html?src=rss