Posts with «technology & electronics» label

This ring lets you whisper to your phone, because sometimes we need to use our inside voices

If there’s a problem with the world of ambient computing we’re all expected to live in, it’s that you can’t really be discreet. Most commands to your voice assistant of choice have to be spoken at a volume slightly higher than you would speak to another person. That’s the societal ill VTouch, a South Korean company, has chosen to tackle with its WHSP Ring. It’s a ring with a proximity sensor and microphone that activates when you raise it to your mouth. So when you want to talk to your assistant, you can simply mutter toward your knuckle and have it understand you.

The idea is that you’ll utilise VTouch’s own app, which will offer a number of AI assistants to aid you. This includes, in the example shown to me, an AI Art Curator or even a digital Psychiatrist, all of which can be accessed from your phone. You can also interact with your smart home, setting all of the usual parameters from the comfort of your hushed tones. It’ll last for around a day and a half on a single charge, while the charging case you can take around with it will extend the life by up to nine days. Any responses will, of course, be pumped through your headphones, enabling you to “chat” to your assistant. There’s even a button that, if tapped five times at once, will trigger a system to alert your contacts of an emergency and record ambient sound.

Touch is planning to launch WHSP Ring as a Kickstarter in the near future, but there’s no word yet on how much you could expect this thing to cost or when you can get your hands on one.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/this-ring-lets-you-whisper-to-your-phone-because-sometimes-we-need-to-use-our-inside-voices-002529204.html?src=rss

Rabbit R1 is an adorable AI-powered assistant co-designed by Teenage Engineering

Yes, you probably already have a virtual assistant in your pocket on your phone. Heck, if you're reading Engadget, I'm willing to bet you've got at least one smart speaker floating around your home as well that you can ask to complete basic tasks. But a new start up called Rabbit seems to think these are less than ideal implementations of AI (if you can really call Siri and Alexa that). It envisions a world where you trade apps for conversation and, rather than a distracting device shoving icons in your face, you interact with what amounts to a walkie-talkie for an AI.

The R1 is the first device to be launched by Rabbit and it's an objectively adorable little square in an endearingly bright shade of orange. Even if you're not sold on the necessity of a dedicated gadget for a virtual assistant, it's hard to deny the aesthetic appeal, which comes courtesy of the design gurus at Teenage Engineering. It features a small 2.88-inch touchscreen, an analog scroll wheel, two mics, a speaker and a "360 degree rotational eye", which is just a fancy name for a camera you can spin to face towards you or through the back of the handset. 

The primary way you interact with the R1, though, is by pressing and holding the "Push-to-Talk" button. This tells Rabbit OS to start listening. A heavily stylized and disembodied rabbit head bobs slowly as you ask your question or give it a task, and then it quickly gets to work. Want to book an Uber? Need a recipe to use up the left overs in your fridge? Wondering who sampled The Isley Brothers "That Lady"? (The answer is Beastie Boys, Basement Jaxx and Kendrick Lamar, FTR.) The R1 seems pretty capable of handling those tasks, at least in the controlled video demo.

Rabbit OS is able to tackle those tasks using what it calls the Large Action Model (LAM). This is what founder and CEO Jesse Lyu pitches as the company's major innovation. Its designed to take actions on interfaces, rather than through APIs or apps. In short, it can be trained to cary out almost any task that can be accomplished through a user interface. It's sort of like a fancy version of a macro. 

As a way of demonstrating its capabilities, Lyu teaches the R1 how to generate an image using Midjourney via Discord. Once Lyu walks performs the process, with Rabbit OS recording his actions, its able to repeat the task when asked. 

The rotating camera faces up into the body by default, acting as a sort of privacy shutter. Only turning its sensor towards its target when summoned. It can do the usual tricks like identify people or thing in the real world (within reason at least). But it's the way it interacts with the AI that is sure to pique people's interest. In the demo Lyu points the R1 at a full refrigerator and asks it to suggest a recipe thats "low in calories" based on it's contents. 

Rabbit

Of course, there's still a lot of unanswered questions about the R1. How is the battery life? The company claims it's "all day," but what does that really mean? And will the average user be able to train it easily. At least we know a few things, though. We know it costs $199 and that it's available for preorder now, with an expected ship date sometime in March or Apil. 

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/rabbit-r1-is-an-adorable-ai-powered-assistant-co-designed-by-teenage-engineering-001051537.html?src=rss

The Spacetop is a laptop that really wants to swap your screen for AR glasses

Right now there are a bunch of companies trying to figure out new and better ways to work on the go. Lenovo made a laptop with two displays and a detachable keyboard to help give owners additional screen space without too much added bulk. And there are headsets from Meta, Apple and others that offer a way to create a completely virtual workspace without the need for a tethered PC. But with the Spacetop, startup Sightful has come up with an in-between solution that uses the bottom of a laptop, but instead of a traditional display, it’s attached to a pair of AR glasses.

In theory, the glasses provide a 100-inch virtual display that can hold more windows than you could ever fit on a traditional laptop screen. And with the Spacetop being powered by Android, you get a familiar working environment too. Instead of controllers or hand gestures, there’s a typical keyboard and touchpad for writing, browsing the web or anything else you might need to do. But after trying one out, while I like the idea, I’m not so sure about Sightful’s execution.

The system is powered by a Qualcomm Snapdragon XR2 chip (the same processor Meta used in the Quest 2 from 2020), just 8GB of RAM and 256GB of storage. It feels smooth, though I wouldn't necessarily call it fast. I didn’t notice much lag when dragging windows around or typing, but I didn’t get to see how it handles a ton of open apps or anything more demanding than a web browser. Sightful has also come up with some simple shortcuts for doing things like re-centering the screen (just press the left and right Shift keys at the same time).

The issue I ran into is that, while Sightful developed the base of the laptop itself and put its logo on the side of the glasses, the bundled AR eyewear attached to the PC is actually off-the-shelf specs from Xreal (in this case the Xreal Lights). So while Xreal glasses are somewhat sharp, their narrow field of view (especially vertically) and somewhat small sweet spot left a lot to be desired when it came to actually using that 100-inch virtual display.

Photo by Sam Rutherford/Engadget

Additionally, the Spacetop’s keys felt spongy and its touchpad was lackluster too. It’s a far cry from the more precise haptic surfaces you’d get on a premium Mac or Windows PC. And when you combine all this with a starting price of $2,000, I just don’t see the appeal. For people who already have a laptop, I’d argue you’d be much better off getting a Meta Quest 3, which costs $500, sports a much newer chip and can sync with your laptop to create its own version of a virtual desktop. Plus then you’d have a headset that’s way better at playing games, watching movies and more.

Photo by Sam Rutherford/Engadget

There’s a reason why Meta, Apple and others have sunk billions of dollars into making headsets and glasses with finely tuned optics. And I’m sure someday (maybe even sometime in the next few years), off the shelf AR glasses will make some pretty big advancements. But between its high price, dated specs, and a big, but not great-looking virtual display powered by two-year old AR glasses, the Spacetop doesn’t quite deliver on the promise of revolutionizing the common laptop.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/the-spacetop-is-a-laptop-that-really-wants-to-swap-your-screen-for-ar-glasses-ces-2024-233638523.html?src=rss

OrCam Hear hands-on: A surprisingly effective voice isolation platform for those with hearing loss

Imagine being at a crowded convention or noisy bar and trying to have a conversation with someone across from you. It's tough enough for people with hearing to focus on what the person is saying, not to mention those with hearing loss. Assistive technology company OrCam has rolled into CES 2024 with a host of new products including a set of devices and an iPhone app designed to help those with hearing loss deal with auditory overload. The platform is called OrCam Hear and after a quick hands-on at the show in Las Vegas, I'm pleasantly surprised.

OrCam Hear consists of a pair of earbuds and a dongle that plugs into any phone, and you'll use the app to control who you want to listen to. The system listens to voices for a few seconds (via the dongle) and uses AI to create speaker profiles for each person that then allows you to "selectively isolate specific voices even in noisy environments." This targets the issue sometimes known as the "cocktail party problem" that's a challenge for hearing aids.

During a demo, my editor Terrence O'Brien and I spoke to two people whose voice profiles were already set up in the app. We stood around a table with Terrence on my right and the two company spokespeople across us about five feet away. I put the earbuds in (after they were sanitized), and the noise around me immediately sounded a little less loud and a lot more muffled. 

Photo by Terrence O'Brien / Engadget

I looked at everyone around me and though I could see their lips moving, I couldn't hear anyone speaking. After OrCam's reps used the app to drag a floating circle into the ring surrounding me, I started to hear the person diagonally across me talk. And though the executive next to him was also moving his mouth, I could still only hear the voice of the person selected. Only after we moved the other speaker's icon into the ring did I start to hear them.

What impressed me more, though, was how the system handled relatively new participants like Terrence. He didn't have a profile set up in the app, and I initially couldn't hear him at all. A few seconds into the demo, though, a new circle appeared with a gray icon indicating a new "Anonymous" person had been recognized. When we dragged that into the ring, I was suddenly able to hear Terrence. This was all the more impressive because Terrence was wearing a fairly thick mask, which would have made him hard to understand any way. Yet, I was able to clearly make out what he was saying.

The OrCam Hear isn't perfect, of course. I was still able to hear the speakers as they talked, and the audio playing through the earbuds was slightly delayed, so there was a small echo. But people who have hearing loss, whom this product is designed for, aren't likely to experience that. There was also some audio distortion when the selected speakers were talking, but not so much that it impeded my comprehension.

OrCam said that the Hear platform is "currently in a technology preview phase and is expected to be shipped later in the year." Hopefully, that gives the company time to iron out quirks and make the app available on both iOS and Android, so that the assistive tech can be truly inclusive and accessible to more people.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/orcam-hear-hands-on-a-surprisingly-effective-voice-isolation-platform-for-those-with-hearing-loss-230243953.html?src=rss

Watch Qualcomm's CES 2024 keynote on its highly anticipated AI-powered chip

Qualcomm is ringing in the new year at CES 2024 in Las Vegas with some updates in its chip lineup that power virtual and mixed-reality headsets. The keynote, which will detail more about what’s new for its anticipated AI-powered chip, will happen on January 10 at 5pm ET. It can be streamed on Qualcomm’s website or directly on the CES keynote page.

What to expect

There might be some information divulged about Meta and Qualcomm’s chip collaboration and how it could improve functionality on new gen VR headsets. Qualcomm has said that the technology has been engineered into a single chip architecture that allows it to support smaller and sleeker headsets.

The AI-integrated chip technology will not only make it easier to track a headset user’s hands, controllers and eyes during use, it will also deliver better display resolution per eye. The Snapdragon XR2+ Gen 2 chip will offer up to 4.3k by 4.3k resolution per eye, as well as upgraded raw image processing and full color display.

Cristiano Amon, the CEO of Qualcomm is also going to focus on generative artificial intelligence and how Snapdragon platforms will integrate AI on devices from smartphones to PC gaming systems. Amon’s keynote is in alignment with the company’s overall move towards investing in AI hardware in its offerings, which we saw in its recently revealed mobile chipset for Android devices.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/watch-qualcomms-ces-2024-keynote-on-its-highly-anticipated-ai-powered-chip-220039557.html?src=rss

The Morning After: Sony's mixed reality headset breaks cover at CES 2024

Sony’s big press event at CES 2024 didn’t reveal much for the first half, retreading the company’s entertainment successes in TV, film and music. Then, out of the blue, it revealed an as-yet-unnamed mixed reality headset, with almost anime-looking controllers.

Sony

While there are some design similarities, this isn’t a VR headset à la PSVR. This is for “spatial content creation." The headset is powered by Qualcomm’s Snapdragon XR2+ Gen 2, announced just as CES began. This means it’s a self-contained device that doesn’t require a computer. Sony CEO Kenichiro Yoshida said the 4K OLED microdisplays on the headset would offer a “crisp viewing experience” and “intuitive interaction for 3D design.”

The headset has a pair of controllers. One is described as a “ring controller” for manipulating objects and the other as a “pointing controller” for... pointing. Sony envisions creators being able to craft 3D models in real time with them.

It all seems a more creative interpretation of Microsoft’s HoloLens. We haven’t yet seen the headset in person, though. Hopefully, we’ll get more details from Sony’s booth, here in Las Vegas.

Oh, and the company drove its incoming Afeela EV on stage with a PlayStation controller. That’s CES. For all the latest CES news, find all our stories right here.

— Mat Smith

The biggest stories you might have missed

The MSI Claw is the first gaming handheld built on Intel’s Core Ultra chips

Watch Sony’s CES 2024 keynote in under 6 minutes

Lockly’s Visage smart lock can unlock doors by scanning your face

Formlabs shows up at CES 2024 with more realistic 3D-printed teeth

​​You can get these reports delivered daily direct to your inbox. Subscribe right here!

Lots of refreshed laptops landed at CES 2024

Intel’s updated chips need to go somewhere.

Engadget

Yesterday, Intel revealed its entire 14th-generation CPU family, which includes powerful HX series chips, like the 24-core i9-14900HX, as well as new mainstream desktop CPUs. That means, of course, lots of new laptops. We’ve got impressions and reports on new ASUS, Alienware, Acer, Lenovo and Razer computers, but I’d point you toward the weirdest PC we saw so far: the Lenovo ThinkBook Plus Gen 5. As pictured here, the bottom is a Windows laptop deck, and the display is a 14-inch Android tablet. You can use the tablet as a standalone Android device, a wireless monitor for the laptop base or a Wacom-like drawing display.

Continue reading.

Samsung’s Ballie robot ball showed up at CES 2024 with a built-in projector

And a yellow new look.

Samsung

Samsung showed off a robot named Ballie, which has a projector built in. Interestingly, though, according to a report from The Washington Post, Samsung said the robotic sphere will actually be available for sale within the year.

We first saw an early iteration of Ballie in 2020, touted as a household assistant and potential fitness assistant, with such sophisticated skills as opening smart curtains and turning on the TV. But four years later, it’s a little different. It’s now “bowling-ball-size” and has a spatial LiDAR sensor and a 1080p projector. The latter has two lenses and allows the robot to display movies, video calls and “greetings” on its surrounding surfaces.

Continue reading.


Mercedes-AMG and will.i.am try to turn cars into DJs at CES 2024

Oh no. will.he.is.

Mercedes-AMG and will.i.am are collaborating on a new sound system for cars, called MBUX SOUND DRIVE (all caps, apparently). It pulls data from the car’s sensors, which control a specially deconstructed music file. Start the car and you hear a music track’s bed, looping in the background; accelerate to a low speed and it’ll add some bass reverb to the song. On top of that, moving the steering wheel gets you extra effects or the chorus loop kicking in. It’s only when you open the car up on a clear highway that the main music and lyrics start. Daniel Cooper tested it out, here in Las Vegas.

Continue reading.

This audio mask left our reviews editor speechless

It makes the public phone calls private.

Skyted’s Silent Mask launched its Kickstarter campaign today at CES 2024. It’s a noise-reducing wearable that would allow you to speak freely about confidential information anywhere, without worrying about people around you hearing. It’s already broken its $8,800 goal many times over. While the noise reduction tech has its limits, the idea is it’ll offer a degree of confidentiality to voice calls in busy or quiet public spaces.

Continue reading.

This article originally appeared on Engadget at https://www.engadget.com/the-morning-after-sonys-mixed-reality-headset-breaks-cover-at-ces-2024-181019926.html?src=rss

The ASUS Zenbook Duo is a stunning dual-screen laptop with seemingly no compromises

Between the Zenbook 17 Fold, Project Precog and previous Zenbook Duo machines, it feels like ASUS has been working towards building a true dual-screen laptop for ages. And now at CES 2024, that time has come with the arrival of the simply named Zenbook Duo.

Similar to Lenovo’s Yoga Book 9i, the Zenbook Duo features two separate screens and a detachable Bluetooth keyboard that can be stashed inside the system for traveling. The difference is that ASUS’ OLED panels look even better, as they are slightly larger at 14 inches, while also offering a 3K resolution, 120Hz refresh rate and stylus support. Plus, with a starting price of $1,500, it costs $500 less than the Yoga Book and not that much more than your average high-end ultraportable.

That alone would be enough to make it interesting, but what really elevates the Zenbook Duo is the polish ASUS has put into making it look and function just like a regular clamshell. Measuring just 0.78 inches thick and weighing 3.64 pounds (including its removable keyboard), it’s only a touch larger and heavier than a typical 14-inch notebook. And it doesn’t give up anything in terms of performance, with an Intel Core Ultra 9 185H CPU, up to 32GB of RAM and a 1TB SSD inside,. The same thing goes for connectivity, where you get two Thunderbolt 4 ports, a USB-A 3.2 slot, a 3.5mm audio jack and even a full-size HDMI 2.1 socket.

Photo by Sam Rutherford/Engadget

But the best thing about the Zenbook Duo is its fit and finish. Even though what I got to mess around with was a pre-production model, it felt incredibly sturdy. When packed up with its keyboard sandwiched between the two screens, there are no gaps or wasted space. And despite its super shallow dimensions, the detachable keyboard features full backlighting and more than enough travel to make sure your fingers don’t get sore while typing.

ASUS also added something I wish Lenovo had included on the Yoga Book 9i: a built-in kickstand. By mounting one on the bottom of the laptop, it sidesteps the need to have a separate folding cover, which streamlines the process every time you have to pack up and go. The small downside is that the orientation of the kickstand promotes a stacked setup with one display on top of the other. Though I’m not that bothered since that’s my preference anyway. Technically, you can use the Zenbook Duo with vertical side-by-side mode, but because there’s no way to adjust the kickstand in this position, it’s not quite as flexible.

Photo by Sam Rutherford/Engadget

As you’d expect from a system like this, you can use it as a standard clamshell with the Zenbook Duo featuring hidden magnets that allow the keyboard to snap neatly in place. However, when you have more room to work with, you can place the keyboard in front of the system and instantly double your screen space. There’s also a responsive virtual keyboard and touchpad you can use in a pinch along with handy widgets for the news, weather and monitoring system performance. And when you need to top up the removable keyboard’s battery, there are some thoughtfully placed pogo pins that allow it to trickle charge while it’s nestled inside the system.

Surprisingly, when it comes to battery life, ASUS managed to fit a 75 Whr power pack in the Zenbook Duo which is slightly larger than what’s available in the new Zephryus G14 (73 Whr), and that’s a gaming machine with way fewer moving parts. And ASUS claims the Zenbook Duo adheres to MIL-STD 810H testing standards, so it should be pretty durable too.

Photo by Sam Rutherford/Engadget

Last year, the Yoga Book 9i felt like a revelation; Lenovo combined all the right components needed to create an appealing dual-screen laptop. But ASUS has optimized that template even further by adding ports, moving to bigger and better screens, including a built-in kickstand and making it even more portable. And then there’s that surprisingly affordable $1,500 starting price. I’ve been waiting for years to switch over to a dual-screen laptop, and the Zenbook Duo might actually convince me to finally make that jump.

The ASUS Zenbook Duo is expected to go on sale sometime later in Q1 2024.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/the-asus-zenbook-duo-is-a-stunning-dual-screen-laptop-with-seemingly-no-compromises-180017370.html?src=rss

LG TVs will soon be Matter-compatible Google Home hubs

Google is expanding its smart home integration at CES 2024. The company said Tuesday that, in the future, LG TVs and some Google TV (and other Android TV) products will work as Google Home hubs. Considering Google’s support for the Matter smart home standard, the move could make it easier for customers to set up and control their smart homes without buying a Nest device.

“In the future, LG TVs and select Google TV and other Android TV OS devices will act as hubs for Google Home,” Google Android VP Sameer Samat wrote in today’s announcement blog post. “So if you have a Nest Hub, Nest Mini or compatible TV, it’s easy to add Matter devices to your home network and locally control them with the Google Home app.”

The announcement closely aligns with a comment teased by Google’s Eric Kay during LG’s CES 2024 press conference. “LG TVs will act as hubs for Google Home where you can easily set up and control any Matter device,” Kay said. “You’ll be able to see, control, and manage both LG and Google Home devices right from the TV or the ThinQ app. These features will roll out later this year.”

The eventual move will give smart home customers more options to set up and control a Google Home setup — including for Matter devices. Currently, you need a Google Nest device to do that.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/lg-tvs-will-soon-be-matter-compatible-google-home-hubs-180015856.html?src=rss

Google apps are coming to select Ford, Nissan and Lincoln vehicles in 2024

Google has teamed up with more automakers to offer vehicles that come pre-installed with Google apps, the company revealed today at CES 2024 in Las Vegas. Nissan, Ford and Lincoln are rolling out select models with built-in Google Maps, Assistant and Play Store — among other applications — this year, while Porsche is expected to follow suit in 2025. They're the upcoming addition to the growing list of auto brands embracing tighter Google integration, which includes Honda, Volvo, Polestar, Chevrolet, GMC, Cadillac and Renault. 

The company has also announced new features for cars with built-in Google apps. One of those features rolling out today is the ability to send trips users have planned on their Android or iOS Google Maps app to their cars. That way, they'll no longer need to plug in multi-stop trips on their car's Google Maps again after they've already plotted it meticulously on their phones. In addition, Chrome is making its way to select Polestar and Volvo cars today as part of a beta release, allowing users to browse websites and even access their bookmarks while they're parked. The browser will be available for more cars later this year. 

Google is also adding PBS KIDS and Crunchyroll to its list of apps for vehicles to give users and their kids access to more entertainment content. And to give drivers a quick way to keep an eye on changing weather conditions, Google's built-in apps for cars now includes The Weather Channel's. It will provide users with hourly forecasts, as well as alerts and a "Trip View" radar on their dashboard, so they no longer have to check their phones. Finally, Google has announced that it's expanding its digital car keys' availability to select Volvo cars soon, allowing owners to unlock, lock and even start their cars with their Android phone. 

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/google-apps-are-coming-to-select-ford-nissan-and-lincoln-vehicles-in-2024-180007640.html?src=rss

Android Auto will soon let Google Maps see EV battery levels and tell you where to charge your vehicle

Much of the auto news out of CES 2024 has focused on EVs, and Google is no exception. The company has announced that Android Auto will share real-time battery updates with Google Maps. The update should initially roll out to the Ford Mustang Mach-E and F-150 Lightning in the coming months before expanding to other Android Auto-compatible EVs in the future.

With this new feature, Google Maps will give you suggestions of charging stations along the route without you having to pull over and search. It should also update you on the car's battery level once you arrive at a destination (something your EV should already do) and provide an estimate of how long it will take to charge. The latter is more novel and could help when you're in a rush.

Google also announced the expansion of Google built-in to additional car brands, such as Nissan and Ford, after first launching in Honda's 2023 Accord sedan's high-end Touring trim. The Chrome browser is now rolling out in beta to select models from Volvo and Polestar (of which Volvo is a part-owner) and should be more widely available later this year. Plus, Google built-in now has The Weather Channel app to track the forecast as you travel.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/android-auto-will-soon-let-google-maps-see-ev-battery-levels-and-tell-you-where-to-charge-your-vehicle-180006872.html?src=rss