Posts with «celebrities» label

The Humane AI Pin is the solution to none of technology's problems

I’ve found myself at a loss for words when trying to explain the Humane AI Pin to my friends. The best description so far is that it’s a combination of a wearable Siri button with a camera and built-in projector that beams onto your palm. But each time I start explaining that, I get so caught up in pointing out its problems that I never really get to fully detail what the AI Pin can do. Or is meant to do, anyway.

Yet, words are crucial to the Humane AI experience. Your primary mode of interacting with the pin is through voice, accompanied by touch and gestures. Without speaking, your options are severely limited. The company describes the device as your “second brain,” but the combination of holding out my hand to see the projected screen, waving it around to navigate the interface and tapping my chest and waiting for an answer all just made me look really stupid. When I remember that I was actually eager to spend $700 of my own money to get a Humane AI Pin, not to mention shell out the required $24 a month for the AI and the company’s 4G service riding on T-Mobile’s network, I feel even sillier.

What is the Humane AI Pin?

In the company’s own words, the Humane AI Pin is the “first wearable device and software platform built to harness the full power of artificial intelligence.” If that doesn’t clear it up, well, I can’t blame you.

There are basically two parts to the device: the Pin and its magnetic attachment. The Pin is the main piece, which houses a touch-sensitive panel on its face, with a projector, camera, mic and speakers lining its top edge. It’s about the same size as an Apple Watch Ultra 2, both measuring about 44mm (1.73 inches) across. The Humane wearable is slightly squatter, though, with its 47.5mm (1.87 inches) height compared to the Watch Ultra’s 49mm (1.92 inches). It’s also half the weight of Apple’s smartwatch, at 34.2 grams (1.2 ounces).

The top of the AI Pin is slightly thicker than the bottom, since it has to contain extra sensors and indicator lights, but it’s still about the same depth as the Watch Ultra 2. Snap on a magnetic attachment, and you add about 8mm (0.31 inches). There are a few accessories available, with the most useful being the included battery booster. You’ll get two battery boosters in the “complete system” when you buy the Humane AI Pin, as well as a charging cradle and case. The booster helps clip the AI Pin to your clothes while adding some extra hours of life to the device (in theory, anyway). It also brings an extra 20 grams (0.7 ounces) with it, but even including that the AI Pin is still 10 grams (0.35 ounces) lighter than the Watch Ultra 2.

That weight (or lack thereof) is important, since anything too heavy would drag down on your clothes, which would not only be uncomfortable but also block the Pin’s projector from functioning properly. If you're wearing it with a thinner fabric, by the way, you’ll have to use the latch accessory instead of the booster, which is a $40 plastic tile that provides no additional power. You can also get the stainless steel clip that Humane sells for $50 to stick it onto heavier materials or belts and backpacks. Whichever accessory you choose, though, you’ll place it on the underside of your garment and stick the Pin on the outside to connect the pieces.

Hayato Huseman for Engadget

How the AI Pin works

But you might not want to place the AI Pin on a bag, as you need to tap on it to ask a question or pull up the projected screen. Every interaction with the device begins with touching it, there is no wake word, so having it out of reach sucks.

Tap and hold on the touchpad, ask a question, then let go and wait a few seconds for the AI to answer. You can hold out your palm to read what it said, bringing your hand closer to and further from your chest to toggle through elements. To jump through individual cards and buttons, you’ll have to tilt your palm up or down, which can get in the way of seeing what’s on display. But more on that in a bit.

There are some built-in gestures offering shortcuts to functions like taking a picture or video or controlling music playback. Double tapping the Pin with two fingers will snap a shot, while double-tapping and holding at the end will trigger a 15-second video. Swiping up or down adjusts the device or Bluetooth headphone volume while the assistant is talking or when music is playing, too.

Cherlynn Low for Engadget

Each person who orders the Humane AI Pin will have to set up an account and go through onboarding on the website before the company will ship out their unit. Part of this process includes signing into your Google or Apple accounts to port over contacts, as well as watching a video that walks you through those gestures I described. Your Pin will arrive already linked to your account with its eSIM and phone number sorted. This likely simplifies things so users won’t have to fiddle with tedious steps like installing a SIM card or signing into their profiles. It felt a bit strange, but it’s a good thing because, as I’ll explain in a bit, trying to enter a password on the AI Pin is a real pain.

Talking to the Humane AI Pin

The easiest way to interact with the AI Pin is by talking to it. It’s supposed to feel natural, like you’re talking to a friend or assistant, and you shouldn’t have to feel forced when asking it for help. Unfortunately, that just wasn’t the case in my testing.

When the AI Pin did understand me and answer correctly, it usually took a few seconds to reply, in which time I could have already gotten the same results on my phone. For a few things, like adding items to my shopping list or converting Canadian dollars to USD, it performed adequately. But “adequate” seems to be the best case scenario.

Sometimes the answers were too long or irrelevant. When I asked “Should I watch Dream Scenario,” it said “Dream Scenario is a 2023 comedy/fantasy film featuring Nicolas Cage, with positive ratings on IMDb, Rotten Tomatoes and Metacritic. It’s available for streaming on platforms like YouTube, Hulu and Amazon Prime Video. If you enjoy comedy and fantasy genres, it may be worth watching.”

Setting aside the fact that the “answer” to my query came after a lot of preamble I found unnecessary, I also just didn’t find the recommendation satisfying. It wasn’t giving me a straight answer, which is understandable, but ultimately none of what it said felt different from scanning the top results of a Google search. I would have gleaned more info had I looked the film up on my phone, since I’d be able to see the actual Rotten Tomatoes and Metacritic scores.

To be fair, the AI Pin was smart enough to understand follow-ups like “How about The Witch” without needing me to repeat my original question. But it’s 2024; we’re way past assistants that need so much hand-holding.

We’re also past the days of needing to word our requests in specific ways for AI to understand us. Though Humane has said you can speak to the pin “naturally,” there are some instances when that just didn’t work. First, it occasionally misheard me, even in my quiet living room. When I asked “Would I like YouTuber Danny Gonzalez,” it thought I said “would I like YouTube do I need Gonzalez” and responded “It’s unclear if you would like Dulce Gonzalez as the content of their videos and channels is not specified.”

When I repeated myself by carefully saying “I meant Danny Gonzalez,” the AI Pin spouted back facts about the YouTuber’s life and work, but did not answer my original question.

That’s not as bad as the fact that when I tried to get the Pin to describe what was in front of me, it simply would not. Humane has a Vision feature in beta that’s meant to let the AI Pin use its camera to see and analyze things in view, but when I tried to get it to look at my messy kitchen island, nothing happened. I’d ask “What’s in front of me” or “What am I holding out in front of you” or “Describe what’s in front of me,” which is how I’d phrase this request naturally. I tried so many variations of this, including “What am I looking at” and “Is there an octopus in front of me,” to no avail. I even took a photo and asked “can you describe what’s in that picture.”

Every time, I was told “Your AI Pin is not sure what you’re referring to” or “This question is not related to AI Pin” or, in the case where I first took a picture, “Your AI Pin is unable to analyze images or describe them.” I was confused why this wasn’t working even after I double checked that I had opted in and enabled the feature, and finally realized after checking the reviewers' guide that I had to use prompts that started with the word “Look.”

Look, maybe everyone else would have instinctively used that phrasing. But if you’re like me and didn’t, you’ll probably give up and never use this feature again. Even after I learned how to properly phrase my Vision requests, they were still clunky as hell. It was never as easy as “Look for my socks” but required two-part sentences like “Look at my room and tell me if there are boots in it” or “Look at this thing and tell me how to use it.”

When I worded things just right, results were fairly impressive. It confirmed there was a “Lysol can on the top shelf of the shelving unit” and a “purple octopus on top of the brown cabinet.” I held out a cheek highlighter and asked what to do with it. The AI Pin accurately told me “The Carry On 2 cream by BYBI Beauty can be used to add a natural glow to skin,” among other things, although it never explicitly told me to apply it to my face. I asked it where an object I was holding came from, and it just said “The image is of a hand holding a bag of mini eggs. The bag is yellow with a purple label that says ‘mini eggs.’” Again, it didn't answer my actual question.

Humane’s AI, which is powered by a mix of OpenAI’s recent versions of GPT and other sources including its own models, just doesn’t feel fully baked. It’s like a robot pretending to be sentient — capable of indicating it sort of knows what I’m asking, but incapable of delivering a direct answer.

My issues with the AI Pin’s language model and features don’t end there. Sometimes it just refuses to do what I ask of it, like restart or shut down. Other times it does something entirely unexpected. When I said “Send a text message to Julian Chokkattu,” who’s a friend and fellow AI Pin reviewer over at Wired, I thought I’d be asked what I wanted to tell him. Instead, the device simply said OK and told me it sent the words “Hey Julian, just checking in. How's your day going?” to Chokkattu. I've never said anything like that to him in our years of friendship, but I guess technically the AI Pin did do what I asked.

Hayato Huseman for Engadget

Using the Humane AI Pin’s projector display

If only voice interactions were the worst thing about the Humane AI Pin, but the list of problems only starts there. I was most intrigued by the company’s “pioneering Laser Ink display” that projects green rays onto your palm, as well as the gestures that enabled interaction with “onscreen” elements. But my initial wonder quickly gave way to frustration and a dull ache in my shoulder. It might be tiring to hold up your phone to scroll through Instagram, but at least you can set that down on a table and continue browsing. With the AI Pin, if your arm is not up, you’re not seeing anything.

Then there’s the fact that it’s a pretty small canvas. I would see about seven lines of text each time, with about one to three words on each row depending on the length. This meant I had to hold my hand up even longer so I could wait for notifications to finish scrolling through. I also have a smaller palm than some other reviewers I saw while testing the AI Pin. Julian over at Wired has a larger hand and I was downright jealous when I saw he was able to fit the entire projection onto his palm, whereas the contents of my display would spill over onto my fingers, making things hard to read.

It’s not just those of us afflicted with tiny palms that will find the AI Pin tricky to see. Step outside and you’ll have a hard time reading the faint projection. Even on a cloudy, rainy day in New York City, I could barely make out the words on my hands.

When you can read what’s on the screen, interacting with it might make you want to rip your eyes out. Like I said, you’ll have to move your palm closer and further to your chest to select the right cards to enter your passcode. It’s a bit like dialing a rotary phone, with cards for individual digits from 0 to 9. Go further away to get to the higher numbers and the backspace button, and come back for the smaller ones.

This gesture is smart in theory but it’s very sensitive. There’s a very small range of usable space since there is only so far your hand can go, so the distance between each digit is fairly small. One wrong move and you’ll accidentally select something you didn’t want and have to go all the way out to delete it. To top it all off, moving my arm around while doing that causes the Pin to flop about, meaning the screen shakes on my palm, too. On average, unlocking my Pin, which involves entering a four-digit passcode, took me about five seconds.

On its own, this doesn’t sound so bad, but bear in mind that you’ll have to re-enter this each time you disconnect the Pin from the booster, latch or clip. It’s currently springtime in New York, which means I’m putting on and taking off my jacket over and over again. Every time I go inside or out, I move the Pin to a different layer and have to look like a confused long-sighted tourist reading my palm at various distances. It’s not fun.

Of course, you can turn off the setting that requires password entry each time you remove the Pin, but that’s simply not great for security.

Though Humane says “privacy and transparency are paramount with AI Pin,” by its very nature the device isn’t suitable for performing confidential tasks unless you’re alone. You don’t want to dictate a sensitive message to your accountant or partner in public, nor might you want to speak your Wi-Fi password out loud.

That latter is one of two input methods for setting up an internet connection, by the way. If you choose not to spell your Wi-Fi key out loud, then you can go to the Humane website to type in your network name (spell it out yourself, not look for one that’s available) and password to generate a QR code for the Pin to scan. Having to verbally relay alphanumeric characters to the Pin is not ideal, and though the QR code technically works, it just involves too much effort. It’s like giving someone a spork when they asked for a knife and fork: good enough to get by, but not a perfect replacement.

Cherlynn Low for Engadget

The Humane AI Pin’s speaker

Since communicating through speech is the easiest means of using the Pin, you’ll need to be verbal and have hearing. If you choose not to raise your hand to read the AI Pin’s responses, you’ll have to listen for it. The good news is, the onboard speaker is usually loud enough for most environments, and I only struggled to hear it on NYC streets with heavy traffic passing by. I never attempted to talk to it on the subway, however, nor did I obnoxiously play music from the device while I was outside.

In my office and gym, though, I did get the AI Pin to play some songs. The music sounded fine — I didn’t get thumping bass or particularly crisp vocals, but I could hear instruments and crooners easily. Compared to my iPhone 15 Pro Max, it’s a bit tinny, as expected, but not drastically worse.

The problem is there are, once again, some caveats. The most important of these is that at the moment, you can only use Tidal’s paid streaming service with the Pin. You’ll get 90 days free with your purchase, and then have to pay $11 a month (on top of the $24 you already give to Humane) to continue streaming tunes from your Pin. Humane hasn’t said yet if other music services will eventually be supported, either, so unless you’re already on Tidal, listening to music from the Pin might just not be worth the price. Annoyingly, Tidal also doesn’t have the extensive library that competing providers do, so I couldn’t even play songs like Beyonce’s latest album or Taylor Swift’s discography (although remixes of her songs were available).

Though Humane has described its “personic speaker” as being able to create a “bubble of sound,” that “bubble” certainly has a permeable membrane. People around you will definitely hear what you’re playing, so unless you’re trying to start a dance party, it might be too disruptive to use the AI Pin for music without pairing Bluetooth headphones. You’ll also probably get better sound quality from Bose, Beats or AirPods anyway.

The Humane AI Pin camera experience

I’ll admit it — a large part of why I was excited for the AI Pin is its onboard camera. My love for taking photos is well-documented, and with the Pin, snapping a shot is supposed to be as easy as double-tapping its face with two fingers. I was even ready to put up with subpar pictures from its 13-megapixel sensor for the ability to quickly capture a scene without having to first whip out my phone.

Sadly, the Humane AI Pin was simply too slow and feverish to deliver on that premise. I frequently ran into times when, after taking a bunch of photos and holding my palm up to see how each snap turned out, the device would get uncomfortably warm. At least twice in my testing, the Pin just shouted “Your AI Pin is too warm and needs to cool down” before shutting down.

A sample image from the Humane AI Pin.
Cherlynn Low for Engadget

Even when it’s running normally, using the AI Pin’s camera is slow. I’d double tap it and then have to stand still for at least three seconds before it would take the shot. I appreciate that there’s audio and visual feedback through the flashing green lights and the sound of a shutter clicking when the camera is going, so both you and people around know you’re recording. But it’s also a reminder of how long I need to wait — the “shutter” sound will need to go off thrice before the image is saved.

I took photos and videos in various situations under different lighting conditions, from a birthday dinner in a dimly lit restaurant to a beautiful park on a cloudy day. I recorded some workout footage in my building’s gym with large windows, and in general anything taken with adequate light looked good enough to post. The videos might make viewers a little motion sick, since the camera was clipped to my sports bra and moved around with me, but that’s tolerable.

In dark environments, though, forget about it. Even my Nokia E7 from 2012 delivered clearer pictures, most likely because I could hold it steady while framing a shot. The photos of my friends at dinner were so grainy, one person even seemed translucent. To my knowledge, that buddy is not a ghost, either.

A sample image from the Humane AI Pin.
Cherlynn Low for Engadget

To its credit, Humane’s camera has a generous 120-degree field of view, meaning you’ll capture just about anything in front of you. When you’re not sure if you’ve gotten your subject in the picture, you can hold up your palm after taking the shot, and the projector will beam a monochromatic preview so you can verify. It’s not really for you to admire your skilled composition or level of detail, and more just to see that you did indeed manage to get the receipt in view before moving on.

Cosmos OS on the Humane AI Pin

When it comes time to retrieve those pictures off the AI Pin, you’ll just need to navigate to humane.center in any browser and sign in. There, you’ll find your photos and videos under “Captures,” your notes, recently played music and calls, as well as every interaction you’ve had with the assistant. That last one made recalling every weird exchange with the AI Pin for this review very easy.

You’ll have to make sure the AI Pin is connected to Wi-Fi and power, and be at least 50 percent charged before full-resolution photos and videos will upload to the dashboard. But before that, you can still scroll through previews in a gallery, even though you can’t download or share them.

The web portal is fairly rudimentary, with large square tiles serving as cards for sections like “Captures,” “Notes” and “My Data.” Going through them just shows you things you’ve saved or asked the Pin to remember, like a friend’s favorite color or their birthday. Importantly, there isn’t an area for you to view your text messages, so if you wanted to type out a reply from your laptop instead of dictating to the Pin, sorry, you can’t. The only way to view messages is by putting on the Pin, pulling up the screen and navigating the onboard menus to find them.

Hayato Huseman for Engadget

That brings me to what you see on the AI Pin’s visual interface. If you’ve raised your palm right after asking it something, you’ll see your answer in text form. But if you had brought up your hand after unlocking or tapping the device, you’ll see its barebones home screen. This contains three main elements — a clock widget in the middle, the word “Nearby” in a bubble at the top and notifications at the bottom. Tilting your palm scrolls through these, and you can pinch your index finger and thumb together to select things.

Push your hand further back and you’ll bring up a menu with five circles that will lead you to messages, phone, settings, camera and media player. You’ll need to tilt your palm to scroll through these, but because they’re laid out in a ring, it’s not as straightforward as simply aiming up or down. Trying to get the right target here was one of the greatest challenges I encountered while testing the AI Pin. I was rarely able to land on the right option on my first attempt. That, along with the fact that you have to put on the Pin (and unlock it), made it so difficult to see messages that I eventually just gave up looking at texts I received.

The Humane AI Pin overheating, in use and battery life

One reason I sometimes took off the AI Pin is that it would frequently get too warm and need to “cool down.” Once I removed it, I would not feel the urge to put it back on. I did wear it a lot in the first few days I had it, typically from 7:45AM when I headed out to the gym till evening, depending on what I was up to. Usually at about 3PM, after taking a lot of pictures and video, I would be told my AI Pin’s battery was running low, and I’d need to swap out the battery booster. This didn’t seem to work sometimes, with the Pin dying before it could get enough power through the accessory. At first it appeared the device simply wouldn’t detect the booster, but I later learned it’s just slow and can take up to five minutes to recognize a newly attached booster.

When I wore the AI Pin to my friend (and fellow reviewer) Michael Fisher’s birthday party just hours after unboxing it, I had it clipped to my tank top just hovering above my heart. Because it was so close to the edge of my shirt, I would accidentally brush past it a few times when reaching for a drink or resting my chin on my palm a la The Thinker. Normally, I wouldn’t have noticed the Pin, but as it was running so hot, I felt burned every time my skin came into contact with its chrome edges. The touchpad also grew warm with use, and the battery booster resting against my chest also got noticeably toasty (though it never actually left a mark).

Hayato Huseman for Engadget

Part of the reason the AI Pin ran so hot is likely that there’s not a lot of room for the heat generated by its octa-core Snapdragon processor to dissipate. I had also been using it near constantly to show my companions the pictures I had taken, and Humane has said its laser projector is “designed for brief interactions (up to six to nine minutes), not prolonged usage” and that it had “intentionally set conservative thermal limits for this first release that may cause it to need to cool down.” The company added that it not only plans to “improve uninterrupted run time in our next software release,” but also that it’s “working to improve overall thermal performance in the next software release.”

There are other things I need Humane to address via software updates ASAP. The fact that its AI sometimes decides not to do what I ask, like telling me “Your AI Pin is already running smoothly, no need to restart” when I asked it to restart is not only surprising but limiting. There are no hardware buttons to turn the pin on or off, and the only other way to trigger a restart is to pull up the dreaded screen, painstakingly go to the menu, hopefully land on settings and find the Power option. By which point if the Pin hasn’t shut down my arm will have.

A lot of my interactions with the AI Pin also felt like problems I encountered with earlier versions of Siri, Alexa and the Google Assistant. The overly wordy answers, for example, or the pronounced two or three-second delay before a response, are all reminiscent of the early 2010s. When I asked the AI Pin to “remember that I parked my car right here,” it just saved a note saying “Your car is parked right here,” with no GPS information or no way to navigate back. So I guess I parked my car on a sticky note.

To be clear, that’s not something that Humane ever said the AI Pin can do, but it feels like such an easy thing to offer, especially since the device does have onboard GPS. Google’s made entire lines of bags and Levi’s jackets that serve the very purpose of dropping pins to revisit places later. If your product is meant to be smart and revolutionary, it should at least be able to do what its competitors already can, not to mention offer features they don’t.

Screenshot

One singular thing that the AI Pin actually manages to do competently is act as an interpreter. After you ask it to “translate to [x language],” you’ll have to hold down two fingers while you talk, let go and it will read out what you said in the relevant tongue. I tried talking to myself in English and Mandarin, and was frankly impressed with not only the accuracy of the translation and general vocal expressiveness, but also at how fast responses came through. You don’t even need to specify the language the speaker is using. As long as you’ve set the target language, the person talking in Mandarin will be translated to English and the words said in English will be read out in Mandarin.

It’s worth considering the fact that using the AI Pin is a nightmare for anyone who gets self-conscious. I’m pretty thick-skinned, but even I tried to hide the fact that I had a strange gadget with a camera pinned to my person. Luckily, I didn’t get any obvious stares or confrontations, but I heard from my fellow reviewers that they did. And as much as I like the idea of a second brain I can wear and offload little notes and reminders to, nothing that the AI Pin does well is actually executed better than a smartphone.

Wrap-up

Not only is the Humane AI Pin slow, finicky and barely even smart, using it made me look pretty dumb. In a few days of testing, I went from being excited to show it off to my friends to not having any reason to wear it.

Humane’s vision was ambitious, and the laser projector initially felt like a marvel. At first glance, it looked and felt like a refined product. But it just seems like at every turn, the company had to come up with solutions to problems it created. No screen or keyboard to enter your Wi-Fi password? No worries, use your phone or laptop to generate a QR code. Want to play music? Here you go, a 90-day subscription to Tidal, but you can only play music on that service.

The company promises to make software updates that could improve some issues, and the few tweaks my unit received during this review did make some things (like music playback) work better. The problem is that as it stands, the AI Pin doesn’t do enough to justify its $700 and $24-a-month price, and I simply cannot recommend anyone spend this much money for the one or two things it does adequately. 

Maybe in time, the AI Pin will be worth revisiting, but it’s hard to imagine why anyone would need a screenless AI wearable when so many devices exist today that you can use to talk to an assistant. From speakers and phones to smartwatches and cars, the world is full of useful AI access points that allow you to ditch a screen. Humane says it’s committed to a “future where AI seamlessly integrates into every aspect of our lives and enhances our daily experiences.” 

After testing the company’s AI Pin, that future feels pretty far away.

This article originally appeared on Engadget at https://www.engadget.com/the-humane-ai-pin-is-the-solution-to-none-of-technologys-problems-120002469.html?src=rss

Fly Me To The Moon trailer plays right into Apollo 11 conspiracy theorists' hands

Fly Me To The Moon is an upcoming comedy-drama from Columbia Pictures and Apple that goes behind the scenes of NASA trying to improve its image while preparing for the Apollo 11 mission to the Moon. A trailer makes it seem like a lighthearted, fun time at the movies, though conspiracy theorists may have a field day with one of the key plot points.

Scarlett Johansson plays Kelly Jones, a PR expert who NASA brings in to improve public perception ahead of the launch. Along with butting heads with launch director Cole Davis (Channing Tatum) and turning the crew into global celebrities, Kelly is handed a particularly difficult task: to secretly create a fake version of the Moon landing, just in case the mission goes sideways. 

The rest of the cast, which includes Woody Harrelson, looks solid too. For one thing, the delightful Jim Rash (Community) plays the very much not Stanley Kubrick director of the phony Moon landing. The movie's director is Greg Berlanti, who was behind Love, Simon and a string of DC Comics TV shows.

Fly Me To The Moon will arrive in theaters on July 14, almost 55 years to the day after Apollo 11 launched.

This article originally appeared on Engadget at https://www.engadget.com/fly-me-to-the-moon-trailer-plays-right-into-apollo-11-conspiracy-theorists-hands-174547851.html?src=rss

Hatsune Miku in Crypt of the Necrodancer feels like the perfect crossover

Crypt of the Necrodancer just won’t die — and that’s a good thing. The nearly decade-old roguelike rhythm game received new content on Thursday, bringing virtual pop star Hatsune Miku into the fold as a playable character.

Developer Brace Yourself Games says Hatsune Miku is one of the more challenging characters in the game. She can move in all eight directions and takes out foes by boogying her way through groups of enemies. The developer’s press release explains, “She doesn’t have a shovel like most characters, so she must use her dance-like dash attack to break through walls instead.” Hell yeah.

She has a “Sing!” ability — entirely new to the game — that charms nearby enemies. When one of these charmed foes strikes Miku, she heals instead of losing her health. Brace Yourself Games says it even reskinned all of the game’s armors as official Miku outfits, so you can put on new threads as you shimmy and groove your way through legions of ghosts and skeletons.

Photo by Mat Smith / Engadget

If you aren’t familiar, Hatsune Miku is one of the world’s biggest virtual pop stars. She’s a perpetual 16-year-old because she’s the personification of a “Vocaloid,” software that synthesizes pre-recorded vocals to simulate human singing. The avatar has sold out 14,000-seat arenas, collaborated with Pharrell Williams and opened for Lady Gaga. She wasn’t the first digital celebrity, but she may be the most famous.

The Hatsune Miku DLC for Crypt of the Necrodancer is available now for $1.99 on the PlayStation Store and PC via Steam. The content arrives a little later on Switch — on April 13. Check out her moves in the trailer below.

This article originally appeared on Engadget at https://www.engadget.com/hatsune-miku-in-crypt-of-the-necrodancer-feels-like-the-perfect-crossover-203138973.html?src=rss

Prepare for more red pill memes: a fifth Matrix movie is happening

There’s another Matrix movie in the works. Warner Bros. just greenlit a fifth installment of the saga, as reported by Deadline. However, neither Lana Wachowski or Lilly Wachowski will be handling directing duties. That honor falls to Drew Goddard, who adapted The Martian into a screenplay and directed the criminally underrated Cabin in the Woods. He's also writing the script. 

Goddard cut his teeth writing episodes of Buffy the Vampire Slayer, Angel and Lost, among others — you could say he knows his way around genre content. Lana Wachowski will be on board as an executive producer, so there will be some input from one of the franchise’s original creators.

There’s no word as to what the film will be about, but Warner Bros. says that Goddard came to the company with a “new idea that we all believe would be an incredible way to continue the Matrix world.” Goddard added that the original films inspire him on a daily basis and that he is “beyond grateful for the chance to tell stories” in that world.

Warner Bros. is also being cagey as to which, if any, cast members would be returning. The original trilogy featured Keanu Reeves, Carrie Anne-Moss, Laurence Fishburne, Hugo Weaving and Jada Pinkett Smith. Most of these actors returned for 2021’s The Matrix Resurrections, with one story-based exception.

Speaking of The Matrix Resurrections, it received mixed reviews from both critics and audiences. We loved the film, going as far as to call it brilliant, but admitted that it wasn’t for everyone. That’s par for the course with this franchise. Every single Matrix movie beyond the first one is divisive. We’ll have to wait and see what Goddard brings to the table.

He’s also writing a film adaptation based on another novel by The Martian scribe Andy Weir. Project Hail Mary will be directed by Phil Lord and Christopher Miller and will star Ryan Gosling as an astronaut trying to save the planet from a star-eating microbe.

This article originally appeared on Engadget at https://www.engadget.com/prepare-for-more-red-pill-memes-a-fifth-matrix-movie-is-happening-184811691.html?src=rss

The Pirate Queen interview: How Singer Studios and Lucy Liu brought forgotten history to life

I had a favorite version of Mulan growing up (Anita Yuen in the 1998 Taiwanese TV series). I obsessed over Chinese period TV series like Legend of the Condor Heroes, My Fair Princess and The Book and the Sword. I consider myself fairly well-versed in Chinese historical figures, especially those represented in ‘90s and 2000s entertainment in Asia. So when I found out that a UK-based studio had made a VR game called The Pirate Queen based on a forgotten female leader who was prolific in the South China Sea, I was shocked. How had I never heard of her? How had the Asian film and TV industry never covered her?

I got to play a bit of the game this week, which was released on the Meta Quest store and Steam on March 7th. The titular character Cheng Shih is voiced by actor Lucy Liu, who also executive produced this version of the game with UK-based Singer Studios’ CEO and founder Eloise Singer. Liu and Singer sat with me for an interview discussing The Pirate Queen, Cheng Shih, VR’s strengths and the importance of cultural and historical accuracy in games and films.

Cheng Shih, which translates to “Madam Cheng” or “Mrs Cheng,” was born Shi Yang. After she married the pirate Cheng Yi (usually romanized as Zheng Yi), she became known as Cheng Yi Sao, which translates to “wife of Cheng Yi.” Together they led the Guangdong Pirate Confederation in the 1800s. Upon her husband’s death in 1807, she took over the reins and went on to become what South China Morning Post described as “history’s greatest pirate.”

Singer Studios

How did Singer Studios learn about Cheng Shih and decide to build a game (and upcoming franchise including a film, podcast and graphic novels) around her? According to Singer, it was through word of mouth. “It was a friend of mine who first told me the story,” Singer said. “She said, ‘Did you know that the most famous pirate in history was a woman?’”

Cheng Shih had been loosely referenced in various films and games before this, like the character Mistress Ching in the 2007 film Pirates of the Caribbean: At World’s End and Jing Lang in Assassin’s Creed IV: Black Flag. As Singer pointed out, Cheng Shih had also appeared in a recent episode of Doctor Who.

Singer said that her team started developing the project as a film at the end of 2018. But the pandemic disrupted their plans, causing Singer to adapt it into a game. A short version of The Pirate Queen later debuted at Raindance Film Festival, and shortly after, Meta came onboard and provided funding to complete development of the game. Liu was then approached when the full version was ready and about to make its appearance at Tribeca Film Festival 2023.

“The rest is history,” Liu said, “But not forgotten history.” She said Cheng Shih was never really recognized for being the most powerful pirate. “It seems so crazy that in the 19th century, this woman who started as a courtesan would then rise to power and then have this fleet of pirates that she commanded,” Liu added. She went on to talk about how Cheng Shih was ahead of the time and also represented “a bit of an underdog story.” For the full 15-minute interview, you can watch the video in this article or listen to this week’s episode of The Engadget Podcast and learn more about Liu and Singer’s thoughts on VR and technology over the last 20 years.

Capturing the historical and cultural details of Cheng Shih’s life was paramount to Liu and Singer. They said the team had to create women’s hands from scratch to be represented from the player’s perspective in VR, and a dialect coach was hired to help Liu nail the pronunciation for the Cantonese words that Cheng Shih speaks in the game. Though I’m not completely certain if Cheng Shih spoke Mandarin or Cantonese, the latter seems like the more accurate choice given it’s the lingua franca in the Guangdong region.

Singer Studios

All that added to the immersiveness of The Pirate Queen, in which players find themselves in an atmospheric maritime environment. The Meta Quest 3’s controllers served as my hands in the game, and I rowed boats, climbed rope ladders and picked up items with relative ease. Some of the mechanics, especially the idea of “teleportation” as moving around, were a little clunky, but after about five minutes I got used to how things worked. You’ll have to point the left controller and push the joystick when you’ve chosen a spot, and the scene changes around you. This probably minimizes the possibility of nausea, since you’re not standing still while watching your surroundings move. It’s also pretty typical of VR games, so those who have experience playing in headsets will likely be familiar with the movement.

You can still walk around and explore, of course. I scrutinized the corners of rooms, inspected the insides of cabinets and more, while hunting for keys that would unlock boxes containing clues. A lot of this is pretty standard for a puzzle or room escape game, which is what I used to play the most in my teens. But I was particularly taken by sequences like rowing a boat across the sea and climbing up a rope ladder, both of which caused me to break a mild sweat. Inside Cheng Shih’s cabin, I lit a joss stick and placed it in an incense holder — an action I repeated every week at my grandfather’s altar when I was growing up. It felt so realistic that I tried to wave the joss stick to put out the flame and could almost smell the smoke.

It’s these types of activities that make VR games great vehicles for education and empathy. “We didn’t want to have these combat elements that traditional VR games do have,” Singer said, adding that it was one of the challenges in creating The Pirate Queen.

“It’s nice to see and to learn and be part of that, as opposed to ‘Let’s turn to page 48,’” Liu said. “That’s not as exciting as doing something and being actively part of something.” When you play as a historical character in a game, and one that’s as immersive as a VR game, “you’re living that person’s life or that moment in time,” Liu added.

While The Pirate Queen is currently only available on Quest devices, Singer said there are plans to bring it to “as many headsets as we possibly can.” Singer Studios also said it is “extending The Pirate Queen franchise beyond VR into a graphic novel, film and television series.”

This article originally appeared on Engadget at https://www.engadget.com/the-pirate-queen-interview-how-singer-studios-and-lucy-liu-brought-forgotten-history-to-life-160007029.html?src=rss

Engadget Podcast: The NY Auto Show and a chat with Lucy Liu

This week, it’s all about cars and Lucy Liu in VR. Devindra chats with Senior Writer Sam Rutherford about his visit to the New York International Auto Show, where he saw the Polestar 4, a unique new EV without a rear window. Also, Cherlynn pops in to chat with Lucy Liu about her new VR game, The Pirate Queen. We also explore the issues around Florida’s bill banning young kids from social media sites, and Sam tells us why he likes Netflix’s Avatar: The Last Airbender adaptation.


Listen below or subscribe on your podcast app of choice. If you've got suggestions or topics you'd like covered on the show, be sure to email us or drop a note in the comments! And be sure to check out our other podcast, Engadget News!

Topics

  • Sam Rutherford on what’s new in EVs and car tech from the New York Auto Show – 0:57

  • Cherlynn Low interviews Lucy Liu about her new VR game The Pirate Queen – 34:39

  • Florida Governor signs bill banning young children from social media – 54:55

  • Intel confirms Copilot will eventually run locally – 58:33

  • There’s finally a version of Chrome that runs well on ARM-based Windows machines – 1:02:43

  • Canadian researchers have created a camera that takes 156.3 trillion frames per second – 1:05:06

  • Working on – 1:07:08

  • Pop culture picks – 1:12:44

Subscribe!

Credits 

Hosts: Devindra Hardawar and Sam Rutherford
Guest: Cherlynn Low and Lucy Liu
Producer: Ben Ellman
Music: Dale North and Terrence O'Brien

This article originally appeared on Engadget at https://www.engadget.com/engadget-podcast-us-tiktok-ban-123047573.html?src=rss

Engadget Podcast: The NY Auto Show and a chat with Lucy Liu

This week, it’s all about cars and Lucy Liu in VR. Devindra chats with Senior Writer Sam Rutherford about his visit to the New York International Auto Show, where he saw the Polestar 4, a unique new EV without a rear window. Also, Cherlynn pops in to chat with Lucy Liu about her new VR game, The Pirate Queen. We also explore the issues around Florida’s bill banning young kids from social media sites, and Sam tells us why he likes Netflix’s Avatar: The Last Airbender adaptation.


Listen below or subscribe on your podcast app of choice. If you've got suggestions or topics you'd like covered on the show, be sure to email us or drop a note in the comments! And be sure to check out our other podcast, Engadget News!

Topics

  • Sam Rutherford on what’s new in EVs and car tech from the New York Auto Show – 0:57

  • Cherlynn Low interviews Lucy Liu about her new VR game The Pirate Queen – 34:39

  • Florida Governor signs bill banning young children from social media – 54:55

  • Intel confirms Copilot will eventually run locally – 58:33

  • There’s finally a version of Chrome that runs well on ARM-based Windows machines – 1:02:43

  • Canadian researchers have created a camera that takes 156.3 trillion frames per second – 1:05:06

  • Working on – 1:07:08

  • Pop culture picks – 1:12:44

Subscribe!

Credits 

Hosts: Devindra Hardawar and Sam Rutherford
Guest: Cherlynn Low and Lucy Liu
Producer: Ben Ellman
Music: Dale North and Terrence O'Brien

This article originally appeared on Engadget at https://www.engadget.com/engadget-podcast-ny-auto-show-lucy-liu-123047921.html?src=rss

Getty flags another British royal family photo for being digitally altered

Getty has flagged another photo captured by the Princess of Wales as digitally altered that was released back in 2022, featuring Queen Elizabeth II surrounded by her grandchildren and great-grandchildren. "Getty Images is undertaking a review of handout images and in accordance with its editorial policy is placing an editor's note on images where the source has suggested they could be digitally enhanced," a spokesperson told CNN. This comes on the heels of a recent controversy, where a photo of Kate Middleton was revealed to be doctored.

The publication found 19 alterations in the photo that most people likely wouldn't notice unless they zoom in very closely and examine every pattern. It found a few misalignments in the subjects' clothing, random floating artifacts, cloned hair strands and heads that looked like they were pasted in from another photo due to the difference in lighting. Kate, or whoever edited the picture, might have simply been looking to create the best version of it possible, but agencies like Getty only allow minimal editing for the photos in their library to avoid spreading misinformation. 

Today would have been Her Late Majesty Queen Elizabeth’s 97th birthday.

This photograph - showing her with some of her grandchildren and great grandchildren - was taken at Balmoral last summer.

📸 The Princess pic.twitter.com/1FOU4Ne5DX

— The Prince and Princess of Wales (@KensingtonRoyal) April 21, 2023

The princess' absence from public events since Christmas last year has, as you might have expected, spawned all kinds of conspiracy theories. It even gave rise to a whole Wikipedia article entitled "Where is Kate?" because people around the world are apparently that invested in the British monarchy and can't quite believe that she'd undergone abdominal surgery. 

In the midst of it all, William's and Kate's social media accounts posted the aforementioned doctored photo of the Princess of Wales with her children on Mother's Day in the UK. But when the Associated Press and other news agencies pulled the photo because they found that it had been edited, those conspiracy theories became even more outlandish. The wildest claim we've heard so far is that the video of her out shopping with the Prince of Wales wasn't her at all but a body double. Or a clone, apparently, because that's the way it goes on the internet.

This article originally appeared on Engadget at https://www.engadget.com/getty-flags-another-british-royal-family-photo-for-being-digitally-altered-121856385.html?src=rss

NVIDIA's GPUs powered the AI revolution. Its new Blackwell chips are up to 30 times faster

In less than two years, NVIDIA’s H100 chips, which are used by nearly every AI company in the world to train large language models that power services like ChatGPT, made it one of the world’s most valuable companies. On Monday, NVIDIA announced a next-generation platform called Blackwell, whose chips are between seven and 30 times faster than the H100 and use 25 times less power.

“Blackwell GPUs are the engine to power this new Industrial Revolution,” said NVIDIA CEO Jensen Huang at the company’s annual GTC event in San Jose attended by thousands of developers, and which some compared to a Taylor Swift concert. “Generative AI is the defining technology of our time. Working with the most dynamic companies in the world, we will realize the promise of AI for every industry,” Huang added in a press release.

NVIDIA’s Blackwell chips are named in honor of David Harold Blackwell, a mathematician who specialized in game theory and statistics. NVIDIA claims that Blackwell is the world’s most powerful chip. It offers a significant performance upgrade to AI companies with speeds of 20 petaflops compared to just 4 petaflops that the H100 provided. Much of this speed is made possible thanks the 208 billion transistors in Blackwell chips compared to 80 billion in the H100. To achieve this, NVIDIA connected two large chip dies that can talk to each other at speeds up to 10 terabytes per second.

In a sign of just how dependent our modern AI revolution is on NVIDIA’s chips, the company’s press release includes testimonials from seven CEOs who collectively lead companies worth trillions of dollars. They include OpenAI CEO Sam Altman, Microsoft CEO Satya Nadella, Alphabet CEO Sundar Pichai, Meta CEO Mark Zuckerberg, Google DeepMind CEO Demis Hassabis, Oracle chairman Larry Ellison, Dell CEO Michael Dell, and Tesla CEO Elon Musk.

“There is currently nothing better than NVIDIA hardware for AI,” Musk says in the statement. "Blackwell offers massive performance leaps, and will accelerate our ability to deliver leading-edge models. We’re excited to continue working with NVIDIA to enhance AI compute,” Altman says.

NVIDIA did not disclose how much Blackwell chips would cost. Its H100 chips currently run between 25,000 and $40,000 per chip, according to CNBC, and entire systems powered by these chips can cost as much as $200,000.

Despite their costs, NVIDIA’s chips are in high demand. Last year, delivery wait times were as high as 11 months. And having access to NVIDIA’s AI chips is increasingly seen as a status symbol for tech companies looking to attract AI talent. Earlier this year, Zuckerberg touted the company’s efforts to build “a massive amount of infrastructure” to power Meta’s AI efforts. “At the end of this year,” Zuckerberg wrote, “we will have ~350k Nvidia H100s — and overall ~600k H100s H100 equivalents of compute if you include other GPUs.”

This article originally appeared on Engadget at https://www.engadget.com/nvidias-gpus-powered-the-ai-revolution-its-new-blackwell-chips-are-up-to-30-times-faster-001059577.html?src=rss

The Morning After: TikTok bans and Airbnb cams

The biggest story this week was TikTok and the US government going at it again, with the house voting in favor of a bill that could force TikTok's parent company to sell to a US owner or face getting banned outright. Don't worry, though; your elected officials didn't waste the chance to embarrass themselves, as usual. Meanwhile, Mike Tyson comes out of retirement to box for Netflix. He'll face-off against Jake Paul, which I feel is best represented by this Punch-Out! tweet.  

This week's stories:

✅🕣⛔️ House passes bill that could ban TikTok

🥊🥊😵 The real fight isn't Tyson vs. Paul — it's Netflix vs. its livestreaming infrastructure

📹🏨 Airbnb to hosts: please stop filming the guests

And read this:

To celebrate this website's 20th anniversary, we're looking back at the products and services that have changed the industry since Engadget's inception on March 2, 2004. I've also been here for over half of its existence. Horrifying. I'd share my not-great first hands-on video for the site, but the footage only lives on through Russian content scrapers. What a shame.

All of the stories live here, but I suggest starting with our stories on how streaming video changed the internet and the game-changer that was (and is) Bluetooth audio.

This article originally appeared on Engadget at https://www.engadget.com/the-morning-after-tiktok-bans-and-airbnb-cams-150046760.html?src=rss