We made it, folks! The end of the year is upon us, so Cherlynn, Devindra and UK Bureau Chief Mat Smith have gathered to break down the high and low points for the tech world. And for the first time, we duke it out to decide the first ever Engadget Podcast awards for the best and worst tech of the year. Let’s just hope things look brighter in 2023.
Listen below or subscribe on your podcast app of choice. If you've got suggestions or topics you'd like covered on the show, be sure to email us or drop a note in the comments! And be sure to check out our other podcasts, the Morning After and Engadget News!
Devindra, Cherlynn and Mat’s personal best / worst of the year – 27:07
Engadget Podcast Official Best / Worst of the Year – 47:36
FTX founder Sam Bankman-Fried arrested in the Bahamas – 1:00:09
Dell’s Concept Luna laptop can be dismantled in seconds – 1:02:41
Google’s smart home devices now support the Matter standard – 1:10:25
Working on – 1:13:22
Pop culture picks – 1:15:00
Livestream
Credits Hosts: Cherlynn Low and Devindra Hardawar Guest: Mat Smith Producer: Ben Ellman Music: Dale North and Terrence O'Brien Livestream producers: Julio Barrientos Graphic artists: Luke Brooks and Brian Oh
Last year, Dell intrigued us with Concept Luna, its attempt at making a sustainable laptop with fewer screws, using components that are easier to upgrade and recycle. It felt like a breath of fresh air compared to ultraportables that trade repairability for thinness. This year, Dell is pushing the concept even further. Its latest Luna device can be fully disassembled in around 30 seconds using just a push-pin tool and a bit of elbow grease. There aren't any cables or screws to worry about.
How can Dell achieve this sorcery? By developing a completely modular design, wherein every component can be snapped into place without much fuss. And it's not just marketing hype: As you can see in the video above, it doesn't take much effort for a Dell representative to deftly disassemble a Luna device. After unlocking the keyboard with a pin tool, he removed two speaker units, the battery, a CPU fan and a slim motherboard. The display was a cinch to remove as well, after unlocking the laptop's center bezel.
When it's all put together, the new Concept Luna looks like one of Dell's 13-inch laptops (more a Latitude than a slim XPS, to be clear). You'd have no idea there was a genuine revolution going on under the hood. Dell's sustainability angle is a lot more clear this time around. Whereas the previous concept still required a bit of technical maneuvering, it wouldn't take much for a general user to get under the hood of a fully modular laptop like this. It's about as hard as ejecting a SIM card.
Dell
The new Luna laptop also has room for a CPU fan, allowing it to house more powerful processors. Additionally, Dell worked with a micro-factory while developing Luna, allowing the company to automate the ordeal of assembling and tearing Luna devices apart. That process also involves testing individual components — after all, it's easy to imagine some aspects of a computer getting far more use than others. If you primarily used your laptop on a desk with an external keyboard, its built-in keyboard probably has a lot of life left.
Dell
"By marrying Luna’s sustainable design with intelligent telemetry and robotic automation, we’ve created something with the potential to trigger a seismic shift in the industry and drive circularity at scale," Glen Robson, CTO for Dell Technologies’ Client Solutions Group, said in a blog post. "A single sustainable device is one thing, but the real opportunity is the potential impact on millions of tech devices sold each year, and optimizing the materials in those devices for future reuse, refurbishment or recycling. "
While it's unlikely we'll see a Luna-like consumer laptop anytime soon, its mere existence could influence the way Dell designs future systems. The company is also pushing its sustainability initiatives in a variety of other ways, for example by dramatically reducing packaging waste, or exploring recycled materials for some PC cases. When it comes to true DIY repairability, Dell already has some competition from Framework (which just unveiled a DIY Chromebook). Still, it's nice to see one of the world's biggest PC makers taking sustainability seriously.
Avatar: The Way of Water is a triumph. As a sequel to the highest-grossing film ever, which was criticized for its formulaic story (and the surprisingly small ripple it had on pop culture), the new movie is a genuine surprise. It's a sweeping epic that reflects on the nature of families, our relationship to the natural world and humanity's endless thirst for violence and plunder. Fans of the original film often had to make excuses for writer and director James Cameron's stilted script, but that's no longer the case for The Way of Water, thanks to additional help from Amanda Silver and Rick Jaffa (who both worked on the recent criminally under-loved Planet of the Apes trilogy).
Perhaps most impressive, though, is that James Cameron has managed to craft the best high frame rate (HFR) movie yet. Certain scenes play back at 48 frames per second, giving them a smoother and more realistic sheen compared to the standard 24fps. That leads to 3D action scenes that feel incredibly immersive — at times HFR can make you forget that the lush alien wildlife on Pandora isn't real.
Fox/Disney
Unlike the handful of high frame rate movies we've already seen – The Hobbit trilogy, as well as Ang Lee's Gemini Man and Billy Lynn's Long Half-Time Walk – the Avatar sequel deploys the technology in a unique way. Rather than using HFR throughout the entire movie, Cameron relies on it for major action sequences, while slower dialog scenes appear as if they're running at 24fps. To do that, the entire film actually runs at 48fps, while the calmer scenes use doubled frames to trick your brain into seeing them at the typical theatrical frame rate.
If this sounds a bit confusing, your brain may have a similar reaction while watching the film. The Way of Water often jumps from hyper-real HFR to pseudo-24 fps in the same scene — at one point, I counted around a dozen switches in a few minutes. This is a strategy Cameron has been discussing for years. In 2016, he noted that HFR is "a tool, not a format," and later he rejected Ang Lee's attempt at using HFR for Gemini Man's entire runtime.
Cameron's dual-pronged approach to HFR is bound to be controversial. Even for someone who appreciates what the technology has to offer — pristine 3D action scenes with no blurring or strobing — it took me a while to get used to flipping between high frame rate and 24 fps footage. With Gemini Man, my brain got used to the hyper-reality of HFR within 15 minutes. In The Way of Water, I was almost keeping an eye out for when the footage changed.
Fox/Disney
Despite the distracting format changes, The Way of Water’s high frame rate footage ultimately worked for me. At times, the film appears to be a window into the world of Pandora, with breathtaking shots of lush forests and lush oceans. It makes all of Cameron’s creations, from enormous flying fish-like creatures that you can ride, to alien whales with advanced language, appear as if they’re living and breathing creatures. HFR also works in tandem with the sequel's more modern CG animation, making the Na'vi and their culture feel all the more real.
Over the film’s three hour and twelve-minute runtime, I eventually managed to see what the director was aiming for, even if his ambition exceeded his grasp. (Cameron, who has the world’s first [Avatar] and third-highest grossing films [Titanic] under his belt, and who dove into the Marianas Trench in a self-designed personal submarine, suggests you can use the bathroom anytime you want during The Way of Water. You’ll just catch up the next time you see it in theaters. Baller.)
The re-release of Avatar earlier this month also used a combination of HFR and traditional footage (in addition to brightening the picture and upscaling the film to 4K). But even though that revamp grossed over $70 million on its own, there hasn't been much discussion about how it integrated high frame rate footage. (I saw it on a Regal RPX screen, which offered 3D but no extra frames, sadly.) There's a better chance you'll be able to catch Avatar: The Way of Water exactly how Cameron intended. It'll be screening in 4K, HFR and 3D at all AMC Dolby Cinema locations and select IMAX theaters (single laser screens get everything, some dual-laser screens will only offer 2K 3D with HFR). While you could see it in 2D, why would you?
After suffering through the interminable Hobbit movies in HFR, I figured the technology was mostly a waste of time, yet another money-grab that Hollywood can use to pump up ticket prices. Director Peter Jackson struggled to recreate the magic of his Lord of the Rings trilogy, and amid production issues, he also failed to change the way he shot the Hobbit films to account for HFR. So that led to sets that looked like they were ripped from B-grade fantasy movies and costumes that seemingly came fromaSpirit Halloween pop-up.
Ang Lee’s more studious attempts at using the technology, especially with the action scenes in Gemini Man, convinced me HFR still had some potential. But even he struggled along the way. Billy Lynn’s Long Half-Time Walk is a cinematic curiosity, where HFR makes slow dialog scenes appear too distractingly real. Gemini Man was cursed by a messy script and the need to be a big-budget Will Smith blockbuster.
Fox/Disney
Avatar: The Way of Water benefits from the creative failures of all of the earlier high frame rate films. For many, it’ll be their introduction to this technology, so it’ll be interesting to see how general audiences respond. Video games and hyper-real YouTube action footage have made 60fps footage far more common, so I could see younger audiences, those raised on hundreds of hours of Minecraft and Fortnite, vibing with Cameron’s vision. Everyone else will need more convincing. For me, though, I’m just glad there’s finally a high frame rate film that’s genuinely great, instead of just a technical exercise.
The idea of sitting in front of a massive 55-inch gaming monitor all day sounds like heaven. Being able to twist it into a towering portrait mode? The stuff of my multi-tasking dreams. That's the pitch behind Samsung's 55-inch 4K Odyssey Ark Monitor. As we saw during our first preview, it's a genuinely unique behemoth of a display, one that can easily immerse you in both Microsoft Flight Simulator and towering Excel spreadsheets.
Sure, you could just plug a 55-inch TV into your PC, but without the Ark's extreme curve, it would be too wide to comfortably use as a monitor. You also won't find any 4K TVs with the Ark's blazing fast 165Hz refresh rate and 1ms response time, let alone its surprisingly solid sound system. The Odyssey Ark stands alone. But is it actually worth $3,500? That depends on if you're able to live with its many annoyances (and if you don't think too hard about the price).
Annoyance number one? This thing is a bear to set up. Even with the help of two delivery workers, it took around 20 minutes to get the (very heavy) Ark monitor correctly attached to its (equally heavy) base. The entire unit weighs 91.5 pounds when put together, so be sure to have a sturdy desk at the ready. If that sounds a bit obscene, well, you'd be correct. Samsung's 55-inch QN90B NEO QLED TV weighs almost half as much (48.3 pounds), while LG's 65-inch C2 OLED TV clocks in at 72 pounds. Extreme heft is the unfortunate price you'll have to pay for a rotating monitor stand.
Devindra Hardawar/Engadget
As soon as I sat in front of the Odyssey Ark, I understood why Samsung dared to build it. We've already seen its extra-wide 49-inch gaming monitors in action. And, of course the company that pushed the limits of phone screens would do the same for PCs. Given Samsung's robust TV business, it makes sense to explore the many other ways it could use 55-inch Mini LED panels. (The TV side is where we also saw Samsung debut screens that could rotate into TikTok-friendly portrait mode.) The Ark may not be entirely practical, but for Samsung it serves as a showcase for many of its display innovations.
Design-wise, the Odyssey Ark resembles Samsung's TVs more than its gaming monitors. It has a sturdy metal base (as it should, given its size), as well as a smooth metal case surrounding the curvy screen. It even comes with Samsung's One Connect breakout-box, one of the company's more intriguing TV inventions. It connects to the Ark over a single cable, while the box itself handles power and all of your typical connections (four HDMI 2.1 ports, an optical audio connection, 3.5mm headphone jack and two USB ports). While it was originally meant for screens you'd be mounting on walls, it's a welcome addition to the Odyssey Ark — nobody wants to push a near 100-pound beast around just to get to HDMI ports.
Sam Rutherford/Engadget
Samsung includes two ways to control the Ark: A simple remote with a directional pad and shortcuts for streaming apps like Netflix, as well as a dial for quickly managing the Ark's many different viewing modes. Both controllers are solar powered, so you'll just need to make sure they get a bit of light to keep running. I'd imagine that could be an issue in window-less offices like mine, but it wouldn't be that tough to place the remotes by a window every few months. On the plus side, they should be able to run indefinitely if you're lucky enough to have some light. (Solar cells can be charged by artificial lighting, but not very efficiently.)
Sam Rutherford/Engadget
The Ark's odd shape, as well as a few LED lighting strips on the rear, are the major signs that it's not a mere TV set. I've been in front of plenty of curved screens before, but nothing this extreme. The sides of the display almost seem like they're trying to embrace you with 4K Mini-LED goodness. It's an effective bit of immersion while you're viewing the Ark in its standard widescreen mode, reminiscent of specialized theaters like LA's Cinerama Dome. And unlike most TVs and monitors, the Ark's large frame allows it to house a six speaker sound system (four tweeters and two woofers), which delivers the audio punch of a medium-sized soundbar.
The combination of expansive sound and a wonderfully immersive picture make the Odyssey Ark a truly unique viewing experience. Movies, TV shows, and even trailers felt like they were drawing me into the action, so much so that I barely noticed the slight distortion from the curved sides of the screen. But while the Ark's sweet spot is indeed very sweet, showing off the added brightness of Mini-LED and the expansive color range from its Quantum Dot display, its viewing angle is incredibly limited. Just a few steps off the center and you immediately lose color and clarity. The curve giveth, the curve taketh away.
When it comes to software, the Ark is a curious device, sitting somewhere between what you'd expect from a smart TV (it has apps for Netflix, YouTube and everything else you'd expect), and a computer monitor. On the PC side, it can reach up to a 165Hz refresh rate, allowing for frenetic gaming at incredibly high framerates. Samsung has baked in multiple ways to take advantage of its massive screen size: Its "Multi View" mode lets you display up to three different apps at once. That can include a single HDMI video input, as well as a device mirrored wirelessly (using AirPlay or the Android equivalent). Additionally, there's a "Flex View" mode that lets you shrink an input down so it doesn't fill up the whole screen.
Now why would you want to do that? I quickly learned that playing fast-paced shooters like Overwatch 2 and Halo Infinite were overwhelming when sitting a few feet away from a 55-inch screen. Sure, I could see more detail, but moving the camera around quickly and trying to track potential enemies was nausea-inducing. I didn't mind sitting close to the screen for slower-paced games like A Plague Tale: Requiem, but for shooters I'd prefer moving the screen further back. Unfortunately, that just wasn't possible in my cramped basement office, so I occasionally used Flex View mode to shrink faster games down to size.
Sam Rutherford/Engadget
The Ark also features built-in apps for every major game streaming service: Xbox Game Pass Ultimate, Geforce Now, Amazon Luna, and yes even Google’s short-lived Stadia (that dies next month). You can easily pair Bluetooth controllers with the Ark directly, allowing you to treat it like an oversized console. I had no trouble signing into Game Pass and going through a few races in Forza Horizon 5, but as usual, your streaming experience will depend on the quality of your internet.
On the non-gaming front, I learned that moving to a 55-inch screen still required a bit of an adjustment. I typically use a 34-inch ultrawide monitor, which gives me a decent amount of horizontal space without being too overwhelming vertically. But sitting in front of the Odyssey Ark almost felt like sitting directly in front of the monolith from 2001 — there's just so much screen. After 30 minutes or so, I got used to using Slack, Evernote and my many browser tabs on a big screen. But when it came to focused writing and other work, the Ark was overkill. I wrote this review on the Ark during small sessions, but I just couldn't stick with it for too long.
Sam Rutherford/Engadget
Samsung's unique portrait view, or "Cockpit Mode," was similarly overpowering. It's easy to rotate the Ark between that and its typical landscape mode — you just have to push the screen to the top of the base and push along its left side — but I found the taller view genuinely off-putting. Instead of a warm embrace, it was honestly a bit threatening, as if my body was instinctively worried about the Ark toppling over.
It's easy to make Windows 11 recognize a portrait display, but I didn't find it too useful for my typical work. (Though I'd imagine some Flight Simulator fans may enjoy trying to tweak the game for a genuine cockpit view). Instead, the portrait orientation was better suited to the Ark's Multi View mode, allowing me to play a PC game in a small 31-inch square while I left a YouTube video running in another window on top, and my phone mirrored right above it. It's just a shame that Multi View mode doesn't currently support streaming video apps like Netflix. Also, you can only hear audio from up to two sources at once. (And if this sounds like pure information overload, well, it is.)
For whatever reason, the Ark did an awful job of downscaling my PC's desktop in Multi View mode, even after tweaking the resolution several times. Text was hard to read and images were blurry, as if I was looking at everything through a pair of dirty glasses. Fighting with that feature also made it clear how the Ark often pales in comparison to having a typical multi-monitor setup. It's not that hard to have another monitor sitting beside your standard screen, and that also wouldn't involve any sort of distortion. You're also more free to tweak the way external monitors are positioned and laid out, rather than working within the confines of Samsung's software.
Most importantly, though, a multi-monitor setup would be vastly cheaper than the Odyssey Ark's $3,500 retail price. (It's currently marked down to $2,500 for the holidays, but it's unclear if that will stick.) You could easily pick up a 55-inch OLED TV and a few PC monitors for less than $2,5000. Or my personal recommendation: Get Alienware's fantastic QD-OLED ultrawide monitor for $1,299 and an assortment of other screens. Whichever direction you go, it'll be far more practical than having a single 55-inch display on your desk.
Devindra Hardawar/Engadget
As my honeymoon period with the Odyssey Ark faded, I was left with nothing but questions. Where are the DisplayPort or USB-C connections on Samsung's breakout box? Why, exactly, can't it display more than two HDMI connections at once? Is the extreme curve worth losing any sort of off-angle viewing? Really, who is this thing actually for? Will gamers be able to live with its downsides to take advantage of a 55-inch monitor? Can they stomach a $3,500 price? And how many people will have desks sturdy enough to hold this thing?
Anyone buying the Odyssey Ark is basically paying to be a beta tester for Samsung. In general, we recommend against serving as consumer guinea pigs. But if you've been dying to have an impossibly gigantic curved gaming monitor, your wish has finally been granted.
Finally, a Kindle you can write on! This week, we dive into Cherlynn’s review of the Kindle Scribe, Amazon’s first e-reader that can also capture handwritten notes. The hardware is great, but as usual, Amazon’s software feels half-baked. Also, Devindra and Cherlynn discuss the rise of new Twitter alternatives like Hive Social and Post. It looks like many communities are already splintering off to these services, but unfortunately, they can’t yet replicate the magic of Twitter.
Listen below or subscribe on your podcast app of choice. If you've got suggestions or topics you'd like covered on the show, be sure to email us or drop a note in the comments! And be sure to check out our other podcasts, the Morning After and Engadget News!
Rise of the Twitter clones: Hive Social, Post, and Mastodon – 19:28
Amazon will lose $10 billion on its Alexa division this year – 34:12
We’ve got a new trailer for the Super Mario Bros. animated movie – 38:01
Working on – 43:58
Pop culture picks – 45:30
Livestream
Credits Hosts: Cherlynn Low and Devindra Hardawar Producer: Ben Ellman Music: Dale North and Terrence O'Brien Livestream producers: Julio Barrientos Graphic artists: Luke Brooks and Brian Oh
Last year, the Biden administration signed the Secure Equipment Act into law, which aimed to block the authorization of network licenses from several Chinese companies whose hardware has been deemed a national security threat. Today, the FCC announced that it's officially implementing that ruling, which means some future equipment from Huawei, ZTE, Hytera, Hikvision and Dahua won't be authorized for sale in the US. Existing equipment from those companies, which are all listed under the FCC's "Covered List," aren't affected by the law.
“The FCC is committed to protecting our national security by ensuring that untrustworthy communications equipment is not authorized for use within our borders, and we are continuing that work here,” FCC Chairwoman Jessica Rosenworcel said in a statement. “These new rules are an important part of our ongoing actions to protect the American people from national security threats involving telecommunications.”
To be clear, the FCC isn't completely blocking all hardware from these companies. And for some, like Hytera, Hikvision and Dahua, Rosenworcel writes that it's specifically focusing on gear related to "the purpose of public safety, security of government facilities, physical surveillance of critical infrastructure, and other national security purposes." If those companies can show that they're not marketing that equipment for government use — for example, directing it consumers instead — they may be able get authorized by the FCC.
This latest move follows years of conflict between the US and companies closely tied to Chinese governments. That's included placing several notable Chinese companies, including DJI, on the Department of Commerce's "Entity List," which prohibits US firms from selling equipment to them. The FCC is also calling for $5 billion to help US carriers with the massive task of replacing equipment from Huawei and ZTE.
This week, Cherlynn and Devindra chat with Senior Commerce Editor, Valentina Palladino, about our massive Holiday Gift Guide. If you’re looking for a decent laptop to gift, or maybe some budget gear for yourself, we’ve got you covered! Also, they dig into the FTX debacle (which got much worse than last week!), and Elon Musk’s ongoing fail whale Twitter acquisition. And on a surprising note, we end up having strong feelings about Amazon’s chat-based virtual healthcare service.
Listen below, or subscribe on your podcast app of choice. If you've got suggestions or topics you'd like covered on the show, be sure to email us or drop a note in the comments! And be sure to check out our other podcasts, the Morning After and Engadget News!
Cherlynn got to try Apple’s SOS satellite text message service – 28:56
Qualcomm announces Snapdragon chips with hardware-accelerated ray tracing – 34:33
Tuvalu turns to the metaverse to save its culture from climate change – 38:38
Meta axes its Portal video chat device – 40:21
FTX continues to collapse as regulator investigations begin – 43:15
Elon Twitter is a mess: your weekly update – 48:36
Working on – 1:02:47
Pop culture picks – 1:05:59
Livestream
Credits Hosts: Cherlynn Low and Devindra Hardawar Guest: Valentina Palladino Producer: Ben Ellman Music: Dale North and Terrence O'Brien Livestream producers: Julio Barrientos Graphic artists: Luke Brooks and Brian Oh
It looks like Evernote's 2020 redesign wasn't enough to keep it independent. Today, the former darling notetaking app for productivity hounds, which was once valued at 1 billion dollars, announced that it has been purchased by Bending Spoons. If that name sounds unfamiliar to you, you're not alone. It's a Milan-based developer behind mobile apps like the video editor Splice, and the AI image editing tool Gemini. They look like well-designed and genuinely useful apps, but they're far from Evernote's once lofty goal of helping you to remember everything.
In a blog post, Evernote CEO Ian Small said the company is currently testing out collaborative editing between multiple users, and it's close to launching beta tests for deep Office 365 calendar integration. While that's good to hear, especially for the few remaining Evernote addicts like this reporter, those are also features that have existed in other platforms for years. It may also be tough to convince friends and colleagues to collaborate on an Evernote document — which may involve signing up for an account and learning a new interface — when Google Docs has made that simple for years.
"While ownership is changing hands, our commitment to keeping your data safe and secure remains as steadfast as ever, and the Evernote you know and love will continue to thrive," Smalls said in the post. "Joining Bending Spoons allows us to take advantage of their proven app expertise and wide range of proprietary technologies."
Terms of the deal weren't disclosed. But the fact that Evernote was purchased by a small app firm, rather than a notable tech giant, may be telling. Evernote raised nearly $300 million during the initial hype cycle around mobile apps. But the company eventually lost focus, branching out to real-world products like a smart notebook with Moleskin. Its apps were incredibly buggy for years, and it did a poor job of convincing users to actually pay for its product.
Somehow, I stuck with it though. I have over a decade's worth of notes living in Evernote — countless news stories, interviews (with their accompanying audio), reviews and PDFs. My attempts at finding replacements have typically ended in failure (sorry OneNote, I just don't like your editor). This acquisition isn't exactly the death knell for Evernote, but it certainly feels like the end of an era. Will my data be safe under a new owner? Can I rely on fast and accurate synchronization? I'll probably stick around for a bit longer, but all of a sudden, the alternatives are looking a lot more compelling.
Sure, we all want NVIDIA's RTX 4090, but it's tough to stomach its $1,599 starting price (if you can even find it at that price) or its massive power demands. That leaves impatient PC gamers with only one other new NVIDIA option this year: the $1,199 RTX 4080 with 16GB of VRAM. While $400 isn't exactly a huge discount in the world of high-end PC gaming (certainly not as significant as the $899 12GB RTX 4080 that NVIDIA "unlaunched."), it may tempt some gamers.
After all, it's faster than the RTX 3080 Ti that launched at the same price earlier this year, and it works with NVIDIA's powerful new DLSS 3 upscaling technology (which is limited to 4000-series GPUs). If you can live without the bragging rights of having a 4090, the RTX 4080 is a powerful GPU that'll satisfy anyone who wants to game in 4K with ray tracing. For those stuck with lower resolution monitors, though, you're probably better off waiting for the eventual 4070 and 4060 cards, as well as AMD's upcoming RDNA 3 GPUs.
Surprisingly enough, the RTX 4080 Founders Edition we reviewed shares the exact same design as the 4090. They both take up three PCI-e slots, sport massive vapor chambers, and they retain the unique pass-through fan design from NVIDIA's previous GPUs. I was expecting something a bit smaller, to be honest. At least the 4080 only needs three 8-pin PSU cables to function, whereas the 4090 demands four. (Both cards can also be powered by a single PCIe 5.0 PSU cable, but those power supplies are pretty rare at the moment.)
Devindra Hardawar/Engadget
The 4080's power cables also hint at one of its major advantages: It has a 320-watt thermal design profile (TDP) and requires a 750W PSU, whereas the 4090 has a far more demanding 450W TDP. Unless you already have an 850W power supply, upgrading to the 4090 may involve getting a new unit and rewiring power throughout your entire system. These cards won't always use their maximum power loads, but you'll still need to be ready for the rare moments when they need more juice.
While it may look just like the 4090, the RTX 4080 is a dramatically different beast under the hood. It's powered by 9,728 CUDA cores, 16GB of GDDR6X VRAM and offers a base clock speed of 2.21GHz (with boost speeds to 2.51GHz). The 4090, on the other hand, has 16,384 CUDA cores, slightly higher clock speeds and a whopping 24GB of VRAM. Compared to the 3080 Ti, the 4080 wins out with NVIDIA's new Ada Lovelace architecture, significantly faster speeds and 4GB more VRAM. (The 3080 Ti technically has around 500 more CUDA cores, but they're also inherently slower and less efficient than NVIDIA's new platform.)
So what do these numbers mean in practice? The RTX 4080 scored around 3,500 fewer points in 3DMark's TimeSpy Extreme benchmark compared to the 4090. But if that more powerful card didn't exist, the 4080 would be the most capable GPU we've ever reviewed. Its TimeSpy Extreme score was about 50 percent higher than the 3080 Ti, and it reached a comfortable 130fps while playing Halo Infinite in 4K with all of its graphics settings maxed out. Seeing Cyberpunk 2077 hit 74fps in 4K with ultra ray tracing settings (and the help of DLSS 3) nearly brought a tear to my eye.
None
3DMark TimeSpy Extreme
Port Royal (Ray Tracing)
Control
Blender
NVIDIA RTX 4090
12,879
17,780/82fps
4K (Native) High RT: 42 fps
9,310
NVIDIA RTX 3090
16,464
25,405/117.62 fps
4K (Native) High RT: 107 fps
12,335
NVIDIA RTX 3080 Ti
8,683
12,948/59.95fps
4K (Native) Med RT: 43 fps
5,940
AMD Radeon RX 6800 XT
7,713
9,104/42.15fps
4K (Native) No RT: 28-40 fps
N/A
A word on DLSS 3: It's NVIDIA's latest AI solution that can take lower-quality imagery and upscale it to higher resolutions. But in addition to intelligently sharpening edges and upgrading textures, DLSS 3 can also inject interpolated frames to smooth out 4K gameplay. While I can occasionally spot issues with particularly low quality DLSS upscaling, I didn't notice any unusual framerate hiccups while testing Cyberpunk and A Plague Tale: Requiem with the technology enabled.
The only real downside to the RTX 4080 is that I can't help but compare it to the 4090. That same Cyberpunk ray tracing benchmark was almost twice as fast on the 4090, reaching an eye-watering 135 fps. It also hit a 40-fps-higher average framerate in the 3DMark Port Royal ray tracing benchmark. Still, these are the sorts of gains only the most dedicated gamers will notice, the exact market for the 4090. When it comes to actual 4K gameplay, even with ray tracing in demanding games like Control, I never felt held back by the RTX 4080.
Devindra Hardawar/Engadget
And if you're looking for more performance, overclocking is always an option. I didn't have a chance to do so myself, but the 4080's thermal performance makes me think there's plenty of room for pushing things harder. It never climbed beyond 61 celsius during my testing, around 10 degrees cooler than the 4090. That's a testament to NVIDIA's excellent cooling setup (and perhaps partially due to my office being slightly cooler this month).
The real question: Is it worth settling for the 4080 if there's a chance you'll actually be able to buy the 4090 for $1,599? At the moment, most online retailers are selling 4090 cards for well above $2,000. It sounds crazy to say it, but the $1,199 card seems like a steal with that gulf. But, of course, who knows how long you'll be able to find the RTX 4080 at its launch price. It likely won't be too long before it creeps towards the 4090's higher tag.
And if paying more than $1,000 for a video card seems insane to you — and let's be clear, it should — sit tight to see what NVIDIA's future cards look like. We're definitely expecting RTX 4070, 4060 and 4050 cards eventually, but the the question is when. (Also, what the heck will NVIDIA do with its planned $899 4080 GPU? Does that become the 4070?) AMD's flagship RDNA 3 GPUs will launch below $1,000, and at the entry level, Intel's new Arc GPUs are surprisingly compelling.
All in all, the RTX 4080 is exactly what I'd want from an RTX 3080 Ti successor. It's faster and has plenty of new features to make it a demonstrable leap from the previous cards. I'm not saying you should be upgrading your 3080 anytime soon, but if you somehow stumble onto $1,199, I wouldn't blame you for being tempted by the 4080.
Magic Leap's glasses were supposed to lead us into the augmented reality era, a world beyond screens where we could interact with digital objects as if they were standing right next to us. Too bad they failed spectacularly. By early 2020, the company had raised nearly $2 billion. But aside from a few flashy demos and wild art projects, there wasn't much of a reason for anyone to buy a $2,295 headset (it reportedly only sold around 6,000 units). Like Google Glass before it, Magic Leap felt like a false start for AR, a solution to a problem that didn't exist.
But the company isn't dead yet. With a new CEO onboard — former Microsoft executive Peggy Johnson — it's aiming for something far more practical: AR for the enterprise. That may seem like a retread of the HoloLens playbook, which has focused on business customers for years, but Magic Leap has a shot at giving Microsoft some serious competition with its second-generation AR glasses.
Devindra Hardawar/Engadget
The $3,299 Magic Leap 2 (ML2), which launched in September, is easier to wear, far more powerful and it offers a dramatically larger (and taller) AR field of view than any headset we've seen before. It has the unique ability to dim its display, allowing you to block out light and focus more on virtual objects. And it should be easier for developers to work with, thanks to a new Android-based OS. While it's still unclear if the company's new business plan will pay off, ML2 is still a significant achievement, especially now that Meta is also pushing into similar AR-like territory with the $1,500 Quest Pro.
"It's been a long struggle," Magic Leap SVP and head of hardware Kevin Curtis said in an interview with Engadget. "When we came out of ML1, we learned a tremendous amount... Not just technically, but also from a market point of view. So that really was used to set the goals for ML2."
Some of those goals seemed impossible at the time. The company wanted to double the field of view (FOV) — the amount of screen area where you can actually see AR objects — as well cut the device's volume in half. Those moves would make its sequel headset even more immersive, while also being more comfortable for extended wear. According to Curtis, bumping up the field of view from 50 degrees to 70 degrees with ML1's projector and eyepiece technology would have required wearing something as large as an open hand. That's not exactly doable all day.
Magic Leap
Magic Leap spent years exploring existing forms of projection, including laser-scan based systems, uLED arrays and LCoS (liquid crystal on silicon), but found them all lacking. Instead, it developed its own custom architecture, which uses LCoS together with LED RGB light modules and a complex system of concentrators and polarizers to bring images to your eyes. That works together with a new eyepiece design to achieve its lofty 70 degree field of view.
But what does that actually mean? The Magic Leap 1 headset featured a FOV of 50 degrees, which made it seem as if you were viewing AR through a car's cramped rear window. (That was comparable to HoloLens 2's 52 degrees of viewing.) With Magic Leap 2, the company hit a 70 degree FOV by increasing the vertical viewing area, allowing you to see taller objects without moving your head up and down. During my brief demo, it felt more like standing in front of an open doorway.
Magic Leap
That's more akin to how you view things in real life, according to Curtis, and it goes a long way towards convincing you the AR objects you're seeing are real. I've tried a wide variety of headsets over the years (including the defunct entry from the startup Meta, which existed long before Facebook's name change), and the Magic Leap 2 is the first one that's delivered a genuine sense of presence. Whether I was viewing a large piece of medical equipment, or an expansive 3D model of downtown San Diego, I had to try hard to see the edges. It was almost aggressively immersive.
The new projection technology also helped Magic Leap achieve its goal of reducing ML2's volume by more than half, leading to a 20 percent weight drop (it clocks in at just 260 grams, slightly more than half a pound). The result is a pair of AR glasses that feel more like, well, glasses. While the original headset looked like a pair of enormous ski goggles, ML2 has flatter lenses and slimmer arms, making you seem less like a bug-eyed dork and more like an engineer or surgeon gearing up for a big project. (It's no wonder Magic Leap gave health startups a headstart with access to its new hardware and software.)
All of this custom development will also help Magic Leap deliver better headsets down the line. The company claims its eventual Magic Leap 3 glasses, which have no release date yet, will lose another 50 percent in volume and deliver a larger field of view. The technology can potentially be scaled beyond 80 degrees, allowing you to view a building-sized object unencumbered by any AR boundaries.
As I started demoing the Magic Leap 2 in a brightly lit hotel meeting room, it was mostly what I expected: A more comfortable and higher quality version of its predecessor. But at one point, I hit a button and the screen started to go dark, as if a shadowy cloud was blotting out the sickly fluorescent lights above me. I had flipped on the headset's global dimmer, which darkens the real world to better highlight virtual objects. The result is an almost VR-like experience. The virtual map I was viewing, which showed how first responders were dealing with wildfires in Colorado, all of a sudden looked sharper and more colorful. I wasn't distracted by the boring meeting desk in front of me, or the occasional bystander walking by.
Every AR solution adds light, Curtis explained, what's unique about ML2 is that it's able to add the color black. The dimmer module is another display that sits in front of the headset's eyepiece, allowing it to reduce light across the entire screen, or into specific areas, by a factor of 100. That'll let you use ML2 in brightly lit rooms, or even outside on a sunny day, without making the AR images seem washed out. Developers can also use the dimmer to add shadows to their objects, giving you an added layer of depth in AR.
Devindra Hardawar/Engadget
As Magic Leap was working on making AR more VR-like, Meta was also doubling down on bringing the real world into VR with the Quest Pro. Thanks to new cameras and upgraded hardware, Meta is pitching that headset as a way to bring VR elements into your typical workflow (just imagine viewing VR windows dancing above your laptop's screen). Based on my time with the Quest Pro so far, that's not something I'd actually want much of. The cameras just aren't good enough yet. But it's funny to see Meta tackling a similar problem as Magic Leap from another angle. Somewhere between these two headsets is the ideal balance between the immersion of VR, and the real world integration of AR.
I was so distracted by Magic Leap 2's expanded field of view and dimming capabilities, I barely noticed that its controller felt more ergonomic. And I didn't think much of the headset's computing pack, which can now be worn across your body like a messenger bag. Naturally, it has faster hardware inside (specifically, a quad-core AMD Zen 2 processor and RDNA 2 graphics). But my main takeaway, after years of AR and VR testing, and the seemingly endless drumbeat of metaverse hype from an increasingly desperate Mark Zuckerberg, is that it’s nice to be genuinely surprised by a new headset.
Magic Leap
But of course, tech alone won't make a successful product. Magic Leap isn't targeting ML2 towards consumers at all, instead it's being pitched to doctors who may want a bit of AR assistance during surgery, or engineers who would like to pull up schematics when they're standing in front of complex machinery.
"I think it's improved a lot, [Magic Leap is a] different company," said Chief Marketing Officer Daniel Diez, when I asked about the state of Magic Leap today. Amid dismal sales of its first headset, and increasingly dire financials, founder and CEO Rony Abovitz left in 2020. But now, thanks to more than $1 billion in additional funding and a new leader in Peggy Johnson, it has another shot at the AR market.
At the very least, it’s clear the metaverse isn’t a problem Meta can solve on its own. Magic Leap is one of the few established competitors out there, making it a company that’s still worth watching. And if the enterprise play doesn’t work out, there’s a chance a large company like Google (one of its original investors) may have some use for all of this AR tech.