Posts with «author_name|devindra hardawar» label

Facebook says it doesn’t want to own the metaverse, just jumpstart it

Here's what Facebook's metaverse isn't: It's not an alternative world to help us escape from our dystopian reality, a la Snow Crash. It won't require VR or AR glasses (at least, not at first). And, most importantly, it's not something Facebook wants to keep to itself. Instead, as Mark Zuckerberg described to media ahead of today's Facebook Connect conference, the company is betting it'll be the next major computing platform after the rise of smartphones and the mobile web.

After spending the last decade becoming obsessed with our phones and tablets — learning to stare down and scroll practically as a reflex — the Facebook founder thinks we'll be spending more time looking up at the 3D objects floating around us in the digital realm. Or maybe you'll be following a friend's avatar as they wander around your living room as a hologram. It's basically a digital world layered right on top of the real world, or an "embodied internet" as Zuckerberg describes.

Before he got into the weeds for his grand new vision, though, Zuckerberg also preempted criticism about looking into the future now, as the Facebook Papers paint the company as a mismanaged behemoth that constantly prioritizes profit over safety. While acknowledging the seriousness of the issues the company is facing, noting that it'll continue to focus on solving them with "industry-leading" investments, Zuckerberg said: 

Devindra Hardawar/Engadget

"The reality is is that there's always going to be issues and for some people... they may have the view that there's never really a great time to focus on the future... From my perspective, I think that we're here to create things and we believe that we can do this and that technology can make things better. So we think it's important to to push forward."

Given the extent to which Facebook, and Zuckerberg in particular, have proven to be untrustworthy stewards of social technology, it's almost laughable that the company wants us to buy into its future. But, like the rise of photo sharing and group chat apps, Zuckerberg at least has a good sense of what's coming next. And for all of his talk of turning Facebook into a metaverse company, he's adamant that he doesn't want to build a metaverse that's entirely owned by Facebook. He doesn't think other companies will either. Like the mobile web, he thinks every major technology company will contribute something towards the metaverse. He's just hoping to make Facebook a pioneer.

"Instead of looking at a screen, or today, how we look at the Internet, I think in the future you're going to be in the experiences, and I think that's just a qualitatively different experience," Zuckerberg said. It's not quite virtual reality as we think of it, and it's not just augmented reality. But ultimately, he sees the metaverse as something that'll help to deliver more presence for digital social experiences — the sense of being there, instead of just being trapped in a zoom window. And he expects there to be continuity across devices, so you'll be able to start chatting with friends on your phone and seamlessly join them as a hologram when you slip on AR glasses.

A simulated preview of Horizon Home.
Facebook

But, of course, the metaverse won't be built in a day. At Facebook Connect today, the company announced several ways it's moving towards making it more accessible. For one, Facebook will be transforming the Oculus Quest's Home interface into "Horizon Home," a more fully featured environment where you can invite friends and hang out virtually. Eventually, you'll also be able to build and customize your home space. The Venues app is also becoming "Horizon Venues," where it'll continue to serve as Facebook's prime spot for live virtual events. (The company also says NBA games are coming back to Venues in early November.)

The company is also making a major push for developers: its new Presence Platform offers through APIs that'll allow devs to make more inventive VR apps. The Insight SDK will let them take advantage of the Quest 2's cameras to bring the real world into VR; the Interaction SDK opens up the door for more hand-tracking interactions; and the Voice SDK will — you guessed it — let you use your words in more ways.

The Insight SDK, in particular, could reshape what Quest VR experiences could look like. It includes Spatial Anchors, which will let virtual objects persist across sessions in a space. So if you placed a VR pet bunny on your coffee table, it should always be there every time you logged into an app. Additionally, there's a Scene Understanding feature, which can help developers get a better sense of your physical space. A character talking to you in VR could, for example, wander around your living room without bumping into furniture.

Facebook

When it comes to augmented reality, Facebook also has plenty of upgrades in store for its Spark AR platform. For one, it's planning to launch an iOS app called Polar that'll let people design their own AR effects and objects without any coding. It's aimed at creators, who could use it to build unique 3D signage or makeup effects that their followers can apply. More experienced devs will also be able to create Geo-anchored objects, which are tied to specific locations in the real world, as well as AR effects that track your hands and body. They can also try out building group video chats for Messenger, something that'll eventually be supported in other apps.  

Like HoloLens and HTC Vive, Facebook plans to make a bigger push into enterprises with Quest for Business. It's a way for employees to log into Quest 2 headsets with secure work accounts (it's probably not great for your boss to see how often you're playing Beat Saber, after all). Since they're meant for office environments, IT departments will also be able to manage work accounts, specific devices and integrate their own security features. The key is that it's all going to be accessible on consumer-grade Quest 2 headsets, Facebook won't have to make entirely new hardware for work environments.

The company plans to take it slow with Quest for Business. It's currently being tested with a few companies now, and a wider beta is expected to come next year. At this point, Facebook isn't planning to officially roll it out to every company until 2023. Quest for Business will replace the previous Oculus for Business program, which required a special $799 Quest 2 headset.

Facebook already showed off one way remote meetings could be handled better with Horizon Workrooms, and that app is going to get better later this year with customizable workrooms. And when it comes to productivity, the company is also opening up the Oculus Store to 2D apps like Slack, Dropbox, Instagram and Facebook. You'll be able to dive into those apps right from your Horizon Home screen. It's convenient, but it's also a cheeky way to keep you from taking off your headset just to answer a Slack message.

Not everyone would want to spend a whole workday wearing a VR headset, but it's not hard to imagine how future AR glasses could let you dive into Slack and Office apps just about anywhere. They'll just be 2D projections floating around you, things that nobody else would be able to see. That may seem like science fiction today, but 15 years ago, so did the idea of having a touchscreen-enabled supercomputer in your pocket with blazing fast wireless internet. 

As Zuckerberg sees it, the metaverse will ultimately lead to a more natural relationship with technology. "It's not about you spending more time on screens," he told press before making a hasty retreat. "It's about making the time we spend better and I think you know screens can't really convey the full sense of presence."

Apple lets devs promote in-app events on the App Store

As promised at WWDC earlier this year, Apple today will start letting developers highlight their in-app events on the App Store. You'll need iOS 15 and iPadOS 15 to see the event listings, and they work as you'd expect, allowing you to see seasonal competitions, livestreams and more. It's a pretty straightforward feature, but it's the sort of thing that could encourage more people to install and use their apps. 

Hopefully, developers won't rely on it as a spammy way to rack up engagement. Many mobile gamers would love to know when they can log on for special item drops, for example. And if you encounter a particularly intriguing upcoming event, you can also create a notification or calendar reminder of when it begins. You'll also be able to share events to get your friends on board too.

ESA will try to fetch data from China's Mars rover with a new method: listening

Next month on Mars, the ESA and China's National Space Administration (CNSA) will try something that's never been attempted before in space: Sending data from a planet-based rover to an orbiter that it can't receive any messages from. Specifically, China's selfie-taking Zhurong rover, which has been on the Red Planet since May, will try to shoot data over to the ESA's Mars Express Orbiter. 

As the ESA explains, Zhurong can't actually receive any communications from the Express Orbiter, due to a radio incompatibility. That means it can't hear the hail signal sent from the orbiter, which is typically what a rover waits for before it starts sending out data. Instead, next month Mars and the ESA will attempt a new method that's previously only been tested on Earth. During five tests, Zhurong will send a signal blindly into space, and the Mars Express will listen for that signal and any potential data.

"If [Mars Express] detects the magic signal, the radio will lock on to it and begin recording any data," ESA's Josh Tapley writes. "At the end of the communication window, the spacecraft will turn to face Earth and relay these data across space the same way it does for other scientific Mars missions. When the data arrive at ESOC, they will be forwarded on to the Zhurong team for processing and analysis."    

It's not unusual for rovers to send data to foreign orbiter — that's commonly been seen as a smart backup method — but this test opens the door for communication between incompatible systems. That'll be useful if China has any issues with its Tianwen-1 orbiter down the line, or if the US and other countries need help in turn. 

Intel's hybrid 12th-gen chips are a major strike against AMD

We've been hearing about Intel's powerful hybrid chips for so long, they've achieved almost mythical status. The idea behind them is intriguing: they feature both performance-cores (P-cores) and and efficient-cores (E-cores) on a single die, giving you chips that can be beefy and a bit more power-conscious, depending on the task. Previously, all of Intel's CPU cores were pretty much the same, which led to the energy-hungry designs we've seen over the last few years. 

Now the company is ready to launch those chips, previously codenamed "Alder Lake," as its 12th-gen desktop processors. And maybe, just maybe, it'll be able to steal the spotlight back from AMD and Apple.

In addition to their hybrid configuration, these 12th-gen chips are also the first under the "Intel 7" process technology, which was previously seen as a refined 10nm design. When Intel revised its product roadmap in July with new names, it seemed to just be steering us away from its 7nm delays. But the performance of these 12th-gen chips may be enough to justify the new branding.

Intel

Intel is throwing some major numbers around: it says 12th-gen chips are up to 19 percent faster than 11th-gen CPUs overall, and they're twice as fast in the Adobe After Effects Pulse benchmark. When it comes to multithreaded performance (tasks built specifically for more than one core, like video and 3D rendering), the company claims the top-end i9-12900K is 50 percent faster than last year's 11900K while using less power. And even better, it can achieve performance parity while using only around a quarter of the power. Basically, everyone who held off on upgrading over the last few years is in for a treat, as these chips promise to be a big leap forward.

Intel's 12th-gen Core chips can fit in up to 16 cores on the i9-12900K. That's a combination of 8 P-cores and 8 E-cores, with a total of 24 process threads (every P-core counts double, since they support hyper-threading, but the E-cores don't). Given that this is an entirely new way of designing its chips, the company also worked together to develop a new Thread Director with Microsoft, which intelligently assigns tasks to the appropriate core. That way you don't have to manually assign a background thread to an E-core, or start mucking about your settings once you start working on concurrent tasks. (If the hybrid core design seems familiar, it's because ARM has been pushing something similar for the past decade with its big.LITTLE technology on mobile CPUs.)

Intel

Intel claims P-cores can perform up to 28 percent faster than its 10th-gen Comet Lake S chips in single-threaded performance. The E-cores, meanwhile, are just as fast as the 10th-gen hardware on their own. As you'd expect, these chips shine best when you're throwing serious workloads at them. Intel says the i9-12900K can get around up to 84 percent higher framerates while playing Mount and Blade II and streaming over OBS, compared to the previous generation chip. Similarly, it's 47 percent faster while multi-tasking with Adobe Lightroom Classic and Premiere Pro.

Intel

Intel's figures sound impressive when compared to its own hardware, but the company also noted that its Ryzen benchmarks were run before AMD and Microsoft deployed Windows 11 updates to fix some performance bugs. At the time of its testing, the i9-12900K appeared to have a commanding lead over the Ryzen 5950X in many games, like Troy: A Total War Saga and Grid 2019. But it'll be interested to see what those numbers look like now. And of course, AMD could easily come back with speedier hardware of its own early next year.

The new 12th-gen chips are also looking towards the future, with support for up to 16 lanes of PCIe 5.0 and DDR5-4800 RAM. Intel's new 600-series chipset will feature PCIe 4.0 support, integrated WiFi 6E, and an updated Direct Media Interface (DMI) that'll offer "double and faster bandwidth between the chipset and the processor." There's also support for up to 4 USB 3.2 Gen 2x2 ports, as well as far more USB 3.2 Gen 2 connections in general.

As you'd expect, it'll cost a bit more to jump into Intel's 12th-gen chips. The Core i9-12900K will go for $589, compared to the 11900K's $539 to $549 price range. You could always save $20 or so by getting the "KF" chip without onboard graphics, but in general I'd recommend keeping graphics in case your GPU gets fried. The more approachable Core i7-12700K, thankfully, hasn't budged from its predecessor's $409 price, while the Core i5-12600K is around $20 more than before if you want onboard graphics.

The real question for Intel is how this new hardware stacks up against what AMD and Apple have coming. Benchmark leaks suggest that the i9-12900K is faster than Apple's M1 Max chip, but that's also a power-sipping laptop part. A faster, desktop-focused chip from Apple would likely leave Intel lagging behind again. Still, this uncertainty is a good thing for the PC industry as a whole. Now we've got several companies producing powerful processors. Their attempts to one-up should ultimately be a very good thing for consumers.

MacBook Pro 14-inch and 16-inch review (2021): Apple’s mighty Macs

Apple is finally restoring balance to its portable lineup with the new 14-inch and 16-inch MacBook Pros. If you wanted a big-screen Mac notebook for video editing over the past year, you were stuck paying a premium for outdated Intel and AMD hardware. So, we've been eagerly awaiting an M1 upgrade for the 16-inch MacBook Pro, a machine I called Apple's best laptop ever when it debuted two years ago.

But it's worth remembering that, for all the hype around Apple's M1 chip last year, it was a let down for creative professionals. It just couldn’t handle the kinds of heavy duty video editing and 3D rendering that they demanded, in part due to being capped at 16GB of RAM. That made the 13-inch MacBook Pro a bit of an odd duck, since the Air was nearly as fast.

Apple's redesigned MacBook Pros, powered by its new M1 Pro and M1 Max chips, are exactly what media professionals have been waiting for. The processors are far faster than last year's M1, they support up to 64GB of RAM, and both laptops feature XDR display technology borrowed from the iPad Pro. But Apple also looked backwards as it stepped forward, restoring ports and adopting a design that resembles many of its older machines. Just call them PowerBooks, reborn.

What's new

Devindra Hardawar/Engadget

Apple isn’t currently planning to replace the 13-inch model with the MacBook Pro 14. It’s more an expansion of the highest-end model. It can do almost everything the 16-inch model can, it’s just smaller. (The only exception is "High Power Mode," which gives the 16-inch M1 Max version a temporary speed boost.) That's one way I've come to terms with the high $1,999 starting price. The bigger model now starts at $2,499, $100 more than the Intel version.

Both notebooks still look like MacBook Pros, with sleek unibody aluminum cases. But lean in a bit closer and you'll notice some retro flourishes. They're slightly thicker, with more bulbous edges that hearken back to Apple's notebooks from the 2000's. They're also heavier than you'd expect: the 14-inch model comes in at 3.5 pounds, while the 16-inch varies between 4.7 and 4.8 pounds, depending on the chip you choose. That's about half a pound heavier than the last 16-inch MacBook Pro. 

All of that heft isn’t for naught, though. In part it allowed Apple to cram in a lot more ports. Joining three Thunderbolt 4 USB-C connections are a full-sized HDMI port, a MagSafe power connection, a high-impedance headphone jack and an SD card reader (cue triumphant horns). Sure, you’ll still need adapters to connect older USB Type-A devices, but at least you can offload photos and video without extra gear. You can still charge the notebooks over USB-C — always useful in a pinch — but the MagSafe connection is less likely to cause accidental falls and you won't have to use a precious USB-C port just to stay powered up.

Devindra Hardawar/Engadget

Looking at the MacBook Pro's screens makes it clear they're anything but retro, though. They feature 14.2-inch and 16.2-inch Liquid Retina XDR displays, respectively. Mini-LED backlighting lets them reach up to 1,600 nits of peak brightness, which is great for HDR content. The screens are a sharp 254 pixels per inch, with a 3,024 by 1,964 resolution on the 14-inch and 3,456 by 2,234 on the 16-inch. Neither are true 4K (the 16-inch comes close), but you'll still be able to work on 4K and 8K video, just at a reduced scale.

Best of all is that the MacBook Pros support ProMotion, Apple's technology that enables refresh rates up to 120Hz. With that flipped on, scrolling through web pages and documents just felt silky smooth. And after spending hours writing up this review, I definitely noticed that my eyes were less fatigued thanks to the speedy refresh rate. This is becoming more common in the laptop world. Microsoft already beat Apple to the punch by putting a 120Hz screen in the Surface Laptop Studio. ProMotion is also intelligent enough to lower the refresh rate when it makes sense, which goes a long way towards saving battery life.

Really though, you don't have to think about all of the technology going into Apple's Liquid Retina XDR displays. Just know that they look incredible, with eye-watering brightness in sunny HDR scenes and inky black darkness in night shots. These aren't OLED screens, but mini-LEDs get Apple pretty close to that level of contrast. 

Embracing the notch life

Devindra Hardawar/Engadget

Now let's talk about the elephant in the room: that notch in the middle of the screen. Much like the last batch of iPhones, Apple carved out a portion of the display to fit in a camera. In this case a 1080p webcam. Upon first glance, it's almost laughable that Apple is leaning even more into a design element that everyone hates. But, honestly, the notch isn't a big deal.

Apple wisely pushed the MacOS menu bar around the camera, so it's really just taking up space that would go unused anyway. The menu bar also gets blacked out entirely whenever you put an app or video in fullscreen. You could also use a black wallpaper which effectively hides the notch.

I'll happily give up a bit of screen real estate, though, if it means Apple can finally squeeze in a decent camera. And judging from the dozens of video calls I've been on over the past week, it's a huge upgrade. There's a clear leap forward in resolution, sharpness and detail compared to my 2017 MacBook Pro. And it definitely looks better than the M1 MacBook Air, which had a few tweaks, but was still stuck at 720p. It would have been nice to see FaceID on the MacBook Pro though, which would have brought it on-par with Windows Hello-equipped PCs. For now, you'll still have to rely on the TouchID sensor on the power button.

In use

Our 14-inch review model was equipped with an M1 Pro sporting 16 GPU cores, 32GB of RAM and a 1TB SSD. The 16-inch had the 32 GPU core M1 Max, a 2TB SSD and a whopping 64GB of RAM. (This, by the way, marks the first time I've reviewed a laptop with that much RAM.) Both of Apple's new chips also feature 10 CPU cores — for comparison, the M1 had eight CPU cores and eight GPU cores.

Geekbench 5 CPU

Cinebench R23

Disk speed (top reads/writes)

Apple MacBook Pro (14-inch)

1,767/11,777

1,515/12,118

5.1 GB/s / 5.8 GB/s

Apple MacBook Pro (16-inch, 2021)

1,783/12,693

1,524/12,281

5.1 GB/s / 6.2 GB/s

Apple MacBook Pro (Apple M1, 13-inch, 2020)

1,696/7,174

1,492/7,467

3GB/s / 3GB/s

Dell XPS 15 (Intel i7-11800H, RTX 3050 Ti)

1,536/7,551

1,506/9,453

2.8 GB/s / 2.6 GB/s

Razer Blade 14 (AMD Ryzen 9 5900HX. RTX 3080)

1,443/7,226

1,461/11,502

3 GB/s / 2 GB/s

Just based on specs, I expected to see some wild performance improvements. And the benchmarks didn't disappoint: According to GeekBench 5, both MacBook Pros blew away every Windows PC we reviewed this year by a significant margin. That includes the NUC 11 Extreme powered by Intel's Core i9-11900KB, a high-end desktop CPU! The single-core performance on the M1 Pro and Max was similar to the M1, which is unsurprising. But the multi-core figures were far higher. Another nice plus: Both of these computers are equipped with very fast NVMe SSDs, which will be a huge help when working with large projects. 

The GPU-heavy Geekbench 5 Compute score made it clear that Apple hasn't completely surpassed the likes of NVIDIA and AMD, though. The 14-inch MacBook Pro was more than twice as fast as the Surface Pro 8 (running Intel's Xe graphics) and the M1-equipped 13-inch MacBook Pro, while the bigger notebook was on-par with the Surface Laptop Studio equipped with an NVIDIA 3050 Ti. These aren't bad scores, but it makes it clear that users who need serious power for 3D rendering or data crunching may be better off with PCs equipped with dedicated GPUs.

Devindra Hardawar/Engadget

I don't think those are the people Apple is trying to court, though. Instead, the 14 and 16-inch MacBook Pros seem like an attempt to get back in the good graces of audio and video producers. Apple's new chips will certainly be more than enough for dealing with media. Both computers managed to convert a 4K video clip to 1080p in 34 seconds with Apple's VideoToolbox hardware encoder, which is four seconds slower than the NUC 11 Extreme, but four seconds faster than the XPS 15. They were also among the fastest Cinebench R23 performers we saw this year — only the ASUS ROG Strix G15 gaming laptop bested them.

And before you ask, no, the M1 Pro and M1 Max don't magically turn these computers into gaming rigs. Sure, everything on Apple Arcade runs smoothly, but that was true of the MacBook Air. When I tried to load Borderlands 3, one of the few semi-recent games that actually works on Macs, I just got an unplayable mess running below 30fps. Maybe the the guarantee of decent GPUs will encourage more game developers to build for Macs, but more likely they'll just end up making sure their iPhone and iPad games run smoothly.

Surprisingly, I didn’t notice a huge difference in performance when I was running the MacBook Pros on battery. PCs often slow down dramatically whenever they’re disconnected from a socket, but not so with these notebooks. That’s useful if you need to encode something remotely and you’re willing to sacrifice battery life to get it done.

Don’t worry though, the efficiency of the M1 chip’s ARM design leads to great battery performance. The 14-inch MacBook Pro lasted 12 hours and 35 minutes in our benchmark, while the 16-inch went for 16 hours and 34 minutes. That’s over five hours longer than the last Intel model.

Devindra Hardawar/Engadget

It’s clear that Apple listened to many of the complaints from Mac fans (and perhaps even lowly reviewers). But really, that’s something the company has been doing more over the years, like when it finally moved away from those awful butterfly keyboards to more tactile Magic Keyboards. By the way, typing on the 14 and 16-inch MacBook Pro remains excellent, and the trackpad is a dream to use, as always.

Apple didn't skimp on the audio front either. Both MacBook Pros sport a six-speaker array, made up of two tweeters and four force-cancelling woofers. Simply put, they sound miraculous. I normally just play a few songs on notebook speakers to confirm how disappointing they are. But these laptops sounded like I was listening to two small bookshelf speakers, with transparent vocals and punchy bass. Yes, I'm as shocked as you are.

The 16-inch MacBook Pro sounds a bit better, since it has room for slightly bigger drivers, but both notebooks are enough for a private jam session. The MacBook Pro's microphones also do a solid job of capturing speech during video calls. I wouldn't use them to record professional music, as Apple suggested you could, but they’re probably fine for a quick podcast session on the road.

Wrap-up

Devindra Hardawar/Engadget

So if you’re in the market for one of these new computers, which should you get? If you’re mostly using it for general productivity tasks, then I’d lean towards the 14-inch model, which was just easier to travel with. It’s a good option for coders and people who may not need a ton of screen real estate. But all of the video and audio producers I’ve talked to were unequivocal: they wanted to go big with the 16-inch model.

The biggest downside of the new MacBook Pros are their high prices — but really, what else do you expect from Apple? The 14-inch model, in particular, will probably give you the vapors if you're comparing it to the 13-inch MacBook Pro. But in the world of gaming and high-end productivity notebooks, Apple's pricing isn't that far off. The Razer Blade 14, for example, is just two hundred dollars less than the MacBook Pro 14-inch. Dell's XPS 15 OLED is around $500 cheaper than the cheapest 16-inch MBP, but that's with a CPU that's much slower than Apple's. Once you start speccing that machine up, though, you’ll likely pay close to $2,500.

On the whole, these computers have practically everything we’d want in a powerful notebook. If you're a creative professional with a large budget for a new computer, and you want something that'll genuinely speed up your workflow, the new MacBook Pros are exactly what you need.

‘Dune’ is too big for your TV

The real world just felt too small when I stepped out of Denis Villeneuve's Dune. There weren't any enormous spaceships ready to rocket off to planets in distant galaxies. No Brutalist palaces amid endless desert vistas. No building-sized sandworms roaming about, eager to devour anyone who disturbed them. Just me and traffic on Atlanta's I-285.

This latest Dune adaptation isn't perfect — it's at times emotionally empty, and it's basically set up for a second movie we may never see — but it successfully transported me to the universe Frank Herbert created over half a century ago. The film focuses on half of the novel, telling the story of Paul Atreides (Timothée Chalamet), a sheltered baron's son who moves to the desert planet of Arrakis. It's an important post, since it's the only world that produces the melange, or spice, which powers interstellar travel. But as Paul quickly learns, it's also a dangerous place for his elite family, and it's where he learns he may also be a potential messiah. You know, typical teen boy stuff.

Warner Bros. and Legendary Pictures

After being wowed by Dune in the theater, I plan to rewatch it at home on HBO Max, where it's also being released today. But I'm certain the experience won't be the same, even on my 120-inch projector screen. This Dune demands to be seen on something even bigger—a place where your very sense of being can be dwarfed. Dune made me feel like Paul Atreides standing in front of a skyscraper-sized sandworm, waiting to be consumed. And I welcomed it.

Of course, it's no simple thing to trek out to the cinema these days, not with coronavirus still raging and fellow theatergoers refusing to take basic safety precautions. (The vaccines are safe. Masks work. Please protect yourself and others.) But if you can manage to safely see it in theaters — perhaps by renting out a private screen with friends — you'll be reminded of what makes that experience so special. I watched it in the second row of a fairly typical multiplex theater, and it still floored me. I can only imagine what it would be like on a full-sized IMAX screen, which can reach up to 98 feet tall.

Dune is at its best when Villeneuve and cinematographer Greig Fraser let you soak in the vistas, the regal-yet-alien costumes and the wealth of background details. It's pure visual world-building. At one point, a character's eyes briefly flash white when he's asked to compute the cost of an imperial envoy's trek through the stars. It's never explained, but you get it. This style of slow burn sci-fi isn't for everyone, but if you enjoyed Arrival or Blade Runner 2049, Villeneuve's previous genre forays, there's a good chance you're primed for this brand of storytelling.

Even before I saw anything on the screen, though, I felt Dune in my gut. As I waited for my screening to begin, an alien voice began speaking out of nowhere, sounding like it came entirely from the theater's subwoofers. It posed a question about the power of drums, but really, it was as if the movie was saying, "Sit up, pay attention, you're not on Earth anymore."

The film's inventive sound design doesn't stop there. Everything you hear — from the roar of spaceships as they take off, the buzz of dragonfly-like vehicles as they flap their wings, or the sphincter-clenching roar of the sandworms — is meticulously crafted to make you believe it's all real. Hans Zimmer's score doesn't tread too far from his Gladiator vibe, but does a fine job of making everything sound epic. (And yes, I was blasting it down the highway as I sped back home.)

Don't take my praise for this movie as disrespect towards David Lynch's 1984 Dune. That was a troubled production that's since attained cult status, but it was hampered by meddling producers and a script that tried to cram in the entire novel. Villenueve's approach is more confident and, as you'd expect, is backed by far more capable visual effects technology. Even though it runs for two hours and 35 minutes, I could have easily given up another three hours to watch the rest of the story.

Warner Bros. and Legendary Pictures

Unfortunately, there's a chance we won't see that conclusion. Warner Bros. originally agreed to let Villeneuve tell the story in two parts (this movie's title card says "Dune Part 1"), but the follow-up still hasn't been officially greenlit. The director told Variety that his plan to shoot both parts at once was denied—he expects to hear more from the studio once we see how Dune performs in theaters and on HBO Max. Plans for a prequel TV series, Dune: The Sisterhood, are still in the works with Villeneuve attached to produce.

As epic as Dune is, it's a shame that its scope couldn't fit in actors from the Middle East and North Africa (MENA), cultures that Herbert was clearly inspired by. The film almost goes out of its way to diminish any Islamic influence from its story (instead of Jihad, there are references to a crusade). That's particularly egregious when we see the locals of Arrakis, the blue-eyed sand dwellers known as the Fremen, who are often portrayed as noble savages. At least the film begins with the Fremen perspective: Chani, played by Zendaya, wonders aloud who their next oppressors will be.

All of this is to say, if you can make it to the theater to see Dune, you should. You can still capture some of its immensity by watching it up close: Pull a chair right up to your TV, or veg out with a laptop as close to your eyeballs as possible. But Dune is a story that hinges on the power of dreams, so it’s almost fitting that it’s best experienced when it overwhelms your reality.

Engadget Podcast: Apple’s new MacBook Pros, the Pixel 6 and the Surface Duo 2

Techtober continues with a deep dive into Apple’s latest MacBook Pros, powered by the new M1 Pro and M1 Max chips. Cherlynn and Devindra also chat about what’s new with the Pixel 6, and Mr. Mobile himself (Michael Fisher) joins to break down the Surface Duo 2. It turns out Microsoft needed more than a year to fix all of the problems with its dual-screen phone.

Listen below, or subscribe on your podcast app of choice. If you've got suggestions or topics you'd like covered on the show, be sure to email us or drop a note in the comments! And be sure to check out our other podcasts, the Morning After and Engadget News!


Subscribe!


Topics

  • Apple’s new MacBooks with M1 Pro and M1 Pro Max – 1:37

  • Google finally details Pixel 6 and Pixel 6 Pro’s specs – 23:22

  • Microsoft’s Surface Duo 2 is inconsistent and buggy – 38:41

  • Facebook may be changing its name – 1:04:05

  • Facebook Portal Go Review – 1:05:05

  • Finally, you can post to Instagram from desktop – 1:06:02

  • Samsung had yet another Unpacked event – 1:06:23

  • Also in events: Razer, DJI – 1:07:35

  • We have a trailer for the Uncharted movie – 1:07:56

  • Mel Brooks is doing History of the World: Part II for Hulu – 1:09:19

  • Fisher Price made a version of its toy phone that actually makes calls – 1:10:14

  • Working on – 1:11:25

  • Pop culture picks – 1:12:26


Video livestream

Credits
Hosts: Cherlynn Low and Devindra Hardawar
Guests: Michael Fisher
Producer: Ben Ellman
Livestream producers: Julio Barrientos,Luke Brooks
Graphics artists: Luke Brooks, Kyle Maack
Music: Dale North and Terrence O'Brien

Google details the Pixel 6's unique Tensor chip

Google was all too excited to unveil Tensor, its first system-on-a-chip, in August. We knew it would be powering the Pixel 6 and 6 Pro, and much like Apple's A-series mobile chips, it was an attempt at tying together Google's software with some custom-tuned hardware. In particular, Google positioned Tensor as something of an AI powerhouse, giving its new phones better hardware for image processing and voice recognition. Now, we know exactly what makes Tensor tick.

Google's SoC is a 5nm eight-core design broken down into big, medium and small cores. Leading the way is two ARM Cortex-X1 cores running at 2.8GHz. That's notable since Qualcomm's flagship Snapdragon 888 chip, which powers Samsung's Galaxy S21 and many other high-end phones, only has a single X1 core. It'll be interesting to see just how much faster Tensor is in comparison. 

Below that, the SoC also features two Cortex A76 cores running at 2.25GHz, as well as four 1.8GHz A55 cores as the "small" bits. Thankfully, Google didn't skimp on graphics: the Tensor also has a Mali-G78 graphics core, which you'll also find on other flagship Android phones. 

All of the talk of custom hardware may bring to mind the ill-fated (but gloriously inventive) Moto X, Google's 2013-era attempt at building a smarter smartphone. It wasn't the most powerful mobile around, but its always-on voice commands were a decent step towards ambient computing, something Google is still focused on today with the Tensor chip. 

The new SoC allow the Pixel 6 to translate videos and messages quickly with its Live Translate feature, and it'll be smarter about recognizing your voice as well. That should be particularly helpful when it comes to using your voice to type, edit and send messages. Overall, the Tensor chip will perform around 80 percent faster than the Pixel 5, according to Google. That's a lofty figure, so we'll definitely be testing the Pixel 6 heavily to confirm those numbers.

Additionally, Google says Tensor also gives the Pixel 6 an extra layer of security. It'll work together with the Titan M2 chip in the phone to protect against malware and other potential attacks. That's a good step forward for Google, and we're hoping to see more security hardware in other Android phones down the line.

At this point, Tensor appears to offer everything we'd want in a new mobile chip: Fast speeds, and plenty of forward-thinking AI features. It could eventually make the Pixel phones Google's true iPhone equivalent: Flagship hardware that dances in concert with a custom mobile chip. (And if Google is truly successful, maybe Tensor could make its way over to devices from other companies.)

Developing...

Catch up on all the latest news from Google's Pixel 6 event!

M1 Pro and M1 Max are Apple's high-end Mac chips

It's been almost a year since Apple unveiled its first custom chip for Macs, the ARM-based M1. As we saw in our review of the latest MacBook Air, MacBook Pro and colorful iMac, the M1 was a marvel, proving to be both faster than Intel and AMD's x86 processors, while also drawing far less power. Now to follow up, Apple is taking a two-pronged approach with M1 Pro and M1 Max, two chips that'll power the company's new MacBook Pro models.

Both chips are 10-core processors, a combination of eight high performance cores and two high efficiency units. What separates them are their GPU capabilities: the M1 Pro has up to a 16-core GPU while the Max has tops out with 32 graphics cores. In comparison, last year's M1 was an eight-core chip that maxed out with eight GPU cores.

Based on these specs, power users will see a much bigger performance upgrade by going for a MacBook Pro. Last year's M1-equipped 13-inch MacBook Pro wasn't much faster than the M1 Air—the Pro basically added a fan for more sustained workloads, whereas the Air was miraculously fan-less. That was an odd situation for Apple; it was both a testament to the power of Apple Silicon, and a sign that the company needed to devote more time to its powerful machines. (Don't forget, the 16-inch MacBook Pro was practically forgotten over the past year.)

Developing...

Follow all of the news from Apple’s Mac event right here.

AMD Radeon RX 6600 review: The opposite of future-proof

When AMD announced the Radeon RX 6600 XT a few months ago, it was positioned as the ideal 1080p gaming card, with the potential to offer decent 1,440p performance in certain games. Now there's the lower-tier RX 6600 and the story is pretty much the same — except, you know, worse. I suppose the existence of an "XT" card implies a more mainstream version eventually. But after testing out the RX 6600 for the past week, I'm still wondering who this card is for.

Of course, that's a tough question to answer when the GPU market is so volatile and card prices vary wildly. AMD says the RX 6600's suggested retail price is $329, compared to $379 for the 6600XT. But given the global chip shortage and resellers hungry for more GPUs, those prices are purely conceptual. In the real world, the 6600XT now sells for upwards of $600 (and in some cases close to $800!). The 6600 is also competing against the RTX 3060, which also has an MSRP of $329, but is now selling between $800 and $1,020. So much for budget GPUs.

AMD, a company with a reputation for creating budget-friendly cards that packed a decent punch, probably wanted to stay true to its roots. But unless it can guarantee a price close to MSRP, the RX 6600 just seems out of place in today's gaming landscape. As you'll see in our testing, it's a capable 1080p gaming card. But its ray tracing performance is terrible, and it can't take advantage of NVIDIA's DLSS technology, which uses AI to boost performance.

I'll be honest, I didn't really expect much from the RX 6600 from the start. Under the hood, its RDNA 2 architecture is powered by 28 compute units and 1,792 stream processors, a noticeable step down from the 6600XT's 32 CUs and 2,048 stream processors. There's also a serious speed difference: the cheaper card has 2,044 MHz game clock and 2,491 MHz boost, compared to 2,359 MHz while gaming on the 6600 XT and a 2,589 MHz boost. Both cards have 8GB of GDDR6 RAM, but the 6600's memory bandwidth is 34 GB/s slower at 224 GB/s.

3DMark TimeSpy

Destiny 2

Hitman 3

Port Royal (ray tracing)

AMD Radeon RX 6600

8,521

1080p: 110-120 | 1440p: 75-85

1080p: 138 | 1440p: 94

3,846/17fps

AMD Radeon RX 6600XT

9,872

1080p: 130-150 | 1440p: 85-105

1080p: 146 | 1440p: 110

4,5824/32.22fps

AMD Radeon RX 6700 XT

11,198

1440p: 75-100fps

4K: 50-75fps

N/A

5,920/27.4fps

NVIDIA RTX 3060 Ti

11,308

1440p: 85-110fps

4K: 45-60fps

N/A

6,989/32.36fps

Given those specs, I predicted the RX 6600 would be a decent 1080p card and not much else. And for the most part, that's what my testing proved: It reached a solid 120FPS in Destiny 2 while playing in 1080p with maxed out graphics. Once I pushed the game to 1,440p, though, it fell to 80fps. That pattern held true for pretty much everything I tested. Hitman 3's benchmark reached a respectable 138fps in 1080p with graphics settings cranked to the maximum, but only 94fps in 1,440p.

If you've got an AMD Ryzen 5000 CPU (or some 3000 models), the RX 6600 will be a slight upgrade thanks to Smart Access Memory. That's a feature that basically lets your CPU directly address all of your video card's RAM, and it's something you can't use at all if you've got an Intel CPU. I had SAM enabled on my testing rig, which was powered by a Ryzen 7 5800X and 32GB of RAM, in case you were wondering.

Devindra Hardawar/Engadget

Both the Radeon RX 6600 and 6600 XT had a hard time competing against their NVIDIA counterparts in our benchmarks. The RTX 3060 Ti reached 11,308 in 3DMark Time Spy, whereas the 6600 XT hit 9,872 and the 6600 trailed behind with a score of 8,521. I didn't have an RTX 3060 on-hand to test, but 3DMark's verified benchmarks with similar systems show scores of around 10,000.

Ray tracing was also a lost cause with the 6,600 — just flipping on ray traced reflections in Control slowed the game to a meager 35fps. Without ray tracing, it was at least playable in 1080p, hitting between 60 and 70fps. Now Control is a notoriously tough game on GPUs, but at least NVIDIA cards let me get decent framerates with ray tracing thanks to DLSS, which uses AI processing to upscale the game from a lower resolution. AMD's alternative technology, FidelityFX Super Resolution, isn't supported in Control yet. That solution is also cross-compatible with NVIDIA cards, so you won't need a Radeon GPU to take advantage of it — which again makes me wonder, why would you get the RX 6600 instead of the 3060?

I could see it being a worthwhile card for a very specific gamer: Someone who has a small case and doesn't want to upgrade beyond a 450 watt power supply. The RX 6600 has a total board power demand of 132 watts, compared to 170 watts on the RTX 3060. That's a major reason why it runs so cool, reaching only around 70 celsius under load (the 3060 typically runs between 70 and 75 celsius when stressed). Still, I can't imagine that someone who wants to shell out $329 for a GPU (and realistically much more), would limit themselves based on a weak power supply.

Devindra Hardawar/Engadget

A good PC upgrade is one that'll last you for years, and, unfortunately, I can't imagine that'll be true of the RX 6600. Solid 1080p performance is a nice feature to have today, but 1,440p monitors are getting cheaper and games are becoming more demanding. Who knows if the 6600 will be able to handle a flagship title in a few years even at 1080p. And if you end up upgrading to a 1,440p, ultrawide, or 4K screen over the next few years, you'll have to upgrade immediately.

The Radeon RX 6600 could be a decent contender if the GPU market stabilizes and AMD pushes the price below $300. But for now, it’s a misfire that only makes sense if you can’t get your hands on an RTX 3060.