Posts with «author_name|devindra hardawar» label

Roku is finally building its own TVs

Roku TVs will finally live up to their name this year. At CES, the streaming device company announced that it'll be building its own smart TVs for the first time. When the Roku TV program debuted in 2014, it was a way for the company to bring its streaming software into TVs built by partners like TCL and Hisense. But now Roku is debuting it's own family of HD and 4K sets ranging from 24 to 75-inches, which are set to arrive in spring.

Value appears to be the key, as the company says the TVs will range from $119 to $999. That should help Roku's partners to rest easy — we've seen some sets like the TCL Series 8 scale into premium $2,000 territory. The company isn't divulging many technical details around these TVs yet, but don't expect them to have some of the nicer features TCL and others are including, like super bright MiniLED panels. Still, Roku's sets may eat into the lower-end offerings from its partners.

Roku

Chris Larson, Roku's VP of retail strategy, tells Engadget that the company isn't trying to directly compete with existing partners, instead it wants to have a bit more control over how some Roku TVs are produced. For example, Roku is bundling its voice remotes with all of its new sets, even the cheap HD models (Select Series TV's come with the Roku Voice Remote, while Premium Series sets include the rechargeable Voice Remote Pro) . That's something the company couldn't push partners to do, especially when it came to budget TVs.

Down the line, Larson says the new TVs will also bring Roku closer to component suppliers, like the companies behind screen panels and the chips that power smart devices. That could help the company "drive innovation in the TV process." These new Roku TVs will work alongside Roku's existing home wireless speakers and other home theater equipment, just like partner offerings. But the company could potentially cook up some new features that are exclusive to its TVs — or at least, capabilities partners may not want to implement.

NVIDIA RTX 4070 Ti review: 3090 Ti power for $799

NVIDIA's new RTX 40-series GPUs are insanely powerful, but also wildly expensive. That's my big takeaway after reviewing the RTX 4090 and RTX 4080 — sure, they're fast, but who can justify spending over $1,000 on a video card? With the RTX 4070 Ti, which debuted at CES 2023, NVIDIA is offering a slightly more reasonable alternative. Starting at $799, it's still fairly pricey, but at least it's under $1,000. And best of all, it's in many ways better than last year's 3090 Ti, which initially cost a whopping $2,000.

After announcing two RTX 4080 cards a few months ago, NVIDIA surprised us all when it "unlaunched" the $899 12GB model. Given its much lower specs, there were plenty of complaints that it seemed a bit too expensive to be called a 4080. So now we've got the 4070 Ti at $100 less, with the same 7,680 CUDA cores and 12GB of GDDR6X memory that the 4080 was supposed to get. Sometimes, yelling at companies online gets results.

If you've got a small case, the 4070 Ti may also be the first RTX 40-series GPU you can actually use. Both the 4080 and 4090 Ti were triple-slot behemoths — they took up a significant chunk of my fairly roomy mid-tower case – whereas the 4070 Ti just needs two. It also requires far less energy than either of those cards, since it can run with a 700-watt PSU and has a maximum power draw of 285W. (The 4080 requires a 750W PSU, while the demands an 850W unit.) NVIDIA says the 4070 Ti uses around 49 percent less power on average than the 3090 Ti.

Given where it sits alongside the RTX 4080, the 4070 Ti performed exactly as I expected. It clocked in around 20 percent slower in 3DMark's TimeSpy Extreme Benchmark, as well as the Geekbench 4 Compute test. It was also a full 30 fps slower while playing Halo Infinite in 4K with maxed out graphics settings. Now those numbers may sound disappointing, but I was ecstatic to see them. Sure, it's slower, but the 4070 Ti is actually keeping up fairly well with a card that's $400 more expensive (and in many cases, far more). That's something to celebrate!

None

3DMark TimeSpy Extreme

Port Royal (Ray Tracing)

Cyberpunk

Blender

NVIDIA RTX 4070 Ti

10,624

14,163/66fps

4K RT DLSS : 78fps

7,247

AMD Radeon RX 7900 XTX

12,969

14,696/68fps

4K FSR RT: 57fps

2,899

AMD Radeon RX 7900 XT

11,688

13,247/61fps

4K FSRT RT: 50fps

3,516

NVIDIA RTX 4080

12,879

17,780/82fps

4K DLSS RT: 84fps

9,310

NVIDIA RTX 4090

16,464

25,405/117.62 fps

4K DLSS RT: 135fps

12,335

NVIDIA's DLSS 3 upscaling technology also proved to be incredibly useful once again. I reached a smooth 78fps in Cyberpunk2077 while playing in 4K with graphics ray tracing settings set to high. And if you need even more frames, you can always bump down to 1440p with DLSS 3, where I managed to reach 90 fps. The 4070 Ti also blew away the Radeon RX 7900 XT and XTX in Blender's benchmark, though that may have been due to unoptimized drivers on AMD's part. 

If you've got a 1,440p monitor running at 120Hz or more, the 4070 Ti is clearly the more sensible purchase in NVIDIA's new family. Halo Infinite hit 165 fps while maxed out in that resolution, and I saw 130 fps in Control was graphics and ray tracing settings cranked up. And, it's worth noting, I didn't encounter any of the odd driver instability that crashed my system multiple times with the Radeon RX 7900 cards. The 4070 Ti was also twice as fast as both of those AMD GPUs in Control while using DLSS 3 and ray tracing in 1440p and 4K.

The ASUS TUF 4070 Ti I reviewed retails for $849, but I ran it at the same stock speeds as other $799 cards. The GPU reached 76C after hours of benchmarking and gaming — that's not as low as the 70C and below temperatures I was seeing on the 4080 and 4090, but those cards also had far more elaborate cooling.

As impressed as I am by the 4070 Ti, every prospective GPU buyer should know that NVIDIA's 30-series GPUs are still great! And while they don't have DLSS 3, they still have excellent DLSS 2 upscaling. Best of all, they're falling in price now that another generation of cards have arrived. You can snag 3060s easily for less than $400, while I've seen 3070s falling below $600 regularly. I'm sure we'll see a 4060 card later this year, but if you're in a rush, don't look down on older hardware.

Devindra Hardawar/Engadget

If anything, the RTX 4070 Ti is awful news for AMD. The Radeon RX 4700 XT and XTX are both faster GPUs in many benchmarks, but once you start enabling ray tracing, they practically crumble. And worst of all, they're $100 and $200 more, respectively. Personally, I'd rather the power of DLSS and the stability of NVIDIA's hardware and software, over the raw speed of those AMD cards.

While I miss the days of “reasonable” video card prices under $500, the 4070 Ti still feels like a dose of sanity. Unless you’re a high-level streamer or pro gamer, there’s little reason to spend four figures on a video card. $799, though? That’s doable. And if anything, it pushes the prices of other hardware down considerably. Even if you don’t buy the 4070 Ti, we should all be thankful it exists.

Alienware's new gaming laptops include an 18-inch beast

Alienware's gaming laptops are getting a new look, and one will even offer a massive 18-inch screen. Dell's gaming brand has been on a roll over the last few years with its bold "Legend" design language, which veered into a very sci-fi aesthetic, as well as its super-thin X series notebooks. This year, Legend is being refined for a third time with a focus on "form, function and quality." Alienware's new machines still look strikingly different than most other gaming laptops, but now they're easier to hold, open and, thankfully, there's no more glossy black plastic to be seen.

Perhaps the most striking example of Legend 3.0 is Alienware's new m18, its largest and most powerful gaming laptop yet. This beast sports an 18-inch screen that's 14.5-percent larger than the previous 17-inch model. Clearly, portability isn't the main concern here, it's power. The m18 features all of the latest hardware, including Intel's fastest 13th-gen HX mobile processors, NVIDIA's latest GPUs, or alternatively you can configure with an AMD Ryzen chip with next-gen Radeon graphics.

While it's just as thick as the previous m17, the 18-inch Alienware also crams in far better cooling, with a vapor chamber that covers the GPU and CPU, seven heat pipes, and four fans for 25 percent better airflow. You'll be able to add an optional Cherry MX mechanical keyboard, up to 9TB of storage and choose between 165Hz quad-HD and 480Hz 1080p screens. With a total system power of 250-watts, the m18 is clearly meant for the most demanding gamers out there, people who won't mind lugging around a huge notebook. If you were intrigued by Alienware's massive Area 51m, its now defunct laptop that used replaceable desktop components, the m18 may be a solid option.

Devindra Hardawar/Engadget

For the more sensible players out there, there's the smaller Alienware m16, which features the same design and similar cooling upgrades. That computer can be equipped with 16-inch quad-HD+ (2,560 by 1,600) panels running at 165Hz or 240Hz, or with a 480Hz FHD+ screen. Both m-series laptops feature NVIDIA G-SYNC and AMD FreeSDync support, wide color gamuts and Dell's ComfortView Plus technology, which reduces harsh blue light.

If the Alienware m16 and m18 look a bit chunky compared to most other gaming laptops, that's pretty much by design. That's always been the company's line for bigger and more powerful machines. If you want something slimmer and perhaps more fashionable, you'll have to look to the X-series notebooks. Last year we got the Alienware x14, which was impressively thin for all the power it held. This year, the company is introducing the x16, which, you guessed it, sports a 16-inch screen.

Alienware is calling the x16 its design highlight of the year, with a tall 16 by 10 display and a stronger all-metal case. It even manages to fit in a six-speaker sound system, with two upward-firing tweeters and two woofers for solid bass. Under the hood, it's powered by Intel's fastest non-HX 13th-gen CPUs, all the way up to the 14-core 13900HK and NVIDIA's full suite of RTX 4000 GPUs. Alienware says it can also be equipped with AMD's next-gen hardware, but we don't have specifics on that yet.

The x16 looks just as striking as Alienware's previous thin gaming laptops, but the new Legend aesthetic makes it appear more refined. It also has all the bling you'd want, with over 100 micro-LEDs along its rear lighting panel, RGB lighting across all of its keys, and an RGB touchpad that's also 15 percent bigger than the previous x17's. Now I'm still not sure why you'd want RGB lighting on a touchpad, but it sure does look cool. The x16's 6-pound weight may be a problem for some buyers though, especially since Razer's Blade 17 pretty much matches it.

As you'd expect, the Alienware x14 also makes a return this year, and it still looks impressive. It sports a new 14-inch quad-HD+ screen running at 165Hz, and its hardware is less beefy than the 16-inch model, topping out with one of Intel's 13th-gen Core i7 CPUs and NVIDIA's RTX 4060. That's the price you'll have to pay for such a thin case though. Alienware claims it's still the thinnest 14-inch gaming notebook on the market, though, measuring at just 14.5mm.

All of Alienware's new machines will be available in the first quarter with Intel and NVIDIA configurations, while AMD options will arrive in the second quarter. The m18 will start at $2,899 with high-end options, though Alienware says entry-level models starting at $2,099 will arrive later. The m16, meanwhile, will initially run you $2,599, with future configurations starting at $1,899. The Alienware x16 will be the priciest of the bunch, starting at an eye-watering $3,099 for high-end configs and $2,150 for future entry-level options. And last, but not least, you can expect to pay at least $1,799 for the x14.

Choosing any notebook, especially gaming machines, always amounts to weighing potential compromises. At least now Alienware has a machine for practically everyone, from the style conscious to people who want an even more desktop-like experience on the go.

NVIDIA unveils the $799 RTX 4070 Ti

The rumors were true: NVIDIA finally unveiled its latest mid-range GPU, the RTX 4070 Ti. Starting at $799, it's meant to be a slightly more reasonable alternative to NVIDIA's $1,199 RTX 4090 and $1,599 4090. But yes, it's still pretty costly — gone are the days when "mid-range" video cards were below $500. For the price, though you get a GPU that can play Cyberpunk 2077 three times as fast as the RTX 3090 Ti in Ray Tracing Overdrive mode (according to NVIDIA, at least).

While the RTX 4080 and 4090 Ti are targeted at 4K gaming, NVIDIA is positioning the RTX 4070 Ti as the pinnacle of 1,440p gaming beyond 120fps. DLSS 3 is a big reason for that — just like with the other 4000-series cards, it uses machine learning to generate entire frames, rather than the pixels DLSS 2 created. That means it should be able to deliver better overall framerates, especially when it comes to CPU-bound titles.

As the leaks foretold, the RTX 4070 Ti features 7,680 CUDA cores and 12GB of GDDR6X memory. In comparison, the 4080 sports 9,728 CUDA cores and 16GB of memory, while the 4090 has 16,384 CUDA cores and 24GB of RAM. Since it's supposedly comparable to the 3090 Ti, you can expect the 4070 Ti to handle a bit of 4K gaming, especially with the help of DLSS 3. But really, it seems more like the card gamers with fast 1,440p monitors have been waiting for. 

Developing...

Intel's 13th-gen laptop CPUs offer up to 24 cores

Intel is bringing the power of its 13th-gen desktop CPUs down to laptops — all 24 cores worth. At CES today, Intel unveiled the Core i9-13980HX, the pinnacle of its mobile lineup. It features 24 cores (a combination of 8 Performance cores and 16 Efficient cores) and a boost speed of a whopping 5.6GHz. It's the continuation of Intel's high performance HX line, which debuted last year as a way to bring more power to beefier laptops. The company claims the new Core i9 CPU is 11 percent faster than last year's top-end 12900HK when it comes to single-threaded tasks, and it's 49 percent faster for multithreaded work (intensive tasks like encoding video and 3D rendering).

Intel's 13th-gen HX lineup scales all the way down to the Core i5-13450HX, which offers 10 cores (6P, 4E) and up to 4.5Ghz boost speeds. Basically, if you're hankering for more performance and don't mind a hit to battery life, there should be an HX chip within your budget. The rest of Intel's 13th-gen lineup looks noteworthy, as well. The P series chips, which are meant for performance ultraportables, will reach up to 14 cores, while the low-power U-series CPUs top out at 10 cores (2P, 8E) with the i7-1365U.

Intel

We weren't too impressed with Intel's previous P-series CPUs on laptops like the XPS 13 Plus — the performance gains seemed negligible for most tasks, while the battery life hit was massive. Hopefully Intel has made some improvements with its new lineup. The company also claims select 13th-gen chips will offer VPU (Vision Processing Unit) AI accelerators, which can help offload tasks like background blurring during video calls. The lack of a VPU was one major downside to the Intel-equipped Surface Pro 9 (and the one major advantage for the Arm model), so it'll be nice to see some sort of AI acceleration this year.

Another pleasant surprise: New low-end chips. Intel quietly killed its Pentium and Celeron branding last year — now we've learned that they've been replaced with new N-series chips, simply dubbed Intel Processor and Intel Core i3. These chips are mainly focused on education and other entry-level computing markets, subsequently they're only equipped with E-cores. Intel says its quad-core N200 chip offers 28 percent better application performance and 64 percent faster graphics than the previous-gen Pentium Silver N6000. Bumping up to the 8-core i3 N-305 adds an additional 42 percent in application performance and 56 percent faster graphics. Sure, we all want a 24-core laptop, but better low-end chips have the potential to help kids and other users who don't need a boatload of power.

Aside from laptops, Intel also roundup out its 13th-gen desktop CPU lineup at CES. They'll still reach up to 24 cores like the enthusiast-level K series chips, but they'll "only" go up to 5.6GHz boost speeds, instead of 5.8Ghz. The company says they're 11 percent faster in single-threaded performance and up to 34 percent faster when it comes to multi-threaded tasks. The 13th-gen desktop chips will also be compatible with 600 and 700-series motherboards, and they'll work with either DDR5 or DDR4 memory, making them decent upgrades for modern Intel systems.

Dell's Concept Nyx gamepad sure is... something

Last year Dell showed off Concept Nyx, its vision for a server that could let you play games on screens throughout your home. Perhaps you could start a game on your bedroom TV and then continue it in your living room — and if someone else was using that set, you could also share that larger screen. I'll admit, I was far from sold on the idea, especially after Engadget's Cherlynn Low and I were forced to go head-to-head in two separate Rocket League windows on a single TV screen. It looked more like the waste of a perfectly good 65-inch TV, instead of being the future of gaming.

Now Dell and Alienware have returned with another Concept Nyx accessory: A truly baffling PC gamepad. Like a cross between Valve's ambitious-yet-flawed Steam Controller, Sony's DualSense and the latest Xbox offering, it sports a trackpad of a directional pad, two analog sticks, the usual face and top buttons, and adaptive triggers. There are also two rear shift buttons, as well as dual scroll-wheels along the bottom to easily change your settings. And if that's not enough functionality for you, the two top buttons also have capacitive sensing, allowing you to slide your finger slowly across them for different affects.

I can trace my love of gadgets back to the first time I held an NES controller at the age of five, so I was initially intrigued by the Nyx controller. It's certainly leagues ahead of Dell's previous "UFO" pad, which resembled the Atari's failed Jaguar controller more than anything modern. The Nyx gamepad feels like a premium device Dell could actually sell, with sturdy build quality and a familiar Xbox-like feel.

The demo gods weren't in Dell's favor during our briefing, unfortunately, so we couldn't play any games with the Nyx controller. Just from holding it though, it felt somewhat incomplete. Perhaps I'm too used to the idea of directional pads, but I still find them essential, especially when playing Metroidvania games or anything that hearkens back to the classic 2D era. The Nyx's circular trackpad could be fine for some PC games, but I still prefer having the confidence of a real directional pad. If Valve can manage to shove two trackpads alongside a D-pad on the Steam Deck, surely Dell could find some more room for a trackpad.

Dell could be trying to one-up Valve's original Steam Controller, which took a huge risk by prominently featuring two circular trackpads to help replicate the feeling of mouse and keyboard controls. But while that device had its fans, I could never adapt to it. There's a reason why console controllers ultimately settled on a fairly standard design: It just works.

Engadget Podcast: CES 2023 Preview

Can you believe CES is just a week away? For our final episode of 2022, Cherlynn, Devindra and Senior Writer Sam Rutherford dive into their expectations for CES 2023. We’ll definitely hear more from Intel and AMD when it comes to CPUs, as well as AMD and NVIDIA’s latest mobile video cards. But we’re always keeping our eyes out for the weird stuff at the show, like Lenovo’s wild swiss army lamp (a combination webcam, facelight and USB hub!). And of course, there will likely be tons of news around new TVs, PCs and cars.

Listen below or subscribe on your podcast app of choice. If you've got suggestions or topics you'd like covered on the show, be sure to email us or drop a note in the comments! And be sure to check out our other podcasts, the Morning After and Engadget News!

Subscribe!


Topics

  • PC hardware to look forward to – 5:06

  • Phones and mobile at CES – 22:16

  • New TVs and gaming monitors to expect – 28:11

  • Wearables at CES 2023 – 35:38

  • Other news – 42:07

  • Working on – 44:47

  • Pop culture picks – 46:06

Livestream

Credits
Hosts: Cherlynn Low and Devindra Hardawar
Guest: Sam Rutherford
Producer: Ben Ellman
Music: Dale North and Terrence O'Brien
Livestream producers: Julio Barrientos
Graphic artists: Luke Brooks and Brian Oh

Radeon RX 7900 XTX and XT review: AMD’s ‘reasonable’ stab at 4K gaming

Once again, AMD is ready to take on NVIDIA's latest video cards with powerful alternatives at a lower price. And once again, AMD still lags behind when it comes to ray tracing. That's pretty much the story behind the Radeon RX 7900 XT and 7900 XTX, two confusingly-named GPUs meant to be the pinnacle of AMD's new RDNA 3 graphics architecture. At $899 and $999, respectively, these cards are certainly easier to stomach than NVIDIA's $1,199 RTX 4080 and the monstrously expensive $1,599 RTX 4090 (both of which actually sell for far more at most stores).

For the most part, AMD’s new cards deliver solid 4K gaming performance, especially with the help of the company’s FidelityFX Super Resolution (FSR) upscaling technology. It's just a shame that you'll have to live with slower ray tracing performance than the competition. (On the bright side, they offer a major ray tracing upgrade over AMD's last batch of Radeon GPUs.)

So what makes these cards so special? They're the first GPU's built on a chiplet-based design, similar to AMD's latest CPUs. That should allow AMD to tweak its designs easily down the line, making it simpler to scale RDNA 3 down to laptops and lower-end GPUs. The 7900 XTX and XT feature a 5nm compute die and a 6nm memory die connected by a 5.3 TB/s interconnect. Together, that means they can reach up to 61 teraflops of computing power and utilize up to 24GB of GDDR6 RAM.

AMD also claims it beefed up ray tracing performance by 50 percent per compute unit, compared to its previous RDNA 2 architecture. Its video engine has been upgraded with support for AV1 encoding and decoding at up to 8K/60fps. That format isn't widely adopted yet, but it aims to deliver better video compression for 4K and 8K footage compared to existing codecs like H.264.

Devindra Hardawar/Engadget

True to their names, the Radeon RX 7900 XTX and XT aren't very different. The top-end XTX sports 96 compute units, the same amount of ray accelerators and clock speeds between 2.3Ghz and 2.5GHz. The XT, meanwhile, offers 84 compute and ray tracing units and clocks between 2GHz and 2.4GHz. The higher end card comes equipped with 24GB of GDDR6 RAM, compared to 20GB on the XT. (Notably, they both offer more memory than the 16GB RTX 4080.) 

Given their similarities though, it's unclear why anyone would opt to save $100 for the 7900 XT. If you're willing to spend close to $1,000 on a video card, you might as well go full-tilt and grab as much memory and power as you can. It would have been nice to see something slightly cheaper from AMD, even if it meant delivering a card that's a bit slower than the 7900 XT.

The reference GPUs we're reviewing look and feel like premium components, as we've come to expect from AMD's flagship cards. Most importantly, though, they only take up two slots on your motherboard, whereas the enormous RTX 4080 and 4090 take up three. The 7900 XT and XTX also rely on two 8-pin power connections, so you won't need to string any new PSU cables or cram in a dongle like with NVIDIA's cards. The 7900 XTX requires an 850-watt power supply, thanks to its starting power draw of 355W, while the XT model can work with a 750W PSU. Both cards hovered around 66C under load, which was right between what we saw on the RTX 4080 and 4090.

While I was eager to see how these new GPUs compared to NVIDIAs, I had to go through several rounds of driver and motherboard BIOS updates on my Ryzen 9 7900X before both cards were stable enough to actually use. That's something I occasionally run into when testing cutting-edge hardware (NVIDIA's cards also required a BIOS update), but there were still issues with AMD's cards even after that. Halo Infinite, for example, refused to launch matches with either card. Sometimes my PC would completely shut down while testing Cyberpunk 2077, which required me to unplug my desktop and reset my BIOS before Windows would boot again.

I've been benching AMD and NVIDIA video cards on this PC, equipped with a premium Corsair 1000W PSU, for the past several months without any stability issues. So it was a surprise to see just how much havoc these GPUs could wreak. I haven't seen other reviews complaining of similar issues, so I'll chalk up my experience to early drivers. AMD just released a new driver that resolves an issue of high power draw during video encoding, so I'm hoping the company is also trying to address the bugs I'm seeing.

None

3DMark TimeSpy Extreme

Port Royal (Ray Tracing)

Cyberpunk

Blender

AMD Radeon RX 7900 XTX

12,969

14,696/68fps

4K FSR RT: 57fps

2,899

AMD Radeon RX 7900 XT

11,688

13,247/61fps

4K FSRT RT: 50fps

3,516

NVIDIA RTX 4080

12,879

17,780/82fps

4K DLSS RT: 74fps

9,310

NVIDIA RTX 4090

16,464

25,405/117.62 fps

4K DLSS RT: 135fps

12,335

AMD Radeon RX 6800 XT

7,713

9,104/42.15fps

N/A

N/A

When the cards ran smoothly, they proved to be fairly competitive with the RTX 4080. The 7900 XTX was on-par with the 4080 when it came to 3DMark's TimeSpy Extreme benchmark and Geekbench 5's Compute test. The 7900 XT scored 1,000 points lower on TimeSpy Extreme, which was 3,000 points higher than last year's RTX 3080 Ti, but it was bested by that NVIDIA card when it came to Geekbench. Hitman 3 also ran blazingly fast on both cards in 4K, reaching 165fps and 180fps when I flipped on FSR upscaling. Much like NVIDIA's cards, there's little reason to run any game in 4K without the help of advanced upscaling tech.

The performance gulf between AMD and NVIDIA appeared once I started dabbling with ray tracing. The 7900 XTX and XT scored well below the RTX 4080 in the 3DMark Port Royal benchmark (at least they managed to beat the 3080 Ti). I also only saw around 57fps in Cyberpunk 2077 on the Radeon 7900 XTX while playing in 4K with full ray tracing and AMD's FidelityFX Super Resolution technology. Without FSR, that frame rate dipped to an unplayable 25fps. The slower 7900 XT only managed to hit 50fps in 4K with FSR and ray tracing enabled.

Devindra Hardawar/Engadget

Basically, if you're eager to get a video card that reaches well above 60fps in 4K with ray tracing, you'll have to look elsewhere. But if you can live with 1,440p, you'll find more to like: The 7900 XTX reached 130fps in Cyberpunk with ray tracing, FSR and graphics settings maxed, while the 7900 XT hit 114fps. That's almost enough to max out a 120Hz gaming monitor! Personally, I still find 4K gaming to be overrated — 1,440p still looks great, and you may never notice the benefits of pushing more pixels. But I’ll admit that I've been spoiled by NVIDIA's DLSS3 upscaling technology, which allowed me to hit 74fps in Cyberpunk while playing in 4K with ray tracing. That's as close to gaming heaven as I've ever been.

But there's one thing you'll find with these AMD GPUs that you won't with NVIDIA's: Reasonable street prices. Even after their launch, you can still snag the 7900XT and XTX close to retail. Many RTX 4080 models, meanwhile, are inching towards $1,500 at online retailers (assuming you can find them in stock at all). Spending close to $1,000 on a video card is still hard to stomach, but at least it makes more sense than going all the way to $1,500.

The Radeon RX 7900 XTX and XT are a solid step forward for AMD, especially when it comes to 4K gaming. But I'm hoping the company can get its driver situation in order, and perhaps eke out better ray tracing performance in the process. Most gamers are still better off waiting for AMD and NVIDIA's next-gen mid-range cards, which are sure to be launching soon. But if you're an avowed AMD fan, you've finally got the high-end upgrade you've been waiting for.

The best shows to binge watch over the holidays in 2022

This year was a bit of a reset for the entertainment industry, with more people returning to theaters and more must-watch TV shows hitting streaming networks. That's a fairly major change from 2021, when many movies hit services like HBO Max on the same day as theaters. But if you've gotten used to catching up on everything on your couch, don't worry — there's still plenty to watch over the holidays. (And be sure to check out our recommendations from last year, which are still good, I swear!)

HBO Max

Station Eleven

Perhaps the best piece of media I've seen this year, Station Eleven is an adaptation of Emily St. John's novel about a society-collapsing swine flu epidemic. Wait, don't run away! While the series may evoke the worst of our COVID experience at first, it also transforms into a hopeful tale about the power of stories (and pop culture!) and human connection. It delivers something we could all use right now: Hope.

Tuca and Bertie Season 3

After being unceremoniously canceled by Netflix in 2020, the cartoon duo of Tuca and Bertie found a new home on Cartoon Network last year. The third and (unfortunately) final season of the series aired this year, and it remains a delight. Created by Lisa Hanawalt and executive produced by Raphael Bob-Waksberg (BoJack Horseman), it follows a pair of friends as they deal with life, love and simply existing in their '30s.

Also on HBO Max:

The White Lotus (Season 2): Mike White's series on the exploits of privileged resort guests, this time in Sicily instead of Hawaii, remains a delight.

Harley Quinn (Season 3): This show remains one of the best DC series currently airing. Tune in for a comedic and more adult spin on your Batman faves.

Disney+

Andor

Yes, it's another Star Wars show, but Andor ended up being one of the biggest surprises of the year. Created by Tony Gilroy (who helped transform Rogue One into a stellar film), it centers on Cassian Andor (Diego Luna), a small-time thief with a healthy distrust for the Empire. The show follows his journey towards becoming a member of the rebels, and in doing so it also serves as a blueprint for taking down authoritarian systems.

Fire of Love

Katia and Maurice Krafft were a rare couple, two expert volcanologists who were also madly in love. They dedicated their lives to documenting active volcanoes, often by directly confronting lava flows, rock explosions and acid lakes. Fire of Love unearths their original footage to show just how far they went in the name of science. But it also paints a portrait of a truly rare couple, one whose contributions we still owe much to.

Also on Disney+:

Tales of the Jedi: A short animated series that gives us a bit more backstory on Ahsoka Tano and... Count Dooku? If you've been interested in the Star Wars cartoons, but don't want to slog through tons of old episodes, this is a good start.

Bluey: This remains the best kids show on TV. Bluey's latest season is as funny and poignant as ever. It's the rare show that can teach both kids and their parents.

Netflix

Cyberpunk: Edgeunners

Cyberpunk 2077 had a notoriously rocky game launch, but the setting of Night CIty was always compelling. Edgerunners is an anime spin on that universe, centering on a plucky street kid who finds himself equipped with a military-grade spine implant. You know, typical teenage stuff. Will his newfound power keep him on the wrong side of the law? And will he ever get revenge against the people who ruined his life? The show doesn't do much new, but it features genuinely compelling characters and some of the best animation in recent years.

The Midnight Club

Mike Flanagan can do no wrong. The talent behind Midnight Mass, Doctor Sleep, and the excellent “Haunting of…” horror shows on Netflix has now set his sights on a Christopher Pike adaptation, and the results are glorious. The show, co-created by Leah Fong, follows a group of terminally ill teenagers as they tell spooky stories and explore the supernatural mysteries of their hospice mansion. It's a meditation on the power of storytelling, but also yet another Flanagan exploration of the value of life.

Also on Netflix:

Wednesday: Come to see Tim Burton finally get his shot at The Addams Family, stay for Jenna Ortega's perfectly deadpan performance.

Hulu

The Bear

Can an award-winning chef truly come back home and save his family's beleaguered sandwich shop? Or is he just trying to work through the death of his brother the only way he knows? The Bear captures the energy and madness of kitchen life better than any TV show — forget all the glossy stuff you've seen on Chef's Table. But amid the insanity, it's the story of a found family banding together to mourn and save the place they all love.

The Dropout

What makes Elizabeth Holmes tick? This series, which stars Amanda Seyfried as the notorious Theranos founder, paints a more complete picture of Holmes than the 2019 HBO documentary The Inventor. We see Holmes' early life, as well as her initial connection with Sunny Balwani (Naveen Andrews, perhaps the best TV adaptation glow-up any South Asian man can hope for). After proving her smarts in college, she sets off to build the world's best blood testing machine with Theranos. We all know how that went. When the hype around Theranos starts to fall apart, The Dropout turns into a fascinating portrait of self-deception.

Also on Hulu:

Fleishman is in Trouble: Toby Fleishman is going through a divorce. But as he starts to rebuild his own identity, he also needs to deal with the wreckage of his marriage (and find his missing ex-wife).

Apple TV+

Severance

Taking the idea of work/life balance a step too far, Severance follows a group of people who’ve received a procedure that completely splits their memories between home and office life. The result is two completely separate personalities within the same body, both trapped in their respective cages. Severance is a bit of a slow burn, but it’s a fascinating exploration of corporate control akin to Terry Gilliam’s Brazil. (Be sure to check out our interview with the creator of the show, Dan Erickson, on the Engadget Podcast.)

Pachinko

An adaptation of Min Jin Lee’s 2017 novel, Pachinko follows a Korean family across several generations starting in 1917 and reaching into the late ‘80s. We see a young fish seller fall in love and make her way to Japan as an outsider, while her grandson struggles to maintain his identity in the pressure-filled business world. Pachinko has almost everything you’d want in a family epic: Children struggling to live up to their parents’ standards, forbidden love and the constant threat of generational trauma. Also, it has one of the best opening sequences of the year.

Other things to watch

The Good Fight (Paramount+): Over its six-season span, The Good Fight tackled the insanity of our current social and political environment better than any other TV show. It’s first and foremost a legal procedural, but coming from the minds of Michelle and Robert King, it ends up being so much more.

Gangs of London (AMC+): Now on its second season, Gangs of London is one of the most brutal crime shows on TV. It’s part gangster epic, part martial arts smackdown (it comes from Gareth Evans, director of The Raid films). While the plot becomes increasingly ludicrous, it’s worth a watch just for the sheer ambition of its action sequences, many of which go far harder than anything we’ve seen in American films lately.

Two men allegedly hacked JFK's taxi dispatch system with Russian help

Would you pay a few bucks to skip an interminably long taxi wait line at the airport? That's essentially what Daniel Abayev and Peter Leyman did, according to the DOJ, except they focused on taxi drivers. The two men, both from Queens, have been arrested for hacking into JFK's taxi dispatch system with the help of Russian nationals. From September 2019 and September 2021, they charged drivers $10 to jump ahead of JFK's taxi queue. Typically, those cars are sent out depending on their order of arrival.

"For years, the defendants’ hacking kept honest cab drivers from being able to pick up fares at JFK in the order in which they arrived," U.S. Attorney Damian Williams said in a statement. "Now, thanks to this Office’s teamwork with the Port Authority, these defendants are facing serious criminal charges for their alleged cybercrimes.”

According to the DOJ's indictment, both men explored a variety of ways to break into JFK's taxi dispatch system, from bribing people to insert a malware-filled flash drive into a computer, stealing tablets and logging into the system over Wi-Fi. Abayev at one point messaged one of the Russian hackers: “I know that the Pentagon is being hacked[.]. So, can’t we hack the taxi industry[?]”

The pair used chat threads to communicate with drivers, some of whom also had their $10 fee waived if they could recruit others. Abayev and Leyman have been charged with two counts of conspiracy to commit computer intrusion, which carry a maximum 10-year sentence in prison. Their story follows a spate of Russian cyberattacks over the last ten years, including the infamous hack on Florida's voter databases in 2016, a decade-long malware scheme to steal millions, and the theft of NATO data in 2014.