Posts with «provider_name|engadget» label

The ASUS AirVision M1 glasses give you big virtual screens in a travel-friendly package

At CES 2024, ASUS seems to have taken people by surprise with the announcement of its AirVision M1 glasses, with some viewing it as an alternative to Apple’s Vision Pro headset. But I discovered that ASUS’ glasses are much more of a novel alternative to portable monitors than something meant for spatial computing. 


The big difference between the AirVision M1 glasses and something like the Vision Pro or even Xreal’s Air 2 Ultras is that it doesn’t really support anything in the way of interactive AR. Sure, the glasses are able to project your desktop or multiple windows into space, but it needs to be tethered to a nearby device and doesn’t recognize hand gestures or other virtual objects. 

Photo by Sam Rutherford/Engadget

Instead, I found that its primary purpose is to give you extra screen space, but without the need to carry around big and bulky portable monitors. Featuring built-in microLED displays with a full HD resolution, the AirVisions can display up to six or seven virtual windows or desktops. You can also choose betwe

en a handful of aspect ratios (16:9, 21:9, 32:9 and more), with the glasses three degrees of freedom allowing you to either pin those screens in virtual space or track your head as you move around. 

Instead, I found that its primary purpose is to give you extra screen space, but without the need to carry around big and bulky portable monitors. Featuring built-in microLED displays with a full HD resolution, the AirVisions can display up to six or seven virtual windows or desktops. You can also choose between a handful of aspect ratios (16:9, 21:9, 32:9 and more), with the glasses three degrees of freedom allowing you to either pin those screens in virtual space or track your head as you move around. 

Instead, I found that its primary purpose is to give you extra screen space, but without the need to carry around big and bulky portable monitors. Featuring built-in microLED displays with a full HD resolution, the AirVisions can display up to six or seven virtual windows or desktops. You can also choose between a handful of aspect ratios (16:9, 21:9, 32:9 and more), with the glasses three degrees of freedom allowing you to either pin those screens in virtual space or track your head as you move around. Instead, I found that its primary purpose is to give you extra screen space, but without the need to carry around big and bulky portable monitors. Featuring built-in microLED displays with a full HD resolution, the AirVisions can display up to six or seven virtual windows or desktops. You can also choose between a handful of aspect ratios (16:9, 21:9, 32:9 and more), with the glasses three degrees of freedom allowing you to either pin those screens in virtual space or track your head as you move around. 

During my first demo, I used the AirVision M1s while tethered to a laptop, in which it behaved almost exactly like having a bit floating desktop that appeared to be hovering six feet in front of me. At first, the virtual displays were a little blurry, but after a short adjustment period and some time dialing in my IPD (interpupillary distance), I was pleasantly surprised by how sharp everything looked. When compared to something like Sightful Spacetop, which is billed as the world’s first AR laptop, not only did it have a much larger vertical field of view (up to 57 degrees), it also didn’t require any additional special equipment, as the glasses are essentially plug and play. While I didn’t need them, it’s important to note that the glasses come with a pair of nose pads to help ensure you can get a good fit, plus a prescription insert for people with glasses.

Once set up, it was pretty easy to create additional virtual workspaces. All I had to do was pull up a small command menu, press a plus sign where I wanted a new window to appear and that’s it. You can also freely adjust the overall size of the virtual display by zooming in or out. And one of the best things about the AirVisions is that using the laptop’s touchpad or typing wasn’t difficult at all. Because you can see through the virtual displays, I simply looked down and focused my eyes where they needed to go. That said, if you become distracted by something in the background, ASUS’ glasses also come with magnetic blinders that clip onto the front and provide a clean black backdrop.

However my favorite use case was when I tried a different pair of the AirVisions that were connected to an ROG Ally, where the glasses provided me with a massive virtual screen for gaming. In this way, it’s a lot like wearing a headset such as the Meta Quest 3, but for non-VR games. This is the kind of device I would love to have on a plane, where space is at a premium, especially for something like a portable monitor. That said, I’m not sure I could handle the embarrassment of being a modern day glasshole, at least not until devices like these become a bit more popular.

But perhaps the biggest difference between the AirVision M1s and Apple’s Vision Pro is price. While ASUS has yet to provide an official figure, a company spokesperson told me that ASUS is targeting around $700, versus $3,000 for Apple’s headset. And when you compare that to the price of a portable monitor, which often goes for between $250 and $400, and offers a lot less screen space, suddenly that price doesn’t seem too ridiculous.

So if you’re on the lookout for an alternative to the travel monitor, keep an eye for ASUS’ AirVision M1 glasses when they become available sometime in Q3 2024. 

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/the-asus-airvision-m1-glasses-give-you-big-virtual-screens-in-a-travel-friendly-package-234412478.html?src=rss

Instagram's founders are shutting down Artifact, their year-old news app

Artifact, the buzzy news app from Instagram co-founders Kevin Systrom and Mike Krieger, is shutting down less than a year after its launch. In a note on Medium, Systrom said the app’s “core news reading” features would be online through the end of February, but that it would remove commenting and posting abilities immediately.

Besides its famous founding team, the app was known for AI-centric features as well as Reddit-like commenting and posting abilities. The app had won praise from journalists who appreciated reporter-friendly features like dedicated author pages and had been featured prominently in Apple and Google’s app stores.

But after a year of work, it seems Systrom and Krieger encountered many of the same struggles as founders of buzzy news apps before them. “We have built something that a core group of users love, but we have concluded that the market opportunity isn’t big enough to warrant continued investment in this way,” Systrom wrote.

While he didn’t say what he might do next, Systrom’s note hinted that he may at some point take on a new AI-focused project. “I am personally excited to continue building new things, though only time will tell what that might be,” he wrote. “We live in an exciting time where artificial intelligence is changing just about everything we touch, and the opportunities for new ideas seem limitless.”

In the meantime, Artifact fans have a few more weeks to keep checking headlines before the app goes offline for good.

This article originally appeared on Engadget at https://www.engadget.com/instagrams-founders-are-shutting-down-artifact-their-year-old-news-app-233431390.html?src=rss

MIT experts develop AI models that can detect pancreatic cancer early

Researchers at MIT’s CSAIL division, which focuses on computer engineering and AI development, built two machine learning algorithms that can detect pancreatic cancer at a higher threshold than current diagnostic standards. The two models together formed to create the “PRISM” neural network. It is designed to specifically detect pancreatic ductal adenocarcinoma (PDAC), the most prevalent form of pancreatic cancer.

The current standard PDAC screening criteria catches about 10 percent of cases in patients examined by professionals. In comparison, MIT’s PRISM was able to identify PDAC cases 35 percent of the time.

While using AI in the field of diagnostics is not an entirely new feat, MIT’s PRISM stands out because of how it was developed. The neural network was programmed based on access to diverse sets of real electronic health records from health institutions across the US. It was fed the data of over 5 million patient’s electronic health records, which researchers from the team said “surpassed the scale” of information fed to an AI model in this particular area of research. “The model uses routine clinical and lab data to make its predictions, and the diversity of the US population is a significant advancement over other PDAC models, which are usually confined to specific geographic regions like a few healthcare centers in the US,” Kai Jia, MIT CSAIL PhD senior author of the paper said.

MIT’s PRISM project started over six years ago. The motivation behind developing an algorithm that can detect PDAC early has a lot to do with the fact that most patients get diagnosed in the later stages of the cancer’s development — specifically about eighty percent are diagnosed far too late.

The AI works by analyzing patient demographics, previous diagnoses, current and previous medications in care plans and lab results. Collectively, the model works to predict the probability of cancer by analyzing electronic health record data in tandem with things like a patient’s age and certain risk factors evident in their lifestyle. Still, PRISM is still only able to help diagnose as many patients at the rate the AI can reach the masses. At the moment, the technology is bound to MIT labs and select patients in the US. The logistical challenge of scaling the AI will involve feeding the algorithm more diverse data sets and perhaps even global health profiles to increase accessibility.

Nonetheless, this isn't MIT’s first stab at developing an AI model that can predict cancer risk. It notably developed a way to train models how to predict the risk of breast cancer among women using mammogram records. In that line of research, MIT experts confirmed, the more diverse the data sets, the better the AI gets at diagnosing cancers across diverse races and populations. The continued development of AI models that can predict cancer probability will not only improve outcomes for patients if malignancy is identified earlier, it will also lessen the workload of overworked medical professionals. The market for AI in diagnostics is so ripe for change that it is piquing the interest of big tech commercial companies like IBM, which attempted to create an AI program that can detect breast cancer a year in advance.

This article originally appeared on Engadget at https://www.engadget.com/mit-experts-develop-ai-models-that-can-detect-pancreatic-cancer-early-222505781.html?src=rss

Nintendo Switch 2 and games to get excited about in 2024 | This week's gaming news

Welcome back to our weekly gaming news roundup. 

January is a magical time in the video game industry. We've just closed out 12 months of marketing hype and shifting production timelines, and the year ahead is filled with the promises of new titles and fresh hardware. During this special month, we can look at the 2024 release calendar with excitement and optimism, before the delays start rolling in. So, let's get to it — these hearts aren't going to break themselves.

This week's stories

Arcane season 2 teaser

You watched Arcane, right? The Netflix series set in the League of Legends universe debuted in late 2021 and it was an instant sensation, starring fan-favorite characters like Jinx, Vi and Caitlyn. The next season is set to come out in November and Riot dropped a one-minute teaser for it last Friday. The trailer has Singed experimenting on himself in a dreary laboratory, while a creature that looks like Warwick hangs above, connected to tubes and IVs. It’s gonna get dark, kids.

If you haven’t watched season one of Arcane, do that now.

This kid beat Tetris

I guess we can all stop playing Tetris. 13-year-old Willis Gibson became the first person to reach the killscreen in the classic NES version of Tetris, 34 years after the game’s debut. Gibson caught the moment on camera and honestly, it gives me goosebumps every time I watch it. The competitive Tetris scene has been steadily growing over the past few years, and players are using a new input technique called rolling that allows them to move pieces faster than ever. If you’re into this kind of thing, I recommend watching Classic Tetris Monthly on Twitch or YouTube.

Promises, promises

Before we get back into all the award shows and livestreams and media events this year, let’s take a look at the video game promises heading into 2024.

There’s nothing official yet, but it looks like Nintendo is preparing to release the Switch 2 in 2024, seven years after the launch of the original Switch, and right in the middle of the PS5 and Xbox Series X console cycle. According to early reports, the Switch 2 will be an iterative hardware update with slightly more processing power and support for DLSS and raytracing. The big news is that Nintendo has finally joined us in the 21st century, and players should be able to transfer their Switch games to the new console without any roadblocks.

Outside of the new Switch, 2024 is all about games. We know how this goes, right — in video games, a release date is really just the first step before a delay, so whatever you’re into, prepare for heartbreak over the next 12 months.

There are two games I’m confident will actually hit the market on their release dates in 2024, and that’s only because they’ve been in development for years and delayed multiple times already. Ubisoft’s open-world pirate simulator Skull and Bones is due out on February 16 for PS5, Xbox and PC, and Final Fantasy VII Rebirth will hit PS5 on February 29. Rebirth looks legit, while Skull and Bones … doesn’t.

Overall, we have a healthy lineup of titles to get excited about in 2024. First, on the mainstream front: 

  • January 18: Prince of Persia: The Lost Crown | Ubisoft Montpellier

  • January 19: The Last of Us Part 2 Remastered | Naughty Dog

  • January 26: Tekken 8 | Bandai Namco Studios, Arika

  • February 2: Suicide Squad: Kill the Justice League | Rocksteady Studios

  • February 2: Persona 3 Reload | P-Studio

  • February 16: Skull and Bones | Ubisoft

  • February 29: Final Fantasy VII Rebirth | Square Enix

  • March 22: Dragon’s Dogma 2 | Capcom

  • March 22: Princess Peach: Showtime! | Nintendo

  • March 22: Rise of the Ronin | Team Ninja

  • 2024: Silent Hill 2 remake | Bloober Team

  • 2024: Star Wars Outlaws | Massive Entertainment

  • 2024: Avowed | Obsidian Entertainment

  • 2024: Senua’s Saga: Hellblade 2 | Ninja Theory

  • 2024: Concord | Firewalk Studios

  • 2024: Paper Mario: The Thousand-Year Door | Nintendo

This isn't a comprehensive list for the year in AAA gaming, but it's a solid start. 

And then there are the games I’m personally looking forward to in 2024. Most of these still have vague release windows — it's as if the developers didn’t want to give a timeframe at all, so they just whispered 2024 to their marketing teams and hoped no one would notice. But I did. I always do. I’m always watching.

Here are the games on my underground radar this year (again, this isn't an exhaustive list because there are so many fantastic games nowadays, but these ones spring to mind):

  • January 16: Home Safety Hotline | Night Signal Entertainment

  • August 20: Black Myth: Wukong | Game Science

  • 2024: Skate Story | Sam Eng

  • 2024: Lorelei and the Laser Eyes | Simogo

  • 2024: Baby Steps | Gabe Cuzzillo, Maxi Boch, Bennett Foddy

  • 2024: The Plucky Squire | All Possible Futures

  • 2024: Mewgenics | Edmund McMillen, Tyler Glaiel

  • 2024: 33 Immortals | Thunder Lotus

  • 2024: Thank Goodness You’re Here! | Coal Supper

  • 2024: Despelote | Julián Cordero, Sebastian Valbuena

  • 2024: Time Flies | Playables, Raphaël Munoz, Michael Frei

  • 2024: Cryptmaster | Paul Hart, Lee Williams, Akupara Games

  • 2024: Hades 2 | Supergiant Games

  • 2024: Hyper Light Breaker | Heart Machine

When any of these titles is inevitably delayed, we can all gather right here and have a good cry. Let us know in the comments what you’re looking forward to this year and why it’s Hollow Knight: Silksong.

Now Playing

I’ve been sticking with local co-op games during these chilly winter months, and now I’ve moved on to Baldur’s Gate 3. I know, I know, everyone is already telling you to play it, but this pitch is strictly for the splitscreen crowd — Baldur’s Gate 3 is a joy to play alongside a loved one, as long as your cleric actually remembers to heal your party every now and then. You know who you are.

This article originally appeared on Engadget at https://www.engadget.com/nintendo-switch-2-and-games-to-get-excited-about-in-2024--this-weeks-gaming-news-211257742.html?src=rss

NASA's new X-59 plane could hit supersonic speeds with minimal sonic boom

NASA’s X-59 Quesst supersonic commercial jet, which is being developed by Lockheed Martin, will have its flight test livestreamed as a demonstration of how quiet it can be in the air. The $247.5 million Quesst, whose name is short for Quiet SuperSonic Tech, will be shown on the livestream dramatically emerging from Lockheed Martin's Skunk Works facility in Palmdale, California. NASA has been on a mission since 2018 to prove that its X-59 can fly over cities without producing noise pollution, or sonic booms. This test flight marks an important milestone in the six-year-old project.

The first flight will be streamed on January 12 at 4pm ET on YouTube, as well as the NASA app and the NASA+ streaming service.

The space agency said it will survey people about the noises they hear from the jet during the first flight. It did not specify how it would find these people, or many people it would poll. The data collected will be sent to regulators and used to help propose new rules that limit the use of supersonic jets. The US federal government has blocked all civilian supersonic jets from flying over land for over five decades.

When NASA first announced its quiet supersonic technology project in 2018, administrator Jim Bridenstine said, “This aircraft has the potential to transform aviation in the United States.” While the jet was supposed to first take flight in 2021, the debut today still marks a major milestone in the QueSST mission. By 2027, NASA expects to have more definitive results about how effective the new aircraft technology is at reducing flight noise.

If new laws are eventually passed that permit supersonic jet aircrafts to fly in close proximity to land, high-speed commercial flights could become a reality. Once NASA and Lockheed Martin finalize development of the aircraft, the agency said it will conduct safety evaluations for about nine months. After enough evidence is shared to prove that the Quesst aircraft can be flown safely, NASA plans to expand its flight tests to cities across the US and collect more information about the noise it produces through additional surveys.

This article originally appeared on Engadget at https://www.engadget.com/nasas-new-x-59-plane-could-hit-supersonic-speeds-with-minimal-sonic-boom-210037676.html?src=rss

EPA scraps plan that would have had it ban mammal testing in favor of computer models

The Environmental Protection Agency has scrapped a plan to phase out mammal testing for studying chemical toxicity, Science reports. In 2019, the regulatory agency vowed to completely phase out animal testing for toxicology studies by 2035 in favor of non-animal “test subjects” programmed into computer models.

The call to challenge the status quo was controversial from the start — it not only was going to impact thousands of studies and experiments, but many scientists argued that computer models were nowhere near ready to replace animals as test subjects. In a letter written by a group of public health officials, the experts urged the EPA’s head Michael Regan to reconsider the ban because computational models, in their opinion, were “not yet developed to the point” where they could be relied on for risk assessments.

In order for the new ban to have taken effect, the EPA said there needed to be “scientific confidence” that non-animal models could soundly replace critters like mice and rabbits in labs. Despite the 2035 deadline being put on ice, however, an EPA spokesperson told Science that it would still explore alternatives to animal testing.

The ambitious plan is not entirely a lost cause, though. While the EPA hasn’t made any official statements about how it plans to work toward its original goal, now without a deadline, some studies have shown promise that computational models might effectively reflect the toxicology of certain chemicals during testing. In some instances, these studies suggest, they can even outperform lab rats.

3D developments like technical organoids are also popping up on the research front by way of stem cells that allow duped livers to be tested and evaluated during research as a human liver would. Labs are currently working on ways to more effectively develop realistic organs using 3D printers. But it might be a while before 3D printing can consistently be used to assist biologists and pharmacologists for research and drug testing.

This article originally appeared on Engadget at https://www.engadget.com/epa-scraps-plan-that-would-have-had-it-ban-mammal-testing-in-favor-of-computer-models-204540435.html?src=rss

Senators want to know why the SEC’s X account wasn’t secured with MFA

Another lawmaker is pushing the Securities and Exchange Commission for more information about its security practices following the hack of its verified account on X. In a new letter to the agency’s Inspector general, Senator Ron Wyden, called for an investigation into “the SEC’s apparent failure to follow cybersecurity best practices.”

The letter, which was first reported by Axios, comes days after the SEC’s official X account was taken over in order to post a tweet claiming that spot bitcoin ETFs had been approved by the regulator. The rogue post temporarily juiced the price of bitcoin and forced SEC chair Gary Gensler to chime in from his X account that the approval had not, in fact, happened. (The SEC did approve 11 spot bitcoin ETFs a day later, with Gensler saying in a statement that “bitcoin is primarily a speculative, volatile asset that’s also used for illicit activity.”)

The incident has raised a number of questions about the SEC’s security practices after officials at X said the financial regulator had not been using multi-factor authentication to secure its account. In the letter, Wyden, who chairs the Senate’s finance committee, said it would be "inexcusable" for the agency to not use additional layers of security to lock down its social media accounts.

“Given the obvious potential for market manipulation, if X’s statement is correct, the SEC’s social media accounts should have been secured using industry best practices,” Wyden wrote. “Not only should the agency have enabled MFA, but it should have secured its accounts with phishing-resistant hardware tokens, commonly known as security keys, which are the gold standard for account cybersecurity. The SEC’s failure to follow cybersecurity best practices is inexcusable, particularly given the agency’s new requirements for cybersecurity disclosure”

Wyden isn’t the only lawmaker who has pushed the SEC for more details about the hack. Senators J. D. Vance and Thom Tillis sent a letter of their own, addressed to Gensler, immediately following the incident. They asked for a briefing about the agency’s security policies and investigation into the hack by January 23.

The SEC didn’t immediately respond to a request for comment. The agency said in an earlier statement that it was working with the FBI and the Inspector General to investigate the matter.

This article originally appeared on Engadget at https://www.engadget.com/senators-want-to-know-why-the-secs-x-account-wasnt-secured-with-mfa-203614701.html?src=rss

Logitech mice, webcams and accessories are up to 25 percent off at Amazon

New year, new... desktop setup? If you're looking for ways to spruce up your desk space without breaking the bank, it's worth taking a peek at a Logitech sale on Amazon that includes discounts on mice, webcams and other accessories. Most of us could do with a webcam upgrade (I know I could given the low-res one built into my laptop), and Logitech's Bio 300 may fit the bill. It's a Full HD 1080p webcam that's on sale for $44.85. That's a 25 percent discount, or just over $15 off the usual price of $60.

The Bio 300 has a privacy shutter, 70-degree field of view, auto-light correction function, LED activity light, built-in mono noise reducing microphone and USB-C connector. You'll be able to use the Logi Tune app to adjust color and image quality. Those concerned with sustainability may be pleased to learn it's made with 48 percent post-consumer recycled plastic too. The slightly speckled plastics help give the webcam a fresh look.

One other product that caught our eye in the sale is the Pebble 2 M350s mouse. That's on sale for $25, which marks a record low. The wireless mouse usually costs $30. The Pebble 2 is available in black, white or a fetching rose to match the aforementioned webcam. It too is built with at least 58 percent certified post-consumer recycled plastic.

This low-profile mouse has quiet clicking sounds and is highly portable, making it a good fit for those who move around with their laptops. It supports Bluetooth 5.1 and the Logitech Bolt receiver, and it's able to pair with up to three devices (you can switch between them using a button on the base). The middle button is customizable and supports shortcuts. Logitech says the Pebble 2 M350s will run for up to two years before you have to change the battery.

Follow @EngadgetDeals on Twitter and subscribe to the Engadget Deals newsletter for the latest tech deals and buying advice.

This article originally appeared on Engadget at https://www.engadget.com/logitech-mice-webcams-and-accessories-are-up-to-25-percent-off-at-amazon-201429217.html?src=rss

Audio Radar helps gamers with hearing loss 'see' sound effects instead

Audio cues can sometimes be crucial for success in games. Developers frequently design the sound environment for their experiences to be not only rich and immersive, but to also contain hints about approaching enemies or danger. Players who are hard of hearing can miss out on this, and it's not fair for them to be disadvantaged due to a disability. A product called Audio Radar launched at CES 2024 and it can help turn sound signals into visual cues, so that gamers with hearing loss can "see the sound," according to the company AirDrop Gaming LLC. 

The setup is fairly simple. A box plugs into a gaming console to interpret audio output and converts that data into lights. A series of RGB light bars surround the screen, and display different colors depending on the type of sound coming from the respective direction they represent. Put simply, it means that if you're walking around a Minecraft world, like I did at the company's booth on the show floor, you'll see lights of different colors appear on the different bars.

Red lights mean sounds from enemies are in the area adjacent to the corresponding light, while green is for neutral sounds. An onscreen legend also explains what the sounds mean, though that might just be for the modded Minecraft scenario on display at CES. 

Photo by Cherlynn Low / Engadget

I walked around the scene briefly, and could see green lights hovering above a pen of farm animals, while purple lights fluttered in tandem with a dragon flying overhead. I did find it a little confusing, but that is probably due more to the fact that I know very little about Minecraft, and as someone with hearing I might not appreciate the added information as much as someone without.

With an SDK that the company launched at the show, developers will be able to customize the lights and visual feedback to elements in their game so that they have control over what their hard-of-hearing gamers see. In the meantime, Audio Radar is using its own software to detect stereo or surround sound signals to convert to feedback in lights and colors. 

Though the product may seem in its early stages, various major gaming companies have appeared to indicate interest in Audio Radar. AirDrop Gaming's CEO Tim Murphy told me that Logitech is "providing support as we further develop our product and design our go-to-market strategy." Also, Microsoft CEO Satya Nadella was spotted at the booth on opening day.

Audio Radar is beginning to ship on a wider level this year, and the company continues to develop products for gamers who are deaf and hard of hearing, among other things. The system works with Xbox, PlayStation and PC.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/audio-radar-helps-gamers-with-hearing-loss-see-sound-effects-instead-195001226.html?src=rss

NASA confirms 2023 was the hottest year on record

Want some bad news as a lead up to the weekend? NASA just released its annual global temperature report and, lo and behold, 2023 was the hottest year on record since measurements began back in 1880. Global temperatures last year were approximately 2.1 degrees Fahrenheit (1.2 degrees Celsius) above the average for NASA’s baseline period of 1951 to 1980.

Compared to the 1880s, the planet was 2.5 degrees warmer in 2023. If you do the math, you’ll find that the vast majority of that increase occurred after NASA’s baseline period. In other words, the past several decades have been the worst of the worst. July of 2023 was the hottest month ever measured, which is a record nobody wanted or asked for but, well, here we are.

“NASA and NOAA’s global temperature report confirms what billions of people around the world experienced last year; we are facing a climate crisis,” said NASA Administrator Bill Nelson. “From extreme heat, to wildfires, to rising sea levels, we can see our Earth is changing.”

NASA’s not burying its head in the sand and pretending this is a natural phenomenon. We did this, with Gavin Schmidt, director at the Goddard Institute for Space Studies (GISS), saying the temperature shift was primarily caused by “our fossil fuel emissions.”

2023 was not an outlier. The past ten consecutive years have been the warmest on record. To that end, the U.S. National Oceanic and Atmospheric Administration (NOAA) recently reported that 2024 has a one-in-three chance of being even hotter. Yay.

It’s also worth noting that 2023 featured some cooling events that actually worked to lower temperatures a bit, including volcanic aerosols in the atmosphere due to the January 2022 eruption of the Hunga Tonga-Hunga Ha’apai underwater volcano. However, these events couldn’t keep up with constant greenhouse gas emissions and the heating effects of this year’s El Niño weather event.

“We will continue to break records as long as greenhouse gas emissions keep going up,” Schmidt said. “And, unfortunately, we just set a new record for greenhouse gas emissions again this past year.”

The Biden-Harris administration has done a few things to try to slow down our transformation into a Mad Max dystopia. The White House recently launched the U.S. Greenhouse Gas Center to make critical climate data readily available and last year’s Inflation Reduction Act set aside $369 billion for climate and clean energy programs. President Biden has also pledged to bring emission levels to at least 50 percent below what we experienced in 2005 by 2025. These are good incremental moves, of that there’s no doubt, but we seem to have sped past “f*ck around” and are careening wildly into “find out.” What was that curse again? Oh yeah. May you live in interesting times.

This article originally appeared on Engadget at https://www.engadget.com/nasa-confirms-2023-was-the-hottest-year-on-record-193626460.html?src=rss