Posts with «natural phenomena» label

The ice caps are melting. Is geoengineering the solution?

Since 1979, Arctic ice has shrunk by 1.35 million square miles, a new JPL study found ice loss in Greenland is far worse than previously thought and Antarctic ice is now at the lowest level since records began. The more they melt, the faster the rate of decline for the ice that remains until we’re faced with a series of catastrophes. The most immediate of which is sea level rise which threatens to eradicate whole nations that are situated on low-lying islands. How do we stop such a problem? While we remedy the longer-term issues around fossil fuel consumption, we might have to buy ourselves more time with geoengineering.

The severity of this situation can’t be stressed enough. Professor John Moore of the Arctic Center, University of Lapland, says that we’re long past the point where emissions reductions alone will be effective. “We are faced with this situation where there’s no pathway to 1.5 [degrees] available through mitigation,” he said. “Things like the ice sheets [melting] and other tipping points will happen regardless,” adding that the Earth’s present situation is akin to a patient bleeding out on the operating table, “we are in this situation where we cannot mitigate ourselves out of the shit.”

Moore is one of the figures behind Frozen Arctic, a report produced by the universities of the Arctic and Lapland alongside UN-backed thinktank GRID-Arendal. It’s a rundown of sixty geoengineering projects that could slow down or reverse polar melting. A team of researchers opted to examine every idea, from those already in place to the ones at the fringes of science. “We wanted to be thorough,” said Moore, “because even the craziest idea might have a nugget of gold in there.” Each approach has been given a brief analysis, examining if it’s feasible on a scientific or practical basis, if it would be potentially helpful and how much it would cost. The report even went so far as to look at pykrete, a wacky World War Two initiative to create artificial glaciers for strategic use by mixing sawdust or paper products into ice.

If you’re curious and don’t have a day or two to read the report yourself, you can boil down the approaches to a handful of categories. The first is Solar Radiation Management, i.e. making the polar regions more reflective to bounce away more of the sun’s heat. Second, there’s artificial ice generation to compensate for what has already been lost. Third, enormous engineering work to buttress, isolate and protect the remaining ice — like massive undersea walls that act as a barrier against the seas as they get warmer. Finally, there are measures that nibble at the edges of the problem in terms of effect, but have more viable long-term success, like preventing flora and fauna (and the warmth they radiate) from encroaching on regions meant to remain frozen.

If you’re a climate scientist, the likely most obvious approach is the first, because we’ve seen the positive effects of it before. Albedo is the climate science term to describe how white ice acts as an enormous reflector, bouncing away a lot of the sun’s heat. Ice ages dramatically increase albedo, but there are more recent examples in living memory: In 1991 Mount Pinatubo, a volcano in the Philippines, erupted, spewing an enormous amount of volcanic ash into the atmosphere. (The event also caused a large amount of damage, displaced 200,000 people and claimed the lives of at least 722.) According to NOAA, the ash dumped into the atmosphere helped reflect a lot of solar heat away from the Earth, causing a temporary global cooling effect of roughly 1.5 degrees celsius. The devastation of Pinatubo isn’t desirable, nor was the ozone depletion that it caused, but that cooling effect could be vital to slowing global warming and polar melting.

It’s possible to do this artificially by seeding the clouds with chemicals deposited by an airplane or with ground-based smoke generators, which can also be used to promote rain clouds. This is a tactic already used in China to help make rain for agriculture and to alleviate drought-like conditions. In this context, the clouds would act as a barrier between the sun and the ice caps, bouncing more of that solar radiation away from the Earth’s surface. Unfortunately, there’s a problem with this approach, which is that it’s incredibly expensive and incredibly fussy. The report says it’s only viable when the right clouds are overhead, and the work would require enormous infrastructure to be built nearby. Not to mention that while we have some small shreds of evidence to suggest it might be useful, there’s nothing proven as yet.

And then there are the second order effects when these approaches then spill over into the rest of the global ecosystem. “If you do sunlight reflection methods and you put anything up in the atmosphere, it doesn’t stay where you put it.” That’s the big issue identified by Dr. Phil Williamson, honorary associate professor at the University of East Anglia and a former contributor to the UN’s keystone Intergovernmental Panel on Climate Change reports. His concern is that regional, targeted climate solutions “don’t solve the problem for the whole world,” and that if you’re not tackling climate change on a global scale, then you’re “just accentuating the difference.” With a cold arctic, but rising temperatures elsewhere, you’re climbing aboard a “climate rollercoaster.”

Second in the ranking of hail-mary climate approaches is to build a freezer to both cool down the existing ice and make more. Sadly, many ideas in this area forget that ice sheets are not just big blocks of immovable ice and are, in fact, liable to move. Take the idea of drilling down two miles or so into the ice sheet and pumping out the warm water to cool it down: Thanks to the constantly shifting ice and water, a new site would need to be drilled fairly regularly.

There’s another problem: The report says one project to bore a hole down 2.5km (1.5 miles) burned 450,000 liters of fuel. Not to mention how much energy it would consume to run the heat exchangers or freezers to create fresh ice on such a scale. That's a considerable amount of greenhouse gas pollution for a project meant to undo that exact type of damage. Dumping a layer of artificially-made snow on a mountain may work fine for a ski resort when the powder’s a little thin, but not the whole planet.

As hard as the scientific and engineering battles will be, there’s also the political one that will need addressing. “A lot of people get quasi-religiously upset about putting stuff into the stratosphere,” said Professor John Moore, “you’d think they’d get similarly upset about greenhouse gasses.” One strategy under consideration is to inject sulfur into the atmosphere to replicate the cooling effects observed after major volcanic eruptions. The sulfur would form SO2, creating thick layers of dense cloud to block more heat from reaching the ice. But if you, like me, have a high school-level knowledge of science, that’s a scary prospect given that sulfur dioxide would resolve to sulfuric acid. Given the microscopic quantities involved, there would be little-to-no impact on the natural world. But the image of acid rain pouring down from the clouds means it’d be a hard sell to an uninformed population.

But if there is a reason for concern, it’s that any unintended consequences could pose a problem in the global political space. “It’s almost like declaring war on the rest of the world if [a nation] goes it alone,” says Phil Williamson, “because any damage or alteration to the global climate system, the country that did it is responsible for all future climatic disasters because the weather isn’t the same.”

Of course, Moore knows that the Frozen Arctic report’s conclusions aren’t too optimistic about a quick fix. He feels its conclusions should serve as a wake-up call for the planet. “Nobody is going to scale up something for the entire arctic ocean overnight,” he said, but that this is the time to “find ideas that might be valuable [...] and then put resources into finding out if [those ideas] really are useful.” He added that the short turnaround time before a total climate disaster isn’t much of an issue, saying “engineers can pretty much do anything you ask them to if you put enough resources into it.” Because the alternative is to do nothing, and “every day that we choose to do nothing, we accept more of the damages that are coming.”

This article originally appeared on Engadget at

AI is starting to outperform meteorologists

A machine learning-based weather prediction program developed by DeepMind researchers called “GraphCast” can predict weather variables over the span of 10 days, in under one minute. In a report, scientists highlight that GraphCast has outperformed traditional weather pattern prediction technologies at a 90% verification rate.

The AI-powered weather prediction program works by taking in “the two most recent states of Earth’s weather,” which includes the variables from the time of the test and six hours prior. Using that data, GraphCast can predict what the state of the weather will be in six hours. 

In practice, AI has already showcased its applicability in the real world. The tool predicted the landfall of Hurricane Lee in Long Island 10 days before it happened, while the traditional weather prediction technologies being used by meteorologists at the time lagged behind. Forecasts made by standard weather simulations can take longer because traditionally, models have to account for complicated physics and fluid dynamics to make accurate predictions.

Not only does the weather prediction algorithm outperform traditional technologies to forecast weather patterns in terms of pace and scale, GraphCast can also predict severe weather events, which includes tropical cyclones and waves of extreme temperatures over regions. And because the algorithm can be re-trained with recent data, scientists believe that the tool will only get better at predicting oscillations in weather patterns that coincide with grander changes that align with climate change.

Soon, GraphCast, or at least the basis of the AI algorithm that powers its predictions, might pop up into more mainstream services. According to Wired, Google might be exploring how to integrate GraphCast into its products. The call for better storm modeling has already paved a path for supercomputers in the space. The NOAA (National Oceanic and Atmospheric Administration) says it has been working to develop models that will provide more accurate readings on when severe weather events might occur and importantly, the intensity forecasts for hurricanes.

This article originally appeared on Engadget at

Comcast debuts Storm-Ready WiFi device ahead of hurricane season

A storm often evokes a desire to feel safe inside your home and able to communicate with loved ones or emergency personnel — yet, electric and landline connections are often the first systems to go down. Comcast is attempting to solve this problem with the release of Storm-Ready WiFi, a connection backup device — one they claim is the first of its kind built by an Internet Provider. It's powered by Xfinity's 10G Network, has WiFi 6 compatibility and works as a WiFi extender during better weather.

As another reminder of the terrifying impact of climate change, Comcast cites the increase in storms nationwide as to why this extra device is necessary. Yes, you can file Storm-Ready WiFi under how to be more comfortable as the world burns — extremely dark, to say the least.

The service is designed to seamlessly transfer your connection over to Storm-Ready WiFi in the case of a power outage. Storm-Ready WiFi's battery lasts about four hours at a time (an average power outage in the US lasts about two hours). Of course, it's not much good if your phone or computer run out of battery, but otherwise, you can work (or better yet, watch all the movies you want) while the sky opens around you. Storm-Ready WiFi is available to buy now for $7 per month for 36 months, both in-store and online.

This article originally appeared on Engadget at

Android's earthquake warning system failed in Turkey, according to the BBC

Google's earthquake warning system for Android is supposed to provide notices in time to reach safety, but that might not have happened following the quake in Turkey on February 6th. BBC investigators claim that none of the hundreds of people they talked to in three Turkish cities received an alert before the first tremor hit. Only a "limited number" got an alert for a second tremor, investigators say.

We've asked Google for comment. Product lead Micah Berman tells the BBC millions of people in Turkey received earthquake alerts, although the company hasn't shared data indicating widespread notifications. Google did show a handful of social media posts from people who said they received a warning, but only one was for the first quake. Berman says he doesn't have a "resounding answer" as to why social networks were quiet about alerts, but does note the nature of a quake and the reliability of internet access can affect the system.

The Android Earthquake Alert System uses the accelerometer (that is, motion sensing) in phones to effectively crowdsource warnings. If many phones vibrate at the same time, Google can use the collective data to find the epicenter and magnitude of the quake, automatically sending a warning to people who are likely to feel the brunt of the shaking. While there's no more than a minute's notice, that can be enough time to find cover or evacuate. The technology can theoretically help people in areas where conventional warnings are unavailable.

The concern is that the system might have failed during a strong (7.8-magnitude) earthquake. Even if it worked, it's not clear how many people should and do receive warnings in cases like this, not to mention milder incidents. Without more data, it's not certain that Android's quake alerts are reliable substitutes for traditional warnings over radio and TV.

This article originally appeared on Engadget at

The USGS warning system that knows when rumbling volcanoes will blow their mountain tops

More than 120 volcanic eruptions have occurred in the United States in the 42 years since Mount St. Helens erupted over Washington in 1980, killing 57 and inflicting over a billion dollars in property damage. While none have been nearly as destructive, their mere presence can impact human activities and even economies hundreds of miles away. Altogether the US Geological Survey (USGS) has identified 161 geologically active volcanoes in 14 states and territories, a third of which constitute “high” or “very high” threats to their surrounding communities, and another 58 volcanoes nationwide classified as being undermonitored. The agency operates five volcano monitoring stations along the west coast to keep an eye on all but the least dangerous as part of the Survey’s Volcano Hazards Program. On average, around 60 volcanoes erupt annually, as Hawaii’s Mauna Loa is doing right now.

Mauna Loa, which had stood dormant for the past 38 years, reawakened late Sunday night for the eighth time since 1843. “Lava flows are not threatening any downslope communities and all indications are that the eruption will remain in the Northeast Rift Zone,” reads Monday’s red alert update from the USGS Hawaiian Volcano Observatory (HVO). “Volcanic gas and possibly fine ash and Pele's Hair may be carried downwind. Residents at risk from Mauna Loa lava flows should review preparedness and refer to Hawai‘i County Civil Defense information for further guidance.” This week’s eruption is decidedly mild compared to 2018’s Kīlauea Volcano event that destroyed 700 homes and launched ash 3,000 meters into the atmosphere, where it disrupted air traffic patterns.

While lava receives a majority of the public attention, volcanoes have myriad methods for ruining your week with fire and (literal) brimstone. Volcanic ash can travel miles into the stratosphere before raining back down where it exacerbates chronic lung diseases like asthma and emphysema; carbon dioxide and hydrogen sulfide collect in low-lying areas to suffocate the unwary and seismic shifts resulting from the initial explosion can trigger landslides, tsunamis, floods, and large-scale power outages.

“Unlike many other natural disasters … volcanic eruptions can be predicted well in advance of their occurrence if adequate in-ground instrumentation is in place that allows earliest detection of unrest, providing the time needed to mitigate the worst of their effects,” David Applegate, USGS associate director for natural hazards, told a House subcommittee in 2017.

As Eos magazine points out, nobody died as a result of the 2018 Kīlauea eruption, in large part due to the efforts of monitors at the HVO. But, a 2018 threat assessment found that, out of the 18 volcanoes listed as “very high” threat, only three — Mauna Loa, St Helens and the Long Valley Caldera — were rated as “well monitored” when that eruption was happening.

On the same day that Kīlauea blew its top, the US Senate unanimously passed S.346, establishing the National Volcano Early Warning and Monitoring System (NVEWS). The following March, the House of Representatives passed its version, PL 116-9/S.47, dubbed the John D. Dingell Jr. Conservation, Management, and Recreation Act. Not unlike California’s new ShakeAlert early earthquake warning scheme, the NVEWS works to combine and standardize the existing hodgepodge of (often outdated) volcano monitoring hardware operated by both government agencies and academic organizations into a unified system, “to ensure that the most hazardous volcanoes will be properly monitored well in advance of the onset of activity.”


“Improvements to volcano monitoring networks allow the USGS to detect volcanic unrest at the earliest possible stage,” Tom Murray, USGS Volcano Science Center director, said in a 2018 USGS release. “This provides more time to issue forecasts and warnings of hazardous volcanic activity and gives at-risk communities more time to prepare.”

The NVEWS Act, which was sponsored by Senator Lisa Murkowski (R - AK), earmarks $55 million annually between 2019 and 2023 to provide more accurate and timely eruption forecasts by increasing partnerships with local governments and proactively sharing data with the volcano science community. It also seeks to increase staffing and systems — from broadband seismometers, infrasound arrays, and real-time continuous GPS receivers, to streaming webcams, satellite overwatch and volcanic gas sensors — for 24/7 volcano monitoring and establishes a grant system for furthering volcanology research.


The USGS ranks volcanic threats based on the risk they pose to public health and property — essentially how potentially destructive the volcano itself is in relation to how many people and things might be impacted when it does erupt. The USGS assigns numerical values to the 24 various hazard and exposure factors for each volcano, then combined to calculate the overall threat score which is divided into five levels (like DEFCONs!). High and Very High get the most detailed monitoring coverage because duh, Moderate threat volcanoes still receive real-time monitoring but don’t have Tommy Lee Jones standing by to intercede, and Low (and Very Low) get checked on as needed. As of May 2022, when the USGS submitted its second annual NVEWS report to Congress, the USGS had spent just under half of the money appropriated for FY 2021 with the funds going to activities like installing a net-gen lahar detection system on Mount Rainier, upgrading the telemetry for more than two dozen observation posts throughout Alaska, Oregon, Washington, California and Hawaii.

Waymo is using its self-driving taxis to create real-time weather maps

Self-driving cars frequently have trouble with poor weather, but Waymo thinks it can overcome these limitations by using its autonomous taxis as weather gauges. The company has revealed that its latest car sensor arrays are creating real-time weather maps to improve ride hailing services in Phoenix and San Francisco. The vehicles measure the raindrops on windows to detect the intensity of conditions like fog or rain.

The technology gives Waymo a much finer-grained view of conditions than it gets from airport weather stations, radar and satellites. It can track the coastal fog as it rolls inland, or drizzle that radar would normally miss. While that's not as important in a dry locale like Phoenix, it can be vital in San Francisco and other cities where the weather can vary wildly between neighborhoods.

There are a number of practical advantages to gathering this data, as you might guess. Waymo is using the info to improve its Driver AI's ability to handle rough weather, including more realistic simulations. The company also believes it can better understand the limits of its cars and set higher requirements for new self-driving systems. The tech also helps Waymo One better serve ride hailing passengers at a given time and place, and gives Waymo Via trucking customers more accurate delivery updates.

The current weather maps have their limitations. They may help in a warm city like San Francisco, where condensation and puddles are usually the greatest problems, but they won't be as useful for navigating snowy climates where merely seeing the lanes can be a challenge. There's also the question of whether or not it's ideal to have cars measure the very conditions that hamper their driving. This isn't necessarily the safest approach.

This could still go a long way toward making Waymo's driverless service more practical, though. Right now, companies like Waymo and Cruise aren't allowed to operate in heavy rain or fog using their California permits — the weather monitoring could help these robotaxi firms serve customers looking for dry rides home.

NYU is building an ultrasonic flood sensor network in New York's Gowanus neighborhood

People made some 760 million trips aboard New York’s subway system last year. Granted, that’s down from around 1.7 trillion trips, pre-pandemic, but still far outpaced the next two largest transit systems — DC’s Metro and the Chicago Transit Authority — combined. So when major storms, like last year’s remnants of Hurricane Ida, nor'easters, heavy downpours or swelling tides swamp New York’s low lying coastal areas and infrastructure, it’s a big deal.

Jonathan Oatis / reuters

And it’s a deal that’s only getting bigger thanks to climate change. Sea levels around the city have already risen a foot in the last century with another 8- to 30-inch increase expected by mid century, and up to 75 additional inches by 2100, according to the New York City Panel on Climate Change. To help city planners, emergency responders and everyday citizens alike better prepare for 100-year storms that are increasingly happening every couple, researchers from NYU’s Urban Flooding Group have developed a street-level sensor system that can track rising street tides in real time.

The city of New York is set atop a series of low lying islands and has been subject to the furies of mid-Atlantic hurricanes throughout its history. In 1821, a hurricane reportedly hit directly over the city, flooding streets and wharves with 13-foot swells rising over the course of just one hour; a subsequent Cat I storm in 1893 then scoured all signs of civilization from Hog Island, and a Cat III passed over Long Island, killing 200 and causing major flooding. Things did not improve with the advent of a storm naming convention. Carol in 1954 also caused citywide floods, Donna in ‘60 brought an 11-foot storm surge with her, and Ida in 2021 saw an unprecedented amount of rainfall and subsequent flooding in the region, killing more than 100 people and causing nearly a billion dollars in damages.


As the NYC Planning Department explains, when it comes to setting building codes, zoning and planning, the city works off of FEMA’s Preliminary Flood Insurance Rate Maps (PFIRMs) to calculate an area’s flood risk. PFIRMs cover the areas where, “flood waters are expected to rise during a flood event that has a 1 percent annual chance of occurring,” sometimes called the 100-year floodplain. As of 2016, some 52 million square feet of NYC coastline falls within that categorization, impacting 400,000 residents — more than than the entire populations of Cleveland, Tampa, or St. Louis. By 2050, that area of effect is expected to double and the probability of 100-year floods occuring could triple, meaning the chances that your home will face significant flooding over the course of a 30-year mortgage would jump from around 26 percent today to nearly 80 percent by mid-century.


As such, responding to today’s floods while preparing for worsening events in the future is a critical task for NYC’s administration, requiring coordination between governmental and NGOs at the local, state and federal levels. FloodNet, a program launched first by NYU and expanded with help from CUNY, operates on the hyperlocal level to provide a street-by-street look at flooding throughout a given neighborhood. The program began with NYU’s Urban Flooding Group.

“We are essentially designing, building and deploying low cost sensors to measure street level flooding,” Dr. Andrea Silverman, environmental engineer and Associate Professor at NYU’s Department of Civil and Urban Engineering, told Engadget. “The idea is that it can provide badly needed quantitative data. Before FloodNet, there was no quantitative data on street level flooding, so people didn't really have a full sense of how often certain locations were flooding — the duration of the floods, the depth, rates of onset and drainage, for example.”

Urban Flooding Group, NYU

“And these are all pieces of information that are helpful for infrastructure planning, for one, but also for emergency management,” she continued. “So we do have our data available, they send alerts to see folks that are interested, like the National Weather Service and emergency management, to help inform their response.”

FloodNet is currently in early development with just 23 sensor units erected on 8-foot tall posts throughout the Gowanus neighborhood in Brooklyn, though the team hopes to expand that network to more than 500 units citywide within the next half decade. Each FloodNet sensor is a self-contained, solar-powered system that uses ultrasound as an invisible rangefinder — as flood waters rise, the distance between the street surface and the sensor shrinks, calculating the difference between that and baseline readings shows how much the water level has risen. The NYU team opted for an ultrasound-based solution rather than, say LiDAR or RADAR, due to ultrasound tech being slightly less expensive and providing more focused return data, as well as being more accurate and requiring less maintenance than a basic contact water sensor.

The data each sensor produces is transmitted wirelessly using a LoRa transceiver to a gateway hub, which can pull from any sensor within a one-mile radius and push it through the internet to the FloodNet servers. The data is then displayed in real-time on the FloodNet homepage.

URban Flooding Group, NYU

”The city has invested a lot in predictive models [estimating] where it would flood with a certain amount of rain, or increase in tide,” Silverman said. Sensors won’t have to be installed on every corner to be most effective, she pointed out. There are “certain locations that are more likely to be flood prone because of topology or because of the sewer network or because of proximity to the coast, for example. And so we use those models to try to get a sense of locations where it may be most flood-prone,” as well as reach out to local residents with first-hand knowledge of likely flood areas.

In order to further roll out the program, the sensors will need to undergo a slight redesign, Silverman noted. “The next version of the sensor, we're taking what we've learned from our current version and making it a bit more manufacturable,” she said. “We're in the process of testing that and then we're hoping to start our first manufacturing round, and that's what's going to allow us to expand out”.

FloodNet is an open-source venture, so all of the sensor schematics, firmware, maintenance guides and data are freely available on the team’s GitHub page. “Obviously you need to have some sort of technical know-how to be able to build them — it may not be right now where just anyone could go build a sensor, deploy it and be online immediately, in terms of being able to just generate the data, but we're trying to get there,” Silverman conceded. “Eventually we'd love to get to a place where we can have the designs written up in a way that anyone can approach it.”

Puerto Rico loses power as Hurricane Fiona brings threat of 'catastrophic' flooding

Almost exactly five years after Hurricane Maria left Puerto Rico in the dark, the US territory is once again facing a power crisis. On Sunday, LUMA Energy, the company that operates the island’s electrical grid, announced that all of Puerto Rico had suffered a blackout due to Hurricane Fiona, reports Reuters.

With the storm nearing the island’s southwest coast, the National Hurricane Center warned of “catastrophic” flooding as Fiona began producing winds with recorded speeds of 85 miles per hour. Without even making landfall, the storm left a third of LUMA’s customers without power. On Twitter, Puerto Rico Governor Pedro Pierluisi said the government was working to restore power, but after the events of five years ago, there’s worry there won’t be an easy fix.

#BREAKING All of #PuertoRico plunged into darkness (once again) after another hurricane unleashes its fury on their fragile electrical grid. #Fiona Brings back memories of #Maria 5 years ago

— Derek Van Dam (@VanDamCNN) September 18, 2022

In 2017, Hurricane Maria caused the largest blackout in US history when the Category 5 storm battered Puerto Rico, leaving 3.4 million people without power. The island had only recently begun rebuilding its weakened infrastructure, with blackouts a daily occurrence in some areas. Officials have tried to stress that Hurricane Fiona won’t bring a repeat of 2017. “This is not Maria, this hurricane will not be Maria,” Abner Gomez, the head of public safety and crisis management at LUMA Energy, told CNN before Sunday’s power outage.

Hitting the Books: How hurricanes work

Hurricane season is currently in full swing across the Gulf Coast and Eastern Seaboard. Following a disconcertingly quiet start in June, meteorologists still expect a busier-than-usual stretch before the windy weather (hopefully) winds down at the end of November. Meteorologists like Matthew Cappucci who, in his new book, Looking Up: The True Adventures of a Storm-Chasing Weather Nerd, recounts his career as a storm chaser — from childhood obsession to adulthood obsession as a means of gainful employment. In the excerpt below, Cappucci explains the inner workings of tropical storms.

Simon and Schuster

Excerpted from Looking Up: The True Adventures of a Storm-Chasing Weather Nerd by Matthew Cappucci. Published by Pegasus Books. Copyright © 2022 by Matthew Cappucci. All rights reserved.

Hurricanes are heat engines. They derive their fury from warm ocean waters in the tropics, where sea surface temperatures routinely hover in the mid- to upper-eighties between July and October. Hurricanes and tropical storms fall under the umbrella of tropical cyclones. They can be catastrophic, but they have a purpose—some scholars estimate they’re responsible for as much as 10 percent of the Earth’s annual equator-to-pole heat transport.

Hurricanes are different from mid-latitude systems. So-called extratropical, or nontropical, storms depend upon variations in air temperature and density to form, and feed off of changing winds. Hurricanes require a calm environment with gentle upper-level winds and a nearly uniform temperature field. Ironic as it may sound, the planet’s worst windstorms are born out of an abundance of tranquility.

The first ingredient is a tropical wave, or clump of thunderstorms. Early in hurricane season, tropical waves can spin up on the tail end of cold fronts surging off the East Coast. During the heart of hurricane season in August and September, they commonly materialize off the coast of Africa in the Atlantic’s Main Development Region. By October and November, sneaky homegrown threats can surreptitiously gel in the Gulf of Mexico or Caribbean.

Every individual thunderstorm cell within a tropical wave has an updraft and a downdraft. The downward rush of cool air collapsing out of one cell can suffocate a neighboring cell, spelling its demise. In order for thunderstorms to coexist in close proximity, they must organize. The most efficient way of doing so is through orienting themselves around a common center, with individual cells’ updrafts and downdrafts working in tandem.

When a center forms, a broken band of thunderstorms begins to materialize around it. Warm, moist air rises within those storms, most rapidly as one approaches the broader system’s low-level center. That causes atmospheric pressure to drop, since air is being evacuated and mass removed. From there, the system begins to breathe.

Air moves from high pressure to low pressure. That vacuums air inward toward the center. Because of the Coriolis force, a product of the Earth’s spin, parcels of air take a curved path into the fledgling cyclone’s center. That’s what causes the system to rotate.

Hurricanes spin counterclockwise in the Northern Hemisphere, and clockwise south of the equator. Though the hottest ocean waters in the world are found on the equator, a hurricane could never form there. That’s because the Coriolis force is zero on the equator; there’d be nothing to get a storm to twist.

As pockets of air from outside the nascent tropical cyclone spiral into the vortex, they expand as barometric pressure decreases. That releases heat into the atmosphere, causing clouds and rain. Ordinarily that would result in a drop in temperature of an air parcel, but because it’s in contact with toasty ocean waters, it maintains a constant temperature; it’s heated at the same rate that it’s losing temperature to its surroundings. As long as a storm is over the open water and sea surface temperatures are sufficiently mild, it can continue to extract oceanic heat content.

Rainfall rates within tropical cyclones can exceed four inches per hour thanks to high precipitation efficiency. Because the entire atmospheric column is saturated, there’s little evaporation to eat away at a raindrop on the way down. As a result, inland freshwater flooding is the number one source of fatalities from tropical cyclones.

The strongest winds are found toward the middle of a tropical storm or hurricane in the eyewall. The greatest pressure gradient, or change of air pressure with distance, is located there. The sharper the gradient, the stronger the winds. That’s because air is rushing down the gradient. Think about skiing — you’ll ski faster if there’s a steeper slope.

When maximum sustained winds surpass 39 mph, the system is designated a tropical storm. Only once winds cross 74 mph is it designated a hurricane. Major hurricanes have winds of 111 mph or greater and correspond to Category 3 strength. A Category 5 contains extreme winds topping 157 mph.

Since the winds are derived from air rushing in to fill a void, or deficit of air, the fiercest hurricanes are usually those with the lowest air pressures. The most punishing hurricanes and typhoons may have a minimum central barometric pressure about 90 percent of ambient air pressure outside the storm. That means 10 percent of the atmosphere’s mass is missing.

Picture stirring your cup of coffee with a teaspoon. You know that dip in the middle of the whirlpool? The deeper the dip, or fluid deficit, the faster the fluid must be spinning. Hurricanes are the same. But what prevents that dip from filling in? Hurricane eyewalls are in cyclostrophic balance.

That means a perfect stasis of forces makes it virtually impossible to “fill in” a storm in steady state. Because of their narrow radius of curvature, parcels of air swirling around the eye experience an incredible outward-directed centrifugal force that exactly equals the inward tug of the pressure gradient force. That leaves them to trace continuous circles.

If you’ve ever experienced a change in altitude, such as flying on an airplane, or even traveling to the top of a skyscraper, you probably noticed your ears popping. That’s because they were adjusting to the drop in air pressure with height. Now imagine all the air below that height vanished. That’s the equivalent air pressure in the eye a major hurricane. The disparity in air pressure is why a hurricane is, in the words of Buddy the Elf, “sucky. Very sucky.”

Sometimes hurricanes undergo eyewall replacement cycles, which entail an eyewall shriveling and crumbling into the eye while a new eyewall forms around it and contracts, taking the place of its predecessor. This usually results in a dual wind maximum near the storm’s center as well as a brief plateau in intensification.

In addition to the scouring winds found inside the eyewall, tornadoes, tornado-scale vortices, mini swirls, and other poorly understood small-scale wind phenomena can whip around the eye and result in strips of extreme damage. A mini swirl may be only a couple yards wide, but a 70 mph whirlwind moving in a background wind of 100 mph can result in a narrow path of 170 mph demolition. Their existence was first hypothesized following the passage of Category 5 Hurricane Andrew through south Florida in 1992, and modern-day efforts to study hurricane eyewalls using mobile Doppler radar units have shed light on their existence. Within a hurricane’s eye, air sinks and warms, drying out and creating a dearth of cloud cover. It’s not uncommon to see clearing skies or even sunshine. The air is hot and still, an oasis of peace enveloped in a hoop of hell.

There’s such a discontinuity between the raucous winds of the eyewall and deathly stillness of the eye that the atmosphere struggles to transition. The eyes of hurricanes are often filled with mesovortices, or smaller eddies a few miles across, that help flux and dissipate angular momentum into the eye. Sometimes four or five mesovortices can cram into the eye, contorting the eyewall into a clover-like shape. That makes for a period of extraordinary whiplash on the inner edge of the eyewall as alternating clefts of calamitous wind and calm punctuate the eye’s arrival.

Climate change has Seville so hot it's started naming heat waves like hurricanes

The city of Seville is trying something new to raise awareness of climate change and save lives. With oppressive heat waves becoming a fact of life in Europe and other parts of the world, the Spanish metropolis has begun naming them. The first one, Zoe, arrived this week, bringing with it expected daytime highs above 109 degrees Fahrenheit (or 43 degrees Celsius).

As Time points out, there’s no single scientific definition of a heat wave. Most countries use the term to describe periods of temperatures that are higher than the historical and seasonal norms for a particular area. Seville’s new system categorizes those events into three tiers, with names reserved for the most severe ones and an escalating municipal response tied to each level. The city will designate future heat waves in reverse alphabetical order, with Yago, Xenia, Wenceslao and Vega to follow. 

It’s a system akin to ones organizations like the US National Hurricane Center have used for decades to raise awareness of impending tropical storms, tornadoes and hurricanes. The idea is that people are more likely to take a threat seriously and act accordingly when it's given a name. 

"This new method is intended to build awareness of this deadly impact of climate change and ultimately save lives," Kathy Baughman McLeod, director of the Adrienne Arsht-Rockefeller Foundation Resilience Center, the think tank that helped develop Seville’s system, told Euronews. Naming heat waves could also help some people realize that we're not dealing with occasional “freak” weather events anymore: they’re the byproduct of a warming planet.