HBO Max has grown its Podcast Program by leaps and bounds over the past two years expanding from 4 shows in 2019 to 25 today, many serving as tie-ins and companion pieces to HBO's various series. But that will soon change as the company announced on Wednesday that it plans to expand its online audio offerings to include original, scripted programming and "look-back" shows as well.
“Our viewers tell us that, more than any other streaming platform, they want to discuss and dissect HBO Max programming with friends and family to extend the emotional experience after finishing an episode,” Joshua Walker, Chief Strategy Officer at HBO Max, said in a statement. He cites Max Podcast fans' collective investment in the existing programming as a driving force for the company's decision.
The new slate of scripted shows will include the highly anticipated Batman: The Audio Adventures starring Jeffrey Wright and premiering this fall, a look-back at Band of Brothers on September 9th (the show's 20th anniversary), and We Stay Looking — a sequel Issa Rae's Insecure companion show and HBO's first scripted podcast. HBO is also expanding its partnership with Audacity to include titles like Lovecraft Country Radio and The Chernobyl Podcast on the streaming platform's library.
You've all seen the iconic picture of the US astronaut riding gracefully upon his NASA-built MODOK chair. That astronaut was Bruce McCandless II, Houston’s capsule communicator during the moon landing mission, Challenger crew member, and the driving force behind America's ability to conduct operations outside of the stuffy confines of space shuttles and international stations. Without McCandless, there's no guarantee the US would have EVA capabilities today. Wonders All Around, exhaustively researched and written by McCandless's son, Bruce III, explores McCandless the elder's trials and tribulations during NASA's formative years and his laser-focus on enabling astronauts to zip through space unencumbered by the mass of their ships.
Greenleaf Book Group
Copyright @ 20201 Bruce McCandless III. Published by Greenleaf Book Group Press. Distributed by Greenleaf Book Group. Design and composition by Greenleaf Book Group and Kimberly Lance. Cover design by Greenleaf Book Group, Shaun Venish, and Kimberly Lance. Cover image courtesy of NASA, photographed by Robert L. "Hoot" Gibson
In his long leaden days of waiting for a spaceflight, my dad found the route to redemption on the back of an aging cartoon character. From the afternoon in December 1966 that he first tried out the Manned Maneuvering Unit in a Martin Marietta simulator, he was hooked on a vision of a gas-propelled jetpack that would allow astronauts to operate outside their spacecraft. This vision had an obvious pop-culture antecedent. In the 1920s a comic-strip character named Buck Rogers — a rock-jawed, All-American World War I veteran — succumbed to the effects of a mysterious gas he encountered while working as a mine inspector. He fell into a deep sleep and woke after five centuries of slumber to a strange new world of spaceships, ray guns, and Asian over-lords. Though he initially traveled this new world via an antigravity belt, a device that allowed him and his best gal, Wilma, to leap great distances at a time, Buck eventually acquired a svelte and evidently omnidirectional jetpack. He eventually ventured into space in an adventure called Tiger Men from Mars, and his exploits in the cosmos changed America’s vision of the future forever. Millions followed Buck’s adventures in the funnies, on radio, and in movie serials. Among Buck’s imitators and spiritual heirs are Flash Gordon, Brick Bradford, John Carter of Mars, and Han Solo.
A host of talented men and women spent significant amounts of time and money to wrestle that jetpack out of the funny papers and into the space shuttle. None worked harder, though, than Bruce McCandless and his chief collaborator, an Auburn-educated engineer and Air Force officer named Charles Edward (“Ed”) Whitsett, Jr. Whitsett was a pale, bespectacled individual, mild-mannered but tenacious. He had a head start on my father. He’d been thinking and writing about jetpack technology as early as 1962. In a sense, he was trying to solve a problem that didn’t exist yet: Namely, how could an astronaut venture outside his or her spaceship and perform constructive tasks in an environment with no oxygen, with extreme temperature fluctuations, and in an orbital “free fall” that would leave the spacefarer lolling in the practical equivalent of zero gravity? Alexei Leonov of the Soviet Union and American Ed White had proven that extravehicular activity was possible, that men could survive outside of their space capsule, but basically all they’d done was float. How could a man move from one part of a spaceship to another, or from one spacecraft to another craft, or from a spacecraft to a satellite, in order to make inspections or repairs? None of these needs really existed in the early sixties, when the programs of both nations were still just trying to fire tin cans into low Earth orbit and predict, more or less, where they would come back down. But clearly the needs would eventually arise, and various methods were proposed to address them.
In the mid-sixties, the Air Force assigned Whitsett to NASA to supervise development of the Air Force’s Astronaut Maneuvering Unit. Gene Cernan’s failed test flight of the AMU on Gemini 9 in 1966 — the “space-walk from hell,” as Cernan called it — set the jetpack project back, but it never went away. McCandless, Whitsett, and a NASA engineer named Dave Schultz worked quietly but assiduously to keep the dream alive. They enlarged and improved the AMU all through the latter half of the decade and into the seventies. In the “Forgotten Astronauts” wire story that portrayed him as a washout in 1973, my dad mentioned the reason why he wanted to stay in the manned space program despite not having won a crew assignment on either Apollo or Skylab. “McCandless,” said the article, “has helped develop the M509 experimental maneuvering unit. The Skylab astronauts strap it on like a backpack and propel themselves Buck Rogers — like around the Skylab interior. [He] wants to build a larger operational unit to perform space chores outside the shuttle.” And that’s exactly what he did.
Though the Skylab M509 tests in 1973 and 1974 were a resounding success, resulting in the triumph of the jetpack concept over both rocket boots and the handheld maneuvering unit, Whitsett and McCandless didn’t rest on their laurels. Over the next several years, using whatever time and funding they could scrape together, the team made multiple upgrades — eleven, by one count — to what was now being called the “manned maneuvering unit,” or MMU. The bulbous nitrogen-gas fuel tank of the ASMU was replaced with two streamlined aluminum tanks in the rear of the unit, each of which was wrapped in Kevlar. The number of propulsion nozzles was increased from fourteen to twenty-four, positioned around the jetpack to allow for six-degrees-of-freedom precision maneuvering. Smaller gyroscopes replaced those used on the ASMU, and, as space historian Andrew Chaikin has noted, the ASMU’s “pistol-grip hand controllers, which were tiring to operate in pressurized space suit gloves, were replaced by small T-handles that needed just a nudge of the fingertips.” The MMU’s new arm units were made to be adjustable, to accommodate astronauts of all sizes. Painted white for maximum reflectivity, the unit was built to survive the 500-degree fluctuation in temperatures (from a high of 250 degrees F to a low of minus 250 F!) that an astronaut might encounter in space.
By 1980 the machine weighed in at 326 pounds. Like the AMU and the ASMU before it, the MMU was designed to fit with or “over” the astronaut’s pressure suit. Shuttle astronauts wore a newly designed suit called the Extravehicular Maneuvering Unit, or EMU, a two-piece marvel of textile engineering made up of fourteen layers of Nylon ripstop, Gore-Tex, Kevlar, Mylar, and other substances. Power for the jetpack’s electronics was supplied by two 16.8-volt silver-zinc batteries. Two motion-control handles — the translational hand controller and the rotational hand controller — were mounted on the unit’s left and right armrests, respectively, and a button activated an “attitude-hold mode,” which used motion-sensing gyroscopes to direct the firing of the thrusters to maintain an astronaut’s position in space.
The machine had been tested in every way its designers could imagine. A representative of a local gun club visited Martin Marietta and shot the MMU’s nitrogen fuel tank with a .50 caliber bullet to ascertain whether the tank would explode if pierced. (It didn't.) The jetpack was run through hundreds of hours of simulations. At my father’s urging, a gifted and intense Martin Marietta project manager named Bill Bollendonk subjected the device to space-like conditions in the company’s thermal vacuum facility. The MMU was no longer a “far out” experiment, as Mike Collins once called it. It was now a promising space tool. Unfortunately, for the moment, it was still an unused space tool. American astronauts remained on Earth, as NASA struggled to produce its next-generation orbital workhorse, the space shuttle.
Following the death of a sight-impaired relative, Wataru Chino had no choice but to take action. In response to the tragedy, the Honda EV engineer developed an in-shoe navigation system, dubbed Ashirase (both the name of the product and the name of the company) that allows low-sighted people to use their feet to navigate, rather than cell phones or other visual aids. The tactile navigation system has earned the financial backing of Honda’s Ignition startup incubator program and continues to gain traction.
The Ashirase system is two-part, consisting of the dedicated Ashirase navigation app running on the user’s smartphone and a silicone shoe insert cradling a combination motion sensor-electronic compass. Once the user programs their walking destination into the app, the shoe inserts will vibrate in various patterns and tempos — “walk forward” causes vibrations under the balls of the feet, “turn left” rubs the appropriate side of both feet and the speed at which the inserts vibrate indicate proximity to the turn or obstacle.
The idea behind the system is to allow users to remain more aware of their surroundings while they walk, using their feet to navigate rather than repeatedly stopping to consult their smartphones or passersby for directions.
Ashirase
Currently the insert prototypes can only be used in low top sneakers and dress shoes but Chino already has plans to expand the footwear selection. “We are thinking about [new footwear styles], and the idea is twofold at this moment,” Chino told Engadget through an interpreter. “One is to try to change, modifying the [electronic] device so that the shape can be fitted to other types of shoes.”
“Otherwise,” he continued, “what we can do is to change the yellow parts of this device so that it fits other types of shoes” noting that the white “puck”part can be disconnected from the flexible yellow insert that sits around the wearer’s foot and houses the various vibrating navigation gyroscopes. The system has a reported week-long battery life when using the system to navigate an average of three hours a day. Initially, the insert will be offered in generic small, medium and large sizes in Japan but he plans to offer more personalized fittings once the product hits market.
The navigation system is currently a bit limited, based on the Google Maps API rather than an HD map source, in that it will work so long as a navigation data signal is available. That means that the system may not initially work in indoor areas like malls or hotels — though hiking trails, parks and other public lands should be no problem.
Chino and his team are reportedly looking into incorporating either a Personal Dead-reckoning (PDR) system, Wi-Fi-based positioning or IoT navigation capability to help users make their ways through indoor public spaces at a later date. The team also reportedly plans to add public transportation options to the program in the future.
The company plans to release a beta version of the Ashirase system in Japan in October or November of this year. Users will be given free use of the insert and app for one week before being asked for feedback. Following the public beta, Ashirase executives expect the commercialized product to be ready by October 2022 and include a 2,000 - 3,000 yen ($18 - $27) monthly subscription.
Before that can happen, however, the startup is seeking some 200 million yen in additional funding — not including the 70 million yen in equity the Ignition program already provided — in order to scale up to full production.
Following its Q2 earnings call this week, Tesla representatives confirmed previous reports that its commercial EV project, Semi, will be delayed until 2022. The company cites both the ongoing global processor shortage and its own currently-limited battery production capability for the new 4680 style cells as contributing to its decision.
On the plus side, Tesla executives also confirmed that development of the highly-anticipated Cyber Truck continues apace. What's more,they explained that once production fully ramps up for the Model Y in the new Berlin and Texas plants, Tesla intends to launch production lines to begin the Semi line. For the full story, watch the video above, and for continuing coverage of all things Tesla, stay tuned to Engadget!
Tesla appears to have shrugged off the production woes it suffered last year during the COVID lockdown with the company announcing a number of "new and notable records" during its Q2 earnings call on Monday. Not only did Tesla build and ship 200,000 vehicles during the quarter, a 151 percent increase over last year, it also earned $1.1 billion in net income during the same period — a whopping tenfold increase year over year. Overall, revenue grew 98 percent from this time last year thanks in large part to Tesla's increased deliveries though the company did suffer from a "Bitcoin-related impairment" of $23 million during the past quarter.
Additionally, Tesla rolled out 85 MW worth of solar capability in Q2, a 215 percent increase from last year's 25 MW as well as added nearly 1,000 Supercharging stations to its ever expanding network.
In terms of tech, Tesla's use of radar as part of the vehicle's Full Self Driving system will soon be coming to an end. "After selling over a million vehicles equipped with radar, we have collected enough data to start removing it in some regions," the company wrote in its shareholder deck. "The removal of radar, which is enabled by our collection of a vast dataset of corner cases, allows us to focus on vision and increase the pace of improvement."
The company is also getting closer to switching over to its new 4680 battery cells, having successfully validated the battery tech's "performance and lifetime" at its California-based fabrication facility. With that testing out of the way, Tesla is focusing on "improving the 10 percent of manufacturing processes that currently bottleneck production output," though the company has not yet announced when the battery style changeover will actually take place.
First developed more than 100,000 years ago, clothing is one of humanity’s earliest — and most culturally significant — inventions, providing wearers not just protection from the environment and elements but also signifying social status, membership in a community and their role within that group. As robots increasingly move out of labs, off of factory floors and into our everyday lives, a similar garment revolution could soon be upon us once again, according to a new research study out of New York’s Cornell University.
“We believe that robot clothes present an underutilized opportunity for the field of designing interactive systems,” the team argues in What Robots Need From Clothing, which was submitted to the In Designing Interactive Systems Conference 2021. “Clothes can help robots become better robots — by helping them be useful in a new, wider array of contexts, or better adapt and function in the contexts they are already in.”
“I started by looking at how different materials would move on robots and thinking about the readability of that motion — like, what is the robot's intention based on the way materials move on the robot,” Natalie Friedman, a PhD student at Cornell Tech and lead author on the paper, explained to Engadget. “From there, I started thinking about all the different social functions that clothes have for people and how that could influence how the robot is viewed.”
While tomorrow’s robots may wear white button down dress shirts and black bow ties while serving hors d'oeuvres to party guests or wear candy stripes while working as nurses, it’s not simply a matter of tossing human clothing onto a robotic chassis. “What robot clothes are is integrally tied to what robots need from clothing. Robot clothing should analogously fulfill needs robots have, rather than just being human clothes on a robot,” the researchers wrote.
Robo-clothes could take any number of forms, depending on their wearer’s specific function. Robotic firefighters, such as the Thermite from Howe and Howe, might theoretically be issued heat-resistant overcoats akin to what humans wear but embedded with thermochromic ink to provide the robot’s operator an easy visual reference to the area’s ambient temperature or indicate that the robot is in danger of overheating. Conversely, search-and-rescue bots could wear waterproof garments when conducting oceanic operations and then strap on extra-grippy boots when searching for lost hikers in mountainous terrain or survivors of a building collapse.
"I think this work is important to helping engineers and technologists understand the functional importance of aesthetics and signaling in design,” Cornell Tech professor and co-author Wendy Ju, said in a recent blog. “It's not ‘just fashion’ - what the robot wears helps people understand how to interact with it in ways that are critical to safety and task execution."
Overall, the use of swappable attire could lead to more generalized robot designs as the specific capabilities the clothing provides don't have to be baked into the robot’s construction. “It is more difficult to build a new robot than to build new clothes,” Friedman said. “I think that clothes are going to influence robot design and robot designs are going to influence clothes. Maybe it'll start in one direction — clothes made to fit robots — but, in the future, I think that robots might be built to better fit in clothes.” She notes that Pepper, though recently discontinued by SoftBank, offers an online merch store with a wide variety of costumes and outfits for the robot to wear including outfits designating cultural, national, professional and religious affiliations.
NurPhoto via Getty Images
But clothing on robots isn’t just for their own benefit, it also serves to demystify and humanize these cutting-edge machines in the eyes of the people they’re working with. For example, clothing could help protect a robot’s sense of shame — or rather that of its user.
“The need for wire modesty — to cover up nudity — stems from anthropomorphic priggishness, since robots do not get embarrassed about wires poking out of them,” the researchers wrote. “However, both humanoid and non-humanoid robots have pragmatic reasons to maintain a clean and covered aesthetic, because exposed wires present a real risk to function. Any wire that is pulled out or cut will remove power or signal to a subsystem, and that can be risky to the robot and any people or objects in the environment.”
“I definitely see a future where [when robots] aren't wearing clothes, it might look a little funny,” Friedman added. “I mean we are just mapping our ideas onto robots, right? Robots don’t have consciousness, so they don't feel shame.”
However, putting clothes on robots could also prove problematic especially if the apparel style has been culturally appropriated. You can bet your bottom dollar that the first cannabis dispensary to dress an automated budtender in rastafarian garb is going to make headlines — and not the kind that are good for business — same as if you outfitted a Roomba with a Native American headdress. “Hawaiian shirts, for example, used to be a marker of ‘casual Friday’ office attire, but more recently are affiliated with the extremist ‘Boogaloo Boys,’” the researchers wrote.
Despite the potential drawbacks to putting pants on robots, doing so could help make the entire field of research more attractive to a new generation of roboticists. “I like to think about girls in robotics,” Friedman said. “When they're young, I think robotics seems like a really intimidating thing but I see clothes as kind of a way to welcome, you know, the stereotypically feminine... skills that women have. I see clothes as a way to welcome girls into [robotics].”
Social media routinely proves itself a cesspool of racist, bigoted and toxic opinions — and that's just coming from the adults. But for the younger generations that have never lived in an unconnected world, these seemingly unnavigable platforms have proven to be a uniquely potent tool for organizing and empowering themselves to change the real world around them. In Digital for Good, author Richard Culatta walks parents through many of the common pitfalls their kids may face when venturing into the internet wilds and how to best help them navigate these potential problems.
Harvard Business Review
Reprinted by permission of Harvard Business Review Press. Excerpted from Digital For Good: Raising Kids to Thrive in an Online World by Richard Culatta. Copyright 2021 Harvard Business School Publishing Corporation. All rights reserved.
Young Voices Matter
The first step for creating engaged digital citizens is making sure we’re teaching young people that their contributions and opinions matter. I think deep down we all believe this and want it to be true. But there are many elements of our society that are set up to communicate the opposite message. Much of school is designed in a way that tells our kids that they are to apply the skills they are learning some day in their hypothetical future, not now. They are taught to learn math because they will need it to get into college. They are taught to write because it will be an important skill when they get a job. In history, the people they learn about are always adults, not kids. They have little choice or control over the learning experience itself; they are handed a schedule, given assignments (that they didn’t have any input in designing), and told to complete by a date that they didn’t choose. The message that young voices don’t matter is reinforced by the fact that they can’t vote until they are eighteen. One of the most important tenets of democracy is the idea that everyone has a voice. We teach that to our children, yet we offer very few ways to actually use that voice before they’re no longer kids. Fortunately, the digital world gives a wide set of tools that can help change that narrative. These tools allow youth to have a voice and learn how to make a meaningful impact on their community, family, and in some cases, the world as a whole—right now, not decades down the road.
Just Some Students from Florida
In February 2018, Marjory Stoneman High School in Parkland, Florida, was in the news worldwide when nineteen-year-old Nikolas Cruz entered the school with a semiautomatic rifle, killing seventeen people and injuring seventeen others. This horrific event became one of the deadliest school shootings in US history. Yet there was a unique ending to this tragic story that set it apart for another reason. In other school shootings, traditional news media and political leaders quickly shape the national conversation around the event. A narrative emerges around what actually happened, with speculation about the causes, who is to blame, and the political responses to justify action (or lack thereof). But in the case of Parkland, it was the students who shaped the national conversation. Frustrated about viewpoints and conclusions from adults that they did not share or agree with, they used their access to social media to reset and redirect the conversation into what has now become one of the most powerful examples of youth engagement ever seen. Within a week of the shooting, the students had appeared on nearly every major news program and had raised more than $3 million in donations to support their cause. Emma Gonzáles, one of the most recognizable faces of the movement, has over 1.5 million Twitter followers—about twice as many as the National Rifle Association.
Not long after the shooting, I met Diane Wolk-Rogers, a history teacher at Stoneman High School. As she explained, nobody could have prepared these students for the horror they faced on that day. But they had been prepared to know how to use technology to make their voices heard. Wolk-Rogers says, “They are armed with incredible communication skills and a sense of citizenship that I find so inspiring.” So when it was time to act, they knew the tools of the trade.
Engaged digital citizens know how to use technology to identify and propose solutions and promote action around causes that are important to them and their communities. Micro-activism is a term used to describe small-scale efforts that, when combined, can bring about significant change. While young people might not be able to vote or run for office, they have a whole range of micro-activism opportunities—all made possible by their participation in the digital world. For youth who have access to social media, micro-activism can be as simple as using their digital platforms to call awareness to issues that matter to them—eradicating racism, protecting our planet, or funding their school, and so on. Most states have a function on their website to submit ideas or feedback directly to the office of the governor. Through sites like Change.org anyone, regardless of age, can submit suggestions to political leaders or private sector entities. You can also add your name in support of other petitions that are gaining momentum. There are many compelling stories of youth who have used Change.org to call attention to issues that matter to them. Examples include a ten-year-old who used the platform to convince Jamba Juice to switch from Styrofoam cups to a more environmentally friendly alternative. Or a seventh grader who used Change.org to successfully petition the Motion Picture Association to change the rating on a movie about school bullying so students in her junior high would be allowed to see it.
Not all acts of micro-activism will immediately result in a desired change. But regardless of the outcome, learning how to impact community issues using digital tools is an important skill to develop in and of itself. The ability to motivate others to act for good in a virtual space will be a significant (if not the significant) determining factor in the effectiveness of future civic leaders. Young people need to practice using tech to make a difference now, if they are going to be prepared to lead our society when they grow up.
Beginning the first quarter of next year, GM will make its advanced semi-autonomous driving assistant, Super Cruise, available on six more Cadillac models including the Escalade, CT4,CT5, Silverado, Hummer EV, and Sierra.
"We’re excited to expand Super Cruise to even more new models with additional capabilities to provide our customers with even more opportunities to go hands-free,” Mario Maiorana, Super Cruise chief engineer, said in a prepared statement. “The additional Super Cruise-enabled vehicles and new features are an important step toward our goal of enabling hands-free driving 95 percent of time and getting people more comfortable with letting go of the wheel.”
These vehicles will also enjoy a number of additional features that the current generation of Super Cruise users do. These include Trailering capability which enables drivers to engage the system even if they're towing a load behind them, Automatic lane change, and Enhanced navigation display, which will highlight Super Cruise-compatible routes and roads along the way to your driving destination. Now, if you already own a Super Cruise enabled vehicle and want to upgrade to the more advanced system, you are in luck assuming you bought your GM SUV in 2021 as that is the only model year getting an upgrade. If you bought between 1997, when Super Cruise was first introduced, and 2020, sorry but no dice, you'll have to change your own lanes like a schmuck.
Mercedes Benz announced its latest step towards electrification on Thursday, asserting that the company will offer BEV versions of its model lineup "in all segments the company serves" by 2022 and that "all newly launched architectures will be electric-only" starting in 2025.
"The EV shift is picking up speed — especially in the luxury segment, where Mercedes-Benz belongs. The tipping point is getting closer and we will be ready as markets switch to electric-only by the end of this decade," Ola Källenius, CEO of Daimler AG and Mercedes-Benz AG, said in a prepared statement. "This step marks a profound reallocation of capital. By managing this faster transformation while safeguarding our profitability targets, we will ensure the enduring success of Mercedes-Benz."
To do so, MBZ plans to invest some €40 billion into BEV technology between 2022 and 2030. What's more, in the 2025 model year, MBZ will introduce a trio of EV-specific architectures: MB.EA for full-size passenger vehicles, AMG.EA for performance EVs geared towards existing AMG customers, and VAN.EA, Mercedes' new line of light commercial EVs and service vans.
Mercedes plans to build and operate eight gigafactories in the coming years to help accommodate the 200 gigawatts of battery production capacity the company anticipates it will need for all these new BEVs it will be making.
The best part of waking up is, of course, hot bean juice in your cup. But, as Dr. Kate "The Chemist" Biberdorf explains in her new book It's Elemental, if you want to consistently enjoy the best cuppa joe you can craft — perfectly caffeinated and not too bitter — a bit math is necessary. And it's not just coffee. Biberdorf takes readers on a journey through mundane moments of everyday life, illustrating how incredible they actually are — if you stop to examine about the chemistry behind them.
Coffee and tea are much more potent sources of caffeine than soda. In one cup of coffee, you are likely to ingest around 100 mg of caffeine, but it can be up to 175 mg with the right coffee beans and technique. The whole process of making coffee beans (and coffee itself) is pretty fascinating if you’ve never given it much thought. For example, espresso makers and percolators get the most caffeine out of lighter roasted beans, but the drip method is the best way to get the most trimethylxanthine from darker beans. However, in general, light and dark roast coffees typically have the same relative number of caffeine molecules in each cup of coffee (excluding espressos).
Let’s look at the roasting processes to determine why that is. When the beans are initially heated, they absorb energy in what we call an endothermic process. However, at around 175°C (347°F), the process suddenly becomes exothermic. This means that the beans have absorbed so much heat that they now radiate the heat back into the atmosphere of the roasting machine. When this happens, the settings have to be adjusted on the equipment, in order to avoid over-roasting the beans (which sometimes results in burnt-tasting coffee). Some roasters will even toggle the beans between the endothermic and exothermic reaction a couple of times, to achieve different flavors.
Over time, roasting coffee beans slowly change from green to yellow, and then to a number of different shades of brown. We refer to the darkness of the bean as its “roast,” where the darker roasted coffee beans are much darker in color than the lighter roasted beans (surprise, surprise). Their color comes from the temperature at which they are roasted. Lighter beans are heated to about 200°C (392°F) and darker roasted beans to about 225–245°C (437–473°F).
But just before the beans start to, for lack of better words, lightly roast, the coffee beans go through their first “crack.” This is an audible process that occurs at 196°C (385°F). During this process, the beans absorb heat and double in size. But since the water molecules evaporate out of the bean when under high temperatures, they actually decrease in mass by about 15%.
After the first crack, the coffee beans are so dry that they stop readily absorbing heat. Instead, all of the thermal energy is now used to caramelize the sugars on the outside of the coffee bean. This means that the heat is used to break the bonds in the sucrose (sugar) into much smaller (and more fragrant) molecules. The lightest roasts—like cinnamon roast and New England roast—are heated just past the first crack before being removed from the coffee roaster.
There is a second crack that occurs during the roast, but at a much higher temperature. At 224°C (435°F), the coffee beans lose their structural integrity, and the bean itself starts to collapse. When this happens, you can usually hear it by a second “pop.” Dark roasts are typically categorized by any beans that have been heated past the second crack—like French and Italian roasts. In general, due to the hotter temperatures, darker beans tend to have more of their sugars caramelized, while lighter beans have less. The variation in flavor due to these methods is wild, but it doesn’t really affect how they react in the body— only the taste.
Once you purchase your perfectly roasted coffee beans, you can do the rest of the chemistry at home. With an inexpensive coffee grinder, you can grind up your coffee beans to a number of different sizes, which will definitely affect the taste of your morning coffee. Small, fine grinds have a lot of surface area, which means the caffeine (and other flavors) can be extracted from the miniaturized coffee beans with ease. However, this can often result in too much caffeine being extracted, which gives the coffee a bitter taste.
On the other hand, coffee beans can be coarsely ground. In this instance, the insides of the coffee beans are not exposed to nearly the same degree as finely ground coffee beans. The resulting coffee can often taste sour—and sometimes even a little salty. But if you partner up the correct size of coffee grounds with the appropriate brewing method, you can make yourself the world’s best cup of coffee.
The simplest (and easiest way) to brew coffee is to add extremely hot water to coarse coffee grounds. After they have soaked in the water for a few minutes, the liquid can be decanted from the container. This process, called decoction, uses hot water to dissolve the molecules within the coffee beans. Most current methods of coffee brewing utilize some version of decoction, which is what allows us to drink a cup of warm coffee instead of chomping on some roasted beans. However, since this method does not contain a filtration process, this version of coffee—affectionately referred to as cowboy coffee—is prone to having coffee bean floaters. For that reason, it’s usually not the preferred brewing method.
By the way, did you notice that I was avoiding the term boiling? If you’re trying to make halfway decent cup of coffee, the hot water should never actually be boiled. Instead, the ideal temperature of the water is around 96°C (205°F), which is just below boiling (100°C, 212°F). At 96°C, the molecules that provide the aroma of coffee begin to dissolve. Unfortunately, when the water is just four degrees hotter, the molecules that give coffee a bitter taste dissolve as well. That’s why coffee nerds and baristas are so obsessed with their water temperature. In my house, we even use an electric kettle that allows us to select whatever temperature we want our water to be.
Depending on how strong you like your coffee to taste, you may be partial to the French press or another steeping method. Like cowboy coffee, this technique also soaks the coffee grounds in hot water, but these grounds are a little smaller (coarse versus extra coarse). After a few minutes, a plunger is used to push all of the grounds to the bottom of the device. The remaining liquid above the grounds is now perfectly clear and deliciously tasty. Since the coarse coffee grounds are used in this method, more molecules can dissolve in the coffee solution, providing us with a more intense flavor (compared to cowboy coffee).
Another technique: when hot water is dripped over coffee grounds, the water absorbs the aromatic molecules before dripping into the coffee mug. This process, appropriately called the drip method, can be done manually or with a high-tech machine, like a coffee percolator. But sometimes this technique is used with cold water, which means that the fragrant, aromatic molecules (the ones that give your coffee its distinctive smell) cannot dissolve in the water. The result is called Dutch iced coffee, a drink that is ironically favored in Japan, and takes about two hours to prepare.