Posts with «author_name|andrew tarantola» label

Hitting the Books: Domestication brought about our best fuzzy friends

Nearly 40,000 years ago, humanity had its best idea to date: transform the age's apex predator into a sociable and loyal ally. Though early humans largely muddled through the first few thousand years of the process, the results have been nothing short of revolutionary. The practice of domestication underpins our modern world, without which we wouldn't have dogs or cats or farm animals — or even farms for that matter. In her latest book, Our Oldest Companions: The Story of the First Dogs, Anthropologist and American Association for the Advancement of Science fellow, Pat Shipman, explores the early days of domestication and how making dogs out of wolves fundamentally altered the course of human history. 

Harvard University Press

Excerpted from OUR OLDEST COMPANIONS: THE STORY OF THE FIRST DOGS by PAT SHIPMAN, published by The Belknap Press of Harvard University Press. Copyright © 2021 by the President and Fellows of Harvard College. Used by permission. All rights reserved.


To answer the question of whether or not the first dog evolved in Asia or Europe, we need to go back and create a good working definition of domestication.

“Domestication” has a very specific meaning. The term is derived from the Latin for “dwelling” or “house”: domus. In its broadest sense, domestication is the process of rendering an animal or plant suitable for or amenable to living in the domus, for being a member of, and living intimately with, the family.

Even in this general sense, the precise meaning of domestication is elusive. Are plants domesticated? Certainly some of them are spoken of as domesticated, as needing deliberate care and cultivation, and sometimes fertilization, by humans and, conversely, as having been genetically modified through human selection to have traits considered desirable. I am not talking about the relatively recent process of genetically engineering changes to plants; these modified products, such as soybeans, are known colloquially as GMOs (genetically modified organisms). Selection has been carried out for millennia by hunters, gatherers, foragers, gardeners, farmers, and breeders of various species through old-fashioned means, not in the laboratory. If you want, for example, violets with white stripes, what do you do? You try to nurture the seeds of those that show white stripes and pull up the ones that don’t, until you always get striped ones (if you ever do).

We can understand the general principle of selecting or choosing the most desirable plants — those that yield the most food under particular conditions, for example — but the practice of selection is somewhat paradoxical. The individual plants that produce rich fruits or seeds or tubers are the ones you would most want to eat — and those are the very ones you must save for the next planting season. Which is the most practical strategy? Why did people start saving the best seed? It is an awkward conundrum. As the late Brian Hesse wisely observed in his studies of early domestication, people who are short of food, even starving, do not save food for next season or next year. They simply try to live until next week.

The habit of saving seeds for another day must have arisen in relatively good times, when food was plentiful enough to keep some for the distant future. This implies that the motivation for domestication is not to ensure a stable food supply because undertaking the initial process of domestication makes sense only if you already have enough food. Plant domestication seems to be about improving the plant species in the long run. But you really don’t care if the plant is happy to see you or plays nicely with the children.

What is more, strictly speaking, domesticated plants — crops — do not exactly live with humans or in the home. In fact, because some of them, such as nuts and fruits, grow on trees, and most require sunlight, they could not possibly live indoors. Domesticated plants certainly do not participate in family life in any active way, though their needs and locations may shape the seasonal and daily round of activities and the locations of settlements. They don’t join the family. There is an odd sort of remote intimacy between crops and those who harvest or farm them.

The more you ponder the domestication of plants, the fuzzier the concept of “domesticating” them becomes. The earliest farmers or gardeners did not know enough about the mechanics of reproduction or genetic inheritance to know how to get a particular plant to fertilize some other particular plant and produce bigger corms, or juicier fruits, or non-exploding seed heads (which are easier to harvest), or tubers that were richer in carbohydrates. Domesticating plants was not a matter of learning which individual plants were friendliest or least aggressive toward people. And yet, over time, wisdom accumulated, sometimes accompanied by good luck, and humans did find out how to alter some plants’ genetics to foster a more desirable outcome. This discovery is often spoken of as the Neolithic revolution or the dawn of agriculture. It is generally thought to have occurred around 11,000 years ago. Agriculture as an organized system of growing food transformed at least some people who had traditionally hunted, gathered, and foraged for their daily food — mobile people living off the land — and turned them into more sedentary farmers, tied to fields and villages and dwellings.

The Neolithic revolution was not a win-win proposition at the outset. Several studies have shown that early farming peoples experienced a decline in their general health because they often had monotonous diets based on a very few staple resources. Having a narrower range of staple foods meant that those people were more vulnerable to normal variations in weather, such as too much or too little rain, or too hot or too cold or too short a growing season; and of course there were plant diseases, which spread easily when a whole field is planted with a single species. Growing crops also caused humans to live in more permanent settlements, which exacerbated problems with sanitation, water supply, and human crowd diseases.

Though farming supported more people living in higher densities than hunting and foraging, it also created perfect conditions for the spread of contagious diseases and parasites and for recurrent episodes of starvation in bad years. And then there was warfare. Among nomadic foraging and hunting peoples, disputes are often settled by one group moving away from the other. But clearing and fencing fields, planting and tending crops, and building storage facilities takes a lot of work, so people begin to defend territories — or to raid others’ territories when times are bad and their own crops fail. Excess foods, such as the seeds for next year or the vegetables saved for winter, could be stolen during a raid. Abandoning a cleared or planted field and a store of food is an expensive proposition, much more risky than simply shifting your hunting to another area when game gets scarce or your brother-in-law becomes annoying.

As best we know at present, the domestication of plants began about 11,000 years ago with fig trees, emmer wheat, flax, and peas in the Near East. At about the same time, foxtail millet was domesticated in Asia. How do we know this at all? We know it because of plant remains preserved under special conditions. Seeds can be preserved and sometimes were.

Many edible plants also contain starch grains and phytoliths, microscopic silica structures that are much more resistant to decay than leaves or stems. If found, these can also be used to identify plants that were used in the past; techniques such as radiocarbon dating can tell us when this occurred.

Historically, it was often assumed that plants were domesticated earlier than animals, but modern science shows that this idea is unquestionably mistaken. There is no logical reason why it should be true. The attributes and needs of domesticated crops differ a great deal from those of hunted or gathered foods; knowing how to raise wheat tells you little about how to look after pigs. Like fields, particularly rich hunting grounds could be invaded by others and were worth defending. But many hunters and gatherers or foragers were nomadic and lived in low densities out of necessity. Staying too long in one area depleted the local prey population. Whereas agriculturalists can store crops for the future, hunters cannot store meat for long in temperate or tropical climates, though extreme cold works well to keep meat frozen. Over time, crops are more vulnerable to theft than carcasses.

Domesticating animals involves other issues. Domestic animals are not normally hunted; indeed, they are not always confined and may be free ranging. Still, domestic animals can be moved to a new area much more easily than a planted field, a store of grain, or a pile of tubers, which simply will not get up and walk to a new locale. Such animals may even transport household goods as they are being moved. Moving domestic animals is a very different proposition from moving plant foods.

So why do we use the same word, domesticates, to describe both plant and animal species, and a single word, domestication, to describe the process by which an organism becomes domesticated? I think it is a grave mistake that has been based on outdated ideas and faulty assumptions. I do not believe that a single process is involved. I argue that plant and animal domestication are radically different because the nature of the wild species from which domestication might begin is also radically different. As well as having the inherent genetic variability that causes some individuals to exhibit more desirable traits, animals must also cooperate to some extent if they are to be domesticated. Animals choose domestication, if it is to succeed. Plants do not. Like animals, plants have to have enough genetic variability to be exploited by humans during domestication, but plants do not decide whether or not to grow for humans. Animals must decide whether or not to cooperate.

Facebook is enabling a new generation of touchy-feely robots

Without a sense of touch, Frankenstein’s monster would never have realized that “fire bad” and we would have had an unstoppable reanimated killing machine on our hands. So be thankful for the most underappreciated of your five senses, one that robots may soon themselves enjoy. Facebook announced on Monday that it has developed a suite of tactile technologies that will impart a sense of touch into robots that the mad doctor could never imagine.

But why is Facebook even bothering to look into robotics research at all? “Before I joined Facebook, I was chatting with Mark Zuckerberg, and I asked him, ‘Is there any area related to AI that you think we shouldn't be working on?’ Yann LeCun, Facebook’s chief AI scientist recalled during a recent press call. “And he said, ‘I can't find any good reason for us to work on robotics,’ so that was the start of our FAIR [Facebook AI Research] research, that we're not going to work on robotics.”

“Then, after a few years,” he continued, “it became clear that a lot of interesting progress in AI work is happening in the context of robotics because this is the nexus of where people in AI research are trying to get to; the full loop of perception, reasoning, planning and action, and then getting feedback from the from the environment.”

As such, FAIR has centered its tactile technology research on four main areas of study — hardware, simulation, processing and perception. We’ve already seen FAIR’s hardware efforts: the DIGIT, a “low-cost, compact high-resolution tactile sensor” that Facebook first announced in 2020. Unlike conventional tactile sensors, which typically rely on capacitive or resistive methods, DIGIT is actually vision-based.

FAIR

“Inside the sensors there is a camera, there are RGB LEDs placed around the silicon, and then there is a silicon gel,” Facebook AI Research Scientist, Roberto Calandra, explained. “Whenever we touch the silicone on an object, this is going to create shadows or changes in color cues that are then recorded by the collar. These allow [DIGIT] to have extremely high resolution and extremely high spectral sensitivity while having a device which is mechanically very robust, and very easy and cheap to produce.”

Calandra noted that DIGIT costs about $15 to produce and, being open source hardware, its production schematics are available to universities and research institutions with fabrication capabilities. It’s also available for sale, thanks to a partnership with GelSight, to researchers (and even members of the public) who can’t build their own.

FAIR

In terms of simulation, which allows ML systems to train in a virtual environment without the need to collect heaps of real-world hardware data (much the same way Waymo has refined its self-driving vehicle systems over the course of 10 billion computer generated miles), FAIR has developed TACTO. This system can generate hundreds of frames of realistic high-resolution touch readings per second as well as simulate vision-based tactile sensors like DIGIT so that researchers don’t have to spend hours upon hours tapping on sensors to create a compendium of real-world training data.

“Today if you want to use reinforcement learning, for example, to train a car to drive itself,” LeCun pointed out, “it would have to it would have to be done in your turn environment because it would have to drive for millions of hours, cause you know countless thousands of accidents and destroy itself multiple times before it burns its way around and even then it probably wouldn't be very reliable. So how is it that humans are capable of learning to drive a car with 10 to 20 hours of practice with hardly any supervision?”

“It's because, by the time we turn 16 or 17, we have a pretty good model of the world,” he continued. We inherently understand the implications of what would happen if we drove a car off a cliff because we’ve had nearly two decades of experience with the concept of gravity as well as that of fucking around and finding out. “So ‘how to get machines to learn that model of the world that allows them to predict events and plan what's going to happen as a consequence of their actions’ is really the crux of the problem here.”

Sensors and simulators are all well and good, assuming you’ve got an advanced Comp Sci degree and a deep understanding of ML training procedure. But many aspiring roboticists don’t have those sorts of qualifications so, in order to broaden the availability of DIGIT and TACTO, FAIR has developed PyTouch — not to be confused with PyTorch. While Torch is a machine learning library focusing primarily on vision-based and NLP libraries, PyTouch centers on touch sensing applications.

“Researchers can simply connect their DIGIT, download a pretrained model and use this as a building block in their robotic application,” Calandra and Facebook AI Hardware Engineer, Mike Lambeta, wrote in a blog published Monday. “For example, to build a controller that grasps objects, researchers could detect whether the controller’s fingers are in contact by downloading a module from PyTouch.”

Most recently, FAIR (in partnership with Carnegie Mellon University) has developed ReSkin, a touch-sensitive “skin” for robots and wearables alike. “This deformable elastomer has micro-magnetic particles in it,” Facebook AI Research Manager, Abhinav Gupta, said. “And then we have electronics — a thin flexible PCB, which is essentially a grid of magnetometers. The sensing technology behind the skin is very simple: if you apply force into it, the elastomer will deform and, as it deforms, it changes the magnetic flux which is read [by] magnetometers.”

“A generalized tactile sensing skin like ReSkin will provide a source of rich contact data that could be helpful in advancing AI in a wide range of touch-based tasks including object classification, proprioception, and robotic grasping,” Gupta wrote in a recent FAIR blog. “AI models trained with learned tactile sensing skills will be capable of many types of tasks, including those that require higher sensitivity, such as working in health care settings, or greater dexterity, such as maneuvering small, soft, or sensitive objects.”

Despite being relatively inexpensive to produce — 100 units cost about $6 to make — ReSkin is surprisingly durable. The 2-3mm thick material lasts for up to 50,000 touches while generating high-frequency, 3-axis tactile signals and while retaining a temporal resolution of up to 400Hz and a spatial resolution of 1mm with 90 percent accuracy. Once a swath of ReSkin reaches its usable limits, replacing “the skin is as easy as taking a bandaid off and putting a new bandaid on,” Gupta quipped.

FAIR

Given these properties, FAIR researchers foresee ReSkin being used in a number of applications including in-hand manipulation, ie making sure that robot gripper doesn’t crush the egg it’s picking up; measuring tactile forces in the field, measuring how much force the human hand exerts on objects it is manipulating, and contact localization, essentially teaching robots to recognize what they’re reaching for and how much pressure to apply once they touch upon it.

As with virtually all of its earlier research, FAIR has open-sourced DIGIT, TACTO, PyTouch and ReSkin in an effort to advance the state of tactile art across the entire field.

Is Big Tech 'greenwashing' its environmental responsibilities ahead of COP26?

COP26, the UN’s climate change conference billed as “the world’s last best chance” to prevent the most disastrous effects of global warming, kicks off in Glasgow on Sunday. Delegates from around the world will convene to hammer out another round of emission reduction targets with a goal of achieving “net zero” greenhouse gas emissions by mid-century and keeping our rapidly heating planet temperature rise to a more manageable 1.5 degrees Celsius, rather than the calamitous 2.7 degree bump currently predicted.

With the eyes of the world firmly focused upon humanity’s disastrous planetary stewardship to date and wondering what might be done to rectify our past pollution, leading tech companies in recent weeks have become increasingly vocal in their pledges to reform business operations to help “save the planet.”

Apple, for example, announced the launch of 10 new environmental projects as part of its Power for Impact initiative as well as that 175 of its suppliers will switch to using renewable energy, the company said in a statement Tuesday, and that, by 2030, every device the company sells will have a net-zero climate impact. The company also noted that it has already reduced its carbon emissions by 40 percent over the past five years.

Google, on the other hand, pointed to its goal of achieving net zero emissions “across all of our operations and value chain by 2030,” according to a blog post published on Monday. The company also called out its efforts to assist its partners with reducing their own emissions, such as through the Environmental Insights Explorer (EIE) program which helps cities map their pollution data, air quality and solar power potential. Google also made sure to mention just how sustainable its products actually are for consumers.

Microsoft

Microsoft made even loftier claims on Tuesday: to be “carbon negative by 2030 and by 2050 remove from the environment all the carbon the company has emitted, either directly or by electrical consumption since it was founded in 1975,” before expounding on the rapidly increasing efficiencies of its massive data centers.

Amazon, for its part, announced that its $2 billion Climate Pledge Fund investment program has selected three low-carbon startups: Resilient Power, which produces transformer-based EV charging technology; CMC Machinery, an order-specific-sized shipping box manufacturer; and Infinium, which devised “ultra-low carbon fuels that can be used in air transport, marine freight, and heavy truck fleets,” per the company’s blog post.

But do these protestations of environmental progress signify a legitimate effort by Big Tech to clean up its collective act or are they simply more PR spin seeking to offset their bad behavior? Because we’ve seen this sort of behavior before. It’s called greenwashing.

What is Greenwashing?

Merriam-Webster defines greenwashing as “expressions of environmentalist concerns especially as a cover for products, policies, or activities.” The term was first coined in 1986 by environmentalist Jay Westerveld in an essay examining the hotel industry’s practice of leaving placards in guest rooms admonishing them to reuse their towels to help “save the environment.” Back then, people got their news from three places: newspapers, television and radio — the same sources for virtually all advertising at the time. This information availability imbalance created a system wherein corporations could promote themselves in any flattering shade they wished, regardless of their actual actions, with little fear of the public actually realizing that a deception had even occurred.

The practice of greenwashing in America goes as far back as 1953 — though it wasn’t called as such at the time — when beverage manufacturers launched the Keep America Beautiful campaign, reminding the public to be good environmental stewards and not litter, in what was actually an effort to forestall incoming regulations on the use of disposable containers. Greenwashing metastasized in the 1980s as Big Oil companies ladled out their own laudations while they sought to minimize their own liability and culpability in environmental pollution scandals and global warming. These companies went so far as to work to actively prevent the government from passing clean energy laws. But you wouldn’t know it from their television ads.

The spot above is from ​​Chevron’s People Do campaign. It should be noted that many of the programs promoted in that campaign were actually government-mandated actions and that while this campaign was running, Chevron was repeatedly found in violation of the Clean Air and Water Acts, and was caught dumping oil in wildlife refuges.

Exxon’s actions through the ‘90s were equally abhorrent. The company continually muddied the waters around humanity’s role in climate change, knowing full well how the burning of fossil fuels inflamed the growing crisis.

In 2017, a Harvard study of ExxonMobil’s climate change communications (both internal memos and public-facing advertorial newspaper content) produced between 1977 and 2014 found that while more than 80 percent of internal documents acknowledged that human activity was largely responsible for global warming, just 12 percent of the company’s advertorials did the same.

"Within hours of publishing our study, ExxonMobil responded with ad hominem attacks," Harvard Research Associate Geoffrey Supran, told Client Earth last year. "I was invited by the European Parliament to testify about ExxonMobil's history of climate denial. The day before, they sent a private memo (which has now been leaked) to Members of Parliament to try to discredit me. If these experiences tell us anything, it's that the Exxon tiger hasn't changed its stripes."

Greenwashing in the modern era

Greenwashing remains a widely-used marketing tactic even today — and not just the mealy-mouthed word salads regurgitated by oil executives during a House Oversight Committee hearing this Thursday.

Take bottled water, for instance. Nestle alone has spent millions of ad dollars over recent years in an effort to convince the public that, as it claimed in 2008, “bottled water is the most environmentally responsible consumer product in the world.”​​ This despite the fact that barely 31 percent of plastic water bottles actually get recycled and the rest end up cluttering landfills and the ocean — scientists estimate that around 8 million metric tons of plastic entered the ocean annually.

And they are far from alone. Coca-Cola came under fire in 2015 in Australia when it rolled out Coke Life, a supposedly light sugar variant packaged in a bright green can. Sure it made consumers feel like they were making a health conscious purchasing decision but that was despite health advocates pointing out that “the reduction to 10 teaspoons of sugar in a 600ml bottle made little difference in terms of health impacts.” More recently the company launched its World Without Waste campaign which, at its essence, pushed consumers to simply recycle more, rather than actually adjust the way the company conducts its business.

The fashion industry is a huge contributor to the climate-and ecological emergency, not to mention its impact on the countless workers and communities who are being exploited around the world in order for some to enjoy fast fashion that many treat as disposables. 1/3 pic.twitter.com/pZirCE1uci

— Greta Thunberg (@GretaThunberg) August 8, 2021

In 2013, Tyson Meats was taken to task over the fawning self-framing of how it cares for its animals and their relative well-being, not two years before five Tyson supplier employees were charged with 33 counts of criminal animal cruelty for repeatedly kicking and punching pigs. And who can forget Volkswagen, which launched a “Clean Diesel” marketing campaign amid the Dieselgate emissions scandal?

Why Greenwashing works so well

So why do companies insist on greenwashing their operations rather than actually reform themselves? Because it is far more profitable to simply adjust public perception than it is to make meaningful reforms. A 2015 Nielsen poll found that 66 percent of respondents would be willing to pay a premium for “environmentally sustainable products” and among those willing to pay more more than 50 percent were influenced by sustainability factors such as “a company being environmentally friendly (58 percent), and company being known for its commitment to social value (56 percent.)”

It’s also because we, collectively, keep falling for it. Consumers’ desires to help address the climate crisis, especially in the face of barely tepid responses from world governments, primes us to view virtually any action on that account as a positive one. “SDGs [Sustainable Development Goals] and ‘net zero’ have kind of created an opportunity for a lot more greenwashing, because it allows you to describe yourself as a green company when you’re doing a thing that’s fundamentally not green,” Dave Powell, co-presenter of the Sustainababble podcast and the former Head of Environment at the New Economics Foundation, told Client Earth. “You effectively buy your way out of trouble, for example, by promising to plant large numbers of trees.”

"As part of their climate strategies, many companies are relying on voluntary carbon offsetting. However, if not done well, offsetting can result in greenwashing,” Dr. Aoife Brophy Haney, Research Lecturer at the Smith School of Enterprise and the Environment at the University of Oxford, added. “To mitigate this risk, government and society at large should support the use of best practice guidelines, such as the recently released ‘Oxford Principles for Net Zero Aligned Carbon Offsetting’, to help ensure offsetting is done in a rigorous and credible way that ultimately contributes to net zero goals."

And, most importantly, companies continue to engage in greenwashing because there is very little downside to doing so, at least from a regulatory perspective. In the US, the FTC guidelines for environmental marketing claims are only voluntary, though the FTC does retain the right to prosecute outright false or misleading advertisements.

However, cracks in the greenwashing facade may be beginning to show, starting in the financial sector, as regulators’ interest in ESG fund (environmental, social and governance) oversight grows. As Financial News London reported Monday, German asset manager DWS has recently been investigated by both US and German regulatory agencies after a former employee accused the company of fudging the environmental credentials in its 2020 annual report.

“You have to be careful, as there is a big reputational risk,” an unnamed senior executive at a European asset manager, told FN London. “We’re not saying we were bulls***ing before, but there’s a recognition now that it’s more complicated.”

“Most have probably been a bit too pushy in marketing their alleged ESG expertise and they are now applying more caution,” ​​Philip Kalus, managing partner at consultancy Accelerando Associates, added. “Some would even say there is panic in the house. Nobody wants to be the next one being accused, but it is an important and overdue wake-up call for the industry.”

That’s not to say that environmental pledges made by Apple, Google, Microsoft or Amazon are meant to intentionally gaslight the public (though Exxon, Shell and Chevron absolutely did). These companies have a vested financial interest in at least appearing as positively as possible to their customers because, frankly, nobody’s going to have time to talk about the slick new features of the Pixel 8 or iOS 15 when we’re in the midst of a global climate meltdown-slash-water war.

Is Google’s “moonshot goal” of operating its data centers and campuses entirely on carbon-free energy by 2030 going to make more than a blip of difference when it comes to mitigating the impacts of climate change? Probably not, definitely not on its own and certainly no more so than Microsoft’s promise to reduce water use in its data centers by 95 percent by 2024 or Apple’s plan to build robots to more effectively recycle old handsets. But these claims do not, in and of themselves, constitute greenwashing. Their changes may not be enough to make a noticeable impact at this point, but these good faith efforts attempt to do something, anything, to stave off what could well be humanity’s self-inflicted extinction. And given how America’s most recent effort to invest in environmentally responsible energy technologies was single-handedly killed off by the coal-loving Senator from West Virginia, these sorts of corporate initiatives may likely be the best we’ll soon see.

We won't have electric airplanes until battery tech improves

Today’s commercial airliners are not exactly fuel efficient. The average 747, for example, burns through a gallon of kerosene-based fuel every second that it flies. And with 8.2 billion people expected to take to the skies annually by 2037, carbon-free alternatives to Jet A-1 will be necessary in order to offset the industry’s impact on global warming. We are nearing the age of electric airplanes.

Pioneering researchers, scientists and entrepreneurs have been working on the dream of electrified flight since the latter part of the 19th century when heavy lead-acid batteries were loaded onto early airships to power their propellers. We’ve also seen a number of, ahem, novel means of powering aircraft while in-flight throughout the years, from conductive tethers stretching back down to the ground to solar panels to microwave energy transmission but it wasn’t until the advent of relatively more power-dense Nickel-cadmium (NiCad) battery technology that human-scale free-flying electric planes became technically feasible.

But even as battery chemistries have evolved and energy densities have risen over the past few decades, today’s state of the art Lithium-ion cells pose the same quandry to the aviation industry as they do to the automotive: how to properly balance the energy-to-weight ratio of their batteries.

“If a jumbo jet were to use today’s batteries, 1.2 million pounds of batteries would be required just to generate the power of the jet engine it would be replacing,” University of Houston Energy Fellow, Emily Pickrell, opined in Forbes earlier this year. “This weight would effectively need an additional eight jet planes just to carry that weight!"

And as Li-ion technology has fully matured, further increases to its energy density have fallen to below five percent with each annual iteration, which is why a number of researchers and battery companies are already looking for the next breakthrough battery chemistry — whether that’s Sodium-ion (Na-ion), Lithium-metal (Li-metal), Lithium-Sulphur (Li-S), or Zinc-air (Zn-air).

Regardless of composition, batteries need to get a whole lot lighter and more energy dense if they’re going to attack and dethrone jet fuel which, with an energy density of 9.6 kWh/L, makes the flammable liquid about 50 times as energy dense as today’s best li-ions. To be fair though, due to inefficiencies inherent to internal combustion engines, that figure drops to around 14 times the energy density of a li-ion battery if you’re comparing equal weights of fuel and batteries.

For example, a Tesla Model 3’s li-ion-based battery boasts an energy density of 260 Wh/kg while CATL announced earlier this year that it had built a sodium-ion battery with 160 Wh/kg density (though it hopes to get that up to 200 Wh/kg by 2023). Lithium-sulfur batteries have shown the capacity to hold up to 600 Wh/kg, though that technology faces significant longevity hurdles (i.e. the chemistry tends to eat through electrodes) before they can be widely used. Currently, 2- and 4-person small aircraft outfitted with electric power systems typically operate at 250-270 Wh/kg of specific energy but industry experts expect energy densities will have to hit 350 - 400 Wh/kg before the electric aviation industry really takes off — something that could happen within the next few years, according to Tesla CEO, Elon Musk.

400 Wh/kg *with* high cycle life, produced in volume (not just a lab) is not far. Probably 3 to 4 years.

— Elon Musk (@elonmusk) August 24, 2020

Preventing and mitigating thermal runaway is another critical test for electric aviation. When a battery cell, or even an area within a single cell, malfunctions due to mechanical, thermal, or electrochemical failure, its temperature can rise beyond safe levels causing the cell to first produce lithium off-gasses, causing the cell walls to bulge, then rupture, releasing the entirety of its energy reserve. When a cell bursts it can damage and overheat surrounding cells, setting off a cascading failure that results in explosion and fire. When that happens to a Chevy Volt, the car will likely be a write off (fingers crossed it didn’t also set your house on fire) but if such a failure were to occur in-flight on an electrified 747, the loss of life would be catastrophic.

To minimize the chances of a full-blown runaway from occurring, early detection of cell failures is key. As off-gassing typically occurs minutes before a cell ruptures, the presence of a monitoring system which compares sensors positioned close to a li-ion battery against those collected by a reference sensor further away can alert for the presence of a failing cell. And to negate any gases that have already been released, fire suppression systems armed with inert gas — to prevent the offgasses from reaching combustible levels when mixed with atmospheric oxygen — can be employed as well. Of course regular maintenance and robust inspections also help prevent cell failures before the situation becomes explosive.

Rolls-Royce

Battery electric planes will also provide unique challenges in balancing air speed and range, though for Rolls-Royce, it’s not even a question — speed all the way. Over the past few years, Rolls-Royce has been quietly working on Project ACCEL (accelerating electrification of flight), building a battery-powered racing plane, dubbed Spirit of Innovation, in an effort to set a new world air speed record.

The record was previously set in 2017 when an electric-powered Extra 330LE, using a Siemens eAircraft-built power plant, notched a 209.7 mph (337.5 kph) top speed over a 3-kilometer-long course. The feat was certified by the World Air Sports Federation (FAI) as the fastest electrically powered flight by an aircraft weighing less than 1,000 kg at takeoff, beating the previous record (set in 2013) by just over 8 mph (13 kph).

In addition to the 3-kilometer record, Rolls-Royce has the opportunity to also set FAI records for a 15km distance and “time to altitude,” basically how quickly the plane can take off and reach a specific height. “It needs to be a significant number,” Rolls-Royce Director of Engineering and Technology – Civil Aerospace, Simon Burr, told Aerosociety. “We’re planning to fly over 300mph. We’ll see how high we can get to.”

Rolls-Royce

For its attempt, Rolls-Royce — which is partnering with the UK’s YASA electric motor manufacturer and start-up Electroflight, which makes bespoke battery systems — has acquired a pair of Sharp Nemesis NXT twin-seat air racers. One has been used for ground testing while the second will conduct the actual flights. The Nemesis NXT already holds the 3km FAI record with a recorded top speed of 415mph (667.8km) using a 400hp Lycoming internal combustion engine.

The Rolls-Royce team has swapped that Lycoming engine out for a trio of YASA 750v electric motors producing around 400kW (530hp) while the fuel tank has been replaced with three independent battery packs.

Rolls-Royce

“The main challenge of electrification is weight,” Rolls-Royce Flight Test Engineer Andy Roberts said during a September media briefing. Not only did the 6,000-cell battery system aboard the Nemesis NXT shift the aircraft’s center of balance, the 450kg battery system also doesn't get lighter over time as conventional fuel tanks would, which could impact the plane’s performance during the later stages of the run. The batteries are so substantial that Rolls-Royce Chief Test Pilot Phill O’Dell had to lose 2kg of bodyweight to help keep the overall aircraft weight within operating margins.

Thermal runaway is a very real concern for the Rolls-Royce team, as they’ll be pushing these batteries to their absolute limits during the flight. In order to mitigate this issue cells are separated by liquid-cooling plates and stored in cork-wrapped fireproof cases (the porous cork material helps diffuse heat). Should a cell overheat to the point of venting off-gasses, the plane is equipped with an inert gas suppression and ventilation system as well.

On September 15th, the Spirit of Innovation made its maiden test flight from the UK Ministry of Defence’s Boscombe Down airfield, flying for 15 minutes. The company hopes to have the Nemesis ready for an official run at the record before the end of this year.

“The first flight of the Spirit of Innovation is a great achievement... We are focused on producing the technology breakthroughs society needs to decarbonize transport across air, land and sea, and capture the economic opportunity of the transition to net zero,” Warren East, Rolls-Royce CEO, said in a statement. “This is not only about breaking a world record; the advanced battery and propulsion technology developed for this programme has exciting applications for the Urban Air Mobility market.”

Rolls-Royce is far from the only company pursuing electric aircraft technology, no matter how much faster it is than the competition. From tiny startups to industry stalwarts — even NASA — companies and governments around the world are racing to develop commercially viable electric aircraft both for passenger flights and cargo hauling. 

Guglielmo Mangiapane / reuters

Bye Aerospace, for example, builds electrified 2-seat trainer planes called the eFlyer, similar in function to Diamond Aircraft’s eAircraft. Slovenian aircraft manufacturer Pipistrel has been selling its $140,000 Alpha Electro, the first electric plane to earn FAA certification, since 2018. On the other end of the spectrum you have aerospace giants like Airbus developing the Air Race E, which the company claims is the world’s first all-electric air race series when it starts up later this year (better get with the times, Red Bull Air Race), and demonstrators like the City Airbus, a 4-seat eVTOL. These electric vertical take off and landing capable vehicles have become a popular option for fossil fuel-free air travel, such as Cadillac’s single-seater concept, the build-it-yourself Jetson Aero, China's EHang AAV, Uber’s since-abandoned air taxi scheme or Volocopter’s ongoing air taxi scheme.

Unfortunately, despite all the research into and hype surrounding electrified air travel, many industry experts remain skeptical that we’ll see its widespread adoption for at least a few more decades — at least for large-scale airframes like the Boeing 787 or Airbus A350. Until battery technologies become sufficiently robust, we’ll most likely see eVTOLS restricted to short-hop intracity duties for the foreseeable future, eventually expanding out to inter-city jaunts and regional commuter jets. Still, it beats sitting in traffic.

The 2022 Range Rover will come with both 'mild' and plug-in hybrid powertrains

Land Rover executives unveiled the latest iteration of the company's renowned flagship on Tuesday, showing off a strikingly well-appointed 5th generation SUV that's also surprisingly friendly to the environment, if not your budget.

The company's emphasis on modernism is on full display with the 2023 Range Rover's exterior. A gently sloping roofline contrasted against a rising sill line as well as other classic design details are joined by state-of-the-art amenities like retractable exterior door handles to help improve the vehicle's aerodynamic performance by nearly 12 percent compared to its previous iterations. The entire vehicle is built on Land Rover's new MLA-Flex architecture allowing for 11.6 inches of ground clearance and fording through more than 35 inches of water. 

Land Rover

An electronic air suspension, which debuted on the Range Rover back in 1992, will keep random road divots from detracting from the drive while the new Dynamic Response Pro system will electronically negate body roll during high speed cornering. Coolest of all, the 2023 Range Rover will offer 4-wheel steering, enabling the rear wheels to turn up to 7 degrees to help maintain stability while cornering as well as reducing the Range Rover's low-speed turning radius to rival that of a Honda Civic.

In terms of powertrains, the new Range Rover offers a slew of options. The base models will come with a 48V mild-hybrid 3.0L Turbocharged I6 — turning out 395 hp and 406 ft-lbs of torque — standard. Above that, a 523 hp (553 ft-lbs of torque) 4.4L Twin Turbo V8 is available as well. By the time the Range Rover hits US shores in 2023, Land Rover expects to offer it with an optional 434 hp plug-in hybrid engine capable of travelling up to 62 miles on electric power alone using its 38.2 kWh battery. And, come 2024, Land Rover has announced plans to offer its flagship with an all-electric drivetrain. 

But the luxury shown off during Tuesday's livestream comes at a price. A very steep price. The entry level P400 SE starts at $104,000 ($110,000 if you opt for the 7 seater variant) and climes to a whopping $163,500 for the P530 First Edition with the long wheelbase. Preorders for the new Range Rover are already open and deliveries are expected to begin next spring.

Hitting the Books: The genetic fluke that enabled us to drink milk

It may not contain our recommended daily allowance of Vitamin R but milk — or "cow juice" as it's known on the streets — is among the oldest known animal products repurposed for human consumption. Milk has been a staple of our diets since the 9th century BC but it wasn't until a fortuitous mutation to the human genome that we were able to properly digest that delicious bovine-based beverage. In her latest book, Life as We Made It: How 50,000 Years of Human Innovation Refined — and Redefined — Nature, author Beth Shapiro takes readers on a journey of scientific discovery, explaining how symbiotic relationships between humans and the environment around us have changed — but not always for the better.

Basic Books

Excerpted from Life as We Made It: How 50,000 Years of Human Innovation Refined—and Redefined—Nature by Beth Shapiro. Copyright © 2021. Available from Basic Books, an imprint of Hachette Book Group, Inc.


The first archaeological evidence that people were dairying dates to around 8,500 years ago — 2,000 years after cattle domestication. In Anatolia (present-day eastern Turkey), which is pretty far from the original center of cattle domestication, archaeologists recovered milk fat residues from ceramic pots, indicating that people were processing milk by heating it up. Similar analyses of milk fat proteins in ceramics record the spread of dairying into Europe, which appears to have happened simultaneously with the spread of domestic cattle.

It’s not surprising that people began dairying soon after cattle domestication. Milk is the primary source of sugar, fat, vitamins, and protein for newborn mammals, and as such is evolved expressly to be nutritious. It would not have taken much imagination for a cattle herder to deduce that a cow’s milk would be just as good for him and his family as it was for her calf. The only challenge would have been digesting it—without the lactase persistence mutation, that is.

Because lactase persistence allows people to take advantage of calories from lactose, it also makes sense that the spread of the lactase persistence mutation and the spread of dairying would be tightly linked. If the mutation arose near the start of dairying or was already present in a population that acquired dairying technology, the mutation would have given those who had it an advantage over those who did not. Those with the mutation would, with access to additional resources from milk, more efficiently convert animal protein into more people, and the mutation would increase in frequency.

Curiously, though, ancient DNA has not found the lactase persistence mutation in the genomes of early dairy farmers, and the mutation is at its lowest European frequency today in the precise part of the world where dairying began. The first dairy farmers were not, it seems, drinking milk. Instead, they were processing milk by cooking or fermenting it, making cheeses and sour yogurts to remove the offending indigestible sugars.

If people can consume dairy products without the lactase persistence mutation, there must be some other explanation as to why the mutation is so prevalent today. And lactase persistence is remarkably prevalent. Nearly a third of us have lactase persistence, and at least five different mutations have evolved—all on the same stretch of intron 13 of the MCM6 gene—that make people lactase persistent. In each case, these mutations have gone to high frequency in the populations in which they evolved, indicating that they provide an enormous evolutionary advantage. Is being able to drink milk (in addition to eating cheese and yogurt) sufficient to explain why these mutations have been so important?

The most straightforward hypothesis is that, yes, the benefit of lactase persistence is tied to lactose, the sugar that represents about 30 percent of the calories in milk. Only those who can digest lactose have access to these calories, which may have been crucial calories during famines, droughts, and disease. Milk may also have provided an important source of clean water, which also may have been limited during periods of hardship.

Another hypothesis is that milk drinking provided access to calcium and vitamin D in addition to lactose, the complement of which aids calcium absorption. This might benefit particular populations with limited access to sunlight, as ultraviolet radiation from sun exposure is necessary to stimulate the body’s production of vitamin D. However, while this might explain the high frequency of lactase persistence in places like northern Europe, it cannot explain why populations in relatively sunny climates, such as parts of Africa and the Middle East, also have high frequencies of lactase persistence.

Neither this hypothesis nor the more straightforward hypothesis linked to lactase can explain why lactase persistence is at such low frequency in parts of Central Asia and Mongolia where herding, pastoralism, and dairying have been practiced for millennia. For now, the jury is still out as to why lactase persistence has reached such high frequencies in so many different parts of the world, and why it remains at low frequencies in some regions where dairying is economically and culturally important.

Ancient DNA has shed some light on when and where the lactase persistence mutation arose and spread in Europe. None of the remains from pre-Neolithic archaeological sites—economies that relied on hunting and gathering—have the lactase persistence mutation. None of the ancient Europeans from early farming populations in southern and central Europe (people believed to be descended from farmers spreading into Europe from Anatolia) had the lactase persistence mutation. Instead, the oldest evidence of the lactase persistence mutation in Europe is from a 4,350-year-old individual from central Europe. Around that same time, the mutation is found in a single individual from what is now Sweden and at two sites in northern Spain. While these data are sparse, the timing is coincident with another major cultural upheaval in Europe: the arrival of Asian pastoralists of the Yamnaya culture. Perhaps the Yamnaya brought with them not only horses, wheels, and a new language, but an improved ability to digest milk.

The mystery of lactase persistence in humans highlights the complicated interaction among genes, environment, and culture. The initial increase in frequency of a lactase persistence mutation, regardless of in whom it first arose, may have happened by chance. When the Yamnaya arrived in Europe, for example, they brought disease—specifically plague—that devastated native European populations. When populations are small, genes can drift quickly to higher frequency regardless of what benefit they might provide. If the lactase persistence mutation was already present when plague appeared and populations crashed, the mutation’s initial increase may have happened surreptitiously. When populations recovered, dairying was already widespread and the benefit to those with the mutation would have been immediate. By domesticating cattle and developing dairying technologies, our ancestors created an environment that changed the course of our own evolution.

We continue to live and evolve in this human-constructed niche. In 2018, our global community produced 830 million metric tons (more than 21 billion US gallons) of milk, 82 percent of which was from cattle. The rest comes from a long list of other species that people domesticated within the last 10,000 years. Sheep and goats, which together make up around 3 percent of global milk production, were first farmed for their milk in Europe around the same time as cattle dairying began. Buffaloes were domesticated in the Indus Valley 4,500 years ago and are today the second largest producer of milk next to cattle, producing around 14 percent of the global supply. Camels, which were domesticated in Central Asia 5,000 years ago, produce around 0.3 percent of the world’s milk supply. People also consume milk from horses, which were first milked by people of the Botai culture 5,500 years ago; yaks, which were domesticated in Tibet 4,500 years ago; donkeys, which were domesticated in Arabia or East Africa 6,000 years ago; and reindeer, which are still in the process of being domesticated. But those are just the most common dairy products. Dairy products from more exotic species—moose, elk, red deer, alpacas, llamas—can be purchased and consumed today, and rumor has it that Top Chef ’s Edward Lee is working out how to make pig milk ricotta, should one want to try such a thing.

We can make the steel of tomorrow without the fossil fuels of yesteryear

The modern world has grown around steel bones — everything from tools and home appliances to skyscrapers and airplanes use the versatile material in their construction. But the process of making steel is a significant contributor to global warming and climate change. In 2018, reportedly every ton of steel produced generated 1.85 tons of carbon dioxide, accounting for about 7 percent of global CO2 emissions that year. This poses not just environmental challenges for our ever increasing world, it could also impact steel producers’ bottom line, which is why the industry is developing a “fossil-free” means of making the alloy, one that relies on renewable-sourced hydrogen rather than carbon coke.

Steel is an alloy composed of iron, which in its pure form is relatively soft, with a small amount of introduced carbon, usually about 2 percent of its total weight. This improves the material’s strength and reduces its propensity for fracturing. The process starts by combining iron ore, before coking coal and limestone (which remove impurities) in a blast furnace to create pig iron.

That molten pig iron is then poured into a furnace and high pressure air is introduced via a water-cooled lance. The oxygen chemically reacts with the molten iron to purge impurities — as well as produce significant amounts of carbon monoxide and carbon dioxide. The oxygen also forces impurities like silicates and phosphates present in the pig iron to react with limestone flux, trapping them as waste slag. Today, per the World Steel Association, some 1,864 million metric tons of crude steel are produced annually with China producing a vast majority of it.

While the WSA points out that “in the last 50 years, the steel industry has reduced its energy consumption per tonne of steel produced by 60 percent” and notes that steel is infinitely reusable, and that “new” steel typically contain 30percent recycled steel on average the traditional methods of iron and steel production are becoming untenable — at least if we want to mitigate its impacts on climate change. What’s more, the International Energy Agency estimates that global steel production will grow by a third by 2050, which will only compound the industry’s environmental impacts. That’s where fossil-free steel comes in.

Take HYBRIT (Hydrogen Breakthrough Ironmaking Technology), for example. This process has been developed as a joint venture between three Swedish companies: SSAB, which makes steel, energy company Vattenfall, and LKAB, which mines iron ore. Rather than using coking coal and a blast furnace to convert raw iron ore into metallic iron, the HYBRIT method uses hydrogen generated from renewable energy sources and a technique known as direct reduction, which lowers the amount of oxygen contained within the ore without heating it above the metal’s melting point, to create sponge iron.

HYBRIT

Like pig iron, sponge iron is an intermediary material in the steelmaking process (it’ll get shipped off to SSAB to be turned into steel slabs), but in HYBRIT’s case, its production results in the creation of water vapor rather than carbon dioxide.

“The first fossil-free steel in the world is not only a breakthrough for SSAB, it represents proof that it’s possible to make the transition and significantly reduce the global carbon footprint of the steel industry,” Martin Lindqvist, CEO of SSAB, told reporters in August. “We hope that this will inspire others to also want to speed up the green transition.”

The HYBRIT coalition opened a pilot direct reduction plant in Luleå, Sweden last year and has announced plans to increase production to an industrial scale by 2026. The team claims that eliminating fossil fuels from the steelmaking industry in Sweden could drop the country’s total CO2 emissions by at least 10 percent. However, they are not the only group looking into fossil-free steel production. The H2 Green Steel company has announced its intent to open a large-scale plant in northern Sweden by 2024 and expects to produce 5 million tonnes of the material annually by 2030.

In June, Volvo announced that it would be partnering with SSAB to develop fossil-free steel for use in its products — both passenger cars and industrial machines. Last week, Volvo unveiled the first vehicle to be made with fossil-free steel, an 8-plus ton load carrier designed to operate within mines. Not only is the load carrier powered by a fully electric drivetrain, it can autonomously navigate across a worksite as well. Granted only about 3 of the vehicle’s 8 tons were made from fossil-free steel (the drivetrain’s steel components, for example, were made through traditional smelting means), this marks an important first step towards a carbon-neutral transportation future.

“When we have been talking about ‘fossil free’ in the transport sector, we have been focusing a lot on emissions from the vehicles in use. But it's clear to us and to everyone else that we also need to address the carbon footprint from the production of our vehicles,” Volvo Group’s Chief Technology Officer Lars Stenqvist told Forbes. “That's why it's so important now to team up with everyone in the value chain and collaborate in order to drive out all the fossil fuel also used in the production of components, parts and also running our production facilities.”

Volvo expects the autonomous load carriers to enter real-world operation by next year, though the company concedes that its ability to ramp up production of fossil-free vehicles will depend largely on SSAB’s ability to deliver sufficient quantities of the material.

Tesla posts a wildly profitable Q3 despite difficult car market

Despite a global pandemic and ongoing chip shortage, Tesla continues to make money hand over fist. The company reported on Wednesday that it had a net income of $1.62 billion — five times more than it did this time last year. What's more, Tesla's operating income grew some 54 percent over the past quarter to $2 billion.

Company executives pointed to record-setting sales of both the Model 3 and Model Y — a combined 232,102 units delivered during Q3 2021 — for the explosive earnings growth, though only 9,289 Models X and S were shipped during the same period, a nearly 40 percent drop from Q2 2021 rates.

On the technology front, Tesla continues its FSD City Streets beta rollout and plans to "continue to monitor fleet data closely to help facilitate a smooth rollout," per its quarterly update. 

The company also released a more streamlined iteration of its car companion app that "enables phone key for multiple vehicles simultaneously, allows commands to be sent to the vehicle immediately upon opening the app and integrates the purchase of upgrades, subscriptions and accessories." New features include Disney+ streaming, a scrolling arcade shooter dubbed Sky Force Reloaded, a "car wash mode," and various tweaks to improve the vehicle's cold weather performance. 

Looking ahead, the company expects to achieve a 50 percent average annual growth in vehicle deliveries "over a multi-year horizon" and eventually reach "industry-leading" operating margins.

Stay tuned! The Tesla Q3 investors call starts at 2:30 PT today, we'll have more details as the event progresses.

developing...   

Egyptian authorities 'detain' robotic artist for 10 days over espionage fears

The robotic artist known as Ai-Da was scheduled to display her artwork alongside the great pyramids of Egypt on Thursday, though the show was nearly called off after both the robot and her human sculptor, Aidan Meller, were detained by Egyptian authorities for a week and a half until they could confirm that the artist was actually a spy.

The incident began when border guards objected over Ai-da's camera eyes, which it uses in its creative process, and its on-board modem. “I can ditch the modems, but I can’t really gouge her eyes out,” Meller told The Guardian. The robot artist, which was built in 2019, typically travels via specialized cargo case and was held at the border until clearing customs on Wednesday evening, hours before the exhibit was scheduled to begin.

“The British ambassador has been working through the night to get Ai-Da released, but we’re right up to the wire now,” Meller said, just before Ai-Da was sprung from robo-jail. “It’s really stressful.”

Ai-Da is slated to participate in the Forever is Now exhibit, which is slated to run through November 7th and features a number of leading Egyptian and international artists, is being presented by Art D’Égypte in conjunction with the Egyptian ministry of antiquities and tourism and the Egyptian ministry of foreign affairs.

“She is an artist robot, let’s be really clear about this. She is not a spy," Meller declared. "People fear robots, I understand that. But the whole situation is ironic, because the goal of Ai-Da was to highlight and warn of the abuse of technological development, and she’s being held because she is technology. Ai-Da would appreciate that irony, I think.”

  

The Pixel 6's camera will feature larger image sensors and smarter photo editing AI

The Pixel 6 smartphone has finally been unveiled. On Tuesday, Google executives explained what sorts of cameras and image capture systems the new handsets will offer when they go on sale October 28th. 

Both the Pixel 6 and 6 Pro will come equipped with a 50 MP Octa PD Quad Bayer wide camera (the base 6 will additionally feature 7x Super Res Zoom) as well as a 12 MP ultrawide camera. Their new 1/1.3 inch rear sensors reportedly capture up to 150 percent more light than the Pixel 5. The 6 Pro will also sport a 48 MP telephoto camera with 4x optical and 20 Super Res zoom functionality. Around front, the base 6 will offer an 8MP camera while the 6 Pro gets a 12 MP camera. 

Both models can capture video in 1080p and 4k (at either 30 or 60 FPS) with their rear cameras, as well as 240 FPS slo-mo. The 6 Pro's front camera can record at both 1080p (30 and 60 FPS) or in 4k at 30 FPS. The base 6's front however can only record at 1080p resolution at 30 FPS.

Editing photos should be a much more streamlined process than with past models, thanks to the Pixel 6's Tensor SOC integration. Users will be able to leverage the Magic Eraser which can quickly and seamlessly remove random objects and even people from the background of shots. Motion Mode features options like Action Pan and Long Exposure, which do exactly what they sound like they do, are available as well.  

developing...

Catch up on all the latest news from Google's Pixel 6 event!