Posts with «science» label

This hopping robot with flailing legs could explore asteroids in the future

Over the past two-and-a-half years, a group of students from ETH Zurich have been developing a robot with three spindly legs that was designed to be able to hop like an insect in microgravity. That's right — the curious little machine was built for space, specifically for the exploration of small celestial bodies like asteroids and moons. SpaceHopper, as the robot is called, could thus provide us more information to advance our understanding of life's origin, of the origin of water on our planet and of asteroids as potential providers of valuable resources. 

It has no preferred orientation, so it can go in any direction, and it has nine motors that give it the capability to jump long distances in low-gravity environments. The robot can even self-right after landing, ensuring the safety of any scientific payload it may carry. Since SpaceHopper was made for use on asteroids and moons, which have very little gravity compared to Earth, it has to be tested under conditions similar to those environments first. To see if it will actually work as intended, the students and the European Space Agency have recently taken the robot on a parabolic flight that creates a zero gravity environment when the aircraft freefalls. Apparently, they had no idea if SpaceHopper would be able to move as they intended in zero gravity scenarios and seeing that it actually worked was a "massive weight off [their] shoulders."

You can watch SpaceHopper flail about in the test flight below:

This article originally appeared on Engadget at https://www.engadget.com/this-hopping-robot-with-flailing-legs-could-explore-asteroids-in-the-future-120043940.html?src=rss

ESA's Gaia mission discovers the biggest stellar black hole in our galaxy yet

In addition to the supermassive black hole at the center of our galaxy, the Milky Way also serves as home to smaller stellar black holes that form when a massive star collapses. Scientists believe there are 100 million stellar black holes in our galaxy alone, but most of them have yet to be discovered. The ones that had already been found are, on average, around 10 times the size of our sun, with the biggest one reaching 21 solar masses. Thanks to the information collected by the European Space Agency's Gaia mission, though, scientists have discovered a stellar black hole that's 33 times the size of our sun, making it the biggest one of its kind we've ever seen in our galaxy so far. It's also relatively close to our planet at around 1,926 light-years away. 

Gaia BH3, as it's now called, was first noticed by a team of ESA scientists poring over data from the mission to look for anything unusual. An old giant star from the nearby Aquila constellation caught their attention with its wobbling, leading to the discovery that it was orbiting a massive black hole. BH3 was hard to find despite being so close — it's now the second closest known black hole to our planet — because it doesn't have celestial bodies close enough that could feed it matter and make it light up in X-ray telescopes. Before its discovery, we'd only found black holes of comparable size in distant galaxies. 

The ESA team used data from ground-based telescopes like the European Southern Observatory to confirm the size of the newly discovered celestial body. They also published a paper with preliminary findings before they release a more detailed one in 2025, so that their peers could start studying Gaia BH3. For now, what they know is that the star orbiting it has very few elements heavier than hydrogen and helium, and since stellar pairs tend to have similar compositions, the star that collapsed to form BH3 could've been the same. 

Scientists have long believed that it's the metal-poor stars that can create high-mass black holes after they collapse, because they lose less mass in their lifetimes. In other words, they'd theoretically still have a lot of materials left by the time of their death to form a massive black hole. This was apparently the first evidence we've found that links metal-poor stars with massive stellar black holes, and it's also proof that older giant stars developed differently than the newer ones we see in our galaxy. 

We'll most likely see more detailed studies about binary systems and stellar black holes that use data from BH3 and its companion star in the future. The ESA believes that BH3's discovery is just the beginning, and it's going to be the focus of more investigations as we seek to unravel the mysteries of the universe.

This article originally appeared on Engadget at https://www.engadget.com/esas-gaia-mission-discovers-the-biggest-stellar-black-hole-in-our-galaxy-yet-085753239.html?src=rss

NASA confirms its space trash pierced Florida man’s roof

On March 8, a piece of space debris plunged through a roof in Naples, FL, ripped through two floors and (fortunately) missed the son of homeowner Alejandro Otero. On Tuesday, NASA confirmed the results of its analysis of the incident. As suspected, it’s a piece of equipment dumped from the International Space Station (ISS) three years ago.

NASA’s investigation of the object at Kennedy Space Center in Cape Canaveral confirmed it was a piece of the EP-9 support equipment used to mount batteries onto a cargo pallet, which the ISS’ robotic arm dropped on March 11, 2021. The haul, made up of discarded nickel-hydrogen batteries, was expected to orbit Earth between two to four years (it split the difference, lasting almost exactly three) “before burning up harmlessly in the atmosphere,” as NASA predicted at the time. Not quite.

The roof-piercing debris was described as a stanchion from NASA flight support equipment used to mount the batteries onto the cargo pallet. Made of the metal alloy Inconel, the object weighs 1.6 lbs and measures 4 inches tall and 1.6 inches in diameter.

Hello. Looks like one of those pieces missed Ft Myers and landed in my house in Naples.
Tore through the roof and went thru 2 floors. Almost his my son.
Can you please assist with getting NASA to connect with me? I’ve left messages and emails without a response. pic.twitter.com/Yi29f3EwyV

— Alejandro Otero (@Alejandro0tero) March 15, 2024

Otero told Fort Meyers CBS affiliate WINK-TV that he was on vacation when his son told him that an object had pierced their roof. “I was shaking,” he said. “I was completely in disbelief. What are the chances of something landing on my house with such force to cause so much damage. I’m super grateful that nobody got hurt.”

NASA says it will investigate the equipment dump’s jettison and re-entry to try to figure out why the object slammed into Otero’s home instead of disintegrating into flames. “NASA specialists use engineering models to estimate how objects heat up and break apart during atmospheric re-entry,” the space agency explained in a news release. “These models require detailed input parameters and are regularly updated when debris is found to have survived atmospheric re-entry to the ground.”

Most space junk moves extremely fast, reaching up to 18,000 mph, according to NASA. It explains, “Due to the rate of speed and volume of debris in LEO, current and future space-based services, explorations, and operations pose a safety risk to people and property in space and on Earth.”

This article originally appeared on Engadget at https://www.engadget.com/nasa-confirms-its-space-trash-pierced-florida-mans-roof-204056957.html?src=rss

Physicist Peter Higgs, who predicted 'the God particle', has died at 94

Peter Higgs, the physicist who predicted the Higgs boson particle, has passed away at the age of 94 due to a blood disorder. His work proposing the particle — and showing how it helped give mass to some matter — won him the Noble price in 2013. The Higgs boson is informally referred to as the God particle, after a book by Nobel laureate Leon Lederman.

Higgs came up with the idea in the early 1960s as an attempt to explain why atoms have mass in the first place. The research didn’t get any traction in scientific journals, primarily because few understood the concept, but he was finally published in 1964. This was just a theory at the time, but led to a 50-year race to prove the Higgs boson particle actually exists.

Scientists hit pay dirt in 2012, thanks to physicists working at the Large Hadron Collider at CERN in Switzerland. It took four years of experiments, but the Higgs boson particle was finally discovered, proving his ideas and adding a major puzzle piece to the corpus of particle physics knowledge known as the Standard Model.

As a matter of fact, modern theoretical physicists have posited the existence of up to five Higgs boson particles that fill up what is now called the Higgs field. Scientists hope to use the Higgs boson to one day find proof for ever-elusive dark matter.

The Royal Swedish Academy of Sciences, which awards the Nobel, wrote about the importance of his discovery ahead of the ceremony in 2013. “Even when the universe seems empty this field is there. Without it, we would not exist, because it is from contact with the field that particles acquire mass.” The Nobel was shared with François Englert, a Belgian theoretical physicist whose work in 1964 contributed to the discovery.

"At the beginning I had no idea whether a discovery would be made in my lifetime”, Higgs once said. He leaves two sons, Chris and Jonny, his daughter-in-law Suzanne and two grandchildren. His former wife Jody, a linguistics professor, died in 2008.

This article originally appeared on Engadget at https://www.engadget.com/physicist-peter-higgs-who-predicted-the-god-particle-has-died-at-94-153635259.html?src=rss

One of these concept lunar vehicles could join NASA’s Artemis V astronauts on the moon

Three companies are vying for the opportunity to send their own lunar vehicle to the moon to support NASA’s upcoming Artemis missions. The space agency announced this week that it’s chosen Intuitive Machines, Lunar Outpost and Venturi Astrolab to develop their lunar terrain vehicles (LTV) in a feasibility study over the next year. After that, only one is expected to be selected for a demonstration mission, in which the vehicle will be completed and sent to the moon for performance and safety tests. NASA is planning to use the LTV starting with the Artemis V crew that’s projected to launch in early 2030.

The LTV that eventually heads to the moon’s south pole needs to function as both a crewed and uncrewed vehicle, serving sometimes as a mode of transportation for astronauts and other times as a remotely operated explorer. NASA says it’ll contract the chosen vehicle for lunar services through 2039, with all the task orders relating to the LTV amounting to a potential value of up to $4.6 billion. The selected company will also be able to use its LTV for commercial activities in its down time.

Lunar Outpost
Astrolab

Intuitive Machines, which will be developing an LTV called the Moon Racer, has already bagged multiple contracts with NASA as part of the Commercial Lunar Payload Services (CLPS) program, and in February launched its first lander, Odysseus, to the moon to achieve the first commercial moon landing. Venturi Astrolab will be developing a vehicle it’s dubbed Flex, while Lunar Outpost will be working on an LTV called Lunar Dawn. All must be able to support a crew of two astronauts and withstand the extreme conditions of the lunar south pole. 

 “We will use the LTV to travel to locations we might not otherwise be able to reach on foot, increasing our ability to explore and make new scientific discoveries,” said Jacob Bleacher, a chief exploration scientist at NASA.

This article originally appeared on Engadget at https://www.engadget.com/one-of-these-concept-lunar-vehicles-could-join-nasas-artemis-v-astronauts-on-the-moon-202448277.html?src=rss

NASA will be studying the total solar eclipse. Here's how you can help

On Monday, April 8, a total solar eclipse will be visible across a swath of North America, from Mexico’s Pacific coast to the easternmost reaches of Canada. And in those few minutes of daytime darkness, all sorts of interesting phenomena are known to occur — phenomena NASA would like our help measuring.

During a total solar eclipse, temperatures may drop and winds may slow down or change their course. Animals have been observed to behave unusually — you might hear crickets start their evening chatter a few hours early. Even radio communications can be disrupted due to changes in the ionosphere while the sun’s light is blocked. And, the sun’s corona — its outermost atmosphere — will come into view, presenting scientists (and those of us helping them) with a rare opportunity to study this layer that’s normally invisible to the naked eye.

NASA has lots of research efforts planned for the eclipse, and has sponsored a handful of citizen science campaigns that anyone can take part in if they’re in or near the path of totality, or the areas where people on the ground can watch the sun become completely obscured by the moon. The path of totality crosses 13 US states, including parts of Texas, Oklahoma, Arkansas, Missouri, Illinois, Kentucky, Indiana, Ohio, Pennsylvania, New York, Vermont, New Hampshire and Maine. It’s an event of some significance; the next time a total solar eclipse passes over that much of the contiguous US won’t be until 2045.

All you’ll need to join in is equipment you already own, like a smartphone, and a few minutes set aside before the eclipse to go through the training materials.

NASA's Scientific Visualization Studio

Help measure the shape of the sun

One such citizen science project is SunSketcher, a concerted effort to measure the true shape of the sun. While the sun is closer to being a perfect sphere than other celestial bodies that have been observed, it’s still technically an oblate spheroid, being a smidge wider along its equator. The SunSketcher team plans to get a more precise measurement by crowd-sourcing observations of Baily's Beads, or the little spots of sunlight that peek out from behind the moon at certain points in the eclipse.

The Baily’s Bead effect is “the last piece of the sun seen before totality and the first to appear after totality,” NASA explained in a blog post. “For a few seconds, these glimmers of light look like beads along the moon’s edge.” They’re visible thanks to the uneven topographical features on the lunar surface.

You’ll need to download the free SunSketcher app, which is available for iOS and Android on the App Store and Google Play Store. Then, a few minutes before totality (the exact time is location-dependent), put your phone on Do Not Disturb, hit “Start” in the app and prop up the phone in a place where it has a good view of the sun. After that, leave it be until the eclipse is over — the app will automatically take pictures of Baily’s Beads as they show up.

There’s a tutorial on the SunSketcher website if you want to familiarize yourself with the process beforehand. When it’s all said and done, the pictures will be uploaded to SunSketcher’s server. They’ll eventually be combined with observations from all over to “create an evolving pattern of beads” that may be able to shed better light on the size and shape of the sun.

The SunSketcher images probably won’t blow you away, so if you’re hoping to get some great pictures of the eclipse, you’ll want to have another camera on hand for that (with the appropriate filters to protect your eyes and the device’s sensors).

NASA / Aubrey Gemignani

Record changes in your surroundings

Eclipse-watchers can also use their smartphones to record the environmental changes that take place when the sun dips behind the moon as part of a challenge run by Global Learning and Observations to Benefit the Environment (Globe). You’ll need an air temperature thermometer as well for this task, and can start logging observations in the days before the eclipse if you feel like being extra thorough.

Temperatures at the surface can, in some cases, drop as much as 10 degrees Fahrenheit during a total solar eclipse, according to NASA. And certain types of clouds have been observed to dissipate during these brief cooldowns, resulting in unexpectedly clear skies in the moments before totality. Data collected with the help of citizen scientists during the 2017 total solar eclipse showed that areas with heavier cloud cover experienced a less extreme drop in surface temperatures.

To participate this time around, download the Globe Observer app from the App Store or Google Play Store, and then open the Globe Eclipse tool from the in-app menu. There, you’ll be able to jot down your temperature measurements and take photos of the sky to record any changes in cloud cover, and make notes about the wind conditions. Plan to dedicate a few hours to this one — NASA asks that you include observations from 1-2 hours before and after the eclipse in addition to what you’ll record during. “You will measure temperature every 5-10 minutes and clouds every 15-30 minutes or whenever you see change,” NASA says.

You can keep using the Globe Observer app for citizen science beyond eclipse day, too. There are programs running all year round for recording observations of things like clouds, land use, mosquito habitats and tree heights. The eclipse tool, though, is only available when there’s an eclipse happening.

Listen to the sounds of wildlife

Observations going back nearly 100 years have added support to the idea that total solar eclipses temporarily throw some animals out of whack. Inspired by a 1935 study that gathered observations on animal behavior during an eclipse three years prior, the Eclipse Soundscapes Project is inviting members of the public to take note of what they hear before, during and after totality, and share their findings.

To be an Observer for the project, it’s recommended that you first sign up on the website and go through the brief training materials so you can get a sense of what type of information the project is looking for. The website also has printable field notes pages you can use to record your observations on eclipse day. You should start taking notes down at least 10 minutes before totality. Only after the eclipse is over will you need to fill out the webform to submit your observations along with your latitude and longitude.

If you happen to have an AudioMoth acoustic monitoring device and a spare microSD card lying around, you can go a step further and record the actual sounds of the environment during the eclipse as a Data Collector. You’ll need to set everything up early — the project says to do it on Saturday, April 6 before noon — and let it record until at least 5PM local time on April 10. At that point, you can turn it off, submit your notes online and mail in the SD card. All of the details for submission can be found on the project’s website.

NASA

Take photos of the solar corona

The Eclipse Megamovie 2024 is an initiative designed to study the sun’s corona and plasma plumes from locations in the path of totality, building off of a previous campaign from the 2017 total solar eclipse. It’s already selected a team of 100 Science Team Alpha Recruits (STARs) who underwent training and were given 3D-printed tracking mounts for their cameras to shoot the best possible images. But, the project will still be accepting photo submissions from any enthusiasts who have a DSLR (and a solar filter) and want to participate.

The Photography Guide is pretty exhaustive, so don’t wait until eclipse day to start figuring out your setup. You’ll be able to submit your photos after the eclipse through a form on the website.

However you choose to spend the eclipse, whether you’re collecting data for a citizen science mission or just planning to kick back and observe, make sure you have everything in place well ahead of the time. While the partial eclipse phases will last over an hour, totality will be over and done in about 3.5-4.5 minutes depending on where you’re watching from. You wouldn’t want to miss out on some of that time because you were fumbling with your camera.

Totality will start shortly after 11AM local time (2PM ET) for western Mexico, moving northeastward over the subsequent two-or-so hours before exiting land near Newfoundland, Canada around 5:30PM local time. There will still be something to see for people outside the path of totality, too. Most of the US will be treated to a partial eclipse that day. You can find out exactly when the eclipse will be visible from your location with this tool on NASA’s website, along with the percentage of sun coverage you can expect to witness.

This article originally appeared on Engadget at https://www.engadget.com/nasa-will-be-studying-the-total-solar-eclipse-heres-how-you-can-help-140011076.html?src=rss

The Morning After: NASA has to make a time zone for the Moon

The White House has published a policy memo asking NASA to create a new time standard for the Moon by 2026. Coordinated Lunar Time (LTC) will establish an official time reference to help guide future lunar missions. The US, China, Japan, India and Russia have space missions to the Moon planned or completed.

NASA (and the White House) aren’t the only ones trying. The European Space Agency is also trying to make a time zone outside of Earth’s… zone.

Given the Moon’s weaker gravity, time moves slightly faster there. “The same clock we have on Earth would move at a different rate on the Moon,” NASA space communications and navigation chief Kevin Coggins told Reuters.

You saw Interstellar, right? Er, just like that. Exactly like that. No further questions.

— Mat Smith

The biggest stories you might have missed

Meta’s AI image generator struggles to create images of couples of different races

Our favorite cheap smartphone is on sale for $250 right now

OnePlus rolls out its own version of Google’s Magic Eraser

How to watch (and record) the solar eclipse on April 8

​​You can get these reports delivered daily direct to your inbox. Subscribe right here!

Microsoft may have finally made quantum computing useful

The most error-free quantum solution yet, apparently.

What if we could build a machine working at the quantum level that could tackle complex calculations exponentially faster than a computer limited by classic physics? Despite all the heady dreams of quantum computing and press releases from IBM and Google, it's still a what-if. Microsoft now says it’s developed the most error-free quantum computing system yet, with Quantinuum. It’s not a thing I can condense into a single paragraph. You… saw Interstellar, right?

Continue reading.

Stability AI’s audio generator can now create three-minute ‘songs’

Still not that good, though.

Stability AI just unveiled Stable Audio 2.0, an upgraded version of its music-generation platform. With this system, you can use your own text to create up to three minutes of audio, which is roughly the length of a song. You can hone the results by choosing a genre or even uploading audio to inspire the algo. It’s fun — try it out. Just don’t add vocals, trust me.

Continue reading.

Bloomberg says Apple is developing personal robots now

EVs schmee vees.

Apple, hunting for its next iPhone / Apple Watch / Vision Pro (maybe?), might be trying to get into robots. According to Bloomberg’s Mark Gurman, one area the company is exploring is personal robotics — and it started looking at electric vehicles too. The report says Apple has started working on a mobile robot to follow users around their home and has already developed a table-top device that uses a robot to move a screen around.

Continue reading.

Another Matrix movie is happening.

Not like this.

Warner Bros.

Whoa.

Continue reading.

This article originally appeared on Engadget at https://www.engadget.com/the-morning-after-nasa-has-to-make-a-time-zone-for-the-moon-111554408.html?src=rss

The White House tells NASA to create a new time zone for the Moon

On Tuesday, The White House published a policy memo directing NASA to create a new time standard for the Moon by 2026. Coordinated Lunar Time (LTC) will establish an official time reference to help guide future lunar missions. It arrives as a 21st-century space race emerges between (at least) the US, China, Japan, India and Russia.

The memo directs NASA to work with the Departments of Commerce, Defense, State, and Transportation to plan a strategy to put LTC into practice by December 31, 2026. International cooperation will also play a role, especially with signees of the Artemis Accords. Established in 2020, they’re a set of common principles between a growing list of (currently) 37 countries that govern space exploration and operating principles. China and Russia are not part of that group.

“As NASA, private companies, and space agencies around the world launch missions to the Moon, Mars, and beyond, it’s important that we establish celestial time standards for safety and accuracy,” OSTP Deputy Director for National Security Steve Welby wrote in a White House press release. “A consistent definition of time among operators in space is critical to successful space situational awareness capabilities, navigation, and communications, all of which are foundational to enable interoperability across the U.S. government and with international partners.”

Einstein’s theories of relativity dictate that time changes relative to speed and gravity. Given the Moon’s weaker gravity (and movement differences between it and Earth), time moves slightly faster there. So an Earth-based clock on the lunar surface would appear to gain an average of 58.7 microseconds per Earth day. As the US and other countries plan Moon missions to research, explore and (eventually) build bases for permanent residence, using a single standard will help them synchronize technology and missions requiring precise timing.

“The same clock that we have on Earth would move at a different rate on the moon,” NASA space communications and navigation chief Kevin Coggins told Reuters. “Think of the atomic clocks at the U.S. Naval Observatory (in Washington). They’re the heartbeat of the nation, synchronizing everything. You’re going to want a heartbeat on the moon.”

NASA

The White House wants LTC to coordinate with Coordinated Universal Time (UTC), the standard by which all of Earth’s time zones are measured. Its memo says it wants the new time zone to enable accurate navigation and scientific endeavors. It also wants LTC to maintain resilience if it loses contact with Earth while providing scalability for space environments “beyond the Earth-Moon system.”

NASA’s Artemis program aims to send crewed missions back to the Moon for the first time since the Apollo missions of the 1960s and 70s. The space agency said in January that Artemis 2, which will fly around the Moon with four people onboard, is now set for a September 2025 launch. Artemis 3, which plans to put humans back on the Moon’s surface, is now scheduled for 2026.

In addition to the US, China aims to put astronauts on the Moon before 2030 as the world’s two foremost global superpowers take their race to space. Although no other countries have announced crewed missions to the lunar surface, India (which put a module and rover on the Moon’s South Pole last year), Russia (its mission around the same time didn’t go so well), the United Arab Emirates, Japan, South Korea and private companies have all demonstrated lunar ambitions in recent years.

In addition to enabling further scientific exploration, technological establishment and resource mining, the Moon could serve as a critical stop on the way to Mars. It could test technologies and provide fuel and supply needs for eventual human missions to the Red Planet.

This article originally appeared on Engadget at https://www.engadget.com/the-white-house-tells-nasa-to-create-a-new-time-zone-for-the-moon-193957377.html?src=rss

This camera captures 156.3 trillion frames per second

Scientists have created a blazing-fast scientific camera that shoots images at an encoding rate of 156.3 terahertz (THz) to individual pixels — equivalent to 156.3 trillion frames per second. Dubbed SCARF (swept-coded aperture real-time femtophotography), the research-grade camera could lead to breakthroughs in fields studying micro-events that come and go too quickly for today’s most expensive scientific sensors.

SCARF has successfully captured ultrafast events like absorption in a semiconductor and the demagnetization of a metal alloy. The research could open new frontiers in areas as diverse as shock wave mechanics or developing more effective medicine.

Leading the research team was Professor Jinyang Liang of Canada’s Institut national de la recherche scientifique (INRS). He’s a globally recognized pioneer in ultrafast photography who built on his breakthroughs from a separate study six years ago. The current research was published in Nature, summarized in a press release from INRS and first reported on by Science Daily.

Professor Liang and company tailored their research as a fresh take on ultrafast cameras. Typically, these systems use a sequential approach: capture frames one at a time and piece them together to observe the objects in motion. But that approach has limitations. “For example, phenomena such as femtosecond laser ablation, shock-wave interaction with living cells, and optical chaos cannot be studied this way,” Liang said.

SCARF
Institut national de la recherche scientifique

The new camera builds on Liang’s previous research to upend traditional ultrafast camera logic. “SCARF overcomes these challenges,” INRS communication officer Julie Robert wrote in a statement. “Its imaging modality enables ultrafast sweeping of a static coded aperture while not shearing the ultrafast phenomenon. This provides full-sequence encoding rates of up to 156.3 THz to individual pixels on a camera with a charge-coupled device (CCD). These results can be obtained in a single shot at tunable frame rates and spatial scales in both reflection and transmission modes.”

In extremely simplified terms, that means the camera uses a computational imaging modality to capture spatial information by letting light enter its sensor at slightly different times. Not having to process the spatial data at the moment is part of what frees the camera to capture those extremely quick “chirped” laser pulses at up to 156.3 trillion times per second. The images’ raw data can then be processed by a computer algorithm that decodes the time-staggered inputs, transforming each of the trillions of frames into a complete picture.

Remarkably, it did so “using off-the-shelf and passive optical components,” as the paper describes. The team describes SCARF as low-cost with low power consumption and high measurement quality compared to existing techniques.

Although SCARF is focused more on research than consumers, the team is already working with two companies, Axis Photonique and Few-Cycle, to develop commercial versions, presumably for peers at other higher learning or scientific institutions.

For a more technical explanation of the camera and its potential applications, you can view the full paper in Nature.

This article originally appeared on Engadget at https://www.engadget.com/this-camera-captures-1563-trillion-frames-per-second-184651322.html?src=rss

Friends don’t let friends use an AI STI test

Picture the scene: Your date has gone well and you and your partner might sleep together. Like any safe adult, you assume there will be a conversation about STI status and the use of protection. Now imagine how you would feel if they asked to take a photo of your penis and upload it to a website you’ve never heard of. That’s the future of intimacy, as imagined by Calmara, a new service launched by “men’s health” startup HeHealth.

HeHealth Website

Its press release suggests users take a picture of their partner’s penis so it can be run through a deep learning model for visual signs of sexually-transmitted infections. And while the website suggests users should wear protection, a banner atop the HeHealth sites describes the app as “Your intimate bestie for unprotected sex.” Mixed messages aside, you may notice some major issues with the pitch: That this only covers infections that present visually, and that it’s only designed to work with penises.

But even if that use case applies, you might not feel you can trust its conclusions once you’ve looked at the data. The Calmara website claims its scans are up to 90 percent accurate, saying its AI has been “battle-tested by over 40,000 users.” That figure doesn’t match up to its press release, which says accuracy reaches 94.4 percent (a figure cited in this NSFW preprint paper submitted a week ago), but its FAQ says the accuracy ranges “from 65 percent to 96 percent across various conditions.” We've reached out to the company and want to learn more about the apparent discrepancy.

Calmara

It’s not impossible for models to categorize visual information — I reported on how systems like these look at images of cells to aid drug discovery. But there are plenty of reasons as to why visual information isn’t going to be as reliable for an STI test. After all, plenty of conditions don’t have visual symptoms and carriers can often be asymptomatic long after infection. The company admits to this in its FAQ, saying that the app is a “first line of defense, not a full-on fortress.” Not to mention that other factors, like the “lighting, the particular health quirks you’re scouting for and a rainbow of skin tones might tweak those [accuracy] numbers a bit.” Even more alarming, the unpublished paper (which is riddled with typos) admits that a full 40 percent of its training dataset is comprised of "augmented" images, for instance "extracting specific visually recognizable disease patterns from the existing clinical image dataset and layering those patterns on top of images of health (sic) penises."

Calmara

The Calmara website’s disclaimer says that its tools are for the purpose of “promoting and supporting general wellness and a healthy lifestyle and are not to be used to diagnose, cure, treat, manage or prevent any disease or condition." Of course, if it really was intended as a general wellness tool, it probably wouldn’t describe itself as “Your intimate bestie for unprotected sex,” would it.

It doesn’t help that this is a system asking users to send pictures of their, or their partner's genitalia. Issues around consent and — as writer Ella Dawson raised on Bluesky — age verification, don’t seem to have been considered. The company's promises that the data is locked in a "digital stronghold" lacks specifics about its security approach or how the data it obtains may be shared. But that hasn’t stopped the company from suggesting that it could, in future, be integrated “directly into dating apps.”

Fundamentally, there are so many red flags and potential vectors for abuse and giving users a false sense of confidence that nobody should try using it.

This article originally appeared on Engadget at https://www.engadget.com/friends-dont-let-friends-use-an-ai-sti-test-162354796.html?src=rss