Posts with «author_name|will shanklin» label

Researchers use novel method to find a distant exoplanet

Astronomers have discovered a new exoplanet — but this time, the way they found it may be as significant as the discovery itself. Researchers used a breakthrough combination of indirect and direct planetary detection to locate the distant world known as HIP 99770 b. It could inch us closer to finding Earth-like exoplanets among our (distantly) neighboring stars.

Direct imaging is what most casual observers would expect to lie at the heart of exoplanet hunting: using powerful telescopes with advanced optics to capture images of distant planetary bodies. However, direct imaging is most effective for planets orbiting far from their stars; an exoplanet closer to its sun is usually obscured by the star’s bright light, making it difficult to detect or image. (When they’re farther away, there’s greater contrast between the exoplanet’s and the star’s light.)

Meanwhile, indirect imaging (precision astrometry) looks for stars that appear to “wobble,” meaning their gravity may be affected by an (otherwise unseen to us) exoplanet. This method can more easily detect the presence of planets orbiting closer to their stars — like the Earth’s relationship to the Sun. As a result, indirect imaging has yielded over 5,000 exoplanet discoveries, while direct imaging has only captured about 20.

The international team of researchers, led by Thayne Currie of the National Astronomical Observatory of Japan (NAOJ) and the University of Texas at San Antonio, combined the two methods to discover the new exoplanet. First, they used data from the Hipparcos-Gaia Catalogue of Accelerations — a map tracking the precise positions and motions of nearly two million stars in the Milky Way — to identify the star HIP 99770 as a prime candidate for hosting an exoplanet. Then, they used Japan’s ultra-powerful Subaru telescope (in Mauna Kea, Hawaii) to directly image the newly discovered exoplanet, creatively titled HIP 99770 b.

European Space Agency

The European Space Agency image above illustrates that the exoplanet is about 16 times as massive as Jupiter. Despite having an orbit over three times longer than Jupiter’s orbit around our Sun, HIP 99770 b receives around the same amount of light as Jupiter because its sun is about twice as massive as ours. The researchers say it may have water and carbon monoxide in its atmosphere.

Astronomers believe the new method combining direct and indirect imaging opens an exciting new door for future discoveries. “It provides a new path forward to discovering more exoplanets, and characterizing them in a far more holistic way than we could do before,” says Currie. Additionally, the group views Gaia’s upcoming fourth data release, which will yield nearly double the previous version’s data, will make it easier to identify stars wobbling from the gravity of planetary bodies. “The discovery of this planet will spawn dozens of follow-on studies.” The team is now studying data from about 50 other stars showing promise for hosting exoplanets.

“This is sort of a test run for the kind of strategy we need to be able to image an earth,” said Currie. “It demonstrates that an indirect method sensitive to a planet’s gravitational pull can tell you where to look and exactly when to look for direct imaging. So I think that’s really exciting.”

This article originally appeared on Engadget at https://www.engadget.com/researchers-use-novel-method-to-find-a-distant-exoplanet-175055335.html?src=rss

Amazon introduces Bedrock, a cloud service for AI-generated text and images

Amazon is joining the generative AI fray. Bedrock is the company’s new API for Amazon Web Services (AWS) that lets developers use and customize AI tools that generate text or images. Think of it as a cloud-based and configurable alternative to OpenAI’s ChatGPT and DALL-E 2 aimed at businesses and developers.

AWS customers can use Bedrock to write, build chatbots, summarize text, classify images and more based on text prompts. It gives its users a choice of Amazon’s Titan foundation model (FM) and several startups’ models, including Anthropic’s Claude (a Google-backed ChatGPT rival from former OpenAI employees), AI21’s Jurassic-2 (a language model specializing in Spanish, French, German, Portuguese, Italian and Dutch) and Stable Diffusion (a popular open-source image generator). Additionally, businesses and developers can customize how the models work based on input — which Amazon says won’t be used for training the models, according toCNBC. That should (theoretically) address a crucial privacy concern for businesses entering sensitive data.

Amazon views the range of AI models on offer as a way of providing flexibility to customers. The company’s description reads, “With Bedrock’s serverless experience, you can get started quickly, privately customize FMs with your own data, and easily integrate and deploy them into your applications using the AWS tools and capabilities you are familiar with (including integrations with Amazon SageMaker ML features like Experiments to test different models and Pipelines to manage your FMs at scale) without having to manage any infrastructure.”

“Most companies want to use these large language models, but the really good ones take billions of dollars to train and many years and most companies don’t want to go through that,” Amazon CEO Andy Jassy toldCNBC on Thursday. “So what they want to do is they want to work off of a foundational model that’s big and great already and then have the ability to customize it for their own purposes. And that’s what Bedrock is.”

Amazon says C3.ai, Pegasystems, Accenture and Deloitte are some early businesses lined up to try Bedrock. The company hasn’t yet announced pricing for the AWS toolset, and it’s currently opening access through a waitlist. You can read more and apply for admission at the project’s website.

This article originally appeared on Engadget at https://www.engadget.com/amazon-introduces-bedrock-a-cloud-service-for-ai-generated-text-and-images-204556563.html?src=rss

Researchers used machine learning to improve the first photo of a black hole

Researchers have used machine learning to tighten up a previously released image of a black hole. As a result, the portrait of the black hole at the center of the galaxy Messier 87, over 53 million light years away from Earth, shows a thinner ring of light and matter surrounding its center in a report published today in The Astrophysical Journal Letters.

The original images were captured in 2017 by the Event Horizon Telescope (EHT), a network of radio telescopes around Earth that combine to act as a planet-sized super-imaging tool. The initial picture looked like a “fuzzy donut,” as described by NPR, but researchers used a new method called PRIMO to reconstruct a more accurate image. PRIMO is “a novel dictionary-learning-based algorithm” that learns to “recover high-fidelity images even in the presence of sparse coverage” by training on generated simulations of over 30,000 black holes. In other words, it uses machine learning data based on what we know about the universe’s physical laws — and black holes specifically — to produce a better-looking and more accurate shot from the raw data captured in 2017.

Black holes are mysterious and strange regions of space where gravity is so strong that nothing can escape. They form when dying stars collapse onto themselves under their gravity. As a result, the collapse squeezes the star’s mass into a tiny space. The boundary between the black hole and its surrounding mass is called the event horizon, a point of no return where anything that crosses it (whether light, matter or Matthew McConaughey) won’t be coming back.

“What we really do is we learn the correlations between different parts of the image. And so we do this by analyzing tens of thousands of high-resolution images that are created from simulations,” the astrophysicist and author of the paper Lia Medeiros of the Institute for Advanced Study in Princeton, NJ, told NPR. “If you have an image, the pixels close to any given pixel are not completely uncorrelated. It’s not that each pixel is doing completely independent things.”

The researchers say the new image is consistent with Albert Einstein’s predictions. However, they expect further research in machine learning and telescope hardware to lead to additional revisions. “In 20 years, the image might not be the image I’m showing you today,” said Medeiros. “It might be even better.”

This article originally appeared on Engadget at https://www.engadget.com/researchers-used-machine-learning-to-improve-the-first-photo-of-a-black-hole-170722614.html?src=rss

DJI’s newest drone is a $16K model for pro filmmakers

DJI unveiled its latest high-end drone for professional filmmakers today. The Inspire 3 is a full-frame 8K cinema drone in a “highly portable form factor” that can be yours this summer for a mere $16,499.

The DJI Inspire 3 has a Zenmuse X9-8K Air Gimbal Camera with a wide range of dynamic colors and compatibility with various lenses. Its camera system has dual native ISO for clear low-light footage while covering over 14 stops of dynamic range to help capture highlights and shadows in sunrises and sunsets. It has a Tilt Boost and 360-degree Pan structures. Its FPV camera, visual sensors, positioning antennas and storage slot are “seamlessly integrated into the airframe for a minimalist look and modern industrial aesthetics.” It can capture video in various formats, including CinemaDNG and Apple ProRes RAW.

The drone supports RTK-powered Waypoint Pro and omnidirectional sensing for precise flight paths and improved safety. The drone has nine sensors to help detect and avoid obstacles and protect your $16.5K purchase. In addition, you can toggle horizontal, upward and downward obstacle-sensing independently and manually set its obstacle-alert range if the automatic function doesn’t suit your needs. (With active avoidance turned off, the display will still show incoming obstacles and sound an alert if it’s within a set range.) It also has hot-swappable TB51 intelligent dual batteries for up to 28 minutes of flight time and up to 58.4 mph speeds.

DJI

It uses DJI’s O3 Pro transmission and control system with a range of up to around 9.2 miles with one controller and up to 7.5 miles in dual-control mode (where one person pilots the drone and a second pilot controls the gimbal). It includes a first-person view (FPV) camera with an ultra-wide 161-degree field of view and night vision. The pilot’s feed has a latency of 90ms in 1080p / 60 fps mode. Additionally, it supports 4K / 30 fps feeds, although that mode reduces the drone’s range to an estimated 3.1 miles.

The DJI Inspire 3 will be available “by the end of June.” If you’re a pro filmmaker with over $16,000 to spare, you’ll get Zenmuse X9-8K Air Gimbal Camera, RC Plus remote controller and other accessories. The company’s DJI Care Pro accidental protection plans are also available for an additional cost.

This article originally appeared on Engadget at https://www.engadget.com/djis-newest-drone-is-a-16k-model-for-pro-filmmakers-130047974.html?src=rss

Summer Games Done Quick 2023 will speed-run Zelda for charity

Summer Games Done Quick (SGDQ) released the full schedule for its return to in-person activities for 2023. The charity speed-running event takes place in Minneapolis from May 28th to June 4th. This year, the organizers added a slew of Zelda runs in honor of the upcoming Tears of the Kingdom, which gamers will spend countless hours exploring beginning next month. Of course, the event will stream live on Twitch for those who can’t make it to Minnesota.

The full schedule starts with a pre-show followed by a Sonic Frontiers run on May 28th and wraps up with a Super Metroid run and an unknown finale on June 3rd. The last day also includes Elden Ring and a blindfolded run of The Legend of Zelda: Breath of the Wild. Sightless speed runs have been a popular GDQ mainstay, with previous years including memorable blindfolded play-throughs of Mike Tyson’s Punch-Out!! and Super Mario 64.

Other Zelda games in the SGDQ 2023 lineup include The Minish Cap (Switch) on May 28th, A Link Between Worlds (3DS) on May 29th, Majora’s Mask (Nintendo 64) on May 31st, Twilight Princess (GameCube) on June 1st and Four Swords (Game Boy Advance) on June 2nd. The Zelda franchise should be frontmost on the minds of many gamers during this year’s event as the latest installment, Tears of the Kingdom, launches on May 12th.

Nintendo

If big-name series aren’t your thing, the 2023 event will include plenty of cult-classic and oddball runs. For example, you can tune into Hobo Cat Adventures on June 1st, Choo-Choo Charles on May 31st and the NES adventure Maniac Mansion on May 30th. (And you won’t want to miss Give Me Toilet Paper! on June 1st.) You can read the full schedule for many more runs, including Hitman 3, GTA: San Andreas and Super Mario Odyssey.

It should be lighthearted fun for a terrific cause, as 100 percent of all donations go to Doctors Without Borders. The event typically raises millions of dollars for the charity, which provides medical and humanitarian care to people in over 70 countries affected by crises like war, natural disasters and epidemics. In its return as an in-person event, attendees must provide proof of COVID-19 vaccination and wear a KN95 / N95 / KF94 mask. You can register to attend on the organization’s website.

This article originally appeared on Engadget at https://www.engadget.com/summer-games-done-quick-2023-will-speed-run-zelda-for-charity-184223865.html?src=rss

You can now stream Peacock shows on Meta Quest VR headsets

You can now watch The Office in VR, as NBC Universal’s Peacock app is now available for the Meta Quest 2 and Meta Quest Pro virtual reality headsets. In addition, the companies are partnering to give new Quest owners a free year of the streaming service.

The app brings content like Poker Face, Vanderpump Rules and (coming April 14th) Cocaine Bear to a giant screen in VR. Of course, live sports, including NFL and Major League Baseball games, are also included. Additionally, the app supports multitasking with multiple screens, and you can resize the content window — stretching all the way up to theater-sized.

The app launch and deal are part of a three-year partnership between Meta and NBC Universal, announced in October. Meta says it will also bring “experiences across a variety of NBCU IP, including Universal Monsters, Halloween Horror Nights and The Office to immersive environments like Horizon Worlds and Avatars Store.” For example, Meta’s Horizon Worlds (the company’s metaverse home base) will let you interact with virtual content from The Office later this year.

As for the deal, if you buy a new Meta Quest 2 or Meta Quest Pro headset between now and April 11, 2024, you can redeem a code for 12 months of Peacock Premium (usually $5 per month). Or, if you bought one of those headsets before April 11th, you’ll receive an offer for three free months. However, Peacock Premium still includes full ads; you’ll need Peacock Premium Plus, which costs an extra $5 monthly, for a plan with “fewer ads.” Quest owners with eligible accounts (at least 18 years old and living in the US or its territories) can watch for an email with a promo code and redemption link.

This article originally appeared on Engadget at https://www.engadget.com/you-can-now-stream-peacock-shows-on-meta-quest-vr-headsets-171018405.html?src=rss

Judge rejects Elizabeth Holmes’ bid for freedom while awaiting appeal

A federal judge denied Holmes’s motion for release on Monday as she appealed her conviction on four counts of fraud and conspiracy, as reported by The Guardian. As a result, the Theranos founder is scheduled to report to prison on April 27th.

Holmes has appealed her conviction to the federal ninth circuit court of appeals based on questions about the “accuracy and reliability” of evidentiary and procedural issues in the trial. However, US district court judge Edward Davila ruled Monday that the appeals didn’t meet the burden of a “substantial” questioning of facts or law. According to the judge, the request didn’t address the conviction’s underlying wire-fraud issues against investors. Therefore, it wouldn’t warrant a reversal or new trial (the legal standard for remaining free pending appeal) even if the appeals court agreed with her assertions.

However, the judge ruled against prosecutors hoping to brand Holmes as a flight risk after learning that her partner bought her a one-way ticket for a flight to Mexico. Although the judge described the ticket purchase (and failure to cancel it post-conviction) as a “bold move” and “perilously careless oversight,” he gave her the benefit of the doubt, ruling she was “not likely to flee or pose a danger” to the public.

Last November, the Theranos founder was sentenced to over 11 years in prison for defrauding investors after a jury found her guilty last January. Founded in 2003, Theranos claimed to produce a long list of revealing health results using only a single drop of a patient’s blood. The company raised hundreds of millions of dollars from high-profile investors before internal whistleblowers sourced a 2015 Wall Street Journal story revealing that the startup’s underlying technology was bogus. The story has since become a cautionary tale, with podcasts, books and a recent Hulu miniseries cashing in on the one-time Silicon Valley golden child’s downfall.

This article originally appeared on Engadget at https://www.engadget.com/judge-rejects-elizabeth-holmes-bid-for-freedom-while-awaiting-appeal-191042016.html?src=rss

Google TV takes on Roku with over 800 free TV channels

Google TV is becoming more like basic cable. The company announced today it’s adding content from several new providers to make browsing ad-supported live TV channels a central part of the platform. The news comes several months after the company was reportedly negotiating with media companies to add similar content to YouTube.

Starting today, Google TV is adding Free Ad-Supported Streaming Television (FAST) channels from Tubi, Plex and Haystack News to its existing FAST content from Pluto TV. In addition, Google is adding “built-in channels from Google TV that you can watch without even downloading or launching an app.” The company says the service now aggregates over 800 free channels.

FAST is the industry term for ad-supported “linear streaming content,” meaning it’s broadcast at specific times like traditional television. (Think standard afternoon programming on TNT or TBS.) Already embraced by competitors like Roku, FAST channels turn streaming into an experience akin to channel-surfing in the old days — further proving that live TV streaming has essentially become cable sent through a different pipe.

Google says the content will include shows like Westworld (which Warner Bros. Discovery removed from HBO Max), Law & Order: SVU and The Walking Dead. Additionally, it includes news content from NBC, ABC, CBS and Fox. It also has international programming in more than 10 languages, including Spanish, Japanese and Hindi. In addition, the programming is organized in an updated TV guide, which Google says makes browsing easier and faster. The Google TV Live tab will also include content from YouTube TV or Sling TV (if you subscribe), putting all your live TV content in one spot.

Although the Google TV changes arrive beginning today, the company says it will roll it out “over the coming weeks,” (a Google classic) so you may have to wait a bit before trying it. First, of course, you’ll need a Google TV device like Chromecast with Google TV or a television from Sony, TCL, Hisense and Philips with Google TV built-in. The company says the feature will trickle down to Android TV devices later this year.

This article originally appeared on Engadget at https://www.engadget.com/google-tv-takes-on-roku-with-over-800-free-tv-channels-173805920.html?src=rss

‘Super Bomberman R 2’ delivers level-building and 15 vs. 1 chaos this September

Konami announced today that the latest installment in the long-running Bomberman franchise arrives this September. Super Bomberman R 2, initially revealed last year, marks the series’ 40th anniversary by taking the foundations of its 2017 predecessor and adding level-building and a wacky 15 vs. 1 mode.

For this installment, Konami added Castle mode, featuring “attack vs. defend” gameplay where you try to overtake or protect a fortress. Teams of 15 will try to open all the treasure chests to unlock passages into the castle; the lone keeper tries to keep at least one locked before the game’s end. It looks every bit as chaotic as you’d imagine. In addition, the returning game modes include Standard (classic gameplay), Story mode (a single-player adventure), Battle 64 (battle royale) and Grand Prix (“compete for crystals and knock out other players”).

Meanwhile, the new Stage Editor lets you create and share your Castle mode stages. Following the trend set by franchises like LittleBigPlanet and Mario Maker, Konami envisions a robust community of online creators giving you virtually unlimited content.

Super Bomberman R 2 launches on September 12th (although digital versions arrive a day later) and is available for pre-order now. It will support the Nintendo Switch, PS 5/ 4, Xbox Series X / S, Xbox One and Steam.

This article originally appeared on Engadget at https://www.engadget.com/super-bomberman-r-2-delivers-level-building-and-15-vs-1-chaos-this-september-213029904.html?src=rss

YouTube Premium on iOS will soon work with SharePlay

Google announced a feature drop today for YouTube Premium users. Perhaps the most anticipated addition is iOS SharePlay support, which follows the release of Google’s equivalent feature in Meet video calls.

The company says iOS SharePlay support will arrive “in the coming weeks” for YouTube Premium subscribers. Apple launched SharePlay in 2021 in the wake of pandemic lockdowns, allowing people to watch media together through Apple’s video-calling service. However, YouTube is late to the party as a long list of video streaming services — including Disney+, HBO Max, Hulu and many others — have been compatible with SharePlay for months, if not years. (Netflix is a remaining holdout.) Assuming YouTube SharePlay works like Google Meet Live Sharing, only the person setting up the call would need a Google account subscribed to YouTube Premium; other participants wouldn’t.

Google is also adding YouTube video queuing to mobile devices. Premium subscribers can now add new videos to watch next — like they’ve been able to on the web since 2019. The mobile version of the feature initially appeared late last year in beta under the Android app’s “Try new features” section.

Also arriving “in the coming weeks” is enhanced 1080p streaming for YouTube Premium subscribers on iOS. Google acknowledged the feature was under testing in February after a small group of users reported seeing the option. It uses a higher bitrate (YouTube sends more data per second), which should lead to a better-looking picture. During the beta test, Google claimed the quality of standard 1080p streaming would be unaffected, meaning it wouldn’t nerf video quality for free users to drive subscriptions.

Premium subscribers on Android, iOS and the web will also soon see a new feature that lets them easily pick up YouTube videos where they left off on another device. Additionally, the new Smart Downloads feature on mobile will automatically add recommended videos to your library (when connected to WiFi) for offline viewing. Of course, if you don’t want to waste storage, you can turn off the feature in the app’s settings menu.

This article originally appeared on Engadget at https://www.engadget.com/youtube-premium-on-ios-will-soon-work-with-shareplay-143057377.html?src=rss