The Apple Vision Pro is an impressive piece of hardware, and the eye-tracking/hand gesture input combo is fantastic for navigating menus and the like. It’s not so great for gaming. There haven't been many easy ways to connect a third-party controller for playing iPad or cloud games. This is changing, however, as accessory manufacturer 8BitDo just announced Vision Pro compatibility for a number of its controllers.
These accessories are officially supported by Apple, so they should work as soon as you make a Bluetooth connection. No muss and no fuss. All told, eight devices got the Apple seal of approval here. One such gadget is the company’s Ultimate Bluetooth Controller, which we basically called the perfect gamepad for PC.
8BitDo
Other compatible devices include various iterations of the SN30 Pro controller, the Lite 2 and the NES-inspired N30 Pro 2. The integration isn’t just for game controllers, as 8BitDo also announced AVP compatibility for its Retro Mechanical Keyboard. Of course, the Vision Pro works out of the box with most Bluetooth keyboards.
This is pretty big news, however, as media consumption is one of the best parts of the Vision Pro experience. Video games fall squarely in that category. Just about every iPad title works on the device. If playing Cut the Rope on a giant virtual screen doesn’t do it for you, the headset also integrates with Xbox Cloud Gaming and Nvidia GeForce Now for access to AAA titles.
8BitDo announced official controller support for Apple devices last year, though this was primarily for smartphones, tablets and Mac computers. The integration was thanks to new controller firmware and Apple's recent iOS 16.3, iPadOS 16.3, tvOS 16.3 and macOS 13.2 updates. It looks like all of the accessories that work with iPhones and iPads also work with the Vision Pro.
This article originally appeared on Engadget at https://www.engadget.com/apple-vision-pro-owners-now-have-more-decent-controller-options-150055872.html?src=rss
Having a fancy webcam is all well and good, but another thing you might need to seriously upgrade the quality of your video calls and livestreams is a decent key light. It will illuminate your face to help you stand out from the background and help the camera discern your features more clearly. You don’t need to break the bank to get a decent key light either. Logitech’s Litra Beam is currently $10 off at $90. That’s only $5 more than the lowest price we’ve seen for it.
The Litra Beam looks a bit like an LED reading lamp and it would be a fairly stylish addition to many setups. It has a three-way adjustable stand, allowing you to tweak the height, tilt and rotation as needed, while its ability to run on either USB or AC power gives you more placement options.
The device uses TrueSoft tech, which, according to Logitech, provides "balanced, full-spectrum LED light with cinematic color accuracy for a natural, radiant look across all skin tones." A frameless diffuser helps mitigate harsh shadows, according to the company.
You'll be able to adjust the Litra Beam's brightness, color temperature, presets and other settings through the Logitech G Hub desktop app, which also allows you to manage multiple lights at once. In addition, the key light has five physical buttons on the rear for quick switching between brightness and color temperature settings.
This article originally appeared on Engadget at https://www.engadget.com/logitechs-litra-beam-key-light-is-10-percent-off-right-now-141839351.html?src=rss
On Monday, April 8, a total solar eclipse will be visible across a swath of North America, from Mexico’s Pacific coast to the easternmost reaches of Canada. And in those few minutes of daytime darkness, all sorts of interesting phenomena are known to occur — phenomena NASA would like our help measuring.
During a total solar eclipse, temperatures may drop and winds may slow down or change their course. Animals have been observed to behave unusually — you might hear crickets start their evening chatter a few hours early. Even radio communications can be disrupted due to changes in the ionosphere while the sun’s light is blocked. And, the sun’s corona — its outermost atmosphere — will come into view, presenting scientists (and those of us helping them) with a rare opportunity to study this layer that’s normally invisible to the naked eye.
NASA has lots of research efforts planned for the eclipse, and has sponsored a handful of citizen science campaigns that anyone can take part in if they’re in or near the path of totality, or the areas where people on the ground can watch the sun become completely obscured by the moon. The path of totality crosses 13 US states, including parts of Texas, Oklahoma, Arkansas, Missouri, Illinois, Kentucky, Indiana, Ohio, Pennsylvania, New York, Vermont, New Hampshire and Maine. It’s an event of some significance; the next time a total solar eclipse passes over that much of the contiguous US won’t be until 2045.
All you’ll need to join in is equipment you already own, like a smartphone, and a few minutes set aside before the eclipse to go through the training materials.
NASA's Scientific Visualization Studio
Help measure the shape of the sun
One such citizen science project is SunSketcher, a concerted effort to measure the true shape of the sun. While the sun is closer to being a perfect sphere than other celestial bodies that have been observed, it’s still technically an oblate spheroid, being a smidge wider along its equator. The SunSketcher team plans to get a more precise measurement by crowd-sourcing observations of Baily's Beads, or the little spots of sunlight that peek out from behind the moon at certain points in the eclipse.
The Baily’s Bead effect is “the last piece of the sun seen before totality and the first to appear after totality,” NASA explained in a blog post. “For a few seconds, these glimmers of light look like beads along the moon’s edge.” They’re visible thanks to the uneven topographical features on the lunar surface.
You’ll need to download the free SunSketcher app, which is available for iOS and Android on the App Store and Google Play Store. Then, a few minutes before totality (the exact time is location-dependent), put your phone on Do Not Disturb, hit “Start” in the app and prop up the phone in a place where it has a good view of the sun. After that, leave it be until the eclipse is over — the app will automatically take pictures of Baily’s Beads as they show up.
There’s a tutorial on the SunSketcher website if you want to familiarize yourself with the process beforehand. When it’s all said and done, the pictures will be uploaded to SunSketcher’s server. They’ll eventually be combined with observations from all over to “create an evolving pattern of beads” that may be able to shed better light on the size and shape of the sun.
The SunSketcher images probably won’t blow you away, so if you’re hoping to get some great pictures of the eclipse, you’ll want to have another camera on hand for that (with the appropriate filters to protect your eyes and the device’s sensors).
NASA / Aubrey Gemignani
Record changes in your surroundings
Eclipse-watchers can also use their smartphones to record the environmental changes that take place when the sun dips behind the moon as part of a challenge run by Global Learning and Observations to Benefit the Environment (Globe). You’ll need an air temperature thermometer as well for this task, and can start logging observations in the days before the eclipse if you feel like being extra thorough.
Temperatures at the surface can, in some cases, drop as much as 10 degrees Fahrenheit during a total solar eclipse, according to NASA. And certain types of clouds have been observed to dissipate during these brief cooldowns, resulting in unexpectedly clear skies in the moments before totality. Data collected with the help of citizen scientists during the 2017 total solar eclipse showed that areas with heavier cloud cover experienced a less extreme drop in surface temperatures.
To participate this time around, download the Globe Observer app from the App Store or Google Play Store, and then open the Globe Eclipse tool from the in-app menu. There, you’ll be able to jot down your temperature measurements and take photos of the sky to record any changes in cloud cover, and make notes about the wind conditions. Plan to dedicate a few hours to this one — NASA asks that you include observations from 1-2 hours before and after the eclipse in addition to what you’ll record during. “You will measure temperature every 5-10 minutes and clouds every 15-30 minutes or whenever you see change,” NASA says.
You can keep using the Globe Observer app for citizen science beyond eclipse day, too. There are programs running all year round for recording observations of things like clouds, land use, mosquito habitats and tree heights. The eclipse tool, though, is only available when there’s an eclipse happening.
Listen to the sounds of wildlife
Observations going back nearly 100 years have added support to the idea that total solar eclipses temporarily throw some animals out of whack. Inspired by a 1935 study that gathered observations on animal behavior during an eclipse three years prior, the Eclipse Soundscapes Project is inviting members of the public to take note of what they hear before, during and after totality, and share their findings.
To be an Observer for the project, it’s recommended that you first sign up on the website and go through the brief training materials so you can get a sense of what type of information the project is looking for. The website also has printable field notes pages you can use to record your observations on eclipse day. You should start taking notes down at least 10 minutes before totality. Only after the eclipse is over will you need to fill out the webform to submit your observations along with your latitude and longitude.
If you happen to have an AudioMoth acoustic monitoring device and a spare microSD card lying around, you can go a step further and record the actual sounds of the environment during the eclipse as a Data Collector. You’ll need to set everything up early — the project says to do it on Saturday, April 6 before noon — and let it record until at least 5PM local time on April 10. At that point, you can turn it off, submit your notes online and mail in the SD card. All of the details for submission can be found on the project’s website.
NASA
Take photos of the solar corona
The Eclipse Megamovie 2024 is an initiative designed to study the sun’s corona and plasma plumes from locations in the path of totality, building off of a previous campaign from the 2017 total solar eclipse. It’s already selected a team of 100 Science Team Alpha Recruits (STARs) who underwent training and were given 3D-printed tracking mounts for their cameras to shoot the best possible images. But, the project will still be accepting photo submissions from any enthusiasts who have a DSLR (and a solar filter) and want to participate.
The Photography Guide is pretty exhaustive, so don’t wait until eclipse day to start figuring out your setup. You’ll be able to submit your photos after the eclipse through a form on the website.
However you choose to spend the eclipse, whether you’re collecting data for a citizen science mission or just planning to kick back and observe, make sure you have everything in place well ahead of the time. While the partial eclipse phases will last over an hour, totality will be over and done in about 3.5-4.5 minutes depending on where you’re watching from. You wouldn’t want to miss out on some of that time because you were fumbling with your camera.
Totality will start shortly after 11AM local time (2PM ET) for western Mexico, moving northeastward over the subsequent two-or-so hours before exiting land near Newfoundland, Canada around 5:30PM local time. There will still be something to see for people outside the path of totality, too. Most of the US will be treated to a partial eclipse that day. You can find out exactly when the eclipse will be visible from your location with this tool on NASA’s website, along with the percentage of sun coverage you can expect to witness.
This article originally appeared on Engadget at https://www.engadget.com/nasa-will-be-studying-the-total-solar-eclipse-heres-how-you-can-help-140011076.html?src=rss
Google has gone from being the go-to search engine to something people are paying to avoid entirely. This week, Cherlynn and Devindra chat with 404 Media co-founder Jason Koebler about his experience moving away from Google and towards Kagi, a $10 a month search engine without ads or data tracking. Funny enough, Kagi is still relying on Google’s index, so it’s a lot like using that site before the onslaught of ads, sponsored posts and AI results. Also, we discuss the company’s lies around Chrome’s incognito mode, as well as the news that it would be deleting user data collected in that mode. (Be sure to check out the 404 Media podcast too!)
Listen below or subscribe on your podcast app of choice. If you've got suggestions or topics you'd like covered on the show, be sure to email us or drop a note in the comments! And be sure to check out our other podcast, Engadget News!
Topics
Why Jason Koebler moved from Google to Kagi's paid search engine – 0:45
Google says it will destroy data collected from users using Incognito mode – 15:01
Gurman report: Apple is working on personal home robots – 24:55
Amazon just walked out on its self check-out tech – 30:43
FCC set to vote to restore Net Neutrality – 43:00
Apple adds Spatial Personas to make the Vision Pro experience less lonely – 45:09
Proposed California state law would give tech workers the “right to disconnect” – 47:17
Tekken director responds to fighting game fans’ request for a Waffle House stage – 49:57
AI models using individual's work without permission (or compensation) is nothing new, with entities like The New York Times and Getty Images initiating lawsuits against AI creators alongside artists and writers. In March, OpenAI CTO Mira Murati contributed to the ongoing uncertainty, telling The Wall Street Journal she wasn't sure if Sora, the company's new text-to-video AI tool, takes data from YouTube, Instagram or Facebook posts. Now, YouTube's CEO Neal Mohan has responded with a clear warning to OpenAI that using its videos to teach Sora would be a "clear violation" of the platform's terms of use.
In an interview with Bloomberg Originals host Emily Chang, Mohan stated, "From a creator's perspective, when a creator uploads their hard work to our platform, they have certain expectations. One of those expectations is that the terms of service is going to be abided by. It does not allow for things like transcripts or video bits to be downloaded, and that is a clear violation of our terms of service. Those are the rules of the road in terms of content on our platform."
A lot of uncertainty and controversy still surrounds how OpenAI trains Sora, along with ChatGPT and DALL-E, with The Wall Street Journal recently reporting the company plans to use YouTube video transcriptions to train GPT-5. On the other hand, OpenAI competitor Google is apparently respecting the rules — at least when it comes to YouTube (which it owns). Google's AI model Gemini requires similar data to learn but Mohan claims it only uses certain videos, depending on permissions are given in each creator's licensing contract.
This article originally appeared on Engadget at https://www.engadget.com/youtube-ceo-warns-openai-that-training-models-on-its-videos-is-against-the-rules-121547513.html?src=rss
Roku already serves ads through its platform, but it's also apparently exploring the idea of showing you ads while you're using third-party devices connected to its TVs. Based on a recent patent filing unearthed by Lowpass, the company is looking to develop a system or a method "for ad insertion by a display device coupled to a media device via a high-definition media interface (HDMI) connection." That means if you've connected another streaming device or console — say, an Apple TV, a Chromecast or a PlayStation — to a Roku TV via HDMI, the company would still be able to serve you advertisements.
In particular, Roku is hoping to show you commercials while whatever you're watching or playing on the third-party device attached to it is on pause. In its patent, it described several methods on how it can detect whether the show or game on screen is paused, such as receiving a pause signal from the remote control, detecting a pause icon, looking at several video frames and determining that the image on screen hasn't changed for some time and getting a silent audio signal from the HDMI connection.
If it works as intended, those ads wouldn't impact your viewing or playing experience (much), assuming you're truly stepping away or doing something else in the meantime. While you'd probably prefer those experiences to be free of ads altogether, Roku is at least looking to make sure that it's serving you relevant ads. It could analyze frozen video or audio frames and use automatic content recognition (ACR) technology to identify what's on screen. Or it could analyze metadata to show ads connected to what you're playing or watching. It could also serve commercials based on what third-party device is attached to your TV.
As Lowpass notes, the company could have conjured the idea because manufacturers typically don't make a lot of money from hardware sales. For the fiscal year of 2023, Roku lost $44 million on smart TVs. Similarly, Samsung's visual display and digital appliances division posted $37.5 million in operating losses for last year's fourth quarter. Meanwhile, ads and services generated $1.6 billion in profit for Roku. This idea could potentially make it more money... if the prospect of watching commercials while your show or game is paused doesn't turn you off buying Roku TV, of course. This is just a patent at this point in time, though, and Roku may very well end up scrapping it and not implementing it at all.
This article originally appeared on Engadget at https://www.engadget.com/roku-looks-into-serving-you-ads-on-whatever-you-plug-into-its-tvs-120016754.html?src=rss
A new Carbon Majors Database report, which examines carbon dioxide emissions, found that just 57 companies were responsible for 80 percent of the global carbon dioxide emissions between 2016 and 2022. ExxonMobil, which topped the list of United States companies, contributed 1.4 percent of all global carbon dioxide emissions. It has net zero emissions targets.
Nearly 200 parties adopted the 2015 Paris Agreement, committing to reduce greenhouse gas emissions. However, 58 of the 100 state- and investor-owned companies in the Carbon Majors Database have since increased their production.
The International Energy Agency found coal consumption increased by eight percent over the seven years to 8.3 billion tons — a record high. State-owned Coal India is one of the top three carbon dioxide producers. Russia’s state-owned energy company Gazprom and state-owned oil firm Saudi Aramco rounded out the group.
YouTube is hyping its exclusive Coachella streaming coverage, which starts next week. The headlining feature is the platform’s multiview experience (already familiar to sports fans) — but who wants to watch up to four stages simultaneously, with audio for one of them. It’s… a music festival. Coachella runs from April 12 to 14 and April 19 to 21.
Finally, after a reveal at CES, the 2024 edition of the Razor Blade 18 arrives for $3,099. The base system has an i9-14900HX processor, 32GB of RAM, 1TB of SSD storage, Wi-Fi 7, a triple-fan cooling system and a six-speaker array with THX spatial audio support. You can equip the laptop with up to an NVIDIA GeForce RTX 4090 (the base model has a 4070 graphics card). In what Razer claims is a first for a laptop, there’s Thunderbolt 5 connectivity, but only if you opt for a 4080 or 4090 GPU.
Eight offices in Santa Clara, California were affected by the layoffs.
Over 700 people at Apple have recently lost their jobs, mostly from offices in Santa Clara. The location that dealt with the company’s electric vehicle projects has lost 371 people. There may not be enough space at that new home robot project.
This article originally appeared on Engadget at https://www.engadget.com/the-morning-after-80-percent-of-global-carbon-dioxide-emissions-comes-from-just-57-companies-111514748.html?src=rss
Say goodbye to your best friend's neighbor's great aunt's Disney+ account. Disney CEO Bob Iger said in an interview with CNBC that the streamer is cracking down on password sharing worldwide this summer. The company enacted the same restrictions for Canadian subscribers last fall.
The move is hardly a surprise, as Disney's CFO Hugh Johnston shared the plan during an earnings call in February. "Paid sharing is an opportunity for us. It's one that our competitor is obviously taking advantage of, and one that sits in front of us. We've got some very specific actions that we're taking in the next couple of months." Disney-owned Hulu started its own crackdown on password sharing on March 14, and both streamers' terms of service explicitly ban people from using other customers' login information (Though its latest announcement indicates Disney is actually ready to enforce it).
Streamers across the lineup are restricting password sharing, and it seems to be working — for them, not us. According to analytics firm Antenna, Netflix's United States signups increased by 102 percent during the first four days after the rule went into effect, compared to the 60 days prior. There were an average of 73,000 new signups daily, far outpacing cancelations. Max will also start restricting sharing this year, fully cracking down in 2025.
Disney+ will start its clampdown in some countries come June, expanding to a second wave of countries in September. It's unclear as of now which group the US is in, but Disney will likely provide a breakdown when the dates get closer. Disney+ currently costs $8 monthly with ads and $14 monthly for ad-free viewing.
This article originally appeared on Engadget at https://www.engadget.com/disney-is-also-cracking-down-on-password-sharing-103010857.html?src=rss
Over the years, Engadget has been the target of a common SEO scam, wherein someone claims ownership of an image and demands a link bank to a particular website. A lot of other websites would tell you the same thing, but now the scammers are making their fake DMCA takedown notices and threats of legal action look more legit with the help of easily accessible AI tools.
According to a report by 404Media, the publisher of the website Tedium received a "copyright infringement notice" via email from a law firm called Commonwealth Legal last week. Like older, similar attempts at duping the recipient, the sender said they're reaching out "in relation to an image" connected to their client. In this case, the sender demanded the addition of a "visible and clickable link" to a website called "tech4gods" underneath the photo that was allegedly stolen.
Since Tedium actually used a photo from a royalty-free provider, the publisher looked into the demand, found the law firm's website, and upon closer inspection, realized that the images of its lawyers were generated by AI. As 404Media notes, the images of the lawyers had vacant looks in the eyes that's commonly seen in photos created by AI tools. If you do a reverse image search on them, you'll get results from a website with the URL generated.photos, which uses artificial intelligence to make "unique, worry-free model photos... from scratch." The publisher also found that the law firm's listed address that's supposed to be on the fourth floor of a building points to a one-floor structure on Google Street View. The owner of tech4gods said he had nothing to do with the scam but admitted that he used to buy backlinks for his website.
This is but one example of how bad actors can use AI tools to fool and scam people, and we have to be more vigilant as instances like this will just likely keep on growing. Reverse image search engines are your friend, but they may not be infallible and may not always help. Deepfakes, for instance, have become a big problem in recent years, as bad actors continue to use them to create convincing videos and audio not just to scam people, but also to spread misinformation online.
This article originally appeared on Engadget at https://www.engadget.com/an-old-seo-scam-has-a-new-ai-generated-face-100045758.html?src=rss
Over 700 people at Apple have recently lost their jobs, according to the latest WARN report posted by the Employment Development Department of California (EDD). Most of the people who were laid off worked at Apple's offices in Santa Clara, with 371 of them coming from the company location that primarily dealt with the company's now-defunct electric vehicle project. Under California law, companies are required to file a report with the EDD for each location affected by layoffs under the Worker Adjustment and Retraining Notification (WARN) program.
Eight Apple locations in Santa Clara were hit by layoffs, including the main car office, though one of them worked on its in-house MicroLED display project that was reportedly scrapped in March due to costs and technical difficulties. The company was hoping to produce its own screens for iPhones, Macs and its smartwatches, but that clearly isn't happening anytime soon.
Apple's original car ambitions were to build a fully autonomous vehicle without pedals and a steering wheel, until it decided to develop an electric vehicle instead. A previous Bloomberg report said Apple canceled the initiative internally called "Project Titan" after investing billions of dollars and a decade into it. The employees who were developing the vehicle were given the chance to transfer to Apple's other divisions, including its teams that are reportedly working on artificial intelligence and home robotics. But based on Apple's WARN report, it wasn't able to re-integrate everyone into the company.
Apple is believed to be in the very early stages of developing personal robotics for people's homes. One of the machines that's currently a work-in-progress is a robot that follows people around, while the other is a table-top device that uses a robot to move a display around, according to another Bloomberg report. The company's work on personal robotics is part of its efforts, which also include the Vision Pro, to find new sources of revenue.
This article originally appeared on Engadget at https://www.engadget.com/apple-cuts-over-700-jobs-following-its-car-and-display-project-closures-061524777.html?src=rss