Adobe's 'Photoshop on the web' service is now available to all Creative Cloud subscribers

Users will be able to edit, share and receive feedback on their Photoshop projects from anywhere on the web, Adobe announced Wednesday, regardless of whether an Adobe product is installed on their PC or tablet. The company is bringing its Photoshop on the web service out of beta and incorporating a few handy new AI features as well.

Adobe first introduced a feature-svelt online version of the popular Photoshop app in December, 2021. Originally, users could share their psd files but only if the recipient had a copy of Photoshop or Illustrator on their computer too. That changed with the introduction of Creative Cloud, which allowed for sharing without the need for a local install. The beta version of Photoshop on the web took that concept a step further by incorporating basic editing tools into the web UI geared towards "minor tweaks and quick edits" — the easy sort of stuff that took less time to fix than the program took to boot. The production version released Wednesday does all that and more.

"With this release we are starting with a focus on the needs of creators who are new to Photoshop with a streamlined user experience," Adobe VP Pam Clark wrote in a blog post. "We have brought the majority of the most commonly used Photoshop tools to the web and have streamlined the user experience, to make it easier for newer users to navigate the app."

Users will also be able to experiment with two new AI-driven tools, generative fill and generative expand. As their names' imply, these will "allow you to add, expand, or remove content from your images non-destructively, while magically matching perspective, lighting, and style of your image," Clark wrote. The features were first released as part of Firefly for the desktop edition of Photoshop

The Contextual Taskbar is also migrating over from the desktop. This on-screen menu will observe your workflow and suggest relevant next steps. But for all the new features to play with, a number of existing tools have yet to make the jump to the web, including the patch and pen tools, smart object support and the polygonal lasso, the the company insists that they will be added with future updates.  

This article originally appeared on Engadget at https://www.engadget.com/adobes-photoshop-on-the-web-service-is-now-available-to-all-creative-cloud-subscribers-210034891.html?src=rss

Looks like NVIDIA got raided by French antitrust authorities

At dawn on Wednesday, French antitrust authorities conducted a surprise raid on a company in the country that specializes in graphics cards — and according to The Wall Street Journal and Challenges business magazine, that company was NVIDIA. We reached out to NVIDIA for clarification and a spokesperson declined to comment. Here's what we know for sure:

The French Competition Authority conducted a surprise raid early Wednesday morning on "a company suspected of having implemented anticompetitive practices in the graphics cards sector," according to a brief press release from the regulator. The raid was tied to a larger investigation into the health of the cloud computing market, with a focus on identifying whether new companies were being unfairly squeezed out by larger, existing ones. The results of that investigation were published in June and they centered on three "hyperscalers," Amazon Web Services, Google Cloud and Microsoft Azure. 

The results read, in part, "The likelihood of a new operator being able to gain market share rapidly appears limited, excluding companies who are already powerful in other digital markets." NVIDIA is not mentioned in the original cloud investigation.

NVIDIA has seen significant financial success this year amid the AI boom. NVIDIA's AI chips and data centers are in high demand, and the company crushed its most recent earnings expectations, pulling in $13.51 billion in the second quarter of 2023, compared with $6.7 billion in 2022.

As the French Competition Authority noted, a raid does not mean the targeted company is guilty of anticompetitive practices — but it's a confident step from the regulatory body.

This article originally appeared on Engadget at https://www.engadget.com/looks-like-nvidia-got-raided-by-french-antitrust-authorities-205809329.html?src=rss

Google will let publishers hide their content from its insatiable AI

Google has announced a new control in its robots.txt indexing file that would let publishers decide whether their content will "help improve Bard and Vertex AI generative APIs, including future generations of models that power those products." The control is a crawler called Google-Extended, and publishers can add it to the file in their site's documentation to tell Google not to use it for those two APIs. In its announcement, the company's vice president of "Trust" Danielle Romain said it's "heard from web publishers that they want greater choice and control over how their content is used for emerging generative AI use cases."

Romain added that Google-Extended "is an important step in providing transparency and control that we believe all providers of AI models should make available." As generative AI chatbots grow in prevalence and become more deeply integrated into search results, the way content is digested by things like Bard and Bing AI has been of concern to publishers. 

While those systems may cite their sources, they do aggregate information that originates from different websites and present it to the users within the conversation. This might drastically reduce the amount of traffic going to individual outlets, which would then significantly impact things like ad revenue and entire business models.

Romain points out that "as AI applications expand, web publishers will face the increasing complexity of managing different uses at scale." This year has seen an explosion in the development of tools based on generative AI, and with search being such a huge way people discover content, the state of the internet looks set to undergo a huge shift. Google's addition of this control is not only timely, but indicates it's thinking about the way its products will impact the web.

This article originally appeared on Engadget at https://www.engadget.com/google-will-let-publishers-hide-their-content-from-its-insatiable-ai-202015557.html?src=rss

Google opens its AI-generated search experience to teens

Google is opening its AI-powered search experience to teens. In addition, the company’s Search Generative Experience (SGE) is adding new context pages to shed light on generated responses and individual web links within answers.

The company is opening its search-based AI tool to US teenagers between 13 and 17. Google says it received “particularly positive feedback” from 18- to 24-year-olds who tested SGE, which influenced its decision. (Younger people being more open to AI isn’t exactly a shock, given older adults’ tendency to be more suspicious of new technologies.) SGE has been available as part of Google Search Labs since late May.

Google says it has added safeguards to prevent inappropriate or harmful content based on its research with experts in teen development. “For example, we’ve put stronger guardrails in place for outputs related to illegal or age-gated substances or bullying, among other issues,” the company wrote on Thursday. Google says it will continue to gather feedback and work with specialists to fine-tune SGE for teens.

Google

Starting today, the company is also adding an “About this result” tool to SGE responses, helping users understand how the AI settled on its answers. Soon, it will also produce “About this result” responses for individual URLs within AI-generated answers “so people can understand more about the web pages that back up the information in AI-powered overviews.”

To help newcomers understand generative AI, Google has published an AI Literacy Guide, serving as a welcome manual to SGE and other AI projects like Bard. It includes tips, FAQs and discussions about its capabilities and limitations.

Finally, Google says it’s making “targeted improvements” to AI-powered results that are false or offensive. It’s rolling out an update to train the AI model to better detect “hallucinations” or inappropriate content. (Chatbots spreading misinformation has been an issue from the get-go.) The company is also working on using large language models to “critique” their first draft responses and rewrite them with quality and safety in mind.

“Generative AI can help younger people ask questions they couldn’t typically get answered by a search engine and pose follow-up questions to help them dig deeper,” the company wrote. “As we introduce this new technology to teens, we want to strike the right balance in creating opportunities for them to benefit from all it has to offer, while also prioritizing safety and meeting their developmental needs.”

This article originally appeared on Engadget at https://www.engadget.com/google-opens-its-ai-generated-search-experience-to-teens-201357386.html?src=rss

Meta’s metaverse is getting an AI makeover

Meta’s Connect keynote felt different this year, and not just because it marked the return of an in-person event. It’s been nearly two years since Mark Zuckerberg used Connect to announce that Facebook was changing its name to Meta and reorienting the entire company around the metaverse.

But at this year’s event, it felt almost as if Zuckerberg was trying to avoid saying the word “metaverse.” While he did utter the word a couple of times, he spent much more time talking up Meta’s new AI features, many of which will be available on Instagram and Facebook and other non-metaverse apps. Horizon Worlds, the company’s signature metaverse experience that was highlighted at last year’s Connect, was barely mentioned.

That may not be particularly surprising if you’ve been following the company’s metaverse journey lately. Meta has lost so much money on the metaverse, its own investors have questioned it. And Zuckerberg has been mercilessly mocked for trying to hype seemingly minor metaverse features like low-res graphics or avatars with legs.

AI, on the other hand, is much more exciting. The rise of large language models has fueled a huge amount of interest from investors and consumers alike. Services like OpenAI’s ChatGPT, Snap’s MyAI and Midjourney have made the technology accessible — and understandable— to millions.

ASSOCIATED PRESS

Given all that, it’s not surprising that Zuckerberg and Meta used much of Connect — once known solely as a virtual reality conference — to talk about the company’s new generative AI tools. And there was a lot to talk about: the company introduced Meta AI, a generative AI assistant, which can answer questions and take on the personality of dozens of characters; AI-powered image editing for Instagram; and tools that will enable developers, creators and businesses to make their own AI-powered bots. AI will even play a prominent role in the company’s new hardware, the Meta Quest 3 and the Ray-Ban Meta smart glasses, both of which will ship with the Meta AI assistant.

But that doesn't mean the company is giving up on the metaverse. Zuckerberg has said the two are very much linked, and has previously tried to dispel the notion that Meta’s current focus on AI has somehow supplanted its metaverse investments. “A narrative has developed that we're moving away from focusing on the metaverse vision,” Zuckerberg said in April. We've been focusing on both AI and the metaverse for years now, and we will continue to focus on both.”

But at Connect he offered a somewhat different pitch for the metaverse than he has in the past. Over the last two years, Zuckerberg spent a lot of time emphasizing socializing and working in VR environments, and the importance of avatars. This year, he pitched an AI-centric metaverse.

"Pretty soon, I think we're going to be at a point where you're going to be there physically with some of your friends, and others will be there digitally as avatars as holograms and they'll feel just as present as everyone else. Or you know, you'll walk into a meeting and you'll sit down at a table and there will be people who are there physically, and people are there digitally as holograms. But also sitting around the table with you. are gonna be a bunch of AIs who are embodied as holograms, who are helping you get different stuff done too. So I mean, this is just a quick glimpse of the future and how these ideas of the physical and digital come together into this idea that we call the metaverse."

Notably, the addition of AI assistants could also make “the metaverse” a lot more useful. One of the more intriguing features previewed during Connect were Meta AI-powered search capabilities in the Ray-Ban Meta smart glasses. The Google Lens-like feature would enable wearers to “show” things they are seeing through the glasses and ask the AI questions about it, like asking Meta AI to identify a monument or translate text.

It’s not hard to imagine users coming up with their own use cases for AI assistants in Meta’s virtual worlds, either. Angela Fan, a research scientist with Meta AI, says generative AI will change the type of experiences people have in the metaverse. “It’s almost like a new angle on it,” Fan tells Engadget. “When you're hanging out with friends, for example, you might also have an AI looped in to help you with tasks. It’s the same kind of foundation, but brought to life with the AIs that will do things in addition to some of the friends that you hang out with in the metaverse.”

Meta

For now, it’s not entirely clear just how long it will be before these new AI experiences reach the metaverse. The company said the new “multi-modal” search capabilities would be arriving on its smart glasses sometime next year. And it didn’t give a timeframe for when the new “embodied” AI assistants could be available for metaverse hangouts.

It’s also not yet clear if the new wave of AI assistants will be popular enough to fuel a renewed interest in the metaverse to begin with. Meta previously tried to make (non-AI) chatbots a thing in 2016 and the effort fell flat. And even though generative AI makes the latest generation of bots much more powerful, the company has plenty of competition in the space. But by putting its AI into its other apps now, Meta has a much better chance at reaching its billions of users. And that could lay important groundwork for its vision for an AI-centric metaverse.

This article originally appeared on Engadget at https://www.engadget.com/metas-metaverse-is-getting-an-ai-makeover-194004996.html?src=rss

A food delivery robot's footage led to a criminal conviction in LA

Footage captured by a food delivery robot in Los Angeles was used to arrest and convict two people after a failed attempt to steal it off the street earlier this year, according to 404 Media. Serve Robotics, which works with Uber Eats for last-mile deliveries in the area, shared videos of the incident with the Los Angeles Police Department both proactively and after a subpoena. Serve previously met with LAPD to “open a line of communication” between the two ahead of any potential troubles, emails obtained by 404 also show.

It comes at a time when public wariness around the technology is already high, with concerns about just how much the robots are recording and where that footage ultimately goes. Serve Robotics CEO Ali Kashani boasted about the resulting convictions on social media, tweeting, “Some genius once tried to steal one of our robots… It didn’t end well (for them).” In a follow-up blog post, Kashani takes a softer stance, attempting to explain how the company balances its approach to involving law enforcement with its responsibility to the public and fostering trust.

The company’s principles, according to Kashani, include “not using robots for surveillance or other purposes that violate the public’s sense of privacy,” and not putting unnecessary strain on public resources by calling in the police “to address every minor incident of robot vandalism.” In this case, in which the police were immediately notified and arrests were made, the robot got away on its own and was, as Kashani describes it, “unharmed.” The company turned in all relevant footage before deleting it.

The emails I got show:

- The robots are filming
- The footage is sometimes saved
- The footage can be proactively given to cops
- The footage can, separately, be subpoenaed
- Serve Robotics, which delivers for UberEats, has a "dialog/partnership" with LAPD pic.twitter.com/5p4V8KpVFo

— Jason Koebler (@jason_koebler) September 28, 2023

It remains unclear how long Serve keeps its robots’ recordings under normal circumstances, and its vagueness around the videos' potential use doesn’t inspire much confidence. In a statement to 404 Media, Serve’s head of communications, Aduke Thelwell, said it is the company’s policy to “regularly delete camera feed unless otherwise required, and to comply with subpoena requests.”

This article originally appeared on Engadget at https://www.engadget.com/a-food-delivery-robots-footage-led-to-a-criminal-conviction-in-la-190854339.html?src=rss

These flying origami-inspired robots change shape in mid-air

Scientists at the University of Washington have developed flying robots that change shape in mid-air, all without batteries, as originally published in the research journal Science Robotics. These miniature Transformers snap into a folded position during flight to stabilize descent. They weigh just 400 milligrams and feature an on-board battery-free actuator complete with a solar power-harvesting circuit.

Here’s how they work. These robots actually mimic the flight of different leaf types in mid-air once they’re dropped from a drone at an approximate height of 130 feet. The origami-inspired design allows them to transform quickly from an unfolded to a folded state, a process that takes just 25 milliseconds. This transformation allows for different descent trajectories, with the unfolded position floating around on the breeze and the folded one falling more directly. Small robots are nothing new, but this is the first solar-powered microflier that allows for control over the descent, thanks to an onboard pressure sensor to estimate altitude, an onboard timer and a simple Bluetooth receiver.

As for the why of it all, the lil baby Starscreams can be equipped with a wide variety of sensors to make surveys as they soar around the sky, so in theory they could gauge temperature, humidity and air quality conditions, among other types of data. Produced at scale, this would be a highly-cost effective way to keep tabs on atmospheric conditions.

The current design only allows them to transition in one direction, from the tumbling state to the falling state, but researchers can control multiple microfliers at the same time, making them disperse upon launch to cover a wider area. They’re working on perfecting the reverse transition to allow the robots to transform back from the falling position to the folded position, which should better allow the microfliers to make precise landings even in turbulent wind.

It’s good to see new robots that don’t resemble a Dr. Who death machine or a headless dog with a thirst for blood. Let’s hear it for innovation! In the meantime, the University of Washington researchers will have plenty of funds to further develop this microflier concept, thanks to grants from the National Science Foundation, NASA and the Google fellowship program, among others.

This article originally appeared on Engadget at https://www.engadget.com/these-flying-origami-inspired-robots-change-shape-in-mid-air-184653938.html?src=rss

Honda's first all-electric vehicle has 300-mile range and starts in the 'upper $40,000s'

Honda has revealed more details about its all-electric Prologue SUV. The EV will have a listed range of 300 miles when the first deliveries arrive in early 2024. The automaker says the vehicle’s MSRP is “expected to start in the upper $40,000s” before subtracting any available incentives or tax credits.

The Prologue’s pricing puts its entry point well above rivals like the Volkswagen ID.4, Hyundai Ioniq 5 and Mustang Mach-E — all of which start at around $40,000. Meanwhile, the range on Honda’s EV is much shorter than the Ioniq 6’s 361 miles and even lags slightly behind the Hummer EV (314 miles).

The Prologue is built on the GM Ultium EV architecture, the same platform as GM electrics like the Chevy Blazer EV. Honda’s new model has an 85-kWh battery pack that applies to both rear-drive and all-wheel-drive models. However, note that the 300-mile estimated range only applies to the 2WD variant, and we don’t yet know how far to expect the AWD one to last. The automaker only lists engine performance stats for the AWD version, which generates an estimated 288 hp and 333 lb-ft of torque.

Honda

Elsewhere, the Honda Prologue has a 121.8-inch wheelbase, five inches longer than the longer Honda Pilot’s. The SUV seats five people and has 136.9 cubic feet of interior space. The vehicle supports wireless Apple CarPlay and Android Auto as standard, and it includes an 11-inch instrument display with an 11.3-inch HD infotainment screen.

Honda will give buyers several charging packages to choose from at purchase. First, they can get an 11.5kW home charging station with a $100 charging credit and a $500 installation credit. Second, they can opt for a 7.6kW portable charging kit, a $300 public charging credit and a $250 installation credit. Alternatively, customers can go with $750 in public charging credits.

This article originally appeared on Engadget at https://www.engadget.com/hondas-first-all-electric-vehicle-has-300-mile-range-and-starts-in-the-upper-40000s-183146226.html?src=rss

Honda's first all-electric SUV has 300-mile range and starts in the 'upper $40,000s'

Honda has revealed more details about its all-electric Prologue SUV. The EV will have a listed range of 300 miles when the first deliveries arrive in early 2024. The automaker says the vehicle’s MSRP is “expected to start in the upper $40,000s” before subtracting any available incentives or tax credits.

The Prologue’s pricing puts its entry point well above rival SUVs like the Volkswagen ID.4, Hyundai Ioniq 5 and Mustang Mach-E — all of which start at around $40,000. Meanwhile, the range on Honda’s EV is much shorter than the Ioniq 6’s 361 miles and even lags slightly behind the Hummer EV (314 miles).

The Prologue is built on the GM Ultium EV architecture, the same platform as GM electrics like the Chevy Blazer EV. Honda’s new model has an 85-kWh battery pack that applies to both rear-drive and all-wheel-drive models. However, note that the 300-mile estimated range only applies to the 2WD variant, and we don’t yet know how far to expect the AWD one to last. The automaker only lists engine performance stats for the AWD version, which generates an estimated 288 hp and 333 lb-ft of torque.

Honda

Elsewhere, the Honda Prologue has a 121.8-inch wheelbase, five inches longer than the longer Honda Pilot’s. The SUV seats five people and has 136.9 cubic feet of interior space. The vehicle supports wireless Apple CarPlay and Android Auto as standard, and it includes an 11-inch instrument display with an 11.3-inch HD infotainment screen.

Honda will give buyers several charging packages to choose from at purchase. First, they can get an 11.5kW home charging station with a $100 charging credit and a $500 installation credit. Second, they can opt for a 7.6kW portable charging kit, a $300 public charging credit and a $250 installation credit. Alternatively, customers can go with $750 in public charging credits.

This article originally appeared on Engadget at https://www.engadget.com/honda-first-all-electric-suv-has-300-mile-range-and-starts-in-the-upper-40000s-183146672.html?src=rss

Cities: Skylines II will hit PS5 and Xbox Series X/S in spring 2024

Looks like Cities: Skylines II is truly going to be colossal. The console release of Cities: Skylines II has been delayed to spring 2024, and the game's minimum and recommended PC specs are now slightly more demanding. The PC version of the game will still land on October 24, 2023, as originally planned.

Anyone who pre-ordered the game on PlayStation 5 or Xbox Series X/S should automatically receive a refund through those platforms. Developer Colossal Order is shutting down pre-orders of the console version for now. Cities: Skylines II is still coming to PC Game Pass on October 24, and it'll hit Xbox Game Pass next spring, alongside the console release.

The new minimum and recommended PC specs aren't wildly different than the original figures, but there are two notable changes. The recommended specs now call for an AMD Ryzen 7 5800X processor, rather than a Ryzen 5 5800X. The minimum specs start with an NVIDIA GeForce GTX 970 graphics card (or AMD equivalent), an upgrade from the original GTX 780.

Colossal Order explained that the console delay and the PC update stem from the same root cause: This game is bigger and requires more optimization than they first thought. Here's how the studio explained the adjustment to the game's PC specs:

"Cities: Skylines II is a next-generation title and therefore has certain hardware requirements. The recommended specs were set when the game was still in development. After having done extensive testing with different hardware we made the decision to update the minimum/recommended specs for a better player experience."

It's been a rough year for PC games in general. The diversity baked into the PC market has always been a challenge for developers, but ninth-generation console hardware is now outstripping many common PC setups, leading to ambitious games with lots of bugs on PC. With today's Cities: Skylines II news, it seems like Colossal Order is attempting to insulate itself from this phenomenon.

Developers have also had a tough time offering parity between the Xbox Series X — the most technically powerful console this generation — and the Series S, Microsoft's cheaper and less powerful option. Microsoft requires all games to launch with the same features on both consoles, and this has led to a handful of delays, dropped features, and at least one accidental PS5 console exclusive. Microsoft recently allowed Baldur's Gate 3 creator Larian Studios to bend these rules, but the requirement remains in place generally.

Colossal Order has an FAQ about the changes to Cities: Skylines II right here.

This article originally appeared on Engadget at https://www.engadget.com/cities-skylines-ii-will-hit-ps5-and-xbox-series-xs-in-spring-2024-173416073.html?src=rss