Uber, DoorDash and Grubhub won’t be able to get out of paying minimum wage to their New York City delivery workers after all, following a judge’s decision to reject their bid to skirt the city’s new law. The upcoming law, which is still pending due to the companies’ ongoing lawsuit, aims to secure better wage protections for app-based workers. Once the suit settles, third-party delivery providers will have to pay delivery workers a minimum wage of roughly $18 per hour before tips, and keep up with the yearly increases, Reuters reports.
The amount, which will increase April 1 of every year, is slightly higher than the city’s standard minimum wage, taking into account the additional expenses gig workers face. At the moment, food delivery workers make an estimated $7-$11 per hour on average.
New York Acting Supreme Court Justice Nicholas Moyne put the law on pause back in July, when the three companies and the smaller delivery service, Relay Delivery, sued the city, arguing that the raised rates will have a negative impact on their services. With Moyne's latest decision, it’ll now move forward. While Uber, DoorDash and Grubhub will have to comply once it takes effect, Relay will be given more time to renegotiate its contracts with restaurants, according to Reuters.
The move makes NYC the first US city to require a minimum wage for app-based deliveries, and others are likely to follow suit. The city previously pushed ride hailing apps to raise their minimum rates for drivers, forcing Uber and Lyft to raise their per-mile rates by just over 5 percent in 2022.
This article originally appeared on Engadget at https://www.engadget.com/uber-grubhub-and-doordash-must-pay-nyc-delivery-workers-an-18-minimum-wage-213145847.html?src=rss
Bethesda appears to have shadow-dropped (intentionally or not) a new mobile game set in a familiar universe. The Elder Scrolls: Castles is a building management game reminiscent of Fallout Shelter. The title, first spotted by Reddit user u/tracteurman (viaGamesRadar), is available for Android but not iOS.
The Elder Scrolls: Castles tasks you with controlling a castle — and your dynasty. “Oversee your subjects as the years come and go, families grow, and new rulers take the throne,” the game’s Play Store description reads. It describes a real-life day in the game covering a year within the virtual world.
Its gameplay involves familiar staples of building management sims: customize the castle, add and expand rooms, decorate, place monuments and assign workers to stations. In addition, you can create heroes to embark on “epic quests” to battle against classic Elder Scrolls foes.
Bethesda
The Play Store listing’s “What’s new” section says, “Welcome to early access,” which at least suggests the game may have been intended as a closed beta. However, it’s downloadable and playable to anyone with a Play Store account at the time of publication. We reached out to Bethesda to find out whether the game’s silent publication was intentional, and we’ll update this article if we find out more.
The Elder Scrolls: Castles is free in the Google Play Store. It’s rated “Teen” for violence and suggestive themes.
This article originally appeared on Engadget at https://www.engadget.com/the-elder-scrolls-castles-is-like-fallout-shelter-but-skyrim-212404049.html?src=rss
Fortnite maker Epic Games is laying off 16 percent of its staff — or about 830 employees. CEO Tim Sweeney said in an open letter to sent employees that Epic Games has been spending “way more money” than it earns. “We concluded that layoffs are the only way," he wrote "and that doing them now and on this scale will stabilize our finances.”
For those impacted by the layoffs, the company says it will offer a severance package that includes six months base pay and healthcare. Epic Games is also offering to accelerate employee’s stock option vesting schedule through 2024, while giving two additional years to exercise the options. About two-thirds of the layoffs affected teams outside of core development.
Sweeney wrote that Epic had been making an effort to reduce costs by not only freezing hiring but also by cutting spending on things like marketing and events. And while the metaverse is still in a conceptual phase, Sweeney said he wants the company to focus on developing infrastructure for its games to exist in the metaverse ecosystem. For example, Epic teamed up with LEGO to build an “immersive digital experience” for kids.
After today’s layoffs and the statement from Epic Games CEO Tim Sweeney, do you fear for competitive heading into 2024? pic.twitter.com/PVnx76Xolc
Epic also said it is divesting Bandcamp, an online music platform it acquired in mid-2022; it's coordinating a sale to Songtradr, a music licensing platform. SuperAwesome, a kid-friendly developer Epic acquired back in 2020, is being broken apart and partially spun out as well. Its advertising business will become an independent company, while the Kids Web Services segment and the parent verification and consent management toolsets will remain part of Epic.
While these moves to cut spending may help Epic Games stave off pressure from investors like Tencent and Sony, its flagship game Fortnite remains banned from Apple’s App Store and Google’s Play Store, which will continue to impact its bottom line. Not to mention the $520 million dollars in penalties it has incurred from the FTC and its efforts to have the Supreme Court overturn a ruling that cleared of antitrust violations.
Epic Games (which is based in Cary) today asked the Supreme Court to take up its case against Apple over alleged anti-competitive App Store practices. pic.twitter.com/Cri6Cz4IwI
Still, Sweeney says Epic's "prospects for the future are strong," thanks to Fortnite and the Unreal Engine. Next week, the company will be hosting Unreal Fest, and while some products and initiatives will continue to land on schedule, Sweeney says some may fall behind due to restrictions on resources. “We’re ok with the schedule tradeoff if it means holding on to our ability to achieve our goals, get to the other side of profitability and become a leading metaverse company,” he said in the memo.
The company says it will not cut any funding for its core businesses and it will continue to invest in games with Fortnite first-party development, as well as the Fortnite creator ecosystem.
This article originally appeared on Engadget at https://www.engadget.com/epic-games-is-laying-off-16-percent-of-its-workforce-and-selling-bandcamp-211830580.html?src=rss
Users will be able to edit, share and receive feedback on their Photoshop projects from anywhere on the web, Adobe announced Wednesday, regardless of whether an Adobe product is installed on their PC or tablet. The company is bringing its Photoshop on the web service out of beta and incorporating a few handy new AI features as well.
Adobe first introduced a feature-svelt online version of the popular Photoshop app in December, 2021. Originally, users could share their psd files but only if the recipient had a copy of Photoshop or Illustrator on their computer too. That changed with the introduction of Creative Cloud, which allowed for sharing without the need for a local install. The beta version of Photoshop on the web took that concept a step further by incorporating basic editing tools into the web UI geared towards "minor tweaks and quick edits" — the easy sort of stuff that took less time to fix than the program took to boot. The production version released Wednesday does all that and more.
"With this release we are starting with a focus on the needs of creators who are new to Photoshop with a streamlined user experience," Adobe VP Pam Clark wrote in a blog post. "We have brought the majority of the most commonly used Photoshop tools to the web and have streamlined the user experience, to make it easier for newer users to navigate the app."
Users will also be able to experiment with two new AI-driven tools, generative fill and generative expand. As their names' imply, these will "allow you to add, expand, or remove content from your images non-destructively, while magically matching perspective, lighting, and style of your image," Clark wrote. The features were first released as part of Firefly for the desktop edition of Photoshop.
The Contextual Taskbar is also migrating over from the desktop. This on-screen menu will observe your workflow and suggest relevant next steps. But for all the new features to play with, a number of existing tools have yet to make the jump to the web, including the patch and pen tools, smart object support and the polygonal lasso, the the company insists that they will be added with future updates.
This article originally appeared on Engadget at https://www.engadget.com/adobes-photoshop-on-the-web-service-is-now-available-to-all-creative-cloud-subscribers-210034891.html?src=rss
At dawn on Wednesday, French antitrust authorities conducted a surprise raid on a company in the country that specializes in graphics cards — and according to The Wall Street Journal and Challenges business magazine, that company was NVIDIA. We reached out to NVIDIA for clarification and a spokesperson declined to comment. Here's what we know for sure:
The French Competition Authority conducted a surprise raid early Wednesday morning on "a company suspected of having implemented anticompetitive practices in the graphics cards sector," according to a brief press release from the regulator. The raid was tied to a larger investigation into the health of the cloud computing market, with a focus on identifying whether new companies were being unfairly squeezed out by larger, existing ones. The results of that investigation were published in June and they centered on three "hyperscalers," Amazon Web Services, Google Cloud and Microsoft Azure.
The results read, in part, "The likelihood of a new operator being able to gain market share rapidly appears limited, excluding companies who are already powerful in other digital markets." NVIDIA is not mentioned in the original cloud investigation.
NVIDIA has seen significant financial success this year amid the AI boom. NVIDIA's AI chips and data centers are in high demand, and the company crushed its most recent earnings expectations, pulling in $13.51 billion in the second quarter of 2023, compared with $6.7 billion in 2022.
As the French Competition Authority noted, a raid does not mean the targeted company is guilty of anticompetitive practices — but it's a confident step from the regulatory body.
This article originally appeared on Engadget at https://www.engadget.com/looks-like-nvidia-got-raided-by-french-antitrust-authorities-205809329.html?src=rss
Google has announced a new control in its robots.txt indexing file that would let publishers decide whether their content will "help improve Bard and Vertex AI generative APIs, including future generations of models that power those products." The control is a crawler called Google-Extended, and publishers can add it to the file in their site's documentation to tell Google not to use it for those two APIs. In its announcement, the company's vice president of "Trust" Danielle Romain said it's "heard from web publishers that they want greater choice and control over how their content is used for emerging generative AI use cases."
Romain added that Google-Extended "is an important step in providing transparency and control that we believe all providers of AI models should make available." As generative AI chatbots grow in prevalence and become more deeply integrated into search results, the way content is digested by things like Bard and Bing AI has been of concern to publishers.
While those systems may cite their sources, they do aggregate information that originates from different websites and present it to the users within the conversation. This might drastically reduce the amount of traffic going to individual outlets, which would then significantly impact things like ad revenue and entire business models.
Romain points out that "as AI applications expand, web publishers will face the increasing complexity of managing different uses at scale." This year has seen an explosion in the development of tools based on generative AI, and with search being such a huge way people discover content, the state of the internet looks set to undergo a huge shift. Google's addition of this control is not only timely, but indicates it's thinking about the way its products will impact the web.
This article originally appeared on Engadget at https://www.engadget.com/google-will-let-publishers-hide-their-content-from-its-insatiable-ai-202015557.html?src=rss
Google is opening its AI-powered search experience to teens. In addition, the company’s Search Generative Experience (SGE) is adding new context pages to shed light on generated responses and individual web links within answers.
The company is opening its search-based AI tool to US teenagers between 13 and 17. Google says it received “particularly positive feedback” from 18- to 24-year-olds who tested SGE, which influenced its decision. (Younger people being more open to AI isn’t exactly a shock, given older adults’ tendency to be more suspicious of new technologies.) SGE has been available as part of Google Search Labs since late May.
Google says it has added safeguards to prevent inappropriate or harmful content based on its research with experts in teen development. “For example, we’ve put stronger guardrails in place for outputs related to illegal or age-gated substances or bullying, among other issues,” the company wrote on Thursday. Google says it will continue to gather feedback and work with specialists to fine-tune SGE for teens.
Google
Starting today, the company is also adding an “About this result” tool to SGE responses, helping users understand how the AI settled on its answers. Soon, it will also produce “About this result” responses for individual URLs within AI-generated answers “so people can understand more about the web pages that back up the information in AI-powered overviews.”
To help newcomers understand generative AI, Google has published an AI Literacy Guide, serving as a welcome manual to SGE and other AI projects like Bard. It includes tips, FAQs and discussions about its capabilities and limitations.
Finally, Google says it’s making “targeted improvements” to AI-powered results that are false or offensive. It’s rolling out an update to train the AI model to better detect “hallucinations” or inappropriate content. (Chatbots spreading misinformation has been an issue from the get-go.) The company is also working on using large language models to “critique” their first draft responses and rewrite them with quality and safety in mind.
“Generative AI can help younger people ask questions they couldn’t typically get answered by a search engine and pose follow-up questions to help them dig deeper,” the company wrote. “As we introduce this new technology to teens, we want to strike the right balance in creating opportunities for them to benefit from all it has to offer, while also prioritizing safety and meeting their developmental needs.”
This article originally appeared on Engadget at https://www.engadget.com/google-opens-its-ai-generated-search-experience-to-teens-201357386.html?src=rss
Meta’s Connect keynote felt different this year, and not just because it marked the return of an in-person event. It’s been nearly two years since Mark Zuckerberg used Connect to announce that Facebook was changing its name to Meta and reorienting the entire company around the metaverse.
But at this year’s event, it felt almost as if Zuckerberg was trying to avoid saying the word “metaverse.” While he did utter the word a couple of times, he spent much more time talking up Meta’s new AI features, many of which will be available on Instagram and Facebook and other non-metaverse apps. Horizon Worlds, the company’s signature metaverse experience that was highlighted at last year’s Connect, was barely mentioned.
That may not be particularly surprising if you’ve been following the company’s metaverse journey lately. Meta has lost so much money on the metaverse, its own investors have questioned it. And Zuckerberg has been mercilessly mocked for trying to hype seemingly minor metaverse features like low-res graphics or avatars with legs.
AI, on the other hand, is much more exciting. The rise of large language models has fueled a huge amount of interest from investors and consumers alike. Services like OpenAI’s ChatGPT, Snap’s MyAI and Midjourney have made the technology accessible — and understandable— to millions.
ASSOCIATED PRESS
Given all that, it’s not surprising that Zuckerberg and Meta used much of Connect — once known solely as a virtual reality conference — to talk about the company’s new generative AI tools. And there was a lot to talk about: the company introduced Meta AI, a generative AI assistant, which can answer questions and take on the personality of dozens of characters; AI-powered image editing for Instagram; and tools that will enable developers, creators and businesses to make their own AI-powered bots. AI will even play a prominent role in the company’s new hardware, the Meta Quest 3 and the Ray-Ban Meta smart glasses, both of which will ship with the Meta AI assistant.
But that doesn't mean the company is giving up on the metaverse. Zuckerberg has said the two are very much linked, and has previously tried to dispel the notion that Meta’s current focus on AI has somehow supplanted its metaverse investments. “A narrative has developed that we're moving away from focusing on the metaverse vision,” Zuckerberg said in April. We've been focusing on both AI and the metaverse for years now, and we will continue to focus on both.”
But at Connect he offered a somewhat different pitch for the metaverse than he has in the past. Over the last two years, Zuckerberg spent a lot of time emphasizing socializing and working in VR environments, and the importance of avatars. This year, he pitched an AI-centric metaverse.
"Pretty soon, I think we're going to be at a point where you're going to be there physically with some of your friends, and others will be there digitally as avatars as holograms and they'll feel just as present as everyone else. Or you know, you'll walk into a meeting and you'll sit down at a table and there will be people who are there physically, and people are there digitally as holograms. But also sitting around the table with you. are gonna be a bunch of AIs who are embodied as holograms, who are helping you get different stuff done too. So I mean, this is just a quick glimpse of the future and how these ideas of the physical and digital come together into this idea that we call the metaverse."
Notably, the addition of AI assistants could also make “the metaverse” a lot more useful. One of the more intriguing features previewed during Connect were Meta AI-powered search capabilities in the Ray-Ban Meta smart glasses. The Google Lens-like feature would enable wearers to “show” things they are seeing through the glasses and ask the AI questions about it, like asking Meta AI to identify a monument or translate text.
It’s not hard to imagine users coming up with their own use cases for AI assistants in Meta’s virtual worlds, either. Angela Fan, a research scientist with Meta AI, says generative AI will change the type of experiences people have in the metaverse. “It’s almost like a new angle on it,” Fan tells Engadget. “When you're hanging out with friends, for example, you might also have an AI looped in to help you with tasks. It’s the same kind of foundation, but brought to life with the AIs that will do things in addition to some of the friends that you hang out with in the metaverse.”
Meta
For now, it’s not entirely clear just how long it will be before these new AI experiences reach the metaverse. The company said the new “multi-modal” search capabilities would be arriving on its smart glasses sometime next year. And it didn’t give a timeframe for when the new “embodied” AI assistants could be available for metaverse hangouts.
It’s also not yet clear if the new wave of AI assistants will be popular enough to fuel a renewed interest in the metaverse to begin with. Meta previously tried to make (non-AI) chatbots a thing in 2016 and the effort fell flat. And even though generative AI makes the latest generation of bots much more powerful, the company has plenty of competition in the space. But by putting its AI into its other apps now, Meta has a much better chance at reaching its billions of users. And that could lay important groundwork for its vision for an AI-centric metaverse.
This article originally appeared on Engadget at https://www.engadget.com/metas-metaverse-is-getting-an-ai-makeover-194004996.html?src=rss
Footage captured by a food delivery robot in Los Angeles was used to arrest and convict two people after a failed attempt to steal it off the street earlier this year, according to 404 Media. Serve Robotics, which works with Uber Eats for last-mile deliveries in the area, shared videos of the incident with the Los Angeles Police Department both proactively and after a subpoena. Serve previously met with LAPD to “open a line of communication” between the two ahead of any potential troubles, emails obtained by 404 alsoshow.
It comes at a time when public wariness around the technology is already high, with concerns about just how much the robots are recording and where that footage ultimately goes. Serve Robotics CEO Ali Kashani boasted about the resulting convictions on social media, tweeting, “Some genius once tried to steal one of our robots… It didn’t end well (for them).” In a follow-up blog post, Kashani takes a softer stance, attempting to explain how the company balances its approach to involving law enforcement with its responsibility to the public and fostering trust.
The company’s principles, according to Kashani, include “not using robots for surveillance or other purposes that violate the public’s sense of privacy,” and not putting unnecessary strain on public resources by calling in the police “to address every minor incident of robot vandalism.” In this case, in which the police were immediately notified and arrests were made, the robot got away on its own and was, as Kashani describes it, “unharmed.” The company turned in all relevant footage before deleting it.
The emails I got show:
- The robots are filming - The footage is sometimes saved - The footage can be proactively given to cops - The footage can, separately, be subpoenaed - Serve Robotics, which delivers for UberEats, has a "dialog/partnership" with LAPD pic.twitter.com/5p4V8KpVFo
It remains unclear how long Serve keeps its robots’ recordings under normal circumstances, and its vagueness around the videos' potential use doesn’t inspire much confidence. In a statement to 404 Media, Serve’s head of communications, Aduke Thelwell, said it is the company’s policy to “regularly delete camera feed unless otherwise required, and to comply with subpoena requests.”
This article originally appeared on Engadget at https://www.engadget.com/a-food-delivery-robots-footage-led-to-a-criminal-conviction-in-la-190854339.html?src=rss
Scientists at the University of Washington have developed flying robots that change shape in mid-air, all without batteries, as originally published in the research journal Science Robotics. These miniature Transformers snap into a folded position during flight to stabilize descent. They weigh just 400 milligrams and feature an on-board battery-free actuator complete with a solar power-harvesting circuit.
Here’s how they work. These robots actually mimic the flight of different leaf types in mid-air once they’re dropped from a drone at an approximate height of 130 feet. The origami-inspired design allows them to transform quickly from an unfolded to a folded state, a process that takes just 25 milliseconds. This transformation allows for different descent trajectories, with the unfolded position floating around on the breeze and the folded one falling more directly. Small robots are nothing new, but this is the first solar-powered microflier that allows for control over the descent, thanks to an onboard pressure sensor to estimate altitude, an onboard timer and a simple Bluetooth receiver.
As for the why of it all, the lil baby Starscreams can be equipped with a wide variety of sensors to make surveys as they soar around the sky, so in theory they could gauge temperature, humidity and air quality conditions, among other types of data. Produced at scale, this would be a highly-cost effective way to keep tabs on atmospheric conditions.
The current design only allows them to transition in one direction, from the tumbling state to the falling state, but researchers can control multiple microfliers at the same time, making them disperse upon launch to cover a wider area. They’re working on perfecting the reverse transition to allow the robots to transform back from the falling position to the folded position, which should better allow the microfliers to make precise landings even in turbulent wind.
It’s good to see new robots that don’t resemble a Dr. Who death machine or a headless dog with a thirst for blood. Let’s hear it for innovation! In the meantime, the University of Washington researchers will have plenty of funds to further develop this microflier concept, thanks to grants from the National Science Foundation, NASA and the Google fellowship program, among others.
This article originally appeared on Engadget at https://www.engadget.com/these-flying-origami-inspired-robots-change-shape-in-mid-air-184653938.html?src=rss