Posts with «software» label

Threads is testing real-time search results

Meta’s Threads app is often described as the company’s competitor to X. But Threads users, especially those once active on Twitter, are often quick to point out that Meta’s app is not yet a great source for real-time information. The app’s “for you” algorithm often surfaces days-old posts alongside fresh ones, and its recently introduced trending topics feature only shows five topics at a time. But for those holding out hope that the app may eventually become more useful for real-time information, Meta’s latest test may be good news.

The app is testing a new search feature that will allow users to filter results by recency, according to a screenshot shared by Threads user Daniel Rodriguez. Threads’ top exec, Adam Mosseri, confirmed the change. “We’re starting to test this with a small number of people so it’s easier to find relevant search results in real time,” Mosseri wrote.

That may sound like a relatively minor tweak but the lack of a chronological search has long been frustrating for Threads users looking to find news or commentary about current events. And while sorting by “recent” posts isn’t the same as chronological search, it should help surface posts about breaking news or other timely topics.

Just how useful the feature is, though, will depend on if Meta makes the filter available to all topics on the platform. The screenshot showed a recency option for “NBA Threads,” a community Mosseri has gone out of his way to encourage in the app. But Mosseri has been considerably less enthusiastic about other timely topics, saying last year he didn’t want to encourage “hard news.” Elsewhere, Threads has angered some users by removing political content from recommendations and blocking search results for topics it deems “potentially sensitive,” like vaccines and COVID-19, even if the posts don’t violate its rules.

This article originally appeared on Engadget at https://www.engadget.com/threads-is-testing-real-time-search-results-234857960.html?src=rss

Adobe previews AI object addition and removal for Premiere Pro

Last year Adobe launched Firefly, its latest generative AI model building on its previous SenseiAI, and now the company is showing how it'll be used its video editing app, Premiere Pro. In an early sneak, it demonstrated a few key features arriving later this year, including Object Addition & Removal, Generative Extend and Text to Video.

The new features will likely be popular, as video cleanup is one a common (and painful) task. The first feature, Generative Extend, addresses a problem editors face on nearly every edit: clips that are too short. "Seamlessly add frames to make clips longer, so it's easier to perfectly time edits and add smooth transitions," Adobe states. It does that by using the AI to create extra media, helping cover an edit or transition.

Adobe

Another common issue is junk you don't want in a shot that can be tricky to remove, or adding things you do want. Premiere Pro's Object Addition & Removal addresses that, again using Firefly's generative AI. "Simply select and track objects, then replace them. Remove unwanted items, change an actor’s wardrobe or quickly add set dressings such as a painting or photorealistic flowers on a desk," Adobe writes.

Adobe shows a couple of examples, adding a pile of diamonds to a briefcase via a text prompt (generated by Firefly). It also removes an ugly utility box, changes a watch face and adds a tie to a character's costume. 

Adobe

The company also showed off a way it can import custom AI models. One, called Pika, is what powers Generative Extend, while another (Sora from OpenAI) can automatically generate B-Roll (video shots). The latter is bound to be controversial as it could potentially wipe out thousands of jobs, but is still "currently in early research," Adobe said in the video. The company notes that it will add "content credentials" to such shots, so you can see what was generated by AI including the company behind the model.

A similar feature is also available in "Text to Video," letting you generate entirely new footage directly within the app. "Simply type text into a prompt or upload reference images. These clips can be used to ideate and create storyboards, or to create B-roll for augmenting live action footage," Adobe said. The company appears to be commercializing this feature pretty fast, considering that generative AI video first appeared just a few months ago

Those features will arrive later this year, but Adobe is also introducing updates to all users in May. Those include interactive fade handles to make transitions easier, Essential Sound badge with audio category tagging ("AI automatically tags audio clips as dialogue, music, sound effects or ambience, and adds a new icon so editors get one-click, instant access to the right controls for the job"), effect badges and redesigned waveforms in the timeline. 

This article originally appeared on Engadget at https://www.engadget.com/adobe-previews-ai-object-addition-and-removal-for-premiere-pro-130034494.html?src=rss

Blackmagic's DaVinci Resolve 19 arrives with AI-powered motion tracking and color grading

Blackmagic Design released its annual NAB 2024 update and announced over a dozen new products, including a new version of its popular DaVinci Resolve editing suite. Other key products include the Micro Color Panel for DaVinci Resolve on iPad, a 17K 65mm camera and the Pyxis 6K cube camera.

Davinci Resolve 19

DaVinci Resolve has become a popular option for editors who don't want to pay a monthly subscription for Adobe's Premiere Pro, and is arguably more powerful in some ways. The latest version 19 takes a page from its rival, though, with a bunch of new AI-powered features for effects, color, editing, audio and more. 

DaVinci Resolve 19 'Color Slice' tool
Blackmagic Design

Starting with the Edit module, a new feature lets you edit clips using text instead of video. Transcribing clips opens a window showing text detected from multiple speakers, letting you remove sections, search through text and more. Other features include a new trim window, fixed play head (reducing zooming and scrolling), a window that makes changing audio attributes faster and more.

The Color tool introduces "Color Slice," a way to adjust an image based on six vectors (red, green, blue, yellow, cyan and magenta) along with a special skin tone slider. For instance, you can adjust any of those specific colors, easily changing the levels of saturation and hues, while seeing and adjusting the underlying key. The dedicated skin slider will no doubt make it attractive for quick skin tone adjustments. 

DaVinci Resolve 19 Intellitrack
Blackmagic Design

Another key feature in Color is the "IntelliTrack" powered by a neural engine AI that lets you quickly select points to track to create effects or stabilize an image. Blackmagic also added a new Lightroom-like AI-powered noise reduction system that quickly removes digital noise or film grain from images with no user tweaking required. 

"Film Look Creator" is a new module that opens up color grading possibilities with over 60 filmic parameters. It looks fairly easy to use, as you can start with a preset (default 65mm, cinematic, bleach bypass, nostalgic) and then tweak parameters to taste. Another new trick is "Defocus Background," letting users simulate a shallow depth of focus via masking in a realistic way (unlike smartphones), while Face Refinement tracks faces so editors tweak brightness, colors, detail and more. 

The Fusion FX editor adds some new tools that ease 3D object manipulation and on the audio (Fairlight) side, BMD introduced the "Dialogue Separator FX" to separate dialogue, background or ambience. DaVinci Resolve 19 is now in open beta for everyone to try, with no word yet on a date for the full release. As usual, it costs $295 for the the Studio version and the main version is free. 

Micro Color Panel

Blackmagic Design

BMD's DaVinci Resolve for iPad proved to be a popular option for editors on the go, and now the company has introduced a dedicated control surface with the new Micro Color Panel. It'll offer editors control that goes well beyond the already decent Pencil and multitouch input, while keeping a relatively low profile at 7.18 x 14.33 inches. 

A slot at the top front lets you slide in your iPad, and from there you can connect via Bluetooth or USB-C. The company promises a "professional" feel to the controls, which consist of three weighted trackballs, 12 control dials and 27 buttons. With those, you can perform editors, tweak parameters like shadows, hues and highlights, and even do wipes and other effects.

"The old DaVinci Resolve Micro Panel model has been popular with customers wanting a compact grading panel, but we wanted to design an even more portable and affordable solution," said Blackmagic Design President Grant Petty. It's now on pre-order for $509

Pyxis 6K camera

Blackmagic Design

Blackmagic Design is following rivals like RED, Sony and Panasonic with a new box-style camera, the Pyxis 6K full-frame camera. The idea is that you start with the basic brain (controls, display, CFexpress media, brain and sensor), then use side plates or mounting screws to attach accessories like handles, microphones and SSDs. It's also available with Blackmagic's URSA Cine EVF (electronic viewfinder) that adds $2,000 to the price. 

Its specs are very similar to the Blackmagic Cinema Camera 6K I tested late last year. The native resolution is 24-megapixels (6K) on a full 36 x 24mm sensor that allows for up to 13 stops of dynamic range with dual native ISO up to 25,600. It can record 12-bit Blackmagic RAW (BRAW) directly to the CFexpress Type B cards or an SSD.

It also supports direct streaming to YouTube, Facebook, Twitch and others via RTMP and SRT either via Ethernet or using a cellular connection. Since the streaming is built into the camera, customers and csee stream status and data rate directly in the viewfinder or LCD. The Pyxis 6K arrives in June for $2,995 with three mounts (Canon EF, Leica L and Arri PL). 

Blackmagic URSA Cine 12K and 17K

Blackmagic Design

Along with the Pyxis, Blackmagic introduced a pair of cinema cameras, the URSA Cine 12K and 17K models. Yes, those numbers represent the resolution of those two cameras, with the first offering a full-frame sensor 36 x 24mm with 12K resolution (12,888 x 6,480 17:9) at up a fairly incredible 100 fps. The second features a 65mm (50.8 x 23.3 sensor) with 17,520 x 8,040 resolution offering up to 16 stops of dynamic range. 

Both models will come with features like built-in ND filters, an optical low pass filter and BMD's latest gen 5.0 color science. The URSA Cine 12K will come with 8TB of internal storage, or you can use your own CFexpress media. Other features include live streaming, a high-resolution EVF, V-battery support, wireless Bluetooth camera control and more. The URSA Cine 12K model is on pre-order for $14,995 (with a copy of DaVinci Resolve), with April availability. The URSA Cine 17K is under development, with no pricing or release yet announced. 

This article originally appeared on Engadget at https://www.engadget.com/blackmagics-davinci-resolve-19-arrives-with-ai-powered-motion-tracking-and-color-grading-090013746.html?src=rss

Microsoft's Windows 11 beta testers may start seeing ads in the Start menu

Microsoft is exploring the idea of putting ads in your Windows 11 Start menu. To be specific, it's looking to place advertisements for apps you can find in the Microsoft Store in the menu's recommended section. I could hear you sighing in defeat if you've used Windows 10 extensively before — the older OS serves ads in the Start menu, as well, and they're also for apps you can download. At the moment, Microsoft will only show ads in this version if you're in the US and a Windows Insider in the Beta Channel. You won't be seeing them if you're not a beta tester or if you're using a device managed by an organization.

Further, you can disable the advertisements altogether. To do so, just go to Personalization under Settings and then toggle off "Show recommendations for tips, app promotions, and more" in the Start section. Like any other Microsoft experiment, it may never reach wider rollout, but you may want to remember the aforementioned steps, since the company does have history of incorporating ads into its desktop platforms. Last year, Microsoft also deployed experimental promo spots for its services like OneDrive in the menu that pops up when you click on your profile photo. 

Microsoft

This article originally appeared on Engadget at https://www.engadget.com/microsofts-windows-11-beta-testers-may-start-seeing-ads-in-the-start-menu-032358394.html?src=rss

Epic wants to blow the Google Play Store wide open

Back in December, Epic Games won an antitrust case against Google. A jury found that Google held an illegal monopoly on in-app billing and app distribution on Android devices, and that it engaged in anticompetitive practices with certain gaming companies and device manufacturers.

At the time, it was unclear what Epic actually won as the remedies had not been determined. The Fortnite maker has now submitted a proposed permanent injunction against Google detailing what it wants. In short, Epic wants the Play Store to be almost wide open.

The injunction is based on three core points. Epic noted in a blog post. First, Epic believes that Google has to let users download apps from wherever they want without it getting in the way. It says people should be able to add apps to Android devices in much the same way they can from a computer — from any app store or the web.

Epic wants to block Google from scaring people off from downloading apps from the web (though it's okay with letting Google block malware). It also wants to stop the company from working with carriers and phone manufacturers to limit the options consumers have for downloading apps. Among other things, Epic wants restrictions on pre-installed app stores to be outlawed. So, if the injunction is approved, we might see Android phones pre-installed with a Epic Games Store app in the future.

Second, Epic argues that Google has to allow developers and users the freedom to choose how they offer and pay for in-app purchases, "free from anticompetitive fees and restrictions." It asserts that Google has to let developers include links from their apps to websites, where they might be able to make offer discounts as they'd bypass Google's cut of in-app payments facilitated through the Play Store.

Epic kicked off its legal battle with Google (and Apple) in 2020 by pointing out to Fortnite mobile players that they could save money by buying the V-bucks currency directly from Epic. Under the proposed injunction, Google would be prevented from trying to prevent alternative payment options through compliance programs like User Choice Billing.

The third aim of Epic's proposed injunction is to block Google from retaliating against it (or any app or developer) for taking on app store practices. "Google has a history of malicious compliance and has attempted to circumvent legislation and regulation meant to reign in their anti-competitive control over Android devices," Epic wrote. "Our proposed injunction seeks to block Google from repeating past bad-faith tactics and open up Android devices to competition and choice for all developers and consumers."

The injunction has more details about Epic's demands, including for Google to untangle its products and services (such as Android APIs) from the Play Store. For a period of six years, Epic wants Google to allow third-party app stores onto the Play Store without fees, and for them to have access to the Play Store's library of apps. That would also mean allowing the third-party app stores to handle updates for Play Store apps. Epic wants Google to appoint a compliance committee to ensure it's abiding by the injunction too.

We may not have to wait too long to find out just how many of Epic's requests the court rubberstamps. Google will respond to the proposal by May 2 and a hearing on the injunction is set for May 23.

Google is having to make many similar changes in the European Union due to the bloc's Digital Markets Act. However, parent company Alphabet and Apple are already under investigation over concerns that they're not freely allowing developers to bypass the Play Store and App Store.

Meanwhile, as a result of the DMA, Epic plans to release a mobile app store on iOS and Android in the EU later this year. It's also still battling Apple over third-party payments in the US.

This article originally appeared on Engadget at https://www.engadget.com/epic-wants-to-blow-the-google-play-store-wide-open-202411585.html?src=rss

X won’t let users hide their blue checks anymore

X will no longer allow users to hide their blue checks, regardless of whether they paid for premium or not. On Thursday, the app began notifying users that “the hide your checkmark feature of X Premium is going away soon.”

BREAKING: #X seems to be removing the ability to hide the verification checkmark! pic.twitter.com/1Kn2OU4puj

— Nima Owji (@nima_owji) April 11, 2024

The change comes shortly after X unexpectedly began adding blue checks to the accounts of “influential” users with at least 2,500 followers who pay for a premium subscription. While Elon Musk suggested that change was meant to be a perk, some of his critics — including formerly verified users — were less than pleased with the blue badge appearing on their accounts, lest others suspect them of actually paying for a subscription.

This article originally appeared on Engadget at https://www.engadget.com/x-wont-let-users-hide-their-blue-checks-anymore-222938703.html?src=rss

Instagram's status update feature is coming to user profiles

Instagram’s status update feature, Notes, will soon be more prominent in the app. Up until now, Notes have only been visible from Instagram’s inbox, but the brief updates will soon also be visible directly on users’ profiles.

The change should increase the visibility of the feature and give people a new place to interact with their friends’ updates. (Instagram added reply functionality to Notes back in December.) The app is also experimenting with “prompts” for Notes, which will allow users to share questions for their friends to answer in their updates, much like the collaborative “add yours” templates for Stories.

Notes are similar to Stories in that the updates only stick around for 24 hours, though they are only visible to mutual followers, so they aren’t meant to be as widely shared as a typical grid or Stories post. The latest updates are another sign of how Meta has used the feature, first introduced in 2022, to encourage users to post more often for smaller, more curated groups of friends.

Separately, the app is also adding a new “cutouts” feature, which allows users to make stickers out of objects in their photos, much like the iOS sticker feature. On Instagram, these stickers can be shared in Stories or in a Reel. Cutouts can also be made from other users’ public posts, effectively giving people a new way to remix content from others (Instagram’s help page notes that users can disable this feature if they prefer for their content to not be reused.)

This article originally appeared on Engadget at https://www.engadget.com/instagrams-status-update-feature-is-coming-to-user-profiles-182621692.html?src=rss

Arturia stuffed almost all of its software emulations into this new keyboard

Arturia just released a new standalone synthesizer called the AstroLab. This 61-key stage keyboard is basically the company’s Analog Lab software in hardware form, which makes it perfect for live performances. The synth boasts ten dedicated sound engines and access to 35 virtual instruments, including the vast majority of the emulations found with the iconic V Collection. It also costs $2,000.

You could recreate this on the cheap by just buying some software instruments and a MIDI controller, but this is a stage keyboard. In other words, it has been designed with live performance in mind. The casing is durable and built to withstand the rigors of touring and there’s plenty of nifty sound design tools that should come in handy when gigging.

There are 12 insert FX options, with four control knobs, and the ability to loop any sound by up to 32 bars. The instrument even captures the MIDI, so people can easily swap out to another instrument and play the same part. The multitimbral feature allows players to set a split point along the keyboard, to make it easy to pull up two instruments at the same time. This is a big deal when playing live, as you never know how long a keyboard will take to load a preset.

If you want to get people dancing to the sound of a robot voice singing “around the world” over and over until 5 AM, AstroLab keyboards ship with a vocoder and a port to plug in a microphone. Of course, the synthesizer features the usual accouterments like mod wheels, an arpeggiator and various chord scale options. Finally, there’s an affiliated mobile app, AstroLab Connect, that lets users organize their presets and download new sounds from the store. The keyboard is available now through Arturia and various retailers.

This article originally appeared on Engadget at https://www.engadget.com/arturia-stuffed-almost-all-of-its-software-emulations-into-this-new-keyboard-190542557.html?src=rss

You can now lie down while using a Meta Quest 3 headset

Meta is rolling out the latest update for Meta Quest and, as always, there are some handy features. From now on, whenever you're livestreaming to the Meta Quest app, the broadcast will continue when you take the headset off. That should help avoid interruptions. There are some Quest 3-specific upgrades too, including the ability to use an external mic via the USB-C port, along with resolution and image quality improvements for the passthrough mixed reality feature.

That's not all, though. Quest 3 users will be able to take advantage of an experimental feature that allows them to make use of the headset while supine. If you enable the Use Apps While Lying Down option from the Experimental section of the Settings, you'll simply need to hold the menu button to reset your view when you lie down.

As such, you should be able to kick back and relax into immersive media and gaming experiences without having to keep your head upright. Turning your head to see what's going on elsewhere in the environment might be a bit more of a chore though.

Elsewhere, it'll now be easier to meet up with friends in Horizon Worlds, if any of your friends actually use that app. Whenever a buddy is in a public world with their location turned on, you can join them from the People app in the universal menu.

This article originally appeared on Engadget at https://www.engadget.com/you-can-now-lie-down-while-using-a-meta-quest-3-headset-164556039.html?src=rss

Google Gemini chatbots are coming to a customer service interaction near you

More and more companies are choosing to deploy AI-powered chatbots to deal with basic customer service inquiries. At the ongoing Google Cloud Next conference in Las Vegas, the company has revealed the Gemini-powered chatbots its partners are working on, some of which you could end up interacting with. Best Buy, for instance, is using Google's technology to build virtual assistants that can help you troubleshoot product issues and reschedule order deliveries. IHG Hotels & Resorts is working on another that can help you plan a vacation in its mobile app, while Mercedes Benz is using Gemini to improve its own smart sales assistant. 

Security company ADT is also building an agent that can help you set up your home security system. And if you happen to be a radiologist, you may end up interacting with Bayer's Gemini-powered apps for diagnosis assistance. Meanwhile, other partners are using Gemini to create experiences that aren't quite customer-facing: Cintas, Discover and Verizon are using generative AI capabilities in different ways to help their customer service personnel find information more quickly and easily. 

Google has launched the Vertex AI Agency Builder, as well, which it says will help developers "easily build and deploy enterprise-ready gen AI experiences" like OpenAI's GPTs and Microsoft's Copilot Studio. The Builder will provide developers with a set of tools they can use for their projects, including a no-code console that can understand natural language and build AI agents based on Gemini in minutes. Vertex AI has more advanced tools for more complex projects, of course, but their common goal is to simplify the creation and maintenance of personalized AI chatbots and experiences. 

At the same event, Google also announced its new AI-powered video generator for Workspace, as well as its first ARM-based CPU specifically made for data centers. By launching the latter, it's taking on Amazon, which has been using its Graviton processor to power its cloud network over the past few years. 

This article originally appeared on Engadget at https://www.engadget.com/google-gemini-chatbots-are-coming-to-a-customer-service-interaction-near-you-120035393.html?src=rss