Samsung may have found a way to strike a hefty blow to the United States’ burgeoning right to repair movement. It has approached the International Trade Commission (ITC), requesting an investigation into the importation of third-party OLED displays for independent repair stores. And if the ITC finds in Samsung’s favor it would, in the words of Louis Rossmann (who published the text of the complaint), “fire a kill shot on the entire repair industry.”
Put simply, Samsung’s claim says that it creates AMOLED displays for mobile devices, and that those displays are covered by a number of patents. But factories in China (and elsewhere) are, according to Samsung, churning out similar screens that infringe upon those patents. And that these screens are often imported by third-party repair businesses in the US as a cheaper option than buying authorized parts directly from, in this case, Samsung.
Several businesses are named in Samsung’s complaint, including MobileSentrix, Injured Gadgets and DFW Cellphone & Parts. Many offer wholesale parts and equipment to other repair companies, as well as their own over-the-counter repair service. Samsung wants the ITC to issue orders blocking the importation of these replacement display parts at the border. It has also requested that the named companies be ordered to stop importing, selling or using the products in question.
Now, Samsung is well within its right to protect its intellectual property, even if it’s going about it in a very interesting way. Rather than address the violating factories directly by seeking remedy where those businesses operate, it’s opting instead to block imports into the US. Given the cavalier manner that foreign IP is treated in some parts of the world, it may be easier to go after the customer than it is to attack the suppliers. Samsung’s lawyers did not respond to our requests for comment at the time of publication.
On January 4th, 2023, the ITC announced that it would open an investigation into the import activity under section 337 of the Tariff Act (1930). This gives the ITC broad latitude to look into if the act of importing a product into the US would harm a business operating here. That includes both the infringement of registered patents, as well as the “misappropriation of trade secrets.” And the remedies on offer include the prohibition on further imports as well as the blanket ban on further attempts to acquire this hardware.
The ITC has become a useful tool in corporate America’s arsenal when looking to avoid a drawn-out courtroom battle. Law firm Meyer Brown’s report on section 337 explains that companies use Commission because it offers a “highly accelerated procedure” and “powerful remedies” which are “not available in federal courts.”
If Samsung’s request is successful, it could prevent large volumes of third-party OLED displays from being imported to the US. This would have consequences for the small and medium-sized repair businesses that have grown up around repairing broken smartphone screens. It would also funnel significantly more people toward Samsung’s network of authorized service centers.
Few individuals are willing to speak on the record concerning the present state of Android device repair for fear of souring already-strained supplier relationships. We heard from multiple sources that the perpetually under-fire third-party Apple repair ecosystem is luxurious compared to its Android equivalent. One individual, who asked not to be named, said it was often difficult to source replacement parts for Android handsets, which regularly cost more than those for equivalent Apple products.
Another said that standalone Android repair businesses often struggle to stay afloat since they have to charge higher prices for display replacement. And many customers, when shown the potential cost, prefer to ditch their device in favor of replacing it outright. (We noted, too, that on Samsung’s US cracked display support page, the first option in the list is to upgrade your phone rather than opting for a screen replacement.)
In its case to the ITC, Samsung says that it has “sufficient manufacturing capacity” to “assure demand is met for OLED displays as replacement,” which are “supplied through authorized channels.” We could not contact anyone inside Samsung’s authorized repair channels for comment, but one independent repairer who claimed knowledge of the situation said that wasn’t necessarily the case. They believe that Samsung repairers often face long wait times for replacement parts, and that the company often can’t fulfill demand quickly enough.
The Repair Association and US Public Interest Research Group issued a joint submission to the ITC on January 12th, which was shared with Engadget. It said Samsung was behaving in a manner contrary to the US’ present push to reduce the proliferation of e-waste. They added the move was likely anti-competitive and designed to box out independent repair technicians. And that, if Samsung is concerned about patent infringement, it should seek to negotiate with the infringing factories directly or propose “fair and reasonable” licensing terms.
When contacted, the ITC said that it did not comment on ongoing matters, and it will likely be some time before we learn its decision. Rossmann, in a YouTube video posted to his channel, added that this may not just affect Samsung displays, but also any OLED display supplied by Samsung. Which includes a number of displays for iOS devices, given that Samsung Display reportedly supplies 70 percent of all screens for iPhones. Which means that, if the ITC interprets this in the broadest possible terms, the right to repair movement may be in for a long battle.
We’ve known for a while that Sony planned to bring PlayStation franchises to mobile platforms, but we were hoping for something with a unique hook. Instead, Sony has partnered with the independent developer and publisher Exient (Lemmings, Planet 53) on a mobile game starring LittleBigPlanet’s Sackboy. Ultimate Sackboy is an auto-running game for Android and iOS, launching globally on February 21st.
The title follows a well-worn formula: control a cute auto-running mascot, jumping and swerving lanes to avoid obstacles while snagging power-ups. Like Super Mario Run and other genre standards, you’ll play with your phone in portrait orientation. The plot revolves around the crocheted hero competing in the Ultimate Games, which we imagine as an Olympics for semi-retired video game mascots living in an artisan-crafted world. Unsurprisingly, the game’s Google Play listing mentions ads and in-app purchases, consistent with the trailer’s emphasis on acquiring costumes and cosmetics.
Although we’d love to see publishers like Sony bring something more unique to their phone-based spinoffs, an auto-runner starring a beloved mascot ticks the boxes publishers prioritize on mobile: maximum micro-transaction potential with minimal investment in unique gameplay.
The public launch will follow the game’s closed betas in Australia, Canada, Ireland, Netherlands, New Zealand, Philippines, Singapore, South Africa, Turkey and Malta. You can sign up to pre-register on Google Play, and this page will notify you once it’s available on iOS (it will have iPhone and iPad versions).
A few big names in the smart home space, iRobot and Shark in particular, have jumped on the robot-vacuum-and-mop bandwagon as of late. The two companies recently came out with their first 2-in-1 devices, and now you can pick up Shark's at its best price yet. The Shark AI Ultra robot vacuum and mop is 36 percent off at Amazon right now, bringing it down to $450, which is less than it was during the holiday shopping season last year. If you're an iRobot fan, the Combo j7+ is also on sale, but it's much more expensive at $899.
It's important to note that we at Engadget have not had the chance to test Shark's new machine yet, but we have had great experiences with all of the Shark robo-vacs we've tried to this point. Shark devices make appearances in both of our robot vacuum guides, with the standard AI Ultra vacuum taking one of the top spots on our list of overall favorites. The new 2-in-1 device seems to take a lot of notes from the standard model: you're getting a disk-like robot vacuum along with a bagless, self-emptying base into which the machine will dump the contents of its dustbin after every job.
The big difference here is the included water reservoir and the washable, reusable mopping pads that come with the 2-in-1 machine. It'll employ those when cleaning hardwood floors using a sonic mopping technique that supposedly scrubs floors up to 100 times per minute. If you have a mix of carpet, hardwood, tile and other flooring in your home, a 2-in-1 device like Shark's will make it more convenient to clean all of those surfaces in one go.
In addition to that new feature, this Shark robot vacuum has improved suction power, flexible silicon "fins" on its underside that help pick up more dirt and debris, obstacle avoidance and smart home mapping. Like most other robot vacuums, you can set cleaning schedules within Shark's companion app, which we think will be easy to use for both the tech-savvy and newbies alike. And we especially like that its base is bagless — that means you don't have to buy proprietary trash bags to fill it with like you do with some competitors.
Shark's device joins a number of other robo-vacs on sale right now. As we mentioned previously, iRobot's Combo j7+ is $200 off right now, plus you can get the Roomba s9+ for $200 off as well or the much more affordable Roomba 694 for only $179.
"It is Getty Images’ position that Stability AI unlawfully copied and processed millions of images protected by copyright and the associated metadata owned or represented by Getty Images absent a license to benefit Stability AI’s commercial interests and to the detriment of the content creators," Getty Images wrote in a press statement released Tuesday. "Getty Images believes artificial intelligence has the potential to stimulate creative endeavors."
"Getty Images provided licenses to leading technology innovators for purposes related to training artificial intelligence systems in a manner that respects personal and intellectual property rights," the company continued. "Stability AI did not seek any such license from Getty Images and instead, we believe, chose to ignore viable licensing options and long‑standing legal protections in pursuit of their stand‑alone commercial interests."
The details of the lawsuit have not been made public, though Getty Images CEO Craig Peters told The Verge, that charges would include copyright and site TOS violations like web scraping. Furthermore, Peters explained that the company is not seeking monetary damages in this case so as much as it is hoping to establish a favorable precedent for future litigation.
Text-to-image generation tools like Stable Diffusion, Dall-E and Midjourney don't create the artwork that they produce in the same way people do — there is no imagination from which these ideas can spring forth. Like other generative AI, these tools are trained to do what they do using massive databases of annotated images — think, hundreds of thousands of frog pictures labelled "frog" used to teach a computer algorithm what a frog looks like.
And why go through the trouble of assembling and annotating a database of your own when there's an entire internet's worth of content there for the taking? AI firms like Clearview and Voyager Labs have already tried and been massively, repeatedly fined for scraping image data from the public web and social media sites. An independent study conducted last August concluded that a notable portion of Stable Diffusion's data was likely pulled directly from the Getty Images site, in part as evidenced by the art tool's habit of recreating the Getty watermark.
You'd think that, being the oldest name in the smart lighting world, Philips would have the best app on the market. More than a decade of iterative improvements and a mature hardware world would see the app rise proudly above its competitors. Sadly for me, and every other Hue user, the company seems to have fallen asleep behind the wheel.
(Yes: I know that Philips Lighting rebranded itself as Signify, but let’s not confuse matters here.)
I picked up a Hue starter kit and some additional Lux bulbs back in 2013, and was very impressed with the setup for at least ten minutes. It very quickly became one of those gadgets that only really got used to show the power of your smart home to visitors. And they rather quickly tired of my ability to change my living room lights from white to purple, and back again. In fact, I mostly used the bulbs as glorified dimmer switches, which wasn’t enough to justify the high cost of the initial investment.
At some point, the app started insisting I replace the v1 (round) Bridge for the v2 (square) model. And I bristled, already feeling aggrieved that Hue was all mouth and no trousers, I resented having to pay when the existing system worked perfectly well. Especially since I could have used that money to buy more Hue bulbs and further lock myself into Philips’ ecosystem.
No tears were shed when the Bridge eventually got smashed by one (or both) of my kids when I was out of the room. I decided, in a tiny flurry of COVID-19 lockdown-induced Marie Kondo-ing, that I’d toss the box into the trash and be done with it. After all, it was broken, and changing the color of my bulbs did not spark the joy I was expecting, not to mention the fact that Philips loves to charge a lot of cash to sync your lighting to a movie playing on your TV.
Last month, my wife asked me why we weren't able to use Hue any more, and I explained the situation. She asked how much it would cost to fix it, and found a sealed, unused, second generation Bridge available on Facebook Marketplace for half the price at retail. So we snapped it up, obviously making the usual security checks about buying second hand IoT gear before plugging it into our network.
That was, however, when the troubles began, since you can’t just sign in to your existing Hue account, hook it up to the new Bridge, and be done. Nobody at Philips seems to have imagined that it might be worthwhile building out the ability to revive an account tied to a dead bridge. In fact, there’s no way to connect anything without a fresh login, and the bulbs themselves are tied to the old one. The app also doesn’t provide any way to hard reset a bulb, or in fact do anything beyond leave you staring at a splash screen.
For about half an hour, I did wonder if I’d just wasted some cash on a new Bridge but never to get things working again. I felt a frustration, a powerlessness, the sort that comes when you’re locked and bolted out of a building at 2am in an unfamiliar city and your phone’s out of charge. My login wouldn’t work, because my bridge wasn’t connected to the internet. A new login won’t even acknowledge the presence of the expensive hardware all over my house. My hands got very itchy.
This is the kicker: I’m not the first person to learn how bad Philips’ software development is, because there’s a whole army of third-party Hue apps out there. Much in the same way that charity is an indictment on behalf of the state, the depth and breadth of Hue apps available is a massive critique on Philips’ lackluster app development. You’re paid to do this, and there’s no available function in the app to be able to fix what could be a fairly common problem.
I opted to use Hue Lights, one of many independent apps that offered the ability to hard reset a bulb. All I had to do was bring each bulb close to the bridge (you’ll need a lamp handy), turn it on, and hard reset each unit individually. Then I could reconnect them to the new bridge and, as if by magic, could then start using them with the official Hue app. Not that, I’ll be honest, I really want to. Because this third-party, very simple app has more power than the official Philips app and it’s easier to use. If you haven’t tried it, I heartily recommend that you do. At least until Philips gets its act together.
Samsung expanded its self-repair program for Galaxy devices today, adding the latest flagship smartphones and, for the first time, PCs. As you may remember, the initiative is a team-up with iFixit, which provides tools and online self-repair guides.
Starting today, you can order repair kits for the 15-inch models of the Galaxy Book Pro and Galaxy Book Pro 360. Supported PC repairs include the display, battery, touchpad, case (front and rear), power key with fingerprint reader, and rubber foot. Additionally, Samsung added the Galaxy S22, S22+ and S22 Ultra kits. It supports repairs for the display assemblies, rear glass and charging ports for those phones.
The newly supported models join the program’s initial lineup of the Galaxy S20, Galaxy S21 and Galaxy Tab S7+. The new kits still include a free return label to help you send used parts to Samsung for recycling. All the new kits are available starting today.
Apple’s Self Service Repair program
Apple
While Apple's program covers more components (including cameras and SIM trays), it also requires you to rent or buy a separate toolkit and talk with someone on the phone to complete the process. With Samsung's kit, you only need to buy the part and follow the instructions.
Samsung frames its self-repair program as being about convenience and the environment — and it can be beneficial for both of those things. But the elephant in the room is Right to Repair legislation on federal and state levels. New York and Massachusetts have passed laws mandating self-repair programs, while the White House has also pushed for it. In 2021, President Biden ordered the FTC to tackle “unfair anti-competitive restrictions on third-party repair or self-repair of items” in the farming and technology industries. So although Samsung’s and Apple’s programs are good for consumers, it’s a stretch to think this would happen without the threat of government legislation.
Uber is expanding its electric car rentals to Europe. The ridesharing service has expanded its deal with Hertz to provide up to 25,000 EVs to European capital cities by 2025, including those from Polestar and Tesla. The rollout will begin in London this month, and will reach hubs like Amsterdam and Paris as soon as 2023. Rates and other details will be available in "due course," Uber says.
The companies first teamed up in late 2021, when Hertz pledged to offer up to 50,000 Tesla rental EVs to Uber drivers in the US. Last spring, Hertz said it would add as many as 65,000 Polestar EVs to its fleet within five years. The rental car agency claims its Uber partnership in North America has been successful — almost 50,000 drivers have rented Tesla vehicles so far.
Both firms see the European expansion as key to furthering their goals. Uber says it plans to be a "zero-emissions platform" in London by 2025, and completely electric in Europe and North America by 2030. Hertz, meanwhile, has set out to offer "one of the largest" EV fleets worldwide.
Neither brand may have much choice, however. The UK and European Union intend to ban sales of new fossil fuel cars by 2035, and the EU agreement also demands cutting new car emissions by 55 percent from 2030. Many automakers operating in Europe, like Ford and Volvo, expect to drop combustion engines by 2030. Uber and Hertz will have to adopt EVs in the next several years, and these rentals could help ease the transition for drivers who can't justify buying the technology at this stage.
Sony and Canon are locked in a pitched battle for the full-frame mirrorless camera market, and Canon’s latest salvo is the $2,500 EOS R6 II. It’s not just a key rival to Sony’s like-priced 33-megapixel A7 IV, but gives Canon the opportunity to rectify overheating flaws in the otherwise excellent EOS R6.
The new 24-megapixel sensor promises more resolution and image quality than the 20-megapixel R6. It also offers faster shooting speeds, improved 4K video specs, an improved viewfinder and more. The competition in this category is getting tough, though. Panasonic also recently announced the $2,000 Lumix S5II and $2,200 S5IIX, its first cameras with phase-detect hybrid autofocus.
I saw the R6 II last last year in prototype form, but I’ve now got my hands on the final version. Can it keep up with the competition, and are the overheating issues solved? I tried it in a variety of shooting situations to find out.
Body and handling
Canon has experimented with the controls of past cameras, introducing things like a touch bar, but users didn’t like it. Fortunately, the R6 II uses Canon’s tried and tested form factor, with buttons, dials and the joystick right where you’d expect to find them. The grip is big, comfortable and has a rubber-like material, giving a sure hold with no discomfort even after a day’s use.
There are a few welcome changes over the R6, though. The power switch is now at right for easier access, with a “lock” setting that prevents accidental control activation (you can specify which controls to lock out).
Canon also introduced a dedicated photo and video switch. Flipping it changes all the settings for each button, as well as the main and quick menus. If you flip from photos to video, though, it uses whatever is set on the mode dial (M, S, A, P, etc.), so you have to remember to change that. All other settings, though, remain separate.
As before, it has a fully-articulating 1.62-million dot display that makes the R6 II useful for vlogging, selfies, etc. And Canon has updated the EVF from 2.36-million to 3.67-million dots, matching the A7 IV and getting rid of one of my biggest complaints about the original R6. It’s not quite as sharp as the 5.76-million dot EVF on the X-H2S, for instance, but it’s relatively sharp and fast with a 120 fps refresh rate.
Where the R6 had a single fast UHS-II card slot and a slower UHS-I slot, the R6 II now has two UHS-II slots. Unlike the A7 IV or Panasonic GH6, though, it lacks any kind of a CFexpress card slot which does affect burst speeds and video capture options.
It uses the same LP-E6NH battery as before, but endurance is up significantly from 510 shots max on the R6 to nearly 760 on the R6 II. I’ve taken well over 2,000 shots in a day (with a mix of electronic and mechanical shutter), and shot video for nearly two hours.
Naturally, it has microphone and headphone ports, along with a “next-generation” 21-pin digital interface at the hot shoe (Canon has shown images with the Tascam XLR2d-C audio interface and its newly launched Speedlite EL-5). Sadly, it uses a fragile micro instead of a full HDMI port. That’s unfortunate considering the RAW video output, as micro HDMI cables (and ports) tend to be fragile and finicky.
In terms of connectivity, you can run the camera off the USB-C via the power delivery feature. It also offers Bluetooth 5 and 5GHz Wifi, and you can use it directly as a PC or Mac webcam over USB-C using the built-in industry-standard UVC and UAC video and audio drivers built into Windows and MacOS.
Performance
Steve Dent/Engadget
As I saw in San Diego while shooting sports, the R6 II is fast. It can fire bursts at 12 fps with the mechanical shutter, which is already a touch faster than the A7 IV. However, switching to electronic mode brings that pace up to a frenzied 40 fps, making it the sportiest full-frame camera in this price category by far.
Using electronic mode means you’ll shoot fewer shots though (it also impacts the quality, but more on that shortly). You can get about 75 compressed RAW/JPEG frames before the buffer fills, and fewer with uncompressed RAW. In mechanical shutter mode, by contrast, you can shoot around 1,000 compressed RAW/JPEG frames before it stops, or about 140 uncompressed RAW photos.
Speaking of the buffer, an interesting new feature is the Pro Capture mode. If you activate that setting and half press the shutter button, it will continuously record and store several seconds worth of photos in the buffer. Then, when you full-press the shutter button, you’ll capture a few seconds of action that occurred right before you did so. The idea, of course, is that if you weren’t quite quick enough, you’ll still get a shot.
Rolling shutter is well controlled, about half that of the original R6 and significantly less than the A7 IV, as you can see in tests performed by Gerald Undone. I’d hesitate to use it for fast-moving sports at full-frame, but it’s very minimal in cropped 1.6x mode.
The Dual Pixel autofocus on the R6 II is also quicker and more reliable than the R6. Using it in single-point mode with no face/eye detection, it could keep up with the 40 fps burst speeds, missing just the odd shot. In 12 fps mechanical shutter mode, I rarely had a shot out of focus. In this aspect, it’s nearly on par with the EOS R3, which uses a stacked sensor.
Steve Dent/Engadget
There are 4,897 focus detect positions for photos and 4,067 for video, with up to 100 percent coverage depending on the lens. That means you can track subjects even at the edge of the frame. Selecting a subject is relatively easy using the multi-controller joystick or touchscreen.
On top of face and eyes, the R6 II can detect people’s bodies, plus animals and vehicles, including motorcycles, cars, trains and horses. It also comes with a new auto-select mode that lets the AI choose the subject type. It can also track user-selected subjects not in those categories.
While it can occasionally get confused by the background, the R6 II is good at locking onto human faces and eyes. It’s a bit less dependable for animals and other subjects. Tracking fast-moving subjects works well, though I had to dive into the settings to boost speeds for quicker movement. Touch to track works well if the subject is well defined, but isn’t as reliable as face tracking.
In general, autofocus is excellent and second only to Sony. As mentioned, I shot thousands of images per day at Canon’s shooting sessions (on a prototype camera), and most of them were in focus, with very little fiddling required on my part.
Image Quality
The new 24-megapixel sensor (neither backside illuminated nor stacked) is the biggest improvement in this camera, offering improved image quality, better low-light sensitivity and more. Images are of course sharper, but Canon has also boosted the dynamic range, allowing for improved image quality as well.
JPEGs have good levels of detail without excessive sharpening. Color accuracy is good and skin tones more pleasing than other cameras I’ve tried recently. If you want to boost quality a bit but not shoot RAWs, you can also shoot using the 10-bit HEIF (high efficiency image file) format, which offers a wider color range and less likelihood of banding.
With an ISO range of 100-102400 (50 to 204800 expanded), the R6 II is actually better than the original R6 in low light, despite the extra resolution. I had no qualms about shooting at ISO 12800 using some light noise reduction, and even ISO 25,600 images were usable if I exposed correctly. Anything above that had low usability, however.
RAW images retain extra detail, especially in shadows. That makes images easier to edit should you underexpose them. It has perhaps a bit less dynamic range than Sony or Nikon full-frame cameras, but it’s still very good. Beware that dynamic range drops in electronic shutter mode, though, as the R6 II shifts from 14-bit to 12-bit capture – so that extra speed does come at a slight loss in dynamic range.
Video
The EOS R6 II offers supersampled, full-frame 4K video all the way up to 60 fps. By contrast, the A7 IV and Panasonic’s S5 II both crop 60p video. Much like the A7 IV, 10-bit quality is available only in C-Log3 mode, with 8-bit in the regular video modes. That’s too bad, as regular 10-bit video provides extra headroom in shadows and highlights, without the hassle of applying LUTs or doing other color correction. All resolutions are available in 1.6X crop mode, with just a slight loss in sharpness.
You can do super slow mo in 1080p at up to 180fps, though the footage is barely usable. It’s more acceptable at 120fps, which still slows the action way down. And finally, you can shoot up to 6K in 12-bit ProRes RAW to an external Atomos Ninja V+ recorder. That delivers the best quality and easiest-to-edit video, if you don’t mind the hassle.
With the original R6, heating issues were a showstopper for many. You could shoot no more than 40 minutes of video at 4K 30fps, or 30 minutes at 60p. On top of that, you had to wait at least 10 minutes for it to cool down, and then you could only shoot for another 10 minutes or so.
Fortunately, those problems are largely gone. I shot supersampled 4K 30p video for nearly two hours until the battery died with no heating issues. In 60p supersampled mode, Canon says you can shoot for up to 50 minutes and get back to shooting again more quickly and for a longer time.
Steve Dent/Engadget
Those numbers are actually conservative, as I was able to shoot 4K 60p for over an hour (albeit, in 50 degree F temperatures). If you start and stop 4K 60p capture, there are no problems. If you really need continuous 4K 60p video, get another camera, but otherwise overheating issues are largely gone.
Quality is excellent, with sharper video than the competition at 4K 60p. Dynamic range in CLog3 mode isn’t quite as good as Sony’s A7 IV or the Panasonic S5 II, though. Much of that is lost in shadows, so it’s better to slightly over than underexpose when shooting C-Log3. It’s nothing you’d notice for regular non-log video, though.
Low-light video is good at ISOs up to 6400 and you can get away with 12800 if you’re careful with exposure. If not, boosting shadows can create some serious noise. Still, it’s one of the better full-frame cameras in low light, making it useful for things like concerts or plays.
One unfortunate omission compared to rivals is the lack of easy-to-edit intra-frame (all-intra or ProRes) codecs. That makes it pretty much mandatory to convert to ProRes or another format afterward, as even fast editing systems don’t like LongGOP. Sony’s A7 IV, meanwhile, supports all-intra capture at up to 600 megabits per second, which is one reason it has a CFexpress Type A card slot.
Video autofocus is a strong point for Canon. With single-point autofocus for run and gun shooting, interviews and the like, I rarely had out-of-focus shots. Human face and eye-tracing is incredibly reliable for videos. It stays locked on the subject and keeps them in focus as they move, though again, Sony’s A7 IV is slightly quicker.
Steve Dent/Engadget
As with photos, it also offers reliable animal and vehicle tracking, with the same “auto” mode that lets the camera’s AI choose the subject type. Overall, the R6 II is another reliable Canon camera in terms of video autofocus – something I think is really important for most video shooters, especially vloggers or documentary filmmakers..
Canon beats all rivals in rolling shutter. It’s noticeably better than on the Sony A7 IV, even in fully-downsampled mode. In 1.6 crop mode, it’s barely detectable, even if you whip the camera around. Like bad autofocus, excessive rolling shutter can ruin shots, so for me this is another key feature.
In-body stabilization is fine for stationary handheld shots or small movements. Anything more can be jerky, even in enhanced digital IS mode, however. The R6 II is about the same as the A7 IV in this regard, but Panasonic’s new S5II has massively improved stabilization designed for video and looks like it will beat both cameras.
Finally, Canon has introduced a digital focus breathing feature, much like Sony has on the A7 IV. This allows you to “rack” focus from one subject to another without either changing in size, by essentially using digital zoom to counteract the optical zoom. It works well, but only with a handful of lenses for now.
Wrap-up
Steve Dent/Engadget
Canon’s $2,500 EOS R6 II is a formidable hybrid mirrorless camera, with fast shooting speeds, accurate autofocus and strong video capabilities. The overheating issues have largely been fixed, unless you really need to shoot continuous 4K 60p full-frame video. The main downside is a lack of dynamic range compared to rivals.
Sony’s $2,500 A7 IV has more resolution but slower shooting speeds, particularly in electronic mode. Rolling shutter is a more serious issue on that camera as well. On the plus side, it offers slightly better image quality and autofocus that’s a touch faster.
Panasonic’s $2,000 S5 II has slightly better video specs, but it remains to be seen if autofocus can keep up. The S5 IIx, coming in May, looks like a better mirrorless camera for video and it's less expensive at $2,200. And finally, if you’re willing to drop down to an APS-C sensor, the $2,500 X-H2S has a stacked sensor and better video chops, but slightly inferior autofocus. If you shoot both photos and video equally, I’d choose the EOS R6 II over all those models.
Samsung is continuing its "more pixels is better" mantra with the launch of its latest 200-megapixel (MP) sensor. The ISOCELL HP2 is a relatively large (for a smartphone) Type 1/1.3 sensor (around 12mm diagonally) with a pixel pitch of 0.6 micrometers (μm) — in between the 200-megapixel HP1 and HP3 sensors. It offers more light gathering than past sensors along with new HDR features and will likely be used in Samsung's upcoming Galaxy S23 Ultra smartphone.
The HP2 uses something Samsung calls Dual Vertical Transfer Gate (D-VTG) technology. This essentially doubles the number of electrons available from each photo diode, "boosting the pixel's full-well capacity by more than 33 percent," the company wrote. That means a pixel can hold more charge before saturating, reducing overexposure and improving color reproduction in bright light conditions.
As before, it can transform into either a 1.2μm 50MP or 2.4μm 12.5MP sensor by binding either four or 16 neighboring pixels, allowing for better performance in low light. It can shoot up to 8K 30 fps video in 50MP mode (up from 24 fps on the Galaxy S22) to minimize cropping while still allowing for sharp video. It also uses something called Smart-ISO Pro to capture 12.5MP HDR images and 4K HDR at up to 60 fps. And as before, each pixel acts as a focus agent to allow for quick autofocus, even in low-light situations.
Samsung's Galaxy S22 Ultra "only" had a 108MP sensor, with the 200MP HP1 chip used in other devices like the Motorola 30 Edge Ultra and Xiaomi 12T Pro. However, rumor has it that the S23 Ultra will have a 200MP sensor, and the HP2 fits the bill as it's already in mass production, Samsung said. The Galaxy S23 is set to be revealed in just two weeks on February 1st, 2023.
Twitter appears to have deliberately cut off third-party clients from accessing its API. Since Thursday evening, many of the most popular apps that scroll Twitter without going through the company’s own software, including Tweetbot and Twitterrific, haven’t worked, with no official communication from Twitter. On Sunday, a report from The Information featured messages from Twitter’s internal Slack channels that suggested the company is aware of the outage – and is likely the cause of it. “Third-party app suspensions are intentional,” reads one message in a channel the company’s engineers use to fix service disruptions.
In the last 24 hours, Tweetbot briefly came back online, but then fizzled out again. Does Twitter really want us to reinstall its app or use it in our web browsers? Will this work, or will more of us just turn off Twitter entirely?
– Mat Smith
The Morning After isn’t just a newsletter – it’s also a daily podcast. Get our daily audio briefings, Monday through Friday, by subscribing right here.
Samsung will make the phones official on February 1st.
Nieuwe Mobile posted leaked images of the upcoming Samsung Galaxy S23 Ultra and S23 Plus. The alleged renders show the camera placement, colors and design of the new flagships, which Samsung is set to announce next month. There are, once again, lots of camera sensors. The Ultra’s back appears flatter than its predecessor’s and has five camera sensors. Additionally, three of the S23 Ultra’s lenses (likely the primary, ultra-wide and 10x telephoto cameras) are bigger than the others. According to rumors, the Ultra will have a 200-megapixel main camera, a first for the Galaxy lineup. (Standard and Plus models should have 50-megapixel cameras.)
The livestreaming tool lets you read your chat without looking distracted.
NVIDIA has updated its Broadcast software with a beta Eye Contact feature that, like Apple's FaceTime, fixes your gaze to keep it focused on your camera. It preserves your blinks and eye color and will even transition between digital and real eyes when you look far enough off-center. It’s not perfect: The developers caution there are "millions" of potential eye-color and lighting scenarios they can't test. It could, however, reduce the awkwardness of your first scripted video.
The legislation targets plates, cutlery, trays and polystyrene cups, among other items.
England will ban businesses from selling and offering a variety of single-use plastics, including plates and cutlery, by the end of the year, the UK’s Department for Environment, Food and Rural Affairs announced on Saturday. In addition to some plastics, the ban will cover single-use trays and certain types of polystyrene cups and food containers but will exempt plates, trays and bowls included with supermarket ready-meals – the government intends to target those through a separate plan. The government will begin enforcing the legislation in October 2023.