Posts with «robots» label

Facebook is enabling a new generation of touchy-feely robots

Without a sense of touch, Frankenstein’s monster would never have realized that “fire bad” and we would have had an unstoppable reanimated killing machine on our hands. So be thankful for the most underappreciated of your five senses, one that robots may soon themselves enjoy. Facebook announced on Monday that it has developed a suite of tactile technologies that will impart a sense of touch into robots that the mad doctor could never imagine.

But why is Facebook even bothering to look into robotics research at all? “Before I joined Facebook, I was chatting with Mark Zuckerberg, and I asked him, ‘Is there any area related to AI that you think we shouldn't be working on?’ Yann LeCun, Facebook’s chief AI scientist recalled during a recent press call. “And he said, ‘I can't find any good reason for us to work on robotics,’ so that was the start of our FAIR [Facebook AI Research] research, that we're not going to work on robotics.”

“Then, after a few years,” he continued, “it became clear that a lot of interesting progress in AI work is happening in the context of robotics because this is the nexus of where people in AI research are trying to get to; the full loop of perception, reasoning, planning and action, and then getting feedback from the from the environment.”

As such, FAIR has centered its tactile technology research on four main areas of study — hardware, simulation, processing and perception. We’ve already seen FAIR’s hardware efforts: the DIGIT, a “low-cost, compact high-resolution tactile sensor” that Facebook first announced in 2020. Unlike conventional tactile sensors, which typically rely on capacitive or resistive methods, DIGIT is actually vision-based.

FAIR

“Inside the sensors there is a camera, there are RGB LEDs placed around the silicon, and then there is a silicon gel,” Facebook AI Research Scientist, Roberto Calandra, explained. “Whenever we touch the silicone on an object, this is going to create shadows or changes in color cues that are then recorded by the collar. These allow [DIGIT] to have extremely high resolution and extremely high spectral sensitivity while having a device which is mechanically very robust, and very easy and cheap to produce.”

Calandra noted that DIGIT costs about $15 to produce and, being open source hardware, its production schematics are available to universities and research institutions with fabrication capabilities. It’s also available for sale, thanks to a partnership with GelSight, to researchers (and even members of the public) who can’t build their own.

FAIR

In terms of simulation, which allows ML systems to train in a virtual environment without the need to collect heaps of real-world hardware data (much the same way Waymo has refined its self-driving vehicle systems over the course of 10 billion computer generated miles), FAIR has developed TACTO. This system can generate hundreds of frames of realistic high-resolution touch readings per second as well as simulate vision-based tactile sensors like DIGIT so that researchers don’t have to spend hours upon hours tapping on sensors to create a compendium of real-world training data.

“Today if you want to use reinforcement learning, for example, to train a car to drive itself,” LeCun pointed out, “it would have to it would have to be done in your turn environment because it would have to drive for millions of hours, cause you know countless thousands of accidents and destroy itself multiple times before it burns its way around and even then it probably wouldn't be very reliable. So how is it that humans are capable of learning to drive a car with 10 to 20 hours of practice with hardly any supervision?”

“It's because, by the time we turn 16 or 17, we have a pretty good model of the world,” he continued. We inherently understand the implications of what would happen if we drove a car off a cliff because we’ve had nearly two decades of experience with the concept of gravity as well as that of fucking around and finding out. “So ‘how to get machines to learn that model of the world that allows them to predict events and plan what's going to happen as a consequence of their actions’ is really the crux of the problem here.”

Sensors and simulators are all well and good, assuming you’ve got an advanced Comp Sci degree and a deep understanding of ML training procedure. But many aspiring roboticists don’t have those sorts of qualifications so, in order to broaden the availability of DIGIT and TACTO, FAIR has developed PyTouch — not to be confused with PyTorch. While Torch is a machine learning library focusing primarily on vision-based and NLP libraries, PyTouch centers on touch sensing applications.

“Researchers can simply connect their DIGIT, download a pretrained model and use this as a building block in their robotic application,” Calandra and Facebook AI Hardware Engineer, Mike Lambeta, wrote in a blog published Monday. “For example, to build a controller that grasps objects, researchers could detect whether the controller’s fingers are in contact by downloading a module from PyTouch.”

Most recently, FAIR (in partnership with Carnegie Mellon University) has developed ReSkin, a touch-sensitive “skin” for robots and wearables alike. “This deformable elastomer has micro-magnetic particles in it,” Facebook AI Research Manager, Abhinav Gupta, said. “And then we have electronics — a thin flexible PCB, which is essentially a grid of magnetometers. The sensing technology behind the skin is very simple: if you apply force into it, the elastomer will deform and, as it deforms, it changes the magnetic flux which is read [by] magnetometers.”

“A generalized tactile sensing skin like ReSkin will provide a source of rich contact data that could be helpful in advancing AI in a wide range of touch-based tasks including object classification, proprioception, and robotic grasping,” Gupta wrote in a recent FAIR blog. “AI models trained with learned tactile sensing skills will be capable of many types of tasks, including those that require higher sensitivity, such as working in health care settings, or greater dexterity, such as maneuvering small, soft, or sensitive objects.”

Despite being relatively inexpensive to produce — 100 units cost about $6 to make — ReSkin is surprisingly durable. The 2-3mm thick material lasts for up to 50,000 touches while generating high-frequency, 3-axis tactile signals and while retaining a temporal resolution of up to 400Hz and a spatial resolution of 1mm with 90 percent accuracy. Once a swath of ReSkin reaches its usable limits, replacing “the skin is as easy as taking a bandaid off and putting a new bandaid on,” Gupta quipped.

FAIR

Given these properties, FAIR researchers foresee ReSkin being used in a number of applications including in-hand manipulation, ie making sure that robot gripper doesn’t crush the egg it’s picking up; measuring tactile forces in the field, measuring how much force the human hand exerts on objects it is manipulating, and contact localization, essentially teaching robots to recognize what they’re reaching for and how much pressure to apply once they touch upon it.

As with virtually all of its earlier research, FAIR has open-sourced DIGIT, TACTO, PyTouch and ReSkin in an effort to advance the state of tactile art across the entire field.

iRobot's poop-detecting Roomba j7+ vacuum is cheaper than ever right now

iRobot's latest Roomba that can detect obstacles — including pet poop — along its cleaning journey is cheaper than ever right now. Both Amazon and Wellbots have the Roomba j7 and j7+ for $150 less, so you can grab them for $499 and $699, respectively. Both robots are the same, but you'll get the clean base with the j7+ model, allowing you to set and forget the robot and only empty the clean base about once every 60 days.

Buy Roomba j7 at Amazon - $499Buy Roomba j7+ at Amazon - $699Buy Roomba j7 at Wellbots - $499Buy Roomba j7+ at Wellbots - $699

The j7 series builds upon the Roomba i7 robots with more powerful cameras, better sensors and more power. The AI-driven computer vision technology allows the device to detect obstacles and move around them as it cleans, and you can label those obstacles as permanent (in the case of a chair or another piece of furniture) or temporary. Not only does this mean the j7 robots should better navigate around things like piles of clothes and charging cords, but they can also detect a robot vacuum's arch nemesis: pet poop. iRobot even has a Pet Owner Official Promise (yes, P.O.O.P.) which states that you'll get a new robot vacuum if your j7+ runs into poop in the first year of you owning it.

Aside from that, the j7 series takes advantage of iRobot's improved mobile app, which lets you schedule cleanings and set routine triggers. You can also label rooms in your home after the robot has created a map, so you can better direct it to a specific room when you only need a quick clean.

While the clean base included in the j7+ package isn't necessary, it takes the convenience level up a notch. Instead of emptying your robot's bin after every job, the j7+ will automatically empty its contents into the clean base when it's done. You then only have to worry about emptying the base once every two months.

Follow @EngadgetDeals on Twitter for the latest tech deals and buying advice.

Shark's self-emptying robot vacuum is nearly half off today

Robot vacuums can be expensive, and if you want one with a base that automatically collects dirt, it'll cost you even more. But now you can get one of Shark's most advanced models for the lowest price we've seen. The Shark IQ RV1001AE robot vacuum with clean base is 47 percent off, bringing it down to $319. That's more than $280 off its normal price and the lowest we've seen it — not to mention that also makes it even cheaper than most iRobot devices that come with clean bases, even when they've been on sale.

Buy Shark IQ RV1001AE at Amazon - $319

We've been fans of Shark's robot vacuums since one of its cheaper models made it into our budget robot vacuum guide. Originally priced at $600, the RV1001AE is one of Shark's higher-end robo-vacs with powerful suction and a self-cleaning brush roll. The latter ensures that hair won't get tangled in the brush, making this vacuum a good pick for pet owners. In addition to Shark's IQ navigation, it also maps out your home as it cleans so you can tell it to only clean the living room or the kitchen by using the mobile app. And if you don't want to bother with manual cleanings, you can set a schedule in the mobile app, too, and let the robot do all the work.

This robot vacuum also cleans itself — it'll automatically empty its bin into the clean base that comes with it. Yes, you'll have to empty the base once every month or so, but that's much more convenient than having to remember to dump out the robot's contents every time. Also, Shark's clean base is bagless, so you don't have to buy proprietary liners like you do with other robots that come with similar bases.

If you can live without the clean base, a few other Shark vacuums — both robot and not — are on sale right now, too. Of note is the Shark IQ AV970, which is down to $250. That's $150 off its normal price and close to its all-time low of $230. It has many of the same features as the RV1001AE, including a self-cleaning brush roll, IQ navigation and Alexa and Google Assistant voice commands. Aside from the clean base, you'll have to forgo home mapping with this model, but you'll get an extra-large dust bin, which allows the robot to suck up more dirt and debris before you need to empty it.

Buy Shark IQ AV970 at Amazon - $250

Follow @EngadgetDeals on Twitter for the latest tech deals and buying advice.

Egyptian authorities 'detain' robotic artist for 10 days over espionage fears

The robotic artist known as Ai-Da was scheduled to display her artwork alongside the great pyramids of Egypt on Thursday, though the show was nearly called off after both the robot and her human sculptor, Aidan Meller, were detained by Egyptian authorities for a week and a half until they could confirm that the artist was actually a spy.

The incident began when border guards objected over Ai-da's camera eyes, which it uses in its creative process, and its on-board modem. “I can ditch the modems, but I can’t really gouge her eyes out,” Meller told The Guardian. The robot artist, which was built in 2019, typically travels via specialized cargo case and was held at the border until clearing customs on Wednesday evening, hours before the exhibit was scheduled to begin.

“The British ambassador has been working through the night to get Ai-Da released, but we’re right up to the wire now,” Meller said, just before Ai-Da was sprung from robo-jail. “It’s really stressful.”

Ai-Da is slated to participate in the Forever is Now exhibit, which is slated to run through November 7th and features a number of leading Egyptian and international artists, is being presented by Art D’Égypte in conjunction with the Egyptian ministry of antiquities and tourism and the Egyptian ministry of foreign affairs.

“She is an artist robot, let’s be really clear about this. She is not a spy," Meller declared. "People fear robots, I understand that. But the whole situation is ironic, because the goal of Ai-Da was to highlight and warn of the abuse of technological development, and she’s being held because she is technology. Ai-Da would appreciate that irony, I think.”

  

Raspberry Pi's Build HAT helps students build LEGO robots

Raspberry Pi has launched a new product that would make it easier to build robots out of LEGO components. The Build HAT (or Hardware Attached on Top), as it is called, is an add-on device that plugs into the Pi's 40-pin GPIO header. It was specifically designed to make it easy to use Pi hardware to control up to four LEGO Technic motors and sensors from the the toy company's Education Spike kits. Those sets are meant as a STEAM (Science, Technology, Engineering, the Arts and Mathematics) learning tool for young students. The HAT also works with motors and sensors from the Mindstorms Robot Inventor kit.

In addition to the Build HAT itself, the company has created a Python library that can help students build prototypes using a Raspberry Pi and LEGO components. Plus, Raspberry Pi designed a $15 power supply for the HAT that can also power the motors and sensors attached to it. The Build HAT will set buyers back $25 each, and it works with all 40-pin GPIO Raspberry Pi boards, including the Raspberry Pi 4 and Raspberry Pi Zero. 

Those who want to make sure that their LEGO components will work with the HAT can also check out Raspberry Pi's handy list of compatible components. Finally, those who need a bit of help to get started can try follow one of Pi's project guides, which include a DIY game controller, a robot buggy that can be controlled via Bluetooth and a robotic face.

Ghost Robotics strapped a gun to its robot dog

Boston Dynamics, the company most commonly associated with robot dogs, prohibits the weaponization of its Spot devices. That's not the case for all robot dog manufacturers, however. One of them, Ghost Robotics, showed off a version of its Q-UGV device that many will have been dreading. It's a robot dog with a gun attached to it.

Ghost Robotics has made robot dogs for the military, and it displayed this deadly model at the Association of the United States Army’s 2021 annual conference in Washington DC this week. A company called Sword International built the "special purpose unmanned rifle" (or SPUR) module. According to The Verge, it has a thermal camera for nighttime operation, an effective range of 1.2km (just under three quarters of a mile) and a 30x optical zoom.

Latest lethality 6.5 #creedmoor sniper payload from @SWORDINT. Check out the latest partner payloads @AUSAorg Wash DC. Keeping US and allied #sof#warfighter equipped with the latest innovations. @USSOCOM#defense#defence#NationalSecurity#drone#roboticspic.twitter.com/Dvk6OvL3Bu

— Ghost Robotics (@Ghost_Robotics) October 11, 2021

"Due to its highly capable sensors the SPUR can operate in a magnitude of conditions, both day and night," a blurb on Sword's website reads. "The Sword Defense Systems SPUR is the future of unmanned weapon systems, and that future is now."

It's unclear how autonomous a SPUR-equipped Q-UGV will be in the field, as Popular Science notes. It remains to be seen whether a human operator will guide the robot to an otherwise hard-to-reach position and manually aim and take shots (which seems more likely), or if the robot will handle entirely things by itself. Either way, it's an unsettling prospect, and that's before we get to the possibility of enemy hackers taking control of these machines.

As if a robot dog with a gun attached wasn't dystopian enough, Ghost Robotics tweeted about a Q-UGV with a different kind of payload: a Lockheed Martin drone and a Digital Force Technologies recon sensor. Sniper robot dogs. Flying robot spy dogs. The future's looking just peachy, isn't it?

Check it out... robot dog w/ wings... New payload with @LockheedMartin Indago #drone and Digital Force Technologies recon sensor for a broad range of #warfighter capabilities @ausaorg#ausa2021. #defense#defence#qugv#specialforcespic.twitter.com/AxuNs3r8PI

— Ghost Robotics (@Ghost_Robotics) October 13, 2021

Honda announces plans to build electric VTOLs and telepresence robots

Honda builds much more than cars and trucks — power equipment, solar cells, industrial robotics, alternative fuel engines and even aircraft are all part of the company's production capacity. On Thursday, Honda announced that it is working to further expand its manufacturing portfolio to include Avatar-style remote telepresence robots and electric VTOLs for inter- and intracity commutes before turning its ambitions to building a fuel-cell driven power generation system for the lunar surface. 

For its eVTOL, Honda plans to leverage not only the lithium battery technology it's developed for its EV and PHEV vehicles but also a gas turbine hybrid power unit to give the future aircraft enough range to handle regional inter-city flights as well. Honda foresees air taxis as a ubiquitous part of tomorrow's transportation landscape, seamlessly integrating with both autonomous ground vehicles and traditional airliners (though they could soon be flown by robots as well). Obviously, the program is still very much in the early research phase and will likely remain so until at least the second half of this decade. The company anticipates having prototype units available for testing and certification by the 2030s and a full commercial rollout sometime around 2040. 

Honda will have plenty of competition if and when it does get its eVTOLs off the ground. Cadillac showed off its single-seater aircar earlier this year, while Joby (in partnership with NASA) already has full-scale mockups flying. In June, Slovakian transportation startup, Klein Vision, flew from Nitra and to the Bratislava airport in its inaugural inter-city flight — and then drove home after the event. But building a fleet of flying taxis is no easy feat — just ask Bell helicopters — and we're sure to see more companies drop out of the sector before eVTOLs become commonplace.

Carlo Allegri / reuters

Honda reps also discussed the company's future robotics aspirations during a media briefing on Wednesday. The company envisions a future where people are unencumbered by space and time, where telepresence robots have visual and tactile acuity rivalling that of humans. Rather than hopping on a plane to inspect remote factory floors or attend product demonstrations in person, tomorrow's workers may simply don VR headsets and step into the body of an on-site humanoid robot. 

The company announced that it wants its Avatar Robot — a newly refined iteration of the Asimo (above) — put into practical use in the 2030s and will conduct technology demonstration testing by the end of Q1, 2024 in order to meet that goal. But before that happens Honda reps noted that the company has work to do downsizing the robot's hand hardware and improving its grasping dexterity.

JAXA/Honda

Honda also has big plans for its space ventures including working on ways to adapt its existing fuel cell and high differential pressure water electrolysis technologies to work on the lunar surface as part of a circulative renewable energy system.

This system would use electricity gathered from renewable energy sources (like solar) to break the molecular bonds of liquid water, resulting hydrogen and oxygen. Those two elements would then be run through Honda's fuel cell to generate both electricity and supply the lunar habitats with oxygen for breathing and hydrogen for rocket fuel. 

The company also hopes to utilize the more-nimble Avatar hands its developing as manipulators on a fleet of remote controlled lunar rovers which will perform tasks on the lunar surface rather than subject astronauts to the moon's many dangers. Honda has partnered with Japan Aerospace Exploration Agency (JAXA) and began joint research into both of these systems in June.

Hyundai puts Boston Dynamic's Spot robot to work as a factory safety inspector

Boston Dynamics’ Spot has found itself a new job, and thankfully this time it doesn’t involve a potential battlefield role. Hyundai has started testing the robot at a Kia manufacturing plant in South Korea where it will be one of the tools the company uses to ensure the facility is safe for workers. The pilot represents the first public collaboration between the two companies since Hyundai acquired a majority stake in Boston Dynamics this past June.

You’ll notice the Spot featured in the video Hyundai released looks different from the robot we’ve seen in past clips. That’s because the automaker’s Robotics Lab outfitted it with what is essentially a backpack that features a host of enhancements, including a thermal camera, LiDAR and more powerful computing resources for handling additional AI tasks. The “AI Processing Service Unit” allows Spot to detect people, monitor temperatures and check for fire hazards. Additionally, a secure webpage allows factory personnel to monitor the robot remotely, and take over control if they want to inspect an area of the facility more closely.

According to Hyundai, the pilot will help it assess the effectiveness of Spot as a late-night security patrol robot before it goes on to deploy it at additional industrial sites. Automation, manufacturing and construction applications align with what the automaker said was its grand plan for Boston Dynamics when it bought the company.

Boston Dynamics' Spot robot has learned to replan its routes

Boston Dynamics' Spot dog is learning some new behaviors that will help the robot adapt to the real world. The company has delivered a Release 3.0 update that helps Spot do its jobs without human intervention. Most notably, it can dynamically replan routes — the robot's inspection will go smoothly even if someone inadvertently left a forklift in the way.

The upgraded Spot can also handle human-free scheduled missions, and it's smart enough to automatically plan routes when you choose the actions you want to perform. The robot will help you notice when something's amiss, too. It uses scene recognition to capture photos at the same angle every time, and human inspectors can conduct live reviews of changes Spot notices with computer vision, such as gauge readings and heat changes.

These updates won't mean much if you don't have the $74,500-plus to spend on a Spot of your own. They do show how Boston Dynamics' signature canine is evolving, though, and illustrate just how robots like this can help in real life — they're increasingly useful for tasks where it would be impractical (or just a hassle) for humans to step in.

iRobot's latest Roomba can detect pet poop (and if it fails, you'll get a new one)

Over the past two decades, iRobot has steadily evolved its Roombas from being fairly dumb robotic dirt suckers to devices that are smart enough to unload their own bins. Now with the $849 Roomba j7+, the company is ready to take on its greatest challenge yet: Pet poop. It's iRobot's first vacuum that can recognize and avoid obstacles, like cables or a pile of clothes, in real-time. And for pet owners, that could finally be reason to adopt a robot vacuum.

After all, you can't exactly trust your bot to clean up while you're away if they could run into surprises from your furry friends. That's a disaster that could lead to poop being spread around your home, not to mention gumming up your expensive bot. To alleviate that concern, iRobot is making a Pet Owner Official Promise (yes, P.O.O.P.): If your j7+ runs into poop within your first year of ownership, the company will replace your vacuum. That should go a long way towards making pet parents feel more comfortable with a Roomba. (It would be nice to see that offer extended beyond just one year, though.)

While the j7+ is technically the smartest Roomba yet, it's not the company's most powerful cleaner. That honor still belongs to the $1,299 s9+. This new model is basically the Roomba i7+ with a more powerful camera, better sensors and far more processing power. It can also automatically empty its bin into a redesigned Clean Base, which is shorter and sleeker than the previous models we've seen. Now you should be able tuck it into an inobtrusive corner, or under a table, instead of dedicating floorspace to a tall iRobot monolith.

With its "PrecisionVision Navigation" — iRobot's marketing term for AI-driven computer vision — the j7+ can detect specific objects, as well as alert you to obstalces in the iRobot app after a cleaning job. You can label them as permanent or temporary obstructions, which helps the vacuum learn how to deal with similar issues in the future. If there's a pile of cords that will always be in one corner, the j7+ will just stop cleaning around that area for good. But if it's just a headphone cord that you've dropped onto the floor, the robot can give that area another go on future jobs. And since it can actually see and interpret your rooms, the j7+ will also be able to clean more gently along walls and furniture.

As it's relying on computer vision, iRobot had to train new models to help the j7+ recognize objects from floor level (there aren't too many other devices with a camera down there). At this point, CEO Colin Angle tells us that it can recognize and a pair of corded headphones on the ground, but eventually it'll handle shoes and socks as well. When it comes to recognizing pet poop, the company captured photos using playdough models, as well as images from employees, to build what's surely one of the most unique machine learning models around.

iRobot plans to bring the j7+'s sensors to future models, Angle says, but it wanted to introduce them in something more people could buy. As much as I like the pricey s9+, it's not a wise purchase when there are cheaper self-cleaning Roombas around.

iRobot

The j7+ is powered by iRobot's new Genius 3.0 software, which will also roll out to the rest of the company's connected vacuums. That builds on the features introduced last year — which includes a better mobile app, smarter scheduling and routine triggers — by adding cleaning time estimates, as well as the ability to automatically clean while you're away. The new OS smarts will also let Roombas automatically suggest room labels as they map out your home. And if you send you intrepid bot to clean one room, it'll be able to move throughout your home quietly until it reaches the work zone.

While I haven't tried out the j7+ yet, it's clear that iRobot is targeting a persistent issue with robot vacuums: trust. Early Roombas required plenty of babysitting, otherwise they could easily get stuck or jammed. These days, I habitually clear out my floors before I start a vacuum run, because even newer models can get into trouble. If iRobot can actually develop a vacuum bot that can deal with obstacles on its own, it may finally have the ideal device for people who hate cleaning. At the very least, it'd be nice to have something I can trust to avoid my cat's poop.