Posts with «robots» label

Hitting the Books: The case against tomorrow's robots looking like people

Who wouldn't want an AI-driven robot sidekick; a little mechanical pal, trustworthy and supportive — the perfect teammate. But should such an automaton be invented would it really be your teammate, an equal partner in your adventurous endeavors? Or would it simply be a tool, albeit a wildly advanced one measured against today's standard? In the excerpt below from Human-Centered AI, author and professor emeritus at the University of Maryland, Ben Shneiderman, examines the pitfalls of our innate desire to humanize the mechanical constructs we build and how we are shortchanging their continued development by doing so.

University of Oxford Press

Excerpted from Human-Centered AI by Ben Shneiderman. Published by Oxford University Press. Copyright © 2021 by Ben Shneiderman. All rights reserved.


Teammates and Tele-bots

A common theme in designs for robots and advanced technologies is that human–human interaction is a good model for human–robot interaction, and that emotional attachment to embodied robots is an asset. Many designers never consider alternatives, believing that the way people communicate with each other, coordinate activities, and form teams is the only model for design. The repeated missteps stemming from this assumption do not deter others who believe that this time will be different, that the technology is now more advanced, and that their approach is novel.

Numerous psychological studies by Clifford Nass and his team at Stanford University showed that when computers are designed to be like humans, users respond and engage in socially appropriate ways. Nass’s fallacy might be described as this: since many people are willing to respond socially to robots, it is appropriate and desirable to design robots to be social or human-like.

However, what Nass and colleagues did not consider was whether other designs, which were not social or human-like, might lead to superior performance. Getting beyond the human teammate idea may increase the likelihood that designers will take advantage of unique computer features, including sophisticated algorithms, huge databases, superhuman sensors, information abundant displays, and powerful effectors. I was pleased to find that in later work with grad student Victoria Groom, Nass wrote: “Simply put, robots fail as teammates.” They elaborated: “Characterizing robots as teammates indicates that robots are capable of fulfilling a human role and encourages humans to treat robots as human teammates. When expectations go unmet, a negative response is unavoidable.”

Lionel Robert of the University of Michigan cautions that human-like robots can lead to three problems: mistaken usage based on emotional attachment to the systems, false expectations of robot responsibility, and incorrect beliefs about appropriate use of robots. Still, a majority of researchers believe that robot teammates and social robots are inevitable. That belief pervades the human–robot interaction research community which “rarely conceptualized robots as tools or infrastructure and has instead theorized robots predominantly as peers, communication partners or teammates.”

Psychologist Gary Klein and his colleagues clarify ten realistic challenges to making machines behave as effectively as human teammates. The challenges include making machines that are predictable, controllable, and able to negotiate with people about goals. The authors suggest that their challenges are meant to stimulate research and also “as cautionary tales about the ways that technology can disrupt rather than support coordination.” A perfect teammate, buddy, assistant, or sidekick sounds appealing, but can designers deliver on this image or will users be misled, deceived, and disappointed? Can users have the control inherent in a tele-bot while benefiting from the helpfulness suggested by the teammate metaphor?

My objection is that human teammates, partners, and collaborators are very different from computers. Instead of these terms, I prefer to use tele-bots to suggest human controlled devices. I believe that it is helpful to remember that “computers are not people and people are not computers.”

UOP

Margaret Boden, a long-term researcher on creativity and AI at the University of Sussex, makes an alternate but equally strong statement: “Robots are simply not people.” I think the differences between people and computers include the following:

Responsibility Computers are not responsible participants, neither legally nor morally. They are never liable or accountable. They are a different category from humans. This continues to be true in all legal systems and I think it will remain so. Margaret Boden continues with a straightforward principle: “Humans, not robots, are responsible agents.” This principle is especially true in the military, where chain of command and responsibility are taken seriously. Pilots of advanced fighter jets with ample automation still think of themselves as in control of the plane and responsible for their successful missions, even though they must adhere to their commander’s orders and the rules of engagement. Astronauts rejected designs of early Mercury capsules which had no window to eyeball the re-entry if they had to do it manually — they wanted to be in control when necessary, yet responsive to Mission Control’s rules. Neil Armstrong landed the Lunar Module on the Moon—he was in charge, even though there was ample automation. The Lunar Module was not his partner. The Mars Rovers are not teammates; they are advanced automation with an excellent integration of human tele-operation with high levels of automatic operation.

It is instructive that the US Air Force shifted from using the term unmanned autonomous/aerial vehicles (UAVs) to remotely piloted vehicles (RPVs) so as to clarify responsibility. Many of these pilots work from a US Air Force base in Nevada to operate drones flying in distant locations on military missions that often have deadly outcomes. They are responsible for what they do and suffer psychological trauma akin to what happens to pilots flying aircraft in war zones. The Canadian Government has a rich set of knowledge requirements that candidates must have to be granted a license to operate a remotely piloted aircraft system (RPAS).13 Designers and marketers of commercial products and services recognize that they and their organizations are the responsible parties; they are morally accountable and legally liable.14 Commercial activity is further shaped by independent oversight mechanisms, such as government regulation, industry voluntary standards, and insurance requirements.

Distinctive capabilities Computers have distinctive capabilities of sophisticated algorithms, huge databases, superhuman sensors, information-abundant displays, and powerful effectors. To buy into the metaphor of “teammate” seems to encourage designers to emulate human abilities rather than take advantage of the distinctive capabilities of computers. One robot rescue design team described their project to interpret the robot’s video images through natural language text messages to the operators.The messages described what the robot was “seeing” when a video or photo could deliver much more detailed information more rapidly. Why settle for a human-like designs when designs that make full use of distinctive computer capabilities would be more effective.

Designers who pursue advanced technologies can find creative ways to empower people so that they are astonishingly more effective—that’s what familiar supertools have done: microscopes, telescopes, bulldozers, ships, and planes. Empowering people is what digital technologies have also done, through cameras, Google Maps, web search, and other widely used applications. Cameras, copy machines, cars, dishwashers, pacemakers, and heating, ventilation, and air conditioning systems (HVAC) are not usually described as teammates—they are supertools or active appliances that amplify, augment empower, and enhance people.

Human creativity The human operators are the creative force — for discovery, innovation, art, music, etc. Scientific papers are always authored by people, even when powerful computers, telescopes, and the Large Hadron Collider are used. Artworks and music compositions are credited to humans, even if rich composition technologies are heavily used. The human qualities such as passion, empathy, humility, and intuition that are often described in studies of creativity are not readily matched by computers. Another aspect of creativity is to give human users of computer systems the ability to fix, personalize, and extend the design for themselves or to provide feedback to developers for them to make improvements for all users. The continuous improvement of supertools, tele-bots, and other technologies depends on human input about problems and suggestions for new features. Those who promote the teammate metaphor are often led down the path of making human-like designs, which have a long history of appealing robots, but succeed only as entertainment, crash test dummies, and medical mannequins. I don’t think this will change. There are better designs than human-like rescue robots, bomb disposal devices, or pipe inspectors. In many cases four-wheeled or treaded vehicles are typical, usually tele-operated by a human controller.

The DaVinci surgical robot is not a teammate. It is a well-designed tele-bot that enables surgeons to perform precise actions in difficult to reach small body cavities (Figure 14.1, above). As Lewis Mumford reminds designers, successful technologies diverge from human forms. Intuitive Surgical, the developer of the DaVinci systems for cardiac, colorectal, urological, and other surgeries, makes clear that “Robots don’t perform surgery. Your surgeon performs surgery with Da Vinci by using instruments that he or she guides via a console.”

Many robotic devices have a high degree of tele-operation, in which an operator controls activities, even though there is a high degree of automation. For example, drones are tele-bots, even though they have the capacity to automatically hover or orbit at a fixed altitude, return to their take-off point, or follow a series of operator-chosen GPS waypoints. The NASA Mars Rover vehicles also have a rich mixture of tele-operated features and independent movement capabilities, guided by sensors to detect obstacles or precipices, with plans to avoid them. The control centers at NASA’s Jet Propulsion Labs have dozens of operators who control various systems on the Rovers, even when they are hundreds of millions of miles away. It is another excellent example of combining high levels of human control and high levels of automation.

Terms like tele-bots and telepresence suggest alternative design possibilities. These instruments enable remote operation and more careful control of devices, such as when tele-pathologists control a remote microscope to study tissue samples. Combined designs take limited, yet mature and proven features of teammate models and embed them in devices that augment humans by direct or tele-operated controls.

Another way that computers can be seen as teammates is by providing information from huge databases and superhuman sensors. When the results of sophisticated algorithms are displayed on information-abundant displays, such as in three-dimensional medical echocardiograms with false color to indicate blood flow volume, clinicians can be more confident in making cardiac treatment decisions. Similarly, users of Bloomberg Terminals for financial data see their computers as enabling them to make bolder choices in buying stocks or rebalancing mutual fund retirement portfolios (Figure 14.2, below). The Bloomberg Terminal uses a specialized keyboard and one or more large displays, with multiple windows typically arranged by users to be spatially stable so they know where to find what they need. With tiled, rather than overlapped, windows users can quickly find what they want without rearranging windows or scrolling. The voluminous data needed for a decision is easily visible and clicking in one window produces relevant information in other windows. More than 300,000 users pay $20,000 per year to have this supertool on their desks.

UOP

In summary, the persistence of the teammate metaphor means it has appeal for many designers and users. While users should feel fine about describing their computers as teammates, designers who harness the distinctive features of computers, such as sophisticated algorithms, huge databases, superhuman sensors, information-abundant displays, and powerful effectors may produce more effective tele-bots that are appreciated by users as supertools.

A burger-flipping robot may be coming to a White Castle near you

You can count burger-flipping robots as one pandemic innovation that’s here to stay. White Castle announced today that it will be bringing Flippy 2, a robot chef that can essentially perform the same tasks as a team of fry cooks, to 100 more locations this year. This amounts to roughly a third of White Castle restaurants nationwide, so it’s likely Flippy may become a permanent addition to this burger chain’s workforce.

Last fall the burger chain first teamed up with Miso Robotics, the makers of Flippy, to launch a pilot program in its Chicagoland location. The company then unveiled Flippy 2, the latest iteration of the chef robot, back in November. Now it appears that the robot chef is ready for prime-time.

It’s unlikely the average White Castle patron will notice the new robots when they first arrive. Since Flippy 2 is designed to perform “back-of-the-house” kitchen functions, that’s where you’ll find them. The robot uses AI to identify the type of food (such as burgers, chicken fingers, or fries), picks it up, cooks it, puts it in its own fry basket and finally puts it in a hot holding area.

But it’s possible your White Castle order may come out slightly faster than usual. Flippy 2 can dole out 60 baskets of food an hour, according to Miso Robotics’ website, or roughly 300 burgers a day. Unlike the earlier model, Flippy 2 can operate entirely without human intervention.

Miso Robotics is primarily crowdfunded; the company raised more than $50 million with more than 15,000 investors and is currently in its Series E funding round.

It’s no secret we’re in the middle of an AI-enabled cooking revolution. Miso Robotics has come out with a wing-making version of Flippy which Buffalo Wild Wings is currently testing, according to CNBC. Since robots also minimize human contact in the kitchen, they’re also used as a pandemic safety measure. Beijing is even using robot chefs to feed athletes and guests in its closed Winter Olympics bubble. It’s likely we’ll see more robotic innovations popping up in the restaurant industry, especially given recent labor shortages.

Coco's restaurant delivery bots are headed to more warm-weather cities

Coco, a company that offers food deliveries by remote-controlled robot, has expanded beyond its home base of Los Angeles for the first time. The service is now available in Austin as it commences a nationwide rollout. Coco plans to bring its robots to Dallas, Houston and Miami in the next few months.

The company says its service, which debuted in 2020, now has hundreds of delivery robots on the streets of LA, covering all of the city's major neighborhoods. Coco claims to reduce costs and deliver food to customers 30 percent faster than traditional methods with an on-time delivery rate of 97 percent. It partnered with 10 Austin restaurants and chains at the outset, and will offer deliveries in the South Lamar, South Congress, South Austin, Downtown, Northside, North Loop and Domain neighborhoods from the jump.

Other robot delivery services — such as Yandex, Serve Robotics (a former division of Postmates) and Nuro — have adopted the self-driving approach. Coco's robots, on the other hand, are controlled by employees who work from home.

Robot performs complex 'keyhole' intestinal surgery on pigs without human aid

A robot has successfully performed "keyhole" intestinal surgery on pigs without any aid from humans, according to a study from John Hopkins University (published in Science Robotics). What's more, the Smart Tissue Autonomous Robot (STAR) handled the tricky procedure "significantly better" than human doctors. The breakthrough marks a significant step towards automated surgery that could one day help "democratize" patient care, the researchers said. 

Laparoscopic or keyhole surgery requires surgeons to manipulate and stitch intestines and other organs through tiny incisions, a technique that requires high levels of skill and has little margin for error. The team chose to do "intestinal anastomosis" (joining two ends of an intestine), a particularly challenging keyhole procedure.  

Soft tissue surgery in general is hard for robots due to the unpredictability. To deal with that, the STAR robot was equipped with specialized suturing tools and state-of-the-art imaging systems that could deliver extremely accurate visualizations. 

John Hopkins

Specifically, it had a "structural light–based three-dimensional endoscope and machine learning–based tracking algorithm" to guide the robots. "We believe an advanced three-dimensional machine vision system is essential in making intelligent surgical robots smarter and safer," said John Hopkins professor Jin Kang. On top of that, STAR is the first robotic system that can "plan, adapt and execute a surgical plan in soft tissue with minimal human intervention," said first author Hamed Saeidi. Using all that technology, the STAR robot successfully performed the procedure in four animals

Laparoscopic surgery is minimally invasive compared to regular surgery, which helps ensure better patient outcomes. However, because it takes so long to master, there's a relatively small pool of doctors able to do it.

"Robotic anastomosis is one way to ensure that surgical tasks that require high precision and repeatability can be performed with more accuracy and precision in every patient independent of surgeon skill," said senior author Axel Krieger from John Hopkins. "We hypothesize that this will result in a democratized surgical approach to patient care with more predictable and consistent patient outcomes."

iRobot's Roomba 694 is back down to a record low of $179

A robot vacuum can help you stick to that New Year's resolution you made to keep your home a bit more tidy — and it helps that you don't have to spend a fortune to get one of these gadgets anymore. There are many more budget-friendly robot vacuums available today than there were even just a couple of years ago, and iRobot's Roomba 694 is one of the better ones we've tried. Normally priced at $274, the affordable robo-vac is even cheaper right now on Amazon where it's $95 off and down to $179. That's the same price we saw during the Black Friday shopping season last year, so if you missed the gadget when it was previously on sale, you have another chance to grab it now.

Buy Roomba 694 at Amazon - $179

This is one of iRobot's entry-level vacuums with a three-button design, mobile app connectivity and the ability to clean both hard and carpeted surfaces well. It earned a spot in our budget robot vacuum guide for those reasons — not only does it do a good job puttering around your home, sucking up dirt and debris along the way, but we also like that you can control it using the on-device buttons or the companion mobile app. iRobot's app is pretty straightforward, so even if you're a newbie to the world of autonomous cleaning robots, it shouldn't be difficult to figure out. The app also lets you set cleaning schedules, which tell the Roomba to automatically clean on certain days of the week and at specific times. The Roomba 694 is also compatible with Alexa and the Google Assistant, so you can control it with voice commands, too.

The Roomba 694 is a great option if you want to introduce a robo-vac into your home without dropping too much money. iRobot also has a number of more advanced machines if you're looking to invest in a vacuum with more power and smarts. The new Roomba j7+ is on sale for $599 right now, which is $250 off and the best price we've seen. It has 10x the suction power of the 694 plus Precision View Navigation with obstacle avoidance, the latter of which is the reason why iRobot dubbed the machine its "poop-detecting" robot. It also comes with a clean base, so the robo-vac will automatically empty its bin into that base after every job so you don't have to.

Similarly, the Roomba s9+ is also $250 off and down to $849. It has 40x the suction of a standard Roomba, a design that can more easily clean in room corners and an included clean base. While it's probably overkill for most people, it's the model to get if you want one of the highest-end robot vacuums around.

Buy Roomba j7+ at Amazon - $599Buy Roomba S9+ at Amazon - $849

Follow @EngadgetDeals on Twitter for the latest tech deals and buying advice.

The Roomba j7+ poop-detecting robot vacuum is $250 off right now

If you made the resolution to tidy up more regularly in 2022, a robot vacuum can help with that. And for those that hate cleaning, investing in a robot vacuum with self-emptying functionality can make it so you rarely have to interact with the machine. Two of iRobot's higher-end models with clean bases are on sale at Wellbots right now when you use the code ENGADGET250 at checkout — both the Roomba s9+ and the Roomba j7+ will be $250 off, bringing them down to $850 and $600, respectively. Those are great deals, especially considering the prices are better than we saw during Cyber Monday at the end of last year.

Buy Roomba j7+ at Wellbots - $600Buy Roomba s9+ at Wellbots - $850

The Roomba j7+ is the latest robo-vac from iRobot and it has new AI-driven computer vision technology the helps it detect objects and move around them as it cleans. It's thanks to this feature that the company calls the j7 a series of "poop-detecting" devices because they should be able to successfully avoid a robot vacuum's arch nemesis — your pet's accidents. iRobot's Pet Owner Official Promise (or P.O.O.P. for short) ensures that you'll get a new vacuum if the robot fails to avoid a run in with poop during your first year of ownership.

Otherwise, the j7 series sits right under the s9 series in iRobot's lineup, meaning it has a number of advanced features like 10x the suction power of a standard Roomba, dual multi-surface rubber brushes and Imprint Smart Mapping, the latter of which lets you direct the vacuum to clean only certain rooms. The "plus" part of the j7+ refers to the clean base, or an extended part of the dock into which the vacuum will empty its debris after every job. So instead of emptying the dustbin yourself after every cleaning, you'll only have to empty the base about once every two months. Combine that with the smart controls in the iRobot mobile app and you may only have to interact with the Roomba every so often — the app lets you do things like remote start the device, set cleaning schedules and more.

The s9+ is the most advanced device that iRobot makes and it has a few differences from the j7 series. It has 40x the suction power of iRobot's standard series of vacuums and a design that helps it clean corners better. It also has a 3D sensor that helps it detect and clean around objects, although the technology is slightly different than that in the j7 series. Both are compatible with Amazon's Alexa and the Google Assistant, too, so you can control the robo-vacs with voice commands if you wish.

Follow @EngadgetDeals on Twitter for the latest tech deals and buying advice.

LG is bringing its CLOi service robot to the US

Don't be shocked if a robot serves you the next time you eat out or go on vacation. LG is bringing its CLOi ServeBot to the US, giving hotels, restaurants and stores a semi-autonomous machine that can ferry up to 66lbs of food and other cargo across a busy space. While humans have to pre-program maps and set destinations, the bot can use a 3D camera, LiDAR and sensors to dodge people and detect when someone has removed an item from a tray.

Like other CLOi robots, the ServeBot uses its 9.2-inch touchsreen to both put on a friendly face for guests and take input. It's not fast at 2.2MPH, but its 11 hours of continuous use should be enough for a long workday.

This isn't the first LG robot to cross to the US. The Korean tech firm brought its disinfecting UV-C robot to the US in 2021. The timing might be apt, however. American companies are grappling with the combination of job shortages (particularly in service roles) and an evolving pandemic that adds risk to waiting tables or helming a hotel's front desk. ServeBot won't completely replace human workers, but it might lessen the sting of job shortfalls and reduce exposure for workers who frequently have to deal with the public.

Watch Hyundai's CES 2022 robot show in under 6 minutes

Many companies at CES 2022 have been focused on products you can find on shelves, but Hyundai came to the show with nothing less than a grand vision of the future. The company used its presentation to outline a "metamobility" strategy where robots augment humanity's capabilities — to the point where you could even reconfigure whole rooms, or use a robot as a stand-in while you navigate the metaverse at home.

Boston Dynamics' robots also played a large part in the event, and Hyundai was keen to discuss everything from exoskeletons through to digital twins for machinery. It's a lot to take in, we know. Thankfully, you can learn about those and more through our six-minute supercut.

Follow all of the latest news from CES 2022 right here!

Yukai Engineering's cute stuffed animal robot will nibble on your finger

It wouldn’t be CES season without at least a couple of offbeat robots showing up. Yukai Engineering, the maker of the Qoobo robotic cat tail pillow, has revealed a soft robot that nibbles on a user’s fingertip. The company hopes the "somewhat pleasing sensation" will brighten up your day.

Amagami Ham Ham has an algorithm called a “Hamgorithm” that selects one of two dozen nibbling patterns, so you’ll never be sure exactly what you’ll feel when you shove your digit into the robot’s maw. Yukai designed the patterns — which include Tasting Ham, Massaging Ham and Suction Ham — to replicate the feeling of a baby or pet nibbling on one’s finger.

Yukai Corporation

“Amagami” means “soft biting” and “ham” means “bite” in Japanese. Yukai based the look of the robot on a character from Liv Heart Corporation’s Nemu Nemu stuffed animal series. There’ll be a couple of finger-munching models to choose from: Yuzu (Calico Cat) and Kotaro (Shiba Inu).

“Most people like the nibbling sensation but know they need to teach their children or pets to stop it, because kids and animals will otherwise bite them with full force eventually," said Yukai Engineering CMO Tsubasa Tominaga, who invented the robot at a hackathon earlier this year. "Amagami Ham Ham is a robot that frees humankind from the conundrum of whether ‘to pursue or not to pursue’ the forbidden pleasure.”

Pricing hasn't been determined, but Yukai and Liv Heart plan to run a crowdfunding campaign in the spring. In the meantime, those braving CES can check out Amagami Ham Ham at the show, and perhaps leave Yukai's booth with a slightly more tender finger.

Among the other devices Yukai will show off at CES is Bocco Emo. The company has updated the original Bocco robot to act as a smart medical device. Yukai says hospitals in Japan are using it to monitor patients' vitals (via connected sensors like pulse oximeters and thermometers) and notify nurses about a patient's condition.

During a pilot period, Bocco Emo was used to inform patients' families about how they're doing. It can also communicate with patients using sound effects, facial expressions and gestures while they wait for a nurse to arrive.

Hyundai's MobED robot can elevate its wheels to navigate uneven surfaces

Spot and the rest of the Boston Dynamics family may get all the attention, but Hyundai has a robotics division separate from the firm it acquired earlier in the year. This week, the automaker unveiled the Mobile Eccentric Droid or MobED, a new mobility platform for taking on even the trickiest surfaces and uneven ground.

The company’s Robotics Lab says it developed MobED to overcome the limitations of existing indoor and service robots. Each of the robot’s wheels features independent power and steering control systems that allow it to rotate in place and move in any direction. It also has an eccentric drive system, hence the name, that allows it to independently adjust the height of each of its wheels. As you can see from the video, that means it can provide a stable platform for something as delicate as glass.  

MobED can expand its wheelbase up to about 25 inches when it needs as much stability as possible. It can also contract to about 17 inches when it finds itself in more complex environments. All told, the platform is approximately 26 inches long, 23 inches wide and 13 inches tall. A 2 kWh battery allows the robot to drive for approximately four hours on a single charge.

Outside of working as a service robot, Hyundai envisions MobED helping out in places like the movie industry where film crews could mount their equipment to the platform. It could also be used for deliveries and other purposes where stability is essential. We’ll get a chance to see more of the robot when Hyundai demos it at CES 2022 next month. However, the company hasn't said if it plans to commercialize MobED.