Posts with «augmented reality» label
As robotics advance, the future could certainly involve humans and automated elements working together as a team. The question then becomes, how do you design such an interaction? A team of researchers from Purdue University attempt to provide a solution with their GhostAR system.
The setup records human movements for playback later in augmented reality, while a robotic partner is programmed to work around a “ghost” avatar. This enables a user to plan out how to collaborate with the robot and work out kinks before actually performing a task.
GhostAR’s hardware includes an Oculus Rift headset and IR LED tracking, along with actual robots used in development. Simulation hardware consists of a six-axis Tinkerkit Braccio robot, as well as an Arduino-controlled omni-wheel base that can mount either a robot an arm or a camera as needed.
More information on the project can be found in the team’s research paper.
With GhostX, whatever plan a user makes with the ghost form of the robot while wearing an augmented reality head mount is communicated to the real robot through a cloud connection – allowing both the user and robot to know what the other is doing as they perform a task.
The system also allows the user plan a task directly in time and space and without any programming knowledge.
First, the user acts out the human part of the task to be completed with a robot. The system then captures the human’s behavior and displays it to the user as an avatar ghost, representing the user’s presence in time and space.
Using the human ghost as a time-space reference, the user programs the robot via its own ghost to match up with the human’s role. The user and robot then perform the task as their ghosts did.
Those familiar with the Dragon Ball Z franchise will recognize the head-mounted Scouter computer often seen adorning character faces. As part of his Goku costume, Marcin Poblocki made an impressive replica, featuring a see-through lens that shows the “strength” of the person he’s looking at, based on a distance measurement taken using a VL53L0X sensor.
An Arduino Nano provides processing power for the headset, and light from a small OLED display is reflected on the lens for AR-style viewing.
It’s not exactly perfect copy but it’s actually working device. Inspired by Google virtual glasses I made virtual distance sensor.
I used Arduino Nano, OLED screen and laser distance sensor. Laser sensor takes readings (not calibrated yet) and displays number on OLED screen. Perspex mirror reflects the image (45 degrees) to the the lens (used from cheap Google Cardboard virtual glasses) and then it’s projected on clear Perspex screen.
So you will still see everything but in the clear Perspex you will also see distance to the object you looking at. On OLED screen I typed ‘Power’ instead distance because that’s what this device suppose to measure in DBZ.
Print files as well as code and the circuit diagram needed to hook this head-mounted device up are available on Thingiverse. For those that don’t have a DBZ costume in their immediate future, the concept could be expanded to a wide variety of other sci-fi and real world applications.
Chances are good that a fair number of us have been roped into “one of those” projects before. You know the type: vague specs, limited budget, and of course they need it yesterday. But you know 3D-printers and Raspberduinos and whatnot; surely you can wizard something together quickly. Pretty please?
He might not have been quite that constrained, but when [Sean Hodgins] got tapped to help a friend out with an unusual project, rapid prototyping skills helped him create this GPS-enabled faux-walkie talkie audio player. It’s an unusual device with an unusual purpose: a comedic walking tour of Vancouver “haunted houses” where his friend’s funny ghost stories are prompted by location. The hardware to support this is based around [Sean]’s useful HCC module, an Arduino-compatible development board. With a GPS module for localization and a VS1053 codec, SD card reader, and a small power amp for the audio end, the device can recognize when the user is within 50 meters of a location and play the right audio clip. The housing is a 3D-printed replica of an old toy walkie-talkie, complete with non-functional rubber ducky antenna.
[Sean]’s build looks great and does the job, although we don’t get to hear any of the funny stuff in the video below; guess we’ll have to head up to BC for that. That it only took two weeks start to finish is impressive, but watch out – once they know you’re a wizard, they’ll keep coming back.
The Unity engine has been around since Apple started using Intel chips, and has made quite a splash in the gaming world. Unity allows developers to create 2D and 3D games, but there are some other interesting applications of this gaming engine as well. For example, [matthewhallberg] used it to build a robot that can map rooms in 3D.
The impetus for this project was a robotics company that used a series of robots around their business. The robots navigate using computer vision, but couldn’t map the rooms from scratch. They hired [matthewhallberg] to tackle this problem, and this robot is a preliminary result. Using the Unity engine and an iPhone, the robot can perform in one of three modes. The first is a user-controlled mode, the second is object following, and the third is 3D mapping.
The robot seems fairly easy to construct and only carries and iPhone, a Node MCU, some motors, and a battery. Most of the computational work is done remotely, with the robot simply receiving its movement commands from another computer. There’s a lot going on here, software-wise, and a lot of toolkits and software packages to install and communicate with one another, but the video below does a good job of showing what you’ll need and how it all works together. If that’s all too much, there are other robots with a form of computer vision that can get you started into the world of computer vision and mapping.
As part of a recent Microsoft HoloLens hackathon in San Francisco, Maker Ian Sterling developed a new app that interacts with you smart home via augmented reality. The proof of concept, dubbed “IoTxMR,” allows a user to simply glance at a gadget and control it through gestures.
As you can see in the video below, IoTxMR enables Sterling to connect various Android and Arduino-based devices with the HoloLens to create a customized interdependent network. It also features a mixed reality experience called “virtual zen mode,” complete with calming sounds and light orbs in his surrounding environment.
During a recent interview with Digital Trends, Sterling revealed:
The primary goal of the app is to provide a 3D spatial UI for cross-platform devices — Android Music Player app and Arduino-controlled fan and light — and to interact with them using gaze and gesture control.
The connectivity between Arduino and a mixed reality device is something which holds a huge amount of creative opportunity for developers to create some very exciting applications — be it [Internet of Things], robotics, or other sensor data visualization. Besides this, our app features some fun ways to connect devices. Our demo featured a connection between a music player and a light in order to set a certain mood in your home.
Although just a demo, IoTxMR does highlight the endless possibilities that AR platforms like HoloLens offer in the not-too-distant future.
Arduino user Jubeso submitted to our blog an instructable explaining the 10 steps to build an input device for gaming.
The Gravity Touch bluetooth glove is specifically designed to interact with augmented reality glasses like the Google Glass, Meta, Moverio BT or with the VR headsets like Oculus Rift, Samsung Gear VR, vrAse, Durovis Dive:
Those new products are amazing and they need new types of input devices. This instructable will describe how to build your own “Gravity Touch bluetooth glove” and I will also give you some tips to build your own Durovis Dive VR headset so that you will be able to enjoy full mobile VR. Because this glove will be of most use for VR game, I have created a Unity3D plugin for Android that handle the communication between your app and the glove. It means that you will be able to use your Gravity Touch glove to interact with your Unity3D VR game.
The Arduino code and the Java class I wrote to handle the communication between the glove and the Android device will also be available so that you will be able to adapt them for your need.
Sphero's hooked up with a new whip, albeit a retro-fitted one. Skylar, a Junior Developer at Orbotix, modded an old RC car with an Arduino board, H-bridge and a few trackball parts, enabling the remote control ball to serve as its brain. Just in time too -- there's only so much fun you can have getting the little orb stuck behind the filing cabinets. Still, it's certainly a leap beyond purposing it to pull an iPhone-toting chariot.
Sean Buckley contributed to this post.
Sphero goes modular, spins out for a drive (video) originally appeared on Engadget on Mon, 19 Mar 2012 02:36:00 EST. Please see our terms for use of feeds.Permalink | Email this | Comments