Posts with «vr» label

ElastImpact brings a bit more realism to VR

If you’ve ever used a VR system and thought what was really missing is the feeling of being hit in the face, then a team researchers at the National Taiwan University may hold just the solution. 

ElastImpact takes the form of a head-mounted display with two impact drivers situated roughly parallel to one’s eyes for normal — straight-on — impacts, and another that rotates about the front of your face for side blows.

Each impact driver first stretches an elastic band using a gearmotor, then releases it with a micro servo when an impact is required. The system is controlled by an Arduino Mega, along with a pair of TB6612FNG motor drivers. 

Impact is a common effect in both daily life and virtual reality (VR) experiences, e.g., being punched, hit or bumped. Impact force is instantly produced, which is distinct from other force feedback, e.g., push and pull. We propose ElastImpact to provide 2.5D instant impact on a head-mounted display (HMD) for realistic and versatile VR experiences. ElastImpact consists of three impact devices, also called impactors. Each impactor blocks an elastic band with a mechanical brake using a servo motor and extending it using a DC motor to store the impact power. When releasing the brake, it provides impact instantly. Two impactors are affixed on both sides of the head and connected with the HMD to provide the normal direction impact toward the face (i.e., 0.5D in z-axis). The other impactor is connected with a proxy collider in a barrel in front of the HMD and rotated by a DC motor in the tangential plane of the face to provide 2D impact (i.e., xy-plane). By performing a just-noticeable difference (JND) study, we realize users’ impact force perception distinguishability on the heads in the normal direction and tangential plane, separately. Based on the results, we combine normal and tangential impact as 2.5D impact, and performed a VR experience study to verify that the proposed 2.5D impact significantly enhances realism.

VR boxing robot actually punches back

VR environments are meant to be immersive, but if you’ve ever thought what was missing is being actually pummeled by robotic fists, then James Bruton’s newest project could be just the thing. 

Bruton recently teamed up with students from Portsmouth University to build a robot that works in the real world, and coordinates its movements with a virtual setting displayed on the human’s headset.

The robot itself is controlled by an Arduino Mega, and features a differential (tank) drive with encoders for feedback. Shoulders can tilt from left to right, and the actual punching motion is handled by pneumatic actuators built from modified bicycle pumps. Robo-fists are covered by boxing gloves to keep humans relatively safe, and flesh-based competitors are given a small shield and sword-bat with which to fight back!

I worked on this project with final year degree students in Computer Games Technology at Portsmouth University CCI faculty. The robot hardware is controlled over a serial interface, the team built an VR game which controls the robot, so when you get hit in VR you get hit in real life! The robot is tracked back into VR with Vive trackers so it stays in sync.

Arduino Blog 15 May 23:25

FacePush adds extra realism to your VR experience

Haptic feedback is something commonly used with handheld controllers and the like. However, in a virtual reality environment, it could also be used with the other interface surface attached to your body: the VR headset itself.

That’s the idea behind FacePush, which employs an Arduino Uno-powered pulley system to place tension on the straps of an HTC Vive headset. A corresponding pushing force is felt by the wearer through the headset in response to this action, creating yet another way to help immerse users in a virtual world. 

Applications tried so far include a boxing game, dive simulator, and 360-degree guidance You can check it out in a short demo below, and read more about it in the full research paper here.

Computer gesture control via webcam and Arduino

While touchscreens are nice, wouldn’t it be even better if you could simply wave your hand to your computer to get it to do what you want? That’s the idea behind this Iron Man-inspired gesture control device by B. Aswinth Raj.

The DIY system uses an Arduino Nano mounted to a disposable glove, along with hall effect sensors, a magnet attached to the thumb, and a Bluetooth module. This smart glove uses the finger-mounted sensors as left and right mouse buttons, and has a blue circle in the middle of the palm that the computer can track via a webcam and a Processing sketch to generate a cursor position.

You can see it demonstrated in the video below, drawing a stick man literally by hand, and also controlling an LED on the Nano. Check out this write-up for code and more info on the build!

Arduino Blog 31 May 21:41

Revealed: Homebrew Controller Working in Steam VR

[Florian] has been putting a lot of work into VR controllers that can be used without interfering with a regular mouse + keyboard combination, and his most recent work has opened the door to successfully emulating a Vive VR controller in Steam VR. He uses Arduino-based custom hardware on the hand, a Leap Motion controller, and fuses the data in software.

We’ve seen [Florian]’s work before in successfully combining a Leap Motion with additional hardware sensors. The idea is to compensate for the fact that the Leap Motion sensor is not very good at detecting some types of movement, such as tilting a fist towards or away from yourself — a movement similar to aiming a gun up or down. At the same time, an important goal is for any added hardware to leave fingers and hands free.

[Florian]’s DIY VR hand controls emulate the HTC Vive controllers in Valve’s Steam VR Tracking with a software chain that works with his custom hardware. His DIY controller doesn’t need to be actively held because by design it grips the hand, leaving fingers free to do other tasks like typing or gesturing.

Last time we saw [Florian]’s work, development was still heavy and there wasn’t any source code shared, but there’s now a git repository for the project with everything you’d need to join the fun. He adds that “I see a lot of people with Wii nunchucks looking to do this. With a few edits to my FreePIE script, they should be easily be able to enable whatever buttons/orientation data they want.”

We have DIY hardware emulating Vive controllers in software, and we’ve seen interfacing to the Vive’s Lighthouse hardware with DIY electronics. There’s a lot of hacking around going on in this area, and it’s exciting to see what comes next.


Filed under: Arduino Hacks, Virtual Reality

‘Duinos and VR Environments

At the Atmel booth at Maker Faire, they were showing off a few very cool bits and baubles. We’ve got a post on the WiFi shield in the works, but the most impressive person at the booth was [Quin]. He has a company, he’s already shipping products, and he has a few projects already in the works. What were you doing at 13?

[Quin]‘s Qduino Mini is your basic Arduino compatible board with a LiPo charging circuit. There’s also a ‘fuel gauge’ of sorts for the battery. The project will be hitting Kickstarter sometime next month, and we’ll probably put that up in a links post.

Oh, [Quin] was also rocking some awesome kicks at the Faire. Atmel, I’m trying to give you money for these shoes, but you’re not taking it.

[Sophie] had a really cool installation at the faire, and notably something that was first featured on hackaday.io. Basically, it’s a virtually reality Segway, built with an Oculus, Leap Motion, a Wobbleboard, an Android that allows you to cruise on everyone’s favorite barely-cool balancing scooter through a virtual landscape.

This project was a collaboration between [Sophie], [Takafumi Ide], [Adelle Lin], and [Martha Hipley]. The virtual landscape was built in Unity, displayed on the Oculus, controlled with an accelerometer on a phone, and has input with a Leap Motion. There are destructible and interactable things in the environment that can be pushed around with the Leap Motion, and with the helmet-mounted fans, you can feel the wind in your hair as you cruise over the landscape on your hovering Segway-like vehicle. This is really one of the best VR projects we’ve ever seen.


Filed under: misc hacks
Hack a Day 28 Sep 03:00

Here's how you make your own 3D-printed virtual reality goggles

So you couldn't get your hands on a nice virtual reality headset like the Oculus Rift, but you'd still like something a little fancier than a cardboard display. Are you out of luck? Not if Noe Ruiz has anything to say about it. He has posted instructions at Adafruit for do-it-yourself 3D-printed goggles that can be used for either VR or as a simple wearable screen. The design mates an Arduino Micro mini computer with a display, a motion sensor and lenses; the 3D printing both adds a level of polish and lets you tailor the fit to your cranium. This definitely isn't the cheapest project (about $231 in parts) or the easiest, but it will give you head-tracking VR without having to wait for Oculus, Samsung or Sony to put out finished devices of their own. If you're up to the challenge, you'll find everything you need at the source link.

[Image credit: Noe Ruiz]

Filed under: Displays, Wearables

Comments

Source: Adafruit