Posts with «interaction design» label

Smile! This plant wants to take a selfie with you

The Selfie Plant is an interactive installation taking pictures of itself using Arduino Yún, Facebook Graph APIs and then uploads them to Facebook. It was developed by a group of students at the Copenhagen Institute of Interaction Design during “The secret life of objects” course held also by Arduino.cc team by Joshua Noble and Simone Rebaudengo. The final prototype was on display at the class exhibition, to observe the interaction of the audience with it, and the results are on Facebook.

The Selfie Plant is an attempt to provoke some thoughts above genre of expression. The Selfie Plant expresses itself in the form of nice-looking selfies, which it clicks according to its mood, weather or occasion. It mimics human behaviour, by giving it’s best pose and adjusting the camera angle to take the perfect selfie.

In the documentation on Github you can find all the details of the project composed by an Arduino Yún, controlling 2 servo motors and adjusting the positions of the plant and the camera stick; a python script (facebook.py) which communicates with Facebook’s graph API to post the captured photos on plant’s Facebook profile. In addition you’ll need also a LED Matrix, a Bread Board and 5 Volt Battery.

Here’s a preview of the diagram:

 

Smile! This plant wants to take a selfie with you

Selfie Plant is an interactive installation taking pictures of itself using Arduino Yún, Facebook Graph APIs and then uploads them to Facebook. It was developed by a group of students at the Copenhagen Institute of Interaction Design during “The secret life of objects” course held also by Arduino.cc team. The final prototype was placed in the exhibition of the school, to see the interaction of the audience with it and you can see the result on Facebook.

The Selfie Plant is an attempt to provoke some thoughts above genre of expression. The Selfie Plant expresses itself in the form of nice-looking selfies, which it clicks according to its mood, weather or occasion. It mimics human behaviour, by giving it’s best pose and adjusting the camera angle to take the perfect selfie.

 

In the documentation on Github you can find all the details of the project composed by an Arduino Yún, controlling 2 servo motors and adjusting the positions of the plant and the camera stick; a python script (facebook.py) which communicates with Facebook’s graph API to post the captured photos on plant’s Facebook profile. In addition you’ll need also a LED Matrix, a Bread Board and 5 Volt Battery.

Here’s a preview of the diagram:

 

Experiencing the solar flux with an interactive installation

Dmitry Morozov shared with us a new interactive installation called  Solarman at the Polytech Museum in Moscow. 2014 and It’s a work he created with Julia Borovaya and Edward Rakhmanov using 64 ultra bright LEDs, 12-channel sound system and 8 electrical nerve stimulation electrodes controlled by Arduino Mega :

Data on power of X-radiation flux from the Sun is received in real time from the satellite GOES15 which is tracking solar activity. It is being converted into streams of sound, light and electric discharges, thus allowing a spectator to experience in more intensive and evident way the influence of the main luminary of the solar system.

The data, which is measured in watts per square meter, come with a frequency of once per minute. A special computer algorithm transforms it in sound waves, distributed by 12 channels in the space. The radiation power directly controls the height of tones and spectral changes in the sound. The speed of sound displacement in the space is also dependent on these parameters. Light is generated by algorithmic transformation of X-ray emission into physical modeling of light particles, which also affect the muscle stimulators in the chair to produce weak electric discharges.

Check the video below to see the power of the sun:

Arduino Blog 09 Dec 19:57

Experience sound multi-sensorially with Ocho Tonos

Some of you may have noticed that words like rhythm, texture, pattern, can be used both to describe fabrics, as well as sound. Focused on building an interface as a whole, using mostly textiles, OCHO TONOS invites the user to interact through touch, and experience sound in a multi-sensorial way. Ocho Tonos is an interactive installation by EJTech duo (Esteban de la Torre and Judit Eszter Kárpáti) I met last July during etextile summer camp while they were working on this experimental textile interface for tactile/sonic interaction by means of tangibles:

Exploring the relation between sound and textile and experimenting with the boundaries of our senses whilst changing the way we perceive fabric, surfaces and their manifestation as sound. Recontextualizing our tactile interaction with textile acting as an interface, where each element triggers, affects and modifies the generated sound’s properties. Creating a soundscape through sensor technology enticing audiophiles to interact and explore with reactive textile elements.The nexus of the body, the senses and technology.
OCHO TONOS is a symbiosis of the unique hand-crafted traditional textile techniques and the immaterial digital media.

Thanks to Arduino Mega ADK , all inputs coming from the touch of the user on the soft sensors are translated into a digital platform, parsed and filtered through MaxMSP, in order to control the generation of a soundscape in Ableton Live.

Ocho Tonos was chopped, spiced and cooked at Kitchen Budapest. Sounds used are samples from the working machinery at  TextielLab.

Yes, The Drink Up Fountain is talking to you!

The Drink Up Fountain is project created in September 2013 by YesYesNo Interactive studio in collaboration with PHA Honorary Chair First Lady Michelle Obama and dedicated to encouraging people to drink more water more often: “You are what you drink, and when you drink water you drink up!”

The Fountain runs on Arduino Mega  and

dispenses entertaining greetings and compliments intended to entice the drinker to continue sipping. When a drinker’s lips touch the water, the fountain “talks,” completing a circuit and activating speakers. When the drinker pulls his or her head away and stops drinking, the circuit breaks and the fountain stops talking. With hidden cameras set up, Drink Up caught unsuspecting individuals using the fountain in New York City’s Brooklyn Bridge Park


Take a look at the video to see how the fountain interacts the people:

Put out a candle with the power of your mind

Trataka” by Alessio Chierico is an interactive installation controlled by Arduino and based on a brain-computer interface. It was exhibited at Ars Electronica last week. When a visitor totally relaxes and focuses, the candle magically extinguishes:

Trataka is a Sanskrit term which means “to look” or “to gaze” and it refers to a meditation technique. This practice consists in concentrate the attention in a small object, or more commonly in a flame. In meditation, this technique is used to stimulate the ajna chakra: a point located in correspondence of the brain. According to the Hindu tradition, this chakra is one of the six main centers of vital energy, and it is considered as the eye of the intuition and the intellect.

This installation is composed by a brain-computer interface that detect the brain waves and defines parameters like the level of attention. Wearing this device, the user is invited to concentrate his attention on a flame placed in front of him. The level of attention detected by this system, controls an air flow located under the flame: higher level corresponds with a more intense air flow. The interaction process aims to the user engagement for increase his attention in order to put the flame out. This will happen when the highest level of attention is reached: the air flow become strong enough to extinguish the flame.

 

Creating colourful clouds of light

Arduino user SicLeung is part of Do Interactive, an interactive design team based in Hong Kong. He sent us a video about his experimental installation at Hong Kong Poly University – School of Design and exploring unusual ways of activating light:

Wearable soundscape from Canada

 

I’m reblogging from Core77 this interesting wearable project because I’d like to highlight the using of Arduino Lilypad board:

Bio Circuit stems from our concern for ethical design and the creation of media-based interactions that reveal human interdependence with the environment. With each beat of the heart, Bio Circuit connects the wearer with the inner workings of their body.

It was created at Emily Carr University by Industrial Design student Dana Ramler, and MAA student Holly Schmidt and provides a form of bio feedback using data from the wearer’s heart rate to determine what “sounds” they hear through the speaker embedded in the collar of the garment. Here’s the schematic of technology:

 

 

Have a look at the video below to see how it works and don’t miss BioCircuit Project page on Dana’s Portfolio:

 

 

Turing and interaction at the Science Museum in London

Codebreaker is the exhibition started last year at  the Science Museum of London and celebrating  the centenary of the birth of computing pioneer Alan Turing.

Hirsch&Mann were commissioned to create a “series of exhibits which demonstrated and recognized the progress in computing while at the same time representing a spirit of engineering and innovation” .

They created three installations that demonstrated 3 programming principles:

LOOPING: A spinning rotor with LEDs on it -> creating POV patterns all controlled by 30 arcade style illuminated switches.

CONDITIONALS: A version of Wolfram’s cellular automata – user was able to choose the result of the child node once the parent node conditions were met

VARIABLES: A mechanical tree – the branch angles were controlled by sliders on the console. Slider A controlled 1 angles at the base of the tree, slider 2 controlled the next 2 angles, slider 3, the next 4 angles and slider 4 the final 8 sliders.

Each installation has a light box which is revealed as soon as you press the BIG GLOWING button on the console. This turns on the lightbox – which has simplified pseudo code and essentially allows people to “step into” the code. Each line that is currently running is highlighted and then you see the result on the installation.

The whole point of these installations was to show where we have come since Turing’s time and stepping on his shoulders.

If you have the chance to visit the exhibition (it’s free!) or watch the video below you will see that at the center of each console there is an Arduino UNO.

 

Skube, a tangible radio

Skube is a music player that allows you to discover and share music.

There are two modes, Playlist and Discovery. Playlist plays the tracks on your Skube, while Discovery looks for tracks similar to the ones on your Skube so you can discover new music that still fits your taste. When Skubes are connected together, they act as one player that shuffles between all the playlists. You can control the system as a whole using any Skube.

The interface is designed to be intuitive and tangible. Flipping the Skube changes the modes, tapping will play or skip songs and flipping a Skube on its front face will turn it off.

The Skube is a fully functional device, not just a concept. It use a combination of Arduino, Max/MSP and an XBee wireless network.

This project was made by Andrew Nip, Ruben van der Vleuten, Malthe Borch, and Andrew Spitz. It was part of the Tangible User Interface module at CIID ran by Vinay Venkatraman, David Cuartielles, Richard Shed, and Tomek Ness.

You can read the details and see the inner workings of the Skube here.

Via:[Create Digital Music]

 

Arduino Blog 20 Sep 10:21