Posts with «interaction design» label

A kinetic installation becomes a hyper-sensorial landscape

Interactive kinetic installations are always incredible to see in action, but they become even more awesome when they’re part of a performance. As in the case of Infinite Delta, which is the result of Boris Chimp 504 + Alma D’ Arame’s artistic residency at Montemor-o-Novo in Portugal.

Using Arduino boards, they built a physical structure comprised of triangular planes that swing back and forth like a pendulum, controlled by a series of servo motors. Light is projected onto the moving structures, creating patterns that are then reflected onto a nearby wall. Infinite Delta also modifies its shape in response to the movement and sound of the audience.

In Euclidean geometry any three points, when non-collinear, form a unique triangle and determine a unique plane. Nevertheless, in quantum physics the string theory proposes that fundamental particles may also have similarities with a string. It also states that the universe is infinite and in it all matter is contained. In this “multiverse”, our universe would be just one of many parallel existent universes. What would happen then if we multiply triangles infinitely? Could or would we have access to those parallel universes?

The performers augmented the physical world by overlaying it into the digital world to produce a new alternative, magic and hyper-sensorial landscape.

A DIY interactive book that uses digital gestures

Digital and craft maker lab Tazas recently worked with a group of master students on an interactive book/prototype to reflect on how gestures like swiping have become as natural as shaking hands. Digital Gestures is a metaphor of the human body’s physiological senses, which identifies 10 actions inherent to our daily interactions with technology: drag and drop, spread and squeeze, swipe, double tap, scroll, zoom, rotate, draw, press, press and hold.

The project was brought to life using four basic electronic components and some digital fabrication: a web server (VPS), an AtHeart Blend Micro Bluetooth module linking objects to the elements contained on the server, an iPod Touch connected viewing medium and conductive ink. All the elements are arranged on a laser cut wooden base, while an iPod digitally decrypts the printed* pages filed on its left.

To play, the viewer places an illustrated page on the support and touches a specific key point beforehand determined as conductive. When touching, the viewer has the ability to interact on the screen in order to understand the illustrated use. This experimental reflection raises many questions about the conditioning that man receives from the machine by accepting these precepts without altering their function. What will become of our so-called ‘daily’ gestures? Will our close to real behavioral experiment be upset? Answers that require that ‘use must be done.’

You can see how it magically works below!

 

A community-made, Arduino-powered interactive town map

A group of students from Farmington, Connecticut partnered with artist Balam Soto and master teachers Earl Procko and Jim Corrigan to create a community-based sculpture project that allows people to explore the sights, sounds and history of their town through new media.

The installation runs on Arduino Uno and XBee, and is comprised of two panels which act as viewing screens for multiple visual projections. Visitors can interact with the display and manipulate the images using 24 buttons placed on the physical map. Plus, they are encouraged to record and add their own stories and memories of Farmington to the ever-growing multimedia library.

Permanently exhibited in Farmington’s public library, the Farmington Map Project was also the opportunity to introduce the students to physical computing, digital fabrication, woodworking, Arduino programming, and to the potential that Makerspaces have to offer for bringing ideas to life.

The project was created with the support of an Arts in Education Mini-Grant, funded by the Connecticut State Department of Education, the Department of Economic and Community Development, the Connecticut Office of the Arts, and the Connecticut Association of Schools, Farmington High School’s Fine and Applied Arts.

Interested? Check it out on Hackster.

Using Arduino with VVVV now is easier than ever

VVVV is an open-source software toolkit supporting interaction designers and artists handling of large media environments with physical interfaces, real-time motion graphics, audio and video that can interact with many users simultaneously.

The cool thing is that you can control Arduino and Genuino boards with VVVV by uploading a Firmata sketch and then start playing with the input and output pins.

What’s more, the team recently released a brand new set of nodes able to talk to your Arduinos. With this implementation you can:

  • Just plug a DigitalWrite (Firmata), AnalogWrite (Firmata) or ServoWrite (Firmata) node to the Arduino node (or concatenate them together) to set the pins of the Board.
  • Connect DigitalRead (Firmata) and AnalogRead (Firmata) nodes to get the values from the Board’s pins.
  • Use the Sysex Messages output to receive different ‘Sysex Messages’ sent back by the Arduino Board. Some Sysex decoders are already there (see StringDecoder (Firmata), CapabilityDecoder (Firmata)). Sending custom ‘Sysex Messages’ is easy as well.
  • Your board is not listed in the NodeBrowser? The Arduino nodes are easily adaptable for other controllers running Firmata. Hello teensy…

Easier than ever before:

  • no need to supply a spread for all 20 pins and then SetSlice some of them to particular values.
  • no need to define the ‘PinMode’ for each pin.
  • no need to define which pins should report their values back.

Intrigued? Take a look at the details and discussion on VVVV blog!

This project makes eating alone a more entertaining experience

Food Screening is an Arduino-based project inspired by the act of watching films while eating meals alone, and was conceived especially for people living on their own abroad. The installation–developed by visual communicator Fongyee Ng in collaboration with Han–gyeol Lee–uses light and distance sensors to create an interaction with each food item, which triggers a snippet from a film that mimics the sound effects of consuming the meal, making eating alone a more entertaining experience.

Explore your weekly calendar through a tangible interface

A group of students (Kate Twomey, Leila ByronDaan Weijers, Luuk Rombouts) at the Copenhagen Institute of Interaction Design explored the creation of a tangible user interface displaying personal calendar’s meetings without using a screen.
The installation is called Timely and uses Temboo, Google Calendar API and a Genuino MKR1000 to pull all the upcoming week’s events and displaying each of them with a rotation of a laser cut base and its red strings:

The visual forecast is used to create awareness, while capacitive sensors in timely make it easy to adjust busy days by simply grabbing all three prongs of the chosen day. timely will then distribute your time more evenly throughout the day by rescheduling events and meetings, while automatically notifying attendees if needed.

Explore relations between nations daily with News Globus


News Globus is an unusual physical interface that piques the curiosity of people and asks them to explore the world by the news putting in relation  places of the world. It was designed by Bjorn Karmann, Charlie Gedeon, Mikio Kiura, Sena Partal wiring 20 regions to a Genuino board inside the sphere. When two regions are connected with the jack, the Genuino selects a country randomly from each region and queries the NY Times API for news containing both locations. A web server then selects a story and converts the headline and byline to a mp3 file which is played either from the headphone jack or the speaker at the  base of the globe:

The shape of the globe is an interesting artifact from the past which was combined with modern technologies and online services. Instead of allowing people to hear the news of one place, the audio jacks bring to mind the metaphor of the phone operator to get people to discover surprising connections between places near and far from each other.

Check the video to see it working:

The project was developed during the Interaction Design Programme at CIID with the help of Massimo Banzi and Dario Buzzini.

Arduino Blog 06 Apr 17:50

Moodbox makes you play with emotions for perfect ambience

Moodbox is a musical device created by a group of students (Iskra Uscumlic, Cyrus Kamath, Luca Mustacchi, Dario Loerke) to explore how we might set the mood in a studio space through music. They created it using Genuino Uno during the Interaction Design Programme at CIID with the help of Massimo Banzi and Dario Buzzini.

Moodbox enables you to set the perfect ambience and trigger different emotions:

Taking inspiration from the classic bar jukebox and its ability to influence the mood, we recognized that music and the atmosphere created by it are inextricably linked. When selecting a song to play in a social setting there is always a sense of negotiation involved. The person choosing has to consider the environment, the people around them, the current mood and that they would like to create.

With this in mind we set out to explore new opportunities for interaction in the communal space, using the environment of the studio as the setting.

 

As you can see in the video above, you can adjust the vibe of any space using four scales of emotions – from love to kill, serious to fun, chill to hype and dreamy to focus:

Emotions may be combined and fine-tuned with retro-style rotary knobs to dial-in feelings and get the perfect song choice. Like a jukebox, songs are queued once the selection is made. To provide visual feedback, lights also respond to the changes in mood, enhancing the overall influence on the space.

Moodbox connects to your iTunes account, with the rotary knobs sending information via a Genuino Uno and JavaScript to sort through a playlist. Our custom emotional algorithm then picks the right song and triggers mood lighting depending on the selection.

Discover what sound is made of with Sound Blocks

Sound Blocks is a tool to teach children and adults what sound is made of. The project was shortlisted in the Expression category of the IXDA Interaction Awards and it was developed by John Ferreira, Alejandra Molina, Andreas Refsgaard at the CIID using Arduino.

The device allows people to learn how, with a few parameters, it’s possible to create new sounds and, also, imitate real world sounds. Users can control waveform, sound decay or wave length and volume of three channels, all mixed together:

Sound blocks first and foremost was created as a tool to experiment with sound, it is playful and engaging.

Watch the video interview to discover more about the project and hear some noise:

Trojan 77: a gamified simulation of the Trojan virus

Trojan 77 is a gamified simulation of the Trojan virus running on Arduino Uno. The Trojan is a malware designed to provide unauthorised remote access to a user’s computer amongst other harmful possibilities and this prototype was designed to be exhibited at a technology museum to show the most important effects the virus. Inspired by the tilting labyrinth game, the prototype simulates a few key effects of the Trojan virus like passwords leaking out, files being deleted and culminating in a system crash.

Trojan 77  was created by a team of Physical Computing students (Dhrux Saxena, Gunes Kantaroglu, Liliana Lambriev, Karan Chaitanya Mudgal) at CIID:

The idea of designing something analog to explain a digital construct was an exciting challenge to undertake. The way that computer viruses operate can be very complicated and hard to explain without overloading people with detailed information. Making this information visual via animated projections helped to communicate the effects in a fun and memorable way.

The Trojan moved through several prototyping stages. Initially, the wooden structure was built, followed by the maze. The structure as a whole became functional with the addition of Arduino and Processing. Two servo motors controlled by a joystick enabled the tilt while the movement of the ball triggered distinct light sensors which in turn triggered events in a Processing sketch mapped onto the maze.

The students created also a great video documentary  to explain the project with a style inspired by the work  of Charles and Ray Eames: