Interact With Love

Project Details:
Interactive Installation (Music & Motion)
Location:
ION Art gallery, ION Orchard
Potential Uses:
Creating engaging advertising models

I’ve always been fascinated with interactive spaces, but not the kind where something screams at you to engage with it. I am fascinated with spaces that react to you without any obvious input from you. Doing something that’s supposed to engage without engaging. I always believe that moment, that “Aha!” moment, when you realize that what you’re doing is directly associated with what you are experiencing, is an awesome feeling. The joy of discovery.

I mentioned it in passing to my friend Peter Draw who just happened to have an exhibition space in ION Art gallery in a month for one of his new works “First, Love”. He was kind enough to offer me a space for my experiments and thus began my first ever personal interactive project.

I’ve also always wanted to use the Kinect in a real world situation, to see how it would fare. Additionally, since I had free reign over what I wanted to do, I could finally incorporate music and visuals in a public project.

My initial idea was to make an ambient soundscape out of the movements of the visitors to the gallery. The idea was inspired by step sequencers. Specifically, the Monome and Tenori-on. The idea was to map people’s positions as they move through the gallery onto a virtual grid. The visitors would essentially become my Monome. The software would then run through the grid, creating dynamic, ever-changing music, directly based on the movement of people in the gallery. My idea was thus to create a truly, user created experience. The ambient music in the gallery would be a result of all the visitors’ collective contributions.

With every idea, there are always a lot of changes and compromises to be made.
As the exhibition was going to be mainly frequented by children and the young at heart, I had to tweak my idea to be more interactive and less subliminal. Thus I made use of Kinect’s sensors and algorithms to detect people and create “blobs” out of them which I could then make use of as objects to interact in a virtual environment on screen.

I then had to tie in my installation with the entire exhibition. The main exhibit, “First, Love” consists of 10,000 balloons, reminding us that there is always good to be found around us. I thus decided to call my installation “Interact With Love”. The purpose of “Interact With Love” would be to encourage us to ignore the darkness in life by collecting colourful balloons as they fall on screen. The colours collected would trigger different samples of music, and thus create the ambience of the exhibition. My idea was for people to live in the moment, play with the interactive installation, and indirectly share their happiness with everyone in the gallery through their music.

I got the help of my friend Marcus, a talented sound engineer (among many of his amazing skills) to help me create a looping sample with the instruments split into separate tracks.

I then started learning Max/MSP (something I’ve always wanted to do as well) to create my audio engine, and Processing to work with the Kinect sensor. The learning curve for Max/MSP was quite steep in my opinion, as you couldn’t really bring a traditional programming mindset into it. But once I got the hang of it, it was a joy to work with! Processing was easier to pick up as I was quite familiar with Java. I just needed to understand their libraries and I was up and running. The toughest part was working with the hacks for the Kinect as documentation was not as complete as I would have liked it to be. Over the course of the project, I learnt about many limitations of the various Kinect libraries out on the web, as well as the limitations of the Kinect sensor. Let’s just say I’m just looking forward to the new Kinect that comes with the Xbox One.

But all in all, it was a very effective field test. I was able to gather some metrics (I called my interactions “Emotions Shared”), which would be really useful for advertisers to actually gain some measure of a screen’s actual visibility and engagement. Also, it was heartwarming to see people engaged and deriving pleasure from such a simple experience.

Finally some numbers:
83,242 interactions recorded over the 7 day event
A daily average of approximately 1,300 people detected by the sensor

(Note: The sensor detects people moving within it’s field of view, while the interactions are actual engagement with the installation.)