The Reactable : A writeup

The Reactable is a musical instrument that has taken the electronic music world by storm since its launch around 2006. It is a Tabletop User Interface that can detect finger touch and swipe gestures and objects with specially designed markers attached to them and use this information to synthesize music. The wide variety of objects can be broadly classified into sound objects and control objects. The sound objects are responsible for generating tones and playing back recorded samples whereas the control object modify the sound by filtering or modifying them. You can also plug the MIDI controllers and electronic instruments like guitar in to make the sound more organic.

One can at times hear people taking a jibe at electronic musicians by saying things like, “Computer is not a music instrument”. I feel that is because if you see an electronic musician performing live, he is mostly staring at a computer screen and rotating or sliding a controller. There is no connect with the audience. The audience is clueless about what a particular button on his controller does or what exactly is the musician doing at any instant of time. Unlike in the case of a conventional instrumentalist – say for instance, a guitarist, where you know what his gestures imply. The process of making music and the output is connected so intimately that we know how each pluck or string bend is actually responsible for altering the music produced. Music generated through computer lack that intimate physical connection with the music. One press of a button is captured by the machine, followed by multiple processing to generate a tone. The whole process takes place within a closed box whose inner functioning has been abstracted to a level that it does not make sense to someone who is new to it.

The reactable with its splendid visual feedback on its tabletop surface gets rid of that problem. Every object that you place on the surface gets connected by other object depending on its proximity with the other object. The connection is made visible by a line drawn between the two objects on the visual feedback. Not only is the musician given a feedback of his actions, the audience can see how each of his actions alter the soundscape. The visual feedback also helps to change the way a newbie approaches the instrument. Unlike traditional hardware and softwares used for electronic music synthesis, the reactable’s interface is not intimidating or cluttered. The direct mapping between the input and the output screen blends the mental model with the actual model seamlessly.



Traditional Hardware Interfaces

Reactable’s User Interface

Unlike most multi-touch applications which are rectangular in shape, a reactable is circular which brings to forefront the collaborative character of it that is so finely ingrained in its design. One could approach it from any side and start interacting with it to make music.

The symbols have large icons on them that pretty much make them self-explanatory and even if they don’t one just need to put them on the surface to know what they do. By design, reactable has been made in such a way that there is no way to use it wrong. Simple and intuitive gestures like rotating the objects and swiping the fingers on the table makes the learning curve easy and playful to cover making the whole process enjoyable. Its more about music and less about the underlying technology, exactly how it should be like.
I think the reason this product is so great and has found wide acceptance in the international electronic circuit is because of its design process. Most interaction design projects take up a technology and try to put it to some use. The team behind Reactable set out with the aim of making a state of the art music making beast and they came to think of the technology only later. Most of the computer vision technology was made later by them as the project progressed. And boy they have done a pretty neat job!

Paper Published in an International Conference

In continuation with the series of surprises this week, A research paper that I wrote on my BTech Project got published in the International Conference on Computer and Software Modeling. Again, I am not quite sure about how great or reputed this conference is but what the heck, it opens my publication’s list.

About the Paper:
Authors : Denny George, Binita Desai and Keyur Sorathia
Title : Dhvani: An Open Source Multi-touch Modular Synthesizer

If anything else, this makes me feel like writing a proper research paper now!