The starting point and usage of electric media to make art can be found in music, as it has a long tradition of performance and of precise and intimate relationship between human and technology. In comparison to musical instruments where the technology was reflected in the instruments and the type of sound generation process determined the design of the instrument, this relationship is often less clear in instruments that use electricity.
For a new blog entry, we had to read a paper about the topic mentioned above by Bert Bongers. The paper describes aspects of physical interfaces in the electronic arts. It’s splits into two parts where one of them is describing the interactions that can take place in electronic arts through a Human Factors point of view and the other one is more practical explaining more details about sensory technologies and categories to make physical interaction possible.
The idea of creating a multi touch interface of controls for keyboarders on a widespread medium such as an iPad is very promising. Up until now, keyboarder during a live performance had to use one hand to alter the sound they produce by changing the state of a knob, button or a fader etc. In case multiple parameters were required to be changed at the same time during a live performance, it was impossible to achieve that by using just one hand while taking into consideration the relative distance between one physical control on the keyboard to another. By developing this multi touch application, the keyboarders can alter multiple parameters of their sound using just one hand, with gestures, swipes and finger count, in the most intuitive way possible. Tests conducted by the developer team in cooperation with professional keyboarders show that the application can become intuitive after a small amount of practice. The application works by setting up gestures and parameters by the keyboarder beforehand and then applied during the live performance. With this application the live performance can come closer to the production of the same sound that could be produced in the studio.
Imagen improvising and making music just by drawing! Draw the melody and it will be played. This is what researchers from Japan and Spain wanted to create with JamSketch. Behind the scenes some processing is working on understanding what you want to play. Someone with no prior musical knowledge should be able to enjoy playing music with the JamSketch.
IllumiWear, a novel eTextile prototype that uses fiber optic cables as interactive input and visual output.
Having considered the concept of IIllumiWear, I was amazed at this idea. This is a new opportunity for interaction with a computer and interfaces.
The idea is that fiber optic cables are separated into bundles and then woven like a basket into a bendable glowing fabric. By equipping light emitting diodes to one side of these bundles and photodiode light intensity sensors to the other, loss of light intensity can be measured when the fabric is bent.
The sensing technique of IllumiWear is not only able to discriminate between discreet touch, slight bends, and harsh bends, but also recover the location of deformation. In this way, this computational fabric prototype uses its intrinsic means of visual output (light) as a tool for interactive input.
To implement our prototype, we divide all the optic fibers evenly into twenty bundles which are then woven like a basket. This fabrication process forms a ten by ten mesh.
Finally they conducted four experiments based on the motivation.
Color variation-based input
Thus, IllumiWear opens up great opportunities for use. The IllumiWear is able to recognise the difference between a different types of touch and bends, this also help to recover the exact location of these deformations.
I chose this paper because I personally really enjoy going to concerts and live performances and the idea of having the audience interact with the performer immediately caught my interest. For this project an AR Web Application was developed. Web – because every participant can easily access it without downloading anything. „Border“ allows performers to record their movement, which is repeated/played in the AR environment afterward. In order to control the sound gesture tracking by the Kinect was mainly used. The performer is able to switch between selection mode, where a virtual instrument can be chosen and play mode. The audience had the possibility to access that videos via the web application at five AR markers on stage – big enough (1m x 1m) so even people in the back can access them .
The IllumiWear is eTextile prototype that uses fiber optics as interactive input and visual output. Fibre optic cables are combined into bundles and the woven to create a bendable glowing fabric. By connecting light diodes to one side of the fibre optic cables and light sensors to the other side, loss of light intensity can be measured when the fabric is bent.
By this technology the IllumiWear is able to recognise the difference between a touch, slight bends, harsh bends and can also recover the exact location of these deformations.
Ich habe mich für das Paper: „Vrengt: A Shared Body-Machine Instrument for Music-Dance Performance“ von Ça ̆grı Erdem, Katja Henriksen Schia und Alexander Refsum Jensenius entschieden, da mir die Thematik Interaktionen zwischen Tänzer und Musiker interessieren.
For this blog entry, I’ve taken a look at the paper by Samuel Thompson Parke-Wolfe, Hugo Scurto and Revecca Fiebrink on Sound Control: Supporting Custom Musical Interface Design for Children with Disabilities. I was immediately attracted to this topic, as I would like to work on the design of an application for children on the autistic spectrum as part of my master’s thesis.