Physical Interfaces in the electronic arts

The following paper by Bert Bongers describes in a compact way

  • needed mindset when designing physical interfaces
  • Variety of sensors and their areas of application

I found the introduction to the paper particularly exciting, in which the author describes two approaches to a physical interface in the electronic arts:

  • Human factor: Which is basically asking the question: How does it feel?
  • Sensor categorisation: Which deals with the technical side and asks: how does it work?

Physical Interfaces in the Electronic Arts – Bert Bongers

The starting point and usage of electric media to make art can be found in music, as it has a long tradition of performance and of precise and intimate relationship between human and technology. In comparison to musical instruments where the technology was reflected in the instruments and the type of sound generation process determined the design of the instrument, this relationship is often less clear in instruments that use electricity.

Physical Interfaces in the Electronic Arts Interaction Theory and Interfacing Techniques for Real-time Performance

For a new blog entry, we had to read a paper about the topic mentioned above by Bert Bongers. The paper describes aspects of physical interfaces in the electronic arts. It’s splits into two parts where one of them is describing the interactions that can take place in electronic arts through a Human Factors point of view and the other one is more practical explaining more details about sensory technologies and categories to make physical interaction possible.

Live keyboard performance sound manipulation using a Multi Touch Interface

The idea of creating a multi touch interface of controls for keyboarders on a widespread medium such as an iPad is very promising. Up until now, keyboarder during a live performance had to use one hand to alter the sound they produce by changing the state of a knob, button or a fader etc. In case multiple parameters were required to be changed at the same time during a live performance, it was impossible to achieve that by using just one hand while taking into consideration the relative distance between one physical control on the keyboard to another. By developing this multi touch application, the keyboarders can alter multiple parameters of their sound using just one hand, with gestures, swipes and finger count, in the most intuitive way possible. Tests conducted by the developer team in cooperation with professional keyboarders show that the application can become intuitive after a small amount of practice. The application works by setting up gestures and parameters by the keyboarder beforehand and then applied during the live performance. With this application the live performance can come closer to the production of the same sound that could be produced in the studio.

Source: https://www.nime.org/proceedings/2013/nime2013_275.pdf

IllumiWear: A Fiber-Optic eTextile for MultiMedia Interactions

https://www.nime.org/proceedings/2019/nime2019_paper088.pdf

IllumiWear, a novel eTextile prototype that uses fiber optic cables as interactive input and visual output.

Having considered the concept of IIllumiWear, I was amazed at this idea. This is a new opportunity for interaction with a computer and interfaces.

The idea is that fiber optic cables are separated into bundles and then woven like a basket into a bendable glowing fabric. By equipping light emitting diodes to one side of these bundles and photodiode light intensity sensors to the other, loss of light intensity can be measured when the fabric is bent.

The sensing technique of IllumiWear is not only able to discriminate between discreet touch, slight bends, and harsh bends, but also recover the location of deformation. In this way, this computational fabric prototype uses its intrinsic means of visual output (light) as a tool for interactive input.

IllumiWear: A Fiber-Optic eTextile for MultiMedia Interactions
Figure 1: Overview of IllumiWear interaction techniques; (A) Multitouch input similar to a midi-keyboard; (B) Varyingpressure touch input for broadening tangible expressiveness; (C) Tangible bending and deformation .

Hardware Implementation.

To implement our prototype, we divide all the optic fibers evenly
into twenty bundles which are then woven like a basket. This
fabrication process forms a ten by ten mesh.

(left) IllumiWear prototype imlementatiion. (right) IllumiWear prototype in the dark.

Finally they conducted four experiments based on the motivation.

  • Sensing Micro-bend
  • Sensing Macro-bend
  • Intersections
  • Color variation-based input

Conclusion.

Thus, IllumiWear opens up great opportunities for use. The IllumiWear is able to recognise the difference between a different types of touch and bends, this also help to recover the exact location of these deformations.

At least 5 interaction modes:

  • location-based touch input
  • pressure sensitivity
  • sliding gesture
  • bend input and tangible deformation
  • input based on changes in light and color

Border: A Live Performance Based on Web AR and a Gesture-Controlled Virtual Instrument

For this blog entry I decided to read the paper „Border: A Live Performance Based on Web AR and a Gesture-Controlled Virtual Instrument“ by K.Nishida et al. 

https://www.nime.org/proceedings/2019/nime2019_paper009.pdf

I chose this paper because I personally really enjoy going to concerts and live performances and the idea of having the audience interact with the performer immediately caught my interest. For this project an AR Web Application was developed. Web – because every participant can easily access it without downloading anything. „Border“ allows performers to record their movement, which is repeated/played in the AR environment afterward. In order to  control the sound gesture tracking by the Kinect was mainly used. The performer is able to switch between selection mode, where a virtual instrument can be chosen and play mode. The audience had the possibility to access that videos via the web application at five AR markers on stage – big enough (1m x 1m) so even people in the back can access them .

NIME 2019 – IllumiWear: A Fiber-Optic eTextile for MultiMedia Interactions

On this blog entry, I’ve taken a look at the paper from Josh Urban Davis of the Department of Computer Science in the Dartmouth College.

http://www.nime.org/proceedings/2019/nime2019_paper088.pdf

The IllumiWear is eTextile prototype that uses fiber optics as interactive input and visual output. Fibre optic cables are combined into bundles and the woven to create a bendable glowing fabric. By connecting light diodes to one side of the fibre optic cables and light sensors to the other side, loss of light intensity can be measured when the fabric is bent.

By this technology the IllumiWear is able to recognise the difference between a touch, slight bends, harsh bends and can also recover the exact location of these deformations.

Supporting Custom Musical Interface Design for Children with Disabilities

For this blog entry, I’ve taken a look at the paper by Samuel Thompson Parke-Wolfe, Hugo Scurto and Revecca Fiebrink on Sound Control: Supporting Custom Musical Interface Design for Children with Disabilities. I was immediately attracted to this topic, as I would like to work on the design of an application for children on the autistic spectrum as part of my master’s thesis.