Session 12 - Instrument Control
From the advent of electronic music, and even from as early as the organ console and other remote manipulated instruments, much of the design and research of new musical interfaces has focused on abstracting the "controller" from the "synthesizer" and then investigating how to best interface those two classes of hardware with the player. Yet many of the striking lessons from our musical history and experiences with intimite, expressive sonic objects lie in the blurred boundaries between player, controller, and sound-producer. The violin, wind instruments, and certainly the human voice all blur these boundaries, both in the design and manufacture of the "instrument" and in control and expression on those instruments.
New digital musical instruments designed for professional and trained musicians can be quite complex and challenging, offering as a counterpart a great amount of creative freedom and control possibilities to their players. On the other hand, instruments designed for amateur musicians or for audiences in interactive sound installations, tend to be quite simple, trying often to bring the illusion of control and interaction to their users, while still producing 'satisfactory' outputs. Logically, these two classes of instruments are often mutually exclusive. Musicians become easily bored with 'popular' tools, while casual users get lost with sophisticated ones. But wouldn't it be possible to design instruments that can appeal to both sectors: systems in which basic principles of operation are easy to deduce, while, at the same time, sophisticated expressions are possible and mastery is progressively attainable? In this paper we will show how visual feedback can highly increase the intuitiveness of an interactive music system, making complex principles understandable. Although visual feedback is arguably not very important for playing traditional instruments, digital instruments should use at their advantage anything that could broaden the communication channel with its player. After a general survey of several systems that have exploited these feedback capabilities, we will show some of the new designs and implementations being carried in that direction by our research group
In order to have a successful interface it is preferred to employ a metaphor that the end user of the artefact is familiar with. This application aims to provide users with an expressive virtual musical instrument, based on the traditional Bodhran. This is not designed entirely to simulate the Bodhran, but to provide an instrument which can be played in a similar way, and which creates a recognisable Bodhran-like sound; a virtual Bodhran, the Vodhran. This instrument is to be an extension of the original, allowing for additional playing techniques and styles, which could not be accomplished with the real instrument.
The Max Mathews radio baton (RB) has been produced in about 40 pieces until today. It has usually been applied as an orchestra conducting system, as interactive music composition controller using typical percussionist gestures, and as a controller for sound synthesis models. In the framework of the Sounding Object EU founded project, the RB has found new applications scenarios. Three applications were based on this controller. This was achieved by changing the gesture controls. Instead of the default batons, a new radio sender that fits the fingertips was developed. This new radio sender allows musicians' interaction based on hand gestures and it can also fit different devices.
Scratching is a popular way of making music, turning the DJ into a musician. Normally scratching is done using a vinyl record, a turntable and a mixer. Vinyl manipulation is built up by a number of specialized techniques that have been analysed in a previous study. The present study has two main objectives. First is to better understand and model turntable scratching as performed by DJs. Second is to design a gesture controller for physical sound models, i.e. models for friction sounds. We attached sensors to a DJ equipment set-up. Then a DJ was asked to perform typical scratch gestures both isolated and in a musical context, i.e. as in a real performance. He also was asked to play with different emotions: sad, angry, happy and fearful. A model of the techniques used by the DJ was built based on the analysis of the collected data. The implementation of the model has been done in Pd. The Radio Baton, with specially adapted gesture controllers, has been used for controlling the model. The system has been tested by professional DJs in concerts.
Despite enjoying the resulting freedom from physical constraints that can result from the incorporation of computer technology in electronic musical instruments as musicians are enabled to push back the creative boundaries within which they work, musicians are still searching for virtual instruments that come closer to their physical counterparts in terms of their playing experience. This paper describes a physical modelling music synthesis system known that enables 'virtual instruments' to be controlled in real-time via a force-feedback joystick and a force-feedback mouse. These serve to provide the user with gestural controllers incorporating tactile feedback. Virtual instruments are set up via a graphical user interface in a manner that is highly intuitive, since users design these virtual instruments by interacting directly with their physical shape and structure in terms of the physical properties of basic objects such as strings, membranes and solids which can be interconnected to form complex structures. Acoustic excitation can be applied at any point mass via virtual bowing, plucking, striking, or from an external sound source. Virtual microphones can be placed at any point masses to deliver the acoustic output. The organic nature of the resulting acoustic output will be illustrated through demonstration.