Please read news on problems with Opera 7.11 browsers

Session 12 - Instrument Control

Remutualizing the Instrument: Co-design of Synthesis Algorithms and Controllers
P Cook
Princeton University, Computer Science (also Music), Princeton, NJ, United States

From the advent of electronic music, and even from as early as the organ console and other remote manipulated instruments, much of the design and research of new musical interfaces has focused on abstracting the "controller" from the "synthesizer" and then investigating how to best interface those two classes of hardware with the player. Yet many of the striking lessons from our musical history and experiences with intimite, expressive sonic objects lie in the blurred boundaries between player, controller, and sound-producer. The violin, wind instruments, and certainly the human voice all blur these boundaries, both in the design and manufacture of the "instrument" and in control and expression on those instruments.
This paper and presentation will look at some of the issues, and recent projects, in the concurrent design of controllers and computer synthesis algorithms. Specific cases will be described and demonstrated where the traditional engineering approach of building a controller (a box), and connecting it to a synthesizer (another box) would never have yielded the final product that resulted from the tightly coupled development of a complete musical system all at the same time. Examples will be given of where a discovery within the synthesis algorithm development suggested a new control metaphor, and where a control component suggested a new aspect of the synthesis algorithm. A number of "complete" new instruments will be shown.

Interactive music systems for everyone: Exploring visual feedback as a way for creating more intuitive, productive and learnable instruments
S Jordà
Pompeu Fabra, Music Technology Group, Barcelona, Spain

New digital musical instruments designed for professional and trained musicians can be quite complex and challenging, offering as a counterpart a great amount of creative freedom and control possibilities to their players. On the other hand, instruments designed for amateur musicians or for audiences in interactive sound installations, tend to be quite simple, trying often to bring the illusion of control and interaction to their users, while still producing 'satisfactory' outputs. Logically, these two classes of instruments are often mutually exclusive. Musicians become easily bored with 'popular' tools, while casual users get lost with sophisticated ones. But wouldn't it be possible to design instruments that can appeal to both sectors: systems in which basic principles of operation are easy to deduce, while, at the same time, sophisticated expressions are possible and mastery is progressively attainable? In this paper we will show how visual feedback can highly increase the intuitiveness of an interactive music system, making complex principles understandable. Although visual feedback is arguably not very important for playing traditional instruments, digital instruments should use at their advantage anything that could broaden the communication channel with its player. After a general survey of several systems that have exploited these feedback capabilities, we will show some of the new designs and implementations being carried in that direction by our research group

Controlling the virtual bodhran - the vodhran
R Bresin¹, S Dahl¹, M Marshall², M Rath³, B Moynihan²
¹KTH, TMH, Stockholm, Sweden; ²University of Limerick, Interaction design centre, Limerick, Ireland; ³University of Verona, Dipartimento di Informatica, Verona, Italy

In order to have a successful interface it is preferred to employ a metaphor that the end user of the artefact is familiar with. This application aims to provide users with an expressive virtual musical instrument, based on the traditional Bodhran. This is not designed entirely to simulate the Bodhran, but to provide an instrument which can be played in a similar way, and which creates a recognisable Bodhran-like sound; a virtual Bodhran, the Vodhran. This instrument is to be an extension of the original, allowing for additional playing techniques and styles, which could not be accomplished with the real instrument.
In two experiments the same sound model implementing the Bodhran was used. In the first experiment the sound model was controlled both with data obtained from measurements of gestures of drum player, and with a traditional playing approach by using two different controllers. The Radio Baton and the ddrum (a drum pad) were used as interfaces for this experiment. Both interfaces were controlled with a traditional bodhran double beater. In the second experiment a sensor-based controller was used to translate player's gestures into sound model controls.
The sound generation mechanism for the Vodhran is based on the modal description of the drum and a robust numerical solution of a nonlinear stick-membrane interaction model. The model includes different forms of a player's interference.

The Radio Baton as configurable musical instrument and controller
R Bresin, K F Hansen, S Dahl
KTH, TMH, Stockholm, Sverige

The Max Mathews radio baton (RB) has been produced in about 40 pieces until today. It has usually been applied as an orchestra conducting system, as interactive music composition controller using typical percussionist gestures, and as a controller for sound synthesis models. In the framework of the Sounding Object EU founded project, the RB has found new applications scenarios. Three applications were based on this controller. This was achieved by changing the gesture controls. Instead of the default batons, a new radio sender that fits the fingertips was developed. This new radio sender allows musicians' interaction based on hand gestures and it can also fit different devices.
A Pd model of DJ scratching techniques (submitted to SMAC03) was controlled with the RB and the fingertip radio sender. This controller allows DJs a direct control of sampled sounds maintaining hand gestures similar to those used on vinyl.
The sound model of a bodhran (submitted to SMAC03) was controlled with a traditional playing approach. The RB was controlled with a traditional bodhran double beater with one fingertip radio sender at each end. This allowed detection of the beater position on the RB surface, the surfaced corresponding to the membrane in the sound model.
In a third application the fingertip controller was used to move a virtual ball rolling along the elastic surface of a box placed over the surface of the RB.
The DJ console and the virtual bodhran were played in concerts.

DJ scratching performance techniques: Analysis and synthesis
K F Hansen, R Bresin
KTH, TMH, Stockholm, Sweden

Scratching is a popular way of making music, turning the DJ into a musician. Normally scratching is done using a vinyl record, a turntable and a mixer. Vinyl manipulation is built up by a number of specialized techniques that have been analysed in a previous study. The present study has two main objectives. First is to better understand and model turntable scratching as performed by DJs. Second is to design a gesture controller for physical sound models, i.e. models for friction sounds. We attached sensors to a DJ equipment set-up. Then a DJ was asked to perform typical scratch gestures both isolated and in a musical context, i.e. as in a real performance. He also was asked to play with different emotions: sad, angry, happy and fearful. A model of the techniques used by the DJ was built based on the analysis of the collected data. The implementation of the model has been done in Pd. The Radio Baton, with specially adapted gesture controllers, has been used for controlling the model. The system has been tested by professional DJs in concerts.

Gesture-tactile controlled physical modelling music synthesis
D M Howard, S Rimell
Electronics, York, United Kingdom

Despite enjoying the resulting freedom from physical constraints that can result from the incorporation of computer technology in electronic musical instruments as musicians are enabled to push back the creative boundaries within which they work, musicians are still searching for virtual instruments that come closer to their physical counterparts in terms of their playing experience. This paper describes a physical modelling music synthesis system known that enables 'virtual instruments' to be controlled in real-time via a force-feedback joystick and a force-feedback mouse. These serve to provide the user with gestural controllers incorporating tactile feedback. Virtual instruments are set up via a graphical user interface in a manner that is highly intuitive, since users design these virtual instruments by interacting directly with their physical shape and structure in terms of the physical properties of basic objects such as strings, membranes and solids which can be interconnected to form complex structures. Acoustic excitation can be applied at any point mass via virtual bowing, plucking, striking, or from an external sound source. Virtual microphones can be placed at any point masses to deliver the acoustic output. The organic nature of the resulting acoustic output will be illustrated through demonstration.

Maintained by