

#Extension tessitura pro e contro series#
We report on a series of workshops with musicians and robotics engineers aimed to study how human and machine improvisation can be explored through interdisciplinary design research. Finally, we discuss some performance results and corresponding implementation challenges, and the solutions we adopted to address these issues. A musical application ba-sed on Steve Reich's Clapping Music was conceived and implemented using the framework as a case study to vali-date the aforementioned features. The present pa-per discusses some implementation details and framework features, including event exchange between agents, agent motion in a virtual world, realistic 3D sound propagation simulation, and interfacing with other systems, such as Pd and audio processing libraries. To address this shortco-ming, Ensemble, a generic framework for building musical multiagent systems was implemented, based on a previ-ously defined taxonomy and architecture.

Previous works in this field were usually concerned with solving very specific musical pro-blems and focused on symbolic processing, which limited their widespread use, specially when audio exchange and spatial information were needed. Multiagent systems can be used in a myriad of musical applications, including electro-acoustic composition, au-tomatic musical accompaniment and the study of emer-gent musical societies. Introduction Born in the early forties, the cybernetic movement makes reference to three main conceptual constituents which are input/output, feedback and information. Since we will here be focusing on the interaction model, our starting point will be a formulation of our model in terms of the behavioral (or abstract) automata theory. In previous papers we have presented a model of an improvising bass player from an AI perspective, emphasizing the underlying problem solving method. Making use of several artificial intelligence paradigms, we try to bridge this gap by considering the two critical points: the experience acquired by musicians and its use with respect to the evolving context of live improvisation. In jazz performance there is a strikingly large gap between the actually played music and the instructions given in a chord grid. This paper presents a computational model for simulating the behavior of a jazz bass player during live performance.
