Project Details


    ScreenPlay is a unique and innovative interactive computer music system (ICMS) designed to captivate users of all levels of musical and technological proficiency. The system incorporates aspects of the three main approaches to ICMS design: transformative, generative, and sequenced (Rowe, 1994). Each of these design frameworks prioritises the affordance of control over a specific aspect of the musical output of the system, ignoring the many other musical parameters over which influence can be exerted, and, as a result, are often limited in terms of the demographics to which they cater. Sequenced systems (Incredibox (So Far So Good, 2011–present), Patatap (Brandel, 2012–present; Brandel, 2015)) are usually tailored towards a lone user and allow for the full orchestration and arrangement of system-specific or pre- existing compositions but are often devoid of computer-influence over the musical output of the system. Transformative and generative systems (NodeBeat (Sandler, Windle & Muller, 2011-present), Bloom (Eno & Chilvers, 2008)) rely upon an underlying algorithmic framework to generate appropriate musical responses to user input and are more suited than sequenced systems to facilitating interaction between multiple users. ICMSs based on the transformative or generative design models are often melodically and harmonically simplistic, incorporating only a few different parts/lines, and offer limited influence to the user(s) over the musical output. All aforementioned system design models are hampered by considerable stylistic constraints. ScreenPlay seeks to combat this exclusivity of focus through the encapsulation and evolution of the fundamental principles behind the three system design models in what is a novel approach to ICMS design, along with the introduction of new and unique concepts to human-computer interaction (HCI) in music in the form of a bespoke topic-theory-inspired transformative algorithm and its application alongside Markovian generative algorithms in breaking routine in collaborative improvisatory performance and generating new musical ideas in composition through the provision of new and additional dimensions of expressivity. Topic theory, which was particularly prevalent during the Classical and Romantic periods, is a compositional tool whereby the composer employs specific musical identifiers – known as topics – in order to evoke certain emotional responses and cultural/contextual associations within the minds of the audience (Monelle, 2000). Specifically, the topic- theory-inspired transformative algorithm serves to transform the musical input of the user either melodically or texturally/timbrally through the application of four “topical oppositions”, Joy—Lament, Open—Close, Light—Dark, and Stability—Destruction. The impact of both the Joy—Lament and Light—Dark topical opposition transformations is directly influenced by specific topics, with the basis of Joy being found in the fanfare topic, Lament in the pianto, Light in the hunt and Dark in nocturnal (Monelle, 2000; 2006). The effect of the Open—Close transformation draws upon Denis Smalley’s theory of spectromorphology and, in particular, his four qualifiers of spectral space: emptiness— plenitude, diffuseness—concentration, streams—interstices, and overlap—crossover (1997), and is designed to mimic the effects of increased and decreased proximity to a sound source as well as the size of the space in which it is sounding. The Stability—Destruction transformation draws upon my own interpretation and the literal interpretation of a destructive effect upon sound through the use of real-time granulation. In essence, the application of topic theory in ScreenPlay is a reversal of roles of music and meaning in the traditional sense, with textual descriptors presented to the user(s) via the graphical user interface (GUI) describing the transformative audible effects of the various topical oppositions. The final novel inclusion in ScreenPlay’s design is the capability of operating as both a multi-user-and-computer collaborative, improvisatory interactive performance system capable of hosting up to sixteen users, each of which is afforded control via a dedicated touchscreen-based GUI over a single instrument/sound in the musical output of the system, and a single-user-and-computer studio compositional tool for Ableton Live that affords the user direct control over up to sixteen individual elements from a single instance of the GUI. This flexibility of application is made possible by the implementation of carefully refined perceived affordances (Norman, 2004) in the touchscreen-based GUI, which result in a seamlessly intuitive interactive experience regardless of the manner in which ScreenPlay is being used. Two- way communication between interfaces is exhibited when ScreenPlay is running in multi-mode, so that changes made to global parameters by one user are reflected in the interfaces of the others. When running in single-mode, the GUI updates to reflect the status of the currently selected part. This two-way communication between Ableton Live and the touchscreen GUI when in single-mode extends further still through the inclusion in the suite of Max for Live MIDI Devices that constitute ScreenPlay’s underlying computational framework of parameter controls that reflect those displayed on the touchscreen GUI, which serves to better support the integration of ScreenPlay into existing compositional/performative setups of practising electronic/computer musicians by affording the user the choice of controlling the transformative/generative algorithms either via the touchscreen GUI or directly from Ableton Live. While the rejuvenation and appropriation of topic theory within contemporary electronic/computer music is ScreenPlay’s most significant achievement, it is the combination of all novel aspects of its design that result is an ICMS that affords users of any level of musical/technological proficiency the ability to easily and efficiently create electronic/computer music of any style/genre in symbiosis with other users and/or the computer.
    Effective start/end date1/10/131/10/20


    • Human-Computer Interaction
    • Electronic Music
    • Topic Theory
    • Ableton Live
    • Max/MSP
    • Max for Live


    Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.