Andrew Johnston is a researcher and interaction/software designer based in Sydney, Australia. His work focuses on the design of systems that support experimental, exploratory approaches to interaction, and the experiences and practices of the people who use them.
He has qualifications in music (B. Arts) and computing (Master of IT) and a PhD combining the two. As a musician he has performed professionally with ensembles such as the Melbourne and Sydney Symphony Orchestras and many other ensembles. His PhD research, completed in 2009, involved the creation of interactive software for use by expert musicians in live performance.
Andrew has had a long and productive collaboration with Stalker Theatre, a Sydney-based dance and physical theatre company. With PhD researcher Andrew Bluff he developed large scale interactive projections for the productions 'Encoded' and 'Creature: A Retelling of Dot and the Kangaroo', which have toured nationally and internationally to the Netherlands, South Korea, the United Kingdom and Hong Kong. Publications describing these works and the relationships between interactive technologies and creative practice have been published in SIGGRAPH, Leonardo, Digital Creativity, New Interfaces for Musical Expression, Movement & Computing conference and others.
Andrew is Research Director of the UTS Animal Logic Academy, a unique, professionally-equipped studio focusing on the creative application and design of digital technologies. He also co-directs the Creativity and Cognition Studios, an interdisciplinary research group working at the intersection of performance, art and technology.
Can supervise: YES
Interaction design, Interactive media, Agile software development methods, Software development.
Samaras, E & Johnston, A 2019, 'Off-Lining to Tape Is Not Archiving: Why We Need Real Archiving to Support Media Archaeology and Ensure Our Visual Effects Legacy Thrives', Leonardo, vol. 52, no. 4, pp. 374-380.View/Download from: UTS OPUS or Publisher's site
This paper examines digital asset archiving and preservation practice in the visual effects (VFX) industry. The authors briefly summarize media archaeology theory and provide an overview of how VFX studios presently archive project assets and records, based on case study and interview research conducted with expert VFX practitioners from leading international studios. In addition, the authors propose that current practice could be improved by adopting archival science methods, including digital preservation practices. Doing so will support media archaeology studies of digital cultures over time and ensure that the legacy of VFX creative and technical production thrives for future generations.
Samaras, E & Johnston, A 2018, 'Fleeting Film: Using Story to Seek Archival Permanence in the Transitory and Globalized Digital Visual Effects Industry', Preservation, Digital Technology and Culture, vol. 47, no. 1, pp. 12-22.View/Download from: UTS OPUS or Publisher's site
© 2018 Walter de Gruyter GmbH, Berlin/Boston. Archiving is a long-standing vocation, founded on principles such as provenance, original order, truth, evidence, preservation and permanence. A far cry from the visual spectacle and movable feast of film visual effects (VFX) - a transitory and globalized industry of disposable firms, ever-advancing technologies and a roving workforce which craft digital animations and seamless effects for the big screen. In this paper we utilize the concept of "story" as a premise to bring together the seemingly different vocations of archival science and film VFX. Through an exploration of digital film production and archival practice under the context of storytelling, we aim to highlight the need for archivists to work with the VFX industry to ensure evidence of this culturally significant aspect of filmmaking and cinema discourse is preserved into the future. As well present the argument that archives are more than collections of historical evidence. Archives are story - and archivists are storytellers.
This paper discusses Creature:Interactions (2015), a large-scale mixed-reality artwork created by the authors that incorporates immersive 360° stereoscopic visuals, interactive technology, and live actor facilitation. The work uses physical simulations to promote an expressive full-bodied interaction as children explore the landscapes and creatures of Ethel C. Pedley's ecologically focused children's novel, Dot and the Kangaroo. The immersive visuals provide a social playspace for up to 90 people and have produced "phantom" sensations of temperature and touch in certain participants.
Walsh, L, Bluff, A & Johnston, A 2017, 'Water, image, gesture and sound: composing and performing an interactive audiovisual work', Digital Creativity, vol. 28, no. 3, pp. 177-195.View/Download from: UTS OPUS or Publisher's site
Performing and composing for interactive audiovisual system presents many challenges to the performer. Working with visual, sonic and gestural components requires new skills and new ways of thinking about performance. However, there are few studies that focus on performer experience with interactive systems. We present the work Blue Space for oboe and interactive audiovisual system, highlighting the evolving process of the collaborative development of the work. We consider how musical and technical demands interact in this process, and outline the challenges of performing with interactive systems. Using the development of Blue Space as a self-reflective case study, we examine the role of gestures in interactive audiovisual works and identify new modes of performance.
© 2016 ISAST. This paper considers the relationship between design, practice and research in the area of New Interfaces for Musical Expression (NIME). The author argues that NIME practitioner-researchers should embrace the instability and dynamism inherent in digital musical interactions in order to explore and document the evolving processes of musical expression.
This paper describes an interactive dance/physical theatre work entitled Encoded, which made use of motion capture techniques and real-time fluid simulations to create systems intended to support, stimulate and augment live performance. Preliminary findings from a qualitative study of performers’ experiences with the system raise a number of issues, including the challenges of creating theatrical meaning with interactive systems, using Contact Improvisation as a metaphor for engaging creative systems, and the impact that large-scale projections can have on performers’ engagement.
Ngo, H, Chuang, Y, Guo, W, Ho, D, Pham, N, Johnston, AJ, Lim, RP & Listowski, A 2009, 'Resident's strategy survey on a new end use of recycled water in Australia', Desalination and Water Treatment, vol. 11, no. 1-3, pp. 93-97.View/Download from: UTS OPUS or Publisher's site
The concept of using recycled water for washing machine was introduced as a new end use. As there is a noticeable lack social research in understanding the general public perceptions of this application, the residents strategy survey was carried out at some selective suburbs in Sydney with demographically based signifi cant differences of general, gender, age, education, and property style and ownership. The survey indicates that the majority in the community considers the use of recycled water for washing machine is indispensable in view of continuing drought and the associated water shortages. Given safety assurance and demonstration, recycled water for washing machine has a considerable proportion within the responses. The general level of knowledge in community clearly understand that recycled water is more environmentally friendly option, whereas from cleanness and public health point of view, higher quality water is required to be reused in washing machine. Moreover, the residents reckon to have a small unit for pre-treatment (point of use) before recycled water entering washing machines might assure the quality and safety. The survey also shows the major concerns for a resident to use recycled water for washing machine are public health, water cleanness and washing machine durability.
Johnston, AJ, Candy, L & Edmonds, EA 2008, 'Designing and evaluating virtual musical instruments: facilitating conversational user interaction', Design Studies, vol. 29, no. 6, pp. 556-571.View/Download from: UTS OPUS or Publisher's site
This paper is concerned with the design of interactive virtual musical instruments. An interaction design strategy which uses on-screen objects that respond to user actions in physically realistic ways is described. This approach allows musicians to `play the virtual instruments using the sound of their familiar acoustic instruments. An investigation of user experience identified three modes of interaction that characterise the musicians' approach to the virtual instruments: instrumental, ornamental and conversational. When using the virtual instruments in instrumental mode, musicians prioritise detailed control; in ornamental mode, they surrender detailed control to the software and allow it to transform their sound; in conversational mode, the musicians allow the virtual instrument to `talk back, helping to shape the musical direction of performance much as a human playing partner might. Finding a balance between controllability and complexity emerged as a key issue in facilitating `conversational interaction.
Johnston, A & Bluff, A 2018, 'Collaborative Creation in Interactive Theatre' in Candy, L, Edmonds, E & Poltronieri, F (eds), Explorations in Art and Technology, Springer, London, pp. 341-351.View/Download from: UTS OPUS or Publisher's site
This article describes the collaboration between two digital artist/researchers from the Creativity and Cognition Studios at the University of Technology Sydney and the Australian based physical theatre company, Stalker Theatre. This collaboration has been under way since 2011 and has resulted in the creation of five major works that have toured internationally
Johnston, AJ 2014, 'Keeping Research in Tune with Practice' in Candy, L & Ferguson, S (eds), Interactive Experience in the Digital Age, Springer, Switzerland, pp. 49-62.View/Download from: UTS OPUS or Publisher's site
With contributions from artists, scientists, curators, entrepreneurs and designers engaged in the creative arts, this book is an invaluable resource for both researchers and practitioners, working in this emerging field.
Johnston, AJ 2011, 'Almost Tangible Musical Interfaces' in Candy, L & Edmonds, E (eds), Interacting Art, Research and the Creative Practitioner, Libri Publishing, Faringdon, Oxfordshire, U.K, pp. 211-224.View/Download from: UTS OPUS
Primarily I see myself as a musician. Certainly Im a researcher too, but my research is with and for musicians and is inextricably bound up in the practice of performing. Research questions arise through the execution of, reflection upon and examination of performance. This is because Id like the findings of my research, and the performance works which are at the core of the work, to be interesting and relevant to other musicians and composers. As someone who put a lot of work into trying to be a good musician for quite some time, this feels like a natural way to operate. I understand how musicians think, I speak the language and I respect their skill and dedication. I dont for a moment propose that the practicebased research approach I will describe in this chapter is the one and only way to conduct research in this area. It is, however, an approach that has worked for me. It has enabled me to continue to work creatively, while also engaging in research grounded in an epistemology of practice.
Bluff, A & Johnston, A 2019, 'Devising Interactive Theatre: Trajectories of Production with Complex Bespoke Technologies', Proceedings of the 2019 on Designing Interactive Systems Conference - DIS '19, Designing Interactive Systems 2019, ACM Press, San Diego, CA, USA, pp. 279-289.View/Download from: UTS OPUS or Publisher's site
Abu Ul Fazal, M, Karim, S, Ferguson, S & Johnston, A 2019, 'Vinfomize: A framework for multiple voice-based information communication', ACM International Conference Proceeding Series, pp. 143-147.View/Download from: UTS OPUS or Publisher's site
© 2019 Association for Computing Machinery. In this paper, we discuss investigations conducted with 10 visually challenged users (VCUs) and 8 sighted users (SUs) that aimed to determine user's experience, interest and expectations from concurrent information communication systems. In the first study, we concurrently played two voice-based streams in continuous form in both the ears, and in the second study, we concurrently communicated one stream continuously in one ear and three news headlines as interval-based short interruptions in another ear. We first reported the participants' experience qualitatively and then based on the feedback received from the users, we proposed a framework that may help in developing systems to communicate multiple voice-based information to the users. It is expected that the application of this new framework to information systems that provide multiple concurrent communication will provide a better user experience for users subject to their contextual and perceptual needs and limitations.
Abu Ul Fazal, M, Ferguson, S & Johnston, A 2018, 'Investigating concurrent speech-based designs for information communication', ACM International Conference Proceeding Series, Audio Mostly on Sound in Immersion and Emotion, ACM, Wrexham, United Kingdom, pp. 1-8.View/Download from: UTS OPUS or Publisher's site
© 2018 Association for Computing Machinery. Speech-based information is usually communicated to users in a sequential manner, but users are capable of obtaining information from multiple voices concurrently. This fact implies that the sequential approach is possibly under-utilizing human perception capabilities to some extent and restricting users to perform optimally in an immersive environment. This paper reports on an experiment that aimed to test different speech-based designs for concurrent information communication. Two audio streams from two types of content were played concurrently to 34 users, in both a continuous or intermittent form, with the manipulation of a variety of spatial configurations (i.e. Diotic, Diotic-Monotic, and Dichotic). In total, 12 concurrent speech-based design configurations were tested with each user. The results showed that the concurrent speech-based information designs involving intermittent form and the spatial difference in information streams produce comprehensibility equal to the level achieved in sequential information communication.
Schmidt, S, Ehrenbrink, P, Weiss, B, Voigt-Antons, JN, Kojic, T, Johnston, A & Moller, S 2018, 'Impact of Virtual Environments on Motivation and Engagement during Exergames', 2018 10th International Conference on Quality of Multimedia Experience, QoMEX 2018, International Conference on Quality of Multimedia Experience, Italy.View/Download from: UTS OPUS or Publisher's site
© 2018 IEEE. Video games and sport are an essential part in the life of millions of people. With recent advances of immersive virtual reality devices such as the HTC Vive, Oculus Rift, or PlayStation VR, the use of virtual environments (VE) for exergames is becoming more and more popular. An exergame combines a physical activity with video game elements by tracking body movements or reactions of user, attempting to engage users in a more enjoyable system. In this paper, we present the results of a subjective experiment carried out with the aim to compare different kinds of virtual environments with each other. A rowing ergometer, connected either to a virtual reality system using a head-mounted display (HMD) or to a CAVE environment, was used as an exergame device. While for rowing experts, fitness and performance improvements are of major interest, we wanted to focus on the motivation and engagement of non-professionals. By means of a series of questionnaires and a follow-up interview, the Quality of Experience of participants using the system was assessed. Measurements include concepts such as flow, presence, video quality and well-being. Results show significant advantages of the HMD as well as of the CAVE compared to a system without a VE for the overall quality, system feedback, and flow. While the CAVE and HMD system mainly differed in their autotelic experience, the HMD was favored by the majority of participants due to a superior feeling of presence.
Ul Fazal, MA, Ferguson, S, Karim, MS & Johnston, A 2018, 'Concurrent voice-based multiple information communication: A study report of profile-based users' interaction', 145th Audio Engineering Society International Convention, AES 2018, Audio Engineering Society International Convention, Audio Engineering Society, New York, New York.View/Download from: UTS OPUS
© 2018 KASHYAP. This paper reports a study conducted with 10 blind and 8 sighted participants using a prototype system for communicating multiple information streams concurrently, using two methods of presentation. The prototype system in method one played two continuous voice-based articles diotically, differing by voice gender and content. In the second method, prototype communicated one continuous article in the female voice and three headlines as interval-based short interruptions in a male voice dichotically. In this investigation the continuous method remained more effective in communicating multiple information compared to the interval-based interruption method, and also the users who possessed at least tertiary qualification performed better in comprehending the multiple concurrent information than the non-tertiary qualified users.
Bluff, A & Johnston, A 2017, 'Storytelling with Interactive Physical Theatre: A case study of Dot and the Kangaroo', Proceedings of the 4th International Conference on Movement Computing, International Conference on Movement Computing, ACM, London, United Kingdom, pp. 1-8.View/Download from: UTS OPUS or Publisher's site
This paper examines the way movement based interactive visuals were used as a storytelling device in the physical theatre production of Creature: Dot and the Kangaroo. A number of performers and artists involved in the production were interviewed and their perceptions of the interactive technology have been contrasted against a similar study into abstract dance. The animated backgrounds and interactive animal graphics projected onto the stage were found to reduce the density of script by describing the location of action and spirit of the character, reducing the necessity for this to be spoken. Peak moments of the show were identified by those interviewed and a scene analysis revealed that the most successful scenes featured a more integrated storytelling where the interaction between performers and the digital projections portrayed a key narrative message.
Murray-Leslie, A & Johnston, A 2017, 'The Liberation of the Feet: Demaking the High Heeled Shoe For Theatrical Audio-Visual Expression', Proceedings of the International Conference on New Interfaces for Musical Expression, New Interfaces for Musical Expression, Aalborg University Copenhagen, Copenhagen, Denmark, pp. 296-301.View/Download from: UTS OPUS
This paper describes a series of fashionable sounding shoe and foot based appendages made between 2007-2017. The research attempts to demake the physical high-heeled shoe through the iterative design and fabrication of new foot based musical instruments. This process of demaking also changes the usual purpose of shoes and associated stereotypes of high heeled shoe wear. Through turning high heeled shoes into wearable musical instruments for theatrical audio visual expressivity we question why so many musical instruments are made for the hands and not the feet? With this creative work we explore ways to redress the imbalance and consider what a genuinely “foot based” expressivity could be.
Holland, S, McPherson, A, Mackay, W, Wanderley, M, Gurevich, M, Mudd, T, O'Modhrain, S, Wilkie, K, Malloch, J, Garcia, J & Johnston, A 2016, 'Music and HCI workshop', Conference on Human Factors in Computing Systems - Proceedings, pp. 3339-3346.View/Download from: Publisher's site
© 2016 Authors. Music is an evolutionarily deep-rooted, abstract, real-time, complex, non-verbal, social activity. Consequently, interaction design in music can be a valuable source of challenges and new ideas for HCI. This workshop will reflect on the latest research in Music and HCI (Music Interaction for short), with the aim of strengthening the dialogue between the Music Interaction community and the wider HCI community. We will explore recent ideas from Music Interaction that may contribute new perspectives to general HCI practice, and conversely, recent HCI research in non-musical domains with implications for Music Interaction. We will also identify any concerns of Music Interaction that may require unique approaches. Contributors engaged in research in any area of Music Interaction or HCI who would like to contribute to a sustained widening of the dialogue between the distinctive concerns of the Music Interaction community and the wider HCI community will be welcome.
Holland, S, McPherson, AP, Mackay, WE, Wanderley, MM, Gurevich, MD, Mudd, TW, O Modhrain, S, Wilkie, KL, Malloch, JW, Garcia, J & others 2016, 'Music and HCI', Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, ACM, pp. 3339-3346.
Johnston, A & Pickrell, M 2016, 'Designing for technicians working in the field: 8 usability heuristics for mobile application design', Proceedings of the 28th Australian Conference on Computer-Human Interaction, Australian Computer Human Interaction Conference, ACM, Launceston, Tasmania, Australia, pp. 494-498.View/Download from: UTS OPUS or Publisher's site
Bluff, AJ & Johnston, A 2015, 'Remote Control of Complex Interactive Art Installations', Proceedings of the 2015 ACM SIGCHI Conference on Creativity and Cognition, Conference on Creativity and Cognition, ACM, Glasgow, UK, pp. 197-200.View/Download from: UTS OPUS or Publisher's site
Movement based interactive artworks are capable of instantly engaging audiences by reacting to physical motion consistently with real-world physics. Sustaining this engagement, however, requires a constant alteration of both the output and interaction aesthetics. Mobile devices (such as the iPad or iPhone) can be used to control the often-overwhelming plethora of parameters found in many interactive systems. The effect that mobile control of these parameters has on the inception, refinement and live performance of two separate art works is examined. An open-source dynamic remote control system is being developed to further facilitate the creative development and performance of interactive artwork as demonstrated by these case studies.
Berry, R, Edmonds, E & Johnston, AJ 2015, 'Unfinished Business: Some Reflections On Adopting A Practice-Based Approach To Technological Research As An Artist', Proceedings of the Annual Conference of the Australasian Computer Music Association, Australasian Computer Music Conference, The Australasian Computer Music Association, Sydney, pp. 13-18.
This paper reflects upon aspects of my experience as an
artist moving into research and my attempts to reconcile
the two areas of activity and interest. It describes several
tabletop augmented reality music systems in the context
of my experience as an artist working in technology
research environments. It is my intention to show how the
relationship between creative practice and technological
research has changed for me over time and how I have
come to embrace a practice-based approach to research
where creative practice takes a central and crucial role
Ilsar, A & Johnston, A 2015, 'Choreography in the Mapping of New Instruments', Proceedings of the 2015 ACM SIGCHI Conference on Creativity and Cognition, ACM Creativity and Cognition, ACM, Glasgow, UK, pp. 161-164.View/Download from: UTS OPUS or Publisher's site
This paper discusses the use of choreography in mapping sound to movement in the field of new instrument design. Using the analogy of the drum kit player utilising all four limbs in a similar fashion to a dancer, we investigate the notion of mapping movement to prerecorded sound in that order, as opposed to sound mapped to movement. In this way the mapping process becomes a type of "choreography", where a particular piece of music is learnt to be played as the mapping is determined. We outline three main factors which must be balanced within the mapping process. We present findings from the development of a new gestural interface for electronic percussionists and several collaborations that this interface has been used in.
Johnston, AJ 2015, 'Conceptualising Interaction in Live Performance: Reflections on 'Encoded'', Proceedings of the 2nd International Workshop on Movement and Computing, Movement and Computing (MOCO), ACM, Vancouver, Canada, pp. 60-67.View/Download from: UTS OPUS or Publisher's site
This paper presents a detailed examination of experiences of the creative team responsible for the direction, choreography, interaction design and performance of a dance and physical theatre work, Encoded. Interviews, observations and reflection on personal experience have made visible a range of different perspectives on the design, use and creative exploration of the interactive systems that were created for the work. The work itself, and in particular the use of interactive systems, was overall considered to be successful and coherent, even while participants' approaches and concerns were often markedly different. A trajectory of creative development in which exploratory improvisation and iterative design gradually became 'locked down' in preparation for final performance and touring is described.
Johnston, AJ & Bluff, A 2014, 'Creative Control of Granular Synthesis Using Fluid Simulation & Motion Tracking', Proceedings of the 2014 International Workshop on Movement and Computing, International Workshop on Movement and Computing, ACM, Paris France, pp. 150-153.View/Download from: UTS OPUS or Publisher's site
This paper describes the development of an audio-visual performance system which applies `reality based interaction' techniques. The real-time gestures and sounds of a musician playing an acoustic instrument are tracked and translated into forces which act on a fluid simulation. The simulation is visualised and also sonified using granular synthesis. Several strategies for linking live performance, fluid behaviour and generated sounds and visuals are discussed.
Tan, CT, Johnston, A, Bluff, A, Ferguson, S & Ballard, KJ 2014, 'Retrogaming as visual feedback for speech therapy', Proceeding SA'14 SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications, International Conference on Computer Graphics and Interactive Techniques in Australasia and Southeast Asia, ACM, Shenzen Convention & Exhibition Center.View/Download from: UTS OPUS or Publisher's site
A key problem in speech therapy is the motivation of patients in repetitive vocalization tasks. One important task is the vocalization of vowels. We present a novel solution by incorporating formant speech analysis into retro games to enable intrinsic motivation in performing the vocalization tasks in a fun and accessible manner. The visuals in the retro games also provide a simple and instantaneous feedback mechanism to the patients' vocalization performance. We developed an accurate and efficient formant recognition system to continuously recognize vowel vocalizations in real time. We implemented the system into two games, Speech Invaders and Yak-man, published on the iOS App Store in order to perform an initial public trial. We present the development to inform like-minded researchers who wish to incorporate real-time speech recognition in serious games.
Tan, CT, Johnston, AJ, Bluff, A, Ferguson, S & Ballard, KJ 2014, 'Speech invaders & yak-man: retrogames for speech therapy', Proceeding SA '14 SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications, SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications, Shenzen Convention & Exhibition Center.View/Download from: UTS OPUS or Publisher's site
Speech therapy is used for the treatment of speech disorders and commonly involves a patient attending clinical sessions with a speech pathologist, as well as performing prescribed practice exercises at home [Ruggero et al. 2012]. Clinical sessions are very effective -- the speech pathologist can carefully guide and monitor the patient's speech exercises -- but they are also costly and timeconsuming. However, the more inexpensive and convenient home practice component is often not as effective, as it is hard to maintain sufficient motivation to perform the rigid repetitive exercises.
Ferguson, SJ, Johnston, A & Murray-Leslie, A 2014, 'Methodologies with fashion acoustics Live on Stage!', Proceedings of the International Conference on New Interfaces for Musical Expression, New Interfaces for Musical Expression, Creativity and Cognition Workshop, Goldsmiths, University of London, pp. 1-4.View/Download from: UTS OPUS
Johnston, AJ 2014, 'Some Opportunities for Practice-Based Research for NIME', Proceedings of the Practice-Based Research Workshop at NIME 2014, Practice-Based Research Workshop at NIME 2014, Goldsmiths, University of London, pp. 1-3.View/Download from: UTS OPUS
Johnston, AJ, Ilsar, A & Havryliv, M 2014, 'Evaluating the Performance of a New Gestural Instrument Within an Ensemble', Proceedings of the International Conference on New Interfaces for Musical Expression, New Interfaces for Musical Expression, Creativity and Cognition Workshop, Goldsmiths University of London, pp. 339-342.View/Download from: UTS OPUS
This paper discusses one particular mapping for a new gestural instrument called the AirSticks. This mapping was designed to be used for improvised or rehearsed duos and restricts the performer to only utilising the sound source of one other musician playing an acoustic instrument. Several pieces with different musicians were performed and documented, musicians were observed and interviews with these musicians were transcribed. In this paper we will examine the thoughts of these musicians to gather a better understanding of how to design effective ensemble instruments of this type.
Berry, RA 2013, 'Representational systems with tangible and graphical elements', 2013 IEEE International Symposium on Mixed and Augmented Reality, IEEE International Symposium on Mixed and Augmented Reality, IEEE, Adelaide, Australia, pp. 1-4.View/Download from: UTS OPUS or Publisher's site
This research centres on the development of a number of prototype interactive systems, each of which uses a tangible means of representation and manipulation of musical elements in musical composition. Data gathered through collaborative prototyping and user studies is analysed using grounded theory methods. The resultant contribution to knowledge includes theory, design criteria and guidelines specific to tangible representations of music. This knowledge will be useful for future design of systems that use tangible representations, particularly for making music. The prototypes themselves also serve as a form of knowledge and as creative works
Ferguson, S, Johnston, AJ & Martin, AG 2013, 'A corpus-based method for controlling guitar feedback', Proceedings of the International Conference on New Interfaces for Musical Expression, New Interfaces for Musical Expression, Korea Advance Institute of Science and Technology, Daejeon & Seoul, Korea Republic, pp. 541-546.View/Download from: UTS OPUS
The use of feedback created by electric guitars and amplifiers is problematic in musical settings. For example, it is difficult for a performer to accurately obtain specific pitch and loudness qualities. This is due to the complex relationship between these quantities and other variables such as the string being fretted and the positions and orientations of the guitar and amplifier. This research investigates corpus-based methods for controlling the level and pitch of the feedback produced by a guitar and amplifier. A guitar-amplifier feedback system was built in which the feedback is manipulated using (i) a simple automatic gain control system, and (ii) a band-pass filter placed in the signal path. A corpus of sounds was created by recording the sound produced for various combinations of the parameters controlling these two components. Each sound in the corpus was analysed so that the control parameter values required to obtain particular sound qualities can be recalled in the manner of concatenative sound synthesis. As a demonstration, a recorded musical target phrase is recreated on the feedback system.
Ilsar, A, Havryliv, M & Johnston, AJ 2013, 'The AirSticks: a new interface for electronic percussion', Proceedings of the Sound and Music Computing Conference 2013, Sound and Music Computing Conference, KTH Royal Institute of Technology Stockholm, Stockholm, Sweden, pp. 220-226.View/Download from: UTS OPUS
This paper documents the early developments of a new interface for electronic percussionists. The interface is designed to allow the composition, improvisation and performance of live percussive electronic music using hand, finger, foot and head movements captured by various controllers. This paper provides a background to the field of electronic percussion, outlines the artistic motivations behind the project, and describes the technical nature of the work completed so far. This includes the development of software, the combination of existing controllers and senses, and an example mapping of movement to sound.
Johnston, AJ 2013, 'Fluid Simulation as Full Body Audio-Visual Instrument', Proceedings of the International Conference on New Interfaces for Musical Expression, New Interfaces for Musical Expression, Korea Advance Institute of Science and Technology, Daejeon & Seoul, Korea Republic, pp. 132-135.View/Download from: UTS OPUS
This paper describes an audio-visual performance system based on real-time fluid simulation. The aim is to provide a rich environment for works which blur the boundaries between dance and instrumental performance-and sound and visuals-while maintaining transparency for audiences and new performers.
Tan, C, Johnston, AJ, Ballard, KJ, Ferguson, S & Perera-Schulz, D 2013, 'sPeAK-MAN: towards popular gameplay for speech therapy', Proceedings of 9th Australasian Conference on Interactive Entertainment IE'13, Interactive Entertainment, ACM, Melbourne, VIC, Australia, pp. 1-4.View/Download from: UTS OPUS or Publisher's site
Current speech therapy treatments are not easily accessible to the general public due to cost and demand. Therapy sessions are also laborious and maintaining motivation of patients is hard. We propose using popular games and speech recognition technology for speech therapy in an individualised and accessible manner. sPeAK-MAN is a Pac-Man-like game with a core gameplay mechanic that incorporates vocalisation of words generated from a pool commonly used in clinical speech therapy sessions. Other than improving engagement, sPeAK-MAN aims to provide real-time feedback on the vocalisation performance of patients. It also serves as an initial prototype to demonstrate the possibilities of using familiar popular gameplay (instead of building one from scratch) for rehabilitation purposes.
Ferguson, S, Johnston, AJ, Ballard, KJ, Tan, C & Perera-Schulz, D 2012, 'Visual feedback of acoustic data for speech therapy: model and design parameters', Proceedings of the 7th Audio Mostly Conference: A Conference on Interaction with Sound, Audio Mostly Conference: A Conference on Interaction with Sound, ACM, Corfu, Greece, pp. 135-140.View/Download from: UTS OPUS or Publisher's site
Feedback, usually of a verbal nature, is important for speech therapy sessions. Some disadvantages exist however with traditional methods of speech therapy, and visual feedback of acoustic data is a useful alternative that can be used to complement typical clinical sessions. Visual feedback has been investigated before, and in this paper we propose sev- eral new prototypes. From these prototypes we develop an iterative model of analysing the design of feedback sys- tems by examining the feedback process. From this iterative model, we then extract methods to inform design of visual feedback systems for speech therapy
Johnston, AJ 2012, 'Conversational Interaction in Interactive Dance Works', Live Interfaces: Performance, Art, Music, Live Interfaces, Interdisciplinary Centre for Scientific Research in Music, Leeds, UK, pp. 1-4.View/Download from: UTS OPUS
This paper describes an ongoing project to develop an interactive dance/physical theatre work entitled Encoded. The focus is on the use of motion capture and real-time fluid simulation to create systems that we hope performers and audiences find stimulating and engaging. Preliminary findings from a qualitative study of performers experiences with the system raise a number of issues, including the challenges of creating theatrical meaning with interactive systems, using Contact Improvisation as a metaphor for engaging creative systems, and the impact that largescale projections can have on performers engagement
Johnston, AJ 2011, 'Beyond Evaluation: Linking Practice and Theory in New Musical Interface Design', NIME 2011: Proceedings of the International Conference on New Inferfaces for Musical Expression, New Interfaces for Musical Expression, Department of Musicology, University of Oslo Norwegian Academy of Music, Oslo, Norway, pp. 280-283.View/Download from: UTS OPUS
Johnston, AJ & CLARKSON, D 2011, 'Designing for Conversational Interaction with Interactive Dance Works', Proceedings of the Workshop: The Body In Design, The Australasian Computer Human Interaction Conference (OzCHI), Australian Computer Human Interaction Conference, Interaction Design and Work Practice Laboratory (IDWoP), Canberra, pp. 13-16.View/Download from: UTS OPUS
In this paper we describe ongoing work, which explores the physicality of human-computer interaction in dance works. The use of physical simulations in the interface to connect with the performers and audiences lived experience of the physical world is discussed. Drawing on past work with musicians, we argue that this approach is effective in encouraging creative, `conversational interactions in live performance.
Tan, C & Johnston, AJ 2011, 'Towards a Non-Disruptive, Practical and Objective Automated Playtesting Process', Proceedings of The Artificial Intelligence and Interactive Digital Entertainment Conference - Workshops at the Seventh Artificial Intelligence and Interactive Digital Entertainment Conference, Artificial Intelligence in the Game Design Process, AAAI Press, Stanford, USA, pp. 25-28.View/Download from: UTS OPUS
Playtesting is the primary process that allows a game designer to access game quality. Current playtesting methods are often intrusive to play, involves much manual labor, and might not even portray the player's true feedback. This paper aims to alleviate these shortcomings by presenting the position that state of the art artificial intelligence techniques can construct automated playtesting systems that supplement or even substitute this process to a certain extent. Several potential research directions are proposed in this theme. A work-inprogress report is also included to demonstrate the conceptual feasibility of the potentials of this research area.
Johnston, AJ & Humberstone, J 2010, 'Elective Music Students Experiences with Jam2Jam', 7th Australian Conference on Interactive Entertainment, Australian Conference on Interactive Entertainment, Massey University, College of Creative Arts, Institute of Communication Design, Wellington, New Zealand, pp. 8-15.View/Download from: UTS OPUS
This paper presents findings from a trial of the interactive music software Jam2Jam in a classroom music setting. Jam2Jam is software which allows musical novices to control generative music in real time. It has an interface which enables users to control multiple audio-visual parameters with a single gesture an approach intended to facilitate complex, conversational interaction. Examination of students experiences with Jam2Jam indicates that students find Jam2Jam attractive and that it has considerable potential. However, a number of issues for improvement, particularly a need for increased transparency of operation are identified. Extensions to Jam2Jam which would enable students to incorporate more of their own material into the music and visual they create during jam sessions are also proposed.
Johnston, AJ & Johnson, C 2010, 'Extreme Programming in the University', Proceedings of Annual International Conference on Computer Science Education: Innovation and Technology (CSEIT 2010), Annual International Conference on Computer Science Education: Innovation and Technology, Global Science and Technology Forum, Phuket, Thailand, pp. 3-8.View/Download from: UTS OPUS
This paper summarises our experiences teaching Extreme Programming to undergraduate students over a period of 8 years. We describe an approach in which students learn about the Extreme Programming (XP) method by using it on real software development projects. This experiential learning technique has been effective in helping students understand how XP works in practice and helped them to develop the skills to reflect on their current approaches to software development and critically evaluate agile methods. Problems, including a steep learning curve for some XP practices and difficulties scheduling pair-programming time in a university environment are also Identified.
Johnston, AJ, Beilharz, KA, Chen, Y & Ferguson, S 2010, 'Proceedings of the 2010 Conference on New Interfaces for Musical Expression (NIME 2010)', Proceedings of the 2010 Conference on New Interfaces for Musical Expression (NIME 2010), University of Technology Sydney, Sydney, Australia.
Johnston, AJ, Candy, L & Edmonds, EA 2009, 'Designing for Conversational Interaction', Proceedings of New Interfaces for Musical Expression (NIME), New Interfaces for Musical Expression, Carnegie Mellon University, Pittsburgh, USA, pp. 207-212.View/Download from: UTS OPUS
In this paper we describe an interaction framework which classifies musicians interactions with virtual musical instruments into three modes: instrumental, ornamental and conversational. We argue that conversational interactions are the most difficult to design for, but also the most interesting. To illustrate our approach to designing for conversational interactions we describe the performance work Partial Reflections 3 for two clarinets and interactive software. This software uses simulated physical models to create a virtual sound sculpture which both responds to and produces sounds and visuals.
Tavoukjian, A & Johnston, AJ 2009, 'Neural Networks and Evolutionary Algorithms in Music Composition', Proceedings of the Australasian Computer Music Conference, Australasian Computer Music Conference, Australasian Computer Music Association, Queensland University of Technology, Brisbane, Australia, pp. 72-78.
In this paper we describe a system developed to generate short musical phrases in the same style as a set of training melodies. The system uses an ensemble of neural networks to rate the similarity of a generated musical phrase to a set of human composed phrases. Given this rating a genetic algorithm is used to effectively 'search' for other similar phrases. Preliminary evaluations indicate that the proposed approach shows promise. A limitation of the system is the relatively simple representation of musical phrases employed and the processing time required to use this technique on longer phrases.
Smith, G & Johnston, AJ 2008, 'Interactive Software for Guitar Learning', Sound : Space - Proceedings of the Australasian Computer Music Conference, Australasian Computer Music Conference, Australasian Computer Music Association, Sydney, Australia, pp. 69-77.View/Download from: UTS OPUS
In this paper we present software designed to help address problems encountered by beginning guItarists, using interactive software to find effective solutions to enhance the learning process. Software can be utilised to improve a player's ability tdhear mistakes in theIr performance, as well as to create a fun and entertaining learning environment 'to motivate the player to practice. A software prototype ~~s been developed, which served as a basIs for usabllzty testmg, to highlight the usefulness of vari~us methods of feedback and provide a way forward in developing valuable software for guitar tuition.
Johnston, AJ & Marks, B 2007, 'Partial reflections: interactive virtual instruments controlled by sound', Proceedings of the 6th ACM SIGCHI conference on Creativity & cognition, ACM Creativity and Cognition, Association for Computing Machinery, Washington, DC, USA, pp. 257-258.View/Download from: Publisher's site
In this paper we describe two interactive virtual musical instruments that are controlled by sound. These instruments are based on virtual physical models that can be pushed and prodded by making sounds into a microphone. These models provide a mapping between acoustic sounds and computer generated sounds and visuals.
Johnston, AJ, Marks, B & Candy, L 2007, 'Sound Controlled Musical Instruments Based On Physical Models', Proceedings of the 2007 International Computer Music Conference, International Computer Music Conference, International Computer Music Association, Copenhagen, Denmark, pp. 232-239.View/Download from: UTS OPUS
This paper describes three simple virtual musical instruments that use physical models to map between live sound and computer generated audio and video. The intention is that this approach will provide musicians with an intuitively understandable environment that facilitates musical expression and exploration. Musicians live sound exerts `forces' on simple mass-spring physical models which move around in response and produce sound. Preliminary findings from a study of musicians' experiences using the software indicate that musicians find the software easy to understand and interact with and are drawn to software with more complex interaction - even though this complexity can reduce the feeling of direct control.
Johnston, AJ, Marks, B & Edmonds, EA 2006, 'Charmed circle - an interactive toy for musicians', ACM International Conference on Digital Interactive Media in Entertainment and Arts, ACM International Conference on Digital Interactive Media in Entertainment and Arts, Research Publishing Services, Bangkok, Thailand, pp. 1-7.View/Download from: UTS OPUS
Johnston, AJ, Marks, B, Candy, L & Edmonds, EA 2006, 'Partial reflections: interactive environments for musical exploration', ENGAGE: Interaction, Art and Audience Experience, ENGAGE: Interaction, Art and Audience Experience, Creativity and Cognition Studios Press, Sydney, Australia, pp. 100-109.View/Download from: UTS OPUS
This paper describes an ongoing project to develop interactive environments for musicians that encourage musical exploration. A process of developing software such as this, where requirements are highly dynamic and unclear is outlined and two musical compositions and associated interactive environments entitled 'Partial Reflections' are described.
Johnston, AJ, Amitani, S & Edmonds, EA 2005, 'Amplifying Reflective Thinking in musical Performance', Creativity and Cognition Proceedings 2005, ACM Creativity and Cognition, ACM Press, London, UK, pp. 166-175.View/Download from: UTS OPUS
Johnston, AJ, Marks, B & Edmonds, EA 2005, 'An artistic approach to designing visualisations to aid instrumental music learning', Cognition and Exploratory Learning in Digital Age (CELDA 2005) Proceedings, Cognition and Exploratory Learning in Digital Age, IADIS Press, Porto, Portugal, pp. 175-182.View/Download from: UTS OPUS
Johnston, AJ, Marks, B & Edmonds, EA 2005, ''Spheres of Influence' : An Interactive Musical Work', Proceedings of the second Australasian conference on Interactive entertainment, Interactive Entertainment, Creativity Cognition Studios Press, Sydney, Australia, pp. 97-103.View/Download from: UTS OPUS
In this paper we describe the development of an interactive artwork which incorporates both a musical composition and software which provides a visual and aural accompaniment. The system uses physical modeling to implement a type of virtual 'sonic sculpture' which responds to musical input in a way which appears naturalistic. This work forms part of a larger project to use art to explore the potential of computers to develop interactive tools which support the development of creative musical skills.
Weakley, AJ, Johnston, AJ, Edmonds, EA & Turner, GA 2005, 'Creative Collaboration: Communication Translation and Generation in the Development of a Computer-based Artwork', HCI International 2005 - 11th International Conference on Human-Computer Interaction, International Conference on Human-Computer Interaction, Lawrence Erlbaum Assoc, Las Vegas, Nevada, pp. 1-9.
Johnston, AJ 2004, 'Creativity, Music and Computers: Guidelines for Computer-Based Instrumental Music Support Tools', Managing New Wave Information Systems: Enterprise, Government and Society - Proceedings of the 15th Australasian Conference on Information Systems (ACIS2004), Australasian Conference on Information Systems, University of Tasmania, Tasmania, Australia, pp. 1-9.View/Download from: UTS OPUS
Johnston, AJ & Edmonds, EA 2004, 'Creativity, Music and Computers: Guidelines for Computer-Based Instrumental Music Support Tools', Proceedings of the Australasian Conference of Information Systems, Australasian Conference of Information Systems, University of Tasmania, Hobart, AUS, Hobart, TAS, Australia, pp. 2-11.
This paper examines requirements for computer-based tools intended to support creative development in musicians. Approaches to instrumental music pedagogy are presented and implications for those seeking to support musical skill development with computers are discussed. A pedagogical philosophy based on the ï½natural learning processï½ is combined with recommendations from creativity researchers to build a set of suggested features and guidelines for developing instrumental music support tools which facilitate creative development. A prototype application illustrating our approach is described.
Johnston, AJ & Edmonds, EA 2005, 'Towards a Framework of Requirements for Music Learning Support Tools', Innovations Through Information Technology: 2004 Information Resources Management Association International Conference, International Conference on Information Resources Management, Idea Group Publishing, New Orleans, LA, USA, pp. 643-646.View/Download from: UTS OPUS
Johnston, A & Johnson, CS 2003, 'Extreme programming: A more musical approach to software development?', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), pp. 325-327.
© Springer-Verlag Berlin Heidelberg 2003. This paper considers the relationship between software development as it is typically practiced, the Extreme Programming methodology and the learning and working environment of those involved in a creative art – music. In particular we emphasise how pair programming can facilitate an increase in the overall skill level of individuals and teams, and relate this to musicians’ development of models of excellence through ensemble playing. Consideration is also given to the psychology of music performance and its relevance to the pursuit of excellence in software development.
ENCODED is an immersive aerial dance performance and installation that uses the latest interactive technologies to build a projected digital environment that responds to the movements of the performers.
This audio-visual work for acoustic instruments and interactive software uses simple models of physical structures to mediate between acoustic sounds and computer generated sound and visuals. Phil Slater (trumpet) and Jason Noble (clarinet) use their acoustic instruments to playfully interact with a physically modelled virtual sound sculpture which is projected onto the screen. The musicians use sounds produced on their acoustic instruments to reach in to the virtual world and grasp, push and hit the sculpture. In response the structure glows, spins, bounces around and generates its own sounds.?? The pitch and timbre of the live acoustic sounds are captured and transformed by the virtual sculpture which sings back in its own way. Each individual object (or mass) in the physical model is linked to a synthesis engine which uses additive and subtractive synthesis techniques to produce a wide range of sonic textures. The frequency of oscillators of the synthesis engines are set by the acoustic sounds played by the acoustic musicians and the volume of sound produced is controlled by the movement of the masses. The effect is that the sound sculpture produces evocative sounds clearly linked to the sonic gestures of the performers and the movement of the on-screen sculpture. ??During performance the physical structure and characteristics of the sculpture are altered. Links between masses are cut, spring tension of the links altered and damping is ramped up and down. Thus, while transparency of operation is maintained, the complexity of the interaction between the acoustic and electronic performers and the sound sculpture itself leads to rich, conversational musical interactions.