UTS site search

Associate Professor Andrew Johnston

Biography

Dr. Andrew Johnston is a researcher and interaction designer who specialises in the development of software for use in creative contexts.

He has qualifications in music (B. Arts) and computing (Master of IT) and a PhD combining the two. As a musician he has performed professionally with ensembles such as the Melbourne and Sydney Symphony Orchestras and many other ensembles.

His PhD research, completed in 2009, involved the creation of interactive software for use by expert musicians in live performance.

Performances featuring his software have been presented at venues such as the Sydney Opera House, the Sound Lounge (Sydney) and the Queensland Conservatorium among others. Recordings have been broadcast on ABC FM.

Andrew is co-director of the Creativity and Cognition Studios, an interdisciplinary research group working at the intersection of creativity and technology.

Image of Andrew Johnston
Course Director, Master of Animation and Visualisation, UTS Animal Logic Academy
Core Member, HCTD - Human Centred Technology Design
BA (Hons) (Melb), GradDipIT (UTS), MIT (UTS), PhD
 
Phone
+61 2 9514 4497

Research Interests

I am interested in using technology as a medium for creative expression, particularly in sonic/musical contexts.

In my research I develop software and interaction strategies for use in live performance. I then conduct studies which examine the experiences of performers and audiences, with the aim of improving understanding of creative work and refining technical and artistic strategies examined in use and the experiences of performers and audiences are.

In particular I am focussed on developing interaction strategies and interfaces for audio-visual expression. My background in music performance and teaching has led me to investigate ways of using computers to support the creative arts, with an emphasis on the development of interactive virtual musical instruments for practicing musicians.

Can supervise: Yes
Registered at Level 2.

Multimedia, Agile software development methods, Software development.

Chapters

Johnston, A.J. 2014, 'Keeping Research in Tune with Practice' in Candy, L. & Ferguson, S. (eds), Interactive Experience in the Digital Age, Springer, pp. 49-62.
View/Download from: UTS OPUS or Publisher's site
With contributions from artists, scientists, curators, entrepreneurs and designers engaged in the creative arts, this book is an invaluable resource for both researchers and practitioners, working in this emerging field.
Johnston, A.J. 2011, 'Almost Tangible Musical Interfaces' in Candy, L. & Edmonds, E. (eds), Interacting Art, Research and the Creative Practitioner, Libri Publishing, Faringdon, Oxfordshire, U.K, pp. 211-224.
View/Download from: UTS OPUS
Primarily I see myself as a musician. Certainly Im a researcher too, but my research is with and for musicians and is inextricably bound up in the practice of performing. Research questions arise through the execution of, reflection upon and examination of performance. This is because Id like the findings of my research, and the performance works which are at the core of the work, to be interesting and relevant to other musicians and composers. As someone who put a lot of work into trying to be a good musician for quite some time, this feels like a natural way to operate. I understand how musicians think, I speak the language and I respect their skill and dedication. I dont for a moment propose that the practicebased research approach I will describe in this chapter is the one and only way to conduct research in this area. It is, however, an approach that has worked for me. It has enabled me to continue to work creatively, while also engaging in research grounded in an epistemology of practice.

Conferences

Bluff, A.J. & Johnston, A. 2015, 'Remote Control of Complex Interactive Art Installations', Proceedings of the 2015 ACM SIGCHI Conference on Creativity and Cognition, ACM SIGCHI Conference on Creativity and Cognition, ACM, Glasgow, UK, pp. 197-200.
View/Download from: Publisher's site
Movement based interactive artworks are capable of instantly engaging audiences by reacting to physical motion consistently with real-world physics. Sustaining this engagement, however, requires a constant alteration of both the output and interaction aesthetics. Mobile devices (such as the iPad or iPhone) can be used to control the often-overwhelming plethora of parameters found in many interactive systems. The effect that mobile control of these parameters has on the inception, refinement and live performance of two separate art works is examined. An open-source dynamic remote control system is being developed to further facilitate the creative development and performance of interactive artwork as demonstrated by these case studies.
Ilsar, A. & Johnston, A. 2015, 'Choreography in the Mapping of New Instruments', Proceedings of the 2015 ACM SIGCHI Conference on Creativity and Cognition, ACM Creativity and Cognition Conference, ACM, Glasgow, UK, pp. 161-164.
View/Download from: UTS OPUS or Publisher's site
This paper discusses the use of choreography in mapping sound to movement in the field of new instrument design. Using the analogy of the drum kit player utilising all four limbs in a similar fashion to a dancer, we investigate the notion of mapping movement to prerecorded sound in that order, as opposed to sound mapped to movement. In this way the mapping process becomes a type of "choreography", where a particular piece of music is learnt to be played as the mapping is determined. We outline three main factors which must be balanced within the mapping process. We present findings from the development of a new gestural interface for electronic percussionists and several collaborations that this interface has been used in.
Johnston, A.J. 2015, 'Conceptualising Interaction in Live Performance: Reflections on 'Encoded'', Proceedings of the 2nd International Workshop on Movement and Computing, Movement and Computing (MOCO), ACM, Vancouver, Canada, pp. 60-67.
View/Download from: UTS OPUS or Publisher's site
This paper presents a detailed examination of experiences of the creative team responsible for the direction, choreography, interaction design and performance of a dance and physical theatre work, Encoded. Interviews, observations and reflection on personal experience have made visible a range of different perspectives on the design, use and creative exploration of the interactive systems that were created for the work. The work itself, and in particular the use of interactive systems, was overall considered to be successful and coherent, even while participants' approaches and concerns were often markedly different. A trajectory of creative development in which exploratory improvisation and iterative design gradually became 'locked down' in preparation for final performance and touring is described.
Berry, R., Edmonds, E. & Johnston, A.J. 2015, 'Unfinished Business: Some Reflections On Adopting A Practice-Based Approach To Technological Research As An Artist', Proceedings of the Annual Conference of the Australasian Computer Music Association, Australasian Computer Music Conference, The Australasian Computer Music Association, Sydney, pp. 13-18.
This paper reflects upon aspects of my experience as an artist moving into research and my attempts to reconcile the two areas of activity and interest. It describes several tabletop augmented reality music systems in the context of my experience as an artist working in technology research environments. It is my intention to show how the relationship between creative practice and technological research has changed for me over time and how I have come to embrace a practice-based approach to research where creative practice takes a central and crucial role
Johnston, A.J. & Bluff, A. 2014, 'Creative Control of Granular Synthesis Using Fluid Simulation & Motion Tracking', Proceedings of the 2014 International Workshop on Movement and Computing, International Workshop on Movement and Computing, ACM, Paris France, pp. 150-153.
View/Download from: UTS OPUS or Publisher's site
This paper describes the development of an audio-visual performance system which applies `reality based interaction' techniques. The real-time gestures and sounds of a musician playing an acoustic instrument are tracked and translated into forces which act on a fluid simulation. The simulation is visualised and also sonified using granular synthesis. Several strategies for linking live performance, fluid behaviour and generated sounds and visuals are discussed.
Ferguson, S.J., Johnston, A. & Murray-Leslie, A. 2014, 'Methodologies with fashion acoustics Live on Stage!', 14th International Conference on New Interfaces for Musical Expression, Goldsmiths, University of London, pp. 1-4.
View/Download from: UTS OPUS
Johnston, A.J., Ilsar, A. & Havryliv, M. 2014, 'Evaluating the Performance of a New Gestural Instrument Within an Ensemble', 14th International Conference on New Interfaces for Musical Expression, Goldsmiths University of London, pp. 339-342.
View/Download from: UTS OPUS
This paper discusses one particular mapping for a new gestural instrument called the AirSticks. This mapping was designed to be used for improvised or rehearsed duos and restricts the performer to only utilising the sound source of one other musician playing an acoustic instrument. Several pieces with different musicians were performed and documented, musicians were observed and interviews with these musicians were transcribed. In this paper we will examine the thoughts of these musicians to gather a better understanding of how to design effective ensemble instruments of this type.
Johnston, A.J. 2014, 'Some Opportunities for Practice-Based Research for NIME', Proceedings of the Practice-Based Research Workshop at NIME 2014, Practice-Based Research Workshop at NIME 2014, Goldsmiths, University of London, pp. 1-3.
View/Download from: UTS OPUS
Tan, C.T., Johnston, A., Bluff, A., Ferguson, S. & Ballard, K.J. 2014, 'Retrogaming as visual feedback for speech therapy', Proceeding SA'14 SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications, SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications, ACM, Shenzen Convention & Exhibition Center.
View/Download from: UTS OPUS or Publisher's site
A key problem in speech therapy is the motivation of patients in repetitive vocalization tasks. One important task is the vocalization of vowels. We present a novel solution by incorporating formant speech analysis into retro games to enable intrinsic motivation in performing the vocalization tasks in a fun and accessible manner. The visuals in the retro games also provide a simple and instantaneous feedback mechanism to the patients' vocalization performance. We developed an accurate and efficient formant recognition system to continuously recognize vowel vocalizations in real time. We implemented the system into two games, Speech Invaders and Yak-man, published on the iOS App Store in order to perform an initial public trial. We present the development to inform like-minded researchers who wish to incorporate real-time speech recognition in serious games.
Tan, C.T., Johnston, A.J., Bluff, A., Ferguson, S. & Ballard, K.J. 2014, 'Speech invaders & yak-man: retrogames for speech therapy', Proceeding SA '14 SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications, SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications, Shenzen Convention & Exhibition Center.
View/Download from: UTS OPUS or Publisher's site
Speech therapy is used for the treatment of speech disorders and commonly involves a patient attending clinical sessions with a speech pathologist, as well as performing prescribed practice exercises at home [Ruggero et al. 2012]. Clinical sessions are very effective -- the speech pathologist can carefully guide and monitor the patient's speech exercises -- but they are also costly and timeconsuming. However, the more inexpensive and convenient home practice component is often not as effective, as it is hard to maintain sufficient motivation to perform the rigid repetitive exercises.
Ilsar, A., Havryliv, M. & Johnston, A.J. 2013, 'The AirSticks: a new interface for electronic percussion', Proceedings of the Sound and Music Computing Conference 2013, Sound and Music Computing Conference, KTH Royal Institute of Technology Stockholm, Stockholm, Sweden, pp. 220-226.
View/Download from: UTS OPUS
This paper documents the early developments of a new interface for electronic percussionists. The interface is designed to allow the composition, improvisation and performance of live percussive electronic music using hand, finger, foot and head movements captured by various controllers. This paper provides a background to the field of electronic percussion, outlines the artistic motivations behind the project, and describes the technical nature of the work completed so far. This includes the development of software, the combination of existing controllers and senses, and an example mapping of movement to sound.
Ferguson, S., Johnston, A.J. & Martin, A.G. 2013, 'A corpus-based method for controlling guitar feedback', Proceedings of the International Conference on New Interfaces for Musical Expression, Korea Advance Institute of Science and Technology, Daejeon & Seoul, Korea Republic, pp. 541-546.
View/Download from: UTS OPUS
The use of feedback created by electric guitars and amplifiers is problematic in musical settings. For example, it is difficult for a performer to accurately obtain specific pitch and loudness qualities. This is due to the complex relationship between these quantities and other variables such as the string being fretted and the positions and orientations of the guitar and amplifier. This research investigates corpus-based methods for controlling the level and pitch of the feedback produced by a guitar and amplifier. A guitar-amplifier feedback system was built in which the feedback is manipulated using (i) a simple automatic gain control system, and (ii) a band-pass filter placed in the signal path. A corpus of sounds was created by recording the sound produced for various combinations of the parameters controlling these two components. Each sound in the corpus was analysed so that the control parameter values required to obtain particular sound qualities can be recalled in the manner of concatenative sound synthesis. As a demonstration, a recorded musical target phrase is recreated on the feedback system.
Tan, C., Johnston, A.J., Ballard, K.J., Ferguson, S. & Perera-Schulz, D. 2013, 'sPeAK-MAN: towards popular gameplay for speech therapy', Proceedings of 9th Australasian Conference on Interactive Entertainment IE'13, Australasian Conference on Interactive Entertainment, ACM, Melbourne, VIC, Australia, pp. 1-4.
View/Download from: UTS OPUS or Publisher's site
Current speech therapy treatments are not easily accessible to the general public due to cost and demand. Therapy sessions are also laborious and maintaining motivation of patients is hard. We propose using popular games and speech recognition technology for speech therapy in an individualised and accessible manner. sPeAK-MAN is a Pac-Man-like game with a core gameplay mechanic that incorporates vocalisation of words generated from a pool commonly used in clinical speech therapy sessions. Other than improving engagement, sPeAK-MAN aims to provide real-time feedback on the vocalisation performance of patients. It also serves as an initial prototype to demonstrate the possibilities of using familiar popular gameplay (instead of building one from scratch) for rehabilitation purposes.
Johnston, A.J. 2013, 'Fluid Simulation as Full Body Audio-Visual Instrument', Proceedings of the International Conference on New Interfaces for Musical Expression, International Conference on New Interfaces for Musical Expression, Korea Advance Institute of Science and Technology, Daejeon & Seoul, Korea Republic, pp. 132-135.
View/Download from: UTS OPUS
This paper describes an audio-visual performance system based on real-time fluid simulation. The aim is to provide a rich environment for works which blur the boundaries between dance and instrumental performance-and sound and visuals-while maintaining transparency for audiences and new performers.
Ferguson, S., Johnston, A.J., Ballard, K.J., Tan, C. & Perera-Schulz, D. 2012, 'Visual feedback of acoustic data for speech therapy: model and design parameters', Proceedings of the 7th Audio Mostly Conference: A Conference on Interaction with Sound, Audio Mostly, ACM, Corfu, Greece, pp. 135-140.
View/Download from: UTS OPUS
Feedback, usually of a verbal nature, is important for speech therapy sessions. Some disadvantages exist however with traditional methods of speech therapy, and visual feedback of acoustic data is a useful alternative that can be used to complement typical clinical sessions. Visual feedback has been investigated before, and in this paper we propose sev- eral new prototypes. From these prototypes we develop an iterative model of analysing the design of feedback sys- tems by examining the feedback process. From this iterative model, we then extract methods to inform design of visual feedback systems for speech therapy
Johnston, A.J. 2012, 'Conversational Interaction in Interactive Dance Works', Live Interfaces: Performance, Art, Music, Interdisciplinary Centre for Scientific Research in Music, Leeds, UK, pp. 1-4.
View/Download from: UTS OPUS
This paper describes an ongoing project to develop an interactive dance/physical theatre work entitled Encoded. The focus is on the use of motion capture and real-time fluid simulation to create systems that we hope performers and audiences find stimulating and engaging. Preliminary findings from a qualitative study of performers experiences with the system raise a number of issues, including the challenges of creating theatrical meaning with interactive systems, using Contact Improvisation as a metaphor for engaging creative systems, and the impact that largescale projections can have on performers engagement
Tan, C. & Johnston, A.J. 2011, 'Towards a Non-Disruptive, Practical and Objective Automated Playtesting Process', Proceedings of The Artificial Intelligence and Interactive Digital Entertainment Conference - Workshops at the Seventh Artificial Intelligence and Interactive Digital Entertainment Conference, Artificial Intelligence and Interactive Digital Entertainment Conference, AAAI Press, Stanford, USA, pp. 25-28.
View/Download from: UTS OPUS
Playtesting is the primary process that allows a game designer to access game quality. Current playtesting methods are often intrusive to play, involves much manual labor, and might not even portray the player's true feedback. This paper aims to alleviate these shortcomings by presenting the position that state of the art artificial intelligence techniques can construct automated playtesting systems that supplement or even substitute this process to a certain extent. Several potential research directions are proposed in this theme. A work-inprogress report is also included to demonstrate the conceptual feasibility of the potentials of this research area.
Johnston, A.J. & CLARKSON, D. 2011, 'Designing for Conversational Interaction with Interactive Dance Works', Proceedings of the Workshop: The Body In Design, The Australasian Computer Human Interaction Conference (OzCHI), Interaction Design and Work Practice Laboratory (IDWoP), Canberra, pp. 13-16.
View/Download from: UTS OPUS
In this paper we describe ongoing work, which explores the physicality of human-computer interaction in dance works. The use of physical simulations in the interface to connect with the performers and audiences lived experience of the physical world is discussed. Drawing on past work with musicians, we argue that this approach is effective in encouraging creative, `conversational interactions in live performance.
Johnston, A.J. 2011, 'Beyond Evaluation: Linking Practice and Theory in New Musical Interface Design', NIME 2011: Proceedings of the International Conference on New Inferfaces for Musical Expression, Department of Musicology, University of Oslo Norwegian Academy of Music, Oslo, Norway, pp. 280-283.
View/Download from: UTS OPUS
Johnston, A.J. & Johnson, C. 2010, 'Extreme Programming in the University', Proceedings of Annual International Conference on Computer Science Education: Innovation and Technology (CSEIT 2010), Annual International Conference on Computer Science Education: Innovation and Technology, Global Science and Technology Forum, Phuket, Thailand, pp. 3-8.
View/Download from: UTS OPUS
This paper summarises our experiences teaching Extreme Programming to undergraduate students over a period of 8 years. We describe an approach in which students learn about the Extreme Programming (XP) method by using it on real software development projects. This experiential learning technique has been effective in helping students understand how XP works in practice and helped them to develop the skills to reflect on their current approaches to software development and critically evaluate agile methods. Problems, including a steep learning curve for some XP practices and difficulties scheduling pair-programming time in a university environment are also Identified.
Johnston, A.J., Beilharz, K.A., Chen, Y. & Ferguson, S. 2010, 'Proceedings of the 2010 Conference on New Interfaces for Musical Expression (NIME 2010)', Proceedings of the 2010 Conference on New Interfaces for Musical Expression (NIME 2010), University of Technology Sydney, Sydney, Australia.
Johnston, A.J. & Humberstone, J. 2010, 'Elective Music Students Experiences with Jam2Jam', 7th Australian Conference on Interactive Entertainment, Australian Conference on Interactive Entertainment, Massey University, College of Creative Arts, Institute of Communication Design, Wellington, New Zealand, pp. 8-15.
View/Download from: UTS OPUS
This paper presents findings from a trial of the interactive music software Jam2Jam in a classroom music setting. Jam2Jam is software which allows musical novices to control generative music in real time. It has an interface which enables users to control multiple audio-visual parameters with a single gesture an approach intended to facilitate complex, conversational interaction. Examination of students experiences with Jam2Jam indicates that students find Jam2Jam attractive and that it has considerable potential. However, a number of issues for improvement, particularly a need for increased transparency of operation are identified. Extensions to Jam2Jam which would enable students to incorporate more of their own material into the music and visual they create during jam sessions are also proposed.
Johnston, A.J., Candy, L. & Edmonds, E.A. 2009, 'Designing for Conversational Interaction', Proceedings of New Interfaces for Musical Expression (NIME), New Interfaces for Musical Expression, Carnegie Mellon University, Pittsburgh, USA, pp. 207-212.
View/Download from: UTS OPUS
In this paper we describe an interaction framework which classifies musicians interactions with virtual musical instruments into three modes: instrumental, ornamental and conversational. We argue that conversational interactions are the most difficult to design for, but also the most interesting. To illustrate our approach to designing for conversational interactions we describe the performance work Partial Reflections 3 for two clarinets and interactive software. This software uses simulated physical models to create a virtual sound sculpture which both responds to and produces sounds and visuals.
Tavoukjian, A. & Johnston, A.J. 2009, 'Neural Networks and Evolutionary Algorithms in Music Composition', Proceedings of the Australasian Computer Music Conference, Australasian Computer Music Conference, Australasian Computer Music Association, Queensland University of Technology, Brisbane, Australia, pp. 72-78.
In this paper we describe a system developed to generate short musical phrases in the same style as a set of training melodies. The system uses an ensemble of neural networks to rate the similarity of a generated musical phrase to a set of human composed phrases. Given this rating a genetic algorithm is used to effectively 'search' for other similar phrases. Preliminary evaluations indicate that the proposed approach shows promise. A limitation of the system is the relatively simple representation of musical phrases employed and the processing time required to use this technique on longer phrases.
Smith, G. & Johnston, A.J. 2008, 'Interactive Software for Guitar Learning', Sound : Space - Proceedings of the Australasian Computer Music Conference, Australasian Computer Music Conference, Australasian Computer Music Association, Sydney, Australia, pp. 69-77.
View/Download from: UTS OPUS
In this paper we present software designed to help address problems encountered by beginning guItarists, using interactive software to find effective solutions to enhance the learning process. Software can be utilised to improve a player's ability tdhear mistakes in theIr performance, as well as to create a fun and entertaining learning environment 'to motivate the player to practice. A software prototype ~~s been developed, which served as a basIs for usabllzty testmg, to highlight the usefulness of vari~us methods of feedback and provide a way forward in developing valuable software for guitar tuition.
Johnston, A.J., Marks, B. & Candy, L. 2007, 'Sound Controlled Musical Instruments Based On Physical Models', Proceedings of the 2007 International Computer Music Conference, International Computer Music Conference, International Computer Music Association, Copenhagen, Denmark, pp. 232-239.
View/Download from: UTS OPUS
This paper describes three simple virtual musical instruments that use physical models to map between live sound and computer generated audio and video. The intention is that this approach will provide musicians with an intuitively understandable environment that facilitates musical expression and exploration. Musicians live sound exerts `forces' on simple mass-spring physical models which move around in response and produce sound. Preliminary findings from a study of musicians' experiences using the software indicate that musicians find the software easy to understand and interact with and are drawn to software with more complex interaction - even though this complexity can reduce the feeling of direct control.
Johnston, A.J. & Marks, B. 2007, 'Partial reflections: interactive virtual instruments controlled by sound', Proceedings of the 6th ACM SIGCHI conference on Creativity & cognition, ACM Creativity and Cognition, Association for Computing Machinery, Washington, DC, USA, pp. 257-258.
View/Download from: Publisher's site
In this paper we describe two interactive virtual musical instruments that are controlled by sound. These instruments are based on virtual physical models that can be pushed and prodded by making sounds into a microphone. These models provide a mapping between acoustic sounds and computer generated sounds and visuals.
Johnston, A.J., Marks, B., Candy, L. & Edmonds, E.A. 2006, 'Partial reflections: interactive environments for musical exploration', ENGAGE: Interaction, Art and Audience Experience, ENGAGE: Interaction, Art and Audience Experience, Creativity and Cognition Studios Press, Sydney, Australia, pp. 100-109.
View/Download from: UTS OPUS
This paper describes an ongoing project to develop interactive environments for musicians that encourage musical exploration. A process of developing software such as this, where requirements are highly dynamic and unclear is outlined and two musical compositions and associated interactive environments entitled 'Partial Reflections' are described.
Johnston, A.J., Marks, B. & Edmonds, E.A. 2006, 'Charmed circle - an interactive toy for musicians', ACM International Conference on Digital Interactive Media in Entertainment and Arts, ACM International Conference on Digital Interactive Media in Entertainment and Arts, Research Publishing Services, Bangkok, Thailand, pp. 1-7.
View/Download from: UTS OPUS
Johnston, A.J., Amitani, S. & Edmonds, E.A. 2005, 'Amplifying Reflective Thinking in musical Performance', Creativity and Cognition Proceedings 2005, ACM Creativity and Cognition, ACM Press, London, UK, pp. 166-175.
View/Download from: UTS OPUS
Johnston, A.J., Marks, B. & Edmonds, E.A. 2005, ''Spheres of Influence' : An Interactive Musical Work', Proceedings of the second Australasian conference on Interactive entertainment, Interactive Entertainment, Creativity Cognition Studios Press, Sydney, Australia, pp. 97-103.
View/Download from: UTS OPUS
In this paper we describe the development of an interactive artwork which incorporates both a musical composition and software which provides a visual and aural accompaniment. The system uses physical modeling to implement a type of virtual 'sonic sculpture' which responds to musical input in a way which appears naturalistic. This work forms part of a larger project to use art to explore the potential of computers to develop interactive tools which support the development of creative musical skills.
Johnston, A.J., Marks, B. & Edmonds, E.A. 2005, 'An artistic approach to designing visualisations to aid instrumental music learning', Cognition and Exploratory Learning in Digital Age (CELDA 2005) Proceedings, Cognition and Exploratory Learning in Digital Age, IADIS Press, Porto, Portugal, pp. 175-182.
View/Download from: UTS OPUS
Weakley, A.J., Johnston, A.J., Edmonds, E.A. & Turner, G.A. 2005, 'Creative Collaboration: Communication Translation and Generation in the Development of a Computer-based Artwork', HCI International 2005 - 11th International Conference on Human-Computer Interaction, International Conference on Human-Computer Interaction, Lawrence Erlbaum Assoc, Las Vegas, Nevada, pp. 1-9.
Johnston, A.J. & Edmonds, E.A. 2005, 'Towards a Framework of Requirements for Music Learning Support Tools', Innovations Through Information Technology: 2004 Information Resources Management Association International Conference, International Conference on Information Resources Management, Idea Group Publishing, New Orleans, LA, USA, pp. 643-646.
View/Download from: UTS OPUS
NA
Johnston, A.J. 2004, 'Creativity, Music and Computers: Guidelines for Computer-Based Instrumental Music Support Tools', Managing New Wave Information Systems: Enterprise, Government and Society - Proceedings of the 15th Australasian Conference on Information Systems (ACIS2004), Australasian Conference on Information Systems, University of Tasmania, Tasmania, Australia, pp. 1-9.
View/Download from: UTS OPUS
Johnston, A.J. & Edmonds, E.A. 2004, 'Creativity, Music and Computers: Guidelines for Computer-Based Instrumental Music Support Tools', Proceedings of the Australasian Conference of Information Systems, Australasian Conference of Information Systems, University of Tasmania, Hobart, AUS, Hobart, TAS, Australia, pp. 2-11.
This paper examines requirements for computer-based tools intended to support creative development in musicians. Approaches to instrumental music pedagogy are presented and implications for those seeking to support musical skill development with computers are discussed. A pedagogical philosophy based on the ½natural learning process½ is combined with recommendations from creativity researchers to build a set of suggested features and guidelines for developing instrumental music support tools which facilitate creative development. A prototype application illustrating our approach is described.

Journal articles

Johnston, A. 2015, 'Conversational Interaction in Interactive Dance Works', Leonardo, vol. 48, no. 3, pp. 296-297.
View/Download from: Publisher's site
Ngo, H., Chuang, Y., Guo, W., Ho, D., Pham, N., Johnston, A.J., Lim, R.P. & Listowski, A. 2009, 'Resident's strategy survey on a new end use of recycled water in Australia', Desalination and Water Treatment, vol. 11, no. 1-3, pp. 93-97.
View/Download from: UTS OPUS or Publisher's site
The concept of using recycled water for washing machine was introduced as a new end use. As there is a noticeable lack social research in understanding the general public perceptions of this application, the residents strategy survey was carried out at some selective suburbs in Sydney with demographically based signifi cant differences of general, gender, age, education, and property style and ownership. The survey indicates that the majority in the community considers the use of recycled water for washing machine is indispensable in view of continuing drought and the associated water shortages. Given safety assurance and demonstration, recycled water for washing machine has a considerable proportion within the responses. The general level of knowledge in community clearly understand that recycled water is more environmentally friendly option, whereas from cleanness and public health point of view, higher quality water is required to be reused in washing machine. Moreover, the residents reckon to have a small unit for pre-treatment (point of use) before recycled water entering washing machines might assure the quality and safety. The survey also shows the major concerns for a resident to use recycled water for washing machine are public health, water cleanness and washing machine durability.
Johnston, A.J., Candy, L. & Edmonds, E.A. 2008, 'Designing and evaluating virtual musical instruments: facilitating conversational user interaction', Design Studies, vol. 29, no. 6, pp. 556-571.
View/Download from: UTS OPUS or Publisher's site
This paper is concerned with the design of interactive virtual musical instruments. An interaction design strategy which uses on-screen objects that respond to user actions in physically realistic ways is described. This approach allows musicians to `play the virtual instruments using the sound of their familiar acoustic instruments. An investigation of user experience identified three modes of interaction that characterise the musicians' approach to the virtual instruments: instrumental, ornamental and conversational. When using the virtual instruments in instrumental mode, musicians prioritise detailed control; in ornamental mode, they surrender detailed control to the software and allow it to transform their sound; in conversational mode, the musicians allow the virtual instrument to `talk back, helping to shape the musical direction of performance much as a human playing partner might. Finding a balance between controllability and complexity emerged as a key issue in facilitating `conversational interaction.
Johnston, A.J. & Marks, B. 2007, ''Partial Reflections'', Leonardo, vol. 40, no. 5, pp. 510-511.
View/Download from: UTS OPUS or Publisher's site
NA

Non traditional outputs

Johnston, A.J. 2013, 'Pixel Mountain', Stalker Theatre, Gwacheon Festival, Hi Seoul, Korea.
View/Download from: UTS OPUS
Johnston, A.J., CLARKSON, D., Rolandi, A., Clarkson, S., Selwyn-Norton, P., Kennard, P., Dalziel, A. & Richards, K. 2012, 'Encoded', Carriageworks, Newtown, NSW.
View/Download from: UTS OPUS
ENCODED is an immersive aerial dance performance and installation that uses the latest interactive technologies to build a projected digital environment that responds to the movements of the performers.
Johnston, A.J. 2010, 'Touching Dialogue', Eugene Goossens Concert Hall, ABC Centre Ultimo.
View/Download from: UTS OPUS
This audio-visual work for acoustic instruments and interactive software uses simple models of physical structures to mediate between acoustic sounds and computer generated sound and visuals. Phil Slater (trumpet) and Jason Noble (clarinet) use their acoustic instruments to playfully interact with a physically modelled virtual sound sculpture which is projected onto the screen. The musicians use sounds produced on their acoustic instruments to reach in to the virtual world and grasp, push and hit the sculpture. In response the structure glows, spins, bounces around and generates its own sounds.?? The pitch and timbre of the live acoustic sounds are captured and transformed by the virtual sculpture which sings back in its own way. Each individual object (or mass) in the physical model is linked to a synthesis engine which uses additive and subtractive synthesis techniques to produce a wide range of sonic textures. The frequency of oscillators of the synthesis engines are set by the acoustic sounds played by the acoustic musicians and the volume of sound produced is controlled by the movement of the masses. The effect is that the sound sculpture produces evocative sounds clearly linked to the sonic gestures of the performers and the movement of the on-screen sculpture. ??During performance the physical structure and characteristics of the sculpture are altered. Links between masses are cut, spring tension of the links altered and damping is ramped up and down. Thus, while transparency of operation is maintained, the complexity of the interaction between the acoustic and electronic performers and the sound sculpture itself leads to rich, conversational musical interactions.