Andrew is a new media interaction artist and researcher specialising in technology to enhance live musical performance, physical theatre and interactive art installations. Seamlessly blending software engineering, sound design and digital projection to create mixed reality live synesthetic audio-visual events. Andrew has recently completed his doctoral research at the UTS Creativity and Cognition Studios, producing his Phd thesis Interactive Art, Immersive Technology and Live Performance. Andrew is currently employed as a post-doctoral research fellow at the flagship UTS Animal Logic Academy where masters and research students are developing their film-based CGI and animation skills and exploring how these skills can be coupled with robotics, 3D printing, 3D scanners, CAVEs, motion capture, AR and VR to create innovative and captivating experiences.
In addition to his interactive live performance work, Andrew has developed the mobile drumming apps DrumStudio and RoboDrummer and received the prestigious App Art Award for his innovative iOS app, Mobile Phone Orchestra.
Andrew has also worked as a professional software developer on many cutting edge sound, lighting and video applications.
Can supervise: YES
This paper discusses Creature:Interactions (2015), a large-scale mixed-reality artwork created by the authors that incorporates immersive 360° stereoscopic visuals, interactive technology, and live actor facilitation. The work uses physical simulations to promote an expressive full-bodied interaction as children explore the landscapes and creatures of Ethel C. Pedley's ecologically focused children's novel, Dot and the Kangaroo. The immersive visuals provide a social playspace for up to 90 people and have produced "phantom" sensations of temperature and touch in certain participants.
Walsh, L, Bluff, A & Johnston, A 2017, 'Water, image, gesture and sound: composing and performing an interactive audiovisual work', Digital Creativity, vol. 28, no. 3, pp. 177-195.View/Download from: UTS OPUS or Publisher's site
Performing and composing for interactive audiovisual system presents many challenges to the performer. Working with visual, sonic and gestural components requires new skills and new ways of thinking about performance. However, there are few studies that focus on performer experience with interactive systems. We present the work Blue Space for oboe and interactive audiovisual system, highlighting the evolving process of the collaborative development of the work. We consider how musical and technical demands interact in this process, and outline the challenges of performing with interactive systems. Using the development of Blue Space as a self-reflective case study, we examine the role of gestures in interactive audiovisual works and identify new modes of performance.
Bluff, A & Johnston, A 2017, 'Storytelling with Interactive Physical Theatre: A case study of Dot and the Kangaroo', Proceedings of the 4th International Conference on Movement Computing, International Conference on Movement Computing, ACM, London, United Kingdom, pp. 1-8.View/Download from: UTS OPUS or Publisher's site
This paper examines the way movement based interactive visuals were used as a storytelling device in the physical theatre production of Creature: Dot and the Kangaroo. A number of performers and artists involved in the production were interviewed and their perceptions of the interactive technology have been contrasted against a similar study into abstract dance. The animated backgrounds and interactive animal graphics projected onto the stage were found to reduce the density of script by describing the location of action and spirit of the character, reducing the necessity for this to be spoken. Peak moments of the show were identified by those interviewed and a scene analysis revealed that the most successful scenes featured a more integrated storytelling where the interaction between performers and the digital projections portrayed a key narrative message.
Ilsar, A & Bluff, A 2015, ''AirStorm,' A New Piece for AirSticks and Storm: Gestural Audio-Visual for Electronic Percussionists', Proceedings of the 2015 ACM SIGCHI Conference on Creativity and Cognition, ACM SIGCHI Conference on Creativity and Cognition, ACM, Glasgow, UK, pp. 389-390.View/Download from: Publisher's site
'AirStorm' is a semi-improvised short 10-min piece for solo AirSticks and physical model visualisation performed by Alon Ilsar and Andrew Bluff respectively. It will be made up of a drum synth, drum samples, other selected samples and room feedback triggered and manipulated by Ilsar on this newly built interface for electronic percussionists. The piece will display some of the capabilities of the AirSticks along with Ilsar's dedication to practicing and composing for this new interface. "AirStorm" will be based around the conferences theme of "Computers, Arts and Data" through the choice and samples and ways are played. The movement data from Ilsar's Airsticks is processed in real-time by Bluff's physics based visualisation engine, Storm. Particles are pushed around a virtual 3D world in response to the movements of the AirSticks and rigid body collision adds a sense of real-world authenticity and complexity. The system responds to drums and movements of the AirSticks with a combination of different visual and physical effects. The real-time visualisations exemplify the movement and sonic complexity of Ilsar's AirSticks performance, providing a visually stimulating and highly synesthetic element to the piece.
Bluff, AJ & Johnston, A 2015, 'Remote Control of Complex Interactive Art Installations', Proceedings of the 2015 ACM SIGCHI Conference on Creativity and Cognition, Conference on Creativity and Cognition, ACM, Glasgow, UK, pp. 197-200.View/Download from: UTS OPUS or Publisher's site
Movement based interactive artworks are capable of instantly engaging audiences by reacting to physical motion consistently with real-world physics. Sustaining this engagement, however, requires a constant alteration of both the output and interaction aesthetics. Mobile devices (such as the iPad or iPhone) can be used to control the often-overwhelming plethora of parameters found in many interactive systems. The effect that mobile control of these parameters has on the inception, refinement and live performance of two separate art works is examined. An open-source dynamic remote control system is being developed to further facilitate the creative development and performance of interactive artwork as demonstrated by these case studies.
Johnston, AJ & Bluff, A 2014, 'Creative Control of Granular Synthesis Using Fluid Simulation & Motion Tracking', Proceedings of the 2014 International Workshop on Movement and Computing, International Workshop on Movement and Computing, ACM, Paris France, pp. 150-153.View/Download from: UTS OPUS or Publisher's site
This paper describes the development of an audio-visual performance system which applies `reality based interaction' techniques. The real-time gestures and sounds of a musician playing an acoustic instrument are tracked and translated into forces which act on a fluid simulation. The simulation is visualised and also sonified using granular synthesis. Several strategies for linking live performance, fluid behaviour and generated sounds and visuals are discussed.
Tan, CT, Johnston, A, Bluff, A, Ferguson, S & Ballard, KJ 2014, 'Retrogaming as visual feedback for speech therapy', Proceeding SA'14 SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications, International Conference on Computer Graphics and Interactive Techniques in Australasia and Southeast Asia, ACM, Shenzen Convention & Exhibition Center.View/Download from: UTS OPUS or Publisher's site
A key problem in speech therapy is the motivation of patients in repetitive vocalization tasks. One important task is the vocalization of vowels. We present a novel solution by incorporating formant speech analysis into retro games to enable intrinsic motivation in performing the vocalization tasks in a fun and accessible manner. The visuals in the retro games also provide a simple and instantaneous feedback mechanism to the patients' vocalization performance. We developed an accurate and efficient formant recognition system to continuously recognize vowel vocalizations in real time. We implemented the system into two games, Speech Invaders and Yak-man, published on the iOS App Store in order to perform an initial public trial. We present the development to inform like-minded researchers who wish to incorporate real-time speech recognition in serious games.
Tan, CT, Johnston, AJ, Bluff, A, Ferguson, S & Ballard, KJ 2014, 'Speech invaders & yak-man: retrogames for speech therapy', Proceeding SA '14 SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications, SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications, Shenzen Convention & Exhibition Center.View/Download from: UTS OPUS or Publisher's site
Speech therapy is used for the treatment of speech disorders and commonly involves a patient attending clinical sessions with a speech pathologist, as well as performing prescribed practice exercises at home [Ruggero et al. 2012]. Clinical sessions are very effective -- the speech pathologist can carefully guide and monitor the patient's speech exercises -- but they are also costly and timeconsuming. However, the more inexpensive and convenient home practice component is often not as effective, as it is hard to maintain sufficient motivation to perform the rigid repetitive exercises.
Ilsar, A, Bluff, AJ, Hughes, M & Krass, D 2017, 'The Hour', Sydney Conservatorium of Music.