• Posted on 25 Aug 2023
  • 3-minute read

UTS has been awarded a contract with Defence to continue developing the advanced brain-computer interface.

Photo of Francesca Iacopi
Professor Francesca Iacopi is an expert in nano-electronics and materials. Photo: Andy Roberts

UTS has been awarded a contract with Defence to continue developing the advanced brain-computer interface (BCI) developed by Distinguished Professor Chin-Teng Lin, Professor Francesca Iacopi and Dr Thomas Do

This contract will see the team evolve the BCI over a period of 30 months to enhance speed, accuracy and reliability, and to take the system from technology readiness level 4 to a fully functioning prototype at TRL6.

Professor Iacopi will further enhance the electrical and mechanical performance of the graphene sensors and demonstrate the scalability of the process. Micropattern designs will be developed to optimise the sensors to be worn on hairy regions of the scalp.

“We are really excited to continue working on the BCI. Our graphene sensors are ideal for use due to their durability, low skin contact resistance and anti-corrosion properties, so it is great to see them applied in demanding environments like those required by Defence,” said Professor Iacopi.  

“The sensors are very thin and comfortable to wear and allow users to move around freely in a wide variety of challenging operating environments and outside laboratories,” she added. 

On the electronics and software front, Distinguished Professor Lin’s team will continue to adapt the BCI to be used with various augmented and virtual reality displays, and improve electronic circuits and algorithms to reduce command completion-time. 

The team will also introduce AI-based adaptive human autonomy teaming to improve the understanding and trust between users and autonomous robots.

hs8hjdoSKNQ

Descriptive transcript

♪♪

Today we've conducted a successful demonstration with the University of Technology Sydney.

This has been a four-way collaboration supported through the Defence Innovation Hub and our partnership with the Defence Science and Technology Group, DSTG.

This four-way collaboration focused on how we could create a brain–robotic interface that will allow a soldier, rather than operating an autonomous system with a command console, to operate the system using brain signals.

What was created was a headset that a soldier could use, based on the HoloLens 2 model, with an AI decoder via a Raspberry Pi that would translate brain signals into explainable instructions.

What's so exciting about this technology is it has the opportunity to be used with a number of different autonomous systems.

We conducted two short demonstrations today.

The first demonstration showed a soldier operator commanding an autonomous system, the Vision 60 Ghost Robot, to a series of waypoints along the ground, and this was conducted successfully.

The potential of the project is actually very broad. At its core, it's translating brain waves into zeros and ones, and that can be implemented into a number of different systems.

It just so happens that in this particular instance, we're translating it into control for a robot.

The second demonstration was having a soldier operator acting in the role of a section commander and providing directions to both the Ghost Robot and other members of their fire team.

They conducted a simulated patrol clearance of a number of buildings here at the Maduro Range Urban Operations Centre.

During the demonstration, we showcased the future capabilities of HAI technology, where soldiers were able to use our system to interact with robots in the training field.

This technology enables me to not only control the Ghost Robot as well as monitor its video feed, but it allows me to be situationally aware of my surroundings as well as my team, to be able to control all movements of that battlefield clearance.

This is very much an idea about what might be possible in the future.

We're really excited to see where the technology might go and to work with our stakeholders to develop a wide range of use cases in order to better determine how we can support the military practitioner and the warfighter.

www.microsoft.com

“Taking our immersive BCI from TRL4 to TRL6 will be a major step to controlling robots, machinery and computer systems by thought, and making physical input mechanisms such as consoles, touch screens and keyboards redundant,” Professor Lin said. 

“The human-autonomous teaming allows persons to use the BCI through two modalities – ‘in-the-loop’, controlling a robot or system with explicit commands, and ‘on-the-loop’, only intervening in a robot’s automated actions when the user notices problems, for example in complex or rapidly evolving situations,” he added. 

Once development of the BCI is completed, it can be tailored for a wide range of purposes and sectors.

For example, in the disability and medical sectors it could control and operate prosthetic limbs, wheelchairs and monitoring equipment. In industry it could operate collaborative robots to perform repetitive and dangerous tasks in physically restricted and dangerous environments.

 

Share