Skip to main content

CT Lin:

Brain computer interfaces are transforming the way human and machines interact.

UTS is at the forefront of innovation in this area in a project funded by the Defense Innovation Hub, Professor Francisca lacopi and me are combining our expertise in developing wearable and wireless devices.

They measure brainwave and new nanmaterials to create a powerful computer interface systems to control and operate vehicles machines and other devices.

Francisca Iacopi:

A key component of our brain computer interface system are the epitaxial graphene sensors which are in contact with the scalp.

Graphene is an atomic themed film highly compatible with biological tissues and is one of the most conductive materials.

That makes it the perfect material for BCI sensors. We have overcome the limitations of existing brain sensors they often wear out corrode or delaminate upon contact with the sweat and are less effective in measuring brain signals.

We have overcome those limitations by growing the graphene on a silicone carbide on silicon substrate. That combines the best of graphene with the best of silicon technology creating a biosensor that is very resilient and robust to use.

They are dry sensors that can be used and reused over prolonged periods and in harsh conditions. We can reliably collect the biopotentials from groups of neurons in the brain from outside the skull hence in a non-invasive way enhancing the potential for use in brain machine interfaces.

Nguyen Tien Thong Do:

Today people interface with computers using keyboards mouse dust screens etc.

They could be replaced by interface directly from the frame our brand computer interface system consists of miniature runway decoders paired with sensors.

We use AI to reduce the noise that has affected brand computer interface performance until now.

We also use AI to decode the commands from the brain signals when the human is performing normal activity with hands-free.

This has allowed us to issue multiple commands in seconds, which outperform the existing technology.

Currently people use touch to control computers, appliances, vehicles and machinery. In the future there will be sensors embedded in what people wear that can detect brand signals and can control things using thought.

CT Lin:

Our collaborations has allowed Francisca and me to develop brand computer interface technologies that has enormous potential in many applications in the medical industry and disability sector.

Francisca Iacopi:

Whether it is for operating a wheelchair, working on a computer without using hands or directing machinery in an adverse or dangerous situation, there are many potential applications for our system.

Imagine driving a car, just like you do now, and using the steering wheel to direct where the car goes. Now, imagine driving a car using nothing more than your mind. Hands-free control of autonomous systems is fast becoming a reality – and a UTS research project is leading the way.

A collaboration between Professor Francesca Iacopi and Distinguished Professor Chin-Teng Lin in the Faculty of Engineering and IT, the work has produced a novel brain-computer interface that uses human brainwaves to control machines.

The research, which is funded by $1.2 million from the Defence Innovation Hub, has three main components: a biosensor to detect electrical signals from the brain, a circuit to amplify the signals, and an artificial intelligence decoder to translate them into instructions – stop, turn right, turn left – that the machine can understand. 

“I see this technology as the next generation of human-computer interfaces,” says Distinguished Professor Lin.

Our technology can translate the brain’s electrical signals into a format that can be caught directly by a machine or a robot and the robot will follow the commands.

- Distinguished Professor CT Lin

Next step in carbon-based biosensing

Photo of Francesca Iacopi

Professor Francesca Iacopi. Photo: Andy Roberts

Professor Iacopi, a leading researcher in nanotechnology, is leading the development of the biosensor, which is worn on the head. The sensor is made of epitaxial graphene – essentially multiple layers of very thin, very strong carbon – grown directly onto a silicon carbide on silicon substrate.

The result is a novel biosensor that overcomes three major challenges of graphene-based biosensing: corrosion, durability, and skin contact resistance, where non-optimal contact between the sensor and skin impedes the detection of electrical signals from the brain.

“We’ve been able to combine the best of graphene, which is very biocompatible and very conductive, with the best of silicon technology, which makes our biosensor very resilient and robust to use,” says Professor Iacopi.

Using brainwaves to control machines

Distinguished Professor CT Lin

Distinguished Professor CT Lin. Photo: Toby Burrows

Distinguished Professor Lin is working on the amplification circuit and the AI brain decoding technology. He and his team have achieved two major breakthroughs in their work so far.

The first was figuring out how to minimise the noise created by the body or the surrounding environment so that the technology can be used in real-world settings. The second was increasing the number of commands that the decoder can deliver within a fixed period of time.

“Current BCI technology can issue only two or three commands such as to turn left or right or go forward,” Professor Lin says.

“Our technology can issue at least nine commands in two seconds. This means we have nine different kinds of commands and the operator can select one from those nine within that time period.”

Revolutionising human-computer interactions

Together, the researchers have produced a prototype brain-computer interface that has huge potential for application across multiple industries.

In a defence context, the technology offers the ability for the Australian Army to explore how soldiers interact with robotic systems during tactical missions. At present, soldiers must look at a screen and use their hands to operate robotic platforms, when they could be looking up and able to support their team differently.

Lieutenant Colonel Kate Tollenaar is the project lead with the Australian Army.

“We are exploring how a soldier can operate an autonomous system – in this example, a quadruped robot – with brain signals, which means the soldier can keep their hands on their weapon, potentially enhancing their performance as highlighted in the Army Robotic and Autonomous Systems Strategy,” she says.

“This brain-computer interface explores how a robotic platform responds in a real-world environment to support operations by a soldier.”

Elsewhere, the technology could offer significant value in industrial, medical and disability sectors.

Research team