- SUMMARY OF SPRING SCHOOL DAY 5
SUMMARY OF SPRING SCHOOL DAY 5
BRAIN 🧠 ASSESSMENT & THE UNICORN 🦄 BRAIN INTERFACE
More than 3.800 people from all around the world signed up for the virtual BCI & Neurotechnology Spring School. During these 5 days, we answered more then 1.400 questions online. This makes it the biggest BCI event ever! Needless to say, this large turnout is the result of the global COVID-19 pandemic because conferences and events were suddenly cancelled. But, the turnout also reflects the growing interest in brain-computer interfaces. We are delighted to host participants from all over the world. Thanks for being part of it!
This is the summary of the fifth day about Brain Assessment and the Unicorn Brain Interface.
Tomasz Rutkowski from RIKEN AIP & The University of Tokyo in Japan spoke about Simple visual, tactile and auditory BCIs for robotics and VR/AR.
Tomek uses motor imagery neurofeedback to train subjects to rapidly learn the BCI control. He also uses auditory feedback to extract music imagery.
Tomek and his team also worked on a passive BCI to detect dementia in patients. In his study, 60-70% of the patients had Alzheimer’s, 10-20% had vascular dementia. In order to detect Alzheimer’s disease, his team used the EEG, because the P300 response is a very good marker. Elderly persons are using such a passive P300 to monitor patients at home on a daily basis. For detecting vascular dementia, Tomek’s team used fNIRS.
In this case the group is using the Montreal Cognitive Assessment (MOCA) test to clinically describe the dementia.
The second presentation was given by Christoph Guger about BR41N.IO – The Brain-Computer Interface Designers’ Hackathon.
BR41N.IO is a hackathon series running since 2017. Every year, there are about 4-6 hackathons attached to conferences or festivals like IEEE SMC Conference or the Ars Electronica Festival in Austria.
Within 24 hours, programmers, developers, biomedical engineers, robotics engineers, artists, neuroscientists and therapists are coming together to develop new BCI applications in teams of 3-5 people.
✔️BCI control for a hand orthosis
✔️BCI controlled drones or robots
✔️BCI driven painting robots, (audio)visuals, etc…
After 24 hours, a final presentation is given about their projects and an international jury evaluates the work. This is a very interactive environment to meet other people and supervisors, and to kick-start your BCI developments and research. It is also a great opportunity to test BCI technologies like EEG, fNIRS or electrical stimulators with EEG.
The third topic was the Unicorn Brain Interface – The new wearable EEG Headset by Martin Walchshofer.
The Unicorn Hybrid Black has 8 hybrid EEG electrodes which means that it can either be used dry or wet. This special electrode technology has been invented and internationally patented by g.tec.
The huge advantage of dry electrodes is that it’s quickly mounted. For many other applications e.g. in neuroscience, data robustness is key and therefore, gel can be applied.
Then, Martin presented the Unicorn Speller software which offers a painting application and a device control application for e.g. controlling a Sphero Robot. The Unicorn Hybrid Black includes a Unicorn Hybrid Black Suite offering C API, .NET API , and a Recorder for free. Additionally, the Python API and MATLAB Simulink Interface can be purchased too.
Afterwards, Martin Walchshofer demonstrated how to assemble the EEG electrodes correctly and how to check if the EEG data quality is good or artifacts can be found.
The Unicorn Suite contains a EEG quality indicator that reacts very fast to eye-blinks, muscle artifacts or movement artifacts. They inform the user about EEG data quality right away. More programming tools such as UDP interface, LSL or Arduino can be used to send BCI messages to remote computers.
Later, Slobodan Tanackovic talked about TMS with simultaneous EEG recording.
Slobodan showed how the Transcranial Magnetic Stimulation (TMS) is applied to the human brain while the EEG is co-registered. In this case, TMS pulses are overlaying the EEG recordings from biosignal amplifiers such as g.HIamp, g.USBamp and g.Nautilus with active and passive electrodes.
Usually, TMS pulses produces an artifact that is visible for 2-6 milliseconds, depending on the type of EEG system. A clean EEG data is important when you analyze short lasting evoked potentials. For BCI applications it is no problem, even if the artifact would be longer because the artifact can just be blanked out.
Furthermore, Slobodan explained different offline processing principles to reduce the artifact in the EEG data.
Christoph Guger gave an Introduction to brain assessment with BCIs. He explained the different assessment methods used for patients with disorders of consciousness (DOC) and demonstrated a BCI technology to assess command following in DOC patients. This is done with
✔️auditory P300 paradigms
✔️vibro-tactile P300 paradigms
✔️motor imagery paradigms
The BCI accuracy can be used as a marker to determine whether a patient can follow commands. In other words, the BCI accuracy shows if a patient is able to understand communication. This marker can also be used to find fluctuations in the awareness of DOC patients. If the command following task works successfully with a patient, then the BCI system can be used for simple YES/NO communication.
Rehabilitation concepts are nowadays especially important in therapy and care of DOC patients. For these reasons, the use of tDCS, Functional Electrical Stimulations and a vibro-tactile BCI protocols become more and more important.
Finally, Katrin Mayr gave a demonstration of the BCI-based mindBEAGLE system. During her talk, she explained:
- how to assemble the active EEG electrodes on patients
- how to run the auditory and vibro-tactile BCI experiments for command following assessment
- how to ask questions to patients that can be answers with the help of mindBEAGLE.
- how to use motor imagery BCI paradigm for command following assessment and for answering questions