University of Otago researchers used NeSI’s Mahuika cluster to train a machine learning model that can help predict and detect cybersickness. The analysis software can be easily incorporated into headsets sold by VR companies. Image: Wren Handman, Pixabay

AI software learns, tracks and predicts cybersickness in virtual reality users

“It would take me a day to train one machine learning model [on my desktop], but I'd train something like 56 models in a day on Mahuika.”

The Challenge:
Cybersickness is a debilitating illness that afflicts virtual reality headset users. It makes virtual reality (VR) training less efficient and VR gaming less fun. To date, there is no automated tool to predict or detect cybersickness within headsets. 

The Solution:
University of Otago researchers used NeSI’s Mahuika supercomputer to train a machine learning model that analysed multichannel EEG and ECG data for signs of cybersickness.

The Outcome:
Researchers developed a model that uses single-channel EEG data to predict and detect cybersickness. The analysis software can be easily incorporated into headsets sold by VR companies.

 

Virtual reality (VR) is an emerging technology finding applications in training, gaming and data visualisation. A December 2022 report from IDC Worldwide Quarterly Augmented and Virtual Reality Headset Tracker predicted global shipments of virtual and augmented reality headsets will climb by 31.5% this year.

But cybersickness has slowed the rising interest in VR. Cybersickness is a debilitating condition that affects some VR users, causing symptoms including nausea, dizziness and difficulty tracking moving images.

There are no automated tools to predict or detect cybersickness in VR users. This means users cannot know if they are suffering from cybersickness until the symptoms have already begun.

Associate Professor Yusuf Ozgur Cakmak and PhD student, Alexander Hui Xiang Yang of the University of Otago have used machine learning models to identify EEG markers of cybersickness that VR headsets could track.

“Apple, Microsoft, and other companies are adopting VR, but there is a problem with ongoing use of VR headsets that human physiology can't yet overcome,” said Yusuf.

Yusuf and Alexander’s earlier research had identified variations in heart rate and brain activity that correlated to cybersickness. But their goal was to make an automated tool that they could include in a VR headset. The pair chose to use electroencephalogram (EEG) data to monitor cybersickness.

“We collaborated with Prof Nikola Kasabov from the Auckland University of Technology. To make it translational, we will likely use EEG data only. And much more importantly, we will use single-channel EEG, which we can easily embed into VR headsets,” said Yusuf.

Determining which single-channel EEG is the key marker for cybersickness is also important to guide new non-invasive therapeutic modalities to alleviate cybersickness.

“Before using NeSI, we were using machine learning and multi-channel EEGs. We couldn’t use the entire EEG dataset because it was quite large. We couldn’t process it with ordinary computers or look at different time segments [of brain activity].”

Yusuf and Alexander used NeSI’s Mahuika supercomputer to increase the artificial intelligence’s processing power for analysing EEG. The project used 895 CPU core hours and 5150 GB hours of RAM.

“In our architecture, we set up an artificial simulation of the brain. Instead of having billions or trillions of neurons, we just had 1471 neurons. These neurons were spatially orientated, so each model neuron represented a large cluster of real neurons in a brain,” said Alex.

These model neurons were the artificial intelligence representations of electrical activity in the brain. Over a given time span, the model neurons output a processed form of the EEG data. An important part of this work was determining the ideal time span for monitoring brain activity that would lead to high accuracy and a lower performance cost.

“So, while you're sitting down, we can record your brain activity. And then we can tell you you're going to feel sick if you play this video. But we can also identify sickness as you're [using the headset],” said Alex.

Running these machine learning simulations at different time spans was computationally expensive. The team optimised their code for HPC use. On their standard computers, the pair could only process small time segments, while NeSI allowed them to test the entire span of measured data.

[The consultancy] NeSI’s utility opened my eyes to the possibility of splitting and speeding up my workflows. We had a dialogue about what we wanted and how to split it up into tiny parallel tasks that we could execute on the NeSI platform to get all our outputs,” said Alex.

Alex also undertook NeSI’s training sessions to learn how to port the team’s source code to Mahuika and repackage it in a way so the supercomputer could interpret it.

“We first needed to understand our growing data needs. We consulted with Murray Cadzow, who is part of NeSI’s Research Reference Group, and found that there was a linear growth between our data and processing needs. It was a big help figuring this out, because then we didn't waste time trying to modify the code for speedy training, we just needed to figure out how to package this in a way that multiple computers could handle with HPC,” said Alex.

The neural network dynamically updates with new information to learn more about cybersickness. It was able to predict cybersickness using EEG of the F7(single channel) frontal lobe with 85.9 % accuracy. It could also detect it by using EEG of two regions of the brain at 76.6% accuracy.

The first region was the area under the Cz channel, involved in motor processing and planning, which also acts as a hub for other functionally connected areas of the brain involved in cybersickness. The second was the Fp1 channel associated with the prefrontal cortex, responsible for attention to detail and decision making.

“NeSI allowed us to generate a greater understanding about the condition of cybersickness, and the susceptibility to cybersickness by analysing all timeframes. We were able to find the most accurate measurement timeframe because of NeSI’s computational capabilities,” said Alex.

“It would take me a day to train one machine learning model [on my desktop], but I'd train something like 56 models in a day on Mahuika.”

Manufacturers could apply this research to commercial VR headsets in medical, aerospace, military and trade training. Being able to monitor the health of trainees will allow VR training to become more efficient and less unpleasant to cybersickness sufferers.

“The next stage is to take the key findings from this research and develop methods of suppression for VR activity that causes cybersickness. We’re looking for commercial applications and potential partners now.”

Article and open access link:
Yang, A.H.X., Kasabov, N.K. & Cakmak, Y.O. Prediction and detection of virtual reality induced cybersickness: a spiking neural network approach using spatiotemporal EEG brain data and heart rate variability. Brain Inf. 10, 15 (2023). https://link.springer.com/article/10.1186/s40708-023-00192-w


Do you have an example of how NeSI platforms or support advanced your research? We’re always looking for projects to feature as a case study. Get in touch by emailing support@nesi.org.nz.

 

 

Next Case Study

Photo of NIWA's Baring Head atmospheric station. Photo by Dave Allen, NIWA.

Automating workflows to help scientists address crucial carbon cycle questions

"The transformation of our CYLC setup into a fully automated and more flexible workflow has had a remarkable impact on our research processes."
Subject: