سԹ Engineering researchers Distinguished Professor Geoff Chase and Lecturer Dr Lui Holder-Pearson have been awarded $1.1 million in funding through a partnership between MBIE and , an AI company in Auckland.
Their project, titled ‘AI-driven Two-Way, Feedback Controlled Emotional Recognition Training for Individuals with Autism Spectrum Disorder’ has been granted $1,105,412.00 (excl. GST). The Canterbury engineering academics will work with European research partners including UC Adjunct Professor , based at Furtwangen University in Villingen-Schwenningen, Germany.
Autism is a neurodevelopmental condition that affects approximately 93,000 New Zealanders. , Autism Spectrum Disorder (ASD) is growing at a rate of 5% to 10% per year due largely to increased diagnosis of high-functioning ASD. It is estimated ASD affects about 1-in-44 children with boys four times more likely to be diagnosed with autism than girls (). It can play a part in socially and economically debilitating cognitive problems, including significant depression and anxiety co-morbidities.
“Sometimes called ‘social blindness’, the inability to accurately recognise emotions in other people is common and the only therapy is intensive one-to-one or small-group training,” says Dr Holder-Pearson. “This approach is costly, in short supply, and thus often infrequent.”
However, strong growth in high-functioning ASD diagnosis, particularly in boys, threatens to create a “lost generation” unable to achieve their full potential. The سԹ researchers recognised the need to significantly increase access to, and the positive outcomes of, emotional recognition training therapy for high-functioning ASD individuals.
“Our proposed solution is a virtualised, two-way, feedback-controlled emotional recognition training therapy combining AI, clinical therapy, and real-time subject physiological/emotion recognition measurements to virtualise 1-to-1 training,” Professor Chase says.
It combines three key elements:
- Hyper-realistic and responsive avatars from Soul Machines Digital DNA Studio able to show detailed emotions
- Computer vision to read subject emotional state, reaction rates in integrated tasks, stress levels (via heartrate, etc.), focus, and attention, incorporating critical subject feedback
- Programmed standard, accepted therapeutic methods (behind the avatar) to respond to measured subject behaviour/actions
These technologies enable a virtualised true two-way therapeutic session, where current emotion recognition software has no subject feedback (is only one-way).
“Critically, the AI avatars are not just a display, and subject feedback enables a true form of virtualised one-to-one therapy,” Professor Chase says.
According to the UC engineering academics, the software-based solution driven by accepted clinical therapeutic methods dramatically increases access and scalability while lowering other costs. The overall solution creates a highly extensible platform for other therapies.