Autism spectrum disorder (ASD) is something that is experienced in vastly different ways by Autistic people. Some Autistic children have idiosyncrasies with behaviour, emotion and communication, which may cause them to be excluded or to face barriers to access services. Because our society is not great at including Autistic people, many must adapt or conform to be accepted.
Although there are methods that can equip children with ASD with social and emotional skills, like behavioural therapy, they are often expensive, time-consuming and inaccessible. One promising method is neurofeedback training (NFT), a computer-based program that gives real-time feedback to help people regulate their own brain activity, particularly cognitive, emotional and behavioural functions. However, NFT is only available in clinics.
Even the most cost-efficient methods, such as teaching emotions through card sorting or slide decks, fail to capture autistic children’s attention — calling for more engaging and immersive activities.
What if we could teach these skills through interactive games instead? Fortunately, Yue Lyu, a PhD student in computer science, is leveraging GenAI and augmented reality (AR) to create personalized home-based intervention tools. For this transformative work, she has received more than $140,000 in various awards and scholarships, including the NSERC Postgraduate Scholarship (Doctoral), Ontario Graduate Scholarship (OGS) and the Autism Scholars Award. The latter was given to three students at the doctoral level last year.
Yue designed and conducted her project according to inclusive research principles, specifically including Autistic people in the research design. She also included caregivers, educators, and autistic families throughout the interview and design process.

What if we could teach emotional and social skills in a joyful and playful manner? This is the vision of Yue Lyu (left), a third-year PhD student. Together with her supervisors, Professors Jian Zhao (inset on right) and Keiko Katsuragawa (middle), she is creating AI and AR-based technology to empower autistic children.
Her visionary plan could solve a major educational gap: the Ontario Autism Coalition reports that investments in ASD support could potentially save $1.4 billion for Ontario taxpayers. For example, she led EMooly, a tablet-based game that explores emotional recognition through interactive storytelling. It actively involves caregivers, who play a key role in an autistic child’s development.
When the child and their caregiver first open EMooly, they are prompted to take a photo of a household object. They must also select a cartoon character and an emotion from EMooly’s database. Using the power of AI, EMooly will generate a story based on these elements. If the users select a cup, panda, and surprise as their respective object, character, and emotion, they will watch a story where a panda receives a cup and experiences surprise. The child and their caregiver will take turns practicing emotions using their tablet’s camera. Next, they will see various emoticons pop up on their screen, which are overlaid with their surroundings. In this case, the users must identify the correct surprise emoticon.
Yue led a user study where she recruited 24 autistic children and their caregivers to use EMooly or a slide deck, a more traditional method. They also had to complete a quiz on emotion recognition before and after using each tool. Notably, the group that only used EMooly performed the best. Their scores were 15 per cent higher than their pre-quiz scores, while the slide deck group’s scores dropped by per cent. This difference shows that EMooly’s immersive experience can successfully capture and sustain a user’s attention, something that traditional methods couldn’t effectively achieve.
A demo of EMooly, which teaches emotional recognition through the power of AR and AI
Similarly, Yue led Eggly, the first-ever mobile game to integrate AR and NFT. It depicts a story where two farmers must work together to collect eggs from a bird. If the user pays attention to the characters’ social interactions, they receive awards: the eggs are laid faster, and the bird flies higher. If not, then the bird will fly sloppily and slowly, and the background music will be played low. The game incorporates AR elements: the users can design their own characters on pen and paper and then use their tablet to scan them. Using computer vision, Eggly will transform their 2D drawings into playable characters.
Typically, children with ASD exhibit abnormal mu wave suppression, a type of brain wave that is responsible for understanding and imitating people’s actions. It is core to social learning and interaction, like empathy. Throughout Eggly, the user must wear an electroencephalogram headband, which helps normalize mu rhythm suppression and tracks their brain’s ability to pay attention to social interactions. By focusing on these interactions, users can pick up on social cues and communication.
A demo of Eggly, an innovative game that weaves interactive storytelling, AR, and NFT
Creating customizable games — like allowing children to select a beloved object in EMooly or design their own characters in Eggly — is key, as every child’s interests and level of development are different. This approach also allows children to create situations based on their preferences and home routines, making them more resilient and adaptable to day-to-day challenges.
Like fellow researchers at Waterloo, Yue is leveraging AI and AR to help autistic children interact, express and communicate themselves better. With the latest funding, she can continue the next phase of her research plan.
“Both Eggly and EMooly were about recognizing and responding to social and communication needs. My future projects aim to go further, towards systems that learn from each child over time. For example, I’m exploring a customizable story-based video game that gently learns a child’s sensory preferences — like whether they enjoy soft visuals or quiet sound — and adjusts the game world accordingly. The goal is to support long-term engagement and learning, while giving kids a sense of control.”
“I’m deeply honoured. These awards feel like a recognition not just of my work, but of the importance of inclusive design in communities. It gives me motivation to keep going, to amplify the voices of neurodivergent users, and to push the boundaries of the design of what assistive tech can look like,” Yue says.
What’s key to Yue’s research is being open to feedback and co-reflecting with the autism community.
“I was surprised by how small design changes — like adjusting the rhythm of a sound or the intensity of the game — could make a huge difference for kids’ comfort. Also, some children responded better when they felt they were “guiding” the system, rather than being guided.”
“It reminded me that autonomy and agency are just as important as functionality.”
Yue proves that play, learning and computer science can come together to empower children’s emotional and social well-being.
To learn more about the research on which this feature is based, please see the following research publications:
Yue Lyu, Di Liu, Pengcheng An, Xin Tong, Huan Zhang, Keiko Katsuragawa, Jian Zhao. EMooly: Supporting Autistic Children in Collaborative Social-Emotional Learning with Caregiver Participation through Interactive AI-infused and AR Activities. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 8(4), (2024).
Yue Lyu, Pengcheng An, Yage Xiao, Zibo Zhang, Huan Zhang, Keiko Katsuragawa, Jian Zhao. Eggly: Designing Mobile Augmented Reality Neurofeedback Training Games for Children with Autism Spectrum Disorder. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 7(2), (2023).