Le Petit Care

LePetit Care

November 2024 ~ May 2025

🏆️ ACM CHI 2025 SDC Finalist (Top 4)

Le Petit Care: A Child-Attuned Design for Personalized ADHD Symptom Management Through AI-powered Extended Reality

Abstract

Le Petit Care is an AI-powered Extended Reality (XR) solution designed to complement pharmacological treatment for ADHD and address the limitations of existing digital therapeutics. The project aims to help children alleviate negative self-perception and guilt often experienced during diagnosis and treatment, while enabling them to manage ADHD symptoms more effectively and in a personalized way. To achieve this, we conducted expert interviews with digital healthcare specialists, child development professors, and psychiatrists, alongside interdisciplinary literature reviews, in order to establish a practical and evidence-based design approach. The system closely reflects the physiological and behavioral characteristics unique to children with ADHD, integrating them naturally into a user-centered design and storytelling framework. As a result, Le Petit Care is designed to quantitatively screen ADHD symptoms by analyzing multidimensional data—such as head movement, gaze tracking, behavior, and voice—through an AI model within a child-friendly virtual content environment based on DSM-5 diagnostic criteria. Based on these analyses, the system then provides each child with personalized behavioral development training.

Introduction

Introduction

This project was independently initiated and led by me from the second semester of my junior year to the first semester of my senior year, without any affiliation or support from a research lab or company. I developed and pitched an idea leveraging sensing and AI technologies in a cross-campus student club, built a team around it, and led a multidisciplinary collaboration that included students from medicine, design, biology, and other backgrounds. Le Petit Care was submitted to the Student Design Competition 2025, an official program of the ACM CHI (Conference on Human Factors in Computing Systems)—the world’s premier venue in Human-Computer Interaction research. The project advanced through the preliminary round to the top 12 teams and was later selected as a Finalist (Top 4) following an on-site presentation at CHI 2025 in Yokohama, Japan, receiving an official certificate from ACM SIGCHI. This competition brings together undergraduate and graduate students worldwide to propose innovative, human-centered technologies and service designs, evaluating both the originality of research design and its social contribution—making it a program of significant academic and practical value.

This project was independently initiated and led by me from the second semester of my junior year to the first semester of my senior year, without any affiliation or support from a research lab or company. I developed and pitched an idea leveraging sensing and AI technologies in a cross-campus student club, built a team around it, and led a multidisciplinary collaboration that included students from medicine, design, biology, and other backgrounds. Le Petit Care was submitted to the Student Design Competition 2025, an official program of the ACM CHI (Conference on Human Factors in Computing Systems)—the world’s premier venue in Human-Computer Interaction research. The project advanced through the preliminary round to the top 12 teams and was later selected as a Finalist (Top 4) following an on-site presentation at CHI 2025 in Yokohama, Japan, receiving an official certificate from ACM SIGCHI. This competition brings together undergraduate and graduate students worldwide to propose innovative, human-centered technologies and service designs, evaluating both the originality of research design and its social contribution—making it a program of significant academic and practical value.

Abstract

Le Petit Care is an AI-powered Extended Reality (XR) solution designed to complement pharmacological treatment for ADHD and address the limitations of existing digital therapeutics. The project aims to help children alleviate negative self-perception and guilt often experienced during diagnosis and treatment, while enabling them to manage ADHD symptoms more effectively and in a personalized way. To achieve this, we conducted expert interviews with digital healthcare specialists, child development professors, and psychiatrists, alongside interdisciplinary literature reviews, in order to establish a practical and evidence-based design approach. The system closely reflects the physiological and behavioral characteristics unique to children with ADHD, integrating them naturally into a user-centered design and storytelling framework. As a result, Le Petit Care is designed to quantitatively screen ADHD symptoms by analyzing multidimensional data—such as head movement, gaze tracking, behavior, and voice—through an AI model within a child-friendly virtual content environment based on DSM-5 diagnostic criteria. Based on these analyses, the system then provides each child with personalized behavioral development training.

BACKGROUND

BACKGROUND

Medication is widely regarded as the most effective treatment for ADHD, yet many children experience guilt and psychological distress during diagnosis and therapy. Since these issues cannot be solved by medication alone, personalized intervention strategies are essential. Approaches such as play therapy, social skills training, and cognitive behavioral therapy have been introduced, but their effectiveness is often limited by narrow symptom focus, low accessibility, and difficulty in objective evaluation. Digital Therapeutics (DTx), evidence-based software designed to treat or manage medical conditions, offer a promising alternative thanks to their accessibility and use of personalized data. However, current ADHD DTx solutions often fail to capture diverse symptoms or maintain user engagement.

Attention-Deficit/Hyperactivity Disorder (ADHD) is a neurodevelopmental disorder that primarily manifests during childhood, leading to deteriorated mental health, academic difficulties, and a substantial societal burden. According to statistical data, the global prevalence of ADHD among children aged 3 to 12 years is estimated at approximately 7.6% (95% confidence interval: 6.1–9.4%). An analysis conducted in Europe estimated the annual societal cost of ADHD at around USD 42.5 billion, comprising USD 13.6 billion in education-related costs, USD 7.9 billion in healthcare and mental health services, and USD 21.1 billion in costs associated with crime and delinquency. Therefore, appropriate therapeutic intervention should not merely aim to alleviate symptoms but also seek to enhance overall mental health and emotional well-being, while promoting the recovery of daily functional abilities. In this context, implementing timely and effective interventions during childhood is of critical importance.


Attention-Deficit/Hyperactivity Disorder (ADHD) is a neurodevelopmental disorder that primarily manifests during childhood, leading to deteriorated mental health, academic difficulties, and a substantial societal burden. According to statistical data, the global prevalence of ADHD among children aged 3 to 12 years is estimated at approximately 7.6% (95% confidence interval: 6.1–9.4%). An analysis conducted in Europe estimated the annual societal cost of ADHD at around USD 42.5 billion, comprising USD 13.6 billion in education-related costs, USD 7.9 billion in healthcare and mental health services, and USD 21.1 billion in costs associated with crime and delinquency. Therefore, appropriate therapeutic intervention should not merely aim to alleviate symptoms but also seek to enhance overall mental health and emotional well-being, while promoting the recovery of daily functional abilities. In this context, implementing timely and effective interventions during childhood is of critical importance.



Medication is widely regarded as the most effective treatment for ADHD, yet many children experience guilt and psychological distress during diagnosis and therapy. Since these issues cannot be solved by medication alone, personalized intervention strategies are essential. Approaches such as play therapy, social skills training, and cognitive behavioral therapy have been introduced, but their effectiveness is often limited by narrow symptom focus, low accessibility, and difficulty in objective evaluation. Digital Therapeutics (DTx), evidence-based software designed to treat or manage medical conditions, offer a promising alternative thanks to their accessibility and use of personalized data. However, current ADHD DTx solutions often fail to capture diverse symptoms or maintain user engagement.

Advances in emerging technologies, including AI, are opening new possibilities for overcoming the limitations of traditional approaches. At the same time, user-centered design that reflects children's psychological and emotional needs has proven highly effective in such contexts. A notable example is Stanford Design School’s redesign of the MRI experience for children who feel extreme anxiety in enclosed spaces. By applying immersive themes such as “underwater exploration,” “space travel,” and “pirate adventure,” the team successfully reduced fear and transformed the MRI into a positive and engaging experience. Building on this insight, our work deeply examines the unique needs of children with ADHD and focuses on addressing them holistically. Through extensive research and iterative design, we derived key insights and refined our concept. Leveraging a multidisciplinary approach—spanning medicine, computer science, visual design, and biology—we propose a new framework that goes beyond existing digital therapeutics and offers a more personalized, empathetic, and effective intervention experience.

Advances in emerging technologies, including AI, are opening new possibilities for overcoming the limitations of traditional approaches. At the same time, user-centered design that reflects children's psychological and emotional needs has proven highly effective in such contexts. A notable example is Stanford Design School’s redesign of the MRI experience for children who feel extreme anxiety in enclosed spaces. By applying immersive themes such as “underwater exploration,” “space travel,” and “pirate adventure,” the team successfully reduced fear and transformed the MRI into a positive and engaging experience. Building on this insight, our work deeply examines the unique needs of children with ADHD and focuses on addressing them holistically. Through extensive research and iterative design, we derived key insights and refined our concept. Leveraging a multidisciplinary approach—spanning medicine, computer science, visual design, and biology—we propose a new framework that goes beyond existing digital therapeutics and offers a more personalized, empathetic, and effective intervention experience.

RESEARCH

RESEARCH

Based on problem definition and preliminary research, we planned a personalized, AI-driven ADHD solution that leverages XR sensing capabilities and multimodal data. To gather diverse perspectives, we conducted in-depth interviews with a child development professor, a psychiatrist, and the CEO of a digital healthcare investment firm. Insights from these interviews guided the direction of our planning and design.


The child development professor highlighted that children with ADHD struggle to selectively focus on important auditory information and often experience difficulty with working memory, sequence retention, and delayed gratification. These cognitive characteristics lead to inconsistent task performance, which is frequently misinterpreted as “intentional misbehavior.” ADHD rarely appears alone and often co-occurs with conditions such as oppositional defiant disorder, tic disorders, dyslexia, depression, and anxiety. As a result, children may miss auditory cues, fail to follow rules, and face social isolation, reinforced by repeated scolding that lowers self-esteem. For system design, the professor emphasized minimizing unnecessary stimuli to ensure accurate assessment and recommended structured personalization to account for large individual differences by age and symptom severity.


The digital therapeutics investor noted that prior AI-based ADHD screening studies often lack clinical rigor. They advised training models on real game-based performance data from clinically diagnosed ADHD patients versus control groups, and clearly defining screening accuracy metrics, even in student competitions. They expressed skepticism toward XR in healthcare, explaining that wearable interventions must provide value that outweighs discomfort. They referenced three VR-based DTx teams that struggled because they could not clearly justify the necessity of the technology. On data security, they noted that de-identified information with user consent and HIPAA-compliant cloud services is sufficient, and emphasized prioritizing practical value and verifiable utility over technical complexity—demonstrating clear benefit to children with ADHD and their families above all.


The psychiatry resident suggested three additional XR-based screening approaches. First, to distinguish attention deficits from cognitive delays, they recommended measuring not only errors but also consecutive correct responses and error-free task completion. Sensitivity identifies true ADHD cases, while specificity excludes non-ADHD cases; although sensitivity is critical to avoid missing at-risk children, including correct-response metrics can improve specificity by ruling out cognitive impairment. Second, they recommended incorporating impulsivity measures—capturing actions where children behave without considering context. For example, tracking how often a child removes the device or performing physical-behavior analysis (e.g., body movement or voice patterns) rather than relying solely on in-game metrics.

Based on problem definition and preliminary research, we planned a personalized, AI-driven ADHD solution that leverages XR sensing capabilities and multimodal data. To gather diverse perspectives, we conducted in-depth interviews with a child development professor, a psychiatrist, and the CEO of a digital healthcare investment firm. Insights from these interviews guided the direction of our planning and design.


The child development professor highlighted that children with ADHD struggle to selectively focus on important auditory information and often experience difficulty with working memory, sequence retention, and delayed gratification. These cognitive characteristics lead to inconsistent task performance, which is frequently misinterpreted as “intentional misbehavior.” ADHD rarely appears alone and often co-occurs with conditions such as oppositional defiant disorder, tic disorders, dyslexia, depression, and anxiety. As a result, children may miss auditory cues, fail to follow rules, and face social isolation, reinforced by repeated scolding that lowers self-esteem. For system design, the professor emphasized minimizing unnecessary stimuli to ensure accurate assessment and recommended structured personalization to account for large individual differences by age and symptom severity.


The digital therapeutics investor noted that prior AI-based ADHD screening studies often lack clinical rigor. They advised training models on real game-based performance data from clinically diagnosed ADHD patients versus control groups, and clearly defining screening accuracy metrics, even in student competitions. They expressed skepticism toward XR in healthcare, explaining that wearable interventions must provide value that outweighs discomfort. They referenced three VR-based DTx teams that struggled because they could not clearly justify the necessity of the technology. On data security, they noted that de-identified information with user consent and HIPAA-compliant cloud services is sufficient, and emphasized prioritizing practical value and verifiable utility over technical complexity—demonstrating clear benefit to children with ADHD and their families above all.


The psychiatry resident suggested three additional XR-based screening approaches. First, to distinguish attention deficits from cognitive delays, they recommended measuring not only errors but also consecutive correct responses and error-free task completion. Sensitivity identifies true ADHD cases, while specificity excludes non-ADHD cases; although sensitivity is critical to avoid missing at-risk children, including correct-response metrics can improve specificity by ruling out cognitive impairment. Second, they recommended incorporating impulsivity measures—capturing actions where children behave without considering context. For example, tracking how often a child removes the device or performing physical-behavior analysis (e.g., body movement or voice patterns) rather than relying solely on in-game metrics.

Based on insights gained from expert interviews, we derived three core design foundations: User Utility, Cognitive Needs, and Data-Driven Personalization. From a user utility perspective, we identified the importance of emotional engagement through storytelling and gamification to sustain participation among children with ADHD. In terms of cognitive needs, we established visual design principles tailored to the neurobiological characteristics of ADHD. For data-driven personalization, we pursued a quantitative assessment framework that integrates XR-based sensing and AI technologies. To overcome the limitations of the traditional ADHD-RS, which relies on subjective reports from parents and teachers, our approach leverages sensor data from XR devices—including gaze tracking, head movement, and voice input—to capture subtle behavioral patterns that conventional methods may miss. These multimodal signals are then aligned with DSM-5 diagnostic criteria to deliver personalized interventions for each child.



Through expert interviews and a literature review, we confirmed that existing ADHD digital therapeutics show limitations in addressing children's behavioral and cognitive needs. Prior studies have demonstrated that storytelling and gamification significantly enhance engagement and therapeutic outcomes. In particular, systems that integrate fairy-tale narratives with gradual difficulty adjustments have shown positive effects on motivation, sustained attention, and continuous participation, establishing them as effective design strategies for optimizing ADHD management. A child development professor emphasized that children with ADHD lose interest quickly if initial scenes are not sufficiently engaging or appropriately challenging, and highlighted that reducing social stigma and negative self-perception is essential for emotional development and continued treatment adherence. These insights informed a core design principle: fostering emotional connection with story content enhances sustained engagement and therapeutic efficacy.



We also established visual design principles grounded in the neurobiological and behavioral characteristics of children with ADHD. Research shows that due to lower dopamine levels, they require more cognitive resources to process blue hues, while colors in the mid- to long-wavelength range (e.g., red, green, orange) yield higher processing efficiency. Accordingly, we prioritized these long-wavelength colors in the UI and content modeling to reduce visual fatigue. In addition, considering children’s limited reading ability and the risk of motion sickness in XR environments, we minimized text and incorporated intuitive graphic elements. Because children with ADHD tend to lose interest quickly if the initial scenes are not engaging due to short attention spans and susceptibility to distraction, we designed the introduction to be visually captivating to effectively capture and sustain attention.


If user data can be quantitatively analyzed based on clear criteria, it becomes possible to develop personalized management programs for ADHD. In psychiatry, where biological markers are limited, diagnostic tools primarily rely on symptom and behavioral assessments. The ADHD Rating Scale (ADHD-RS), a DSM-5–based 18-item questionnaire, provides a reliable standard; however, it depends on subjective reports from parents and teachers. To address this limitation, technologies such as the Continuous Performance Test (CPT) have been used as complementary methods to quantify symptoms while minimizing environmental influences. Recent studies further leverage eye-tracking and AI to detect linguistic and movement patterns that are difficult to capture through traditional methods. Notably, XR technologies create immersive 3D environments that enable observation and measurement of real behaviors in simulated contexts, while embedded sensors capture, process, and transmit multidimensional data—such as head movement, hand gestures, gaze, and voice input—in real time. By combining XR-based data collection with ADHD-RS and AI-driven screening techniques, it becomes possible to design more comprehensive and personalized ADHD symptom-management solutions.



Based on insights gained from expert interviews, we derived three core design foundations: User Utility, Cognitive Needs, and Data-Driven Personalization. From a user utility perspective, we identified the importance of emotional engagement through storytelling and gamification to sustain participation among children with ADHD. In terms of cognitive needs, we established visual design principles tailored to the neurobiological characteristics of ADHD. For data-driven personalization, we pursued a quantitative assessment framework that integrates XR-based sensing and AI technologies. To overcome the limitations of the traditional ADHD-RS, which relies on subjective reports from parents and teachers, our approach leverages sensor data from XR devices—including gaze tracking, head movement, and voice input—to capture subtle behavioral patterns that conventional methods may miss. These multimodal signals are then aligned with DSM-5 diagnostic criteria to deliver personalized interventions for each child.



DESIGN

DESIGN

Through expert interviews and a literature review, we confirmed that existing ADHD digital therapeutics show limitations in addressing children's behavioral and cognitive needs. Prior studies have demonstrated that storytelling and gamification significantly enhance engagement and therapeutic outcomes. In particular, systems that integrate fairy-tale narratives with gradual difficulty adjustments have shown positive effects on motivation, sustained attention, and continuous participation, establishing them as effective design strategies for optimizing ADHD management. A child development professor emphasized that children with ADHD lose interest quickly if initial scenes are not sufficiently engaging or appropriately challenging, and highlighted that reducing social stigma and negative self-perception is essential for emotional development and continued treatment adherence. These insights informed a core design principle: fostering emotional connection with story content enhances sustained engagement and therapeutic efficacy.




We also established visual design principles grounded in the neurobiological and behavioral characteristics of children with ADHD. Research shows that due to lower dopamine levels, they require more cognitive resources to process blue hues, while colors in the mid- to long-wavelength range (e.g., red, green, orange) yield higher processing efficiency. Accordingly, we prioritized these long-wavelength colors in the UI and content modeling to reduce visual fatigue. In addition, considering children’s limited reading ability and the risk of motion sickness in XR environments, we minimized text and incorporated intuitive graphic elements. Because children with ADHD tend to lose interest quickly if the initial scenes are not engaging due to short attention spans and susceptibility to distraction, we designed the introduction to be visually captivating to effectively capture and sustain attention.





If user data can be quantitatively analyzed based on clear criteria, it becomes possible to develop personalized management programs for ADHD. In psychiatry, where biological markers are limited, diagnostic tools primarily rely on symptom and behavioral assessments. The ADHD Rating Scale (ADHD-RS), a DSM-5–based 18-item questionnaire, provides a reliable standard; however, it depends on subjective reports from parents and teachers. To address this limitation, technologies such as the Continuous Performance Test (CPT) have been used as complementary methods to quantify symptoms while minimizing environmental influences. Recent studies further leverage eye-tracking and AI to detect linguistic and movement patterns that are difficult to capture through traditional methods. Notably, XR technologies create immersive 3D environments that enable observation and measurement of real behaviors in simulated contexts, while embedded sensors capture, process, and transmit multidimensional data—such as head movement, hand gestures, gaze, and voice input—in real time. By combining XR-based data collection with ADHD-RS and AI-driven screening techniques, it becomes possible to design more comprehensive and personalized ADHD symptom-management solutions.



Based on this design approach, we implemented a program inspired by The Little Prince(Le Petit Prince). Drawing from its themes of connection and self-discovery, we created storytelling-driven XR content that helps children with ADHD reduce negative self-perception and rebuild self-efficacy. The experience follows a journey in which the child travels across stars with the Little Prince and completes various missions. Through interactions such as conversing with the Little Prince, caring for the rose, and collecting flight fuel, the child is naturally assessed on key cognitive and social abilities, including social skills, attention, and working memory. All interaction data is collected in real time through a single hardware device and analyzed by an AI model, which then delivers personalized developmental training content tailored to each child’s profile.

IMPLEMENTATION

IMPLEMENTATION

The screening phase consists of three immersive scenarios designed to evaluate ADHD-related behavioral indicators through natural and playful interactions. In Step 1, “Conversation with the Little Prince,” the system analyzes speech patterns, gaze, and body movements using voice recognition and tracking to assess social skills and impulse control. In Step 2, “Caring for the Rose,” children perform nurturing tasks—such as watering the rose or protecting it from sandstorms—to measure self-regulation and attention while observing their responses to random distractions. Finally, in Step 3, “Collecting Airplane Fuel,” players gather symbolic resources like hope, courage, and starlight in sequence, evaluating working memory and executive function. All interaction data are collected and analyzed in real time through the XR device, and the AI model diagnoses based on DSM-5 ADHD indicators, providing personalized behavioral development training content accordingly


The screening pipeline is structured as XR device → local module → server → AI model → database. The XR device captures multimodal data in real time — including voice, gaze, and head and hand movements — while the local module performs initial preprocessing such as noise removal, synchronization, and coordinate normalization. The refined data are then analyzed by the AI model, which applies DSM-5 and ADHD-RS–based algorithms to detect patterns related to attention, hyperactivity, and impulsivity. The resulting AI outputs and metadata are transmitted to the central server and stored in a database linked to each user’s profile and historical records. This integrated system enables the generation of personalized training content that reflects both individual screening data and the unique characteristics of each child


The Behavioral Development Phase is personalized based on each child’s screening results and focuses on six core cognitive and emotional domains: attention, working memory, executive function, impulse control, behavioral regulation, and emotional regulation. Each domain is trained naturally through storytelling-based missions inspired by The Little Prince’s journey. For example, on the King’s Planet, children play an emotion-matching game to understand and empathize with others, strengthening emotional regulation and social awareness, while on the Merchant’s Planet, they trade items in a memorized order to enhance working memory and executive function. Beyond simple cognitive training, this phase integrates the narrative of understanding and growth found in The Little Prince, helping children develop a positive self- and social perception while fostering emotional stability and self-efficacy.


Le Petit Care was developed using Unity and C#, Python, and MariaDB, and prototyped with METALENSE 2 AR glasses. Key scenarios—including conversations with the Little Prince, caring for the rose, and collecting flight fuel—run in an MR environment, where gaze, head and hand movement, and voice data are captured and analyzed in real time to generate performance reports. By leveraging mixed reality, the system minimizes the risk of children colliding with real-world objects while keeping them connected to their physical surroundings, enabling safe yet immersive task participation in a seamlessly blended physical-virtual environment.


A user study was conducted with seven participants to collect qualitative feedback on interaction design and immersion, along with an expert interview with a psychiatrist to evaluate project validity. The usability test revealed that the voice dialogue system’s excessive length and repetition were major issues. Participants noted that “the dialogue was too long and monotonous, making it difficult to stay focused,” especially for younger users who were likely to lose interest. In the Rose care mission, the lack of clear objectives and insufficient visual feedback made interactions unintuitive. Overall, improvements were needed in dialogue conciseness, visual cues, and interaction clarity. The psychiatrist suggested measuring both correct and incorrect responses to improve diagnostic accuracy and recommended incorporating context-aware behavioral metrics for a more balanced assessment of impulsivity. Additionally, technical enhancements were advised to address limitations in lower-body and hand-movement tracking.



While medication-based approaches for ADHD are effective in alleviating symptoms, they fall short in addressing critical issues such as identity confusion, social stigma, and psychological burden. This project introduces a new approach that combines data-driven screening with personalized content designed to strengthen self-efficacy. Our goal was to prioritize practical value and user benefit, helping children with ADHD manage their condition effectively, enhance self-awareness, and ultimately improve their quality of life. In terms of model validity, prior studies using data from more than 2,000 children demonstrated significant improvements in AI diagnostic accuracy, suggesting strong potential for ongoing refinement of our approach. Furthermore, integrating additional algorithms or outside-in tracking systems to enhance data quality could further increase diagnostic precision. As research advances toward a clearer understanding of the neurobiological mechanisms of ADHD, AI-based screening systems hold the potential not only to complement but eventually even replace traditional diagnostic methods.



Under the supervision of my HCI lab professor at Kyung Hee University, I wrote an eight-page paper and produced a five-minute introduction video and poster for submission. The project passed the preliminary review with an acceptance rate of 23.4%, advancing to the finals. In May 2025, I attended CHI 2025 in Yokohama, Japan, where I presented the poster and underwent on-site evaluation, competing with undergraduate and graduate teams from institutions such as Georgia Tech, UCLA, University College London, and Shanghai Jiao Tong University. Based on the judges’ evaluations—including industry professionals from Google and IBM and faculty from Stanford University and Carnegie Mellon University—our team was selected as one of the top 4 finalists, and I delivered an oral presentation on stage two days later.



Under the supervision of my HCI lab professor at Kyung Hee University, I wrote an eight-page paper and produced a five-minute introduction video and poster for submission. The project passed the preliminary review with an acceptance rate of 23.4%, advancing to the finals. In May 2025, I attended CHI 2025 in Yokohama, Japan, where I presented the poster and underwent on-site evaluation, competing with undergraduate and graduate teams from institutions such as Georgia Tech, UCLA, University College London, and Shanghai Jiao Tong University. Based on the judges’ evaluations—including industry professionals from Google and IBM and faculty from Stanford University and Carnegie Mellon University—our team was selected as one of the top 4 finalists, and I delivered an oral presentation on stage two days later.



Based on this design approach, we implemented a program inspired by The Little Prince(Le Petit Prince). Drawing from its themes of connection and self-discovery, we created storytelling-driven XR content that helps children with ADHD reduce negative self-perception and rebuild self-efficacy. The experience follows a journey in which the child travels across stars with the Little Prince and completes various missions. Through interactions such as conversing with the Little Prince, caring for the rose, and collecting flight fuel, the child is naturally assessed on key cognitive and social abilities, including social skills, attention, and working memory. All interaction data is collected in real time through a single hardware device and analyzed by an AI model, which then delivers personalized developmental training content tailored to each child’s profile.

The screening phase consists of three immersive scenarios designed to evaluate ADHD-related behavioral indicators through natural and playful interactions. In Step 1, “Conversation with the Little Prince,” the system analyzes speech patterns, gaze, and body movements using voice recognition and tracking to assess social skills and impulse control. In Step 2, “Caring for the Rose,” children perform nurturing tasks—such as watering the rose or protecting it from sandstorms—to measure self-regulation and attention while observing their responses to random distractions. Finally, in Step 3, “Collecting Airplane Fuel,” players gather symbolic resources like hope, courage, and starlight in sequence, evaluating working memory and executive function. All interaction data are collected and analyzed in real time through the XR device, and the AI model diagnoses based on DSM-5 ADHD indicators, providing personalized behavioral development training content accordingly







The screening pipeline is structured as XR device → local module → server → AI model → database. The XR device captures multimodal data in real time — including voice, gaze, and head and hand movements — while the local module performs initial preprocessing such as noise removal, synchronization, and coordinate normalization. The refined data are then analyzed by the AI model, which applies DSM-5 and ADHD-RS–based algorithms to detect patterns related to attention, hyperactivity, and impulsivity. The resulting AI outputs and metadata are transmitted to the central server and stored in a database linked to each user’s profile and historical records. This integrated system enables the generation of personalized training content that reflects both individual screening data and the unique characteristics of each child







The Behavioral Development Phase is personalized based on each child’s screening results and focuses on six core cognitive and emotional domains: attention, working memory, executive function, impulse control, behavioral regulation, and emotional regulation. Each domain is trained naturally through storytelling-based missions inspired by The Little Prince’s journey. For example, on the King’s Planet, children play an emotion-matching game to understand and empathize with others, strengthening emotional regulation and social awareness, while on the Merchant’s Planet, they trade items in a memorized order to enhance working memory and executive function. Beyond simple cognitive training, this phase integrates the narrative of understanding and growth found in The Little Prince, helping children develop a positive self- and social perception while fostering emotional stability and self-efficacy.






Le Petit Care was developed using Unity and C#, Python, and MariaDB, and prototyped with METALENSE 2 AR glasses. Key scenarios—including conversations with the Little Prince, caring for the rose, and collecting flight fuel—run in an MR environment, where gaze, head and hand movement, and voice data are captured and analyzed in real time to generate performance reports. By leveraging mixed reality, the system minimizes the risk of children colliding with real-world objects while keeping them connected to their physical surroundings, enabling safe yet immersive task participation in a seamlessly blended physical-virtual environment.






USER STUDY

USER STUDY

A user study was conducted with seven participants to collect qualitative feedback on interaction design and immersion, along with an expert interview with a psychiatrist to evaluate project validity. The usability test revealed that the voice dialogue system’s excessive length and repetition were major issues. Participants noted that “the dialogue was too long and monotonous, making it difficult to stay focused,” especially for younger users who were likely to lose interest. In the Rose care mission, the lack of clear objectives and insufficient visual feedback made interactions unintuitive. Overall, improvements were needed in dialogue conciseness, visual cues, and interaction clarity. The psychiatrist suggested measuring both correct and incorrect responses to improve diagnostic accuracy and recommended incorporating context-aware behavioral metrics for a more balanced assessment of impulsivity. Additionally, technical enhancements were advised to address limitations in lower-body and hand-movement tracking.

CONCLUSION

CONCLUSION

While medication-based approaches for ADHD are effective in alleviating symptoms, they fall short in addressing critical issues such as identity confusion, social stigma, and psychological burden. This project introduces a new approach that combines data-driven screening with personalized content designed to strengthen self-efficacy. Our goal was to prioritize practical value and user benefit, helping children with ADHD manage their condition effectively, enhance self-awareness, and ultimately improve their quality of life. In terms of model validity, prior studies using data from more than 2,000 children demonstrated significant improvements in AI diagnostic accuracy, suggesting strong potential for ongoing refinement of our approach. Furthermore, integrating additional algorithms or outside-in tracking systems to enhance data quality could further increase diagnostic precision. As research advances toward a clearer understanding of the neurobiological mechanisms of ADHD, AI-based screening systems hold the potential not only to complement but eventually even replace traditional diagnostic methods.



OUTCOME

OUTCOME

Under the supervision of my HCI lab professor at Kyung Hee University, I wrote an eight-page paper and produced a five-minute introduction video and poster for submission. The project passed the preliminary review with an acceptance rate of 23.4%, advancing to the finals. In May 2025, I attended CHI 2025 in Yokohama, Japan, where I presented the poster and underwent on-site evaluation, competing with undergraduate and graduate teams from institutions such as Georgia Tech, UCLA, University College London, and Shanghai Jiao Tong University. Based on the judges’ evaluations—including industry professionals from Google and IBM and faculty from Stanford University and Carnegie Mellon University—our team was selected as one of the top 4 finalists, and I delivered an oral presentation on stage two days later.




MATERIALS

MATERIALS

This project has been published as an Extended Abstract in the ACM Digital Library. To access the Full Paper, please click the image on the right to be redirected to the publication page. The paper is available for free under Open Access.



This project has been published as an Extended Abstract in the ACM Digital Library. To access the Full Paper, please click the image on the right to be redirected to the publication page. The paper is available for free under Open Access.


Create a free website with Framer, the website builder loved by startups, designers and agencies.