This project aims to develop virtual simulations of human passengers and cabin environments. The virtual human passengers consist of state-of-the-art conversational capabilities alongside custom visuals and animations to closely mimic real humans.
To better understand how passengers might relate to passenger related incidents in the planes and its effects on passenger comfort and wellness, the team propose the use of virtual passengers to simulate various complicated passenger requests and incidents.
Expertise, infrastructure, and relevant background work done at InteractAI would be tapped on to add value to the project.
Passenger care includes having an appropriate understanding of how passengers might feel or respond when placed under a variety of situations. As an airplane cabin can be a highly contained environment where passengers are all in close proximity to one another, it is important to study how passenger-to-passenger interactions, behaviour and placement might affect another passenger’s comfort and wellness.
Through this study, the team propose an augmented reality whereby virtual passengers with user conversational capabilities and encompassing various possible behavioural types are placed around the cabin in either a controlled or randomised manner.
Dealing with demanding requests is paramount to all crew members in order to maximise passenger’s satisfaction levels. This study can identify the variables and settings whereby the passenger is experiencing the highest levels of discomfort and through this, suggest possible optimised solutions towards reducing or eliminating the discomfort in passengers.
Our collaborative study with the Yong Loo Lin School of Medicine revealed the benefits of using conversationally capable virtual humans in understanding human-computer interaction and user learning. However, we identified a lack of research in recognizing emotions, specifically empathy, from users. Recognizing empathy is crucial for nuanced conversations and user skill development. While emotion recognition is extensively studied, empathy recognition, particularly in augmented reality human-computer interaction, remains understudied. Our project aims to address this gap by leveraging multimodal learning to train deep learning models for empathy recognition, surpassing conventional methods that rely on a single data domain or mode for classification.
Additionally, we aim to demonstrate that optimized frameworks and deep learning techniques can streamline the creation of realistic virtual humans, reducing resource requirements and enabling widespread adoption across mainstream platforms.
SIA-NUS Digital Aviation Corporate Laboratory is a part of NUS’s Institute of Operation Research and Analytics.