Sensing Humans (Bio/Brain/Face/Movement/VR)

See Related Papers and Related Projects:
## AI-based Exploration of Art and Design ## AI Cognitive Creativity ## AI Affective Virtual Human
## XR Avatars; Edu, Coaches, Health

Reseachers: Steve DiPaola, Meehae Song

About: Our lab has extensive experience in using different sensing technologies to better understand and incorporate human intent and interaction. Including eye tracking, facial emotion recognition (DiPaola et al 2013); gesture, body, &hand tracking; bio sensing - heart rate and EDA (Song & DiPaola, 2015); brain waves ( BCI), both for human understanding and for more human centered interaction for affect generative systems as it can be used to understand the reception to the generated graphics (still, video, VR).

The Research: Emotional facial tracking using camera and AI software. Motion, gesture and body tracking using overhead cameras and MS Kinect. Hand tracking via our own data gloves and Leap Controller. Eye tracking via our Pupil eye tracker. Bio sensing ( heart rate and EDA) via our Empatica E4 watch. Brain wave via Muse and other systems.

Setup and Results: Some examples of our tracking systems. All our 2d, 3d and VR systems have an abstraction layer with software modules to support several advanced input technologies such as emotion tracking, motion tracking, and bio-sensors.


See samples of this work in the videos below:


Video 1: DiPaola in our lab, demonstrating Brain and Heart Rate sensing - for health - where it is possible to control and visualize your heart and feelings. Moving from the universe, to birds flocking to your beating heart for mental health.

Video 2: Our work where a mental health counsellor uses our system to create (dream of) a happy place and brings via brain/breathing control the VR patient to the happy clam place she has created with her mind and breathing. For mental health; the outer sphere is her breathing in and out, the flock of birds is her brain waves (alpha waves here).


More studies with heart rate (Empatica Watch) and breathing via our systems and generated graphics.

Video 3

Video 4

Video 5

Video 6


Video 7: Breath controlled art.

Video 8: Emotional facial recognition combined with movement/placemnet recogntion and hand finger tracking - where our AI aware avatar responds.






—— PAPERS: Sensing Humans (Bio/Brain/Face/Movement/VR) ——

Sound and Visual Entrainment in Affective Bio-Responsive Multi-User VR Interactives

by Song M; DiPaola S – Conference – 5 pages
Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW) (2023)

Multimodal Entrainment in Bio-Responsive Multi-User VR Interactives

by Song M; DiPaola S – Conference – 5 pages. Late Breaking Paper
ACM International Conference on Multimodal Interaction (ICMI) (2023)

Combining Bio-Sensing; Multimodal Control; and Artificial Intelligence for Bio-Responsive Interactives

by DiPaola S; Song M – Conference – Workshop; Modeling Socio-Emotional and Cognitive Processes. 5 pages
ACM International Conference on Multimodal Interaction (ICMI) (2023)

Social Prescribing Across the Lifespan with Virtual Humans

by Yalcin ON; Moreno S; DiPaola S – Conference – No. 56. pp. 1-3
ACM International Conference on Intelligent Virtual Agents (IVA) (2020)

Framework for a bio-responsive VR for interactive real-time environments and interactives

by Song S; DiPaola S – Conference – British Computer Society. pp. 377-384
Electronic Visualisation and the Arts (EVA) (2017)