Human-Human
Interaction
Our lab investigates the intricate,and often unpredictable, dynamics of human-to-human interaction. A cornerstone of this work is our dual eye-tracking system, i+i (Thorsson et al., 2024, Scientific Reports) which captures and analyzes gaze patterns of two interacting participants in real time. Currently, we are working on developing the system and coupling it with biometric measures, such as heart rate and skin conductance, to examine physiological changes that occur during interactions. This multi-modal approach will be able to provide a richer understanding of joint attention, mutual gaze, turn-taking, and the emotional and physiological states that underpin social coordination. Our research in this area is ongoing, with the goal of uncovering new insights into human connection and developing innovative applications for education, clinical interventions, and beyond.

How we work with Affective Computing
At DICE Lab, we take both a theoretical and applied approach to study a range of multimodal affective detection approaches. First, we look to understand which modalities, signals and/or data can accurately capture a user’s affective state, particularly in niche demographics (i.e. elderly populations, and/or populations with cognitive impairments) or in specific contexts (i.e. being a passenger in an automated vehicle, or interacting with a social robot). Secondly, we seek to understand the relevance of these affective signals in the real-time adaptation of these technologies (socially assistive robots, autonomous vehicles) in a way that best serves positive outcomes associated with the human-technology interaction, such as motivation to interact (with cognitive training or a robot companion) or trust in automated systems (such as an automated vehicle).
Current Projects

Socially Affective Robots for Digitized Cognitive Interventions (SARP-DCI)
This project looks to investigate the effects of a social robot partner, which adapts its behaviours and interactions based on a human partner’s affective and emotional states, on the long-term viability (acceptance, adherence, motivation, efficacy) of digitised cognitive training therapy. In combination with interactions with a social robot and a cognitive training task, this project seeks to find out: (1) the optimal combination of multimodal signals that can best capture a user’s affective/emotional states, particularly when interacting with these tools, (2) the effects of real-time adaptation of these technologies (robot interactions, characteristics of training tasks) based on these affective states on various behavioural and performance measures and (3) the viability of this approach in pre-clinical populations with Mild Cognitive Impairments.