top of page

Human-Human
Interaction

Our lab investigates the intricate,and often unpredictable, dynamics of human-to-human interaction. A cornerstone of this work is our dual eye-tracking system, i+i (Thorsson et al., 2024, Scientific Reports) which captures and analyzes gaze patterns of two interacting participants in real time. Currently, we are working on developing the system and coupling it with biometric measures, such as heart rate and skin conductance, to examine physiological changes that occur during interactions. This multi-modal approach will be able to provide a richer understanding of joint attention, mutual gaze, turn-taking, and the emotional and physiological states that underpin social coordination. Our research in this area is ongoing, with the goal of uncovering new insights into human connection and developing innovative applications for education, clinical interventions, and beyond.

Abstract Structures

How we work with Affective Computing

At DICE Lab, we take both a theoretical and applied approach to study a range of multimodal affective detection approaches. First, we look to understand which modalities, signals and/or data can accurately capture a user’s affective state, particularly in niche demographics (i.e. elderly populations, and/or populations with cognitive impairments) or in specific contexts (i.e. being a passenger in an automated vehicle, or interacting with a social robot). Secondly, we seek to understand the relevance of these affective signals in the real-time adaptation of these technologies  (socially assistive robots, autonomous vehicles) in a way that best serves positive outcomes associated with the human-technology interaction, such as motivation to interact (with cognitive training or a robot companion) or trust in automated systems (such as an automated vehicle). 

Current Projects

Flat Round Elements

Socially Affective Robots for Digitized Cognitive Interventions (SARP-DCI)

This project looks to investigate the effects of a social robot partner, which adapts its behaviours and interactions based on a human partner’s affective and emotional states, on the long-term viability (acceptance, adherence, motivation, efficacy) of digitised cognitive training therapy. In combination with interactions with a social robot and a cognitive training task, this project seeks to find out: (1) the optimal combination of multimodal signals that can best capture a user’s affective/emotional states, particularly when interacting with these tools, (2) the effects of real-time adaptation of these technologies (robot interactions, characteristics of training tasks) based on these affective states on various behavioural and performance measures and (3) the viability of this approach in pre-clinical populations with Mild Cognitive Impairments.

Publications

A novel end-to-end dual-camera system for eye gaze synchrony assessment in face-to-face interaction

Thorsson, M., Galazka, M. A., Åsberg Johnels, J., & Hadjikhani, N. Attention, Perception, & Psychophysics (2024)

Funding bodies

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 945380.
LO_GUeng_NEG_flag_SV (002).png

© 2025 DICE Lab, University of Gothenburg

bottom of page