top of page

Human-Human
Interaction

Our lab investigates the intricate,and often unpredictable, dynamics of human-to-human interaction particularly when using AI and digital technology. A cornerstone of this work is our dual eye-tracking system, i+i (Thorsson et al., 2024, Scientific Reports) which captures and analyzes gaze patterns of two interacting participants in real time. Currently, we are working on developing the system and coupling it with biometric measures, such as heart rate and skin conductance, to examine physiological changes that occur during interactions. This multi-modal approach will be able to provide a richer understanding of joint attention, mutual gaze, turn-taking, and the emotional and physiological states that underpin social coordination. Our research in this area is ongoing, with the goal of uncovering new insights into human connection and developing innovative applications for education, clinical interventions, and beyond.

i+i.png

Investigating Human-Human Interaction with Digital Technology

At DICE Lab, we take both a theoretical and applied approach to study a range of multimodal affective detection and human-human interaction approaches. First, we look to understand which modalities, signals and/or data can accurately capture a user’s affective and cognitive state, particularly in niche demographics (i.e. children and elderly populations, and/or populations with cognitive impairments). Secondly, we seek to understand the relevance of human-human interaction and affective/cognitive signal detection in new human-technology contexts (e.g. with socially assistive robots, autonomous vehicles) in a way that best serves positive outcomes associated with the human-technology interaction, such as motivation to interact (with cognitive training or a robot companion) or trust in automated systems (such as an automated vehicle). 

We also lead the IEEE Developmental Psychology task force: 

https://www.developmentpsychologytaskforce.com/

Current Projects

i+i.png

i+i project collaboration DICE Lab and Warwick University (2025+)

This project investigates dyadic interactions in relation to mutual eye contact and with respect to persons with heightened levels of autistic traits. Analysis is enabled by use of a custom-built (see above image) dual-camera, deep learning based eye tracking system (developed by Max Thorsson), which uses video input with timestamp-based alignment. Utilizing this technology for human-human interaction evaluation presents 
high ecological validity as the eye-tracker is non-wearable and monitor-free.

The work is partly sponsored and enabled by the EUTOPIA alliance (https://eutopia-university.eu/). 

hands.png

Dialogic reading at scale and up close (2026)

This project will explore how early language interventions can be made more effective and accessible through digital technology. Based on an effect method of Dialogic Reading (DR), where parents engage children in dialogue while reading, the research will evaluate a newly developed mobile app designed to guide and support families whose children show signs of language delay. The app, which is currently under development, will provide parents with short instructional videos, book materials, and reminders to support everyday reading routines. In the first study, 300 families will participate in a large-scale trial to examine the app’s impact on children’s language development, parent engagement, and well-being. A second, more detailed study will investigate how biobehavioral synchrony between parent and child during shared reading contributes to language learning. It is hoped that these studies will help shape future early intervention programs and advance our understanding of the social and biobehavioral foundations of language development. For more info. see:

https://vasilikimylo.wixsite.com/dialogical-reading

Featured Publications

A novel end-to-end dual-camera system for eye gaze synchrony assessment in face-to-face interaction

Thorsson, M., Galazka, M. A., Åsberg Johnels, J., & Hadjikhani, N. Attention, Perception, & Psychophysics (2024)

Funding bodies

Funding has predominantly come from EUTOPIA, Promobilia Stiftelsen.
LO_GUeng_NEG_flag_SV (002).png

© 2025 DICE Lab, University of Gothenburg

bottom of page