Human Vehicle Interaction
Our group works with biometric/biobehavioural evaluation and modelling of driver behaviour in vehicles, specifically in reference to self-driving cars (SAE level 3-5) in simulation. We seek to develop insights into patterns of biometric activity and behaviour that are correlated with particular affective states. For example, we are interested in how facial expression, galvanic skin response, eye movements, driver pose/distance from wheel, braking and steering behaviour, correlate with feelings of stress, comfort/discomfort or intoxication when driving which might provide cues to driver support systems.
We work much with driver simulators and adhere to ISO or ASAM standards when specifying simulated scenarios.


How we investigate Human Vehicle Interaction
At DICE Lab we are investigating scenarios in car simulator setups for assessing how humans (drivers, passengers) respond affectively and behaviourally in potentially dangerous driving scenarios for the purpose of using such information to automate corrective responses in self-driving cars. Currently the target is for use in cars at around SAE level 3-5. We (Robert Lowe at RISE – Research Institutes of Sweden, Human-Centred AI Unit) are also working on research into how to use Explainable AI (XAI) within the Machine Learning development life cycle for autonomous vehicles.
Current projects include:
-
I-AIMS2 (see below), funded by VINNOVA (2025-2027)
-
DIME (see below): funded by VINNOVA (2025-2027)
-
SAFEXPLAIN (see below) - research carried out at RISE (2022-2025 Nov)
-
ROAD-MASTER (https://projektdatabas.vti.se/bib/5329) - Robert Lowe (PI) research carried out at RISE, funded by Trafikverket (2025-2027 Dec)
-
ExAffect (Explainable AI Cognitive-Affective Driver Interface) , funded by Transport Area of Advance, Chalmers (Jan 2026-end Dec. 2026).
Featured Projects

I-AIMS1&2 (Impairment-Aware Intelligent Mobility System): 2023 (Nov)-2024 (Oct) + 2025 (May)-2027 (April)
I-AIMS aims at investigating how to enhance safety and well-being of drivers through i) monitoring, and ii) regulating impaired cognitive and emotional states affected by driving scenarios and/or driver state. The University of Gothenburg (including personnel at DICE Lab) are collaborating with Smart Eye AB in using Smart-Eye’s Driver Monitoring System (DMS) focusing on eye-tracking cameras for sensing e.g. stress levels and cognitive load. We are investigating how the use of a large language model (LLM) can regulate affective state following feedback from the DMS.
In the longer term, we plan to develop the LLM interface to the DMS by allowing algorithmic supervision of the LLM output that is task and context appropriate and to incorporate embodied feedback in the form of a small (robot) head unit. This unit will display emotional expressions and have two degrees of freedom for providing interactions with drivers at opportune (safe) moments. For more info. see: https://www.gu.se/en/research/i-aims2-impairment-aware-intelligent-mobility-systems-2

ROAD-MASTER (Real-world On-road
Assessment of Drivers – Methods And
Systems for Technology Evaluation and Review): 2025 (June)-2026 (Dec.)
The aim of the project Real-world On-road Assessment of Drivers - Methods And Systems for
Technology Evaluation and Review (ROAD-MASTER) is to establish a method for evaluating the
quality, usability, and appropriate application of systems designed to automatically evaluate driver
competency. Establishing a clear method for evaluating such systems is critical for procurement and
for determining proper application of such systems. We will practically demonstrate/test the method
on a use case AI tracking and analytic system.
The project is a collaboration between RISE (coordinator) and QTPIE (see https://www.mirmi.tum.de/en/mirmi/news/article/times-best-inventions-2025-start-up-qtpie-featured-with-ai-driving-test/). Driver simulation and real-world tests will help establish the operational boundaries of QTPIE's AI tool.

DIME (Driver Impairment Multi-modal Evaluation): 2025 (Aug) -2027 (Aug)
The Driver Impairment Multimodal Evaluation (DIME) project addresses the challenge of identifying intoxication, particularly when drivers actively attempt to conceal it through creating an advanced multi-modal system, designed for commercial deployment by 2028. For this, Smart Eye will leverage deep learning techniques to develop their Driver Monitoring System to combine: i) in-cabin driver behavior; ii) intoxication-related speech patterns; and iii) biobehavioral measures.
For more information see: https://www.vinnova.se/en/p/driver-impairment-multimodal-evaluation-dime/.

SAFEXPLAIN (Safe and Explainable Critical Embedded Systems Based on AI): 2022-2025
The aim of the project is to address the functional safety requirements of Critical Autonomous AI-Based Systems (CAIS) by making more transparent (explainable, traceable) the black box algorithms (typically Deep Learning based) used/to-be-used within software products for autonomous vehicles. The focus of RISE AB (within which Robert Lowe of DICE Lab works) is to identify and deploy Explainable AI (XAI) algorithms at different stages of the Machine Learning lifecycle to render more transparent the black box algorithms for the various users involved (e.g. developers, operational users). Read more about the project here
Featured Publications
Evaluating Biometric and Behavioral Markers of Intoxication in Drivers: A Pilot Study
Ravandi, B.S., Fransson, M., Fabricius, V., Vandeleene, N., Francois, C., Lowe, R. CHItaly '25: Proceedings of the 16th Biannual Conference of the Italian SIGCHI Chapter, 2025
.png)