31.07.2025
Whether in a conversation with friends, a team discussion, or a theatrical performance – we are constantly tuning into one another. This synchrony manifests not only in our behavior, but also deep within our bodies: for instance, when our heart rates subconsciously begin to align. These “heart echoes” are potentially measurable – and they can provide insights into relationship quality, stress levels, and how we collaborate. But how can we reliably measure this subtle interplay? A promising new method developed by New Zealand researcher Roydon Goldsack challenges the status quo – and introduces an innovative, AI-driven approach that goes beyond traditional correlation techniques.
The heart does not beat in isolation. In social interactions, our bodies and emotions adjust to one another in surprising ways – through synchronized breathing, shared facial expressions, or even aligned heart rhythms. In psychology, such physiological synchrony is seen as an indicator of closeness, empathy, or tension between individuals. It occurs in friendships, romantic relationships, and even strangers – and influences both emotional and practical outcomes, such as cooperation or relationship satisfaction.
But how exactly can this synchrony be measured objectively? Conventional methods, like time-lagged cross-correlation, offer only a limited view. They capture only linear relationships and fail to reflect the complexity of dynamic, nonlinear human interactions.
This is where Roydon Goldsack’s research comes in. A PhD candidate at the School of Engineering and Computer Science at Te Herenga Waka – Victoria University of Wellington, Goldsack draws on psychology, statistics, and computer science to develop a new measurement technique. Their approach is based on the concept of mutual information from information theory, which quantifies the amount of shared information between two physiological signals – for example, between the heart rates of two interacting individuals – regardless of whether their relationship is linear or nonlinear.
Goldsack’s work attempts to break down the problem of mutual information estimation for timeseries into smaller chunks. The idea is by modelling simpler aspects of the signal we can get a better overall picture of the shared information between sequences. The result is a sophisticated bridge between machine learning and emotion-focused interaction research, with wide potential applications in healthcare, education, team diagnostics, and even human-robot interaction.
Goldsack’s work is part of the interdisciplinary Wiri Project, which explores new models of embodied emotion at the crossroads of design, theatre, psychology, and computer science. They are supported by the Deep Learning Research Group (led by Prof. Dr. W. Bastiaan Kleijn) and the Affective and Criminal Neuroscience Lab (led by Prof. Dr. Hedwig Eisenbarth).
In July Roydon Goldsack is visiting the Causality Group at the RC Trust and TU Dortmund University, invited by Prof. Dr. Alexander Marx, Professor at the Faculty of Statistics and Chair of Causality. During their stay, they share their work with researchers and students, present current findings, and engage in interdisciplinary exchange within and beyond the group.
Their visit illustrates the value of research that connects computer science, psychology, and society. It not only introduces new models – it raises new questions: What does it mean to be “in sync” in the digital age? And how can we better understand our everyday interactions?