
A University of Maryland expert in computational linguistics has secured seed funding for an interdisciplinary project that explores real-time social interactions—and the brain processes that drive them.
Philip Resnik, a professor of linguistics with an appointment in the University of Maryland Institute for Advanced Computer Studies, is co-principal investigator of a $70,000 grant awarded by the UMD Brain and Behavior Institute (BBI).
One of only three projects selected for BBI seed funding in 2024, the project aims to create an integrated computational framework that will reveal how the brain learns, updates and communicates in real-time with a social partner, Resnik says.
“The idea of being ‘in sync’ with others is vital in contexts like education, ideological discussions, or workplace interactions involving neurodiverse individuals,” he explains. “This project connects brain activity with the ability to predict and align with another person’s mental state, paving the way for new insights into human connection and understanding.”
The research team—which includes Elizabeth Redcay, a professor of psychology who is the lead PI on the project, Caroline Charpentier, an assistant professor of psychology, and Rachel Romeo, an assistant professor of human development and quantitative methodology—will develop a computational framework by analyzing features from multimodal interactions by 30 pairs of adults as they participate in communication tasks.
This includes speech, facial expressions, and body gestures that are captured by cameras as well as functional magnetic resonance imaging (fMRI) and functional near-infrared spectroscopy (fNIRS) devices. The team will then integrate and analyze that data using advanced Bayesian models, which update beliefs based on new evidence, as well as game-theoretic models, which explore strategic decision-making.
“Social interaction is essential to human development, health and well-being, yet traditional methods have struggled to capture its complexity,” says Redcay. “This project uses advanced computational models and neurocognitive tools to uncover how the brain learns, adapts, and communicates in real time.”
Ultimately, Resnik believes, the BBI researchers will address a critical gap in computational neuroscience—understanding the complexities of real-time conversations. These interactions involve interpreting emotions, intentions and social dynamics, he adds, areas that have been largely unexplored compared to how the brain processes language during reading or listening.
Resnik, who is director of the Computational Linguistics and Information Processing (CLIP) Laboratory, says that the BBI team’s preliminary results will support larger grant proposals to the National Institute of Mental Health and the National Science Foundation in late 2025 and early 2026.
—Story by Melissa Brachfeld, UMIACS communications group