Toward Automated Detection of Biased Social Signals from the Content of Clinical Conversations

AMIA 2024

Abstract

Implicit bias can impede patient-provider interactions and lead to inequities in care. Raising awareness is key to reducing such bias, but its manifestations in the social dynamics of patient-provider communication are difficult to detect. In this study, we used automated speech recognition (ASR) and natural language processing (NLP) to identify social signals in patient-provider interactions. We built an automated pipeline to predict social signals from audio recordings of 782 primary care visits that achieved 90.1% average accuracy across codes, and exhibited fairness in its predictions for white and non-white patients. Applying this pipeline, we identified statistically significant differences in provider communication behavior toward white versus non-white patients. In particular, providers expressed more patient-centered behaviors towards white patients including more warmth, engagement, and attentiveness. Our study underscores the potential of automated tools in identifying subtle communication signals that may be linked with bias and impact healthcare quality and equity.

Publication
Proceedings of the AMIA Annual Symposium 2024
Manas Bedmutha
Manas Bedmutha
Ph.D. Student

Manas is currently working on developing social signal processing tools and devices for understanding healthcare interactions better.

Nadir Weibel
Nadir Weibel
Professor of Computer Science and Engineering

Related