From Signal to Noise: Discernment and Debility in the Making of American Vocal Biomarker AI
Beth Semel
How might AI reshape the sensory norms and moral economies of discernment of American mental healthcare?
Since 2015, the US has witnessed a rise in mental healthcare-related vocal biomarker AI (VBAI), machine listening technologies trained to identify supposedly objective and/or biological indicators of mental distress conveyed in the sounds of the voice. The technologists, entrepreneurs, and clinical researchers invested in this subfield suggest that AI can unlock more accurate techniques for screening patients, superseding the need to rely on a clinical worker’s interpretive acumen alone to parse the ill from the well.
This talk draws from ethnographic fieldwork with several VBAI makers to show that bringing this promise of enhanced clinical precision into fruition also involves sensing certain features of mental distress out of technoscientific and bureaucratic legitimacy, generating modes of algorithmic legibility and illegibility side-by-side throughout the technology development pipeline. To illustrate these dynamics in practice, I focus on the “data workers” (Miceli and Posada 2023) tasked with processing the datasets of “mentally distressed voices” that are essential to the creation of VBAI, which requires them to make value-laden determinations about which aspects of mental distress — including their own — ought to be counted as computationally tractable signal versus unquantifiable and discardable noise.
This lecture will be held both online & in person. You are welcome to join us either in South Hall or via Zoom.
Speaker
Beth Semel
Beth Semel is an assistant professor of anthropology at Princeton University. Situated at the intersection of linguistic and medical anthropology and STS (tcience and technology studies), she studies the sensory politics and technopolitics of American mental healthcare in an era in which AI is called upon to manage increasingly broad arenas of human life.
Her current project ethnographically traces the interpretive and auditory practices and labor underpinning machine listening technologies designed to evaluate and track people experiencing mental distress. Research from this project has been published in Science, Technology & Human Values, The AI Now Institute’s “A New Lexicon of AI” series, and will appear in a forthcoming issue of Current Anthropology and the edited volume Technocreep and the Politics of Things Not Seen (Duke University Press, May 2025). She received her Ph.D. in history, anthropology, science, technology and society at MIT.