Entropy and Sentiment: the Anna Karenina Principle in Patient Experience Data

 

The so-called “Anna Karenina Principle” – named for the first like of Tolstoy’s novel (“All happy families are alike; each unhappy family is unhappy in its own way.”) – states that failure can occur in any number of ways, while success tends to be relatively uniform.

This principle has been observed in fields as disparate as evolutionary biology and market research. In this talk, the principle is applied to an NLP task via the intersection of topic and sentiment in patient experience comment data, to both illustrate the trend and propose an approach for its measurement. Consequences of the observation in terms of both imbalances by class for sentiment accuracy and consequent challenges for effective healthcare interventions are discussed.

About the speaker
Zach Childers

Zach Childers

Manager, NLP and Machine Learning at Press Ganey

Zach Childers has worked for over five years in Natural Language Processing in health care, at NarrativeDx and continuing through their acquisition by Press Ganey.

Prior to that he was trained as a linguist, working primarily on formal syntax, lexical semantics, and sentiment in English.

He is the lead inventor on patent 11,003,849, covering technologies for valid dependency parsing and sentiment domain identification. Recent publications include “Using AI to Understand the Patient Voice During the Covid-19 Pandemic” in the New England Journal of Medicine – Catalyst and “Understanding Unhappy Patients Makes Hospitals Better for Everyone” in the Harvard Business Review.

NLP-Summit

When

Sessions: April 5th – 6th 2022
Trainings: April 12th – 15th 2022

Contact

nlpsummit@johnsnowlabs.com

Presented by

jhonsnow_logo