Audiovisual data differentiate schizophrenia, bipolar disorders
(HealthDay)—Audiovisual data can differentiate between schizophrenia disorders and bipolar disorders, according to a study published in the January issue of JMIR Mental Health.
Michael L. Birnbaum, M.D., from the Zucker Hillside Hospital in Glen Oaks, New York, and colleagues examined whether reliable inferences—psychiatric signs, symptoms, and diagnoses—can be extracted from audiovisual patterns using audiovisual data from 89 participants: 41 individuals with schizophrenia spectrum disorders, 21 individuals with bipolar disorder, and 27 healthy volunteers.
Machine learning models were developed based on acoustic and facial movement features extracted from participant interviews. Model performance was assessed using area under the receiver operating characteristic curve (AUROC) in fivefold cross validation.
The researchers found that when aggregating face and voice features, the model successfully differentiated between schizophrenia spectrum disorders and bipolar disorder (AUROC, 0.73). The strongest signal for men was seen for facial action units, including cheek-raising muscle and chin-raising muscle (AUROCs, 0.64 and 0.74, respectively). For women, the strongest signal was provided by vocal features, including energy in the frequency band 1 to 4 kHz and spectral harmonicity (AUROCs, 0.80 and 0.78, respectively).
For both men and women, lip corner-pulling muscle signal discriminated between diagnoses (AUROCs, 0.61 and 0.62, respectively). Certain psychiatric signs and symptoms were successfully inferred, including blunted affect, avolition, lack of vocal inflection, asociality, and worthlessness (AUROCs, 0.81, 0.72, 0.71, 0.63, and 0.61, respectively).
Source: Read Full Article