The basic idea is this: Soldiers will interact with a computer simulation of a therapist, which has been programmed to encourage the interaction (with well-timed "uh-huhs", etc.). While the soldier is speaking to the sim-therapist, the computer records their body movements, eye gaze, vocal tone, etc. Then, the sample of the soldier's "body language" is compared to a database of body language prototypes for various patient groups (e.g., combat traumatized, depressed, suicidal, psychotic) and a normal prototype. Soldier's whose body language is similiar to one of the patient prototypes is "red-flagged" and follow-up is conducted with a human clinician.
It is a very interesting idea, but only as a screening device (before the soldiers has seen a clinician), or as a backstop for incompetent mental health clinicians. The suggestion that this technique will yield an "objective measure" (like a blood sample) to inform diagnosis is poppycock. It could possibly distinguish distressed from nondistressed people, but no better than those people could do themselves. What would be great if it could discern suicidal intent in non-disclosing people. If it could, it could be used prior to discharge from psychiatric hospitals. But I won't wait up nights for those findings to come in.
Under the wide screen where Ellie's image sits, there are three devices. A video camera tracks facial expressions of the person sitting opposite. A movement sensor — Microsoft Kinect — tracks the person's gestures, fidgeting and other movements. A microphone records every inflection and tone in his or her voice. The point, Rizzo explains, is to analyze in almost microscopic detail the way people talk and move — to read their body language.
"We can look at the position of the head, the eye gaze," Rizzo says. Does the head tilt? Does it lean forward? Is it static and fixed?" In fact, Ellie tracks and analyzes around 60 different features — various body and facial movements, and different aspects of the voice.
The theory of all this is that a detailed analysis of those movements and vocal features can give us new insights into people who are struggling with emotional issues. The body, face and voice express things that words sometimes obscure.
"You know, people are in a constant state of impression management," Rizzo says. "They've got their true self and the self that they want to project to the world. And we know that the body displays things that sometimes people try to keep contained."
So, as Ellie gets the person in front of her to ruminate about when they were happy and when they were sad, the machines below her screen take measurements, cataloging how much the person smiles and for how long, how often they touch their head.
Morency says the machines record 30 measurements per second, or "about 1,800 measurements per minute." Literally every wince, pause and verbal stumble is captured and later analyzed.
Ellie was originally commissioned by the U.S. Department of Defense. After all of the deployment in Iraq and Afghanistan, the military was seeing a lot of suicides and wanted to find a way to help military therapists stop them. Soldiers don't always like to confess that they're having problems, but maybe their bodies would say what their words wouldn't.
This is why Ellie is being programmed to produce a report after each of her sessions — it's a kind of visual representation of the 60 different movements she tracks.
If the person's physical behaviors are similar to someone who's depressed, then the person will be flagged.
The idea here is not for Ellie to actually diagnose people and replace trained therapists. She's just there to offer insight to therapists, Morency says, by providing some objective measurements.
"Think about it as a blood sample," he says. "You send a blood sample to the lab and you get the result. The [people] doing the diagnosis [are] still the clinicians, but they use these objective measures to make the diagnosis."
But Skip Rizzo, the psychologist working on Ellie, genuinely believes these technologies will eventually change the field of mental health. One of the central problems with humans, he says, is that they bring their own biases to whatever they encounter, and those biases often make it hard for them to see what's directly in front of them.
"You can get training to be a health care provider or psychologist," he says, "and try to put those things on hold and be very objective. But it's still a challenge. It's always going to be biased by experience. What computers [like Ellie] offer is the ability to look at massive amounts of data and begin to look at patterns, and that, I think, far outstrips the mere mortal brain."
This summer, Ellie is being tested. She's scheduled to sit down with dozens of veterans from Iraq and Afghanistan.
She'll ask them about their lives, encourage them to open up.
Then, silently, Ellie will measure their answers.