BOSTON – Oftentimes a proper diagnosis relies not so much on what a patient says as on what he or she shows.
That’s the premise behind a new wave of startups and entrepreneurs looking to make an impact in healthcare. They’re developing mobile technology designed to analyze emotions, studying vocal and visual clues as well as physiological factors.[See also: Telehealth puts patients at center]
The idea is to pick up signs that a patient might not recognize or be willing to express.
“It’s the dawn of time for that particular technology,” says Joseph Kvedar, MD, founder and director of the Center for Connected Health, part of Partners HealthCare. “There’s so much sensitivity to the role that mental health plays in our healthcare.”
“It’s absolutely brand-new, very much experimental,” adds Meghan Searl, PhD, a psychologist with Brigham & Women’s Hospital in Boston, who sees a future for the technology in detecting depression and mood swings. "It's a very nuanced and complex field, so a lot of validation work has to be done."[See also: Diabetes texting program gets a boost]
Kvedar says the technology looks to address "the psychology of connected health," which might try to measure moods much as a nurse would take one's vital signs. Mobile and wireless technology would certainly help, he says, if the argument could be made that a psychological sensor is as reliable as a blood-pressure cuff or blood-glucose monitor.
Mental health disorders rank among the top health problems worldwide in terms of cost to society. Depression affects 16 percent of adults, or 32 million people, and is significantly higher in people diagnosed with a chronic condition – between 40 percent and 60 percent of those diagnosed with a chronic condition also suffer from depression, according to recent studies.
Those same studies indicate that as much as 85 percent of people diagnosed with a chronic condition aren't correctly diagnosed with depression, and that less than one-fourth of individuals experiencing depression receive appropriate treatment.
One of the companies trying to solve that dilemma is Cogito, based in Charlestown, Mass., which has been working since 2008 to develop “Honest Signals” technology that measures patient engagement, either through phone conversations or face-to-face encounters. Company CEO Joshua Feast says the idea was first launched in advertising circles to measure how people react to sales pitches or commercials, and is now making its way into population health management.
“We’re co-developing systems that basically will analyze relevant interactions between an organization and its patients,” he says. “What we’re really trying to do here is have more successful interactions.”
Feast is quick to point out that the technology can't read people's minds and isn't meant to be used as a lie detector. "What this technology can do is replicate the observations of an observer," he says. "You're focusing on how people speak and interact, not what people say."
Technology like that being developed by Cogito focuses on vocal clues in phone conversations or visual signals in face-to-face meetings. Other systems are being designed as mobile sensors, worn by patients, to monitor physiological responses to situations.
Feast sees this technology being used in many different scenarios, from diagnosing mental health disorders and PTSD to helping healthcare providers spot stress in employees and preventing burnout.
Another company on the horizon is Waltham, Mass.-based Affectiva, whose technology includes Q, a wearable biosensor, and Affdex, which uses a webcam to read facial expressions.
Like Cogito, Affectiva got its start at the Massachusetts Institute of Technology, having been co-founded by MIT professor Rosalind W. Picard. A group led by Picard, who is also director of the Affective Computing Research Group at MIT's Media Lab, had developed sensors to measure the electrical conductance of the skin, which can determine the state of the wearer's sympathetic nervous system. The company raised $5.7 million in Series B funding in 2011 to commercialize those sensors, and is now exploring how they can be used on epilepsy patients – possibly to detect seizures before they occur.
Earlier this month, the company secured a $500,000 National Science Foundation grant to further develop Affdex, its cloud-based emotion measurement platform using facial expression recognition. While much of the company's work with Affdex has been in advertising (including crowdsourcing projects at the 2011 Cannes Lions International Festival of Creativity and the 2012 Super Bowl), officials see the technology eventually being used to help people with autism spectrum disorders and those who have difficulty reading faces in real-time conversation.
Most experts agree the technology has to be validated through studies before it can be applied to healthcare.
Among the questions to be answered, Searl points out: Can emotion-sensing technology be deployed without a patient's approval? "It's sort of like taking a person's blood without their consent," she says.
This past April, Cogito was awarded a contract from the University of Southern California’s Institute for Creative Technologies (ICT) to contribute to the Defense Advanced Research Projects Agency’s (DARPA’s) Detection and Computational Analysis of Psychological Signals Program. The company’s Social Signal Processing (SSP) platform is being integrated into telehealth interactions to allow clinicians to assess psychological stress, depression and engagement among U.S. military personnel, veterans and their families.
“Incorporating Cogito’s technology into the telehealth dashboard will give remote care providers an objective, secure tool to supplement their skills and intuition in assessing patients’ behavioral health status, engagement and rapport during each interaction,” Feast said in a press release announcing the contract. “We are honored to be working with ICT to support our U.S. Service members experiencing depression, post-traumatic stress disorder, or other psychological health concerns.”