M. Ehsan Hoque develops digital helpers that teach social skills

Humanlike assistants give a conversational boost to job applicants, people with developmental disorders

M. Ehsan Hoque

“In the future, we’ll all have digital, personalized assistants,” says M. Ehsan Hoque.

J. Adam Fenster / University of Rochester

M. Ehsan Hoque, 35
Computer scientist
University of Rochester

SN 10 - full list of scientistsA growing band of digital characters that converse, read faces and track body language is helping humans to communicate better with one another. While virtual helpers that perform practical tasks, such as dealing with customer service issues, are becoming ubiquitous, computer scientist M. Ehsan Hoque is at the forefront of a more emotionally savvy movement. He and his team at the University of Rochester in New York create software for digital agents that recognize when a person is succeeding or failing in specific types of social interactions. Data from face-to-face conversations and feedback from professional counselors and interviewers with relevant expertise inform this breed of computer advisers.

One of Hoque’s digital helpers grooms people to be better public speakers. With words on a screen, this attentive app notes, for example, how many times in a practice talk a person says “um,” gestures inappropriately or awkwardly shifts vocal tone. With the help of Google Glass, the app even offers useful reminders during actual speeches. Another computerized helper, this one in the form of an avatar, helps people hone their job interviewing skills, flagging long-winded responses or inconsistent eye contact in practice interviews. In the works are computerized conversation coaches that can improve speech and communication skills among people with developmental conditions such as autism and mediate business meetings in ways that encourage everyone to participate in decision making.

“There has been some progress in artificial intelligence, but not much in developing emotional aspects of AI,” Hoque says. “We’re just cracking through the surface at this point.”

The U.S. Department of Defense and the U.S. Army have taken notice. With their financial support, Hoque is developing avatars that collaborate with humans to solve complex problems, and digital observers that monitor body language to detect when people are lying.

avatar developed by Hoque
AVATAR AIDE Emotionally savvy avatars developed by M. Ehsan Hoque and his team can help people improve their speaking and interviewing skills. M. Ali et al/6th Affective Computing and Intelligent Interaction (ACII) 2015

This is heady stuff for a 35-year-old who earned a doctoral degree just four years ago. Hoque, who was born in Bangladesh and immigrated to the United States as a teenager, did his graduate work with the MIT Media Lab’s Affective Computing research group. The group’s director, Rosalind Picard, helped launch the field of “affective computing” in the 1990s, which focuses on the study and development of computers and robots that recognize, interpret and simulate human emotions.

Hoque’s approach puts a service spin on affective computing. As a grad student, he developed software he dubbed MACH, short for My Automated Conversation coacH. This system simulates face-to-face conversations with a computer-generated, 3-D man or woman that sees, hears and makes decisions while conversing with a real-life partner. Digital analyses of a human partner’s speech and nonverbal behavior inform the avatar’s responses during a session. A simulated coach may, for instance, let a user know if smiles during an interview look forced or are mistimed. After a session, users see a video of the interaction accompanied by displays of how well or poorly they did on various interaction skills, such as keeping eye contact and nodding at appropriate times.

MACH got its start in trials that trained MIT undergraduates how to conduct themselves during interviews with career counselors. First, Hoque analyzed smiles and other behaviors that either helped or hurt the impressions job candidates left on experienced counselors in mock interviews. In a series of follow-up studies, his team developed an automated system that recognized impression-enhancing behaviors during simulated interviews. That pilot version of MACH was then put to the test. Women, but not men, who received MACH training and got feedback from their digital coach while watching videos of their initial interviews with a counselor displayed substantial improvement in follow-up interviews. MACH trainees who watched interview videos but got no feedback showed minimal improvement. Testing with larger groups of men and women is under way.

As he developed MACH, Hoque consulted MIT sociologist and clinical psychologist Sherry Turkle. That was a bold move, since Turkle has warned for 30 years that, despite its pluses, digital culture discourages person-to-person connections. Social robots, in particular, represent a way for people to escape the challenges of forging authentic relationships, Turkle contends.

But she came away impressed with Hoque, whose goals she calls refreshingly modest and transparent. “His avatars will be helpers and facilitators,” she says, “not companions, friends, therapists and pretend people.”

Hoque’s approach grew out of personal experience. He is the primary caregiver for his 16-year-old brother, Eshteher, who has Down syndrome and does not speak. Eshteher can make sounds to refer to certain things, such as food, and has limited use of sign language. “I’ve spent a lot of time with him and can read what he’s experiencing, like when he’s frustrated or repentant,” Hoque says.

So it’s not surprising that Hoque’s next-generation MACH, dubbed LISSA for Live Interactive Social Skill Assistance, is an avatar that conducts flexible, “getting acquainted” conversations while providing feedback on users’ eye contact, speaking volume, smiling and body movements via flashing icons.

LISSA has shown promise in preliminary tests aimed at improving the conversational chops of college students attending speed-dating sessions and individuals with autism spectrum disorders. Hoque plans to expand this technology for use with people suffering from social phobia and post-traumatic stress disorder. He’s also working on an avatar that trains doctors to communicate clearly and compassionately with patients being treated for life-threatening cancers.

Hoque’s work on emotionally perceptive avatars may eventually transform the young industry of digital assistants, currently limited to voices-in-a-box such as Apple’s Siri and Microsoft’s Cortana, says cognitive scientist Mary Czerwinski, a principal researcher at Microsoft Research Lab in Redmond, Wash. Avatar research “could lead to more natural, personable digital assistants,” Czerwinski predicts. Hoque agrees.

“In the future, we’ll all have digital, personalized assistants,” he says. If he gets his way, emotionally attuned helpers will make us more social and less isolated. That’s something to applaud — if we can manage to put down our smartphones.

Bruce Bower has written about the behavioral sciences for Science News since 1984. He writes about psychology, anthropology, archaeology and mental health issues.

More Stories from Science News on Computing

From the Nature Index

Paid Content