What if your laptop knew how you felt?
Researchers train computers to 'read' emotions, which could help with teaching, security, people with autism – and cranky users.
(Page 2 of 2)
Together with computer scientists, Cohn is working to get machines to read AUs and then describe which muscles moved and how. Such applications could do what Cohn did when he studied a videotape of a criminal who professed to be distraught about the murder of several family members and tried to pin the blame on someone else. Cohn watched attentively and saw no genuine sadness reflected in the woman's face. Sadness is a combination of AUs that is difficult to perform voluntarily: pulling down the corners of your lips while bringing your eyebrows together and raising them. What the subject did was raise her cheeks to simulate the lip curl. Her brows stayed smooth.Skip to next paragraph
Subscribe Today to the Monitor
Researchers interviewed for this story concur that emotion recognition appeals to the security industry, which could use it in lie detection, identification, and expression reading. The best results are still obtained in controlled settings with proper lighting and a good positioning of the face. An image from a security camera wouldn't give the software much to work with.
Picard says there is peril in working with "fake data" if this technology is used in security. Yes, machines may be able to read fear, but fear is not necessarily an indicator of bad intentions. For example, sudden elation after a period of depression can also be an indicator of suicidal intent.
Tim Bickmore wants to teach computers to make small talk.
A graduate of Rosalind Picard's group at the Massachusetts Institute of Technology and now a computer-science professor at Northeastern University in Boston, Mr. Bickmore is studying how human relationships develop over time, and how conversation becomes less formal and more referential to past activity, in order to help computers do the same. Bickmore has created Laura, one in a series of "relational agents," computer programs that adapt to a user's emotional state, engage in small talk, and even remember information from previous conversations.
Bickmore tested Laura in a two-month-long study to promote walking among patients at the Boston Medical Center geriatrics clinic. Bickmore divided 21 people into two groups. Both groups received a pedometer and a brochure. One group was also asked to interact daily with Laura through a touch-screen computer.
In "her" robotic voice, Laura would first try to gauge the emotional state of the person. The response would prompt a change in Laura's facial features, tone, and the things she'd say. She eventually would present the results of that day's walk along with future goals, but she'd also find time for social conversation.
If the person told Laura they were going to walk in the park with a friend, the next day Laura would ask if the friend would join them again. Laura's software also had some humor built into it: If a user asked Laura where she lived, she'd reply: "I just live in this little box." Over the course of the trial, Laura became familiar with people's taste in food and movies.
At the end of 60 days, researchers found that people who talked to Laura walked twice as many steps as those in the control group. One reason could be that Laura successfully established a bond with patients. Bickmore says that when Laura appeared on the screen to greet users, they would often wave back and say, "Hi, Laura."
Bickmore hopes to do more application in real-time and have relational agents act as advisers. "Do I go for the cookie or the salad?" someone on a diet might ask Laura, who would respond accordingly.
• Read more about relational agents and their use on Tim Bickmore's site: http://www.ccs.neu.edu/home/bickmore/