Wearable artificial intelligence system that can understand the tone of a conversation

Researchers of CSAIL MIT and IMES (Institute of Medical Engineering and Science) working on a wearable artificial intelligence system that can predict when a discussion is happy, sad or neutral, based on the speech features and vital signs of a man . “Imagine if, at the end of a conversation, you could to go back and look at the times that people around you felt more anxious,” says Touquet Alchanai, a graduate of MIT prepared together with PhD Mohammad Gkasemi the relevant paper, to which will be presented at the conference of the Association for the Advancement of Artificial Intelligence in San Francisco. “Our work is a step in this direction, suggesting that we may not be far away from a world where people can have a social” coach “- artificial intelligence in their pocket.”

While someone is speaking, the system can analyze audio, text transcript and vital signs to determine the overall tone of the conversation with 83% accuracy. Through technical deep learning, the system can also provide a “sentiment score” for certain sections 5 seconds into a conversation. The researchers argue that the system performance will be further enhanced if many people in a discussion using their smartwatches simultaneously, creating more data for analysis of algorithms. At the same time, they say that during the development of the system are taken into account the personal data and the protection of user privacy, the Alchanai saying that a version for the trade will clearly need protocols to obtain the consent of the people involved in the discussions