A good deal of attention is being given to emotion detection systems that use machine learning algorithms and deep learning networks to identify the emotion a person is experiencing from their facial expressions, the words they use and the way their voice sounds. Many of these systems are remarkably successful but they are somewhat limited by the necessity for people to either speak while experiencing an emotion or show that emotion on their face. Emotions that are not reflected in facial expressions or speech remain hidden. Now, a research group at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) has built a system called EQ-Radio that can identify emotions using radio signals from a wireless router whether or not a person is speaking or showing their emotions with facial expressions.
Sharing is Caring!