-
Joseph Weizenbaum ELIZA used a simple pattern-matching technique that caused some users to believe that ELIZA actually understood their feelings which sparked interest in machines
-
Rosalind Picards research showed affective computing's ability to recognize, interpret, and simulate human emotion. Her work was the foundation for emotions in AI.
-
Paul Ekmans Facial Action Coding System lets AI look at emotions by analyzing micro-expressions. FACS is now a core technology for emotion detections in security, marketing, and healthcare. Outside source: https://www.paulekman.com/facial-action-coding-system/
-
Affectivia introduced AI systems that are able to analyze drivers' and passengers' emotions in real-time using face and voice cues. It is aimed to reduce accidents by picking up on fatigue, frustration, or distracted workers by prompting alerts.
-
AI-powered therapy apps and robots will become the standard for mental health care. They will have emotion recognition and conversational capabilities, which will provide emotional responses and track wellbeing over time.
-
AI will evolve to genuinely experience emotions by replicating human feelings and having chemical processes through neuromorphic computing. These systems will form relations with humans based on deep emotional understanding. This may blur the line between machine and human. AI might share a humans joy or sadness.