Affective Computing Timeline | Conner McCracken

  • Affective Computing is born

    Affective Computing is born
    The idea of affective computing, is the notion that machines should be capable of recognizing and responding to human emotions, and was originally proposed by Rosalind Picard, a researcher at the Massachusetts Institute of Technology, MIT in 1995. Her book of the same name cemented this proposition, thereby establishing the foundations for a research area that would join forces with computer science, psychology and neuroscience. https://www.historyofinformation.com/detail.php?id=5043
  • Facial Expression recognition software

    Facial Expression recognition software
    Facial Action Coding System (FACS), engineers have started to create predictors for measured emotions based on facial expressions. This represented one of the first real-world uses of affective computing as a field. Based on this, other systems such as Affectiva were later put in place to analyze emotional responses as they occurred in real-time. Some tools have been applied in advertising to determine the reactions https://www.clickworker.com/customer-blog/affective-computing/
  • Emotion sensing wearable device

    Emotion sensing wearable device
    Scholars proposed artefacts including the Empatica E4 wristband that has electrodes for monitoring indices such as skin conductance and heart rate to assess emotion. These wearables also had substantial applications in the health fields including monitoring of epilepsy seizers.
  • Power of AI

    Power of AI
    WaveNet which is a neural network model enhanced AI speech synthesis by enabling synthetic speech to contain emotions. In contrast to prior text-to-speech solutions, WaveNet trained the system with a deep neural network to mimic voice intonations. The technology helped applications such as Google Assistant advance and provided updates to accessibility services. https://neurosciencenews.com/affective-computing-ai-emotion-25668/
  • Emotionally responsive future tools

    Emotionally responsive future tools
    In the following year 2025, courses and e-learning will contain affective artificial intelligence as part of it to detect emotions of students and adapt the level of the lessons in parallel. For example, if a student gets struck, the system may provide tips or make matters easier. A virtual math tutor who stops teaching when they sense signs of frustrations and then continues with the lesson.
  • Future of AI being everywhere

    Future of AI being everywhere
    it is expected that in the next 100 years people will be able to give out emotional signals in realtime and even share them through BCI. Think of one being able to ‘transmit’ comforting to another person experiencing apprehension –like an ESP. Decentralised, this innovation could transform therapy and relations, art and creativity, but also spark privacy violations and falsified emotions. https://spj.science.org/doi/10.34133/icomputing.0076