Affective Computing Timeline

  • Early Theoretical Foundations

    Early Theoretical Foundations
    This period marked the inception of concepts surrounding emotional intelligence in machines. Theorists and computer scientists began pondering the possibility of machines that could understand and emulate human emotions.
  • Rosalind Picard's "Affective Computing"

    Rosalind Picard's "Affective Computing"
    Picard's book laid the foundational theories for affective computing. It proposed that computers can be programmed to recognize, interpret, and process human emotions. TED talk
  • The Galvactivator

    The Galvactivator
    The Galvactivator, developed by Rosalind Picard and her team, is a wearable glove-like device that measures skin conductance, an indicator of emotional arousal. Which sends biofeedback information to users playing computer games. The glove became brighter when the user's emotions rose due to the conductivity of the user's skin. This innovation marked a significant step in affective computing, showcasing the ability to link physiological changes to emotional states.
  • Clippy

    Clippy
    Clippy, the Microsoft Office Assistant, while not an affective computing tool, represents early attempts in creating interactive AI. Clippy's design aimed to make software more user-friendly, although it lacked emotional intelligence capabilities.
  • Affectiva

    Affectiva
    Affectiva, a spin-off from MIT Media Lab, developed emotion recognition software that analyzes facial cues and expressions. Their technology, used in various sectors, demonstrated the practical application of emotion AI in understanding human sentiments.
  • Jibo

    Jibo
    Jibo was a pioneering social robot designed to interact with humans on an emotional level. Known as the "world's first family robot," Jibo used facial recognition and natural language processing to engage with users, making it a significant development in emotionally intelligent machines. Its ability to respond to social cues and display 'emotional' reactions represented a notable advancement in affective computing and robotics.
  • Emotient

    Emotient
    Emotient, acquired by Apple, specialized in facial expression analysis using AI. Their technology could read and interpret multiple facial expressions, offering insights into user emotions, a breakthrough in emotion detection and analysis. (https://www.wsj.com/articles/apple-buys-artificial-intelligence-startup-emotient-1452188715)
  • Breakthrough In Emotion AI

    Breakthrough In Emotion AI
    An article from MIT Technology Review discusses the latest breakthrough in emotion AI. This innovation involves algorithms that can more accurately interpret human emotions by analyzing patterns in voice and facial expressions. The technology's potential in healthcare for patient monitoring and in customer service as an empathy tool is particularly highlighted. https://www.technologyreview.com/2020/09/24/1008876/how-close-is-ai-to-decoding-our-emotions/
  • Augmented Reality Contact Lenses (Near Future Prediction)

    Augmented Reality Contact Lenses (Near Future Prediction)
    In the 2030s, AI assistants might evolve to understand complex human emotions, providing more personalized and empathetic interactions. They could seamlessly integrate into daily life, offering support in mental health, education, and personal assistance using augmented reality technologies. Imagine AR contact lenses that not only project information but also read and respond to the wearer's emotional state, enhancing communication and personal interaction.
  • Brain-Computer Interfaces (Long-term Future Prediction)

    Brain-Computer Interfaces (Long-term Future Prediction)
    A century from now, affective computing could culminate in brain-computer interfaces capable of directly interfacing with human emotions. These systems might allow the sharing of emotional experiences and sensations, potentially revolutionizing communication and empathy.