Affective Computing (AI) Timeline

  • The Beginnings of Artificial Intelligence

    The Beginnings of Artificial Intelligence
    The term "artificial intelligence" was created in 1957 at a conference in Dartmouth. Researchers and scientists realized they needed a name for this growing field. This is considered the true start of artificial intelligence. The photo included is a picture of a plaque at the site where the conference was held. It explains who was involved in the process and how they created the field of AI.
    Here is a video describing what AI is.
    YouTube
  • Period: to

    The First AI Programs

    During this time period, the first AI programs were created. In 1956, Logic Theorist was made, which was a program meant to prove mathematic theorems and mimic the human brain. In 1957, General Problem Solver arrived to the scene, which proved theorems in chess, geometry, and word puzzles. LISP, the first AI programming language, was founded in 1958.
    HC
    Instructional Design
  • Moore's Law

    Moore's Law
    Moore's Law is a trend discovered by Gordon Moore. It explains that every two years, the amount of transistors that can be placed on a circuit doubles. This led to more developed and advanced technology. This, among other innovations, helped advanced AI. It has allowed technology to get faster, cheaper, and more affordable. Over more recent years, Moore's Law appears to be slowing down.
    Here is a video describing Moore's Law
    YouTube
  • The Sentograph

    The Sentograph
    This was a form of affective computing created by Manfred Clynes. This was one of the first machines created that could "measure" emotions. It worked by measuring changes in finger pressure waveforms when the user pressed an immovable button. By using algorithms and patterns, Clynes believed that certain pressure waveforms indicated certain emotions. The Sentograph stimulated Picard's interest (another researcher who did lots of work in the field of AI and will be discussed later in timeline).
  • Facial Action Coding System

    Facial Action Coding System
    Originally created by Carl-Herman Hjortsjö and adopted by Paul Ekman, this technology can identify emotions from facial expressions. It works by taking every facial muscle movement and assigning it an action unit. This allows the system to process the whole image by breaking down each muscle action unit and categorizing it based on the emotion assigned. Like other pieces of technology, computers need a logical algorithm to read emotions.
    YouTube
  • Clippit!

    Clippit!
    Clippit, otherwise known as Clippy, was a feature on Microsoft that would pop up and offer help to the user. Clippit would detect a suggestion based on whatever the user was doing. Although meant to be helpful, this technology proved to be annoying to users since it continually popped up. The technology was unable to understand the user's emotions, which caused the frustration. This technology was retired in 2001.
    Video of Clippit in action
    YouTube
  • Sojourner Rover

    Sojourner Rover
    The Sojourner Rover landed on Mars in 1997. This was the first robot to land on a different planet. This robot was sent to Mars to deliver science instruments. It was originally only going to stay for 7 days, but Sojourner Rover spent 83 days exploring Mars' terrain, taking photos, and collecting measurements. This rover was an example and model for all future rovers that would be sent to Mars.NASA
  • Kismet

    Kismet
    Kismet was the first social robot to be made. The robot was made out of metal and had large eyes, red lips, and movable ears. The robot was able to track objects and make comments about them. The robot was able to speak to the user and also convey nonverbal communication (blinking, head nods, etc.).
    Video of Kismet in action
    YouTube
  • Squeezable Mouse

    Squeezable Mouse
    In response to Clipper's unpopularity, Picard created the squeezable mouse, a machine that could detect a user's frustration. The mouse would analyze the amount of pressure the user was exerting on the mouse. If there was more pressure, it indicated more frustration. When there was frustration, Clippit would pop up and offer help at those moments. This technology, although helped solve the problem of Clippit not detecting frustration, was too late to save Clippit.
  • Galvactivator Transforming Video Games

    Galvactivator Transforming Video Games
    The galvactivator is a glove-like machine that can conduct the skin's conductance. It is connected to LED lights, so when the user's arounsal levels increase, the lights shine brighter. This device was connected to the video game, Quake, so when the user's arousal levels (shock) went up, the video game player would jump back in surprise. Through biosensors, the machine was able to detect the player's emotions.
  • MindReader

    MindReader
    MindReader, LED glasses with a camera, was created by Rana el Kaliouby. This device watched the listener and gave feedback to the user as to if the person was engaged, neutral, or bored based on either a green, amber, or red LED light shown to the user. She hoped this would be a great device for people with autism, since they can struggle with face-blindness.
    TH
  • iCalm Wristband

    iCalm Wristband
    The iCalm wristband was a device people could wear that would track their autonomic processes (heart rate, electrodermal activity, etc.). This would help the user be able to detect how their health was and could make improvements. The iCalm wristband was made to help people with sleep, weight monitoring, stress, and many more.
  • Pepper

    Pepper
    Pepper is the first commercially available social machine for the public to buy. Pepper, a four-foot-tall humanoid robot, can speak, make jokes, and interpret the person's mood. Pepper has been used by over 2,000 companies as an assistant to greet guests as they enter and is able to speak up to 15 languages. Pepper has touch screens, microphones, and LED sensors for multiple ways to interact with the user. It also has independent navigation.
    Aldebaran
  • Jibo

    Jibo
    Jibo claimed to be the world's first social robot for the home. It gained the most successful robot crowdfunding, with raising $35 million dollars in a year. This article explained that in 2019, there were rumors that Jibo programming would be discontinued. It turned out to be false, but many owners mourned. They expressed how they debated on burying their robot, since they saw it as a family member. The Verge
  • Patents Become Involved

    Patents Become Involved
    Emotient, an artificial intelligence company, won a patent for its technique of gathering and labeling facial images. Through its technology, it was able to process up to 100,000 facial images a day. Many in the field worry if the overuse of patents will stifle the field's development. Emotient was eventually bought out by Apple in January of 2016.
    Reuters
  • Robot Refrigerator

    Robot Refrigerator
    This robot refrigerator will recommend meals to the user based on what food is inside of the refrigerator and what nutrients the user has been consuming. The refrigerator will also make suggestions based on the user's current physical health and what the user has been craving lately. It will keep track of the meals they have been eating lately to ensure the user isn't eating the same meals. It will also keep track of which foods are close to being spoiled, so there is less food waste.
  • AI Daydreaming

    AI Daydreaming
    AI daydreaming works by having a chip placed inside of the user's brain. This chip can read people's emotions, desires, wants, needs, etc. If people want to daydream, all they have to do is turn the chip on by thinking about it. The chip will then show their eyes whatever they want to daydream. The user will see exactly what they want to see and it will appear to be occurring in front of them and around them. It is a new immersive experience for people to explore different ideas and realities.