Affective Computing Timeline

  • Oldowan tools

    Oldowan tools
    Used for knapping, these tools found in Tanzania in 1930, are seen as some of the oldest flaked tools. It is estimated that the tools they found were 1.7 million years old. While their wasn't written language at time time, the homo habilis were able to pass the knowledge of how to make and use the Oldowan tools through nonverbal cues, facial expression and through visual demonstration. This discovery shows just how important technology was comparison to other things such as language.
  • First AI programs

    First AI programs
    In 1956, the Logic theorist was created and in 1957 the General Problem Solver. Logic theorist was written by Allen Newell, it was able to solve complex problems such as different theorems of symbolic. The General Problem Solver was created by Hebert Simon used to solve almost any problem through means-ends analysis. Millions of dollars were poured into programs such as these after World War 2.
  • Clippit

    Clippit
    When Office assistant was first created a default animated character called Clippit was made. Clippit was suppose to be a helpful tool for writing and other tasks on Microsoft. But Clippy ended up not being as helpful as they hoped, because of the inconvenient times it showed up on the users screen when they were working. Through the downfall of Clippy. Researchers were able to see ways they needed to improve on Office Assistant's emotional intelligence which helped for future AI.
  • MindReader

    MindReader
    A software called MindReader was created by el Kaliouby in 2004. The device is used to read human expression and then give feedback. The program was created by training the device to read a variety of different expressions, by naming all the common points in the face, the program was able to detect future facial signals.
    http://trac.media.mit.edu/mindreader/
  • Facebook- DeepFace

    Facebook- DeepFace
    The development of Deepface was introduced in 2014 as a variety of pattern recognition work was being improved. This technology is used to identify faces online and was created by a group within Facebook. The programming has a 97% accuracy rate. Researchers have found that Deepface algorithms have almost the same level of accuracy as humans when it comes to facial recognition. Deepface uses simulated neural networks with the help of numerical descriptions to identify faces.
  • Self-Driving Cars

    Self-Driving Cars
    Researchers believe that by 2030 cars will be fully automated. Technology is already being put in cars today such as sensors, speed assistance, and lane-keeping technology which brings us closer to self-driving cars. This will then lead to different types of robotic transportation such as robotaxis.
    https://www.forbes.com/sites/johnkoetsier/2019/04/06/self-driving-cars-in-10-years-eu-expects-fully-automated-cars-by-2030/?sh=25c064ba615b
  • Personal Health Pods

    Personal Health Pods
    By 2100, it's believed that people who identify as upperclass will be able to have their own full-body scan pods. This would allow for faster diagnosis and treatment that could once only be done in a clinical setting. These pods would be able to do a 3D analysis of different regions of the body and compare it to other scans the person has done in the past. The robot would also be able to recommend medication.
    https://www.futurebusinesstech.com/blog/the-world-in-2100-top-10-future-technologies