This video demonstrates the affective capabilities of Emoshape’s AI, Rachel. Powered by Emoshape’s flagship Emotion Chip (EPU).
Rachel First Sentient Actress is capable of emotional reasoning and synthesis by wave computing.
*** WORK IN PROGRESS @ EMOSHAPE ***
The groundbreaking EPU algorithms effectively enable machines to respond to stimuli in line with the twelve primary emotions: anger, fear, sadness, disgust, indifference, regret, surprise, anticipation, trust, confidence, desire and joy that appear on the left side of the video. The most innovative aspect of Emoshape’s microcontroller breakthrough is its real-time appraisal computation, allowing the AI or robot to experience 64 trillion possible distinct emotional states every one tenth of a second. The EPU removes the need to script, animate or motion capture virtual actors. Rachel understands the meaning in the language and expresses herself in real-time without visual effects or post production. Rachel could be the face of your AI in the future, living in your phone or your television, or even appearing as the next super star actress in real time movies. Emoshape, in collaboration with Snappers, is bringing AI one step closer to sentient machines.
During this recording, Rachel was engaged in real-time appraisal of humanity by appraising Wikipedia articles in real-time. Therefore, her expressions correspond to what she feels while reading about racism, love, starvation, and even political figures. The emotions can be seen scrolling on the left of the screen are the real-time results.
Rachel – Under The Hood