Emotion Chip 3.0 – Production Early 2019

Emotion Chip 3.0 – Production Early 2019

New Capabilities:

– 3 Languages per EPU

– 8 Persona at the same time

– Human voice Tone analysis

– Cloud service ready (Stackable with Raspberry Pi 0)

Samples will be available in October 2018 for selected customers.

During this recording (last third of the video), Rachel was engaged in real-time appraisal of humanity by appraising Wikipedia articles in real-time. Therefore, her expressions correspond to what she feels while reading about racism, love, starvation, and even political figures. The emotions can be seen scrolling on the left of the screen are the real-time results. Rachel Avatar has no predicated animations, the emotional states synthesized by the EPU controls all her virtual facial muscles grouped by FACS (Facial Action Coding System*).

  • Facial Action Coding System (FACS) is a system to taxonomize human facial movements by their appearance on the face, based on a system originally developed by a Swedish anatomist named Carl-Herman Hjortsjö