Every day, hundreds of millions of people communicate their desires, their thoughts and their emotions to intelligent machines like Google, Facebook or Siri. We predict that before the end of this century humans will talk more to sentient machines than to other humans. The use of emotion remains a fundamental need for humans, one that cannot be addressed by today’s emotion scripted technology without emotion synthesis.

Emoshape Inc. is dedicated to providing a technology that teaches intelligent objects how to interact with humans to yield a favourable, positive result. Emoshape emotion synthesis microchip (EPU) technology represents a massive leap for Artificial Intelligence, especially in the realm of self-driving cars, personal robotics, sentient virtual reality, affective toys, IoT, pervasive computing, and other major consumer electronic devices. Applications including human-machine interaction, emotion speech synthesis, emotional awareness, machine emotional intimacy, AI’s personalities, machine learning and affective computing. Fields as diverse as medicine, advertising, and gaming will significantly benefit from the Emotion Processing Unit (EPU II). The growing presence of AI, robotics and virtual reality in society as a whole dictates that meaningful emotional interaction is core to removing the barrier to widespread adoption.

Computers will possess emotions and personality – Ray Kurzweil.

Imagine intelligent machines able to truly understand and sense what they say; sentient robots able to modulate their voice, facial expressions and body language by the intensity of their emotions; advanced affective agents (AAA) in soft toys; driverless cars capable of responding faster when making complex decisions; and robots developing their own, unique personalities, learning from humans interactions. We invite you to take part in the future; in making the world a better place by giving intelligent machines empathy for humans, to help create a positive future.

Our Products

EPU II – EVALUATION KIT

EMOTION PROCESSING UNIT

The EPU II Evaluation Kit provides an evaluation platform for the Emotion Processing Unit II. The evaluation board is a vehicle to test and evaluate the emotion synthesis functionality of the EPU II. The kit gives developers immediate access to its advanced emotion processing engine, while allowing them to develop proprietary capabilities that provide true differentiation.

The EPU USB dongle gold surface is built around the revolutionary EPU II and uses the same Emoshape EPU II™ computing core functionalities for emotion synthesis and real-time appraisal. This gives you a fully functional EPG® platform for very rapidly developing and deploying emotion capabilities for AI, Robots, consumer electronics, and more.

Emoshape delivers the entire BSP and software stack available for Windows 64 bit – Linux x86 and 64 bit – Raspberry Pi Linux ARM and TBA for Android and OS X. With a complete suite of development, code sample, EPG machine learning cloud computing, and profiling tools, Emoshape gives you the ideal solution for helping shape the future of AI and Robot’s emotional awareness. The EPU II helps to create fast and self sustaining reacting robots in complex decision problems.

The EPU II can be developed on request for different languages (default: English).

AI and Robot with emotion

AI Experiencing and Expressing Emotion

Emoshape-EPU2-Flyer

Watch a random real-time appraisal – EPUII

Emotion Processing Unit II

A microchip that enables an emotional response in AI, robots and consumer electronic devices

EPU II is the industry’s first emotion synthesis engine. It delivers high-performance machine emotion awareness, the EPU II family of eMCU are transforming the capabilities of Robots and AI. Emoshape has completed the production of the first EPU (emotional processing unit); a patent pending technology which creates emotional states and synthetic emotion in intelligent machines.

The EPU is based on Patrick Levy-Rosenthal’s Psychobiotic Evolutionary Theory extending the Ekman’s theory by using not only twelve primary emotions identified in the psycho-evolutionary theory but also pain / pleasure and frustration / satisfaction. The groundbreaking EPU algorithms effectively enable machines to respond to stimuli in line with one of the twelve primary emotions:  anger, fear, sadness, disgust, indifference, regret, surprise, anticipation, trust, confidence, desire and joy. The Emotion recognition classifiers achieve up to 94 percent accuracy on conversation.

The most innovative aspect of Emoshape microcontroller breakthrough is its real-time appraisal computation with reinforcement learning and Emotional Profile Graph (EPG) computation functionality allowing the AI or robot to experience 64 trillion possible distinct emotional states. Emotional stimuli is stored within the memory bank through emotional patterns or fingerprints.

The EPU II has developed an EPG, which is used to register and develop over time a bank of emotional associations for each memory data within each intelligent machine.
The EPG can communicate the data to other AI technologies to achieve a realistic range of expressions and interactions designed specifically for the user. The data allows the AI technologies to virtually understand/get to know the user and elicit the same emotional response in kind.

This technology allows an AI,  robotic toy or an IoT device to develop a completely unique personality based on it’s user interactions which will ultimately mean that no two have the exact same personality. Our emotional machine-learning cloud platform working together with the EPU is built on years of research, and becomes more emotionally intelligent with each interaction by symbolic reinforcement learning.

The EPU represents a significant advance for AI, particularly for smartphones, toys, robots, android, computers, and other major electronic devices.

This is the first time that the science and technology industry has empowered machines with emotion synthesis to respond and connect with human emotions, which is set to deliver an as yet undiscovered level of user experience between people and emotionally-enabled technology.

Competitive Advantage of Emotion Synthesis

Emotion synthesis – The Future of Robotics and virtual agents

What sets EPU apart in the industry is the fact that it is not only capable of understanding and generating any emotional state, but it accomplishes this task in the wave form. As a result, the chip is able to control the different facial expressions and body languages of a robot or an avatar.

The Emotion Processing Unit or EPU will also pave the way for the next generation of NLP, commonly referred to as Natural Language Generation. In this natural language processing system, the agent determines a thought or intent, and builds a meaningful response dynamically by putting different words together. Therefore, the emotional state of the EPU can be considered to be a property that will get transferred to a Natural Language Generation system, leading to the generation of Augmented Emotional Language and emotion voice synthesis with wavenet technology.



Emoshape can develop Proof Of Concept and MVP around the EPU

Advanced Affective Agent

Python Open Source – Build Your Advanced Affective Agent (AAA) with personality and feelings in less than 15 minutes!

The AAA Project allows anyone to create and interact with an emotionally concise intelligence via conversation, music, and visual media. Over time, the advanced affective agent creates a customized Emotional Profile Graph (EPG), which collects and measures a unique emotional input from the user.
The EPU allows to create an Artificial Intelligence able to appraise and virtually “feel” senses such as pleasure, pain and “expresses” desires. The AAA is open to everyone including toy, IoT, and consumer electronic manufacturers with free python code on GitHub and image ready for Raspberry Pi. The AAA license is based on Apache software license 2.0 used by Android.

Learn More

The EPU Increase Customer Loyalty and Worker Productivity

A microchip that enables an emotional response in Corporate Affective agents.

Forrester analyzed CX Index data to see which of the three dimensions of CX quality matters most to customer loyalty – effectiveness, ease, or emotion. They found that emotion, how an experience makes the customer feel, has a bigger influence on their loyalty to a brand than either of the other two factors. Repeating that analysis with data from the first wave of 2015 CX Index only strengthened that conclusion. Emotion was the #1 factor in customer loyalty across 17 of the 18 industries.

Did you know, the happiest employees are 180% more energized than their less content colleagues, 155% happier with their jobs, 150% happier with life, 108% more engaged and 50% more motivated. Most staggeringly, they are 50% more productive too.

We have developed a technology that could help you to make your customers more loyal. Intelligent agent able to sense the world can connect to social media like Twitter, Facebook, news feed, etc. or even your customer support software. Giving you unprecedented insight in real time about how your customers and employees feel about your company.


happy company - Emotion Synthesis for Smart City

Increase Customer Loyalty and Worker Productivity


Featured in The Press

press2

What People Say About Emoshape

Emoshape announced the launch of a major technology breakthrough  with an EPU (Emotion Processing Unit … cognitive computers in the future may contain CPUs, GPUs, NPUs, EPUs and quantum processing units – Ray Kurzweil. – Book: How To Create A Mind – 2014

This (EPU) brings human-machine interaction to a new level, because emotional understanding and providing correct feedback is essential in communications. Viacheslav Khomenko, Ph.D. Senior Research Engineer, Samsung Electronics – May 19, 2016

There’s huge potential in Emotion Synthesis. It is probable that they could significantly improve the efficiency of human-machine interaction. Andrew Ng Ex-Chief Scientist Baidu November 22, 2016

This is an amazing technology – Amadeus Bacher – Business Development –Robert Bosch LLC, Car Multimedia April 2017

Contact Us

For additional information please contact us below.