
EMOSHAPE predicts that before the end of this century humans will talk more to sentient machines than to other humans. The use of emotion remains a fundamental need for humans, one that cannot be addressed by today’s emotion technology.
Read more
Emoshape Inc. is dedicated to providing an edge computing solution (cloud + chip) that teaches intelligent objects how to interact with humans to yield a favourable, positive result. Emoshape emotion synthesis chip (EPU) technology represents a massive leap for Artificial Intelligence, especially in the realm of self-driving cars, personal robotics, sentient virtual reality, affective toys, IoT, pervasive computing, and other major consumer electronic devices. Applications including human-machine interaction, emotion speech synthesis, emotional awareness, emotion reasoning, machine emotional intimacy, AI’s personalities, machine learning and affective computing. Fields as diverse as medicine, advertising, and gaming will significantly benefit from the Emotion Processing Unit (EPU II). The growing presence of AI, robotics and virtual reality in society as a whole dictates that meaningful emotional interaction is core to removing the barrier to widespread adoption.
Computers will possess emotions and personality – Ray Kurzweil.
Imagine intelligent machines able to truly understand and sense what they say; sentient robots able to modulate their voice, facial expressions and body language by the intensity of their emotions; advanced affective agents (AAA) in soft toys; driverless cars capable of responding faster when making complex decisions by emotional reasoning; and robots developing their own unique personalities, learning from humans interactions. We invite you to take part in the future; in making the world a better place by giving intelligent machines empathy for humans, to help create a positive future.

-
-
THE EXOLIFE PROJECT
EPU II – EVALUATION KIT
EMOTION PROCESSING UNIT
The EPU II Evaluation Kit provides an evaluation platform for the Emotion Processing Unit II. The evaluation board is a vehicle to test and evaluate the emotion synthesis functionality of the EPU II. The kit gives developers immediate access to its advanced emotion processing engine, while allowing them to develop proprietary capabilities that provide true differentiation.
The EPU USB dongle gold surface is built around the revolutionary EPU II and uses the same Emoshape EPU II™ computing core functionalities for emotion synthesis and real-time appraisal. This gives you a fully functional EPG® platform for very rapidly developing and deploying emotion capabilities for AI, Robots, consumer electronics, and more.
Emoshape delivers the entire BSP and software stack available for Windows 64 bit – Linux x86 and 64 bit – Raspberry Pi Linux ARM and TBA for Android and OS X. With a complete suite of development, code sample, EPG machine learning cloud computing, and profiling tools, Emoshape gives you the ideal solution for helping shape the future of AI and Robot’s emotional awareness. The EPU II helps to create fast and self sustaining reacting robots in complex decision problems.

Advanced Affective Agent
Python Open Source – Build Your Advanced Affective Agent (AAA) with personality and feelings in less than 15 minutes!
The AAA Project allows anyone to create and interact with an emotionally concise intelligence via conversation, music, and visual media. Over time, the advanced affective agent creates a customized Emotional Profile Graph (EPG), which collects and measures a unique emotional input from the user.
The EPU allows to create an Artificial Intelligence able to appraise and virtually “feel” senses such as pleasure, pain and “expresses” desires. The AAA is open to everyone including toy, IoT, and consumer electronic manufacturers with free python code on GitHub and image ready for Raspberry Pi. The AAA license is based on Apache software license 2.0 used by Android.
Competitive Advantage of Emotion Synthesis
Emotion synthesis – The Future of Robotics and virtual agents
What sets EPU apart in the industry is the fact that it is not only capable of understanding and generating any emotional state, but it accomplishes this task in by wave computing. As a result, the chip can control in real-time the different facial micro expressions (FACS ) and body languages of a robot or an avatar as Rachel.
The Emotion Processing Unit or EPU will also pave the way for the next generation of NLP, commonly referred to as Natural Language Generation. In this natural language processing system, the agent determines a thought or intent, and builds a meaningful response dynamically by putting different words together. Therefore, the emotional state of the EPU can be considered to be a property that will get transferred to a Natural Language Generation system, leading to the generation of Augmented Emotional Language and emotion voice synthesis with wavenet technology.
Internet Of Things
Internet Of Things
By incorporating an in-built EPU, it is possible to make all e-Readers and personal assistants emotionally intelligent. This innovation would mean that they will be able to read a story aloud and completely understand the meaning of what they say. While listening to the voice of the device, the users will be able to understand how the device feels scared when the story gets scary or happy when the story is about something wonderful. They will remember what makes the users happy or unhappy, and act accordingly.
Self Driving Car
Self Driving Car
The days are not very far when your living experience while traveling in the car will become the selling point of the self driving cars. Emotion and personality will play a central role in creating the virtual Social Hostess of the car. There is no doubt that the earliest of sentient self driving cars will be equipped with an EPU.
AI & Chatbots
AI & Chatbots
The Emotion Processing Unit from Emoshape will pave the way for the next generation of NLP, commonly referred to as Natural Language Generation. In this NLP system, the agent determines a thought or intent, and builds a meaningful response dynamically by putting different words together. Therefore, the emotional state of the EPU can be considered to be a property or reward function that gets transferred to a NLG system, leading to the generation of Augmented Emotional Language and emotion voice synthesis with wavenet technology.
KNOW MORE ABOUT AIPersonal Robotics
Personal Robotics
EPU stands apart in the industry not only by virtue of the fact that it is capable of understanding and generating any emotional state. This technology also allows a robot toy to develop a completely unique personality based on its user interactions, which will ultimately mean that no two have the exactly the same personality. The chip is able to control the different facial expressions and body languages of a robot without hard coded predicates.
Game and VR
Game and VR
Very soon, the emotions involved in playing a game will not only be felt by the players, but also the game itself. This will have a serious impact on the entire strategy of the game. With two new plugins, one each for Unity 3D and Unreal Engine for the ExoLife Emotion Engine, Emoshape is looking to make online gaming and AR/VR experience emotionally interactive and sentient.
Industrial Robotics
Industrial Robotics
Each of the human emotional states plays a critical role in the productivity levels. Therefore, the collaborative robots will need to address frustration and depression, if they want to ensure a high level for the production.
The EPU Increase Customer Loyalty and Worker Productivity in Smart Buildings
A microchip that enables an emotional response in Corporate Affective agents.
Forrester analyzed CX Index data to see which of the three dimensions of CX quality matters most to customer loyalty – effectiveness, ease, or emotion. They found that emotion, how an experience makes the customer feel, has a bigger influence on their loyalty to a brand than either of the other two factors. Repeating that analysis with data from the first wave of 2015 CX Index only strengthened that conclusion. Emotion was the #1 factor in customer loyalty across 17 of the 18 industries.
Did you know, the happiest employees are 180% more energized than their less content colleagues, 155% happier with their jobs, 150% happier with life, 108% more engaged and 50% more motivated. Most staggeringly, they are 50% more productive too.
We have developed a technology that could help you to make your customers more loyal. Intelligent agent able to sense the world can connect to social media like Twitter, Facebook, news feed, etc. or even your customer support software. Giving you unprecedented insight in real time about how your customers and employees feel about your company.
Emotion Processing Unit II
An emotion chip that enables an emotional response in AI, robots and consumer electronic devices.
EPU II is the industry’s first emotion synthesis engine. It delivers high-performance machine emotion awareness, the EPU II family of eMCU are transforming the capabilities of Robots and AI. Emoshape has completed the production of the first EPU (emotional processing unit); a patent pending technology which creates emotional states and synthetic emotion in intelligent machines.
The EPU is based on Patrick Levy-Rosenthal’s Psychobiotic Evolutionary Theory extending the Ekman’s theory by using not only twelve primary emotions identified in the psycho-evolutionary theory but also pain / pleasure and frustration / satisfaction. The groundbreaking EPU algorithms effectively enable machines to respond to stimuli in line with one of the twelve primary emotions: anger, fear, sadness, disgust, indifference, regret, surprise, inattention, trust, confidence, desire and joy. The Emotion recognition classifiers achieve more than 86 percent accuracy (ISEAR) on conversation.
Read more
The most innovative aspect of Emoshape microcontroller breakthrough is its real-time appraisal computation with reinforcement learning and Emotion Profile Graph (EPG) computation functionality allowing the AI or robot to experience 64 trillion possible distinct emotional states. Emotional stimuli is stored within the memory bank with its associated cognitive and physical state.
The EPU II has developed an EPG, which is used to register and develop over time a bank of emotional associations for each memory data within each intelligent machine.
The EPG can communicate the data to other AI technologies to achieve a realistic range of expressions and interactions designed specifically for the user. The data allows the AI technologies to virtually understand/get to know the user and elicit the same emotional response in kind.
This technology allows an AI, robotic toy or an IoT device to develop a completely unique personality based on it’s user interactions which will ultimately mean that no two have the exact same personality. Our emotion machine-learning cloud platform working together with the EPU is built on years of research, and becomes more emotionally intelligent with each interaction by symbolic reinforcement learning.
The EPU represents a significant advance for AI, particularly for smartphones, toys, robots, android, computers, and other major electronic devices.
This is the first time that the science and technology industry has empowered machines with emotion synthesis to respond and connect with human emotions, which is set to deliver an as yet undiscovered level of user experience between people and emotionally-enabled technology.

Emoshape announced the launch of a major technology breakthrough with an EPU (Emotion Processing Unit ... cognitive computers in the future may contain CPUs, GPUs, NPUs, EPUs and quantum processing units - Ray Kurzweil. - Book: How To Create A Mind - 2014
New York-based startup Emoshape has developed its own CPU optimized to handle emotional data. The technology has the potential to change computer games, virtual reality and augmented reality applications” - Roberta Cozza, Research Director at Gartner - Jan 10, 2018
This (EPU) brings human-machine interaction to a new level, because emotional understanding and providing correct feedback is essential in communications. Viacheslav Khomenko, Ph.D. Senior Research Engineer, Samsung Electronics - May 19, 2016
There’s huge potential in Emotion Synthesis. It is probable that they could significantly improve the efficiency of human-machine interaction. Andrew Ng Ex-Chief Scientist Baidu November 22, 2016
Pretty fascinating work by EmoShape -Brennan Spiegel, MD Director of health research at Cedars Sinai. - Jan 10, 2018
-
EPU III SDK SOURCE CODE
$7,000.00 Select options -
EPU III Chip
$800.00 Select options -
Sale!
EPU II SDK SOURCE CODE
$7,000.00$4,500.00 Select options -
Sale!
1EPU II Chip
$149.00$95.00 Select options -
1 YEAR E.P.G. CLOUD COMPUTING
$3,000.00 Add to cart -
Sale!
EPU Emotion Engine – Unity & Unreal
$249.00$189.00 Select options