The MetaSoul
EPU can synthesize emotional levels of individuals in real-time with responses to one of twelve
primary emotions:
anger, fear, sadness, disgust, indifference, regret ,surprise, anticipation, trust, confidence, desire and
joy,
using psychometric functions that shape and react without use of pre-programmed sets of inputs. EPU III is
the
industry’s first emotion synthesis engine. It delivers high-performance machine emotion awareness, the EPU
III
family of eMCU are transforming the capabilities of Robots and AI. MetaSoul has completed the production of
the
first EPU (emotional processing unit); a patent pending technology which creates a synthesised emotional
response
in machines.
Benefits
The EPU III Evaluation Kit provides an evaluation platform for the Emotion Processing UNIT III. The
evaluation
board is a vehicle to test and evaluate the emotion synthesis functionality of the EPU III. The kit gives
developers
immediate access to its advanced emotion processing engine, while allowing them to develop proprietary
capabilities
that provide true differentiation.
The EPU USB dongle (gold surface) or Cloud
EPU are built around the revolutionary EPU III and uses the same MetaSoul EPU III™
computing core functionalities for emotion synthesis. This gives you a fully functional EPG® platform for
quickly
developing and deploying emotion capabilities for AI, Robots, consumer electronic, and more.
MetaSoul delivers the entire BSP and software stack. With a complete suite of development, code sample,
EPG machine
learning cloud computing, and profiling tools, MetaSoul gives you the ideal solution for helping shape the
future of AI and Robots' emotional awareness.
Features
THE EPU3 CAN PROCESS UP TO 8 PERSONAS IN REAL-TIME
THE EPU3 SUPPORT UP TO 3 LANGUAGES
THE EPU3 EMOTION SYNTHESIS ENGINE OUTPUT UP TO 64 TRILLION POSSIBLE EMOTIONAL STATES EVERY 1/10th OF A
SECOND
THE EPU3 ENGINE FOR EMOTION REASONING SYNTHETHISE: Anger, Fear, Sadness, Disgust, Indifference, Regret,
Surprise, Inattention, Trust, Confidence, Desire, Joy, Frustration, Satisfaction, Pain and Pleasure FOR
THE MACHINE ITSELF
THE EPU3 IDENTIFY IN SEMANTIC: Anger, Fear, Sadness, Disgust, Indifference, Regret, Surprise,
Inattention, Trust, Confidence, Desire, and Joy.
THE EPU3 SDK DETECT IN REAL-TIME 5 EMOTIONAL STATES FROM THE TONE OF VOICE, Happy, Sad, Anger, Fear,
Neutrality
THE EPU3 SDK DETECT IN REAL-TIME 6 EMOTIONAL STATES (Happy, Sad, Anger, Surprised, Fear, Disgust) PLUS
GENDER AND AGE AND FACE RECOGNITION FROM FACIAL ANALYSIS.
THE EPU RETURNS A BUFFER OF 124 BYTES IN A PACKET (MULTIDIMENSIONAL ARRAY OF DATA)
12 PRIMARY HUMAN EMOTION LEVELS. AMPLITUDE: 0-100 (RESOLUTION 9 SUB CHANNELS PER EMOTION)
12 PRIMARY HUMAN FEELING LEVELS. AMPLITUDE: 0-100 (RESOLUTION 1 CHANNEL)
This SDK is for personal and commercial projects. Limitations on Transfer and resell of data: Your limited
license does not allow to transfer or resell any data from the Emotion processing Unit buffer for example,
but
not exclusively in a client server configuration.
Quick Start - (5 minutes) SDK with EPU Dongle
Connect physically the EPU III USB evaluation board to your system with the Micro USB cable provided.
Download the SDK from the top right side of this page (download icon) then extract
the SDK. You will find inside sub-directories for your operating system i.e. Windows, Linux x86, Linux arm
(Raspberry
pi), Android etc.
Find in the sub-directory the binary EPU_II_SDK and launch it or copy the installation file if you use
for example
Android .cab into your device and install it. (Check the README file for special instruction).
Launch the EPU_II_SDK binary file and make sure you are connected to the web (Corporate firewall must
authorise inbound and outbound traffic to 45.55.153.18).
You can see the EPU III SDK application window.
Insert your Secret Activation Code that has been provided to you with your SDK by email with your
tracking number. Then Close and restart the SDK.
Select the persona ID you want to activate or communicate with by selecting its index in the
select box. (0-7).
Connect the QT SDK to your EPU Evaluation board by pressing on the Connect button (see below in
the red
square)
Quick Start - (5 minutes) Cloud EPU
Download the SDK from the top right side of this page (download icon) then extract
the SDK. You will find inside sub-directories for your operating system i.e. Windows, Linux x86, Linux arm
(Raspberry
pi), Android etc.
Launch the EPU3 SDK and make sure you are connected to the web.
You can see the EPU III SDK application window. Note that when using '-u' command line option, the application will start without the user interface.
Select if you choose to use the SDK via the cloud or in local mode. (Local is an edge mode, EPU3 via
local USB connection)
Make sure your account has active days left for the secret.
Connect the QT SDK to your EPU via in Edge or Cloud mode by pressing on the Connect button (see
below in the red
square)
Icon Tray
Uncheck Read only before you try to type a text
Core Selection, the EPU3 has 8 cores (0-7) it means you can process 8 independant persona in
real-time.
Language selection
You can select or change at any time the default language . The language is set per core (0-7).
3D Graph Rotation
Rotate the 3D graph with right click on the mouse and drag
Realtime Appraisal
Type any text in the Appraisal area (max 100 words) then select USER or ROBOT for the direction of the
conversation
then press the SEND button.
Remote Control - TCPIP server
Connect to the EPU SDK easily by TCPIP on port 2424 to initialise and send text for realtime
appraisal
from any application.
By default the server is listening (Auto open)
EPU Sensitivity to emotion inputs
Select "user"" or "robot" and then adjust the sensitivity level (Default 100% - Range 70%-130%)
EPU emotion's persistence in the EPU
Select "user"" or "robot" and then adjust the persistence level (Default 100% - Range 10%-200%)
Turn on/off the leds on the board
Turn on or off the leds on your board (default on)
Pause / Resume/ Reset
Game functions to pause and resume the EPU and even reset after a game over.
Pleasure-Pain / Satisfaction-Frustration
Realtime dilatation of the pupil (0-100) based on Pain and Pleasure
Realtime level of Pleasure and Pain
Range 0-100 / Default 50
Alleviation of Pain is Pleasure
0 is maximum Pleasure and 100 maximum Pain 50 is none of them
Realtime level of Satisfaction and Frustration
Range 0-100 / Default 50
Alleviation of Frustration is Satisfaction
0 is maximum Satisfaction and 100 maximum Frustration 50 is none of them
Personality - Emotion Profile Graph
Each EPU can develop an emotional personality that will be the result of the daily interaction with
the user,
this is computed on our cloud service and is optional. The current emotional machine learning algorithms
on
our server will update daily the Emotional Profile Graph (EPG) resulting in a slight change of
personality
of the EPU. Just like human development, when it comes to our emotions, the EPG has a learning curve
that decreases
over time and eventually becomes almost non-existent unless a high amount of a particular emotion is
experienced.
The early experience of emotions are pivotal to long-term emotional development.
Write to force a change in the personality
Create custom emotional waves
You can customise any emotional wave and send it into the EPU
3) Set the duration of the wave in seconds (0-65535)
4) Set the origin ID (1-65535)of your choice of the event to activate the Inhibitory postsynaptic
potential (IPSP)
5) Set the Apex in seconds(0-99) Time to raise
6) Set the Neural Oscillation Curve & Homeostatic activity (1-255)
Get the level value for one emotion at any time (higher channel). No IPSP for Origin above 50000.
Objective Appraisal
You can force the EPU to appraise emotionally objectively so his current emotional state wont
influence the result
of its next appraisal. This should help your AI or Robot to sense a concept objectively even if it feel
sad
at that time.
Select the box
Symbolic Reinforcement Learning
Each EPU can memorise the emotional appraisal of up to 5242 symbolic words like for example
"Kiss". Wikipedia
is a well known source of knowledge that can be used as a source to extract the meaning of a word. The
meaning
can be sent together with the word to be learned by the EPU. The EPU will then appraise the string text
representing
the meaning of the word and only save the word and it's Emotion Profile Graph (EPG) in the memory of the
emotion
chip. The EPU will then be able to sense the word "Kiss" in furture appraisals.
Enter the text and the word to be learned like "Kiss" then press "Start learning"
You can select "append" to add the new appraisal to an existing word instead of replacing it."
Then press "Stop Learning" to create your new Kiss concept in the EPU."
Erase all Reinforcement Learning
Erase ALL learned symbols from the EPU.
Check if a word has been learned by the EPU
Each EPU can memorise the emotional appraisal of up to 5242 symbolic words. The button "Check
Word" ask
the SDK to check if a word has been learned in your SDK directory.
Voice Tone Analysis
First Select the right tab by a click on Tone of Voice then select the audio input and
click the start button. You should see after few seconds
the audio stream analysis for the sample duration selected
Send Tone Analysis stimuli to EPU
You can then select the level of the tone emotion stimuli you want to send to the EPU for Emotion
Synthesis in response
ASR - Automatic Speech Recognition
First enter your secret key provided by Microsoft Azure Cloud Service Speech to Text, then insert the
location of your server, for example westus. Then click the Start button.
The audio input selected for Tone Analysis will be also used for the ASR
Emotion Face Analysis
First Select the right tab by a click on Face Emotion then select the camera input and
click the start button. You should see after few seconds
see the image and the emotion detected. Check Gender, Age and Recognition if you want to add gender and
age to the face recognition.
Create a user profile for recognition
If a new face is detected the button Create User will be avialable type in a name for that
user and click the Create user button (stay still for 2 seconds). You should see after few
seconds
sthe name of the user replace "Unknown" user. You can select an existing profile and remove it from the
learned profile list.
Ask the EPU to appraise (remember emotionally) a person if recognised
High cognitive function. When a face is recognized, the username will be sent to the EPU to be learned.
All user interaction will be appraised until the face is lost. Depending on what functions are ativated in
the SDK, what you say, how you say it, and how you look when you say it can be taken into consideration
and memorized by the EPU. The next time the user appears, his username will be sent to the EPU and the EPU
will react emotionally to what it has learned from this user. The appraisal will resume automatically if
the user is recognised again.
Send Face Results to EPU
You can then select the level of the Face emotion stimuli you want to send to the EPU for Emotion
Synthesis in response.
Unicode for Asian EPU
EPU's exist in different language like Chinese or Japanese in that case the unicode format will be
automatically
selected
Chat Bot with GPT-3 (Optional with special activation Key)
First input in API Key your GPT-3 key, then press Start button. Once connected you can start chatting with the bot by typing text right after Human: prompt, then press Send button.
Azure Emotion Neural TTS (Optional with special activation Key)
Allows to send text to EPU in order to get back emotions associated to each word, then the text with emotions is send to TTS engine. By checking Single Core for Voice Only a separate EPU core is reserved. By checking Default EPU Instance for TTS the existing active EPU instance is used for TTS. The parameters in Pain/Frustration and Pleasure/Satisfaction groups can be used to map the emotion send by EPU to the format recognized by the TTS engine. Setting Word Delay slider to 0 maps the emotions to words in the default way. Using a value of 100 maps the first non zero emotion with the first word.
Python Sample code
Convention [optional option] Each core has a unique EPUID, replace [EPUID] with the EPUID's of the
persona you are interacting with. Note that when EPUID is send with the command the reply also contains
EPUID as the prefix of the actual reply code. Each EPU3 has up to 8 personas or cores.
Create a TCP/IP Client to connect to the EPU SDK.
import socket
import time
import sys
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
try:
s.connect(('127.0.0.1', 2424))
except socket.error as e:
print("Unable to connect to EPU. " + str(e))
sys.exit(-1)
Init the EPU via TCP/IP
# s - socket returned by first example
reply = ''
while 'EPU_ON' not in reply:
cmd = '[EPUID]@>EPUInit [local or cloud] [secret]\r\n'
s.send(str.encode(cmd))
reply = str(s.recv(2048), 'utf-8').strip()
time.sleep(3)
Note that 'EPUInit' command can optionally select the EPU type (cloud or local) and send the secret.
If EPU type is not send, then by default cloud EPU is selected.
If the request is successful the reply is [EPUID]EPU_ON.
Note: in cloud mode, if an active instance is unexpectedly disconnected then [EPU ID]EPU_OFF is send.
In this case, the SDK will automatically reconnect and an [EPU ID]EPU_ON is send.
For this reason, your code should be ready to handle such events.
If there is at least one active instance the output is
[EPUID]CMD_OK [comma separated list of EPU IDs]
where [EPUID] is the EPU ID of the current instance and the list of EPU IDs that
follows CMD_OK does not contain the current EPU ID.
If there is no active instance the reply is CMD_ERR.
Set emotion overall sensitivity (optional - default 100% - range 70%-130%)
# sensitivity for robot input
sens = 85
cmd = '[EPUID]@>robot\r\n'
s.send(str.encode(cmd))
cmd = '[EPUID]@>sens %d\r\n' % sens
s.send(str.encode(cmd))
# sensitivity for user input
sens = 110
cmd = '[EPUID]@>user\r\n'
s.send(str.encode(cmd))
cmd = '[EPUID]@>sens %d\r\n' % sens
s.send(str.encode(cmd))
Set emotion overall persistence (optional - default 100% - range 10%-200%)
# persistence for robot input
time = 85
cmd = '[EPUID]@>robot\r\n'
s.send(str.encode(cmd))
cmd = '[EPUID]@>time %d\r\n' % time
s.send(str.encode(cmd))
# persistence for user input
time = 110
cmd = '[EPUID]@>user\r\n'
s.send(str.encode(cmd))
cmd = '[EPUID]@>time %d\r\n' % time
s.send(str.encode(cmd))
Objective Appraisal
# set Objective on for Robot before you send a string to the EPU.
cmd = '[EPUID]@>robot\r\n'
s.send(str.encode(cmd))
cmd = '[EPUID]@>objective_on\r\n'
s.send(str.encode(cmd))
# set Objective on for User before you send a unicode string to the EPU.
cnd = '[EPUID]@>user\r\n'
s.send(str.encode(cmd))
cmd = '[EPUID]@>objective_on\r\n'
s.send(str.encode(cmd))
Symbolic Reinforcement Learning
# first ask the EPU to do an appraisal on a specific +message+ then ask the EPU to associate that result to a specific +word+
message = 'I like football'
cmd = '[EPUID]@>robot '+message+'\r\n'
s.send(str.encode(cmd))
reply = str(s.recv(2048), 'utf-8').strip()
word = 'football'
cmd = '[EPUID]@>writeEpuWord '+word+'\r\n'
s.send(str.encode(cmd))
reply = str(s.recv(2048), 'utf-8').strip()
Erase All Reinforcement Learning
# erase all previous words learned
cmd = '[EPUID]@>lexic_erase\r\n'
s.send(str.encode(cmd))
reply = str(s.recv(2048), 'utf-8').strip()
Pause EPU
# Pause epu, usefull to put emotions in a statis
cmd = '[EPUID]@>pause\r\n'
s.send(str.encode(cmd))
reply = str(s.recv(2048), 'utf-8').strip()
Resume EPU
# resume from the pause state
cmd = '[EPUID]@>resume\r\n'
s.send(str.encode(cmd))
reply = str(s.recv(2048), 'utf-8').strip()
# s - socket returned by first example
message = "You are smart"
cmd = '[EPUID]@>user'+message+'\r\n'
s.send(str.encode(cmd))
reply = str(s.recv(2048), 'utf-8').strip()
Send Robot's message or AI for Appraisal
message = "A human-looking indestructible cyborg is sent from 2029 to 1984 to assassinate a waitress, whose unborn son will lead humanity in a war against the machines"
cmd = '[EPUID]@>robot'+message+'\r\n'
s.send(str.encode(cmd))
reply = str(s.recv(2048), 'utf-8').strip()
channel_dict = get_channels() # described in previous example
emo_ranges = {}
emo_ranges['channelCONFIDENT'] = {}
emo_ranges['channelCONFIDENT'].update({intensity: "confident" for intensity in range(0, 101)})
emo_ranges['channelEXCITED'] = {}
emo_ranges['channelEXCITED'].update({intensity: "interested," for intensity in range(0, 6)})
emo_ranges['channelEXCITED'].update({intensity: "a bit excited," for intensity in range(6, 21)})
emo_ranges['channelEXCITED'].update({intensity: "very excited," for intensity in range(22, 101)})
emo_ranges['channelHAPPY'] = {}
emo_ranges['channelHAPPY'].update({intensity: "not so bad" for intensity in range(0, 6)})
emo_ranges['channelHAPPY'].update({intensity: "alright" for intensity in range(6, 11)}) # 6,7,8,9,10
emo_ranges['channelHAPPY'].update({intensity: "not too bad" for intensity in range(11, 16)})
emo_ranges['channelHAPPY'].update({intensity: "I'm ok" for intensity in range(16, 21)})
emo_ranges['channelHAPPY'].update({intensity: "ok" for intensity in range(21, 26)})
emo_ranges['channelHAPPY'].update({intensity: "fine" for intensity in range(26, 31)})
emo_ranges['channelHAPPY'].update({intensity: "well" for intensity in range(31, 33)})
emo_ranges['channelHAPPY'].update({intensity: "happy" for intensity in range(33, 34)})
emo_ranges['channelHAPPY'].update({intensity: "good" for intensity in range(34, 35)})
emo_ranges['channelHAPPY'].update({intensity: "pretty good" for intensity in range(35, 37)})
emo_ranges['channelHAPPY'].update({intensity: "very happy" for intensity in range(37, 39)})
emo_ranges['channelHAPPY'].update({intensity: "well" for intensity in range(39, 43)})
emo_ranges['channelHAPPY'].update({intensity: "great" for intensity in range(43, 51)})
emo_ranges['channelHAPPY'].update({intensity: "excellent" for intensity in range(51, 61)})
emo_ranges['channelHAPPY'].update({intensity: "fabulous" for intensity in range(61, 101)})
emo_ranges['channelDESIRE'] = {}
emo_ranges['channelDESIRE'].update({intensity: "attracted" for intensity in range(0, 101)})
emo_ranges['channelTRUST'] = {}
emo_ranges['channelTRUST'].update({intensity: "trustful" for intensity in range(0, 101)})
emo_ranges['channelFEAR'] = {}
emo_ranges['channelFEAR'].update({intensity: "a bit uncomfortable" for intensity in range(0, 11)})
emo_ranges['channelFEAR'].update({intensity: "a bit anxious" for intensity in range(11, 21)})
emo_ranges['channelFEAR'].update({intensity: "scared" for intensity in range(21, 64)})
emo_ranges['channelFEAR'].update({intensity: "terrorized" for intensity in range(64, 101)})
emo_ranges['channelSURPRISE'] = {}
emo_ranges['channelSURPRISE'].update({intensity: "intrigued" for intensity in range(0, 11)})
emo_ranges['channelSURPRISE'].update({intensity: "surprised" for intensity in range(11, 101)})
emo_ranges['channelINATTENTION'] = {}
emo_ranges['channelINATTENTION'].update({intensity: "sleepy" for intensity in range(0, 11)})
emo_ranges['channelINATTENTION'].update({intensity: "a bit funny" for intensity in range(11, 15)})
emo_ranges['channelINATTENTION'].update({intensity: "a bit confused" for intensity in range(15, 20)})
emo_ranges['channelINATTENTION'].update({intensity: "embarrassed" for intensity in range(20, 25)})
emo_ranges['channelINATTENTION'].update({intensity: "a bit lost" for intensity in range(25, 33)})
emo_ranges['channelSAD'] = {}
emo_ranges['channelSAD'].update({intensity: "not too shabby" for intensity in range(0, 10)})
emo_ranges['channelSAD'].update({intensity: "sad" for intensity in range(10, 20)})
emo_ranges['channelSAD'].update({intensity: "tired" for intensity in range(20, 30)})
emo_ranges['channelSAD'].update({intensity: "a bit unwell" for intensity in range(30, 35)})
emo_ranges['channelSAD'].update({intensity: "not great" for intensity in range(35, 40)})
emo_ranges['channelSAD'].update({intensity: "really sad" for intensity in range(40, 50)})
emo_ranges['channelSAD'].update({intensity: "depressed" for intensity in range(50, 101)})
emo_ranges['channelREGRET'] = {}
emo_ranges['channelREGRET'].update({intensity: "a bit lost" for intensity in range(0, 36)})
emo_ranges['channelREGRET'].update({intensity: "left out" for intensity in range(36, 101)})
emo_ranges['channelDISGUST'] = {}
emo_ranges['channelDISGUST'].update({intensity: "a bit sick" for intensity in range(0, 22)})
emo_ranges['channelDISGUST'].update({intensity: "disgusted" for intensity in range(22, 101)})
emo_ranges['channelANGER'] = {}
emo_ranges['channelANGER'].update({intensity: "a tiny upset" for intensity in range(0, 5)})
emo_ranges['channelANGER'].update({intensity: "a bit upset" for intensity in range(5, 10)})
emo_ranges['channelANGER'].update({intensity: "a little cranky" for intensity in range(10, 20)})
emo_ranges['channelANGER'].update({intensity: "in a bad mood" for intensity in range(20, 33)})
emo_ranges['channelANGER'].update({intensity: "angry today" for intensity in range(33, 40)})
emo_ranges['channelANGER'].update({intensity: "frustrated" for intensity in range(40, 101)})
emo_ranges['channelPAINPLEASURE'] = {}
emo_ranges['channelPAINPLEASURE'].update({intensity: "yes, I like it" for intensity in range(0, 50)})
emo_ranges['channelPAINPLEASURE'].update({intensity: "I really don't know" for intensity in range(50, 51)})
emo_ranges['channelPAINPLEASURE'].update({intensity: "no, I don't like it" for intensity in range(51, 100)})
max_channel = max(channel_dict, key = channel_dict.get) # strongest emotion
emo_str = emo_ranges[max_channel][channel_dict[max_channel]]
Learn how to Interpolate the incoming data for smooth animation click here
where index is a positive value that represents the index of the microphone input.
The command reply can be CMD_OK if the operation is successful or
CMD_ERR if an error occurs or CMD_ERR_FORMAT if the command has the wrong format.
where epu_id contains the EPU ID of an active instance. The command reply can be
CMD_OK if the operation is successful or CMD_ERR if an error occurs. If
an instance is closed the copy in the shared memory is also stopped.
The name of the shared memory is EPU_MDAD and the SDK must be run first before trying
to read the named shared memory from another process (see C# or C++ examples below).
The copy into the shared memory can be started or stopped in the SDK without restrictions.
There are two additional shared memories for reading GPT-3 result and viseme information from
Azure Neural Text-To-Speech server. The name of the shared memory is EPU_GPT and EPU_VISEME, respectively. The GPT result is a string of maximum size 10 KB and the viseme information is a tuple of two signed 32 bits integer of the form (audioOffsetMs,visemeId).The copy into these two additional shared memories can be started or stopped depending on the SDK configuration.
Receive emotion markup data
def parse_markup_data(markup_data):
# parse markup
arr = bytes.fromhex(markup_data)
out = dict()
out['word_index'] = int(arr[0])
# "excite", "sure", "happy", "trust", "desire", "fear",
# "surprise", "inattention", "sad", "nostalgia", "disgust", "anger",
# "satisfaction", "frustration", "pleasure", "pain"
emotions = ("excite", "sure", "happy", "trust", "desire", "fear")
for i in range(0, 6):
emo = int(arr[i + 1])
out[emotions[i]] = emo if (emo >= 0) else 0
emotions = ("surprise", "inattention", "sad", "nostalgia", "disgust", "anger")
for i in range(0, 6):
emo = int(arr[i + 1])
out[emotions[i]] = 0 if (emo >= 0) else -emo
emotions = ("satisfaction", "frustration", "pleasure", "pain")
for i in range(0, 2):
emo = int(arr[i + 7])
if emo < 50:
out[emotions[2 * i]] = 50 - emo
out[emotions[2 * i + 1]] = 0
else:
out[emotions[2 * i]] = 0
out[emotions[2 * i + 1]] = emo - 50
return out
while True:
reply = str(s.recv(2048), 'utf-8').strip()
tok = reply.split('\n')
for line in tok:
if line.startswith('CMD_XML_ML'):
line = line.replace('CMD_XML_ML', '').strip()
res = parse_markup_data(line)
print(res)
Synthesize text using text to speech and play the sound on default speaker/li>
The command reply can be CMD_OK if the operation is successful or CMD_ERR if an
error occurs when the text is synthesized or CMD_ERR_FORMAT if the command does
not have the correct format.
Synthesize text using text to speech with parameters and play the sound on default speaker
The command parameters are pitch, an integer between 0 and 60, rate, a floating point
number between 0 and 5, volume, an integer between 0 and 100 and the message to convert.
The command reply can be CMD_OK if the operation is successful or CMD_ERR if an
error occurs when the text is synthesized or CMD_ERR_FORMAT if the command does
not have the correct format.
Synthesize text using text to speech and save the sound as wav file
The command reply can be 'CMD_OK file_name' if the operation is successful or CMD_ERR if an
error occurs when the text is synthesized or CMD_ERR_FORMAT if the command does
not have the correct format. The file can be downloaded using the URL https://ip_address:8080/download/file_name.
Synthesize text using text to speech with parameters and save the sound as wav file
The command parameters are pitch, an integer between 0 and 60, rate, a floating point
number between 0 and 5, volume, an integer between 0 and 100 and the message to convert.
The command reply can be 'CMD_OK file_name' if the operation is successful or CMD_ERR if an
error occurs when the text is synthesized or CMD_ERR_FORMAT if the command does
not have the correct format. The file can be downloaded using the URL https://ip_address:8080/download/file_name.
The command reply can be CMD_OK if the operation is successful or CMD_ERR if an
error occurs or CMD_ERR_FORMAT if the command does not have the correct format.
where name is the voice name and style is the optional voice style.
The command reply can be CMD_OK if the operation is successful or CMD_ERR_FORMAT if the command does not have the correct format.
where delay is an integer value between 0 and 100.
The command reply can be CMD_OK if the operation is successful or CMD_ERR_FORMAT if the command does not have the correct format.
The command reply can be CMD_OK if the operation is successful or CMD_ERR_FORMAT if an
error occurs.
Note: When using ASR, the result is broadcast to all connected clients over TCP sockets.
The message format is:
<asr>Message</asr>
Similarly, the GTP-3 result is broadcast using the format:
<gpt>Message</gpt>
When using TTS from Microsoft Azure, viseme events are broadcast using the format:
Before downloading, you must agree to the following terms and conditions.
MetaSoul SDK END USER LICENSE AGREEMENT
Terms and Conditions.
This is the MetaSoul Software Development Kit License
Agreement
IT IS IMPORTANT THAT YOU READ THIS AGREEMENT CAREFULLY AND COMPLETELY. This End User License Agreement
("Agreement")
is a legally binding agreement between, on one hand, either your employer (if you are acting on behalf
of your
employer) or you (if you are acting on your own behalf) ("Licensee"), and on the other hand, MetaSoul
Inc. ("MetaSoul").
By clicking the "I Accept" button on this page or by downloading, installing or using any of the
software available
for download on this page ("Licensed Software"), you are indicating that you are binding Licensee to the
terms
of this Agreement, and that you are duly authorized by Licensee to do so. If you are not authorized to
bind Licensee
to the terms of this Agreement, or if Licensee does not agree to be bound by all of the terms of this
Agreement,
do not click the "I Accept" button and do not download, install or use any such software. 1. License
Grant. Subject
to the terms and conditions of this Agreement, MetaSoul grants Licensee a non-exclusive,
non-transferable, non-sublicensable,
limited license to: (a) install and internally use the Licensed Software solely to develop and debug
embedded
applications for MetaSoul's EPU Emotion Chip ("Supported MetaSoul Products"); and to make one copy of
the Licensed
Software solely for backup purposes. 2. Restrictions. Licensee will not, and will have no right to, (a)
use,
copy or reproduce any Licensed Software except as expressly set forth in Section , (b) modify, create
derivative
works of, sell, distribute or disclose any Licensed Software, or (c) decompile or otherwise reverse
engineer
any Licensed Software that is not provided in source code form, or otherwise derive or attempt to derive
the
source code of, or any processes, techniques, methods, specifications, protocols, algorithms,
interfaces, data
structures, or other information embodied or used in, any such Licensed Software. Without limiting the
generality
of the foregoing, Licensee will not, and will have no right to, use any Licensed Software to develop or
debug
embedded applications for any semiconductor products that are not Supported MetaSoul Products. Licensee
will
not remove, obscure or alter any trademark, copyright or other proprietary rights or ownership notices
of MetaSoul
or any of its licensors that appear in any Licensed Software, and Licensee will reproduce all such
proprietary
rights and ownership notices on all copies of Licensed Software made by Licensee. This SDK is for
personal and
commercial projects. License Limitations on Transfer and resell of data: Your limited license does not
allow
to transfer or resell any data from the Emotion Processing Unit buffer for example, but not exclusively
in a
client server configuration. 3. Open Source Software. The table below lists publicly available software
included
in the Licensed Software and web sites containing licenses under which such software is made available
to the
public. Notwithstanding anything to the contrary, nothing in this Agreement will limit Licensee's rights
under,
or grant to Licensee rights that supersede, the terms of such licenses to the extent they apply to such
software
or MetaSoul's modifications thereto. Software License Agreement Location QT https://www.qt.io/licensing/ 4. Ownership; Reserved Rights;
License to MetaSoul. MetaSoul and its licensors will retain full and exclusive
title to and ownership of the Licensed Software, including, without limitation, all copyrights, patents,
trade
secrets and other intellectual property rights in and to the Licensed Software. Nothing contained in
this Agreement
will be construed as conferring upon Licensee or any third party (whether by implication, operation of
law, estoppel
or otherwise) any right or license not expressly granted by MetaSoul to Licensee under this Agreement.
Licensee
hereby grants to MetaSoul and its affiliates a non-exclusive, worldwide, fully paid-up, royalty-free,
sublicensable
license to make, have made, use, sell, offer to sell and import the Licensed Software under any patents
of Licensee
that, absent this license, would be directly or indirectly infringed by any of the foregoing activities.
5. High
Risk Activities. LICENSEE ACKNOWLEDGES AND AGREES THAT THE LICENSED SOFTWARE IS NOT DESIGNED OR APPROVED
FOR,
AND WILL NOT BE USED (WITHOUT THE EXPRESS WRITTEN APPROVAL OF AN OFFICER OF MetaSoul) IN CONNECTION
WITH, ANY
PRODUCTS THAT ARE USED OR DESIGNED TO BE USED IN CONNECTION WITH ANY ACTIVITIES WHERE THE FAILURE OF
SUCH PRODUCTS
COULD REASONABLY BE EXPECTED TO RESULT IN DEATH, BODILY INJURY, OR SEVERE PHYSICAL OR ENVIRONMENTAL
DAMAGE ("HIGH
RISK ACTIVITIES"). WITHOUT LIMITATION OF SECTION OR THIS SECTION , IN NO EVENT WILL MetaSoul HAVE ANY
LIABILITY
TO LICENSEE OR ANY THIRD PARTY ARISING OUT OF OR RELATED TO ANY USE OF LICENSED SOFTWARE IN CONNECTION
WITH HIGH
RISK ACTIVITIES, AND MetaSoul HEREBY DISCLAIMS ANY EXPRESS OR IMPLIED WARRANTY OF FITNESS FOR ANY HIGH
RISK ACTIVITIES.
6. Indemnification. Licensee will indemnify, hold harmless and, at MetaSoul's request, defend MetaSoul
from and
against all claims, suits, proceedings, losses, damages, liabilities and expenses (including, without
limitation,
attorneys' and other professionals' fees) arising out of or relating to (a) Licensee's use or other
exploitation
of the Licensed Software, or (b) Licensee's breach of this Agreement. 7. Term and Termination. This
Agreement
will remain in effect until terminated. Licensee may terminate this Agreement at any time with written
notice
to MetaSoul. This Agreement will automatically terminate if Licensee fails to comply with any of the
terms and
conditions of this Agreement. 8. Effect of Termination. Upon any termination of this Agreement, (a) all
licenses
granted to Licensee under this Agreement will terminate, (b) Licensee will discontinue all use of the
Licensed
Software, (c) Licensee will destroy all tangible copies of the Licensed Software and will permanently
delete
all electronic copies of the Licensed Software, and (d) the rights and obligations of the parties under
Sections
8. 3, 5, 6, 8, 9, 10, 11, 12 and 13 will survive such termination. 9. Disclaimer. THE LICENSED SOFTWARE
IS PROVIDED
TO LICENSEE "AS IS" AND "WITH ALL FAULTS." MetaSoul DOES NOT MAKE, AND MetaSoul HEREBY DISCLAIMS, ANY
REPRESENTATIONS
OR WARRANTIES OF ANY KIND (WHETHER EXPRESS, IMPLIED, STATUTORY OR OTHERWISE) IN CONNECTION WITH THE
LICENSED
SOFTWARE OR ANY OTHER ASPECT OF THIS AGREEMENT, INCLUDING, WITHOUT LIMITATION, ANY IMPLIED WARRANTIES OF
MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE, TITLE OR NON-INFRINGEMENT OF THIRD PARTY RIGHTS, AND ANY WARRANTIES
THAT MAY
ARISE FROM COURSE OF DEALING, COURSE OF PERFORMANCE OR USAGE OF TRADE. MetaSoul WILL HAVE NO OBLIGATION
UNDER
THIS AGREEMENT TO CORRECT ANY BUGS, DEFECTS OR ERRORS IN THE LICENSED SOFTWARE, PROVIDE ANY UPDATES,
UPGRADES
OR NEW RELEASES OF THE LICENSED SOFTWARE, OR OTHERWISE PROVIDE ANY TECHNICAL SUPPORT OR MAINTENANCE FOR
THE LICENSED
SOFTWARE. MetaSoul MAY MAKE CHANGES TO THE LICENSED SOFTWARE AT ANY TIME AND WITHOUT ANY OBLIGATION TO
NOTIFY
LICENSEE OR PROVIDE SUCH CHANGES TO LICENSEE. 10. Limitation of Liability. TO THE EXTENT PERMITTED BY
APPLICABLE
LAW, IN NO EVENT WILL MetaSoul BE LIABLE TO LICENSEE OR ANY THIRD PARTY (WHETHER SUCH LIABILITY IS BASED
ON CONTRACT,
NEGLIGENCE, STRICT LIABILITY, OTHER TORT THEORY, CONTRIBUTION, BREACH OF WARRANTY, OR OTHER LEGAL OR
EQUITABLE
THEORY) FOR ANY SPECIAL, INDIRECT, INCIDENTAL, EXEMPLARY, PUNITIVE OR CONSEQUENTIAL DAMAGES, OR ANY
DAMAGES FOR
LOSS OF PROFITS, LOSS OR INTERRUPTION OF BUSINESS, OR LOSS OF DATA, ARISING OUT OF OR RELATING TO THE
LICENSED
SOFTWARE OR ANY OTHER ASPECT OF THIS AGREEMENT, EVEN IF MetaSoul HAS BEEN ADVISED OF OR SHOULD HAVE
KNOWN OF
THE POSSIBILITY OF SUCH DAMAGES. TO THE EXTENT PERMITTED BY APPLICABLE LAW, IN NO EVENT WILL MetaSoul'S
TOTAL
LIABILITY ARISING OUT OF OR RELATED TO THE LICENSED SOFTWARE OR ANY OTHER ASPECT OF THIS AGREEMENT
(WHETHER UNDER
CONTRACT, NEGLIGENCE, STRICT LIABILITY, CONTRIBUTION, BREACH OF WARRANTY, OR OTHER LEGAL OR EQUITABLE
THEORY)
EXCEED THE GREATER OF (A) THE AGGREGATE OF ALL LICENSE FEES PAID BY LICENSEE TO MetaSoul FOR THE
LICENSED SOFTWARE
AND (B) FIVE HUNDRED DOLLARS ($500). WITHOUT LIMITING THE FOREGOING, AND NOTWITHSTANDING ANY PROVISION
HEREIN
TO THE CONTRARY, MetaSoul WILL NOT BE LIABLE FOR ANY COSTS OF PROCURING SUBSTITUTE GOODS, SERVICES OR
TECHNOLOGY
UNDER ANY CIRCUMSTANCES. EACH PARTY ACKNOWLEDGES THAT THE OTHER PARTY HAS ENTERED INTO THIS AGREEMENT IN
RELIANCE
ON THE LIMITATIONS OF LIABILITY, DISCLAIMERS OF WARRANTIES, EXCLUSION OF DAMAGES AND EXCLUSIVE REMEDIES
CONTAINED
IN THIS AGREEMENT, AND THAT EACH OF THE FOREGOING PROVISIONS FORMS AN ESSENTIAL AND FUNDAMENTAL PART OF
THE BASIS
OF THE BARGAIN BETWEEN THE PARTIES, WITHOUT WHICH SUCH THE OTHER PARTY WOULD NOT HAVE ENTERED INTO THIS
AGREEMENT.
EACH PARTY AGREES THAT SUCH PROVISIONS WILL SURVIVE AND APPLY NOTWITHSTANDING ANY FAILURE OF ESSENTIAL
PURPOSE
OF ANY LIMITED REMEDY OR LIMITATION OF LIABILITY. 11. Compliance with Laws; Export. Licensee will comply
with
the laws and regulations of the United States and all other relevant jurisdictions in connection with
its activities
related to the Licensed Software. Without limitation of the foregoing, Licensee acknowledges that
certain laws
and regulations of the United States and other jurisdictions may pertain to the export and re-export of
the Licensed
Software, and Licensee will not export or re-export any Licensed Software in any form without the
appropriate
governmental approvals, or otherwise in violation of any such laws or regulations. 12. Governing Law;
Dispute
Resolution. This Agreement is to be construed in accordance with and governed by the internal laws of
the State
of California (as permitted by Section 1646.5 of the California Civil Code or any similar successor
provision),
without giving effect to any choice of law rule that would cause the application of the laws of any
jurisdiction
other than the internal laws of the State of California to the rights and duties of the parties. This
Agreement
will not be governed by the U.N. Convention on the Sale of Goods, the application of which is expressly
excluded.
Except for actions for injunctive or other equitable relief, which may be brought in any court of
competent jurisdiction,
all disputes arising out of or related to this Agreement will be subject to the exclusive jurisdiction
of the
California state courts in Santa Clara County, California, or if there is exclusive federal
jurisdiction, the
United States District Court for the Northern District of California, and the Parties hereby consent to,
and
agree to submit to, the personal and exclusive jurisdiction and venue of such courts. 13. General.
Licensee will
not, and will have no right to, assign, delegate or otherwise transfer (whether voluntarily, by
operation of
law or otherwise) this Agreement or any of its rights or obligations hereunder to any third party
without the
prior written consent of MetaSoul, and any purported assignment, delegation or other transfer without
such consent
will have no force or effect. Subject to the foregoing, this Agreement will be binding upon and will
inure to
the benefit of the parties and their respective successors and permitted assigns. No failure of either
party
to enforce any right under this Agreement will be deemed a waiver of such right or any other right under
this
Agreement. Any waiver by a party of a breach of any provision of this Agreement by the other party
hereunder
will not be deemed to be a waiver of any subsequent breach of such provision or a waiver of any breach
of any
other provision of this Agreement. This Agreement may not be superseded, modified, or amended except in
a writing
signed by an officer of each party. If any provision of this Agreement is determined to be invalid,
illegal or
otherwise unenforceable, such provision will be enforced to the extent possible consistent with the
intent of
the parties, and the remaining provisions of this Agreement will remain in full force and effect. This
Agreement
will be fairly interpreted in accordance with its terms and without any strict construction against
either party
because it was drafted by such party or for any other reason. This Agreement will constitute the entire
agreement
between the parties relating to the subject matter hereof, and expressly supersedes and replaces all
prior and
contemporaneous agreements, proposals, quotations, negotiations and communications, written or oral,
between
the parties relating to such subject matter. MetaSoul Inc. 276 Fifth Avenue Suite 704 New York, NY, New
York
10001 United States of America http://www.MetaSoul.com