PUBLIC - EPU Documentation Ver. 3.6 - - 01/23/2020 - Cloud plugins for Unity3D and UE4.

Getting Started

Welcome

EPU - Unity and Unreal Engine Plugins

The Emoshape EPU can synthesize emotional levels of individuals in real-time with responses to one of twelve primary emotions: anger, fear, sadness, disgust, indifference, regret ,surprise, anticipation, trust, confidence, desire and joy, using psychometric functions that shape and react without use of pre-programmed sets of inputs. EPU III is the industry’s first emotion synthesis engine. It delivers high-performance machine emotion awareness, the EPU III family of eMCU are transforming the capabilities of NPC in video games. EmoSHAPE EPU is a patented technology which creates a synthesised emotional response for the Game industry.

Benefits

  • Cloud EPU Plugins for Unity3D and Unreal Engine provide an evaluation platform for the Emotion Processing Unit III. The evaluation plugins are a vehicle to test and evaluate the emotion synthesis functionality of the Cloud EPU Emotion Engine in games, VR and AR. The kit gives developers immediate access to its advanced emotion processing engine, while allowing them to develop proprietary capabilities that provide true differentiation.
  • Cloud EPU is built around the revolutionary EPU III and uses the same Emoshape EPU III™ computing core functionalities for emotion synthesis. This gives you a fully functional EPG® platform for quickly developing and deploying emotion capabilities for AI, Robots, consumer electronic, and more.
  • Emoshape also delivers the entire BSP and software stack. With a complete suite of development, code sample, EPG machine learning cloud computing, and profiling tools, Emoshape gives you the ideal solution for helping shape the future of AI and Robots' emotional awareness.

Features

  • THE EPU RETURNS A BUFFER OF 123 BYTES IN A PACKET (MULTIDIMENSIONAL ARRAY OF DATA)
  • 12 PRIMARY HUMAN EMOTION LEVELS. AMPLITUDE: 0-100 (RESOLUTION 9 SUB CHANNELS PER EMOTION)
  • 12 PRIMARY HUMAN FEELING LEVELS. AMPLITUDE: 0-100 (RESOLUTION 1 CHANNEL)
  • PULSE SPEED. RANGE: 0-100 (RESOLUTION 1 CHANNEL)
  • PAIN / PLEASURE LEVELS. AMPLITUDE: 0-100 (RESOLUTION 1 CHANNEL)
  • FRUSTRATION / SATISFACTION LEVELS. AMPLITUDE: 0-100 (RESOLUTION 1 CHANNEL)

License

This documentation is for personal and commercial projects. Limitations on Transfer and resell of data: Your limited license does not allow to transfer or resell any data from the Emotion processing Unit buffer for example, but not exclusively in a client server configuration.


Quick Start - (5 minutes) Cloud EPU

  • Download the plugins from the top right side of this page (download icon) then extract all the files from the zip archive file. You will find the plugins for Unity and UE4 in 2 different archive files and and EPU Redistributables installer for different OS.
  • Install the setup files EPU_III_SDK_3D-3.5.0.1-win64-game corresponding to your OS.
  • The setup will install the Cloud EPU binaries redisributable and start automatically a local invisible process EPU_III_SDK_3D. This application must always run with your plugins in order to connect your plugins to Emoshape EPU Cloud servers.
  • You will also need your Secret Key that is generated in your EPU Cloud account.
  • Make sure your account has active days left for the secret.

  • Icon Tray
  • EPU SDK

    Unity 3D(C#) Sample code

      Unity 3D EPU SDK
    • Import Libraries for the Plugin
      // basic libraries we are using to create the plugin 
      using System.Collections;
      using System.Collections.Generic;
      using UnityEngine;
      using System;
      using System.Net;
      using System.Net.Sockets;
      using System.Text;
      
    • Plugins Parameters, basic parameters for a efficient handling of the plugin, all examples for calls need some of those parameters
      // the parameters necessary to create the plugin, take care with emoBytes
      Socket sender;
      byte[] bytes = new byte[123];
      byte[] emoBytes = new byte[300];  //Must be more than 256 because of the emotion buffer -> its 1 byte by value and 1 byte by ',' + the CMD_OK 
      public bool talkMode = false; //false = user and true = robot;  
      public string EPUIP = "127.0.0.1"; // In most cases it will be in your LocalHost so 127.0.0.1 to access the dongle.
    
      [Header("Pre Configured Start")]
      public string userSensibility = "100";
      public string robotSensibility = "100";
      public string userPersistence = "100";
      public string robotPersistence = "100";
      public bool preConnect = false;
      public bool startSDK = false;
      public event EventHandler preConnected;
      public Dictionary mEmotions = new Dictionary();
      Action _Action;  
    • Secondary Class
        // This is just a brief explaination of the other classes used by the Plugin.
        
        //UDPReceiveOnly.cs
        
        //Methods
        
        //Unity Standard Basically everything you need to do is to let this script enabled (its a required component for the main component) and do not call this
        //in case you want to call this change it to Init or something similar.
        private Start()
        
        //The method to retrieve the UDP packet and transcribe it to string, you can call this manually if you want to check/update the info in a exact step.
        private ReceiveData()
        
        //updateScreen is a premade IEnumerator to update a Standard Unity UI Text with the information for testing and debugging, you can't update Unity components in a thread sadly.
        IEnumerator updateScreen();
        
        //Method to stop the retrieval of information and the thread, safely closing everything used component.
        public Stop()    
    • Optional Features
            //This is just a brief explaination of the optional features that can be used through the plugin.
            
            //In the main emoshape class, there are 2 parameters they are audioPath, epuMessage those are used to have the EPU respond you with text and audio.
            
            //Usage
            
            //The process is automatized by the plugin those parameters will fill automatically when a SDK output contains those parameters, this process happens at the
            //OrganizeEpuMessage method inside the main class if you want to change something there. 
    
            //AudioPath will point to a temporary folder in your system where the audio will have been created so you can use whoverer you want, the actual implementation
            //of importing the audio and playing is not done yet.
    
            //epuMessage is the epu reply to a question send to the chatbox by a outer system, check epu docs for more imformation on this.
    
            //The parameters can be seem in the screen below.
            
              
      Unity EPU Plugin
    • Create a TCPIP Client to connect to the EPU SDK.
      IPHostEntry ipHost = Dns.GetHostEntry(EPUIP);
      IPAddress ipAddr = ipHost.AddressList[0];
      IPEndPoint ipEndPoint = new IPEndPoint(ipAddr, 2424);
      sender = new Socket(ipAddr.AddressFamily, SocketType.Stream, ProtocolType.Tcp);
      sender.Connect(ipEndPoint);    
    • Use a UDP receiver to connect to the EPU by TCP
        //This need to be in a IEnumerator or a Thread because its a async method, create a call method for this to help using it by another class
        IEnumerator UDPLoop()
        {
            List mIPs = new List();
            mIPs = GetComponent().mLocalIPs;
            IPAddress ipAddr = IPAddress.Parse(EPUIP);
            IPEndPoint ipEndPoint = new IPEndPoint(ipAddr, 2424);
            
            Debug.Log("Total IPs Detected " + mIPs.Count);
            for (int i = mIPs.Count - 1; i >= 0; i--)
            {                
                sender = new Socket(ipAddr.AddressFamily, SocketType.Stream, ProtocolType.Tcp);              
                UpdateTimeouts(8000);
                Debug.Log("Trying Ip Number " + i);
                yield return new WaitForSeconds(0.5f);
                EPUIP = mIPs[i];
                ipAddr = IPAddress.Parse(EPUIP);
                ipEndPoint = new IPEndPoint(ipAddr, 2424);
                IAsyncResult result = sender.BeginConnect(ipEndPoint, null, null);
                //Limit to drop the connection, basically if it takes more than 3 seconds for the handshake its a bad IP
                yield return new WaitForSeconds(3f);
                if (sender.Connected)
                {
                    isConnected = true;
                    Debug.Log("Connection Completed");
                    PlayerPrefs.SetString("webID", GetComponent().retrievedID);
                    break;
                }
                else
                {
                    sender.Close();
                }
            }
        }    
    • Init the EPU via TCPIP
          public void StartEPU(string method, string key)//Not creating a async method for this because this will mostly like be called at the start of the game
          {
              //This is a safety measure so we know if the EPU was already started
              //try{
              byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>EPUInit " + method + " " + key + " \r\n'");  // method is  [local or cloud] and the [secret] key can be found here: EPU Cloud account.
              sender.Send(msg);
              sender.Receive(bytes);
              Debug.Log(Encoding.UTF8.GetString(bytes));
              //}
              //catch{
    
              //}
          }
         
    • Set emotion overall sensitivity (optional - default 100% - range 70%-130%)
      public void SetSenses(bool robot = false, string value = "100")
      {
          byte[] msg;
    
          if (robot)
          {
              msg = Encoding.UTF8.GetBytes("[EPUID]@>robot\r\n");
          }
          else
          {
              msg = Encoding.UTF8.GetBytes("[EPUID]@>user\r\n");
          }
          sender.Send(msg);
          msg = Encoding.UTF8.GetBytes("[EPUID]@>sens " + value + "\r\n");
          sender.Send(msg); 
      }    
    • Set emotion overall persistence (optional - default 100% - range 10%-200%)
      public void SetPersistence(bool robot = false, string value = "100")
      {
          byte[] msg;
    
          if (robot)
          {
              msg = Encoding.UTF8.GetBytes("[EPUID]@>robot\r\n");
          }
          else
          {
              msg = Encoding.UTF8.GetBytes("[EPUID]@>user\r\n");
          }
          sender.Send(msg);
          msg = Encoding.UTF8.GetBytes("[EPUID]@>time " + value + "\r\n");
          sender.Send(msg);
      }    
    • Objective Appraisal
      //Turn on Objective Appraisal, this creates a less emotional appraisal and this will turn off after one message
      public void SendObjectiveAppraisal()
      {
          byte[] msg;
    
          if (talkMode)
          {
              msg = Encoding.UTF8.GetBytes("[EPUID]@>robot\r\n");
          }
          else
          {
              msg = Encoding.UTF8.GetBytes("[EPUID]@>user\r\n");
          }
    
          sender.Send(msg);
          msg = Encoding.UTF8.GetBytes("[EPUID]@>objective_on \r\n"); // can send the text at the same time if needed
          sender.Send(msg);
      }    
    • Symbolic Reinforcement Learning Part 1 (Sync and Async)
      //Send a message for the AI to Learn so it can be binded to a word 
      //E.g input a text explaining something like a kiss or a person profile like Bill Gates after it process you can set to bind to a word, text limit is 100 words
      public void SendSimbolicLearning(string message)
      {
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>robot " + message + "\r\n");
          sender.Send(msg);
          sender.Receive(bytes);
          Debug.Log(Encoding.UTF8.GetString(bytes));
    
      }
    
      public void SendSimbolicLearningAsync(string message)
      {
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>robot " + message + "\r\n");
          sender.BeginSend(msg, 0, msg.Length, SocketFlags.None, null, null);
          sender.BeginReceive(bytes, 0, bytes.Length, SocketFlags.None, new AsyncCallback(ReceiveCallback), null);
      }    
    • Symbolic Reinforcement Learning Part 2 (Sync and Async)
      //Bind the emotional reaction from Simbolic Learning to a word, you have to wait until a reaction to use this
      public void SaveSimbolicLearning(string word)
      {
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>writeEpuWord " + word + "\r\n");
          sender.Send(msg);
          sender.Receive(bytes);
          Debug.Log(Encoding.UTF8.GetString(bytes));
      }
    
      public void SaveSimbolicLearningAsync(string word)
      {
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>writeEpuWord " + word + "\r\n");
          sender.BeginSend(msg, 0, msg.Length, SocketFlags.None, null, null);
          sender.BeginReceive(bytes, 0, bytes.Length, SocketFlags.None, new AsyncCallback(ReceiveCallback), null);
      }    
    • Erase All Reinforcement Learning
      //Erase the EPU learned knowledge (words and saved reactions)
      public void EraseLexic()
      {
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>lexic_erase\r\n");
          sender.Send(msg);
      }    
    • Pause EPU
       //Pause the epu, this is best used when you have a situation where its possible to delay a action, E.g you are talking with a npc but you have unlimited time to respond it
      public void PauseEPU(){
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>pause \r\n");
          sender.Send(msg);
      }   
    • Resume EPU
      //Resume the EPU after pausing it.
      public void ResumeEPU(){
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>resume \r\n");
          sender.Send(msg);
      }    
    • Reset EPU
      //Reset the EPU.
      public void ResetEPU(){
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>reset \r\n");
          sender.Send(msg);
      }    
    • Send User's message for Appraisal (Sync and Async)
      //Send a message to the EPU - text limit is 100 words, this will prompty a emotional response
      public void SendUserMessage(string message = "")
      {
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>user " + message + " \r\n");
          sender.Send(msg);
          sender.Receive(bytes);
          Debug.Log(Encoding.UTF8.GetString(bytes));
      }
      
      public void SendUserMessageAsync(string message = "")
      {
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>user " + message + " \r\n");
          sender.BeginSend(msg, 0, msg.Length, SocketFlags.None, null, null);
          sender.BeginReceive(bytes, 0, bytes.Length, SocketFlags.None, new AsyncCallback(ReceiveCallback), null);
      }
    • Ask for the BUFFER STRUCTURE MDAD (Emo-Matrix)
      //Retrieve the Emotional Buffer for usage
      public void AskForEmotionalBuffer(Action mAction)   
      {
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>buffer\r\n");
          sender.Send(msg);
          //Explain that the emotions are in a formated string not in raw bytes      
          sender.Receive(emoBytes);
          string[] emotionList = Encoding.UTF8.GetString(emoBytes).Split(',');
    
          UpdateDictionary(emotionList);
          if (mAction != null)
          {
              mAction.Invoke();
          }
    
      }
    
      public void AskForEmotionalBufferAsync(Action mAction)
      {
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>buffer\r\n");
          sender.BeginSend(msg, 0, msg.Length, SocketFlags.None, null, null);
    
          if (mAction != null)
          {
              _Action = mAction;
          }
          sender.BeginReceive(emoBytes, 0, emoBytes.Length, SocketFlags.None, new AsyncCallback(ReceiveCallbackBuffer), null);
      }
    
       private void ReceiveCallbackBuffer(IAsyncResult AR)
      {
          int received = sender.EndReceive(AR);
          if (received <= 0)
              return;
          //Do what you want with the returned bytes[]
          string[] emotionList = Encoding.UTF8.GetString(emoBytes).Split(',');
    
          UpdateDictionary(emotionList);
          _Action.Invoke();
          _Action = null;
      }
    • Example on how to Store the Buffer
      //this is a simple dictionary to hold the correct values, if you want more control you can put in any storage type you want
      void SetUpDictionary()
      {
          mEmotions.Add("channelEXCITED", "0");
          mEmotions.Add("channelCONFIDENT", "0");
          mEmotions.Add("channelHAPPY", "0");
          mEmotions.Add("channelDESIRE", "0");
          mEmotions.Add("channelTRUST", "0");
          mEmotions.Add("channelFEAR", "0");
          mEmotions.Add("channelSURPRISE", "0");
          mEmotions.Add("channelINATTENTION", "0");
          mEmotions.Add("channelSAD", "0");
          mEmotions.Add("channelREGRET", "0");
          mEmotions.Add("channelDISGUST", "50");
          mEmotions.Add("channelANGER", "50");
          mEmotions.Add("channelPAINPLEASURE", "50");
      }
    
      void UpdateDictionary(string[] emotions)
      {
          //this check exists only because of the UI auto refresh to the emotional buffer, feel free to erase it in case you don't need it.
          if (emotions.Length < 122)
          {
              return;
          }
          mEmotions["channelEXCITED"] = emotions[4];
          mEmotions["channelCONFIDENT"] = emotions[14];
          mEmotions["channelHAPPY"] = emotions[24];
          mEmotions["channelDESIRE"] = emotions[34];
          mEmotions["channelTRUST"] = emotions[44];
          mEmotions["channelFEAR"] = emotions[54];
          mEmotions["channelSURPRISE"] = emotions[64];
          mEmotions["channelINATTENTION"] = emotions[74];
          mEmotions["channelSAD"] = emotions[84];
          mEmotions["channelREGRET"] = emotions[94];
          mEmotions["channelDISGUST"] = emotions[104];
          mEmotions["channelANGER"] = emotions[114];
          mEmotions["channelPAINPLEASURE"] = emotions[122].Split('C')[0]; //This is necessary because there isn't a divider from the last number and the CMD_OK message
      }
    • Send custom emotional wave (Sync and Async)
      //You can inject(send a wave) to generate a emotion
      public void InjectEmotion(string emotion, string level, string duration, string origin_id, string apex, string curve)
      {
          if (emotion == "" || emotion == null)
          {
              return;
          }
    
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>wave " + emotion + "," + level + "," + duration + "," + origin_id + "," + apex + "," + curve + "\r\n");
          sender.Send(msg);
          sender.Receive(bytes);
          Debug.Log(Encoding.UTF8.GetString(bytes));
    
    
      }
    
      public void InjectEmotionAsync(string emotion, string level, string duration, string origin_id, string apex, string curve)
      {
          if (emotion == "" || emotion == null)
          {
              return;
          }
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>wave " + emotion + "," + level + "," + duration + "," + origin_id + "," + apex + "," + curve + "\r\n");
          sender.BeginSend(msg, 0, msg.Length, SocketFlags.None, null, null);
          sender.BeginReceive(bytes, 0, bytes.Length, SocketFlags.None, new AsyncCallback(ReceiveCallback), null);
      }    
    • Broadcast a message to all connected clients (Sync and Async)
    //You can send a message to all connected clients in your epu
    //Send a message to the EPU - text limit is 100 words
      public void BroadcastMessageToAllClients(string message = "")
      {
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>transfer " + message + " <@\r\n");
          sender.Send(msg);
          sender.Receive(bytes);
          Debug.Log(Encoding.UTF8.GetString(bytes));
      }
    
     
      public void BroadcastMessageToAllClientsAsync(string message = "")
      {
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>transfer " + message + " <@\r\n");
          sender.BeginSend(msg, 0, msg.Length, SocketFlags.None, null, null);
          sender.BeginReceive(bytes, 0, bytes.Length, SocketFlags.None, new AsyncCallback(ReceiveCallback), null);
      }   
    • Enable/disable tone of voice detection (Sync and Async)
    public void EnableDisableToneOfVoiceDetection(bool on = false)
    {
      byte[] msg = null;
      if (on)
      {
          msg = Encoding.UTF8.GetBytes("@>tov_on\r\n");
      }
      else
      {
          msg = Encoding.UTF8.GetBytes("@>tov_off\r\n");
      }
      sender.Send(msg);
      sender.Receive(bytes);    
    }
    
    public void EnableDisableToneOfVoiceDetectionAsync(bool on = false)
    {
      byte[] msg = null;
      if (on)
      {
          msg = Encoding.UTF8.GetBytes("@>tov_on\r\n");
      }
      else
      {
          msg = Encoding.UTF8.GetBytes("@>tov_off\r\n");
      }
      sender.BeginSend(msg, 0, msg.Length, SocketFlags.None, null, null);
      sender.BeginReceive(bytes, 0, bytes.Length, SocketFlags.None, new AsyncCallback(ReceiveCallback), null);
    
    }    
    • Enable/disable face tracking (Sync and Async)
    public void EnableDisableFaceTracking(bool on = false)
    {
        byte[] msg = null;
        if (on)
        {
            msg = Encoding.UTF8.GetBytes("@>ft_on\r\n");
        }
        else
        {
            msg = Encoding.UTF8.GetBytes("@>ft_off\r\n");
        }
        sender.Send(msg);
        sender.Receive(bytes);
    }
    
    public void EnableDisableFaceTrackingAsync(bool on = false)
    {
        byte[] msg = null;
        if (on)
        {
            msg = Encoding.UTF8.GetBytes("@>ft_on\r\n");
        }
        else
        {
            msg = Encoding.UTF8.GetBytes("@>ft_off\r\n");
        }
        sender.BeginSend(msg, 0, msg.Length, SocketFlags.None, null, null);
        sender.BeginReceive(bytes, 0, bytes.Length, SocketFlags.None, new AsyncCallback(ReceiveCallback), null);
    
    }    
    • Add/remove user for face tracking (Sync and Async)
    public void AddRemoveUserForFaceTracking(string username, bool add = false)
    {
        if (username == "") return;
        byte[] msg = null;
        if (add)
        {
            msg = Encoding.UTF8.GetBytes("@>ft_user_add " + username + " \r\n");
        }
        else
        {
            msg = Encoding.UTF8.GetBytes("@>ft_user_rm " + username + " \r\n");
        }
        sender.Send(msg);
        sender.Receive(bytes);
    }    
    • Initialize Speech Recognition (Sync and Async)
        public void InitSpeechRecognition(string key, string region)
        {
            byte[] msg = Encoding.UTF8.GetBytes("@>asr_init " + key + " " + region + "\r\n");
            sender.Send(msg);
            sender.Receive(bytes);
        }
    
        public void InitSpeechRecognitionAsync(string key, string region)
        {
            byte[] msg = Encoding.UTF8.GetBytes("@>asr_init " + key + " " + region + "\r\n");
            sender.BeginSend(msg, 0, msg.Length, SocketFlags.None, null, null);
            sender.BeginReceive(bytes, 0, bytes.Length, SocketFlags.None, new AsyncCallback(ReceiveCallback), null);
        }
      
    • Enable/disable Speech Recognition (Sync and Async)
        public void SpeechRecognition(bool on = false)
        {
            byte[] msg = null;
            if (on)
            {
                msg = Encoding.UTF8.GetBytes("@>asr_on\r\n");
            }
            else
            {
                msg = Encoding.UTF8.GetBytes("@>asr_off\r\n");
            }
            sender.Send(msg,);
            sender.Receive(bytes);
        }
    
        public void SpeechRecognitionAsync(bool on = false)
        {
            byte[] msg = null;
            if (on)
            {
                msg = Encoding.UTF8.GetBytes("@>asr_on\r\n");
            }
            else
            {
                msg = Encoding.UTF8.GetBytes("@>asr_off\r\n");
            }
            sender.BeginSend(msg, 0, msg.Length, SocketFlags.None, null, null);
            sender.BeginReceive(bytes, 0, bytes.Length, SocketFlags.None, new AsyncCallback(ReceiveCallback), null);
        }
      
    • Request the detected user by face tracking (Sync and Async)
    public void GetCurrentFaceTrackingUser()
    {
        byte[] msg = Encoding.UTF8.GetBytes("@>detected_user\r\n");
        sender.Send(msg);
        sender.Receive(bytes);
        Debug.Log(Encoding.UTF8.GetString(bytes));
    }
        
    public void GetCurrentFaceTrackingUserAsync()
    {
        byte[] msg = Encoding.UTF8.GetBytes("@>detected_user\r\n");
        sender.BeginSend(msg, 0, msg.Length, SocketFlags.None, null, null);
        sender.BeginReceive(bytes, 0, bytes.Length, SocketFlags.None, new AsyncCallback(ReceiveCallback), null);
    }
      
    • Request supported languages (Sync and Async)
    public string GetSupportedLanguages()
      {
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>supported_lang\r\n");
          sender.Send(msg);
          sender.Receive(bytes);
          Debug.Log(Encoding.UTF8.GetString(bytes));
          return Encoding.UTF8.GetString(bytes);
      }
    
      public void GetSupportedLanguagesAsync()
      {
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>supported_lang\r\n");
          sender.BeginSend(msg, 0, msg.Length, SocketFlags.None, null, null);
          sender.BeginReceive(bytes, 0, bytes.Length, SocketFlags.None, new AsyncCallback(ReceiveCallback), null);
      }    
    • Set a language (Sync and Async)
    public void SetLanguage(string language)
      {
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@> " + language + " \r\n");
          sender.Send(msg);
          sender.Receive(bytes);
          Debug.Log(Encoding.UTF8.GetString(bytes));
      }
    
      public void SetLanguageAsync(string language)
      {
          byte[] msg = Encoding.UTF8.GetBytes("[EPUID]@>set_lang " + language + " \r\n");
          sender.BeginSend(msg, 0, msg.Length, SocketFlags.None, null, null);
          sender.BeginReceive(bytes, 0, bytes.Length, SocketFlags.None, new AsyncCallback(ReceiveCallback), null);
      }   
    • Generic Async Callback
      //This is a async callback method if you want to process something after recieveing a response
      private void ReceiveCallback(IAsyncResult AR)
      {
          int received = sender.EndReceive(AR);
          if (received <= 0)
              return;
          //Do what you want with the returned bytes[]
          //        Debug.Log(Encoding.UTF8.GetString(bytes));
      }    
    • Stop Connection from the Socket
      //This will only close the socket connection, the epu will not be turned off.
      public void StopConnection()
      {
    
          sender.Shutdown(SocketShutdown.Both);
          sender.Close();
      }    
    • Timeout - Utility
        public void UpdateTimeouts(int value)
        {
            sender.SendTimeout = value;
            sender.ReceiveTimeout = value;
        }   

    Unreal Engine 4(C++ and Blueprints) Sample code

    • Summary
      First of all this uses the Unreal C++ libraries, they are not the same has the standard C++ libraries. But this can be easily replicated with .Net C++ and other variants.
      In case you just want the blueprint info, its recommended to read the next topic(Important Information) then feel free to skip the C++ content (This can be used to any socket system, :D).
      This content uses a tick approach to retrieve recieves from the socket you will not get them instantly but in a timed interval of your choice(min is 0.15f).
      In the future I'll probably expand this with threading or AsyncTask.    
    • Important Information
    
      Emotion Names:
       The default names you have to call the emotions inside the emotional map are as follow:
        channelEXCITED, channelCONFIDENT, channelHAPPY, channelDESIRE, channelTRUST, channelFEAR, channelSURPRISE, channelINATTENTION,
        channelSAD, channelREGRET, channelDISGUST, channelANGER and channelPAINPLEASURE 
       Feel free to change then to whatever you like or change the array completely, like to a really simple default integer array, etc.
    
      Wave Command or Inject Emotion Function:
       The default names you have to call to update the correct emotions are as follow:
        excite, sure, happy, desire, trust, fear, surprise, innattention, sad, regret, disgust and anger
      Those unfortunately aren't changeable, all the others values are strings representings numbers, for more information check the EPU section describing the wave command.
    
      Emotional Buffer:
       It will be a 256 byte stream, the trick part is that the last segment has the okay command together, basically in a string format the last part will be a random number + command like this "96CMD_OK",
       take care in cause you are building your own custom method. 
    • Optional Features
    
          //This is just a brief explaination of the optional features that can be used through the plugin.
            
          //In the main emoshape blueprint but its pretty easy to adjust for the C++ class, there are 2 parameters they are audioPath, epuMessage those are used to have the EPU respond you with audio and text.
          
          //Usage
          
          //The process is automatized by the plugin those parameters will fill automatically when a SDK output contains those parameters, this process happens at the
          //OrganizeEpuMessage method inside the main class if you want to change something there. 
    
          //AudioPath will point to a temporary folder in your system where the audio will have been created so you can use whoverer you want, the actual implementation
          //of importing the audio and playing is not done yet.
    
          //epuMessage is the epu reply to a question send to the chatbox by a outer system, check epu docs for more imformation on this.
    
          //The parameters can be seem in the screen below. 
      Unreal EPU Plugin
      Unreal EPU Plugin
    • Plugins Parameters
      // the parameters necessary to create the plugin
      FSocket* Socket = ISocketSubsystem::Get(PLATFORM_SOCKETSUBSYSTEM)->CreateSocket(NAME_Stream, TEXT("default"), false);    
      const FString address = TEXT("91.124.7.98");
      int32 port = 2424;
      FString msg;
      //Make a Blueprint Node that is editable
      UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = EmoShapePlugin)
      bool talkmode = false;
      //Make a Blueprint Node that is readOnly
      UPROPERTY(EditDefaultsOnly, BlueprintReadOnly, Category = EmoShapePlugin)
      bool connected = false;    
      UPROPERTY(EditDefaultsOnly, BlueprintReadOnly, Category = EmoShapePlugin)
      int32 BytesSent = 0;    
      int32 BufferSize = 300;    
      UPROPERTY(EditDefaultsOnly, BlueprintReadOnly, Category = EmoShapePlugin)
      TArray ReceivedData;
      UPROPERTY(EditDefaultsOnly, BlueprintReadOnly, Category = EmoShapePlugin)
      //You can use others forms here, just using a string solution of ease of reading
      TMap mEmotions;  
    • Create a TCPIP Client to connect to the EPU SDK.
      ///EmoShapePlugin.h
    
      //The most important function otherwise you will never be able to receive and send data :D
      UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
      void Connect(const FString & ipAddress);
    
      ///EmoShapePlugin.cpp
    
      void UEmoShapePlugin::Connect(const FString& ipAddress) {
          FIPv4Address ip;
          FIPv4Address::Parse(ipAddress, ip);
          UE_LOG(LogTemp, Warning, TEXT("Response was %d"), ip.Value);
          TSharedRef addr = ISocketSubsystem::Get(PLATFORM_SOCKETSUBSYSTEM)->CreateInternetAddr();
          bool isValid;
          addr->SetIp(*ipAddress, isValid);
          addr->SetPort(port);
          Socket->SetSendBufferSize(BufferSize, BufferSize);
          Socket->SetReceiveBufferSize(BufferSize, BufferSize);
          Socket->SetNonBlocking(false);//This make it do not choke the game thread pretty cool
          //hold the socket state, this isn't 100% exact(if the server cut the connection per example), but its a decent parameter
          connected =  Socket->Connect(*addr);    
          Socket->SetNonBlocking(true);//This make it do not choke the game thread pretty cool
          if (connected) {
              UE_LOG(LogTemp, Warning, TEXT("Connected"));
          }
          else {
              UE_LOG(LogTemp, Warning, TEXT("Not Connected"));
          }
      }    
    • Alternative Method to Connect
    ///EmoShapePlugin.h
    
    //This exists for the sole reason of connecting in a try and forget method with multiples IPs 
    //and you have no knowledge of the correct IP
    UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
    void ConnectAlternative(const FString & ipAddress);
    
    ///EmoShapePlugin.cpp
    
    void UEmoShapePlugin::ConnectAlternative(const FString& ipAddress) {
      if (Socket) {
          ISocketSubsystem::Get(PLATFORM_SOCKETSUBSYSTEM)->DestroySocket(Socket);        
          Socket = ISocketSubsystem::Get(PLATFORM_SOCKETSUBSYSTEM)->CreateSocket(NAME_Stream, TEXT("default"), false);
      }
      FIPv4Address ip;
      FIPv4Address::Parse(ipAddress, ip);
      TSharedRef addr = ISocketSubsystem::Get(PLATFORM_SOCKETSUBSYSTEM)->CreateInternetAddr();
      bool isValid;
      addr->SetIp(*ipAddress, isValid);
      addr->SetPort(port);
      Socket->SetSendBufferSize(BufferSize, BufferSize);
      Socket->SetReceiveBufferSize(BufferSize, BufferSize);
      Socket->SetNonBlocking(true);//This make it do not choke the game thread pretty cool   
      Socket->Connect(*addr);
      
    }    
    • Init the EPU via TCPIP
      ///EmoShapePlugin.h
    
      //Using this while active cause a overhead at the processor, try to call something else first to get a EPU_OFF message
      UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
      void StartEPU();
    
      ///EmoShapePlugin.cpp
      
      void UEmoShapePlugin::StartEPU() {    
          msg = "[EPUID]@>EPUInit\r\n";    
          Socket->Send((uint8*)TCHAR_TO_UTF8(*msg), sizeof(msg), BytesSent);   
          UE_LOG(LogTemp, Warning, TEXT("Response was %d"), BytesSent);      
      }   
    • Set emotion overall sensitivity (optional - default 100% - range 70%-130%)
      ///EmoShapePlugin.h
    
      //Set the Sensibility (user and robot messages) of the EPU
      UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
      void SetSenses(bool robot, FString value);
    
      ///EmoShapePlugin.cpp
    
      void UEmoShapePlugin::SetSenses(bool robot, FString value) {
          if(robot){
              msg = "[EPUID]@>robot\r\n";        
          }
          else {
              msg = "[EPUID]@>user\r\n";        
          } 
          Socket->Send((uint8*)TCHAR_TO_UTF8(*msg), sizeof(msg), BytesSent);
          msg = "[EPUID]@>sens " + value + "\r\n";
          Socket->Send((uint8*)TCHAR_TO_UTF8(*msg), sizeof(msg), BytesSent);
      }    
    • Set emotion overall persistence (optional - default 100% - range 10%-200%)
      ///EmoShapePlugin.h
    
      //Set the Persistence (user and robot messages) of the EPU
      UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
      void SetPersistence(bool robot, FString value);
    
      ///EmoShapePlugin.cpp
    
      void UEmoShapePlugin::SetPersistence(bool robot, FString value) {
          if (robot) {
              msg = "[EPUID]@>robot\r\n";
          }
          else {
              msg = "[EPUID]@>user\r\n";
          }
          Socket->Send((uint8*)TCHAR_TO_UTF8(*msg), sizeof(msg), BytesSent);
          msg = "[EPUID]@>time " + value + "\r\n";
          Socket->Send((uint8*)TCHAR_TO_UTF8(*msg), sizeof(msg), BytesSent);    
      }
    • Objective Appraisal
      ///EmoShapePlugin.h
    
      //Set the Objective Appraisal ON (user and robot messages) of the EPU
      UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
      void SetObjectiveAppraisal(bool robot);	
    
      ///EmoShapePlugin.cpp
    
      void UEmoShapePlugin::SetObjectiveAppraisal(bool robot) {
          if (robot) {
              msg = "[EPUID]@>robot\r\n";
          }
          else {
              msg = "[EPUID]@>user\r\n";
          }
          Socket->Send((uint8*)TCHAR_TO_UTF8(*msg), sizeof(msg), BytesSent);
          msg = "[EPUID]@>objective_on \r\n";
          Socket->Send((uint8*)TCHAR_TO_UTF8(*msg), sizeof(msg), BytesSent);
      }    
    • Symbolic Reinforcement Learning Part 1
      ///EmoShapePlugin.h
    
      //he will generate a emotional response and you can save it. (Max of 100 words)
      UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
      void SetSimbolicLearning(FString message);
    
      ///EmoShapePlugin.cpp
    
      void UEmoShapePlugin::SetSimbolicLearning(FString message) {
          msg = "[EPUID]@>robot " + message + "\r\n";
          Socket->Send((uint8*)TCHAR_TO_UTF8(*msg), sizeof(msg), BytesSent);
      }    
    • Symbolic Reinforcement Learning Part 2
      ///EmoShapePlugin.h
    
      //Save the SimbolicLearning to a word
      UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
          void SaveSimbolicLearning(FString word);
    
      ///EmoShapePlugin.cpp
    
      void UEmoShapePlugin::SaveSimbolicLearning(FString word) {
          msg = "[EPUID]@>robot " + word + "\r\n";
          Socket->Send((uint8*)TCHAR_TO_UTF8(*msg), sizeof(msg), BytesSent);
      }    
    • Enable/disable tone of voice detection - C++
       ///EmoShapePlugin.h
     
      
       UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
           void EnableDisableToneOfVoiceDetection(bool on);
     
       ///EmoShapePlugin.cpp
     
       void UEmoShapePlugin::EnableDisableToneOfVoiceDetection(bool on = false)
      {   
        if (on)
        {
            msg = "@>tov_on\r\n";
        }
        else
        {
            msg = "@>tov_off\r\n";
        }
        Socket->Send((uint8*)TCHAR_TO_UTF8(*msg), sizeof(msg), BytesSent);
    } 
    • Enable/disable face tracking - C++
       ///EmoShapePlugin.h
     
      
       UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
           void EnableDisableFaceTracking(bool on);
     
       ///EmoShapePlugin.cpp
     
       void UEmoShapePlugin::EnableDisableFaceTracking(bool on = false)
    {
        if (on)
        {
            msg = "@>ft_on\r\n";
        }
        else
        {
            msg = "@>ft_off\r\n";
        }
        Socket->Send((uint8*)TCHAR_TO_UTF8(*msg), sizeof(msg), BytesSent);
    }   
    • Add/remove user for face tracking - C++
       ///EmoShapePlugin.h 
       
       UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
           void AddRemoveUserForFaceTracking(FString username, bool add);
     
       ///EmoShapePlugin.cpp
     
       void UEmoShapePlugin::AddRemoveUserForFaceTracking(FString username, bool add = false)
    {
        if (username == "") return;
        if (add)
        {
            msg = "@>ft_user_add " + username + "\r\n";
        }
        else
        {
            msg = "@>ft_user_rm " + username + "\r\n";
        }
        LongLineSend(msg);
    }   
    • Initialize ASR
     ///EmoShapePlugin.h
    
    
     UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
         void InitSpeechRecognition(FString key, FString region);
    
     ///EmoShapePlugin.cpp
    
     void UEmoShapePlugin::InitSpeechRecognition(FString key, FString region)
    {
      msg = "@>asr_init " + key + " " + region + "\r\n";
      LongLineSend(msg);
    }   
    • Enable/disable ASR
       ///EmoShapePlugin.h
     
      
       UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
           void SetSpeechRecognition(bool on);
     
       ///EmoShapePlugin.cpp
     
       void UEmoShapePlugin::SetSpeechRecognition(bool on = false)
    {
     
        if (on)
        {
            msg = "@>asr_on\r\n";
        }
        else
        {
            msg = "@>asr_off\r\n";
        }
        LongLineSend(msg);
    }   
    • Request the detected user by face tracking - C++
       ///EmoShapePlugin.h
     
       
       UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
           void GetCurrentFaceTrackingUser();
     
       ///EmoShapePlugin.cpp
     
       vvoid UEmoShapePlugin::GetCurrentFaceTrackingUser()
       {
           
           msg = "@>detected_user\r\n";
           LongLineSend(msg);
       }   
    • Request supported languages - C++
       ///EmoShapePlugin.h
     
      
       UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
           void GetSupportedLanguages();
     
       ///EmoShapePlugin.cpp
     
       void UEmoShapePlugin::GetSupportedLanguages()
    {
        msg = "[EPUID]@>supported_lang\r\n";
        LongLineSend(msg);
    }   
    • Set a language - C++
       ///EmoShapePlugin.h
     
      
       UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
           void SetLanguages(FString index);
     
       ///EmoShapePlugin.cpp
     
       void UEmoShapePlugin::SetLanguages(FString index)
    {
        msg = "[EPUID]@> " + index + "\r\n";
        LongLineSend(msg);
    }  
    • Erase All Reinforcement Learning
      ///EmoShapePlugin.h
    
      //Erase all learned words and personality(not sure about this one)
      UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
          void EraseLexic();
    
      ///EmoShapePlugin.cpp
    
      void UEmoShapePlugin::EraseLexic() {
          msg = "[EPUID]@>lexic_erase\r\n";
          Socket->Send((uint8*)TCHAR_TO_UTF8(*msg), sizeof(msg), BytesSent);
      }    
    • Pause EPU
      ///EmoShapePlugin.h
    
      //E.g you are talking with a npc(epu) but you have unlimited time to respond it
      UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
          void PauseEPU();
    
      ///EmoShapePlugin.cpp
    
      void UEmoShapePlugin::PauseEPU() {
          msg = "[EPUID]@>pause\r\n";
          Socket->Send((uint8*)TCHAR_TO_UTF8(*msg), sizeof(msg), BytesSent);
      }    
    • Resume EPU
      ///EmoShapePlugin.h
    
      //Same goes here make sure its paused first    
      UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
      void ResumeEPU();
    
      ///EmoShapePlugin.cpp 
    
      void UEmoShapePlugin::ResumeEPU() {
          msg = "[EPUID]@>resume\r\n";
          Socket->Send((uint8*)TCHAR_TO_UTF8(*msg), sizeof(msg), BytesSent);
      }    
    • Reset EPU
      ///EmoShapePlugin.h
    
      //Reset the EPU   
      UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
      void ResetEPU();
    
      ///EmoShapePlugin.cpp 
    
      void UEmoShapePlugin::ResetEPU() {
          msg = "[EPUID]@>reset\r\n";
          Socket->Send((uint8*)TCHAR_TO_UTF8(*msg), sizeof(msg), BytesSent);
      }    
    • Send a message for Appraisal
      ///EmoShapePlugin.h
      
      //Send a message for the EPU to process and generate a emotional response
      UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
      void SendMessage(FString message);
    
      ///EmoShapePlugin.cpp
    
      void UEmoShapePlugin::SendMessage(FString message) { 
          if (talkmode) {
              msg = "[EPUID]@>robot " + message + "\r\n";
          }
          else {
              msg = "[EPUID]@>user " + message + "\r\n";
          }   
          
          Socket->Send((uint8*)TCHAR_TO_UTF8(*msg), msg.Len(), BytesSent); 
          UE_LOG(LogTemp, Warning, TEXT("My message is %s"), *msg);  
      }    
    • Ask for the BUFFER STRUCTURE MDAD (Emo-Matrix)
      ///EmoShapePlugin.h
      
      //This one is used to ask for a emotional buffer for later retrieval with RecieveMessage.
      UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
         void AskForEmotionalBuffer();
    
      ///EmoShapePlugin.cpp
          
      void UEmoShapePlugin::AskForEmotionalBuffer() {   
          msg = "[EPUID]@>buffer\r\n";
          Socket->Send((uint8*)TCHAR_TO_UTF8(*msg), sizeof(msg), BytesSent);        
      }   
    • RecieveAnyMessage
      ///EmoShapePlugin.h
      
      //This is the recieveMessage you call this whetever you want to receive something,
      //just take in note This use a nonblocking setting, so if you put a .Send() and a .Recv one after the other it will not compute the correct values,
      //I used a loop to grab the receives.
      //Usually you just want to know the emotional buffer anything else is irrelevant to the game minus if the epu stop working,
      //but even so you will get a EPU_OFF message trying to get the buffer so you can act accordly
      //A alternative is using Threads or AsyncTask, puting this together with a other send and running it async, but I think its a over processing for a wait receive.
      UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
          void ReceiveMessage();  
    
      ///EmoShapePlugin.cpp
    
      void UEmoShapePlugin::ReceiveMessage() {    
          ReceivedData.Init(0, 512);
          Socket->Recv(ReceivedData.GetData(), ReceivedData.Num(), BytesSent);   
      }    
    • Example on how to Store the Buffer
      //EmoShapePlugin.h
      
      //Convert response bytes to FString
      //Usually you will always use the ReceivedData array for this, but I let open in case someone wants to make something different
      UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
          FString UTF8toString(const TArray& BinaryArray);
      
      //You have to call ParseIntoArray from UTF8toString returned FString    
      UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
          void UpdateEmotionTMap(TArray mArray);
    
      //EmoShapePlugin.cpp
    
      FString UEmoShapePlugin::UTF8toString(const TArray& BinaryArray) {
          //This was not made by me and I forgot where I go this but kudos to the creator
          //Create a string from a byte array!
          const std::string cstr(reinterpret_cast(BinaryArray.GetData()), BinaryArray.Num());
    
          //FString can take in the c_str() of a std::string
          return FString(cstr.c_str());
      }
    
      void UEmoShapePlugin::UpdateEmotionTMap(TArray mArray) {
          //This is to be sure the recv got the emotional buffer and its not something else.
          if (mArray.Num() == 123) {
              mEmotions.Add(TEXT("channelEXCITED"), mArray[4]);
              mEmotions.Add(TEXT("channelCONFIDENT"), mArray[14]);
              mEmotions.Add(TEXT("channelHAPPY"), mArray[24]);
              mEmotions.Add(TEXT("channelDESIRE"), mArray[34]);
              mEmotions.Add(TEXT("channelTRUST"), mArray[44]);
              mEmotions.Add(TEXT("channelFEAR"), mArray[54]);
              mEmotions.Add(TEXT("channelSURPRISE"), mArray[64]);
              mEmotions.Add(TEXT("channelINATTENTION"), mArray[74]);
              mEmotions.Add(TEXT("channelSAD"), mArray[84]);
              mEmotions.Add(TEXT("channelREGRET"), mArray[94]);
              mEmotions.Add(TEXT("channelDISGUST"), mArray[104]);
              mEmotions.Add(TEXT("channelANGER"), mArray[114]);
              TArray Array;
              mArray[122].ParseIntoArray(Array, TEXT("C"), true);
              mEmotions.Add(TEXT("channelPAINPLEASURE"), Array[0]);            
          }
          else {
              //ConsoleCommands will trigger in this part, handle however do you want in case they are necessary            
          }        
      }   
    • Send custom emotional wave
      //EmoShapePlugin.h
    
      //Inject manually a emotion this can be usefull to be sure something will trigger within the EPU
      UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
          void InjectEmotion(FString emotion, FString level, FString duration, FString origin_id, FString apex, FString curve);
    
      //EmoShapePlugin.cpp 
      
      void UEmoShapePlugin::InjectEmotion(FString emotion, FString level, FString duration, FString origin_id, FString apex, FString curve) {
          //maybe I'll put this in format, but I think this way is clearer for everyone, you can change this if you want.
          msg = "[EPUID]@>wave " + emotion.ToLower() + "," + level + "," + duration + "," + origin_id + "," + apex + "," + curve + "\r\n";
          Socket->Send((uint8*)TCHAR_TO_UTF8(*msg), msg.Len(), BytesSent);
          UE_LOG(LogTemp, Warning, TEXT("Injection %s"), *msg);
      }   
    • Stop Connection from the Socket
      //EmoShapePlugin.h
    
      UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
          void StopConnection();
    
      //EmoShapePlugin.cpp
    
      void UEmoShapePlugin::StopConnection() {
          Socket->Close();
      }    
    • Send a Post to the WebAPI
    //EmoShapePlugin.h
    
    UFUNCTION(BlueprintCallable, Category = "EmoShapePlugin")
      void Post(FRequest_Post LoginCredentials);
    
    //EmoShapePlugin.cpp
    
      void UEmoShapePlugin::Post(FRequest_Post LoginCredentials) {   
        FString ContentJsonString;
        GetJsonStringFromStruct(LoginCredentials, ContentJsonString); 
        TSharedRef Request = PostRequest("application/json", ContentJsonString);    
        Request->OnProcessRequestComplete().BindUObject(this, &UEmoShapePlugin::PostResponse); 
        Send(Request);  
      }    
    • Summary for Blueprints
      I'll show here how to call and utilize the blueprint nodes availiable by the plugin.
    
      The blueprint section follow this layout:  
      Inputs/Parameters:
        -Name|Type or None
      Outputs:
        -Name|Type or None        
      Image:
          Image HTML link       
    • Plugins Parameters
        //Those Values are inside the EmoShape plugin component
        Parameters:
          //Byte stream received from the buffer, Get Only.
          -ReceivedData|byte[]          
          //Emotion TMap. Get Only
          -mEmotions|Map[]
          //To check the size of the received buffer, Get Only
          -BytesSent|integer        
          //To check if the plugin is connected to the EPU, this doesnt include if it is running. Get Only
          -Connected|bool  
          //for sending messages only, robot = true, user = false. Get and Set
          -TalkMode|bool      
        
        //This shows how to add the EmoShape plugin within your actor.
    
           
      Unreal EPU Plugin
    • Create a TCPIP Client to connect to the EPU SDK.
        //Connects to the socket to the EPU address, do not add the port in the IP input, this will freeze the screen for about 1 sec, use in a loading screen, etc.
    
        Inputs:
          -IP|String
        Outputs:
          -None        
      
      Unreal EPU Plugin
    • Alternate way to Connect
      //Connect method for Try and Forget methods until the correct Ip is found
    
      Inputs:
        -IP|String
      Outputs:
        -None        
    
      Unreal EPU Plugin
    • Init the EPU via TCPIP
        //Init the EPU in case its off, this takes a while its best to add a timer or delay of 3 to 5 seconds.
    
        Inputs:
          -None
        Outputs:
          -None  
      
      Unreal EPU Plugin
    • Set emotion overall sensitivity (optional - default 100% - range 70%-130%)
        //Set the emotional sensibility from robot or user messages.
    
        Inputs:
          -Robot|bool
          -Value|String
        Outputs:
          -None       
         
      Unreal EPU Plugin
    • Set emotion overall persistence (optional - default 100% - range 10%-200%)
        //Set the emotional persistence from robot or user messages.
    
        Inputs:
          -Robot|bool
          -Value|String
        Outputs:
           -None     
      
      Unreal EPU Plugin
    • Objective appraisal
        //Set the objective appraisal on, this will make the next message to be appraised without taking in account the current emotional levels.
    
        Inputs:
          -Robot|bool
        Outputs:
          -None  
      
      Unreal EPU Plugin
    • Symbolic reinforcement learning - Part 1
        //Ask the EPU to analize a text, max of 100 words, and generate a emotional response from that so it can be saved inside a word for further usage.
    
        Inputs:
          -Message|String
        Outputs:
          -None
       
      Unreal EPU Plugin
    • Symbolic reinforcement learning - Part 2
        //Save the emotional response from the learning process within a word
    
        Inputs:
          -Word|String
        Outputs:
          -None
        
      Unreal EPU Plugin
    • Enable/disable tone of voice detection
        
        Inputs:
          -On|bool
        Outputs:
          -None
        
      Unreal EPU Plugin
    • Enable/disable face tracking
        
        Inputs:
          -On|bool
        Outputs:
          -None
        
      Unreal EPU Plugin
    • Add/remove user for face tracking
       
        Inputs:
          -UserName|String
          -Add|bool
        Outputs:
          -None
        
      Unreal EPU Plugin
    • Enable/disable Speech Recognition
        
    
        Inputs:
          -On|bool
        Outputs:
          -None
        
      Unreal EPU Plugin
    • Request the detected user by face tracking
        
    
        Inputs:
          -None
        Outputs:
          -None
        
      Unreal EPU Plugin
    • Request supported languages
        
        Inputs:
          -None
        Outputs:
          -None
        
      Unreal EPU Plugin
    • Set a language
        //Set the SDK language based on the languages index
    
        Inputs:
          -Index|String
        Outputs:
          -None
        
      Unreal EPU Plugin
    • Erase all reinforcement learning
        //Erase all learned words
    
        Inputs:
          -None
        Outputs:
          -None
        
      Unreal EPU Plugin
    • Pause EPU
        //Pause the EPU, be sure to have the EPU running when calling this, this will not generate a EPU_OFF message
    
        Inputs:
          -None
        Outputs:
          -None      
    
      Unreal EPU Plugin
    • Resume EPU
        //Resume the EPU, be sure to have the EPU paused when calling this
    
        Inputs:
          -None
        Outputs:
          -None    
    
      Unreal EPU Plugin
    • Reset EPU
        //Reset the EPU, be sure to have the EPU paused when calling this
    
        Inputs:
          -None
        Outputs:
          -None    
    
      Unreal EPU Plugin
    • Send a message to the EPU
        //Send a message so the EPU can generate a emotional response.
    
        Inputs:
          -Message|String
        Outputs:
          -None
      
      Unreal EPU Plugin
    • Ask for the BUFFER STRUCTURE MDAD (Emo-Matrix)
        //Send a message to the epu to get the emotional buffer
    
        Inputs:
          -None
        Outputs:
          -None     
         
      Unreal EPU Plugin
    • Receive any message
        //This node receive the buffer sent by the EPU and write then at ReceivedData for later use.
    
        Inputs:
          -None
        Outputs:
          -None     
          
      Unreal EPU Plugin
    • Example on how to store the buffer
        //A example to how retrieve, convert and store the emotional buffer in a array
    
        //UTF8ToString Node
        Inputs:
          -ReceivedData|byte[]        
        Outputs:
          -result|String 
    
        //ParseIntoArray, this is a default node
        
        Inputs:
          -temp|String       
        Outputs:
          -result|String[]
    
        //UpdateEmotionTMap Node
        
        Inputs:
          -temp|String[]      
        Outputs:
          -None
        
          
      Unreal EPU Plugin
    • Send custom emotional wave
        //Inject manually a emotion this can be useful to be sure something will trigger within the EPU
    
        Inputs:
          -emotion
          -duration
          -level
          -origin_id
          -apex
          -curve
        Outputs:
          -None     
        
      Unreal EPU Plugin
    • Stop connection from the socket
        //This will close the socket connection only.
    
        Inputs:
          -None
        Outputs:
          -None     
       
      Unreal EPU Plugin
    • Stop connection from the socket
      //By using a struck you can put anything you want to the json request message.
    
      Inputs:
        -RequestInfo|FRequest_Post*
      Outputs:
        -None  
        
      *Struct name, you can check it in HttpStruct.h
     
      Unreal EPU Plugin

    This documentation is provided by Emoshape Inc.

    © Copyright Emoshape Inc. All Rights Reserved.