Category: News

MetaSoul’s Groundbreaking API Now Available on Microsoft Azure

We are thrilled to announce the official launch of our latest innovation – the MetaSoul Enhanced OpenAI Persona API, now available on the Microsoft Marketplace

 

 

Why It’s a Game Changer:

 

  • Revolutionizing AI Interaction: MetaSoul’s advanced algorithms empower AI to respond dynamically with twelve primary emotions, revolutionizing how AI engages and interacts with users.
  • Real-Time Emotional Synthesis: Experience unparalleled emotional intelligence with real-time appraisal computation, reinforcement learning, and Emotion Profile Graph (EPG) functionality, allowing for 64 trillion possible emotional states every 1/10th of a second.
  • Myers Briggs Personality Integration: Craft authentic personas using Myers Briggs personality types, offering a deep understanding of 16 distinct personalities and their interactions.
  • Seamless Integration with Microsoft Voices: The integration with MetaSoul Speech API allows precise emotional control over Microsoft voices, providing lifelike lipsync and expressive avatars.
  • Metaverse and Game Development: Elevate NPCs and avatars in the metaverse, creating dynamic interactions linked to personality, emotional states, and learned sensitivity. Game developers now have a powerful tool to enhance player experiences.
  • Affordable and Secure: Our one-time setup fee of $99 makes this groundbreaking technology accessible to developers. Rest assured, MetaSoul holds the patent, ensuring a secure and seamless development process.

MetaSoul Speech API Brochure

MetaSoul Enhanced personality API for AI Brochure

 

MetaSoul’s Increases Customer Experience by 32% – Microsoft Neural TTS

MetaSoul Enhances Caregiver Environment and General Satisfaction with Microsoft Neural TTS

 

 

In the wake of our exhaustive survey and meticulous data analysis, we are thrilled to share compelling results that illuminate the transformative impact of MetaSoul’s real-time control over Microsoft Neural Text-to-Speech (TTS).

 

 

Key Finding: Significant Improvements

 

Our findings reveal a remarkable 32% improvement in the caregiver environment when Emoshape takes the reins of Microsoft Neural TTS in real-time. This noteworthy enhancement underscores the potential of emotion synthesis technology to elevate the overall caregiving experience.

 

 

Voice A: Normal TTS from Microsoft AZUR:

 

 

Voice B: The same TTS controlled with our MetaSoul Emotion Synthesis Tech – EMS-AZUR:

 

 

 

Refining Insights: A Closer Look After Cleaning the Data Set

 

In the pursuit of precision and meaningful analysis, our journey takes a crucial turn as we meticulously clean and refine our dataset. After collecting responses from 204 random adult consumers in the U.S. during August 11-12, 2021, we recognize the importance of eliminating outliers and self-contradictory responses to ensure the integrity of our findings.

 

 

Data Cleaning Methodology

 

Our commitment to rigorous research led us to scrutinize individual responses, identifying outliers that could potentially skew our results. Outliers included respondents who consistently scored everything as 10s or 0s, as well as those who provided conflicting assessments—labeling the voice both highly pleasant and highly annoying. These patterns of response signaled a lack of careful consideration or potential inconsistencies in understanding the task at hand.

 

 

Exclusion Criteria

 

To maintain the robustness of our dataset, we made the decision to exclude responses falling into these outlier categories. This step was essential to refine our analysis and derive insights that truly reflect the thoughtful evaluations of our participants.

 

 

Enhancing Data Integrity

 

By systematically removing outliers, we enhance the reliability and accuracy of our results, allowing us to draw more meaningful conclusions from the survey. This commitment to data integrity ensures that our findings are rooted in genuine participant perceptions, untainted by anomalous responses that may skew the overall narrative.

 

 

 

Results: 32% Improvement in caregiver environment and 22% more satisfying in general when Emoshape controls Microsoft Neural TTS in real-time.

 

 

 

 

 

Beyond Caregiving: General Satisfaction

 

The impact doesn’t stop there. Participants expressed a notable 22% increase in general satisfaction when Emoshape’s technology controlled Microsoft Neural TTS. This broad satisfaction metric spans various applications, indicating the versatility and positive reception of emotion-enhanced voice synthesis.

 

 

Unpacking the Insights

 

As we dissect these results, it becomes evident that the infusion of Emoshape’s emotion synthesis technology adds a layer of depth and resonance to the voice, fostering a more engaging and satisfying experience. Whether in the caregiver environment or across diverse applications, participants consistently favored the emotionally enriched voice controlled by Emoshape.

 

 

Implications for the Future

 

These findings hold promising implications for the future of voice synthesis technology, suggesting that integrating emotion synthesis can significantly enhance user experiences across different scenarios. The marriage of Microsoft Neural TTS and Emoshape emerges as a potent combination, opening new avenues for applications that prioritize both technical excellence and emotional resonance.

 

 

Stay Tuned for In-Depth Analysis

 

In the coming weeks, we will delve deeper into the nuanced aspects of our survey results, providing a comprehensive analysis of participant feedback and shedding light on the intricacies of emotion-enhanced voice synthesis.

 

 

In summary, our study unveils a substantial leap forward in the realm of voice synthesis, showcasing the tangible benefits of real-time emotion control. MetaSoul’s collaboration with Microsoft Neural TTS paves the way for a more satisfying and emotionally resonant auditory landscape, hinting at exciting possibilities for the future of human-technology interaction.

 

 

 

Patrick Levy-Rosenthal: The Inventor behind Virtual Lens

Emoshape CEO presents his worldwide-acclaimed invention, Virtual Lens, used today by more than 1.3 billion people daily in Snapchat and Instagram
Patrick Levy-Rosenthal, presently the CEO of Emoshape Inc., is the architect behind the virtual lens also known as filters. His invention now serves one of the core functions to which social media applications owe their popularity and high engagement.
The 2006 Virtual Lens technology supported the ability to track a defined set of three-dimensional coordinates within a video stream and to dynamically insert rendered 3D objects within the stream in real time.

IVBOT was renamed GloopIt

Levy-Rosenthal started by creating an application called IVBOT. This app was devised to make video creation more engaging and fun by generating filters that enhanced and altered appearance. The app accomplished this through 3D face tracking and Augmented Reality. IVBOT included real-time makeup and skin enhancement; the tech was able to remove pimples and other blemishes, as commonly enjoyed today on photo editing and retouching tools. The application was reviewed by the technodata blog for use as a video chat booster and can be see in action in this demo.
The groundbreaking Virtual Lens invention was patented in 2007, and subsequently cited by a number of applicants including Skype, Samsung, Snap, and Facebook. The concept was adopted by Masquerade Technologies, which sold to Facebook in 2016 to help them compete with Snapchat.

Insofar as filters and other beautification tools dominate the camera app and social media categories, Virtual Lens should be acknowledged for its contributions to some of today’s most influential companies.
After IVBOT that became GloopIt, Levy-Rosenthal invented the Emotion Processing Unit (EPU) and developed the first AI home console in 2012. The latter inspired the Amazon Echo. He currently
lives and works in New York. He grew up in Paris. He has presented to the Artificial Intelligence council of the United Nations, been awarded the IST Prize from the European Union, and been written about in Forbes magazine.
Patrick Levy-Rosenthal’s work and accomplishments are important for the fields of Virtual Reality and Artificial Intelligence. They were meaningful in the creation of camera applications now enjoyed by billions in their day-to-day life. And new products and services like Music4Book and ExoLife that derive from his EPU share the same promise for consumers.

https://markets.businessinsider.com/news/stocks/patrick-levy-rosenthal-the-inventor-behind-virtual-lens-1028772713

Emoshape Inc.
New York City

The Erosion Of Empathy In Healthcare

The erosion of empathy in healthcare is a well-known phenomenon in medical schools. According to a 2009 study for the NCBI, erosion of empathy begins in most medical students from the third year and continues to decline until graduation. The study concluded that ironically the deterioration of empathy occurs just as the curriculum begins to shift towards patient care.

 

But why does this occur with such precise regularity and could one paediatrician’s unconventional approach provide a ray of hope for emotive patient care?

 

Protective Empathetic Detachment 

Writing in the Stanford Medicine Scope, Natasha Abadilla offers a personal account of the erosion of empathy in healthcare training. Her piece identifies the unfortunate reason why it may be unavoidable. As doctors acquire more knowledge about the pain of their patients, the need for emotional protection and (thus) emotional distance becomes greater:

 

“Additional knowledge of why our patients feel pain… sets the level of grief so much higher than before.” – Natasha Abadilla, Stanford Medicine Scope

 

Not only does it become more painful to empathize as context improves, but physicians are more exhausted than ever. This leaves them less able to be empathetic to begin with.

 

Physician Burnout and Depression Increases Emotional Erosion

Physicians that are burned out and depressed have admitted that they were less engaged with their patients. In a survey of 15,000 physicians, Medscape found that 43% of respondents felt burned out and a further 15% were colloquially to clinically depressed.

 

If doctors need to depersonalize their patients in order to effectively treat them, is empathy in healthcare worth the sacrifice? That question should ultimately depend on how valuable empathy is, so what does the research say?

 

The erosion of empathy in healthcare

The erosion of empathy in healthcare

The rate of physician burnout is on the increase

 

How Important Is Empathy in Healthcare? 

A study first published by the Canadian Medical Association Journal in 1995 found that effective physician-patient communication had an impact on patient health. The second most common outcome affected, after immediate emotional benefits, was symptom resolution. Empathy in healthcare was also linked to better medication adherence and fewer malpractice cases.

 

It is clear that there is much to be gained from more emotive patient care. The findings of the study were reinforced in dramatic fashion when a pediatrician from California captured the public’s affection in 2018 with a highly compassionate gesture for his young patients.

 

Tony Adkins: A Philosophy of Dance

In an interview at a GDS Summit, Tony Adkins shared elements of the philosophy which led him to start dancing with the extremely ill children in his care. He believes that making sure patients have as good a time as possible is essential. Treatment can’t be replaced, but he argues effectively establishing connections with patients can sometimes be more effective than further increases in pain medication.

 

“Once you connect with that patient, you win and they will trust you with their life forever.” – Tony Adkins, PA-C, MPH, MCHS

 

Paediatrician Tony Adkins dances with his young patients

For Adkins, the most important thing is the well-being of his patients.

 

Returning to Empathy

Patients’ wellbeing may be inseparable from the healthcare industry. But it is critically important that this is not only treated as a practical issue. At the same time, a complicated supply and demand problem is causing extreme pressure on physicians. In spite of their best intentions, emotional erosion is an unfortunate reality for most students.

 

Ironically, empathy in healthcare becomes more difficult to achieve as physicians progress because of their compassionate motivations. But by embracing Adkins philosophy, providers of healthcare can rediscover the intrinsic sense of empathy in healthcare and improve patients’ experience, as well as their health.

 

Copyright GDS Insight

 

MetaSoul Vision

In light of the recent GDS Insights publication for the healthcare industry that indicated the value of empathy in improving patient outcomes, it is prudent to consider the role of technology.

 

The GDS Group posits that personal connection between doctor and patient is a critical but difficult element of care. Circumstances of emotional and physical fatigue preclude physicians from engaging meaningfully with patients. The article cites the anecdotal example of a doctor who joyfully dances with his patients as a practical solution for overcoming this chasm.

 

This is a charming but impractical model from which to extrapolate a solution for an entire industry. Dr. Tony Adkins, the aforementioned dancing pediatrician, became a viral sensation from whom to glean this remedy because he was unique and novel. Unique and novel things are not easily repeatable; for many reasons including disinterest, personal and professional obligations, energy, social challenges, and so forth doctors lack the ability to form connections, notwithstanding that patients may not receive their efforts as they were intended (one size does not fit all).

 

However unbelievable, robots have precisely the capabilities to avoid the pitfalls doctors face. They are indefatigable and imminently scalable.

 

With the Emotion Processing Unit, virtual caregivers, robots and AI, for the first time, are also capable of feeling and understanding in human-like ways, empathizing, and forming meaningful connections. And they can adapt for every individual patient.

 

EPU-powered robots are the future of patient well-being:

Domestic SupportContinuity-of-CareProvider Support
Treatment adherence support (from basic appointment reminders and medication administration to physical therapy)Paediatric remote monitoring (temperature fluctuation, respiratory issues, seizures, etc.)

Geriatric companionship and assistance

Care Agents—a digital or robotic assistant that is available day and night at every medical facility, familiar with complete medical history, an extension of a doctor’s visit that can continue to provide comfort and supportIntelligent Devices (e.g. imaging machines that comfort the claustrophobic)Collaborative Assistants—a digital or robotic assistant that helps assimilate diagnostic information, review treatment options, prepare for patient notification, counsel patient, enter visit notes and order labs/procedures, check emotional wellness of doctors, and support human productivity.

“Emotion Is Part of Intelligence” Yann LeCun

Yann LeCun, a luminary in the field of artificial intelligence, has articulated a groundbreaking perspective on the integration of emotion into intelligence. In his notable statement, “Emotion is part of intelligence,” LeCun challenges the conventional boundaries of AI by acknowledging the crucial role of emotions in the broader spectrum of intelligent behavior.

LeCun’s emphasis on an “Emotion Chip” underscores his belief in the interconnectedness of cognitive and emotional processes. This perspective represents a departure from traditional AI models that often neglect the nuanced and complex nature of human emotions. By advocating for the incorporation of emotional understanding into AI systems, LeCun pioneers a more holistic and human-centric approach to artificial intelligence.

This concept has far-reaching implications, transcending theoretical frameworks to impact practical applications. LeCun’s vision implies that future AI systems could not only comprehend data and perform tasks but also navigate the subtleties of human emotions. Such a paradigm shift could revolutionize human-computer interaction, enabling machines to respond empathetically and adapt to users’ emotional states.

As the Vice President and Chief AI Scientist at Meta, LeCun is uniquely positioned to shape the trajectory of AI development. His perspective on emotion and intelligence provides a guiding philosophy for the ethical and responsible advancement of AI technologies. By acknowledging the importance of emotions, LeCun fosters a vision of AI that aligns with human values and societal well-being.

In essence, Yann LeCun’s assertion that “emotion is part of intelligence” serves as a catalyst for a paradigm shift in AI development. It opens up new avenues for exploration, challenging researchers and practitioners to create AI systems that not only excel in cognitive tasks but also understand, interpret, and respond to the rich tapestry of human emotions.

The World Board of AI Come Together at the United Nations

Several global leaders from the world of Artificial Intelligence were invited to a dinner organized by the United Nations on March 27, 2017. Amongst other eminent guests, the founder of Emoshape, Patrick Levy-Rosenthal was also present at this high-profile event. Patrick is the primary architect behind the development of Emoshape’s Emotion Processing Unit (EPU), the industry’s first ever emotion synthesis engine.

 

During the event, Patrick Levy-Rosenthal had the opportunity to talk at a length with the AI chief at Facebook, Yann LeCun. Later on, during the dinner, Patrick was spotted sitting next to Guruduth S. Banavar, the Chief Science Officer of Cognitive Computing of IBM, also known popularly as IBM Watson. Two of them had an extensive

Patrick Levy-Rosenthal @ The United Nation

communication and exchange of point of views on the role of emotions. Though it is certainly too early to forecast anything,  this discussion may pave the way for his organization to collaborate with IBM Watson in the days to come.

Highlighting his experience at the dinner, Patrick mentioned, “It was certainly an amazing experience to meet some of the most iconic personalities in the field of AI. The most interesting part for me was the talk during the dinner with the Chief Science officer Cognitive Computing from IBM (Watson) about Emotion synthesis.”

AI chief at Facebook, Yann LeCun with Patrick Levy-Rosenthal architect of the Emotion Chip at the United Nation