Category: Media

“공감은 인류의 생존 기술…가상현실 속 ‘디지털 영혼’ 나올 것”

[이코노미조선]
[Interview]패트릭 레비-로젠탈 이모셰이프 창업자 겸 최고경영자(CEO)

디지털과 인공지능(AI) 기술의 급속한 발전은 우리 생활에 많은 편의를 제공하고 있지만, 동시에 막연한 두려움도 안겨준다. ‘언젠가는 기계가 인간을 대체하지 않을까’라는. 실제로 이미 많은 산업 영역에서 인간을 대체하는 AI도 속속 등장하고 있다.

하지만 많은 전문가들은 고도로 디지털이 발달한 사회에서도 ‘휴먼 터치(human touch•인간 감성)는 여전히 필요하다’고 지적한다. 아무리 기술이 발전해도 인간의 감수성과 인간다움은 대체하기 어렵기 때문이다. 디지털 사회에서 휴먼 터치는 어떻게 구현되고, 기업은 어떻게 휴먼 터치를 기술에 접목시킬 수 있을까.[편집자 주]

패트릭 레비-로젠탈 이모셰이프 창업자 겸 CEO 전 오디오트랙(Audiotrack) 창업자 겸 CEO, 전 피트복스(Pitvox) 창업자 겸 CEO, 전 D&L 퓨처스(Futures) CEO/ 사진 패트릭 레비-로젠탈

Read More: https://biz.chosun.com

Read more: Click here

Read English version: “Empathy is the survival skill of mankind… A digital soul in virtual reality will appear”

Personal Robotic Technology Market

Personal Robotic Technology Market is Booming Worldwide with Top Key Players: Emoshape Inc., Blue Frog Robotics, Jibo, LG Electronics, PARO Robots US, Robert Bosch



Read more: https://www.digitaljournal.com/pr/personal-robotic-technology-market-is-booming-worldwide-with-top-key-players-emoshape-inc-blue-frog-robotics-jibo-lg-electronics-paro-robots-us-robert-bosch#ixzz7SskkQ1Os

MetaSoul™

MetaSoul is the next step in terms of human evolution in the digital world. Billions people use social media and games worldwide on a daily basis, spending billions of dollars on digital assets and filters to empower their alter ego. But unfortunately, all these digital entities have one thing in common; they are ultimately Soulless.

Soon it will be possible for humans to create a unique digital soul, an extension of their inner self that will continue to evolve in the Metaverse forever. Emoshape MetaSoul technology will very soon start to change the way humans interact in Metaverses. We encourage your to be a part of this future and pre-order a digital soul powered by MetaSoul technology on our website ExoLife.com

Emoshape Announce MetaSoul™

Emotions are the voice of the Soul. 32 types of MetaSoul exist in the Metaverse, powered by 64 trillion subtle emotional states. Initially synthesized from a few seconds of the human voice, they understand the meaning of the language and the gameplay.

They are aware entities powered by Emoshape Emotion Synthesis. Sensing different levels of joy, frustration, pleasure, sadness, excitement, fear, and more, these emotions will trigger various responses in MetaSoul, crafting its personality and emotional sensitivity forever.

After interacting with owners and the environment itself, the MetaSoul will learn, adapt, and gain unique personalities of their own. All critical accumulated experience data is collected in the digital and real world.

Features alone are no longer the sole determinant of value. Instead, how these digital assets interact within the Metaverse will distinguish them just as humans in the real world. Attached to your Avatar or digital human, they will act according to how they feel, thereby revolutionizing the meaning of personalization.

MetaSoul™
Digital Soul for Metaverse

As the popularity of NFTs increases and the demand for more dynamic gameplay grows, emotional integration and learned sensibility will continue to play a pivotal role in this evolution.

While different variants and collections of assets exist in the digital world, they are yet to react based on sensory experiences and interactions. Your Avatars, NPCs, or Digital Art will evolve based on how you train and engage with them. Powered by the complexities of human emotion.

For gamers, a MetaSoul attached to an NPC is poised to redefine how players interact with NPCs, paving the way to hyper-realistic emotional responses and characters. Their perception will define them. Just as people are affected by those around them, so will your NPC.

By enabling interactive art to make adjustments and adaptations based on feelings and emotional responses, they can transform into one-of-a-kind pieces that flow with their personalities.

MetaSoul keeps its visual and emotional experiences in memory and can learn from its owner. As a result, they can accrue ultra-high values for monetization purposes.

Additionally, they can be enjoyed as original, dynamic pieces found nowhere else.

MetaSoul is the currency of your experience in Metaverses.

Emotion Synthesis Chip for AI and Robots with Patrick Levy-Rosenthal

Adam Torres and Patrick Levy-Rosenthal discuss emotion synthesis for AI and robots.

Adam Torres         Emotion Synthesis Chip for AI and Robots with Patrick Levy-Rosenthal           Emotion Synthesis Chip for AI and Robots with Patrick Levy-Rosenthal          Get Mission Matters News FREE!  

Show Notes:

Emotion synthesis is making it possible for AI and robots to have another level of communication. In this episode,  Adam Torres and Patrick Levy-Rosenthal, Founder & CEO of Emoshape Inc, explore how Emoshape is creating the “heart” of AI and robots.

About Patrick Levy-Rosenthal

A Revolution In Human Emotion Through Artificial Intelligence.

Emoshape Inc. is owned and run by entrepreneur Patrick Levy-Rosenthal, who currently lives in New York. He grew up in Paris. He is presented to the Artificial Intelligence council of the United Nations, he has won the IST Prize from the European Union and has been written about in Forbes magazine. TEDx speaker, his 2006 worldwide-acclaimed invention, Virtual Lens, is used today by more than 1.3 billion people daily in Snapchat and Instagram. Emoshape is ready to have a massive impact changing not only the deviceless that we interact now but for many more we will interact in the future. He moved to NYC to work and develop his passion and ideas surrounding bio inspired emotion synthesis. He studied the relationship between cognition and emotion, the root of the cognitive processes underlying each emotional response, emotions synthesis and the influence of emotion on decision making. Patrick has developed a new generation of microchip named EPU (Emotion Processing Unit) for Ai and Robots and the world’s first Al that can feel 64 trillion possible states every 1/10 of a second.

Close your eyes and imagine a technology to teach objects how to interact with humans in order to yield a favorable result. Emoshape technology presents a new leap for artificial intelligence on all fronts especially in the realm of smart phones, toys, robots, computers, and other major electronic devices with applications in Artificial Intelligence, Medical, Biometrics, Financial, Defense, Gaming and Advertising. 

About Emoshape Inc.

EmoSHAPE INC is dedicated to providing a technology that teaches intelligent objects how to interact with humans to yield a favourable, positive result. Emoshape emotion synthesis microchip (EPU) technology represents a massive leap for Artificial Intelligence especially in the realm of self-driving cars, personal robotic, sentient virtual reality, affective toys, IoT, pervasive computing and other major consumer electronic devices. Applications including Human machine interaction, emotion speech synthesis,  emotional awareness, machine emotional intimacy, AI’s personalities, machine learning, affective computing, medicine, advertising, and gaming will significantly benefit from the Emotion Processing Unit (EPU II). The growing presence of AI, robotics and virtual reality in society at large means that meaningful emotional interaction is core to removing the barrier to widespread adoption.

Emotion Processing Unit II

A microchip that enables an emotional response in AI, robots and consumer electronic devices

EPU II is the industry’s first emotion synthesis engine. It delivers high-performance machine emotion awareness, the EPU II family of eMCU are transforming the capabilities of Robots and AI. Emoshape has completed the production of the first EPU (emotional processing unit); a patent pending technology which creates a synthesised emotional response in machines. The groundbreaking EPU algorithms effectively enable machines to respond to stimuli in line with one of the twelve primary emotions:  anger, fear, sadness, disgust, indifference, regret, surprise, anticipation, trust, confidence, desire and joy. The Emotion recognition classifiers achieve up to 98 percent accuracy. The EPU represents a significant advance for AI, particularly for smartphones, toys, robots, android, computers, and other major electronic devices.

Emoshape 2020 – New Website Design

Click the image below to visit the site

An Inventor’s Journey: Emoshape’s Patrick Levy-Rosenthal – Forbes

An Inventor’s Journey: Emoshape’s Patrick Levy-Rosenthal

 

In the last few decades, we’ve had great inventors who gave us the Internet, E-commerce, the iPhone, Social Media, and other technologies that we can’t live without. Even though we may use these products every day, we don’t often think about the journey that inventors took to bring these products into the marketplace. In reality, inventing the technology is often the beginning. Realizing ….  (click here to continue)

HAPPY NEW YEAR 2020 – EMOSHAPE

Five.. Four.. Three.. Two.. One.. HAPPY NEW YEAR from EMOSHAPE

Emoshape Featured on Forbes – Empathy in Artificial Intelligence

In order for Artificial Intelligence to empathize with human emotions, artificial intelligence must have a way of learning about the range of emotions that we experience. 

Emoshape is the first company to hold the patent technology for emotional synthesis. The emotion chip or EPU developed by Emoshape can enable any AI System to understand the range of emotions experienced by humans. At any moment, the EPU can understand 64 trillion possible emotional states every 1/10th of a second. The range of your emotions is mapped onto a gradient where the degree of each emotion can be observed.   (read the full article)

Patrick Levy-Rosenthal: The Inventor behind Virtual Lens

Emoshape CEO presents his worldwide-acclaimed invention, Virtual Lens, used today by more than 1.3 billion people daily in Snapchat and Instagram
Patrick Levy-Rosenthal, presently the CEO of Emoshape Inc., is the architect behind the virtual lens also known as filters. His invention now serves one of the core functions to which social media applications owe their popularity and high engagement.
The 2006 Virtual Lens technology supported the ability to track a defined set of three-dimensional coordinates within a video stream and to dynamically insert rendered 3D objects within the stream in real time.

IVBOT was renamed GloopIt

Levy-Rosenthal started by creating an application called IVBOT. This app was devised to make video creation more engaging and fun by generating filters that enhanced and altered appearance. The app accomplished this through 3D face tracking and Augmented Reality. IVBOT included real-time makeup and skin enhancement; the tech was able to remove pimples and other blemishes, as commonly enjoyed today on photo editing and retouching tools. The application was reviewed by the technodata blog for use as a video chat booster and can be see in action in this demo.
The groundbreaking Virtual Lens invention was patented in 2007, and subsequently cited by a number of applicants including Skype, Samsung, Snap, and Facebook. The concept was adopted by Masquerade Technologies, which sold to Facebook in 2016 to help them compete with Snapchat.

Insofar as filters and other beautification tools dominate the camera app and social media categories, Virtual Lens should be acknowledged for its contributions to some of today’s most influential companies.
After IVBOT that became GloopIt, Levy-Rosenthal invented the Emotion Processing Unit (EPU) and developed the first AI home console in 2012. The latter inspired the Amazon Echo. He currently
lives and works in New York. He grew up in Paris. He has presented to the Artificial Intelligence council of the United Nations, been awarded the IST Prize from the European Union, and been written about in Forbes magazine.
Patrick Levy-Rosenthal’s work and accomplishments are important for the fields of Virtual Reality and Artificial Intelligence. They were meaningful in the creation of camera applications now enjoyed by billions in their day-to-day life. And new products and services like Music4Book and ExoLife that derive from his EPU share the same promise for consumers.

https://markets.businessinsider.com/news/stocks/patrick-levy-rosenthal-the-inventor-behind-virtual-lens-1028772713

Emoshape Inc.
New York City

Affective Computing market worldwide is projected to grow by US$110.6 Billion

Affective Computing market worldwide is projected to grow by US$110.6 Billion, driven by a compounded growth of 30.4%. Software, one of the segments analyzed and sized in this study, displays the potential to grow at over 33.2%. The shifting dynamics supporting this growth makes it critical for businesses in this space to keep abreast of the changing pulse of the market. Poised to reach over US$61.5 Billion by the year 2025, Software will bring in healthy gains adding significant momentum to global growth.

Representing the developed world, the United States will maintain a 26.4% growth momentum. Within Europe, which continues to remain an important element in the world economy, Germany will add over US$4.7 Billion to the region’s size and clout in the next 5 to 6 years. Over US$8.9 Billion worth of projected demand in the region will come from Rest of Europe markets. In Japan, Software will reach a market size of US$2.7 Billion by the close of the analysis period. As the world’s second largest economy and the new game changer in global markets, China exhibits the potential to grow at 40.4% over the next couple of years and add approximately US$22.4 Billion in terms of addressable opportunity for the picking by aspiring businesses and their astute leaders. Presented in visually rich graphics are these and many more need-to-know quantitative data important in ensuring quality of strategy decisions, be it entry into new markets or allocation of resources within a portfolio. Several macroeconomic factors and internal market forces will shape growth and development of demand patterns in emerging countries in Asia-Pacific. All research viewpoints presented are based on validated engagements from influencers in the market, whose opinions supersede all other research methodologies.

TABLE OF CONTENTS

I. INTRODUCTION, METHODOLOGY & PRODUCT DEFINITIONS 1

II. EXECUTIVE SUMMARY 12

1. MARKET OVERVIEW 12

  • Affective Computing: Entering an Era of Emotional Devices 12
  • Widening the Application Scope 12
  • Spectacular Rise on the Cards for Affective Computing Market 13
  • Review of Market Challenges 14
  • Opinion Mining: A Significant Challenge in Affective Computing 14
  • Global Competitor Market Shares 15
  • Affective Computing Competitor Market Share Scenario Worldwide (in %): 2019 15

2. FOCUS ON SELECT PLAYERS 16

3. MARKET TRENDS & DRIVERS 20

  • Focus Grows on Advanced Computational Devices with High Emotional Quotient 20
  • Machine Learning, Leveraging the Ability to Power Deep Learning Strategies, Likely to Augment Affective Computing Landscape 20
  • ‘Emotional’ Technology to Find Broader Adoption in Business Arena 21
  • Digital Marketing: A High-Growth Area 21
  • Growing Focus on Neuromarketing, the Marketing Approach Powered by Neuroscience, to Widen Prospects for Affective Computing 22
  • Affective Computing in E-Commerce – A Long Way to Go 22
  • Global Retail e-commerce Sales (in US$ Billion): 2016 -2025 23
  • Media & Entertainment Industry Seeks to Leverage Affective Computing in Building Unique Promotion Strategies 24
  • Market Senses Significant Opportunities Coming in its way in Automotive Industry 24
  • Novel Tools Come to Fore in the Automotive Affective Computing Domain 25
  • Affectiva Rolls Out Affectiva Automotive AI, the Multi-Modal In-Cabin AI Sensing Solution 25
  • Kia and MIT Media Lab to Develop Real-time Emotion Adaptive Driving (READ) Tool for Automotive Use 26
  • Role of ‘Emotion AI Systems’ on Personal Devices Transformation 26
  • Mobile Affective Computing – An Evolving Area of Research 26
  • Growing Market for Wearables Bodes Well for Affective Computing Market: Global Wearables Shipments (in Billion Units): 2016-2025 28
  • Proliferation of Smartphones Buoys Development of Affective Computing Technologies 28
  • Rise in Smartphone Ownership Offers New Avenues for Growth: Number of Smartphone Users Worldwide (in Billion): 2016-2021 29
  • Smartphone Adoption Worldwide by Region (in%): 2018 & 2025 30
  • Advances in Affective Computing Technologies Vital to Seamless Human-Robot Interactions 31
  • Affective Computing Emerges as an Important Tool for e-Learning 31
  • Global e-Learning Market (in US$ Billion): 2016-2025 33
  • Affective Computing Buoys Application of AI in Transforming the Healthcare Landscape 34

4. GLOBAL MARKET PERSPECTIVE 35

  • TABLE 1: Affective Computing Global Market Estimates and Forecasts in US$ Million by Region/Country: 2018-2025 35
  • TABLE 2: Affective Computing Market Share Shift across Key Geographies Worldwide: 2019 VS 2025 36
  • TABLE 3: Software (Component) World Market by Region/Country in US$ Million: 2018 to 2025 37
  • TABLE 4: Software (Component) Market Share Breakdown of Worldwide Sales by Region/Country: 2019 VS 2025 38
  • TABLE 5: Hardware (Component) Potential Growth Markets Worldwide in US$ Million: 2018 to 2025 39
  • TABLE 6: Hardware (Component) Market Sales Breakdown by Region/Country in Percentage: 2019 VS 2025 40
  • TABLE 7: Market Research (End-Use) Global Market Estimates & Forecasts in US$ Million by Region/Country: 2018-2025 41
  • TABLE 8: Market Research (End-Use) Market Share Breakdown by Region/Country: 2019 VS 2025 42
  • TABLE 9: Media & Advertising (End-Use) Demand Potential Worldwide in US$ Million by Region/Country: 2018-2025 43
  • TABLE 10: Media & Advertising (End-Use) Share Breakdown Review by Region/Country: 2019 VS 2025 44
  • TABLE 11: Healthcare & Lifesciences (End-Use) Worldwide Latent Demand Forecasts in US$ Million by Region/Country: 2018-2025 45
  • TABLE 12: Healthcare & Lifesciences (End-Use) Distribution of Global Sales by Region/Country: 2019 VS 2025 46
  • TABLE 13: Automotive (End-Use) Sales Estimates and Forecasts in US$ Million by Region/Country for the Years 2018 through 2025 47
  • TABLE 14: Automotive (End-Use) Global Market Share Distribution by Region/Country for 2019 and 2025 48
  • TABLE 15: Other End-Uses (End-Use) Global Opportunity Assessment in US$ Million by Region/Country: 2018-2025 49
  • TABLE 16: Other End-Uses (End-Use) Percentage Share Breakdown of Global Sales by Region/Country: 2019 VS 2025 50

III. MARKET ANALYSIS 51

  • GEOGRAPHIC MARKET ANALYSIS 51
  • UNITED STATES 51
  • Market Facts & Figures 51
  • TABLE 17: United States Affective Computing Market Estimates and Projections in US$ Million by Component: 2018 to 2025 51
  • TABLE 18: United States Affective Computing Market Share Breakdown by Component: 2019 VS 2025 52
  • TABLE 19: United States Affective Computing Latent Demand Forecasts in US$ Million by End-Use: 2018 to 2025 53
  • TABLE 20: Affective Computing Market Share Breakdown in the United States by
  • End-Use: 2019 VS 2025 54
  • CANADA 55
  • TABLE 21: Canadian Affective Computing Market Estimates and Forecasts in US$ Million by Component: 2018 to 2025 55
  • TABLE 22: Affective Computing Market in Canada: Percentage Share Breakdown of Sales by Component for 2019 and 2025 56
  • TABLE 23: Canadian Affective Computing Market Quantitative Demand Analysis in US$ Million by End-Use: 2018 to 2025 57
  • TABLE 24: Canadian Affective Computing Market Share Analysis by End-Use:
  • 2019 VS 2025 58
  • JAPAN 59
  • TABLE 25: Japanese Market for Affective Computing: Annual Sales Estimates and Projections in US$ Million by Component for the Period 2018-2025 59
  • TABLE 26: Japanese Affective Computing Market Share Analysis by Component:
  • 2019 VS 2025 60
  • TABLE 27: Japanese Demand Estimates and Forecasts for Affective Computing
  • in US$ Million by End-Use: 2018 to 2025 61
  • TABLE 28: Affective Computing Market Share Shift in Japan by End-Use:
  • 2019 VS 2025 62
  • CHINA 63
  • TABLE 29: Chinese Affective Computing Market Growth Prospects in US$ Million by Component for the Period 2018-2025 63
  • TABLE 30: Chinese Affective Computing Market by Component: Percentage Breakdown of Sales for 2019 and 2025 64
  • TABLE 31: Chinese Demand for Affective Computing in US$ Million by End-Use: 2018 to 2025 65
  • TABLE 32: Chinese Affective Computing Market Share Breakdown by End-Use: 2019 VS 2025 66
  • EUROPE 67
  • TABLE 33: European Affective Computing Market Demand Scenario in US$ Million by Region/Country: 2018-2025 67
  • TABLE 34: European Affective Computing Market Share Shift by Region/Country: 2019 VS 2025 68
  • TABLE 35: European Affective Computing Market Estimates and Forecasts in US$ Million by Component: 2018-2025 69
  • TABLE 36: European Affective Computing Market Share Breakdown by Component: 2019 VS 2025 70
  • TABLE 37: European Affective Computing Addressable Market Opportunity in US$ Million by End-Use: 2018-2025 71
  • TABLE 38: European Affective Computing Market Share Analysis by End-Use: 2019 VS 2025 72
  • FRANCE 73
  • TABLE 39: Affective Computing Market in France by Component: Estimates and Projections in US$ Million for the Period 2018-2025 73
  • TABLE 40: French Affective Computing Market Share Analysis by Component: 2019 VS 2025 74
  • TABLE 41: Affective Computing Quantitative Demand Analysis in France in US$ Million by End-Use: 2018-2025 75
  • TABLE 42: French Affective Computing Market Share Analysis: A 7-Year Perspective by End-Use for 2019 and 2025 76
  • GERMANY 77
  • TABLE 43: Affective Computing Market in Germany: Recent Past, Current and Future Analysis in US$ Million by Component for the Period 2018-2025 77
  • TABLE 44: German Affective Computing Market Share Breakdown by Component: 2019 VS 2025 78
  • TABLE 45: Affective Computing Market in Germany: Annual Sales Estimates and Forecasts in US$ Million by End-Use for the Period 2018-2025 79
  • TABLE 46: Affective Computing Market Share Distribution in Germany by End-Use: 2019 VS 2025 80
  • ITALY 81
  • TABLE 47: Italian Affective Computing Market Growth Prospects in US$ Million by Component for the Period 2018-2025 81
  • TABLE 48: Italian Affective Computing Market by Component: Percentage Breakdown of Sales for 2019 and 2025 82
  • TABLE 49: Italian Demand for Affective Computing in US$ Million by End-Use: 2018 to 2025 83
  • TABLE 50: Italian Affective Computing Market Share Breakdown by End-Use: 2019 VS 2025 84
  • UNITED KINGDOM 85
  • TABLE 51: United Kingdom Market for Affective Computing: Annual Sales Estimates and Projections in US$ Million by Component for the Period 2018-2025 85
  • TABLE 52: United Kingdom Affective Computing Market Share Analysis by Component: 2019 VS 2025 86
  • TABLE 53: United Kingdom Demand Estimates and Forecasts for Affective Computing in US$ Million by End-Use: 2018 to 2025 87
  • TABLE 54: Affective Computing Market Share Shift in the United Kingdom by End-Use: 2019 VS 2025 88
  • REST OF EUROPE 89
  • TABLE 55: Rest of Europe Affective Computing Market Estimates and Forecasts in US$ Million by Component: 2018-2025 89
  • TABLE 56: Rest of Europe Affective Computing Market Share Breakdown by Component: 2019 VS 2025 90
  • TABLE 57: Rest of Europe Affective Computing Addressable Market Opportunity in US$ Million by End-Use: 2018-2025 91
  • TABLE 58: Rest of Europe Affective Computing Market Share Analysis by End-Use: 2019 VS 2025 92
  • ASIA-PACIFIC 93
  • TABLE 59: Affective Computing Market in Asia-Pacific by Component: Estimates and Projections in US$ Million for the Period 2018-2025 93
  • TABLE 60: Asia-Pacific Affective Computing Market Share Analysis by Component: 2019 VS 2025 94
  • TABLE 61: Affective Computing Quantitative Demand Analysis in Asia-Pacific in US$ Million by End-Use: 2018-2025 95
  • TABLE 62: Asia-Pacific Affective Computing Market Share Analysis: A 7-Year Perspective by End-Use for 2019 and 2025 96
  • REST OF WORLD 97
  • TABLE 63: Rest of World Affective Computing Market Estimates and Forecasts in US$ Million by Component: 2018 to 2025 97
  • TABLE 64: Affective Computing Market in Rest of World: Percentage Share Breakdown of Sales by Component for 2019 and 2025 98
  • TABLE 65: Rest of World Affective Computing Market Quantitative Demand Analysis in US$ Million by End-Use: 2018 to 2025 99
  • TABLE 66: Rest of World Affective Computing Market Share Analysis by End-Use: 2019 VS 2025 100

IV. COMPETITION 101

1. ADMOBILIZE LLC 101

2. AFFECTIVA 101

3. AFFECTIVE COMPUTING LLC 103

4. AUDEERING GMBH 104

5. APPLE, INC. 106

6. BEHAVIORAL SIGNAL TECHNOLOGIES, INC. 117

7. BEYOND VERBAL 119

8. CLAY AIR INC. 120

9. COGITO CORPORATION 121

10. COGNITEC SYSTEMS GMBH 122

11. CROWD EMOTION LTD. 126

12. DEEPEYES GMBH 127

13. DREAM FACE TECHNOLOGIES 129

14. ELLIPTIC LABORATORIES A/S 130

15. EMOSHAPE INC. 136

16. EMOTIBOT TECHNOLOGIES LIMITED 139

17. EMOTION RESEARCH LAB 142

18. EMPATH INC. 142

19. EMPATICA INC. 144

20. ENTROPIK TECHNOLOGIES PVT. LTD. 145

21. EYERIS 146

22. EYESIGHT TECHNOLOGIES LTD. 150

23. FACE++ 155

24. GESTIGON GMBH 157

25. GESTURETEK 158

26. GOOD VIBRATIONS COMPANY B.V. 166

27. GOOGLE LLC 168

28. HUMANZYME 173

29. IBM CORPORATION 175

30. IFLEXION 204

31. IMOTIONS A/S 209

32. INTEL CORPORATION 212

33. KAIROS AR, INC. 227

34. LOGIC PURSUITS, LLC (CAPTEMO(tm)) 229

35. MICROSOFT CORPORATION 230

36. MIT MEDIA LAB – AFFECTIVE COMPUTING GROUP 248

37. NEMESYSCO LTD. 249

38. NOLDUS INFORMATION TECHNOLOGY BV 250

39. NURALOGIX CORPORATION 254

40. NUMENTA, INC. 256

41. NVISO SA 258

42. OPSIS PTE. LTD. 260

43. POINTGRAB INC. 261

44. QUALCOMM, INC. 263

45. REALEYES OU 286

46. RECEPTIVITI INC. 287

47. SENSEON TECH LTD. 289

48. SENSUM CO. 292

49. SENTIO SOLUTIONS (FEEL WRISTBAND) 294

50. SIGHTCORP BV 295

51. SONY DEPTHSENSING SOLUTIONS SA/NV 297

52. TAWNY GMBH 298

53. TOBII AB 299

54. UAB SKYBIOMETRY 301

55. VICON MOTION SYSTEMS LIMITED 304

56. VOKATURI B.V. 308

57. WRNCH 309

  • AFFECTIVE COMPUTING MCP10116
  • CONTENTS

Copyright MarketResearch

https://www.marketresearch.com/Global-Industry-Analysts-v1039/Affective-Computing-12659122/