top of page

Say What You Really Feel, Cognovi Listens

Remember “2001: A Space Odyssey” and “The Terminator”? Back in the day when computer science and the general public were realizing the full potential of artificial intelligence, futuristic movies stirred a primal fear that computers would become so smart that they could take over the world. Since then, advances in AI have been astounding. Machines now analyze visual data like scenes and faces, understand natural language, and even learn. But don’t worry about global domination, we’re told, computers are a long way from being self-aware or able to think like humans. Most importantly, they will never have the ability to feel emotions.


Maybe not, but they can read them.


Emotion AI, a.k.a. affective computing or artificial emotional intelligence, originated in the 1990s to bolster earlier attempts to identify universal emotions communicated by facial expressions. This research needed a database of standardized photographic images and the burgeoning data-mining field could comply. Then machine learning techniques were applied to big data and emotion AI took off. It wasn’t long before other visual non-verbal communication modes were studied, such as gesture, posture, gait, unconscious body movements and biophysical signals like flushing and sweating. Auditory cues in speech and voice were also analyzed, such as intonation, emphasis, rhythm and pauses. This research is becoming more sophisticated each year and the current wealth of social media offers innumerable sources of images and vocalizations.







Learn ways to partner with Cognovi

More From Cognovi Labs

Never miss an update

Thanks for submitting!

bottom of page