VERN is a patented AI system that was designed to detect emotions in communication. This software represents a breakthrough in emotion recognition-in that it provides results in real time, and provides multiple emotional signal analyses simultaneously.
Click here for: Case studies & Use Cases, Registration, Developer Portal, or Live Demo.
In our “explainer” video, CEO Craig Tucker walks us through VERN AI, an emotion recognition system consisting of an aggregate of models working simultaneously that allow machines to understand human emotion–and complex speech such as humor. He talks about the state of the industry and science, who is using VERN, and what it does. Craig talks about the problems of sentiment analysis–and how emotion recognition is closer to revealing emotional states. He answers some of the biggest questions facing emotion recognition including discussion on problems with sentiment analysis, and psychology’s replication crises.
Craig talks about how VERN’s model of emotions follows neuroscientific findings of the three affective states-Euphoria, dysphoria, and fear. He talks about how psychology’s replication crisis makes it impossible to create a NLP model around their findings, essentially rendering all other models incomplete at best. Craig talks about how the written word has been found to be a reliable measure of what someone’s emotions are. And, he reviews how VERN’s process being in real time benefits you.
VERN’s CEO answers some objections to emotion recognition and VERN’s process. He discusses intent, and proposes that intent should be measured not by the sender, but instead by the receiver. Craig discusses the problems with facial recognition and voice inflection systems claiming to detect emotion, and why VERN is more closely based on neuroscience than psychology. He discusses the prediction error, and how it relates to an incongruity signal that VERN is patented to detect.
In the end, Craig brings it all back together: How this all fits in the model VERN uses to detect emotions. Through contextual clues, frames of reference and moderating by the personal frame VERN is able to detect emotions.
Craig illustrates the power of VERN’s patented method like a trip to the eye doctor: Each lens focusses the concept to be clear the end user, like VERN uses their patented personal frames. He explains where emotions come from, and what they’re made of: Emotives. Emotives are words and phrases that are known to impart emotional meaning. VERN tracks these emotives and provides a confidence score of 0-100{b1ba36726a3bfcdc42af6e5ec24af305dbc6425c95dfb7052d7f2b4aabbf1a02}. Each additional emotive signal adds to the confidence level, and intensity of the found emotion.
We finish by inviting you all to join the Virtual Emotion Resource Network, and help us discover new emotions and possibly reach a consensus on something that’s universal to all of us: Our feelings.