ArticlesGeneral
What's next for the attention economy and emotion AI?
by
Nick Warner

What's next for the attention economy and emotion AI?

Matt Celuszak, Founder & CEO at Element Human appeared on the Insight Platforms podcast to discuss: What's next for the attention economy and emotion AI? Strengths & weaknesses of facial coding data; introducing measurement for the influencer economy; the importance of great co-founders. From stingrays and hot coals (listen to find out) to building one of the leading companies in facial coding, emotion AI and behavioral leading models, Matt Celuszak of Element Human has had quite a journey. Matt’s always astute, engaging and in this interview we get into the issues above and more.

Join Henry Piney and Matt Celuszak for a 55 minute tour, covering a wide gamut of topics including: 

  • Strengths and weakness of facial coding data
  • Evaluating the intersection of claimed and behavioral data sets
  • Understanding effectiveness of content for the BBC - rating vs reaction
  • Using sensor date to create measurement for the influencer economy
  • Measuring what’s valued, rather than just valuing what you can measure
  • How to derive granular human understanding from big data sets
  • How privacy works in relation to sensor data
  • How behavioral learning models work
  • Lessons from building a business from scratch.

Listen Here:

Full Transcript: 

[00:00:00]

Matt Celuszak: Emotions, emotions happen before people think. So if you ask somebody how they feel, you're actually asking them how they think they feel because whatever they felt has already happened. Being able to get landmarks on a face and be able to look at unique features on a face, that works. That side works.

I'm going to separate facial coding from emotion. I, because emotion is a judgment call as to whether somebody is feeling happy or sad, or it's a probability as to whether somebody's feeling happy or sad. And that's where a lot of these models fall down. And I would be careful and reticent as a user to be putting a lot of weight behind the emotional labels.

Henry Piney: So facial coding data and insight space has been around for quite some time. Thank you. In simplistic terms, it's a technique that involves reading the signal or landmarks on a respondent's face to understand what [00:01:00] they're paying attention to and the reaction listed. It's widely used by a variety of companies, often to gauge reaction to advertising content.

Many brands and agencies have bought into the theory. Facial coding data will give you a true, unfiltered reaction to content, understanding system one type reactions based on immediate or emotional rather than stated response. However, there's also been some skepticism around this type of data too. Not least around a number of key questions.

How accurate is it? To what extent can you really categorize emotion? Self selected samples are engaging with the content, questions around consumer privacy, and a whole range of other issues. Well, Mat Saluzak addresses all those topics and many other very relevant points, including how Element Human, the company where he's the founder and CEO, takes a considered approach to this type of data, the use of behavioral learning models, and how human sensor data can empower the creator economy.

I must give a disclaimer and that I'm on the board of element human. So I have some existing in going perspectives and [00:02:00] viewpoints that said, I hope you'll agree that we have a really good in depth discussion around my facial coding data and emotion. AI may go next and the impact on insights and marketing.

So. Onto the interview. So Matt, it's good to track you down wherever you are in the world today, west coast of Canada. I've caught you on, I believe.

Matt Celuszak: Yeah. Thanks Henry for having me here.

Henry Piney: Not at all. It's very, very good to have you on. Now. I wanted to start off with a traditional icebreaker, which is something from your deepest, darkest past.

No, it doesn't have to be deepest, darkest past. It's just something that most people wouldn't know about you. Something that might be a little bit surprising.

Matt Celuszak: Okay. Well, I don't think you'd actually find it anywhere written anywhere, but I was actually stung by a stingray and almost lost my leg down in Costa Rica.

Wow. How long ago did that

Henry Piney: happen?

Matt Celuszak: Oh it was in my 20s. Also, 2 years

Henry Piney: ago.

Matt Celuszak: Yeah, I wish it was 2 years ago. No, my other leg reminded me just recently that how old I am with an Achilles rupture. So, yeah, [00:03:00] unfortunately, being active certainly gets you out there, but getting you out there certainly gets you exposed.

So, yeah, no, I guess that's something that most people wouldn't know, but I spend a lot of time in the ocean and I really enjoy the ocean and, people know about my fishing. Some of them know about my kayaking, but, a big part of it out there is just getting out there and just being in the wild and a stingray stumbled across my leg and almost lost that thing.

So

Henry Piney: what, so what are the consequences of a stingray bite then? I mean, so clearly it does more than just really hurt. I mean, if you almost lost your leg.

Matt Celuszak: Yeah. So, I mean, it really hurts. , not the most painful thing, but. Close to the challenge that you have is that the poison goes in and it starts to create necrosis.

And so and unfortunately hit my lymph system. So I, the whole leg kind of started to swell up after three or four days. I couldn't really figure it out. And we just had to absolutely nuke and nuke myself with antibiotics, try and recover. And it was we, we ended up getting it back. It didn't help that we were [00:04:00] kind of, we were kind of remote Costa Rica at the moment.

And so at the time it was very much, they said, put heat on it and or at least the doctor did and the people there that I was out with a bunch of kind of local guides and they were like, oh, they just it was hot is what they heard. And so they put these. Burning hot coals on, on the back of my leg and I got third degree burns.

And so just to make it, just to make it even better, just complicated the whole issue. It's the first time I've ever bit down on a stick to handle the pain. It was quite interesting anyway.

Henry Piney: Oh, well, that was probably, that was probably very good preparation of burning hot coals, biting down on sticks, extreme pain for getting into the startup journey in the startup world, but you'll get it.

So we'll get maybe into your background a little bit, which is interesting in itself, but can we just start with Element Human and its various sort of iterations and why did you found the business and what were you looking to do?

Matt Celuszak: Yeah, I, [00:05:00] so my co founder Diego and I founded the business based on a kind of one simple principle.

We were realizing that. People were starting to spend more time interacting with technology in the outside world than they were with human beings. And I think we can all say that even our interactions with humans, we're going through technology, but technology was still in a very binary stage. I mean, we founded this in 2013.

So this is still when people were in the marketing industry. We're all like, Ooh, mobile phones are going to take over. And And so we had heard about kind of artificial intelligence and machine learning, and we thought, well, if people have kind of cameras and microphones and all this really rich data that we get out of, say, a focus group, or out of 1 to 1 interviews that contextualizes everything, could we create a nice feedback loop that complements this big data?

But very kind of broad data set that that's, that's not very granular. And so we wanted to work on generating human behavior signals and emotional signals [00:06:00] out of a sensor data. And we just believe that sensor data offered this kind of body language feedback loop helped us really unpack kind of why and how people are doing things rather than just what they're doing.

So that's kind of where we got started.

Henry Piney: Matt, when you were saying censored data, we're kind of diving into the nitty gritty of it now. I mean, so obviously it was always survey based, I believe, was it? The in terms of element humans solution. Oh, it wasn't. Okay.

Matt Celuszak: No, no, no, no. So our 1st iteration was actually an API where you could just send a video to us and we would translate what the expressions and emotional signals were.

And that was using early day support vector machine models. So, it was really good. And then we teamed up with the University of Nottingham. And and then we started, it was BBC who asked us to bake it into a survey to demonstrate the value, because what we were learning is that people would express differently than they would claim.

So we would have the, the expression, and then we'd have what they claim. And then those 2, they would contradict and where they [00:07:00] contradicted ended up being where the insight was. So that actually ended up being the leading indicator as to where, whether a piece of content would perform or not at the BBC, but that's where we were incubated.

And that's when we ended up. Putting it into survey the 4th, sorry, the 3rd generation of the tool was actually an embeddable. So it was again. Being able to flip on a webcam, do I tracking emotion detection and implicit testing and and as a researcher, you could just plug that into a survey and then it was, it wasn't until this 4th.

Generation that we ended up taking over the survey, the sample, all that stuff, because we realized quality control for rich data was, was important. So part of that is trying to design out the human variability enough while creating a natural enough human experience. So, and yes, by owning the survey allows you to do that.

Henry Piney: Yeah, yeah, got it. It's really interesting. I mean, going back to that point that you made around the difference between claims data. And behavioral data, [00:08:00] you may have used a different phrase now. I forget exactly what it was, but then, but then, but then how did you start to discern, you said the truth was somewhere in between or the intersection gave the truth, but how did you start to discern where the truth in inverting comma would lie?

Matt Celuszak: The first kind of big project where it became quite clear that there was a whole different type of signal you can get from, from interacting with people and sensor devices was when we had people rate. Trailers of the BBC in multiple markets, and then we had them react to those trailers or just watch it naturally and see how they reacted.

And you'd get people, it ended up creating kind of a 2 by 2, where you'd have kind of a rating that was high, low and a reaction that was larger or small. And we could very clearly see that there were people who would rate high and react high, and that was high engagement, true engagement. There was people who would rate high and react low.[00:09:00]

And those are people that were more. They fill out surveys and they want to say that they would watch that show because their friends do. But actually when, when you went deeper, they didn't know much about the show itself and they actually wouldn't watch it. Then re when, when they would kind of rate low and react low, that was just not a very good piece of content for them.

It's not a good audience fit. And then the final one was the really interesting one was where it would rate low, but it would react high and where it was rating low on the reaction was high. We were actually able to. Those were always the shows that over outperformed prime time average. And so we found out that we could actually anticipate beating prime time average about which shows in which markets would do well.

And so that held the BBC price, their inventory for those markets and selling in. So it was these we call them the guilty pleasure. It's these shows that you would never admit. That you would watch, but when they're on, you're never going to turn them off. So, you know, the Jerry springers, the, you know, the, the, the, the talk TV show, [00:10:00] some of those ones where you wouldn't want anybody, you wouldn't want your friends knowing that you actually watch this, but when you watch it, you love it.

Henry Piney: Yeah, well, yeah, I think we're all aware of all sorts of guilty pleasures. We're all

Matt Celuszak: aware of which ones those are from the press.

Henry Piney: But that, that is interesting in that it's, so it wasn't really one or the other. It genuinely was an intersection of saying by putting the two core data sources together You get much, you get a much more nuanced picture as to what the reaction is.

Yeah. Kind of interesting, man. You should I realized I know what element human does very well, but you should probably actually describe what element human does for people who are less familiar with the business.

Matt Celuszak: Yeah, sure. So I, I mean, in its current articulation Element human is very much we work on trying to take sensor data and learn how people behave or react to any interaction they have with the device today.

That articulation is in ads and add measurement and media measurement. The media [00:11:00] measurement industry or the media industry in particular, particularly during COVID kind of hit this new bubbling up influencer and creator economy, which has over 200 million creators conversing with 4. 2 billion people, 365 days a year.

It's a whole different form of marketing. It offers the largest untapped market in the world. And there's absolutely no way to measure across platforms at the moment. And the old traditional TV metrics are really hard to apply because it's a non linear format. There's a lot of user interaction control.

The media formats are changing or expanding. So how do you compare a Snapchat with a Facebook media buy? And how do you know whether your content's better on YouTube or Instagram? YouTube shorts or Instagram real. So those kind of common questions became real challenges. A for the platforms to prove their value.

Their unique value and their qualitative value but also be for the media buyers and then see for the creators themselves. So how do I stand out in [00:12:00] these platforms? If I can make money from this early days, there were very few creators and a lot of ad dollars available for experimentation. Now, total opposite.

There's creators coming on a dime a dozen. There's a lot of content that's being pushed. People are kind of tuning out these feeds. You've heard of the doom scrolling and that type of stuff. So attention's really important. So the question for a creator is how do I capture attention and how do I get them to stay with me?

That also, that also ladders up to the platform level and then further ladder ladders up to the brand performance level. So we're working with. Pioneers in this space Whaler, Twitch, and the BBC to understand what is really good quality content that's going to grab people's attention in the context of the environment.

And how does that allow each of those type of stakeholders, creators, brands, agencies, or platforms be able to articulate their unique proposition and the value of that proposition in a measurable way. So instead of. Measuring kind of clicks, likes, and [00:13:00] shares, which are okay for audience growth, but not great for brand performance.

Instead of being able to, instead of kind of valuing what's measured, which is what the whole programmatic industry is kind of based off of, it's flipping it on its head to, to, to measure what's valued. And I think that's. That's where sensor data is really interesting because it's a super rich data source where surveys can't give you that context.

You get this really rich data source and you could kind of, it creates a machine learning feedback loop where you can kind of learn that against whatever metric that matters. So. First and foremost, it's an ad testing tool for campaign measurement. It's been used by over kind of 200 brands. We've road tested this for the last four years.

We've got over 30 billion data points and benchmarks across four big platforms. It's a, it's pretty cool from that perspective. But it's got a long way to go too. There's a lot of. The question is, what else can body language teach us? What else can we learn it against? It's a hyper contextual source of information.

And I think just to put this in [00:14:00] context, for every like, Like a thumbs up that you get on a YouTube video, we're getting over 10, 000 data points. So for every single data point you're getting on a YouTube video from a like perspective, we're getting over 10, 000.

Henry Piney: And Matt, we'll get into that, I think, in a second as to what you mean by those 10, 000 data points and exactly what you're measuring, but playing it back a little bit.

So it sounds like there are multiple propositions in that you've got, I guess, the old Kind of insights propositions been around for quite a long time of, you know, survey, gauging the effectiveness of your advertising, whether from a creative perspective in terms of you're making the advertising, you want to make it better that that's a use case.

But it also sounds like there's a really interesting use case too, in terms of. Areas in the digital world, particularly the influencer world, whereby there isn't really a lot of measurement at the moment,

Matt Celuszak: there's, there's no measurement to be honest. And the measurement that does exist, it's usually cost prohibitive.[00:15:00]

So our goal with the creator economy is to create something that was. Super fast, fast enough for influencer, reliable enough for brands and for platforms and agencies and, and, but also cost effective so that it could scale. A lot, but also accessible to these influencers who might actually be pre revenue in their own personal journeys.

And so how do you attract brands in a language that they speak, but in a way that's useful to you so you can, you have a feedback loop, so it's worth asking your followers for, for to review your content.

Henry Piney: So, so the use case in that side of things is one of which is fine. You get your followers to review the content, you show it to them, you get the feedback in a survey environment and also using facial coding and potentially other metrics, you could make better content, but then if a brand is looking to buy from, say, I was an influencer, then I can actually start to give them some [00:16:00] validated metrics around recall or purchase intent or some type of brand uplift.

Matt Celuszak: Yeah, so exactly. So it's brand uplift, brand recall, full funnel analysis. It provides all that. And so that, yeah, on

Henry Piney: that side of it, could you just talk through in simple terms to people who aren't familiar? So how does the methodology work at the moment? It's mainly a survey based methodology, but then what, what, what happens just very high level if you're a respondent and you go to that survey and then what does the client get at the end of it?

Matt Celuszak: Yeah, so so client, I think we handle the sample, our partners sent. So they, and they're one of the main industry providers for, for participants. So you basically just go into the platform, you upload your creative, upload a few kind of brand goals and objectives, and then and then you select the audience you want to go to.

Then we, then a link gets sent out to the, the, our partner sample [00:17:00] partner. Some. People like Twitch bring their own samples. So some, some of them plug us into their own customer databases and communities. And then the link goes out, they agree. They consent to do it. They turn on their webcam. That's part of it.

It is entirely privacy based and we'll get into privacy in a second, but they turn on their webcam and as they're filling out the survey, they're being observed and then they get to the points of where they're popped into a mock feed or they're exposed to the content. If it's just a simple creative test.

And then there, then we also do test and control. So there is a group of a hundred people who will be done as the control group to be able to compare all the results and see what the differences are. So that's what creates our benchmarks. That's what allows us to do uplift and recall and awareness, consideration, all that stuff.

And then you can see how much you actually would compare to it, either a GenPOP population or a more targeted control population, if you want. From a brand, there's essentially 3 products create, measure, learn. From a create perspective, you understand where, where [00:18:00] people are paying attention, what's catching their attention, which objects and scene and elements inside the creative is capturing their attention.

You get to understand where the emotional key moments are. So what are the parts of the. Creative that are actually driving their engagement continuous engagement within the creative and you can get some emotional context as well. Kind of some of the labeling around kind of happy sad, that type of expression that's being reviewed and then on the, measure it's very much your traditional brand metrics funnel analysis, all that type of stuff. We also have implicit traits. So you can, I understand. Is it authentic? Are you trustworthy? Are you sustainable? Whatever your brand goal is, are you actually getting there? And that's a subconscious kind of yes, no.

And then the final bit of course is learning. So once you have all that data across a bunch of your data sets, you can then learn your own bespoke metrics for you as a customer, or you can start to pipe in say your sales data and machine learn against that.

Henry Piney: Got it. [00:19:00] Got it. And to ask the million dollar question, hopefully, maybe literally the million dollar question.

So, facial coding data and eye tracking, does it really work? You know, inside agencies have looked at this for a long time. And I think, you know, some cases have had trouble with it in terms of shaky cameras, self, self selected respondent groups. for your time. Problems with beards, you know, whatever, whatever the, you know, whatever the issues kind of maybe so I know that you're going to say it's really, really improving, but what's your kind of honest evaluation at the moment of the strengths and weaknesses of facial coding data.

Matt Celuszak: Yeah, so, so facial coding facial coding data or, or facial recognition data, I mean, frankly, has been around for yonks. And I mean, does it work? I mean, some governments are using it to do automated border control.

Henry Piney: Okay, yeah, true. Yeah.

Matt Celuszak: So, quite frankly, the facial [00:20:00] coding stuff works and the computer vision side behind the facial.

Coding and more specifically facial recognition work. So being able to get landmarks on a face and be able to look at unique features on a face. That works. That side works. The side that I'm going to separate facial coding from emotion. AI because emotion is a judgment call as to whether somebody is feeling happy or sad, or it's a probability as to whether somebody's feeling happy or sad.

And that's where a lot of these models fall down. And I would be careful and reticent as a user to be putting a lot of weight behind the emotional labels. What the emotional labels are doing is they're mathematically taking a probability of somebody say smiling and saying, well, we have labeled this smile as happiness.

And that is highly dependent on two things. One is your training dataset. So who you've trained against. So if you've only trained against non bearded people, Bearded people are probably going to fool the system because of the way that the [00:21:00] machine learning, supervised machine learning works. The other thing the other challenge that you have is a universal agreement on what emotions are and how to label them.

There's a reason why body language goes largely unlabeled. Even in human construct the word love and the emotion love in English is summarized in one word in Sanskrit. It's 96 words. So, so, so there's 96 different variations that you can get in Sanskrit to describe the same emotional feel. You and I.

Would have entirely different perspectives of what happiness means to each of us. And I think that has been the biggest misnomer is actually less in the technology itself and more in the theory of emotional modeling. And so what I would what I would caution any user on in this space. Because of course, it's always improving because the technology is always improving.

Mathematically, you can get facial points off a face pretty accurately. You can deal with the human variability movement now [00:22:00] with the high quality webcam. So if you tested this 4 years ago, you would have been experiencing human variability for mobile webcam movement. Today, that is pretty much gone because it's it's pretty easy to lock onto a face on a moving and moving.

So, so 2 moving objects back and forth. So that stuff's pretty much gone. The racial issues that were happening. A lot of those are gone now because of the way that cameras have improved. But the part that still I would say be very careful is the labels that you're using and your interpretation as a user of those labels.

So we offer the labels but we offer the standard Ekman labels and we only offer it in the create function because they're only useful in saying were you intending to get people to smile here if you weren't. Then they are smiling. That's what we can tell you. If we are saying, were you intending to make people disgusted here?

Well, we did a trial on disgust and just the different ways that people express and react to disgust. We had Bear Grylls [00:23:00] eating an exploding bug and. People would grimace, people would laugh, people would cover their face, people would turn away from the screen, people would lean into the screen.

Everybody had a very different reaction, so you can't really model that. What you can model is there's a lot of expressional movement going on, and therefore that's a key moment. And so the key moment stuff is very, Solid, very expressional emotional interpretation is very psychological. The 1 thing I would say, and this is what a lot of people get wrong is that they think that emotions don't work.

The reality is, is that you're not going to get a better emotional read. Then you're not going to get a better emotional read anywhere else. That is at least objectively validated because the psychologist will give you their view and that opinion and a focus group is, is just too small. And so you have and the ability to anticipate, truly anticipate an emotional level, depending on how you ground truth it is just so difficult that even at 50 percent accuracy rate of [00:24:00] emotion, or whether it's 80 percent accuracy of emotion, depending on which 1 you're trying to express that's still.

A lot better than a 20, 30 percent that you're going to get in a human. So the here, I would say the, where it has been massively misused in the market research industry is people think that emotions and facial coding is to scale up to quantify. Qualitative element and it's not, it is to scale qual, not to enrich quant and I think you have to be really careful about that.

Henry Piney: Yeah, that's a really interesting distinction. And so, Matt, so what can you quantify then? You can quantify the modes. You can quantify the movements.

Matt Celuszak: Yeah, so what's what's hyper accurate at the individual level is whether we've detected a face, whether we have actual landmarks on that face, and all the vectors that go in between.

So all these vectors, you might have like 200. Or 56 to 1000 data points on a phase, and then those will [00:25:00] ladder up into what we call classifications. So happiness sadness, and that's looking at certain muscle groups predetermined by an emotional model. Our view in this whole world is that that's in this whole space is that that's actually the wrong way to approach a problem.

Our job is not to go in there and tell you what emotion they're feeling. Necessarily, our job is to see if body language and behaviors are indicative or leading indicator of a metric that matters to you and your business. So, by having this really rich data on the face, you then as Coca Cola might say, well, when we see brand lift go by X, that means we've hit our happiness moment.

Okay, then your version of happiness can be machine learned against that. That metric and we will tell you it turns out vector three. 475 and 989 are all indicators, strong indicators and predictive indicators of that [00:26:00] performance, and that's a much better mathematical way to do things in the emotion space than trying to label somebody's emotions because our language is just frankly too limited.

And it's the wrong question. Emotions, emotions happen before people think. So if you ask somebody how they feel, you're actually asking them how they think they feel, because whatever they felt has already happened.

Henry Piney: What, so what did you mean by the vector example you just gave there?

Matt Celuszak: You have a bunch of landmarks on a face and then vectors are the space between the landmarks.

So we track how, how much they move back and forth.

Henry Piney: Okay, so if you

Matt Celuszak: can imagine a face with a face mask that has a bunch of dots on it, we'd be tracking those dots. And as the person moves their face or moves to and from the camera, we're tracking the distances. And those distances are the things that tell you whether there's expressional movement or not, and whether that expressional movement is a large enough, a relevant deviation from the norm.

Henry Piney: Interesting. And again, drilling kind of quite deep here, [00:27:00] but I like to geek out on this stuff, as you know, what exactly are you tracking? So it's the face you're tracking. It's not broader body movement. It's just the face.

Matt Celuszak: So the way that we built our system is you can kind of You can have it recognize any part of the body if you wanted it to so if it was, if the interaction was broader body movement, great, but television media at this point in time, and that's where a lot of the consumption is for people.

It's a pretty passive activity, so face is probably the most expressive and most telling hand occlusion you'll see. Time to time hand occlusions good and eye tracking and head gaze. So eye tracking head movement, those fall within the vectors of the face. So that, so we would track those as well.

Who knows, we might find out that a nose flare twitch might be indicative of you know, high sales for the toothpaste category. But we just don't know that yet. The key is to get the really rich data first, and then be able to link it to whatever metrics you want.

Henry Piney: Which probably actually links into the next question I was about to ask.

Things [00:28:00] like audio and strength of their reaction when they're speaking. I imagine companies must be exploring that as well. I mean, there's a big rise in open ended open ended responses through audio. Is that something you've been looking at as well? Or do you feel like there's enough to work on with, broadly, with the face?

Matt Celuszak: No, so multimodal has always been our our end game and the models that we build are kind of input agnostic. They need to take on any sensor data. So sensor data to us is anything that will have any features that could be extracted from an input on a time series. So for us, that's kind of sensor data.

So. The data structures are the same. So then like, okay, let's put audio in it and see if that works. Let's put galvanic skin response in it and see if that works. So, so it's about, I mean, we really the core product that the one that isn't seen on the website, but the core product that powers all this is we call it the human behavior data lab.[00:29:00]

It's a very ugly long name, but it's, that's what it is. And it's purely meant for experimentation because we don't know where these sensors are going to be relevant. What I do like about that approach though, is that a, that a client can come in. We can say, great. All the survey stuff ticks the box for your measurement, the learning element.

We're going to tell you now what's actually relevant, and then we could fine tune the data capture around what's relevant. And so that allows them again, this optimization process allows any business or any creator to really scale up their measurement. And if they want to scale up measurement, then they get that constant feedback loop.

And, and that, that's, that's how we expect to standardize the industry.

Henry Piney: One of the other big factors we touched on this very briefly earlier is privacy. I mean, I'd imagine that must be a pretty big consideration within this space.

Matt Celuszak: Yeah. My, I mean, my, my early work right at the beginning, we were helping the digital catapult feed into GDPR.

So [00:30:00] coming from market research, you kind of. Yeah. I mean, even straight out of academia into market research your kind of privacy is kind of hammered into you from an ethics perspective. What I love about facial coding data is is that it's not facial recognition. It's built on the same principles of facial recognition mathematically, but it's actually not facial recognition.

So I actually don't need to know who you are to know how you're expressing. And, and it's because I literally just start to retrospectively normalize how you're reacting to this, this, this environment. So I kind of liken it to a comedian on stage. Stage really has a clear feedback loop. They know when a joke fails and when a joke wins, right?

They don't know the individuals in the audience, but they know if their content's working or not. So that's what I love about this technology is it offers at the individual level, our data agreement as a data controller is with the individual. So no matter how much a client might want to get access to the face videos, nobody has access.

Even I don't have access to [00:31:00] those. Nobody has access to the face videos. It is highly protected. There are two people in our company who can access the face videos to maintain or to train and that's it. And it is entirely separated even by like proper Server design, it's entirely separated from the rest of the ecosystem.

So even if you hacked through the client portal to get the client data, it wouldn't, you wouldn't be able to get through to the other side of it. Well, I mean, you might, but you'd have to work hard to get out of it.

Henry Piney: Yeah. It's an interesting point you make as well around the fact

Matt Celuszak: that there is just one more thing, I apologize.

And then one more thing is on the individual side because our relationships with the individual, we just said, you control it. We capture enough data that we can give. A strong enough result to a business and and I have an acceptable loss. So if. People want to delete their data. They, we give them a code right at the end of the survey that they can just email to privacy and their record isn't deleted forever.[00:32:00]

Henry Piney: Oh, we got, that's a very nice touch. I haven't heard about other companies doing that. I don't know if they are or not. I was just thinking about it. That's an interesting. Point that you make as well, that it in some ways you are not interested in who the individual is at all. In fact, the absolutely not interested.

Yeah. Yeah. The, the last thing you want is individual responses. What you want is aggregate responses, correct?

Matt Celuszak: Yeah,

Henry Piney: yeah.

Matt Celuszak: We, we need what, what we need for us, our focus is to make the individual stuff the small, rich data as accurate as possible so that the aggregate data on top. fulfills its need, but so that you also need less and less people and you can kind of synthetically fill in the rest.

So the key, the key is then winning the trust of the individual. And in that case, it's giving them full autonomy and power over what they've provided. It given them the ability to delete it right away. And frankly, once we have our metadata, we're not carrying anything that's personally identifiable. Okay,

Henry Piney: Matt, you're continuing my education BLMs.

Behavioral learning [00:33:00] models. You know, I think I'm just about getting my head around LLMs. So what are behavioral learning models?

Matt Celuszak: Well, it just the, I would argue the natural extension. Once you've got LLMs in play that can kind of accurately predict what the next word's going to be or can actually generate the next word and synthesize kind of a thought, then a BLMs are.

But they work on, I would say language is just relatively limited. Again, I use the love example as a very good example of this, but language is relatively limited. It's a, it's a synthesis and a summarization of our thought, whereas body language tends to be much more contextual. So BLM is very simple. I think of it as simple as the body language physics engine.

for the future. So taking the principles of LLMs and applying it to human behavior and emotion.

Henry Piney: And then how does that work then in practice? So it's like, [00:34:00] you know, an LLM is looking at what a combination of words and it's predicting the next words and then it optimizes over time. And so for element human for a BLM, what you're looking at, let's just stick with the face, you're looking at the.

The physical reaction, and then you're calibrating it against a certain data set. And then

Matt Celuszak: so again, calibrating, yeah. So you want to calibrate against the metrics that matter the whole principle of the business is to kind of bring empathy to digital interactions or the internet and those interactions.

So I would say nothing's ever going to be done in isolation of just the face. Making an emotional call, what you want to do is you want to make sure you have some sort of feedback loop in there. So some sort of pre cognitive post cognitive feedback. So here's the emotional data set. And then did they click a button?

Did they move through the website? Did they leave the website? Whatever it [00:35:00] might be, whatever that interaction might be, did they leave the app? Do they click on something? That's the feedback loop. So you want to link it into something that that's relevant. So I wouldn't look at. You could look, try to look for a signal that is predictive on its own, right?

I would argue, behavior, behaviors are very contextual, but they are contextual to an event that people have thought about and then done. So we're looking, we're trying to find leading indicators to, to events that are commonly executed on in a digital interaction, right? So a survey is one way to do that.

But Frankly, I actually see kind of the big, one of the bigger use cases for this, two in general, one is in UX UI testing. So every website could put it on kind of like hot jar. And it's like 1 percent of people that visit the website could actually flip on their webcam or they have the element human app and they just pop it in.

And that's where we'd want to go again, future scope, not currently. [00:36:00] And then further though, if we are accurate enough, we do see enough kind of interesting. States like boredom, depression, things like that. We can get into mental wellness tracking as well, which again would be inter somebody interacting with their device regularly and being able to say, hey, look, you seem to be excitable here after these events.

You seem to be depressed after those events. So it's about that. What's exciting. What's draining and being able to optimize your life that way if you wanted to, or teach the machine to actually recognize this stuff. So it can say, hey, it's time to stand up and take a break.

Henry Piney: Yeah, obviously that would be the next stage of it.

I mean, you can very much see where that could go though. Yeah, unequivocally, as you said, we we're monitoring, but behavior it is the

Matt Celuszak: interaction. Yeah, it is about the interaction. The interaction today is a survey and, and it's a survey with an embedded experience of a feed like a TikTok feed. And so that's, that's the interaction today and that's where, that's our learning ground to be able to identify these other sig, other signals.[00:37:00]

Henry Piney: Is everybody doing that at the moment from a UX perspective? I could see it's a very, you could see it's a very compelling use case of, as you say, just asking a percentage of people just to be monitored. You incentivize them to do it, see what they're doing, and then start to track what their, yeah, what the emotional reaction is.

Matt Celuszak: So there's I'd say there's three barriers to preventing that from mainstay adoption, mainstream adoption. The, the 1st 1 is technical in the sense that you have to have X, Y coordinates from the eye tracking against the X, Y pixel or mouse. Coordinates or what's on the page and your areas of interest, and then be able to summarize the emotional state of the areas of interest.

That is all assuming that you have a very good probability model on the emotional state, which I don't agree has really been nailed yet. There are 2 companies out there that I think are doing some, some really good work in this area. Ours is probably the most raw data set. So we're going to be [00:38:00] building our own model around that stuff.

We'll see. The 2nd barrier to entry is trust. Frankly, I don't know if I would go on to any website, particularly with the day, the way the data privacy has been abused in the past. I don't think I would actually go on and even say consent to any website unless I had full control of the data that went over there and could retrospectively remove that control the data and, and have full and final and total deletion.

So that to us, we've, we think that that's an area we can own. And that's a very interesting area. And we like that area that we've played a lot there. We've got a pretty good ethics committee for review on that. We've said no to some very big, big financial gain opportunities to specifically not work in that space.

So. Trust the technology itself. And then the 3rd is the variability. It's hard to take 1 models assumption that works for 1 [00:39:00] website and assume that it's going to work for the next website. And that variability also translates into human variability of people use different devices on for different types of sites and different types of workflows.

And so I think it's a much broader problem. Even brought even more variant, more broad than than marketing. So, I marketing is kind of that happy medium. It's not as isolated as, like, trying to detect, you know, yawning from drivers and autonomous vehicles or whatever it is or, or, or enhanced vehicles.

But that's a much more isolated workflow, but the variability in UX UI design is almost intractable. I would say.

Henry Piney: Okay. Yeah, that, that does make sense. Broadly, I'm conscious of time. And I wanted to talk about the business actually a little bit as well. Not, not necessarily the detail of your journey. But it would be good just to get like a little bit of an overview of the evolution of the business.

And then also kind of what [00:40:00] you've learned along the way. I mean, as an entrepreneur doing this business since whenever, I'm sure there have been ups and downs and you've probably got all sorts of elements of insights to, to pass on to others.

Matt Celuszak: Yeah, I mean I think, wow, well, so let's just say there's kind of been 4 iterations of the business and I touched on this before.

1st was the algorithms weren't really accessible outside of a university. So the 1st was plugging those into an API and that business was great, but there just weren't enough people building products commercially to create enough feedback for us to be able to get. The insight today that might be a different story.

The second one was then something that we packaged up as a demo solution called MIMO. And that one was good for the broadcast industry, but the broadcast industry deals were still being done. Predominantly rights holders deals were still being done by handshake agreement. So great, but it wasn't a big enough uplift or certainty.

[00:41:00] For sales, and there was a lot of resistance from salespeople of having the technology, tell them which show to sell for the upside. So, so we didn't quite nail that, that product proposition. The 3rd 1 then was for the market research industry of giving them something that they could embed that met the heavy skepticism as you talked about.

But even those that would take it, it didn't create a recurring revenue model for us. So And they hated recurring revenue models, because they don't want to carry a cost base that they, they can't guarantee that they'll use. And a lot of market researchers are ad hoc sales, their sales cycles ad hoc.

So they would use us in their sales decks to win the deals, but then not use us ultimately in, in, in delivering the goods. And then the, the, and now that 1 Well, at each one of those the change came from staring at insolvency papers. So going, okay, we're going to have to put this business under, scale it back and redo it.

Then the last kind of This, this current product has been a very [00:42:00] interesting journey because it was actually going the right way. It had a very strong product market fit. It was serving the marketers inside the media industry, particularly from the media owner standpoint, actually, surprisingly, whereas a lot of the attention providers were serving the media agencies for media buying.

And and I, I would say on that one, though, COVID hit after the first tranche of three funding hit. Obviously, everybody got skittish. Some tough conversations had to happen. I had to let go of a really good team. And we scaled back. A couple people stayed through. Then after COVID hit, the The recession threats hit and we were trying to go for series a funding and that didn't go through.

So we had to hit another reset button. And at that time, we were just like, I'd say the biggest thing is cut deep and cut once. If you ever have to make cuts having been through a few rounds of cuts, it sucks. It's a very, very. Visceral human experience, [00:43:00] but also surround yourself with the people who've got your back.

Even if you do fail, and I think that one's real hard. And so, like, Diego, my co founder through and through has had my back. And I'd say if you get lucky enough to find a good partner through business. Hold on to that. I think, yeah, I, I hate giving advice. I'd say that the biggest things that I would probably tell myself are always, always, always, no matter what you do, even if the business has to move in a certain direction, be people for be just be human about it.

I've listened to a lot of lawyers and follow the legal process process does win the day. God, you can avoid a lot of pain if you're real human about it. And I definitely didn't nail it on a few of those conversations. I, I, I messed it up in a number of ways. And that was a big one. Co founders love them to death and, and make sure you always, you know, keep working on being the right fit.

They're there, it's a [00:44:00] marriage because you're going to need, you're going to need the energy when you don't have it. And they just, they put rocket fuel in the tank when you need it. And then also, you know, really understand your business stage. I didn't understand your business stage in the market. This whole idea of growth at all costs just never really gelled with me, but I tried to live it and I failed at every turn only to, to when I went back to my core, this is what I believe in.

And the moment I did that. Everything started to work, like it's just really weird. It just started to work. And, and again, I think that's believing yourself as a founder and all that type of stuff. I mean, you're told no, you're an idiot and your ideas are dumb and you're selling snake oil and all that type of stuff.

And you're like, well, I still think it's, I still think empathy and body language or empathy is really important in a world of the internet. And I still think the body language is probably our key identifier of, of how to empathize.

Henry Piney: Well, thank you, Matt. I know it's [00:45:00] been quite a journey. I also know it's, it's on an upward swing and I think we'll continue to be so.

Now talk about an upward swing and we should jump onto a quick fire round, but I'm going to invert these questions slightly because I happen to know that you got married fairly recently. So I don't know, I don't want to know what you think your best and worst characteristics are. I want to know what your wife thinks your best and worst characteristics are.

Matt Celuszak: So in anticipation of this, I went and I said, I know Henry, he typically asks some of these questions listen to your podcast. And so I actually had a conversation with her. So the best characteristics, she said positivity. She's like, you're, you're sickingly positive. And she said, like, even in the middle of the night.

When you can't sleep, you'll turn around and go, yeah, but it's okay. Cause there's so many interesting things going on. And so you're just like, she's like almost to the point that it's annoying, but it really helps. She said your positivity is just infectious. [00:46:00] Resilience. She said, boy, do you know how to take a knock and, and just.

Get back up again. Keep going. So yeah, she said that. And then she said the third she said is kind of a two parter. It's the best and worst is your kindness. You really want to help other people, but you do it at the fourth, which is the worst at the cost of your own time and therefore time management.

You are rubbish at time and I am rubbish at time management. absolute rubbish at time management. Thankfully, I've got a great board and I've got great operators around me who are not rubbish at time management. So it's always something that's constant work. And yeah, I would say, so there you go.

Positivity, resilience, and I would say kindness on the two fronts and that I put other people's problems and I like to solve those before I solve my own and then time management. It was horrendous.

Henry Piney: Yeah, I think it sounds like a pretty fair, like, summation. It'd also be interesting to ask that question, like, in five years.

Yeah. [00:47:00] I will, I've written down the answer so we can do that. A couple of final questions. What do you wish you What do you do now? That you wish you'd known, say, 10 years ago.

Matt Celuszak: Validate, validate, validate the problem. What does that mean? I just got really excited about I mean, I knew in my gut that there was a problem, but what I didn't do is I didn't spend enough time speaking to people in the market.

And I, I spoke to like, I went and calculated the other day, it was about 16, 1700 people in the first five years. And so that still wasn't enough. We didn't isolate and I got caught up in like, trying to find a way in to make this boost. I was more, more, it turned out I was more interested in trying to bootstrap the business than I was actually solving the problem.

I'm recognizing what was required to solve the problem. And I hadn't, didn't really clock that until the last four years. So I could have saved myself probably about six years [00:48:00] worth of time and effort. Now we had moving markets. There were things that were just out of our control and things change.

And, you know you know, the Mike Tyson phrase, you know, everyone has a plan to get punched in the face, but spend more time validating. I think just take your time. Actually, the rat race isn't there. There's very few products that need that where there's a true race to market. In a lot of cases, if you build it right before you rush it.

You're going to actually end up with more market share on the longterm. So I think a think longterm and B just validate. Don't don't feel pressured. Don't feel pressured to try and raise money or do whatever. Just validate and actually take that signal of failure is actually a good thing and learn from it and keep going.

Henry Piney: So Matt, it sounds like you weren't necessarily rushed in creating the product and from even hearing you talk, you're incredibly knowledgeable around this space and that you know the intricacies of [00:49:00] the products very, very well. It almost sounds like you were assuming there was a problem that needed to be solved, which maybe I was assuming

Matt Celuszak: I was assuming the problem was recognized.

I was assuming that it was, it was more obvious than it was and it just wasn't. And it was actually the 2nd order 3rd order level of the problem. That was more of the problem. We should have honed in on at the time. So the root cause. Remains the same of the problem and the root cause is not going away.

It's got, I think it was, it was actually interviewed a product manager today and, and I think he summarized it very well. He said, you got all the ingredients for the rest for the recipe, but the bread isn't rising. And, and, and why is that? And and, and that was very, very real up until about a year and a half ago, two years ago, where we isolated a problem where we could be a true value, differentiated value, not just like nudge.[00:50:00]

The efficiency a little bit, but really, really say here is where behavioral metrics can really fundamentally make a visceral difference that the user can feel in their day to day lives. And that and that's where that's what we had not really clocked initially. We thought accessibility was the issue in the first one with the API.

It wasn't, it was that the real product problem wasn't out there, but I mean, our, we weren't the only ones, our competitors, I mean, Affectiva raised what, 65 million in the first five years of our company and realize raised 45 million and frankly, they were dealing with the same problems. The market just wasn't there yet.

So

Henry Piney: I hear you. This the market

Matt Celuszak: validate.

Henry Piney: Yeah, I hear. I hear you. So we're at time unconscious. So final question, what's your favorite or most impactful book or recent book? Could be a piece of media. Doesn't have to be a book.

Matt Celuszak: Yeah. Favorite one? Bill Browder's Red Notice. So hang on. What's [00:51:00] that? I don't, I've never even heard of that.

Oh, my God. It's like it reads like a spy novel. Bill Browder was a Jewish American entrepreneur who went to Russia during the nationalization and then privatization. So the Yeltsin era and set up did very well, very well, like, super smart numbers guy, but then got caught up in the Putin era and I'm not going to give away any more.

It is worth, it literally reads like an international spy thriller. It is so good, but that's all real. And it's yeah, have mad respect for that dude. He's a talk about a real entrepreneur.

Henry Piney: Matt, that sounds like a very good recommendation. I get a few of them. Most of them, I have to confess, I don't read, but I will put that one on the list.

This

Matt Celuszak: one is like a holiday read. Like you can actually put it onto the fiction level, but it's just nonfiction. It's amazing.

Henry Piney: Amazing. Amazing. Matt, thanks so much. Great [00:52:00] as always to talk to you and thank you for putting up with my annoying pernickety questions. But but I learned a lot.

Matt Celuszak: No, I appreciate it, Henry.

And, and it's I love listening to your podcast. So thanks for having me on.

Henry Piney: So did you get all that? Matt's a super smart guy. He's very lucid and always thought provoking, even though I'm pretty familiar with the business. I actually went back and listened a couple of times in particular, some of the explanations around behavioral and sensor data and how it may affect the industry.

As mentioned, I'm on Matt's board. So I may be a little biased, but I think he's a pioneer and a visionary in this space and element human is a company that's going places. On that note, in my next interview, I'll have on Catherine Topp, the CEO of Yabble, one of the leaders in the synthetic and augmented data space, also a business to keep a close eye on.

Thanks as always to Insight Platforms for their support, and see you next time.

Related artıcles

There is no related articles for now

Subscribe to our Monthly HXI Newsletter

Get the latest news.

Subscribe now