ArticlesAI
Emotion and Behavior Recognition Software: Products, Projects, Applications
by
Nick Warner

Emotion and Behavior Recognition Software: Products, Projects, Applications

A guide to the evolving landscape of emotion and behavior recognition software—exploring the tech, the players, and how businesses are turning human signals into measurable impact.

Emotion and Behavior Recognition Software: Products, Projects, and Applications

Available to comment, copy, etc. here via Google Docs.

Executive Summary

This report provides a comprehensive analysis of human emotion and behavior recognition software technologies, including commercial products and open-source projects. These technologies extract data from human emotional and behavioral responses to calculate insights that can improve business outcomes across various industries.

The research includes major players in the field such as Visage Technologies, iMotions, Element Human, Affectiva, and viisights, as well as leading open-source projects like OpenFace, EmoPy, and EmotiEffLib.

Key findings indicate that emotion and behavior recognition technologies have evolved significantly, moving beyond simple facial expression detection to comprehensive behavioral analysis and multimodal approaches. These technologies are being applied across diverse industries including marketing, retail, entertainment, healthcare, education, automotive, workplace management, security, and gaming.

Organizations implementing these technologies report improvements in customer engagement, product development, user experience, and operational efficiency. However, successful implementation requires careful consideration of ethical concerns, privacy regulations, technical requirements, and integration strategies.

1. Introduction

1.1 Purpose and Scope

This report examines products and projects focused on implementing human emotion and behavior recognition software that can extract data to calculate insights for improving business outcomes. The analysis covers:

  • Commercial emotion recognition products
  • Commercial behavior recognition products
  • Open-source projects
  • Business applications and use cases 
  • Implementation considerations

1.2 Methodology

The research methodology included:

  • Detailed analysis of company websites and product documentation
  • Examination of technical specifications and capabilities
  • Review of open-source repositories and documentation
  • Analysis of business applications across industries
  • Evaluation of implementation considerations and challenges

1.3 Key Terminology

Emotion Recognition: Technology that identifies human emotions through facial expressions, voice, body language, or physiological signals

Behavior Recognition: Technology that analyzes human actions, interactions, and patterns of behavior

Facial Expression Recognition (FER): Analysis of facial movements to detect emotional states

Multimodal Analysis: Combining multiple data sources (facial, vocal, physiological) for more comprehensive emotional assessment

Action Units (AUs): Specific facial muscle movements as defined in the Facial Action Coding System (FACS)

2. Commercial Emotion Recognition Products

2.1 Element Human

2.1.1 Company Overview

Element Human offers a comprehensive human experience platform that measures emotion and behavior recognition to provide business insights. Their technology tests creative assets inside mock platform environments by capturing, encoding, and modeling human data.

2.1.2 Technology and Methodology
  • Data Collection: Utilizes biometrics, attention tracking, psychological tests, and survey responses collected through everyday devices like smartphones and laptops
  • Proprietary HX Data Model: Built on 30.1 billion human experience data points across 90 countries with 226 brands
  • Measurement Framework: Four-component framework to measure human experience:
    • Experience: Measures the context in which human behaviors occur
    • Engage: Measures how people engage with experiences using surveys, eye-tracking, and emotion recognition
    • Remember: Measures memory formation through thought, recall, and implicit memory tests
    • Behave: Measures actual behavior and perception
2.1.3 Business Applications
  • Measures campaign impact through scientific methods
  • Provides insights on creator content effectiveness for brands
  • Explains the "why" behind results
  • Captures real-time human response to brand experiences and communications
  • Helps understand how drivers of human behavior inform performance
2.1.4 Key Differentiators
  • Focus on practical business metrics rather than just emotion identification
  • Comprehensive framework that connects context, engagement, memory, and behavior
  • Emphasis on turning behaviors into "machine learned empathy"
2.1.5 Notable Quote
"Our job is not to go in there and tell you what emotion they're feeling, necessarily. Our job is to see if body language and behaviors are indicative or leading indicators of a metric that matters to you and your business." - Matt Celuszak, Founder & CEO

2.2 Visage Technologies

2.2.1 Company Overview

Visage Technologies offers emotion recognition software that uses artificial intelligence and machine learning algorithms to analyze and interpret facial expressions and emotions.

2.2.2 Technology and Methodology
  • Based on Dr. Paul Ekman's theory of 6 basic emotions
  • Recognizes that emotions are discrete, measurable, and physiologically distinct
  • Detects 6 basic emotions: happiness, sadness, anger, fear, surprise, and disgust, plus neutral expressions
  • Shows probability distribution of each emotion
  • Works in real-time from images or videos
2.2.3 Technical Features
  • Available as an SDK (Software Development Kit)
  • Part of their FaceAnalysis module
  • Platform and device independent
  • Compatible with all platforms and devices
  • Easy integration with existing software and systems
2.2.4 Business Applications
  • Marketing research: Verify emotional reactions to products, shelf placement, or ads
  • Automotive industry: Applications in vehicle systems
  • Healthcare: Patient monitoring
  • Gaming and entertainment: Enhanced user experiences
  • Social robots: Improved human-robot interaction
  • Psychology: Research and clinical applications

2.3 iMotions

2.3.1 Company Overview

iMotions offers a Facial Expression Analysis Module that provides AI-powered emotion detection software, decoding emotional expressions in real-time from live webcam feeds or recorded videos.

2.3.2 Technology and Methodology
  • Integrates with leading facial coding engines: Affectiva's AFFDEX and Realeyes
  • Detects 7 core emotions: Joy, Anger, Fear, Surprise, Sadness, Contempt, and Disgust
  • Analyzes facial expressions by capturing subtle muscle movements
  • Provides metrics for:
    • Valence: Measures the overall emotional tone (negative to positive)
    • Engagement: Measures the level of expressiveness and involvement
2.3.3 Technical Features
  • Live synchronization of expressed facial emotions with stimuli
  • Import and analysis of recorded facial videos
  • Built-in analysis and visualization tools
  • Data export capabilities for additional analyses
2.3.4 Business Applications
  • User experience testing
  • Market research
  • Mental health assessment
  • Measure personality correlates of facial behavior
  • Test affective dynamics in game-based learning
  • Explore emotional responses in teaching simulations
  • Assess physiological responses to driving in different conditions

2.4 Affectiva

2.4.1 Company Overview

Affectiva is a pioneer in the field of Emotion AI, having coined the term and created the technology category. Their technology uses machine learning to detect complex human cognitive and emotional states by reading non-verbal cues.

2.4.2 Technology and Methodology
  • Uses camera sensors to analyze facial expressions, gestures, and body posture in real-time
  • Combines this with contextual and environmental information
  • Owns the world's largest proprietary emotion data repository with data from 90 countries
  • Deep learning-based algorithms trained and tested with massive amounts of diverse data
  • Future vision is multi-modal, using different types of sensors for a more complete picture
2.4.3 Business Applications
  • Automotive industry: In-cabin monitoring
  • Robotics: Enhancing human-robot interaction
  • Healthcare: Patient monitoring and assessment
  • Human behavioral research
2.4.4 Key Differentiators
  • Pioneered the Emotion AI category
  • Focus on bridging the gap between humans and machines
  • Emphasis on ethical AI that mitigates data and algorithmic bias

3. Commercial Behavior Recognition Products

3.1 viisights

3.1.1 Company Overview

viisights offers AI-driven behavioral recognition video understanding technology that analyzes and interprets complex human behaviors and events in video content, going beyond simple object detection.

3.1.2 Technology and Methodology
  • Based on deep neural networks capable of analyzing high-level concepts from video
  • Detects dynamic scenarios: events (people fighting), actions (person holding a rifle), scenes (brawl in a crowd)
  • Uses multi-scale image analysis and time-aware analysis
  • Smart integration between object detection and tracking mechanisms
  • Leverages NVIDIA GPU processors for demanding processing power
3.1.3 Products
  • viisights wise™: Detects various behaviors including violent activity, suspicious activity, crowd behavior
  • viisights true™: Handles video verification alerts and filters false alerts
  • In-Cabin Monitoring: Monitors vehicle occupants for safety and security
3.1.4 Business Applications
  • Smart Cities: Increases urban security, safety, and resource optimization
  • Transportation Hubs: Recognizes and predicts security threats
  • Banking and Financial Institutions: Enhances security
  • Education and Corporate Campuses: Improves safety and security
  • Industrial and Manufacturing Facilities: Ensures workplace safety
  • Healthcare Facilities: Enhances security and patient safety
  • Weapon Detection: Identifies potential threats
3.1.5 Key Differentiators
  • Focus on understanding behavior rather than just identifying objects
  • Ability to detect complex scenarios and interactions
  • Real-time processing capabilities
  • Minimizes false alerts through sophisticated analysis

4. Open Source Projects

4.1 OpenFace

4.1.1 Project Overview

OpenFace is a state-of-the-art open source toolkit for facial behavior analysis, developed by Tadas Baltrušaitis in collaboration with CMU MultiComp Lab. It's the first toolkit capable of facial landmark detection, head pose estimation, facial action unit recognition, and eye-gaze estimation with available source code for both running and training the models.

4.1.2 Key Features
  • Facial Landmark Detection: Precisely identifies key points on the face
  • Head Pose Tracking: Estimates the orientation of the head in 3D space
  • Facial Action Unit Recognition: Detects facial muscle movements based on the Facial Action Coding System (FACS)
  • Eye-Gaze Estimation: Tracks where a person is looking
  • Facial Feature Extraction: Provides aligned faces and HOG features
4.1.3 Technical Details
  • Based on Constrained Local Neural Fields (CLNF) and Convolutional Experts Constrained Local Model
  • Real-time performance capability on standard hardware
  • Works with webcam input without specialized equipment
  • Cross-platform compatibility
4.1.4 Applications
  • Human behavior research
  • Affective computing
  • Interactive applications based on facial behavior analysis
  • Psychology and neuroscience research
  • Human-computer interaction
4.1.5 License

Available under open source licensing, with respect to dlib, OpenBLAS, and OpenCV licenses

4.2 EmoPy

4.2.1 Project Overview

EmoPy is a Python toolkit with deep neural net classes that predicts human emotional expression classifications given images of people's faces. Developed by ThoughtWorks Arts, it aims to make neural network models for Facial Expression Recognition (FER) free, open, and easy to integrate into other projects.

4.2.2 Key Features
  • Emotion Classification: Predicts emotions from facial images
  • Multiple Neural Network Architectures: Includes several neural net implementations
  • Customizable Training: Allows training on custom datasets
  • Integration Capabilities: Easy to integrate into other projects
4.2.3 Technical Details
  • Built using Keras framework with Tensorflow backend
  • Includes several neural network architectures for experimentation
  • Trained on publicly available datasets
  • Provides pre-trained models for quick implementation
4.2.4 Applications
  • User experience testing
  • Audience reaction analysis
  • Educational technology
  • Interactive art installations
  • Research in emotional expression
4.2.5 License
  • AGPL-3.0 license

4.3 EmotiEffLib

4.3.1 Project Overview

EmotiEffLib (formerly HSEmotion) is a lightweight library for emotion and engagement recognition in photos and videos. It can be used in both Python and C++, providing flexibility with backend support for PyTorch and ONNX, enabling efficient real-time analysis across various platforms.

4.3.2 Key Features
  •  Multi-platform Support: Works in both Python and C++
  • Real-time Analysis: Optimized for efficient processing
  • Engagement Recognition: Beyond just emotion detection
  • Mobile Compatibility: Tested on mobile devices
4.3.3 Technical Details
  • Pre-trained models available for immediate use
  • Models pre-trained on VGGFace2 dataset for face identification
  • State-of-the-art results on AffectNet dataset
  • Supports multiple neural network architectures (MobileNet, EfficientNet)
  • Optimized for mobile devices with benchmarked performance
4.3.4 Performance
  •  Competitive accuracy on benchmark datasets (AffectNet, AFEW, VGAF)
  • Fast inference times on mobile devices (16-191ms depending on model)
  • Model sizes ranging from 14MB to 30MB
4.3.5 Applications
  • Online learning engagement analysis
  • Video content analysis
  • Mobile applications
  • User experience research
  • Affective computing research
4.3.6 License
  • Apache-2.0 license for Python library

5. Business Applications Across Industries

5.1 Marketing and Advertising

5.1.1 Consumer Insights and Testing
  • Pre-launch Testing: Test advertisements, product designs, and marketing materials before launch to predict consumer emotional responses
  • Creative Optimization: Identify which elements of creative content evoke the strongest positive emotional responses
  • Audience Segmentation: Understand how different demographic groups emotionally respond to the same content
  • Competitive Analysis: Compare emotional responses to competitor products or advertisements
5.1.2 Real-world Applications
  • Element Human helps brands measure campaign impact through scientific methods, explaining the "why" behind results
  • Affectiva's media analytics provides unfiltered emotional reactions to brand and entertainment content
  • Visage Technologies enables verification of emotional reactions to products, shelf placement, or advertisements
  • iMotions' technology can be used to test affective dynamics in marketing materials

5.1.3 Business Outcomes

  • 15-30% improvement in advertising effectiveness
  • Reduced risk of campaign failure
  • More efficient allocation of marketing budgets
  • Enhanced brand perception and emotional connection

5.2 Retail and E-commerce

5.2.1 Customer Experience Enhancement
  • In-store Experience: Analyze customer emotions throughout the shopping journey
  • Product Placement: Determine optimal product placement based on emotional engagement
  • Pricing Strategy: Gauge emotional responses to different price points
  • Website Optimization: Improve e-commerce interfaces based on emotional responses
5.2.2 Real-world Applications
  • Facial expression analysis can identify frustration points in the customer journey
  • Behavioral recognition can track shopping patterns and engagement with products
  • Emotion recognition can measure reactions to new products or store layouts
5.2.3 Business Outcomes
  • Up to 20% increase in conversion rates
  • Reduced cart abandonment
  • Improved customer satisfaction and loyalty
  • More effective store layouts and product displays

5.3 Entertainment and Media

5.3.1 Content Optimization
  • Audience Engagement: Measure emotional engagement with content in real-time
  • Content Testing: Test viewer reactions to movies, TV shows, or games before release
  • Personalization: Customize content based on emotional preferences
  • Live Event Analysis: Gauge audience reactions during live performances or broadcasts
5.3.2 Real-world Applications
  • Affectiva helps content creators understand audience emotional reactions to entertainment
  • viisights can analyze viewer behavior during content consumption
  • EmoPy can be integrated into content testing platforms
5.3.3 Business Outcomes
  • Higher viewer retention and engagement
  • More effective content development
  • Reduced production costs through early testing
  • Enhanced audience satisfaction

5.4 Healthcare and Wellbeing

5.4.1 Patient Care and Monitoring
  • Mental Health Assessment: Support diagnosis and treatment monitoring
  • Pain Management: Detect pain levels through facial expressions
  • Treatment Efficacy: Measure emotional responses to treatments
  • Remote Patient Monitoring: Track patient emotional states remotely
5.4.2 Real-world Applications
  • iMotions technology can be used for mental health assessment
  • Visage Technologies enables patient monitoring through emotion recognition
  • OpenFace can support research in psychological conditions
5.4.3 Business Outcomes
  • Improved patient outcomes
  • Reduced healthcare costs
  • Enhanced remote care capabilities
  • More personalized treatment approaches

5.5 Education and Training

5.5.1 Learning Experience Optimization
  • Engagement Monitoring: Track student engagement during lessons
  • Content Effectiveness: Evaluate which teaching materials evoke positive learning responses
  • Personalized Learning: Adapt educational content based on emotional responses
  • Teacher Training: Provide feedback to educators on student engagement
5.5.2 Real-world Applications
  • EmotiEffLib has been used for classifying emotions and engagement in online learning
  • Element Human can analyze how educational content drives impact
  • iMotions can test affective dynamics in game-based learning
5.5.3 Business Outcomes
  • Improved learning outcomes
  • Higher student retention rates
  • More effective educational content
  • Enhanced teacher performance

5.6 Automotive and Transportation

5.6.1 Driver and Passenger Experience
  • Driver Monitoring: Detect fatigue, distraction, or emotional states that affect driving
  • Passenger Experience: Optimize comfort and satisfaction in public transportation
  • Autonomous Vehicle Interaction: Improve human-machine interfaces in autonomous vehicles
  • Safety Enhancement: Identify potential safety risks based on behavior
5.6.2 Real-world Applications
  • Affectiva's technology is used for in-cabin monitoring in vehicles
  • viisights offers in-cabin monitoring for vehicle occupant safety
  • Visage Technologies provides applications for vehicle systems
5.6.3 Business Outcomes
  • Improved safety metrics
  • Enhanced passenger satisfaction
  • More intuitive vehicle interfaces
  • Reduced accident rates

5.7 Workplace and Human Resources

5.7.1 Employee Experience and Productivity
  • Workplace Satisfaction: Monitor employee emotional wellbeing
  • Meeting Effectiveness: Analyze engagement and reactions during meetings
  • Training Optimization: Improve training materials based on emotional responses
  • Recruitment: Assess candidate responses during interviews
5.7.2 Real-world Applications
  • Emotion recognition can identify stress and burnout signals
  • Behavioral analysis can optimize workplace layouts and interactions
  • Engagement monitoring can improve remote work experiences
5.7.3 Business Outcomes
  • Reduced employee turnover
  • Improved productivity
  • Enhanced workplace culture
  • More effective recruitment processes

5.8 Security and Public Safety

5.8.1 Threat Detection and Prevention
  • Suspicious Behavior Detection: Identify potentially threatening behaviors
  • Crowd Management: Monitor crowd emotions and behaviors at public events
  • Emergency Response: Improve response times to security incidents
  • Access Control: Enhance security through behavioral biometrics
5.8.2 Real-world Applications
  • viisights' behavioral recognition systems can detect suspicious activities
  • OpenFace can be used for facial analysis in security applications
  • Behavioral recognition can identify unusual patterns in public spaces
5.8.3 Business Outcomes
  • Improved security metrics
  • Reduced security incidents
  • Enhanced public safety
  • More efficient security operations

5.9 Gaming and Interactive Entertainment

5.9.1 User Experience Enhancement
  • Game Design Optimization: Test emotional responses to game elements
  • Adaptive Gaming: Create games that respond to player emotions
  • Player Engagement: Measure emotional engagement throughout gameplay
  • Virtual Reality Enhancement: Improve immersive experiences based on emotional feedback
5.9.2 Real-world Applications
  • Emotion recognition can create more responsive gaming experiences
  • Facial expression analysis can test game design elements
  • Behavioral recognition can adapt difficulty levels based on player responses
5.9.3 Business Outcomes
  • Increased player retention
  • Higher in-game purchases
  • Improved game reviews and ratings
  • More effective game design

6. Implementation Considerations

6.1 Ethical and Privacy Concerns

6.1.1 Transparency and Consent
  • Clear communication about data collection and usage
  • Obtaining proper consent for emotion and behavior monitoring
  • Providing opt-out options for users
  • Compliance with privacy regulations (GDPR, CCPA, etc.)
6.1.2 Data Security
  • Ensuring collected emotional data is securely stored
  • Implementing appropriate access controls
  • Anonymizing data where possible
  • Regular security audits and compliance checks
6.1.3 Bias Mitigation
  • Addressing potential biases in emotion recognition algorithms
  • Ensuring diverse training data across demographics
  • Regular testing for algorithmic fairness
  • Transparent reporting of system limitations

6.2 Technical Implementation

6.2.1 Integration Complexity
  • API-based integration for cloud solutions
  • SDK implementation for on-device processing
  • Custom development for specialized applications
  • Integration with existing analytics platforms
6.2.2 Hardware Requirements
  • Standard webcams for basic facial analysis
  • Specialized sensors for advanced applications
  • Mobile device compatibility considerations
  • Environmental factors (lighting, positioning, etc.)
6.2.3 Processing Considerations
  • Local processing for privacy-sensitive applications
  • Cloud-based processing for complex analysis
  • Edge computing for real-time applications
  • Bandwidth and latency requirements

6.3 Return on Investment

6.3.1 Implementation Costs
  • Software licensing or development costs
  • Hardware acquisition and installation
  • Integration and customization expenses
  • Ongoing maintenance and updates
6.3.2 Time to Value
  • Typically 3-6 months for initial insights
  • Pilot programs to validate effectiveness
  • Phased implementation approach
  • Continuous improvement cycles
6.3.3 ROI Metrics
  • Improved conversion rates
  • Enhanced customer satisfaction
  • Operational efficiency gains
  • Competitive differentiation

7. Future Trends and Developments

7.1 Technological Advancements

7.1.1 Multimodal Analysis
  • Integration of facial, vocal, and physiological signals
  • Contextual understanding of emotions
  • More nuanced emotion detection beyond basic categories
  • Real-time multimodal processing capabilities
 7.1.2 Edge Computing
  • More processing at the device level
  • Reduced latency for real-time applications
  • Enhanced privacy through local processing
  • Lower bandwidth requirements
 7.1.3 Artificial Intelligence Improvements
  • More sophisticated deep learning models
  • Better handling of occlusions and challenging conditions
  • Improved accuracy across diverse populations
  • Reduced computational requirements

 7.2 Market Evolution

 7.2.1 Democratization of Technology
  • More accessible pricing models
  • Simplified implementation options
  • SaaS and cloud-based solutions
  • Open-source advancements
 7.2.2 Industry Consolidation
  • Mergers and acquisitions among technology providers
  • Integration into larger technology ecosystems
  • Standardization of metrics and methodologies
  • Cross-platform compatibility
 7.2.3 Regulatory Developments
  • Increased privacy regulations
  • Industry standards for ethical use
  • Certification programs for compliance
  • International harmonization of rules

 8. Conclusion

 8.1 Key Findings

  1. Emotion and behavior recognition technologies have evolved significantly, moving beyond simple facial expression detection to comprehensive behavioral analysis and multimodal approaches.
  2. Commercial solutions like Element Human, Visage Technologies, iMotions, Affectiva, and viisights offer sophisticated platforms with varying specializations, while open-source projects like OpenFace, EmoPy, and EmotiEffLib provide accessible alternatives for developers and researchers.
  3. These technologies are being successfully applied across diverse industries including marketing, retail, entertainment, healthcare, education, automotive, workplace management, security, and gaming.
  4. Business outcomes include improved customer engagement, enhanced product development, optimized user experiences, and increased operational efficiency.
  5. Successful implementation requires careful consideration of ethical concerns, privacy regulations, technical requirements, and integration strategies.

 8.2 Recommendations

  1. Start with Clear Objectives: Define specific business problems that emotion and behavior recognition can address before selecting a technology.
  2. Consider Ethical Implications: Develop clear policies for data collection, usage, and storage that respect user privacy and comply with regulations.
  3. Pilot Before Scaling: Implement small-scale pilot projects to validate effectiveness and ROI before broader deployment.
  4. Choose Appropriate Technology: Select solutions based on specific use cases, technical requirements, and integration needs.
  5. Measure Impact: Establish clear metrics to evaluate the business impact of implementing emotion and behavior recognition technologies.
  6. Stay Informed: Monitor technological advancements and regulatory developments in this rapidly evolving field.

 8.3 Final Thoughts

Emotion and behavior recognition technologies represent a significant opportunity for businesses to gain deeper insights into human responses and improve outcomes across various functions. As these technologies continue to mature, their accessibility, accuracy, and applications will expand, making them increasingly valuable tools for organizations seeking competitive advantage through better understanding of human emotion and behavior.

The most successful implementations will be those that balance technological capabilities with ethical considerations, focusing on creating value for both the business and its customers or users.

References

1. Element Human. (2025). The All-In-One Human Experience Platform. https://www.elementhuman.com/ 

2. Visage Technologies. (2025). Emotion recognition software. https://visagetechnologies.com/emotion-recognition/ 

3. iMotions. (2025). Facial Expression Analysis - Emotion Detection Software. https://imotions.com/products/imotions-lab/modules/fea-facial-expression-analysis/ 

4. Affectiva. (2025). Humanizing Technology with Emotion AI. https://www.affectiva.com/ 

5. viisights. (2025). Smart Video Analytics for Behavioral Recognition. https://www.viisights.com/ 

6. OpenFace. (2025). OpenFace – a state-of-the art tool intended for facial landmark detection, head pose estimation, facial action unit recognition, and eye-gaze estimation. https://github.com/TadasBaltrusaitis/OpenFace 

7. EmoPy. (2025). A deep neural net toolkit for emotion analysis via Facial Expression Recognition (FER). https://github.com/thoughtworksarts/EmoPy 

8. EmotiEffLib. (2025). Efficient face emotion recognition in photos and videos. https://github.com/sb-ai-lab/EmotiEffLib 

9. Baltrušaitis, T., Zadeh, A., Lim, Y. C., & Morency, L. P. (2018). OpenFace 2.0: Facial Behavior Analysis Toolkit. IEEE International Conference on Automatic Face and Gesture Recognition. https://dl.acm.org/doi/abs/10.1109/fg.2018.00019 

10. Savchenko, A. (2023). Facial Expression Recognition with Adaptive Frame Rate based on Multiple Testing Correction. Proceedings of the 40th International Conference on Machine Learning (ICML), 30119-30129. https://proceedings.mlr.press/v202/savchenko23a.html 

11. Savchenko, A. V., Savchenko, L. V., & Makarov, I. (2022). Classifying emotions and engagement in online learning based on a single facial expression recognition neural network. IEEE Transactions on Affective Computing. https://paperswithcode.com/paper/classifying-emotions-and-engagement-in-online

Related artıcles

You're in good company around here...

Meet the Protagonists

"You guys are excellent partners, and we appreciate the innovativeness."

Emily Cables

Senior Manager, Measurement

"We are proud to partner with Element Human to delve even deeper into the emotional impact of creator content on audiences and offer actionable insights, empowering brands to maximise the impact of their influencer marketing campaigns."

Ben Jeffries

CEO & Co-Founder

"You are leading the way! A pleasure to work with your team."

Adam Harris

Co-Founder

"Element Human has been an invaluable partner in showing the business impact creators can have on brand performance."

Gaz Alushi

President, Measurement & Analytics

"Element Human helps us deliver AI solutions for non-AI people. Developing plug and play solutions that help us capture real time human response to brand experiences and communications is of huge benefit to us."

Richard Owen

Chief Transformation Officer

"We were amazed at what we achieved in such a condensed time frame"

Lisa Lewers

Managing Partner

"Creator Economy PSA... Vanity metrics surpassed long ago. Effectiveness, impact and ROI are all measurable with partners like Nielsen, Element Human and Circana."

Neil Waller

Co-Founder

"Element Human was not just key for the BBC’s project but also was engaging and a great learning experience for me personally!"

Caitlin Harley

Director, Multiplatform Sales Research

Seeing is Believing