-
How Advertisers Can Keep Mobile Users Engaged and Reduce Video-Ad Blocking
Abstract: Advertising researchers do not understand fully the impact different advertisement placement and delivery vehicles have on the mobile user’s experience. To better grasp the mobile user’s experience in real time, the authors collected data streams garnered from the brain and body, including visual fixations, heart rate, electroencephalography, skin conductance, and facial affect. The data helped […]
-
Personality traits affect the influences of intensity perception and emotional responses on hedonic rating and preference rank toward basic taste solutions
Abstract:Â This study aimed at determining, based on independent predictors of taste intensity and emotional response, whether individual personality traits could affect prediction models of overall liking and preference rank toward basic taste solutions. Sixty-seven participants rated taste intensities (TI) of four basic-taste solutions at both low and high concentrations, and of plain water. Emotional responses […]
-
Towards Automated Pain Detection in Children using Facial and Electrodermal Activity
Abstract: Accurately determining pain levels in children is difficult, even for trained professionals and parents. Facial activity and electrodermal activity (EDA) provide rich information about pain, and both have been used in automated pain detection. In this paper, we discuss preliminary steps towards fusing models trained on video and EDA features respectively. We demonstrate the […]
-
Subtle behavioural responses during negative emotion reactivity and down-regulation in bipolar disorder: A facial expression and eye-tracking study
Abstract: Abnormal processing of emotional information and regulation are core trait-related features of bipolar disorder (BD) but evidence from behavioural studies is conflicting. This study aimed to investigate trait-related abnormalities in emotional reactivity and regulation in BD using novel sensitive behavioural measures including facial expressions and eye movements. Fifteen patients with BD in full or […]
-
Impact of Learner-Centered Affective Dynamics on Metacognitive Judgements and Performance in Advanced Learning Technologies
Abstract: Affect and metacognition play a central role in learning. We examine the relationships between students’ affective state dynamics, metacognitive judgments, and performance during learning with MetaTutorIVH, an advanced learning technology for human biology education. Student emotions were tracked using facial expression recognition embedded within MetaTutorIVH and transitions between emotions theorized to be important to […]
-
Exploring Preservice Teachers’ Emotional Experiences in an Immersive Virtual Teaching Simulation through Facial Expression Recognition
This study investigated preservice teachers’ emotional experiences while interacting within a virtual scenario-based teacher-training system called Simulation for Teaching Enhancement of Authentic Classroom behavior Emulator (SimTEACHER). We created three types of interactions (no interaction, unexpected interaction, and expected interaction) within SimTEACHER and examined the influences of the interaction design on preservice teachers’ emotional responses in […]
-
Assessment of human driver safety at Dilemma Zones with automated vehicles through a virtual reality environment
Abstract: Ensuring the safety of mixed traffic environments, in which human drivers interact with autonomous vehicles, is an impending challenge. A virtual traffic environment provides a risk-free opportunity to let human drivers interact with autonomous vehicles, indicating how variability in traffic environments and human responses compromises safety. Analyzing the section of road preceding an intersection […]
-
Using sequence mining to reveal the efficiency in scientific reasoning during STEM learning with a game-based learning environment
Abstract: The goal of this study was to assess how metacognitive monitoring and scientific reasoning impacted the efficiency of game completion during learning with Crystal Island, a game-based learning environment that fosters self-regulated learning and scientific reasoning by having participants solve the mystery of what illness impacted inhabitants of the island. We conducted sequential pattern mining and […]
-
Emotion recognition for semi-autonomous vehicles framework
Abstract: The human being in his blessed curiosity has always wondered how to make machines feel, and, at the same time how a machine can detect emotions. Perhaps some of the tasks that cannot be replaced by machines are the ability of human beings to feel emotions. In the last year, this hypothesis is increasingly […]
-
Visual Attention Mechanisms in Happiness vs. Trustworthiness Processing of Facial Expressions
Abstract: A happy facial expression makes a person look (more) trustworthy. Do perceptions of happiness and trustworthiness rely on the same face regions and visual attention processes? In an eye-tracking study, eye movements and fixations were recorded while participants judged the un/happiness or the un/trustworthiness of dynamic facial expressions in which the eyes and/or the […]
Research Report 2023
In-depth look at the scientific landscape as powered by iMotions software, showcasing groundbreaking research and the impact of our tools in various scientific and industrial fields.
Share Your Research
850+ universities worldwide with an iMotions human behavior lab
73 of the top 100 highest ranked universities
710+ published research papers using iMotions
iMotions is used for some of the most interesting human behavior research studies carried out by top researchers around the world. Contact us to have your publication featured here.
The authors of these publications have used iMotions as a software tool within their research.
“Software should be cited on the same basis as any other research product such as a paper or a book; that is, authors should cite the appropriate set of software products just as they cite the appropriate set of papers” (Katz et al., 2020).
We therefore encourage you to cite the use of iMotions where appropriate.
How to cite iMotions
APA
iMotions (10), iMotions A/S, Copenhagen, Denmark, (2024).
Note: adjust the version and year where relevant.
5 Most Popular Blogs
Learn How to Conduct Human Behavior Research with iMotions
Publications
Read publications made possible with iMotions
Blog
Get inspired and learn more from our expert content writers
Newsletter
A monthly close up of latest product and research news