Filters and Search 🔍
  • Comparing the Affectiva iMotions Facial Expression Analysis Software with EMG

    Open Access28/02/2020Göttingen University

    Abstract: People’s faces display emotions, informing others about their affective states. In order to measure facial displays of emotion, Electromyography (EMG) has widely been used, requiring electrodes and technical equipment. More recently, emotion recognition software has been developed that detects emotions from videos. However, its validity and comparability to EMG is unclear. The aim of […]

  • Eye gaze and facial displays of emotion during emotional film clips in remitted patients with bipolar disorder

    Open AccessPeer-Reviewed27/02/2020

    Aberrant emotional reactivity is a putative endophenotype for bipolar disorder (BD), but the findings of behavioral studies are often negative due to suboptimal sensitivity of the employed paradigms. This study aimed to investigate whether visual gaze patterns and facial displays of emotion during emotional film clips can reveal subtle behavioral abnormalities in remitted BD patients. Thirty-eight […]

  • A Visual Attentive Model for Discovering Patterns in Eye-Tracking Data—A Proposal in Cultural Heritage

    Open AccessPeer-Reviewed24/02/2020Universitá Politecnica delle Marche

    In the Cultural Heritage (CH) context, art galleries and museums employ technology devices to enhance and personalise the museum visit experience. However, the most challenging aspect is to determine what the visitor is interested in. In this work, a novel Visual Attentive Model (VAM) has been proposed that is learned from eye tracking data. In […]

  • Mobile and stationary eye tracking comparison – package design and in-store results

    Open AccessPeer-Reviewed11/02/2020University of Tartu

    Abstract: Purpose This paper aims to test the similarity of the results of on-screen eye tracking compared to mobile eye tracking in the context of first fixation location on stimuli. Design/methodology/approach Three studies were conducted altogether with 117 participants, where the authors compared both methods: stationary eye tracking  and mobile eye tracking. Findings The studies […]

  • Affect and exertion during incremental physical exercise: Examining changes using automated facial action analysis and experiential self-report

    Open AccessPeer-Reviewed11/02/2020University of Potsdam

    Abstract: Recent research indicates that affective responses during exercise are an important determinant of future exercise and physical activity. Thus far these responses have been measured with standardized self-report scales, but this study used biometric software for automated facial action analysis to analyze the changes that occur during physical exercise. A sample of 132 young, […]

  • Willingness to Pay for Rose Attributes: Helping Provide Consumer Orientation to Breeding Programs

    Open AccessPeer-Reviewed01/02/2020University of Kentucky + 2

    Abstract: Floriculture value exceeds $5.8 billion in the United States. Environmental challenges, market trends, and diseases complicate breeding priorities. To inform breeders’ and geneticists’ research efforts, we set out to gather consumers’ preferences in the form of willingness to pay (WTP) for different rose attributes in a discrete choice experiment. The responses are modeled in […]

  • Validating Physiological Stress Detection Model Using Cortisol as Stress Bio Marker

    Open Access01/02/2020University of Kentucky + 2

    Abstract: In this work, we have presented the validation of a stress detection model using cortisol as the stress biomarker. The proposed model uses two physiological signals: Galvanic Skin Response (GSR) and Photoplethysmograph (PPG) to classify stress into two levels. GSR and PPG signals were collected from a total of 13 participants along with saliva […]

  • The Use of Eye Tracking Technology in Aesthetic Surgery: Analyzing Changes in Facial Attention Following Surgery

    Open AccessPeer-Reviewed20/01/2020Cleveland Clinic + 2

    The ability to quantitatively analyze how we look at a face and determine if this changes following facial surgery should be of interest to the plastic surgeon. Eye tracking technology (ETT) provides the ability to record where observers fixate when viewing a facial image, enabling quantitative data to be obtained comparing pre- and postoperative changes. […]

  • Eye-Tracking the City: Matching the Design of Streetscapes in High-Rise Environments with Users’ Visual Experiences

    Open AccessPeer-Reviewed01/01/2020Amsterdam University of Applied Sciences (AUAS), Centre of Applied Research Technology

    Abstract Large cities in the West respond to an ever-increasing shortage of affordable housing by accelerating the process of urban densification. Amsterdam, for instance, aims to increase its housing stock by 10 percent in the next 15 years as its population is expected to grow by 20 percent. As in other cities, it seems inevitable […]

  • I Can See It in Your Face. Affective Valuation of Exercise in More or Less Physically Active Individuals

    Open AccessPeer-Reviewed19/12/2019University of Potsdam + 2

    Abstract: The purpose of this study was to illustrate that people’s affective valuation of exercise can be identified in their faces. The study was conducted with a software for automatic facial expression analysis and it involved testing the hypothesis that positive or negative affective valuation occurs spontaneously when people are reminded of exercise. We created […]

Share Your Research

850+ universities worldwide with an iMotions human behavior lab 

73 of the top 100 highest ranked universities 

710+ published research papers using iMotions 

The authors of these publications have used iMotions as a software tool within their research.

“Software should be cited on the same basis as any other research product such as a paper or a book; that is, authors should cite the appropriate set of software products just as they cite the appropriate set of papers” (Katz et al., 2020).

Publications

Read publications made possible with iMotions

Blog

Get inspired and learn more from our expert content writers

Newsletter

A monthly close up of latest product and research news