Researchersgenerallyagreethathumanemotionscorrespondtotheexecutionofanumberofcomputationsby thenervoussystem.But,thereisstrongdisagreementonwhatthesecomputationsare.Onehighlycontentious pointistheperceptionofemotionthroughfacialexpressions.Thatis,whicharetheemotionsignalsproduced byasenderthatarevisuallyrecognizedbyanobserver?Theoverarchinggoalofourresearchistoidentifythese signalsandspecifytheformanddimensionsofthecomputationalmodeloftheirvisualrecognition.Ourgeneral hypothesisisthatthehumanvisualsystemsolvestheinverseproblemofproduction.Inthefirstfiveyearsof funding, we have studied the hypothesis that consistent and differential facial muscle actions (called, action unitesorAUs)areasubsetofthesecomputationsexecutedbythenervoussystem.Thatis,whenexperiencing the same emotion, identical AUs are used by all people. Additionally, these AU activations are differential betweenemotions.Thus,ourgenerallyhypothesisstatesthatthegoalofthevisualsystemistoidentifywhich AUs are present in an image of a face. To date, we have completed several computational, behavioral and imaging studies favoring this hypothesis. These studies identified a previously unknown set emotive signals calledcompoundfacialexpressionsofemotion.However,theseresultsweregivenbyananalysisofposedfacial expressionsofemotionfilmedincontrolledconditionsinthelaboratory.Thefirstspecificaimofthisrenewalis to assess the validity of these results on facial expressions seen outside the laboratory (called ?in the wild?). Sincetheheterogeneityofspontaneousfacialexpressionsobservedoutsideofthelabislargerthanthoseseen inposedexpressions,wehypothesizethatfacialexpressionsinthewildcommunicateanevenlargernumberof emotionsthanthosepreviouslyidentified.Also,theseresultsstudythecomputationsexecutedbythenervous system that yield consistent and differential movements of one?s facial muscles only. In our second aim, we hypothesizethatthecomputationsexecutedbythenervoussystemalsoinvolvechangesinbloodflow,e.g.,by increasingbloodflowinthefacewhenexpressinganger.Specifically,wehypothesizethatthesechangesare consistentwithinanddifferentialbetweenemotions.Thus,ourgeneralhypothesisimpliesthatthevisualsystem mustidentifythesefacialcolorchanges.Wewillalsotestthealternativehypothesisthatfacialcolorisusedto communicateotheremotivevariables,e.g.,valenceandarousal.Ourthirdandfinalaimwillexaminetheneural mechanismsassociatedtothecomputationsstudiedintheprevioustwoaims.Ourstudiesaretimely,because alackofunderstandingofthecomputationsofemotionposesacriticalbarriertoprogressinbasicandclinical research.Specifically,researchersaredividedonwhetheremotioncategoriesidentifiedinthelaboratoryalso exist in the wild, there is a poor understanding of which emotion signals are communicated through facial expressions,anditisunclearwhichneuralmechanismsareassociatedwiththesecomputations.

Public Health Relevance

Our computational model and results have the potential to challenge current practices in and scientific understanding of psychopathologies. Diagnosis of psychopathologies primarily relies on the checklist of a numberofsymptomsandbehaviorsthathasbeencompiledafteryearsofclinicalobservation,self-reportand statisticalanalysis.Symptoms-baseddiagnosistypicallyleadstoheterogeneityandreification.Thenewfrontier indiagnosisofpsychopathologiesisthedevelopmentoftoolsandprotocolsthatcanclassifydisordersbased on observable behavior defining known variants of the computational model of neurotypicals. Given the relevanceofemotionperceptioninalargenumberofpsychopathologies,ourunderstandingofthecomputational modelofthevisualperceptionoffacialexpressionsofemotionisessentialtoovercomethecurrentscientific barrierneededtoyieldsuchashiftintranslationalandclinicalresearch.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Research Project (R01)
Project #
2R01EY020834-06A1
Application #
9428905
Study Section
Special Emphasis Panel (ZRG1)
Program Officer
Wiggs, Cheri
Project Start
2010-09-30
Project End
2019-09-29
Budget Start
2018-09-30
Budget End
2019-09-29
Support Year
6
Fiscal Year
2018
Total Cost
Indirect Cost
Name
Ohio State University
Department
Engineering (All Types)
Type
Biomed Engr/Col Engr/Engr Sta
DUNS #
832127323
City
Columbus
State
OH
Country
United States
Zip Code
43210
Pumarola, Albert; Agudo, Antonio; Martinez, Aleix M et al. (2018) GANimation: Anatomically-aware Facial Animation from a Single Image. Comput Vis ECCV 11214:835-851
Zhao, Ruiqi; Wang, Yan; Martinez, Aleix M (2018) A Simple, Fast and Highly-Accurate Algorithm to Recover 3D Shape from 2D Landmarks on a Single Image. IEEE Trans Pattern Anal Mach Intell 40:3059-3066
Martinez, Aleix M (2017) Computational Models of Face Perception. Curr Dir Psychol Sci 26:263-269
Martinez, Aleix M (2017) Visual perception of facial expressions of emotion. Curr Opin Psychol 17:27-33
Zhao, Ruiqi; Martinez, Aleix M (2016) Labeled Graph Kernel for Behavior Analysis. IEEE Trans Pattern Anal Mach Intell 38:1640-50
Hamsici, Onur C; Martinez, Aleix M (2016) Multiple Ordinal Regression by Maximizing the Sum of Margins. IEEE Trans Neural Netw Learn Syst 27:2072-83
Benitez-Quiroz, C Fabian; Wilbur, Ronnie B; Martinez, Aleix M (2016) The not face: A grammaticalization of facial expressions of emotion. Cognition 150:77-84
Srinivasan, Ramprakash; Golomb, Julie D; Martinez, Aleix M (2016) A Neural Basis of Facial Action Recognition in Humans. J Neurosci 36:4434-42
Du, Shichuan; Martinez, Aleix M (2015) Compound facial expressions of emotion: from basic research to clinical applications. Dialogues Clin Neurosci 17:443-55
Du, Shichuan; Tao, Yong; Martinez, Aleix M (2014) Compound facial expressions of emotion. Proc Natl Acad Sci U S A 111:E1454-62

Showing the most recent 10 out of 25 publications