From Facial Features to Facial Expressions A.Raouzaiou, K.Karpouzis and S.Kollias Image, Video and Multimedia Systems Laboratory National Technical University of Athens
Outline The concept of archetypal expressions FAPs-based description and estimation of FAPs Expression synthesis using profiles Synthesis of intermediate emotions
Archetypal Expressions Source: F. Parke and K. Waters, Computer Facial Animation, A K Peters universal Also termed universal because they are recognized across cultures
Archetypal Expressions (cont.) Description of the archetypal expressions through muscle actions Translation of facial muscle movements into FAPs Creation of FAPs vocabulary for every archetypal expression Action Units (AUs) - FACS raise_l_i_ eyebrow e.g.AU1=+ raise_r_i_eyebrow e.g. sadness close_t_l_eyelid, close_t_r_eyelid, close_b_l_eyelid, close_b_r_eyelid, raise_l_i_eyebrow, raise_r_i_eyebrow, raise_l_m_eyebrow, raise_r_m_eyebrow, raise_l_o_eyebrow, raise_r_o_eyebrow
FAPs-based description Discrete features offer a neat, symbolic representation of expressions Not constrained to a specific face model Suitable for face cloning applications MPEG-4 compatible Based on feature points, not complete features
FAPs-based description (cont.) Two issues should be addressed : choice of FAPs involved in profiles’ formation definition of FAP intensities
Expression synthesis Choice of FAPs is based on psychological data Intensities are derived from expression database images
Estimation of FAPs Absence of clear quantitative definition of FAPs It is possible to model FAPs through FDP feature points movement using distances s(x,y) e.g. close_t_r_eyelid (F 20 ) - close_b_r_eyelid (F 22 ) D 13 =s (3.2,3.4) f 13= D 13 - D 13-NEUTRAL
Sample FAP vocabulary Sadness: close_t_l_eyelid(F 19 ), close_t_r_eyelid(F 20 ), close_b_l_eyelid (F 21 ), close_b_r_eyelid(F 22 ), raise_l_i_eyebrow(F 31 ), raise_r_i_eyebrow(F 32 ), raise_l_m_eyebrow(F 33 ), raise_r_m_eyebrow(F 34 ), raise_l_o_eyebrow(F 35 ), raise_r_o_eyebrow(F 36 )
Archetypal Expression Profiles Profile Profile: set of FAPs accompanied by the corresponding range of variation
Sample Profiles of Anger A 1 : F 4 [22, 124], F 31 [-131, -25], F 32 [-136,-34], F 33 [-189,-109], F 34 [- 183,-105], F 35 [-101,-31], F 36 [-108,-32], F 37 [29,85], F 38 [27,89] A 2 : F 19 [-330,-200], F 20 [-335,-205], F 21 [200,330], F 22 [205,335], F 31 [-200,-80], F 32 [-194,-74], F 33 [-190,-70], F 34 =[-190,-70] A 3 : F 19 [-330,-200], F 20 [-335,-205], F 21 [200,330], F 22 [205,335], F 31 [-200,-80], F 32 [-194,-74], F 33 [70,190], F 34 [70,190]
Emotion representation Emotions can be approached as points on a plane defined by activation and evaluation
Intermediate Expression Profiles Same universal emotion category Animation of the same FAPs using different intensities Absence of expert knowledge for the (+, –) quadrant worry < fear< terror
Intermediate Expression Profiles Different universal emotion categories In the same evaluation half-plane Averaging of FAPs used in universal emotions
Intermediate Expression Profiles Different universal emotion categories afraid + sad= depressed
Conclusions FAPs provide a compact and established means of emotion representation Necessary input from psychological and physiological studies Universal emotions can be used to synthesize intermediate ones Useful for low-bitrate MPEG-4 applications
Extensions Verification – Evaluation Initial results Acceptable performance for expression grading Intermediate expressions: better results for the negative evaluation half plane Lack of linguistic rules for the (+, -) quadrant
Extensions Personalized ECAs Detected facial feature points can be used to adapt a generic ECA head (FDP FPs) Intermediate emotions based on processing real data (FAP extraction) Processing real data temporal aspect of FAPs