%0 Journal Article %A Seyedarabi, H. %A Aghagolzadeh, A. %A Khanmohammadi, S. %A Kabir, E. %T Analysis and Synthesis of Facial Expressions by Feature-Points Tracking and Deformable Model %J Journal of Iranian Association of Electrical and Electronics Engineers %V 4 %N 1 %U http://jiaeee.com/article-1-260-en.html %R %D 2007 %K Facial feature points, Probabilistic Neural Network, Deformable face model, Facial animation., %X Face expression recognition is useful for designing new interactive devices offering the possibility of new ways for human to interact with computer systems. In this paper we develop a facial expressions analysis and synthesis system. The analysis part of the system is based on the facial features extracted from facial feature points (FFP) in frontal image sequences. Selected facial feature points were automatically tracked using an improved cross-correlation based motion tracking algorithm, and extracted feature vectors from the position of FFPs in the first and the last frames, were used to classify expressions and Action Units (AU), using Probabilistic Neural Networks (PNN).Comparing with similar analysis works, we improved analysis system recognition rate by using improved motion tracking system and some new facial features. The synthesis part of the system uses a deformable patch object model that models facial features (eyes, eyebrows and mouth) by small polygons. The coordinates of the vertices of these polygons can be changed based on the FFP positions in the original image. The synthesis part of the system can be used as an on-line personalized facial animator by tracking some FFPs in the original image, or as a generic facial expressions animator by applying some parameters for each AU code (off-line animation). The proposed deformable model has a simple structure and uses a few set of control points comparing to the similar face models. %> http://jiaeee.com/article-1-260-en.pdf %P 11-19 %& 11 %! %9 Research %L A-10-1-198 %+ %G eng %@ 2676-5810 %[ 2007