Darmstadt, TU, Master Thesis, 2017
Human computer interactions can be made easier if we make computers understand person's emotions. Over the years, research in emotion recognition has mainly focused on facial expressions, voice analysis and hand-writing. Apart from these conventional methods, body movements, body postures and gestures or quality of movements can also be used to differentiate basic or fundamental emotions like happiness, anger, fear, sadness, surprise etc. For instance in case of fear, body of a person contracts, muscles tighten while as in case of happiness, muscles are more relaxed and body tends to occupy more area. Recognizing emotions of a person solely on his movements will enable efficient communication between human and machine.
This master thesis is based on this idea of the machine being able to recognize the emotions from postures and movements of a human. A couch as a smart furniture has been used for the prediction of postures which are further used to predict the fundamental emotions including anxiety, happiness, sadness, relaxation, being focused/interested by using capacitive proximity sensors integrated into the couch. Android application was developed to predict the real-time postures of a person using machine learning classification algorithms.
A relation between postures and movements with emotions has been established. This relation was considered as a baseline for the prediction of emotions. For the recognition of mentioned emotions, the detected movements and postures were analyzed and evaluated using various classification algorithms in machine learning. Furthermore the comparison of these classification algorithms with respect to performance was done and the better accuracy classification algorithm was chosen.
This thesis also discusses in depth various methods that have been used to evoke the emotions of a human being during evaluation experiments. After successful evoking and prediction of the emotions, the results can then be used in various smart home applications.