Automatic recognition of emotions is a challenging task and can be performed using single modal or multimodal inputs. It can enhance the effectiveness of human machine interaction systems and need of the time as it has applications in various domains. The study discusses the experiments carried out to capture and analyze the data which is related to human health. Images of the subjects under study are captured along with human health data and the degree of presence/absence of all seven universally accepted emotions is derived from images and depicted in the form of emotion profile. Facial affect recognition is performed on MIST database a locally created context specific database of images with health data like pulse rate, systolic and diastolic blood pressure. Emotions are categorized as neutral, positive and negative for MIST database. Accuracy obtained as 91.38%. Affect health data is analyzed for deviation in positive and negative emotion category with reference to neutral category. The inference derived from this affect health data analysis is that the observed deviation for negative emotions with respect to neutral emotion category falls into high deviation ranges for more No. of subjects as compare to deviation for positive emotions with respect to neutral emotion category.