Because music conveys and evokes feelings, a wealth of research has been performed on music emotion recognition. Research has shown that musical mood is linked to features based on rhythm, timbre, melody and lyrics. For example, sad music correlates with slow tempo while happy music is generally faster. We see only limited success has been obtained in learning automatic classifiers of Hindustani classical music emotions. In this book we have collected a ground truth data set of 196 raga clips that have been tagged with one of two emotions "happy" and "sad". We investigated all recordings of a time period of 30 seconds for uniformity. Various set of audio features were extracted using standard algorithms. A musical mood classifier was trained. We found that the probability of pitch contour, when included as one of the features, gives 30% higher accuracy of mood recognition.