Recognizing Emotion using DeepFace
Use deepface to analyze the emotion in an image. Pass the image stored as an array to Deepface’s analyze function. It will return a Python dictionary with the percentage of all emotions.
Python3
# storing the dictionary of emotions in result result = DeepFace.analyze(img, actions = [ 'emotion' ]) # print result print (result) |
Output:
[{'emotion': {'angry': 2.9941825391265356e-05, 'disgust': 3.6047339119136895e-10, 'fear': 0.00011003920865101386, 'happy': 97.65191646241146, 'sad': 0.0015582609232700413, 'surprise': 0.0032574247843123716, 'neutral': 2.343132812456228}, 'dominant_emotion': 'happy', 'region': {'x': 325, 'y': 64, 'w': 128, 'h': 128}}]
Extract the emotion with the highest percentage.
Python3
# extracting emotion with highest percentage query = str ( max ( zip (result[ 0 ][ 'emotion' ].values(), result[ 0 ][ 'emotion' ].keys()))[ 1 ]) print (query) |
Output:
happy
Emotion Based Music Player – Python Project
In this article, we will be discussing how can we recommend music based on expressions or say dominant expressions on someone’s face. This is a basic project in which we will be using OpenCV, Matplotlib, DeepFace, and Spotify API.