Summary

AWARDED 2nd PLACE OVERALL AT SWAMPHACKS

Imagine going through your everyday conversations – and not being able to read body language, social cues, or emotions.

This is a reality for people with autism.

Unfortunately, this leads to people with high-functioning autism being socially outcasted from early education to the work force – and beyond. Not because they do not understand emotion, but because they need to be explicitly told how someone is feeling to respond accordingly.

Our team set out to create a way to do that by providing an application that would read another individuals emotion based on an image fed to Microsoft’s emotion API.

What we built

We build our backEnd with R in order to expose our API to take the image from our web application. We then sent that image to the Microsoft facial recognition API which sent back s JSON response with the relevant encoded emotions

Status

No longer continued.