Skip to content

UTkzhang/eMotion

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 

Repository files navigation

Emotion

Inspiration Mobility Traditional home assistants are static and gimmicky. We want a home companion that can physically do something.

Compatibility With the universality of voice commands, we want to create a companion device that can aid the lives of everyone. Our Machine Learning models can be trained to let eMotion help find any object around you.

Adaptability As machine intelligence grows, we want to incorporate its power to create a helper that learns to improve.

What it does Through integration with Google Assistant, eMotion can recognize your voice and act based on your commands. Using an integrated camera and Machine Learning algorithms, eMotion can identify objects in its surroundings and bring them to you at your request. With its tracks, it can accompany you anywhere, making it a true smart companion.

How we built it Using Google Dialogflow, we integrate with Google Assistant on Android phones or Google Home. Dialogflow then communicates with the DragonBoard 410c running Linaro Linux, which sends serial instructions to the Arduino on the robot.

What's next for eMotion As eMotion is used more, its computer vision models can be trained to recognize more objects. With the help of extra sensors, eMotion will be better able to navigate its environment to search for what you request.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published