Whole Home Voice Control
Contributors: Hendrik Hagala (Research)
Gesture Control for Lights and Audio
Problem
Smart control systems for your smart house.
Solution
The idea is to develop a user programmable gesture-controlled home system, where people can program their own gestures to control lights, audio, emergency commands and SeaPod steering.
The system has to be smart enough to recognise users and know their permissions to commands. For example, the owner might want to give SeaPod steering access to a friend but not for kids.
Also, the system has to be secure so there’there’s no data leakage possibilities.
Prize
- Get credited as a Project Contributor to the Ocean Builders Project
- Turn this into your own entrepreneurial business venture and we will be your first customers and help bring you media attention and customers
- Get Entrepreneurial Business Coaching to start this as a business
And here are some potential benefits:
- Mass exposure with highly visible project
- Build reputation
- Recognized as an official collaborator/ and/or on Github
- Get noticed
- Product development experience
- Work on projects you are passionate about
- Get your project built and working in the real world
- Participate in interesting work
- Get grants (maybe partner with someone that can help with this or exposure to grant writers)
- Change the world
Industry
There are a few gesture-controlled home systems in the market like Single Cue, that lets you control your home media devices with your finger movements and Fibaro Swipe. Unfortunately their reviews are not so good.
Below, there are listed some open source projects that might be helpful for your R&D:
Teachable Machine is a web-based tool that you can use to train a computer to recognize your own images, sounds, & poses.
Here you can learn how to make a gesture-controlled Arduino robot using PictoBlox AI’AI’s machine learning feature.
In this video, the developer demonstrates how to build a gesture recognition system and use it to control the media player on the computer. OpenCV + Machine Learning + GUI Automation were used to do this.
Here is a project where Alexa is responding to sign language using webcam and TensorFlow.js
Here you can read an article about how an open source Google sign language AI turns hand gestures into speech.
In this tutorial (see here), you can learn how to build an American Sign Language translator using computer vision and a machine learning model.
MediaPipe Hands is a high-fidelity hand and finger tracking solution. It employs machine learning (ML) to infer 21 3D landmarks of a hand from just a single frame. MediaPipe has also other solutions to detect objects like faces, iris, poses and more.
Information
Repository
<text>
License Requirement
Open Source: Can be used for private or commercial projects
Software: GNU General Public License (GNU GPL V3) here
Non-Software: Creative Commons (CC BY-SA 4.0) here
Project Areas
- Software (ML, AI, Computer Visualization, Neural Network)
- More?
Objectives
First part of the project should be a research on what gestures to use for particular commands.
The second part of the project is to develop a gesture control system for lights, audio, emergency and steering.
Project’Project’s Requirements
Stages and deadlines
Project Start |
date |
Team Formed |
date |
Market Research Summary (Report) |
date |
Project Plan Complete |
date |
Preliminary Product Design Complete |
date |
Prototype Development Complete |
date |
Prototype Evaluation Complete |
date |
Product Presentation |
date |
Project Completion |
date |
Project plan should cover the following:
- stages / milestones of a project (not all stages are brought out in a table above)
- activities or tasks in each phase
- task start and end dates
- interdependencies between tasks
Also:
- skills needed
- responsibilities of each team member (identify as many as you can).
Preliminary product design should cover the following:
- production components, raw material
- system block diagram (This diagram specifies each electronic function and how all of the functional components interconnect).
- preliminary Bill of Materials (BOM)
- production cost estimation
performanceperformancefeaturesfeatures- development
feasibilityfeasibility - manufacturability
Product’Product’s requirements
Basic |
Advanced |
Feature |
Can you control the audio system? |
||
Can you control lights? |
||
Can you control TV? |
||
Can you control SeaPod steering? |
||
Can you control functions for emergencies? |
||
Is it user programmable (users can program their own gestures)? |
||
Can the system recognize users (visual identity)? |
||
Can you select who can have access to particular home control functions? |
||
Can the user create its own gestures? |
||
Is the data sandboxed to protect privacy? |
||
Can it recognize gestures at any angle? |
||
Can it recognize gestures in the dark? |
||
Can it easily understand users' gestures? |
||
Is it user friendly to set up? |
||
Is it more convenient to use than remote control? |
||
Can I use it in any room? |
||
Can I use it in any location of the room? |
Project video link:
https://www.dropbox.com/s/j44y0z574mt1ohh/VoiceControl.mp4?dl=0
This project is being developed as an open-source project with the following licensing:
- Software: GPL-3.0 - https://www.gnu.org/licenses/gpl-3.0.en.html
- Hardware, Design & other Intellectual Property: CC-BY-SA-4.0 - https://creativecommons.org/licenses/by-sa/4.0/