KinesicMouse

Nextgen hands-free computer access with 50+ input signals


WINNER: Best Mobility Solution

WINNER: Grand Prize

KinesicMouse

Jason DaSilva wants to access his editing tools and applications on his computer hands-free. Professional filmmaking software is not accessible without precise mouse and keyboard control. Keyboard shortcuts give quick access to commonly used tools and speed up the process of editing movies.

With the KinesicMouse Jason can get full access to his editing tools. The KinesicMouse features multiple mouse modes that allow for pixel precise cursor placement. Even the tiniest menu buttons can be clicked. Jason would not only be able to operate his mouse including all the mouse buttons and wheel, he could even take advantage of several keyboard shortcuts triggered by performing certain assigned facial expressions.

What is it?

The KinesicMouse is a software that allows to control a PC completely hands-free. It detects head rotation and facial expressions which can be used to control any mouse, keyboard and joystick input. The KinesicMouse is able to distinguish between more than 50 different facial movements and expressions.

As an example: the user can place the mouse cursor by rotating the head up and down. Tilting the head sideward is ideal for rotating the mouse wheel up and down. Mouse clicks like: left click, right click, double click and click&drag, can be assigned directly to facial expressions. The most common expression are: stretching left/right lip corner, eye brows raise, pucker one’s lips, mouth open and moving the jaw left/right. In total there are more than 50 signals that can be assigned to best fit the user’s needs.

How does it work?

All that is required is a compatible 3D camera. It is possible to choose between the Microsoft Kinect and the Intel RealSense camera technology. Both of these cameras are consumer products that can be bought in any electronic store. These sensors are manufactured in high volumes and are very affordable. The KinesicMouse leverages this mass hardware technology and turns these sensors into state-of-the-art assistive technology for computer access.

As soon as the user’s face is visible to the camera the facial tracking starts. No switches, joysticks or body attached tracking markers are required. The tracking range is very flexible and allows the user to reposition without the need of readjusting his equipment.

Anything special?

Where the KinesicMouse shines is its efficiency. Head or mouth controlled input devices usually have only up to two different signals for control. Expensive hardware switches can be used to get some extra inputs if required. The KinesicMouse does not need any of that. The high amount of detected expressions allows a trained user to simultaneously assign more than 20 different inputs. This is even enough to play demanding video game titles.

As every disability is very unique in its form the KinesicMouse is very flexible. All settings are highly adjustable and the user can assign the expressions he is able to perform. If someone is not able to rotate his head also facial expressions can be used for mouse cursor placement.

Setups can be saved in specific profiles so that the user can adjust his controls for each application individually. In a photo editing tool you might want to bind certain shortcuts and in a video game you may want to use the same expression for character movement. All these setups can be saved and loaded on demand.

Who can actually benefit from this solution?

Basically anyone that is not able to use his hands for controlling mouse, keyboard or touch devices. The KinesicMouse is already used by people affected by various disabilities. Examples are: stroke, cerebral palsy, Parkinson’s, muscular dystrophy (Duchenne), stroke, Carpal Tunnel Syndrome and spinal cord injury.

Why did you develop this solution

I am not affected by any condition that might cause my hands to fail. I am a passionate video gamer and my goal is to create the most efficient and affordable assistive solutions. I started with this vision in my mind three years ago. Now I have created two solutions that prove that this dream actually can become reality. To boost the development of these tools it would be huge to win this challenge.

Try it out