Shorthand desires to be a paradigm shifting way to communicate artfully and logically via a fluid use of gestures, voice and perhaps, eye movement. At its very heart, Shorthand is a project inspired by my Grandson, Camden. Camden was born without limbs, a condition known as phocomelia syndrome. He does have some upper arms and that's where this project comes into the story. It is has been a dream since Camden's birth for me to see him and those with similar conditions be able to:
- Grow new or regenerate limbs
- Have robot arms and legs!
- Be able to function in this world a little better through the use of artful creativity, technology, and community
This is a project with the heart to give Camden limbs, in a metaphoric sense, by providing a method of arm control inputs (gestures) that translate to keyboard keystrokes, phonetics, or a number of different language outputs and special gesture-driven functions.
Before I share the original concept, please note that the method of input is in discussion and may not completely match the concept below. Shorthand is conceptual at this stage though we have a beginning team made of programmers, designers, makers, and coordinators. We also have a few robot parts, some know-how, some don't-know-how--yet, and sheer determination. As Camden so often says, "I got this". Well buddy, we got this with you.
So, without further adieu, the original concept is outlined below:
Applicability: Physically disabled / people with limited range of movement / artists / designers / musicians / alternative keyboard option / 3D virtual navigation / mouse
Design: Ergonomic and aesthetically pleasing enclosure to house two joysticks with minimum five points of movement each (left,right,up,down,press). Possibly add “function” buttons. Joysticks/buttons connected via usb interface unit to a Raspberry Pi 3B or greater. Joystick position and combination of the two joystick positions simultaneously translates to a unique “keyboard key press”. Buttons can be used to serve as shift keys, key caps, alt key, macro combination or app launcher keys, etc.
Interface option to switch to voice dictation / control through existing Google Voice HAT and API.
Operating System: Google AIY Raspbian image or suitable Linux distribution
Programming Languages required: Python
Parts:
- Raspberry Pi 3B or newer
- Google AIY Voice Kit
- 2 each 5-position switch joysticks
- 10 each arcade button switches
- 1 usb / switch interface kit
- LED / LCD lighting & display options considered
Equipment / Consumables:
- Soldering equipment
- PPE
- HDMI monitor
- USB Keyboard / Mouse
Useful Repositories
- enclosure-picroft: Mycroft interface for Raspberry Pi environment - https://github.com/MycroftAI/enclosure-picroft
- awesome-raspberry-pi: Curated list of git Pi repositories - https://github.com/blackout314/awesome-raspberry-pi