xOSC keygloves
In the randform post “Gesture steered Turing machine” I used data gloves, which were made following the instructions of Hannah Perner-Wilson who is a member of the gloves project. Being weary of sitting too much at the computer I had also written in this post that I would like to make more use of body movements and in particular include danse-like movements in computer interaction and in particular in programming.
Unfortunately rather shortly after I had written the post a not so nice medical indication in my vicinity which was -at least partially- due to too much computer sitting urged me to more or less dramatically speed up this project.
The gesture recognition for my gloves, which were used in the Turing machine example, works, but it is not yet fine grained and exhaustive enough. So I had to look for an easy and fast and at least to some extend workable and affordable solution which would insure a more direct and precise steering possibility, like some version of key gloves. To make it short: In the end I made my own version with Tims help. Again it’s only a start but still.
But let’s look into this a bit. Data or wired gloves and even keygloves have already quite a long history.
The Wikipedia article wired glove describes this a bit and contains even a link to a survey article on data gloves which is itself from 1994 (paywall).
A good overview on more current commercial and DIY data gloves can be found at the website of the gloves project.
The gloves project is a collaborative effort to built a commercially working data glove called mimu glove. The gloves project was already mentioned in the randform post “Gesture steered Turing machine”. The mimu gloves as well as the gloves I used in this Turing machine post use gesture recognition. But as already said in the beginning – the gesture recognition for my gloves works, but it is not yet fine grained and exhaustive enough. I haven’t tried the mimu gloves – I am pretty sure the gesture recognition there works way better than the one with my DIY gloves. But apart from the fact that I can’t really afford the mimu gloves – I am also not sure wether the current version (which I think doesn’t include keys?) would be sufficient for typing. They might be more suitable for the fluid gestures needed in musical applications. In a somewhat similar product called gest on the kickstarter page it is likewise written:
Our typing system is an experimental feature, and to make it great we need some brave folks to help test and improve the system.
And considering exhaustiveness -according to the blogpost at android headlines also a tapping device called tap strap, which uses an physical add on has only 31 gestures. This is of course already something and it would be somewhat sufficient if one would use some of the gestures for switching modes. But switching between different keyboard modes is not as intuitive as direct visual and haptic cues. And apart from that -as of now- it is still in development and even the price seems unclear.
Similar things hold for direct gesture tracking devices via radar, video imaging etc. as they can e.g. be found on Wikipedias gesture recognition website. But -last not but least due to the VR applications- big companies are here jumping in, so this might soon be an option. See in particular the paper VR controller for the google cardboard VR.
A very pronounced and clear type of “gesture recognition” can of course be done by using switches, like the buttons on your keyboard. Those could even provide immediate tactile/mechanical feedback, which I consider rather important. And I mean here really rather clear on/off switches even if “soft switches” like in this glove project by the cit-ec lab or Google Jacquards tapping fabrics (which seems though strictly speaking not a switch (resistance) but probably rather via capacitance measurements) sounds interesting (by the way not to be confused with this Jacquard project by Hannah et al). At least on hard backgrounds “soft switches” may allow for more unusal gestures.
Tim has since quite some years been using twiddler for his real-time-white-board-typing during math lectures. But I have considerably small hands and I felt not so comfortable with twiddler. Moreover I like the textile nature of gloves. So I was looking for gloves with keys, i.e. keygloves.
Keygloves as a subunit of datagloves have their own history and one of the earlier versions of affordable DIY key gloves seems to be the keyglove from Okt 2000 from the eyetap lab, university of Toronto which is also mentioned on this key glove survey from ten years ago.
The summary at the gloves project contains also a project which makes keygloves, but this seems sofar also to be a DIY version, similar things seem to hold for a Berlin keyglove project as apparently presented at a hackaton and at a youth-only event and I needed an affordable solution fast. A couple of weekends solution so to say. So finally we ended up making our own fast prototype. See above video and the description on astlab. I have currently a bad cold, I hope you understand the video.
An important outcome was that custom tailoring was rather key. So this version is handmade and of course the handicraft work is not even nearly as elaborate as with this piece of cyber fashion, but then it was mainly intended to work.
supplement 1 march 2017: Link to youtube video about Mogli. Citation from the text to the video:
“Mogli was a nice attempt (Doepfer cooperated with the Band Kraftwerk on this) in 1993, to control MIDI via a Nintendo game controller glove. Probably only about 350 were built because the glove went out of Nintendo-production. So here´s a rare basic demonstration how it works. “
supplement 19 July 2017: Here an example for the above mentioned gesture recognition development. The company MotionSavy founded by (former) students at RIT uses gesture recognition for generating speech in order to make sign language audible. Here a video with more details. The production of the MotionSavy tablet (called Uni) had been delayed to summer 2016 and the company takes in preorders as of now.
Another approach from RIT in which letters are recognized via gesture recognition is decribed in the article Gesture Recognition with the Leap Motion Controller. Here:
Students and staff on the RIT campus used the GUI
to record their versions of each of 12 gesture types: one
finger tap, two finger tap, swipe, wipe, grab, release, pinch,
check mark, figure 8, lower case ‘e’, capital ‘E’, and capital
‘F’.
The script letters are here written into air with the index finger, so in some sense this is a rather script based approach to finger spelling.
supplement 18 August 2017: Another project translating american sign language (ASL) into letters:
The Language of Glove: Wireless gesture decoder with low-power and stretchable hybrid electronics (via adafruit and motherboard)
The decoder uses a gesture recognition via bend sensors and accelerometer just as in the Gloves Projects glove. The Gestures in the Gloves project however could be learned and recognized via a neural net. For the decoder the recognition is adjusted to the 26 signs of ASL and thus a direct translation (without needing a neural net) was possible. (This opportunity by the way seems to enlarge the number of recognizable gestures by quite a bit.) The bend sensors for the decoder are self-fabricated.