This is just a feasibility study and not meant for production use. Feel free to use it in your own projects or give me some constructive feedback.
KrikelKrakel (German for scribbling)
Inspired by the detailed article at http:www.usenix.org/events/usenix03/tech/freenix03/full_papers/worth/worth_html/xstroke.html by Carl D. Worth I began experimenting with stroke recognition on the iPhone. Unfortunately the sources for xstroke are very difficult to find nowadays and are unsupported. I finally did find them but I did not want to port all that X11 stuff, so I decided to start from scratch and did a small feasibility study which I want to show you here.
The most interesting class you would look at is KrikelKrakelView. It inherits from UIView and does all the tracking and recognition. The gestures are recognized when the touches ended. The area where touches took place is divided in a grid with 9 cells and the path the finger took is then described by the cell ids. You should have a look at the article mentioned earlier about the details.
A gesture can contain of more than one strokes. The strokes have to be drawn in a specific time before the recognition occurs. At the moment GAP_BETWEEN_STROKES is set to 1 second.
One can register as a delegate to get called on different occassions:
- (void) willDrawGesture;
This will be called right inside the touchesBegan method.
- (void) didLearnNewGesture:(NSString*)text;
When a gesture has not been recognized the user will be presented with an alert box where he can enter a letter or some more text.
- (void) didRecognizeGesture:(NSString*)text;
When a gesture has been recognized this method will be called and the stored letter/text will be delivered in the variable text.
The learned gestures a stored in the application documents directory with the name "strokes.dict". If there is no file on first start the bundled strokes.dict will be used a the initial version.
See a demo video here.