Tablet PC input: recognizing gestures and creating strokes from code

In my previous post on Tablet PC pen input I explored handwriting recognition. I explored how the language specific recognizers of the tablet API translated your scribbling into text, listed the possible alternatives and the confidence it had in the results. The Tablet API has another recognizer built in, this one recognizes gestures. In this post I will explore that a little further. My demo code will try to recognize some specific gestures and draw it’s idealized form from strokes created in code.


Again I have a simple Winform with a panel on it with a private InkCollector object. The collector is created in the form’s constructor and disposed in the form’s Dispose method. For the details on that check my previous post. By default an inkcollector object collects strokes. By setting it’s CollectionMode property you change this. Here the collector is told to only check for gestures. The user can draw on the panel, but after a short lapse of time the strokes will disappear having been processed by the gesture recognizer. A gesture is a certain movement of the pen. For instance tapping the digitizer is a gesture. But also drawing a line, circle or even an arrow. These gestures fall into two categories. First come the system-gestures, which include tap, double tap and drag, a full list is is the SystemGesture enumeration. The second group are application gestures, a full list of of these is in the ApplicationGesture enumeration.



In this story I will dive deeper into the latter. A system gesture is always recognized, but if you want your application to recognize an application gesture you must subscribe to it using the inkcollector’s SetGestureStatus method. You can be a lazy coder by passing in ApplicationGesture.AllGestures but that is a bad habit. Recognizing gestures is a costly matter so please be economic when subscribing. When a gesture is recognized the InkCollector‘s Gesture event will fire. Add an event handler to inspect the gestures and do something with them.


using System;
using System.Drawing;
using System.Collections;
using System.ComponentModel;
using System.Windows.Forms;
using System.Data;
using Microsoft.Ink;


namespace GestureSketcher
{
    public class Form1 : System.Windows.Forms.Form
    {
        private System.ComponentModel.Container components = null;
        private System.Windows.Forms.Panel panel1;

        private InkCollector ic;

        public Form1()
        {
            InitializeComponent();
            ic = new InkCollector(panel1);
            ic.CollectionMode = CollectionMode.GestureOnly;
            ic.SetGestureStatus(ApplicationGesture.Scratchout, true);
            ic.SetGestureStatus(ApplicationGesture.Up, true);
            ic.SetGestureStatus(ApplicationGesture.Down, true);
            ic.SetGestureStatus(ApplicationGesture.Left, true);
            ic.SetGestureStatus(ApplicationGesture.Right, true);
            ic.SetGestureStatus(ApplicationGesture.Square, true);
            ic.SetGestureStatus(ApplicationGesture.Circle, true);
            ic.SetGestureStatus(ApplicationGesture.SemiCircleRight, true);
            ic.SetGestureStatus(ApplicationGesture.ChevronUp, true);
            ic.SetGestureStatus(ApplicationGesture.ArrowUp, true);           
            ic.Gesture +=new InkCollectorGestureEventHandler(ic_Gesture);
            ic.Enabled = true;
        }

        /// <summary>
        /// Clean up any resources being used.
        /// </summary>
        protected override void Dispose( bool disposing )
        {
            if( disposing )
            {
                if (components != null)
                {
                    components.Dispose();
                    ic.Dispose();
                }
            }
            base.Dispose( disposing );
        }

        #region Windows Form Designer generated code
        private void InitializeComponent()
        {
            // just panel1

        }
        #endregion

        /// <summary>
        /// The main entry point for the application.
        /// </summary>
        [STAThread]
        static void Main()
        {
            Application.Run(new Form1());
        }



        private void ic_Gesture(object sender, InkCollectorGestureEventArgs e)
        {
            // Get bounding box of strokes drawn
            Rectangle rcBounds = e.Strokes.GetBoundingBox();

            // Calculate bounding points
            Point rcBoundsTopLeft = new Point(rcBounds.Left, rcBounds.Top);
            Point rcBoundsTopMid = new Point((rcBounds.Left + rcBounds.Right) /2, rcBounds.Top);
            Point rcBoundsTopRight = new Point(rcBounds.Right, rcBounds.Top);   

            Point rcBoundsMidLeft = new Point(rcBounds.Left, (rcBounds.Top + rcBounds.Bottom) /2);
            Point rcBoundsMidRight = new Point(rcBounds.Right, (rcBounds.Top + rcBounds.Bottom)/2);

            Point rcBoundsBottomLeft = new Point(rcBounds.Left, rcBounds.Bottom);
            Point rcBoundsBottomMid = new Point((rcBounds.Left + rcBounds.Right) /2, rcBounds.Bottom);
            Point rcBoundsBottomRight = new Point(rcBounds.Right, rcBounds.Bottom);

            int gesturesRecognized = 0;

            for (int i=0; i< e.Gestures.Length;i++)
            {
                bool curved = false;
                Point[] pts = new Point[0];
                Gesture gest = e.Gestures[i];

                switch(gest.Id)
                {                       
                    case ApplicationGesture.Down :
                        pts = new Point[2];
                        pts[0] = e.Strokes[0].GetPoint(0);
                        pts[1] = new Point(pts[0].X, rcBounds.Bottom);
                        break;
                    case ApplicationGesture.Up :
                        pts = new Point[2];
                        pts[0] = e.Strokes[0].GetPoint(0);
                        pts[1] = new Point(pts[0].X, rcBounds.Top);
                        break;
                    case ApplicationGesture.Left :
                        pts = new Point[2];
                        pts[0] = e.Strokes[0].GetPoint(0);
                        pts[1] = new Point(rcBounds.Left, pts[0].Y);
                        break;
                    case ApplicationGesture.Right :
                        pts = new Point[2];
                        pts[0] = e.Strokes[0].GetPoint(0);
                        pts[1] = new Point(rcBounds.Right, pts[0].Y);
                        break;
                    case ApplicationGesture.Square :
                        pts = new Point[5];
                        pts[0] = rcBoundsTopLeft;
                        pts[1] = rcBoundsTopRight;
                        pts[2] = rcBoundsBottomRight;
                        pts[3] = rcBoundsBottomLeft;
                        pts[4] = rcBoundsTopLeft;
                        break;
                    case ApplicationGesture.ArrowUp :
                        pts = new Point[5];
                        pts[0] = rcBoundsMidLeft;
                        pts[1] = rcBoundsTopMid;
                        pts[2] = rcBoundsBottomMid;
                        pts[3] = rcBoundsTopMid;
                        pts[4] = rcBoundsMidRight;
                        break;
                    case ApplicationGesture.ChevronUp:
                        pts = new Point[3];
                        pts[0] = rcBoundsBottomLeft;
                        pts[1] = rcBoundsTopMid;
                        pts[2] = rcBoundsBottomRight;
                        break;
                    case ApplicationGesture.SemiCircleRight :
                        pts = new Point[3];
                        pts[0] = rcBoundsBottomLeft;
                        pts[1] = rcBoundsTopMid;
                        pts[2] = rcBoundsBottomRight;
                        curved = true;
                        break;
                    case ApplicationGesture.Scratchout :
                        // erase last stroke drawn
                        ic.Ink.DeleteStroke(ic.Ink.Strokes[ic.Ink.Strokes.Count - e.Strokes.Count- 1]);
                        panel1.Invalidate();
                        break;

                }

                if (pts.Length > 0)
                {
                    gesturesRecognized++;
                    // Create a stroke to draw form recognized
                    Stroke str = ic.Ink.CreateStroke(pts);
                    str.DrawingAttributes.FitToCurve = curved;

                    switch(gest.Confidence)
                    {
                        case RecognitionConfidence.Strong :
                            str.DrawingAttributes.Color = Color.Green;
                            break;
                        case RecognitionConfidence.Intermediate :
                            str.DrawingAttributes.Color = Color.Orange;
                            break;
                        case RecognitionConfidence.Poor :
                            str.DrawingAttributes.Color = Color.Red;
                            break;
                    }

                    if (gesturesRecognized == 1)
                    {
                        // Emphasize first recognizer result
                        str.DrawingAttributes.Width = str.DrawingAttributes.Width * 2;
                        str.DrawingAttributes.Height = str.DrawingAttributes.Height * 2;
                    }
                    // Draw stroke on panel
                    ic.Ink.Strokes.Add(str);
                }
            }

        }
    }
}
 

All the fun stuff happens in the ic_Gesture method. In the parameters it receives an InkCollectorGestureEventArgs object, full of interesting information such as the strokes drawn and the gestures recognized. The strokes drawn have a bounding box. This is the rectangle which encloses them. Out of the 4 corner points I calculate the points half way the rectangle’s sides. The events arguments also contain a list of gestures. This list is comparable with the list of words when recognizing handwriting, this is a list of the calculated guesses of the gesture. My code loops through these and checks for the id of the gesture.


When the strokes have been analyzed by the recognizer they are discarded. The code will create a new stroke to draw a shape which corresponds to the gesture. You create a stroke out of an array of points. You get the coordinates of a stroke by using the Stroke.GetPoint method. The actual value of these coordinates is not in pixels but in HiMeteric format. The accuracy of a tablet digitizer is far more precise than the pixels of the screen, by using the HiMetric format this information is preserved.


The simplest gestures are a simple line up, down, left or right. Drawing a line requires two points. The starting point is copied from the first point of the stroke drawn. As end point the stroke’s bounding box provides the coordinates. The result is a stroke which will draw a perfectly straight horizontal or vertical line. Create a stroke from the array of (these two) points using the Ink.CreateStroke method. The stroke will be drawn by adding it to the inkcollector’s Strokes collection. Before doing that I’ll do some decoration by setting properties of the stroke’s DrawingAttributes property. Each gesture has a recognition Confidence, I use this to set the color. The first gesture recognized will be drawn with a thicker line by setting Height and Width.


A somewhat more complex gesture is a square. There are several ways to draw this gesture: in one stroke making a lot of turns or in two strokes, each with just one turn. There is a small lag between the last stroke drawn and the gesture recognizer firing. This lag makes it possible to have a gesture which consists of multiple strokes. To build a square from one stroke you need 5 points. The top left corner will be the first and the last point to get a (visual) closed structure. Actually it is one open stroke whose begin- and end-point overlap.


When it comes to gestures pointing up the gesture recognizer is going to have a hard time. In the demo I have subscribed to the arrow-up, the chevron-up (this looks like the letter v upside down) and the semicircle-left gesture. Despite its name the latter is best recognized by drawing the upper half of a circle. When these gestures are recognized the strokes are constructed from the points calculated. A semicircle is a rounded shape. When you set the FitToCurve property of the stroke’s DrawingAttributes a Bezier curve fitting the strokes point’s will be drawn. Drawing these kinds of gestures you will see that the gesture recognizer will propose several alternatives, leading to several shapes been drawn. The color of the shapes tell the confidence the recognizer has. Even if it’s pretty sure it will return alternatives. Just like text recognition.



There are loads of things you can do with gestures. As a small example I have subscribed to the ScratchOut gesture. It will erase the last stroke drawn by our code. Strokes entered by the user are discarded when the recognizer has finished. Before that they are still in the Strokes collection of the InkOverlay. To find the last stroke drawn I have to count the user’s strokes to get the right index in the strokes collection.


Now I have a very simple sketcher. To expand it’s possibilities just subscribe to the other gestures. If you want to subscribe to SystemGestures you have to add an eventhandler to the InkCollector’s SystemGesture event. Another method, same idea. A special note deserves the Circle gesture. When it comes to drawing a circle things are going to get hard as there is no simple way to draw a circle using Bezier curves. This is something to address with a custom renderer. Something for a new post.


Scribble away !

This entry was posted in Tablet+PC. Bookmark the permalink. Follow any comments here with the RSS feed for this post.