#739 – Handling Touch Input at Different Levels

In WPF, there are three different ways that your application can support touch input:

  • Built-in support for touch.  Some elements will automatically respond to touch input.  For example, you can trigger the Click event for a button by touching the button or scroll a ListBox by touching and dragging.
  • Manipulation Events.  User interface elements support a series of manipulation events that let you detect when the user is trying to rotate, scale (zoom) or translate (move) an element.  The touch points from two fingers are automatically mapped to an event with the correct data.  For example, spreading two fingers apart triggers an event that knows you want to zoom in.
  • Raw Touch Events.  You can handle individual events for touch down, up and move actions on an element, for all supported touch points.  For example, you can track the location of 10 fingers touching the screen at the same time.
Advertisement

#736 – Finding the Maximum Number of Touch Points at Run-time

You can write code that discovers at run-time the number of touch points supported by the hardware that you’re running on.  You do this by calling the GetSystemMetrics Win32 API call.

    class Program
    {
        [DllImport("user32.dll")]
        static extern int GetSystemMetrics(int nIndex);

        // Index passed in to GetSystemMetrics() indicates
        // what data we're asking for.
        private const int SM_DIGITIZER = 94;
        private const int SM_MAXIMUMTOUCHES = 95;

        // Masks used to check results from SM_DIGITIZER check
        private const int NID_READY = 0x80;
        private const int NID_MULTI_INPUT = 0x40;

        static void Main(string[] args)
        {
            string info;

            int digitizer = GetSystemMetrics(SM_DIGITIZER);

            if ((digitizer & (NID_READY + NID_MULTI_INPUT)) == NID_READY + NID_MULTI_INPUT)
            {
                int numTouchPoints = GetSystemMetrics(SM_MAXIMUMTOUCHES);
                info = string.Format("Multitouch ready, {0} inputs supported", numTouchPoints);
            }
            else
                info = "Multitouch not supported";

            Console.WriteLine(info);
            Console.ReadLine();
        }
    }

736-001

#731 – The Idea of Multi-Touch

Touch input is the idea of using your finger as an input device by touching a screen.  Multi-touch means that you can touch the screen with more than one finger, with each finger touching a different spot on the screen.

Multit-touch is typically used to track gestures that the user performs with more than one finger.  For example, placing two fingers on the screen and then spreading the fingers apart is interpreted as a “zoom in” gesture.  Moving the two fingers together is intepreted as a “zoom out” gesture.  And placing two fingers on the screen and rotating them both is interpreted as a “rotate” gesture.

731-001731-002731-003

Windows 7 and Windows 8 both include support for multi-touch input.