#740 – Set Background of Canvas to Transparent to Receive Touch Events

If you define one or more touch event handlers for a Canvas panel, without setting any other properties, you may not see any touch events for the canvas.

This happens because none of the controls inheriting from Panel will receive either touch or mouse events unless you specify a value for the panel’s Background property.  So, to receive touch events for the Canvas, you can just set its Background property to Transparent.

    <Canvas Name="canvMain" Background="Transparent"
        TouchDown="Canvas_TouchDown" TouchMove="Canvas_TouchMove" TouchUp="Canvas_TouchUp">
    </Canvas>
Advertisements

#739 – Handling Touch Input at Different Levels

In WPF, there are three different ways that your application can support touch input:

  • Built-in support for touch.  Some elements will automatically respond to touch input.  For example, you can trigger the Click event for a button by touching the button or scroll a ListBox by touching and dragging.
  • Manipulation Events.  User interface elements support a series of manipulation events that let you detect when the user is trying to rotate, scale (zoom) or translate (move) an element.  The touch points from two fingers are automatically mapped to an event with the correct data.  For example, spreading two fingers apart triggers an event that knows you want to zoom in.
  • Raw Touch Events.  You can handle individual events for touch down, up and move actions on an element, for all supported touch points.  For example, you can track the location of 10 fingers touching the screen at the same time.

#738 – Sample Code – Drawing and Moving Circles at Touch Points

Here’s some sample code that draws a circle for each touch point, when a finger contacts the screen, and then moves that circle around as you move your finger.  This is done using the raw touch events–TouchDown, TouchMove and TouchUp.

    <Canvas Name="canvMain" Background="Transparent"
        TouchDown="Canvas_TouchDown" TouchUp="Canvas_TouchUp" TouchMove="Canvas_TouchMove"/>

 

    public partial class MainWindow : Window
    {
        public MainWindow()
        {
            InitializeComponent();
            TouchPositions = new Dictionary<int, Point>();
            TouchEllipses = new Dictionary<int, Ellipse>();
        }

        private const double CircleWidth = 55;
        private Dictionary<int, Point> TouchPositions;
        private Dictionary<int, Ellipse> TouchEllipses;

        private void Canvas_TouchDown(object sender, TouchEventArgs e)
        {
            canvMain.CaptureTouch(e.TouchDevice);

            TouchPoint tp = e.GetTouchPoint(null);

            Ellipse el = AddEllipseAt(canvMain, tp.Position, Brushes.Red);

            TouchPositions.Add(e.TouchDevice.Id, tp.Position);
            TouchEllipses.Add(e.TouchDevice.Id, el);
            e.Handled = true;
        }

        private void Canvas_TouchMove(object sender, TouchEventArgs e)
        {
            TouchPoint tp = e.GetTouchPoint(null);

            Canvas.SetLeft(TouchEllipses[e.TouchDevice.Id], tp.Position.X - (CircleWidth / 2));
            Canvas.SetTop(TouchEllipses[e.TouchDevice.Id], tp.Position.Y - (CircleWidth / 2));
            e.Handled = true;
        }

        private void Canvas_TouchUp(object sender, TouchEventArgs e)
        {
            TouchPoint tp = e.GetTouchPoint(null);

            TouchPositions.Remove(e.TouchDevice.Id);

            canvMain.Children.Remove(TouchEllipses[e.TouchDevice.Id]);
            TouchEllipses.Remove(e.TouchDevice.Id);

            canvMain.ReleaseTouchCapture(e.TouchDevice);
            e.Handled = true;
        }

        private Ellipse AddEllipseAt(Canvas canv, Point pt, Brush brush)
        {
            Ellipse el = new Ellipse();
            el.Stroke = brush;
            el.Fill = brush;
            el.Width = CircleWidth;
            el.Height = CircleWidth;

            Canvas.SetLeft(el, pt.X - (CircleWidth / 2));
            Canvas.SetTop(el, pt.Y - (CircleWidth / 2));

            canv.Children.Add(el);

            return el;
        }

    }

738-001

#737 – Touch Behavior when Maximum Number of Touch Points Reached

When you already have the maximum number of touch points engaged as input devices, additional touches on the device will be ignored.

For example, assume that your hardware supports a maximum of two touch points and you’re already touching the screen with two fingers.  If you place a third finger on the screen, you will not get a TouchDown for that third finger.  However, if you leave all three fingers on the screen and then lift a finger up, the first finger lifted up will not see a TouchUp event.  The behavior can be summarized as:

  • If you’ve already reached maximum number of touch points, adding fingers will not result in TouchDown events
  • If you currently have more fingers touching the screen than the maximum number of touch points, lifting  a finger will not result in a TouchUp event until you’re back down to the maximum number of touch points

#736 – Finding the Maximum Number of Touch Points at Run-time

You can write code that discovers at run-time the number of touch points supported by the hardware that you’re running on.  You do this by calling the GetSystemMetrics Win32 API call.

    class Program
    {
        [DllImport("user32.dll")]
        static extern int GetSystemMetrics(int nIndex);

        // Index passed in to GetSystemMetrics() indicates
        // what data we're asking for.
        private const int SM_DIGITIZER = 94;
        private const int SM_MAXIMUMTOUCHES = 95;

        // Masks used to check results from SM_DIGITIZER check
        private const int NID_READY = 0x80;
        private const int NID_MULTI_INPUT = 0x40;

        static void Main(string[] args)
        {
            string info;

            int digitizer = GetSystemMetrics(SM_DIGITIZER);

            if ((digitizer & (NID_READY + NID_MULTI_INPUT)) == NID_READY + NID_MULTI_INPUT)
            {
                int numTouchPoints = GetSystemMetrics(SM_MAXIMUMTOUCHES);
                info = string.Format("Multitouch ready, {0} inputs supported", numTouchPoints);
            }
            else
                info = "Multitouch not supported";

            Console.WriteLine(info);
            Console.ReadLine();
        }
    }

736-001

#735 – System Applet Indicates Maximum Number of Touch Points

The number of simultaneous touch points depends on the particular touch hardware that you’re using.  To quickly check what is supported on the machine that you’re using, bring up the System applet in Control Panel.  (Control Panel | System and Security | System, or just type “System” in Windows 7 or Windows 8 search.

The maximum number of simultaneous touch points is listed in the middle of the window, labeled Pen and Touch.

735-001

#733 – A Full List of Touch Related Events

Here’s a full list of UIElement events that you can handle when you want to handle touch input.  All of the events listed below are also defined for ContentElement.

All events are bubbling, unless flagged as tunneling.

Raw touch events:

  • GotTouchCapture – element has captured touch input
  • LostTouchCapture – element has lost touch capture
  • PreviewTouchDown – finger touches element  (tunneling)
  • PreviewTouchMove – finger moving on screen  (tunneling)
  • PreviewTouchUp – finger lifts off screen after moving  (tunneling)
  • TouchDown – finger touches element
  • TouchEnter – finger moves into element from outside
  • TouchLeave – finger moves out of element
  • TouchMove – finger moving on screen
  • TouchUp – finger lifts off screen after moving

Events related to manipulation (gestures):

  • ManipulationBoundaryFeedback – manipulation enters boundary
  • ManipulationCompleted – manipulation on element finishes
  • ManipulationDelta – position changes during manipulation
  • ManipulationInertiaStarting – finger leaves screen during manipulation
  • ManipulationStarted – manipulation on element starts
  • ManipulationStarting – user puts finger on element