#737 – Touch Behavior when Maximum Number of Touch Points Reached

When you already have the maximum number of touch points engaged as input devices, additional touches on the device will be ignored.

For example, assume that your hardware supports a maximum of two touch points and you’re already touching the screen with two fingers.  If you place a third finger on the screen, you will not get a TouchDown for that third finger.  However, if you leave all three fingers on the screen and then lift a finger up, the first finger lifted up will not see a TouchUp event.  The behavior can be summarized as:

  • If you’ve already reached maximum number of touch points, adding fingers will not result in TouchDown events
  • If you currently have more fingers touching the screen than the maximum number of touch points, lifting  a finger will not result in a TouchUp event until you’re back down to the maximum number of touch points
Advertisement

#736 – Finding the Maximum Number of Touch Points at Run-time

You can write code that discovers at run-time the number of touch points supported by the hardware that you’re running on.  You do this by calling the GetSystemMetrics Win32 API call.

    class Program
    {
        [DllImport("user32.dll")]
        static extern int GetSystemMetrics(int nIndex);

        // Index passed in to GetSystemMetrics() indicates
        // what data we're asking for.
        private const int SM_DIGITIZER = 94;
        private const int SM_MAXIMUMTOUCHES = 95;

        // Masks used to check results from SM_DIGITIZER check
        private const int NID_READY = 0x80;
        private const int NID_MULTI_INPUT = 0x40;

        static void Main(string[] args)
        {
            string info;

            int digitizer = GetSystemMetrics(SM_DIGITIZER);

            if ((digitizer & (NID_READY + NID_MULTI_INPUT)) == NID_READY + NID_MULTI_INPUT)
            {
                int numTouchPoints = GetSystemMetrics(SM_MAXIMUMTOUCHES);
                info = string.Format("Multitouch ready, {0} inputs supported", numTouchPoints);
            }
            else
                info = "Multitouch not supported";

            Console.WriteLine(info);
            Console.ReadLine();
        }
    }

736-001

#735 – System Applet Indicates Maximum Number of Touch Points

The number of simultaneous touch points depends on the particular touch hardware that you’re using.  To quickly check what is supported on the machine that you’re using, bring up the System applet in Control Panel.  (Control Panel | System and Security | System, or just type “System” in Windows 7 or Windows 8 search.

The maximum number of simultaneous touch points is listed in the middle of the window, labeled Pen and Touch.

735-001

#734 – Recognizing Different Fingers in Touch Event Handlers

When you’re handling low-level touch events in WPF and the user will be using more than one finger at a time on the screen, you’ll want to keep track of which finger is generating a particular touch event.  You can do this using the TouchEventArgs.TouchDevice.Id property.  Every touch event handler will report a different Id for each finger that is touching the screen.  Also, when you touch and drag a finger on the screen, the Id property will remain the same for all events associated with that finger.

Here’s an example.

        private const double CircleWidth = 10;
        private Dictionary<int, Point> LastPositionDict;

        private void Canvas_TouchDown(object sender, TouchEventArgs e)
        {
            try
            {
                TouchPoint tp = e.GetTouchPoint(null);

                AddEllipseAt(canvMain, tp.Position, Brushes.Red);

                LastPositionDict.Add(e.TouchDevice.Id, tp.Position);
            }
            catch (Exception xx)
            {
                MessageBox.Show(xx.ToString());
            }
        }

        private void Canvas_TouchMove(object sender, TouchEventArgs e)
        {
            TouchPoint tp = e.GetTouchPoint(null);

            AddLineFromTo(canvMain, LastPositionDict[e.TouchDevice.Id], tp.Position, Brushes.Black);
            LastPositionDict[e.TouchDevice.Id] = tp.Position;
        }

        private void Canvas_TouchUp(object sender, TouchEventArgs e)
        {
            TouchPoint tp = e.GetTouchPoint(null);

            AddEllipseAt(canvMain, tp.Position, Brushes.Blue);
            LastPositionDict.Remove(e.TouchDevice.Id);
        }

Now I can draw with two fingers at the same time:
734-001

#733 – A Full List of Touch Related Events

Here’s a full list of UIElement events that you can handle when you want to handle touch input.  All of the events listed below are also defined for ContentElement.

All events are bubbling, unless flagged as tunneling.

Raw touch events:

  • GotTouchCapture – element has captured touch input
  • LostTouchCapture – element has lost touch capture
  • PreviewTouchDown – finger touches element  (tunneling)
  • PreviewTouchMove – finger moving on screen  (tunneling)
  • PreviewTouchUp – finger lifts off screen after moving  (tunneling)
  • TouchDown – finger touches element
  • TouchEnter – finger moves into element from outside
  • TouchLeave – finger moves out of element
  • TouchMove – finger moving on screen
  • TouchUp – finger lifts off screen after moving

Events related to manipulation (gestures):

  • ManipulationBoundaryFeedback – manipulation enters boundary
  • ManipulationCompleted – manipulation on element finishes
  • ManipulationDelta – position changes during manipulation
  • ManipulationInertiaStarting – finger leaves screen during manipulation
  • ManipulationStarted – manipulation on element starts
  • ManipulationStarting – user puts finger on element

#732 – Basic Events for Raw Touch Input

WPF includes a set of events for handling raw touch input.  These events are defined for all UIElement, ContentElement, and UIElement3D objects.

The most basic events are:

  • TouchDown – User touches the screen
  • TouchMove – User moves finger across the screen
  • TouchUp – User lifts finger off the screen

Below is a simple example that allows drawing using touch.  We’ve defined event handlers and attached them to a main Canvas element.  A red circle is drawn at the TouchDown point and a blue circle at the TouchUp point.  A continuous line is drawn as the user moves their finger across the screen.

        private const double CircleWidth = 10;
        private Point LastPosition;

        private void Canvas_TouchDown(object sender, TouchEventArgs e)
        {
            try
            {
                TouchPoint tp = e.GetTouchPoint(null);

                AddEllipseAt(canvMain, tp.Position, Brushes.Red);
                LastPosition = tp.Position;
            }
            catch (Exception xx)
            {
                MessageBox.Show(xx.ToString());
            }
        }

        private void Canvas_TouchMove(object sender, TouchEventArgs e)
        {
            TouchPoint tp = e.GetTouchPoint(null);

            AddLineFromTo(canvMain, LastPosition, tp.Position, Brushes.Black);
            LastPosition = tp.Position;
        }

        private void Canvas_TouchUp(object sender, TouchEventArgs e)
        {
            TouchPoint tp = e.GetTouchPoint(null);

            AddEllipseAt(canvMain, tp.Position, Brushes.Blue);
        }

        private void AddEllipseAt(Canvas canv, Point pt, Brush brush)
        {
            Ellipse el = new Ellipse();
            el.Stroke = brush;
            el.Fill = brush;
            el.Width = CircleWidth;
            el.Height = CircleWidth;

            Canvas.SetLeft(el, pt.X - (CircleWidth / 2));
            Canvas.SetTop(el, pt.Y - (CircleWidth / 2));

            canv.Children.Add(el);
        }

        private void AddLineFromTo(Canvas canv, Point from, Point to, Brush brush)
        {
            Line l = new Line();
            l.Stroke = brush;
            l.X1 = from.X;
            l.Y1 = from.Y;
            l.X2 = to.X;
            l.Y2 = to.Y;
            l.StrokeThickness = 2;

            canv.Children.Add(l);
        }

So when I touch and drag on a touch-enabled device, I get something that looks like this:

732-BasicTouch

#731 – The Idea of Multi-Touch

Touch input is the idea of using your finger as an input device by touching a screen.  Multi-touch means that you can touch the screen with more than one finger, with each finger touching a different spot on the screen.

Multit-touch is typically used to track gestures that the user performs with more than one finger.  For example, placing two fingers on the screen and then spreading the fingers apart is interpreted as a “zoom in” gesture.  Moving the two fingers together is intepreted as a “zoom out” gesture.  And placing two fingers on the screen and rotating them both is interpreted as a “rotate” gesture.

731-001731-002731-003

Windows 7 and Windows 8 both include support for multi-touch input.

 

#730 – Use QueryContinueDrag Event to Know When Mouse Button State Changes

The QueryContinueDrag event lets you know when a mouse button changes during a drag-drop operation.  It will also indicate whether the state of the Shift, Ctrl, or Alt keys change while dragging.  You wire up an event handler to the control that the drag-drop operation originates from.

In the example below, the source control waits for the left mouse button to be released and then cleans out its content.

        <Label Content="Drag from here" Background="LavenderBlush"
               HorizontalAlignment="Center" Margin="10" Padding="10"
               MouseDown="Label1_MouseDown"
               QueryContinueDrag="Label1_QueryContinueDrag"/>
        <Label Content="To here" Background="SandyBrown" AllowDrop="True"
               HorizontalAlignment="Center" Margin="10" Padding="10"
               Drop="Label2_Drop"/>
        private void Label1_MouseDown(object sender, MouseButtonEventArgs e)
        {
            Label lblFrom = e.Source as Label;

            if (e.LeftButton == MouseButtonState.Pressed)
                DragDrop.DoDragDrop(lblFrom, lblFrom.Content, DragDropEffects.Copy);
        }

        private void Label1_QueryContinueDrag(object sender, QueryContinueDragEventArgs e)
        {
            Label lblFrom = e.Source as Label;

            if (!e.KeyStates.HasFlag(DragDropKeyStates.LeftMouseButton))
                lblFrom.Content = "...";
        }

        private void Label2_Drop(object sender, DragEventArgs e)
        {
            string draggedText = (string)e.Data.GetData(DataFormats.StringFormat);

            Label toLabel = e.Source as Label;
            toLabel.Content = draggedText;
        }

730-001
730-002

#729 – Mouse.GetPosition Doesn’t Work While Dragging

If you are handling the DragOver event during a drag-and-drop operation and you want to find the current mouse position, you need to use DragEventArgs.GetPosition, rather than the static Mouse.GetPosition method.

In the example below, we initiate a drag-and-drop operation in a window and then try reporting the mouse’s position in the window’s DragOver handler.  We try using both methods to get the mouse position, but only the DragEventArgs.GetPosition method works.

<Window x:Class="WpfApplication2.MainWindow"
        xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
        xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
        Title="Application 2" Height="350" Width="325"
        MouseDown="Window_MouseDown"
        AllowDrop="True" DragOver="Window_DragOver">
    <StackPanel>
        <Label Name="lblInfo1" Content="Info 1"/>
        <Label Name="lblInfo2" Content="Info 2"/>
    </StackPanel>
</Window>

 

        private void Window_MouseDown(object sender, MouseButtonEventArgs e)
        {
            DragDrop.DoDragDrop((DependencyObject)e.Source, "Sample", DragDropEffects.Copy);
        }

        private void Window_DragOver(object sender, DragEventArgs e)
        {
            System.Windows.Point p1 = Mouse.GetPosition(this);
            lblInfo1.Content = string.Format("Mouse.GetPosition: {0}, {1}", p1.X, p1.Y);

            System.Windows.Point p2 = e.GetPosition(this);
            lblInfo2.Content = string.Format("DragEventArgs.GetPosition: {0}, {1}", p2.X, p2.Y);
        }

729-001

#728 – Using the Clipboard to Transfer Other Types of Data

As with drag-and-drop, you can transfer data between two running WPF applications in a variety of formats, using the clipboard.

The full list of data formats that you can specify is listed as a set of static fields in the System.Windows.DataFormats class.  Keep in mind that these are just labels used by the two applications to communicate with each other what data format is being transfered.

Below is an example of transfering some Xaml data between two applications using the clipboard.

On the copy side:

        private void btnCopy_Click(object sender, RoutedEventArgs e)
        {
            string xaml = XamlWriter.Save(e.Source);
            DataObject data = new DataObject(DataFormats.Xaml, xaml);

            Clipboard.SetDataObject(data);
        }

On the paste side:

        private void btnPaste_Click(object sender, RoutedEventArgs e)
        {
            IDataObject data = Clipboard.GetDataObject();
            if (data.GetDataPresent(DataFormats.Xaml))
            {
                string xaml = (string)data.GetData(DataFormats.Xaml);
                MessageBox.Show(xaml);
            }
        }

728-001
728-002