#739 – Handling Touch Input at Different Levels
January 22, 2013 2 Comments
In WPF, there are three different ways that your application can support touch input:
- Built-in support for touch. Some elements will automatically respond to touch input. For example, you can trigger the Click event for a button by touching the button or scroll a ListBox by touching and dragging.
- Manipulation Events. User interface elements support a series of manipulation events that let you detect when the user is trying to rotate, scale (zoom) or translate (move) an element. The touch points from two fingers are automatically mapped to an event with the correct data. For example, spreading two fingers apart triggers an event that knows you want to zoom in.
- Raw Touch Events. You can handle individual events for touch down, up and move actions on an element, for all supported touch points. For example, you can track the location of 10 fingers touching the screen at the same time.