Raw Touch Events

Touch events are the equivalent of mouse events for multi-touch input. In WPF4, we’ve added touch events to UIElement, UIElement3D and ContentElements. Multi-touch events follow the same patterns as mouse events, with TouchDown, TouchMove and TouchUp, the Preview variants and TouchEnter and TouchLeave. Multi-touch events in WPF are hit-tested and routed through the element tree just like other input events. This is different from the low-level application wide touch event support that is in SL3, I’ll discuss that later.

(For Microsoft Surface SDK developers, TouchDevice corresponds to the Contact class, and Touch events correspond to Contact events).

 

image

Each touch contact (finger) is associated with a different TouchDevice. You can use the instance or the Id of the TouchDevice to keep track of the contact between TouchDown and TouchUp events.

WPF also supports the notion of multiple capture for touch events. When an input device is captured to an element, all the input events from that device are routed from that element regardless of whether the input is directly over the element, no hit-testing need to be performed. With multi-touch events, each element can capture multiple TouchDevices and multiple elements can capture simultaneously. This feature allow WPF to support multiple simultaneous active controls, and one of the main differences between the WPF and the native Win32 touch support.

The following is a simple example of handling multi-touch events. In this example, we place a shape on a canvas under each finger, and track the shapes while the fingers are moving.

 

First we hook up the touch event handlers on the canvas

<Canvas x:Name="_canvas"                    Background="LightYellow"                   TouchDown="_canvas_TouchDown"         TouchMove="_canvas_TouchMove"         TouchUp="_canvas_TouchUp">   </Canvas>

 

The TouchDown handler creates a shape for the particular touch device, adds the shape to the canvas, moves it under the finger position and stashes it away in a dictionary for later.

private void _canvas_TouchDown(object sender, TouchEventArgs e) {     // New Shape for Touch Device     var shape = CreateSomeShape();     // Get Touch Point relative to _canvas coordinates     var origin = e.GetTouchPoint(_canvas);    // Move the shape to the touch point     shape.RenderTransform = new TranslateTransform                 (origin.Position.X-shape.RenderSize.Width/2,                   origin.Position.Y-shape.RenderSize.Height/2);     // Stash away the shape     _shapes[e.TouchDevice] = shape;     // Add the shape to the canvas     _canvas.Children.Add(shape);     _canvas.InvalidateVisual();     // Capture the touch device     _canvas.CaptureTouch(e.TouchDevice); } 

The TouchMove handler retrieves the shape associated with the touch device and moves it under the current finger position

private void _canvas_TouchMove(object sender, TouchEventArgs e) {      if (e.TouchDevice.Captured == _canvas)      {          // Retrieve the shape associated with the device          var shape = _shapes[e.TouchDevice];          // Get current touch point          var origin = e.GetTouchPoint(_canvas);          // Move the shape to the new position          shape.RenderTransform = new TranslateTransform(                   origin.Position.X - shape.RenderSize.Width / 2,                    origin.Position.Y - shape.RenderSize.Height / 2);      } }

The TouchUp handler removes the associated shape from the canvas, and removes it from the dictionary.

private void _canvas_TouchUp(object sender, TouchEventArgs e)   {       // Release the captured device       _canvas.ReleaseTouchCapture(e.TouchDevice);       // Clean up the shape       _canvas.Children.Remove(_trails[e.TouchDevice]);       _shapes.Remove(e.TouchDevice);   }

 

 

Multi-touch in Silverlight 3 and 4

Silverlight also supports multi-touch input on Windows 7. Both Silverlight and WPF allows you to retrieve the raw window level multi-touch input via the static Touch.FrameReported event.

image

GetTouchPoints returns the touch device positions relative to a particular element. Touch Up/Move/Down events are identified in the Action property of the TouchPoint class.

GetPrimaryTouchPoint returns the position of the first finger down in this multi-touch input sequence

Calling SuspendMousePromotionUntilTouchUp will prevent the touch devices from raising the equivalent mouse events. This allows panning to be implemented without also invoking mouse interactions like button clicks or listbox selections.

To support Pan/Zoom/Rotate gestures in SL, Surface will be providing the System.Windows.Input.Manipluations.DLL (the .NET4 core manipulation and inertia processors) as a separate download in the future. This assembly is part of .NET4, and used to implement the WPF4 manipulation events.

Touch User Experience

There are some subtle differences in user experience when dealing with multiple touches that are not immediately obvious. Let’s take Button as an example. With mouse, the button click event is raised when the mouse button is released over the button. However, if you use multiple fingers to press a button, you only want to raise the click event when the LAST finger is lifted from the button. All of the WPF and SL controls are built based on mouse events, luckily with the availability of the Surface SDK for WPF4, you will be able to get the ‘correct’ multi-touch user experience without having to write your own controls.