,

Handling Touch with Mouse Events

Mouse events can be used to detect a simple, one-finger gesture. Mouse event handlers can be quickly added to your app and provide an easy way to get basic touch support.

For more complex gestures, additional code is required to translate manipulation values to detect the actual gesture type. As you see in later sections of this chapter, there are easier higher-level APIs for handling more complex gestures.

In this chapter, the term touch point is used to describe the point of contact between a finger and the device screen.

The UIElement class includes the following mouse events:

Image MouseEnterRaised when a touch point enters the bounding area of a UIElement.

Image MouseLeaveRaised when the user taps and moves his finger outside the bounding area of the UIElement.

Image MouseLeftButtonDownRaised when the user touches a UIElement.

Image MouseLeftButtonUpRaised when a touch point is removed from the screen while it is over a UIElement (or while a UIElement holds mouse capture).

Image MouseMoveRaised when the coordinate position of the touch point changes while over a UIElement (or while a UIElement holds mouse capture).

Image MouseWheelThis event is unutilized on a touch-driven UI.

The following example responds to the MouseLeftButtonDown, MouseLeftButtonUp, and MouseLeave events to change the background color of a Border control. See the following excerpt from the MouseEventsView.xaml file:

<Grid x:Name="ContentPanel" Grid.Row="1" Margin="12,0,12,0">
   <Border
        MouseLeftButtonDown="HandleLeftButtonDown"
        MouseLeftButtonUp="HandleLeftButtonUp"
        MouseLeave="HandleMouseLeave"
        Height="100"
        Width="200"
        Background="{StaticResource PhoneAccentBrush}" />
</Grid>

The event handlers in the code-beside change the color of the Border (see Listing 12.1).

LISTING 12.1. MouseEventsView Class


public partial class MouseEventsView : PhoneApplicationPage
{
    readonly SolidColorBrush dragBrush = new SolidColorBrush(Colors.Orange);
    readonly SolidColorBrush normalBrush;

    public MouseEventsView()
    {
        InitializeComponent();

        normalBrush = (SolidColorBrush)Resources["PhoneAccentBrush"];
    }

    void HandleLeftButtonDown(object sender, MouseButtonEventArgs e)
    {
        Border border = (Border)sender;
        border.Background = dragBrush;
    }

    void HandleLeftButtonUp(object sender, MouseButtonEventArgs e)
    {
        Border border = (Border)sender;
        border.Background = normalBrush;
    }

    void HandleMouseLeave(object sender, MouseEventArgs e)
    {
        Border border = (Border)sender;
        border.Background = normalBrush;
    }
}


The MouseButtonEventArgs class derives from MouseEventArgs. MouseEventArgs is a RoutedEvent, which is discussed in more detail later in this chapter.

Although mouse events provide a simple way to respond to touch in your app, they do not come equipped to support complex gestures or multitouch, nor do they provide any built-in means for ascertaining more detailed touch information, such as touch velocity. Fortunately, as you see later in the chapter, a number of alternative touch APIs do just that. For now, though, turn your attention to the Touch and TouchPoint classes that represent the low-level touch API.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.141.201.26