Learn about the user interaction platform, the input sources (including touch, touchpad, mouse, pen/stylus, and keyboard), modes (touch keyboard, mouse wheel, pen eraser, and so on), and user interactions supported by Windows, Windows Store apps, and Windows Phone Store apps.
Updates for Windows 8.1: Windows 8.1 introduces a number of updates and improvements to the pointer input APIs. See API changes for Windows 8.1 for more info.
Through guidelines, best practices, and examples, we show you how to take full advantage of the interaction capabilities of Windows to build apps with intuitive, engaging, and immersive user experiences.
See Responding to user interaction (XAML) for apps using C++, C#, or Visual Basic.
See Responding to user interaction (DirectX and C++) for apps using DirectX with C++.
Learn about events with Quickstart: adding HTML controls and handling events and Event capture and event bubbling with DOM events
App features, start to finish: Explore this functionality in more depth as part of our App features, start to finish series
User experience guidelines:
The platform control libraries (HTML and XAML) provide the full Windows user interaction experience, including standard interactions, animated physics effects, and visual feedback. If you don't need customized interaction support, use these built-in controls.
If the platform controls are not sufficient, these user interaction guidelines can help you provide a compelling and immersive interaction experience that is consistent across input modes. These guidelines are primarily focused on touch input, but they are still relevant for touchpad, mouse, keyboard, and stylus input.
- Guidelines for common user interactions
- Guidelines for cross-slide
- Guidelines for optical zoom and resizing
- Guidelines for panning
- Guidelines for rotation
- Guidelines for Semantic Zoom
- Guidelines for selecting text and images
- Guidelines for targeting
- Guidelines for visual feedback
Samples: See this functionality in action in our app samples.
Design your apps with touch interactions in mind. Touch input is supported by an increasingly large and varied number of devices, and touch interactions are a fundamental aspect of the Windows experience.
Because touch is a primary mode of interaction for users of Windows and Windows Phone, the platforms are optimized for touch input to make your apps responsive, accurate, and easy to use. And don't worry, tried and true input modes (such as mouse, pen, and keyboard) are fully supported and functionally consistent with touch (see Gestures, manipulations, and interactions). The speed, accuracy, and tactile feedback that traditional input modes provide are familiar and appealing to many users. These unique and distinctive interaction experiences have not been compromised.
Be creative with the design of the user interaction experience. Support the widest range of capabilities and preferences to appeal to the widest possible audience, and attract more customers to your app.
From built-in controls to full customization, the user interaction platform is based on layers of functionality that progressively add flexibility and power.
Take advantage of the built-in controls to provide a common user interaction experience that works well for the majority of Windows and Windows Phone apps.
The built-in controls are designed from the ground up to be touch-optimized, while providing consistent and engaging interaction experiences across all input modes. They support a comprehensive set of gestures: press and hold, tap, slide, swipe, pinch, stretch, turn. When you couple these with direct manipulations, such as pan, zoom, rotate, drag, and realistic inertia behavior, they enable a compelling and immersive interaction experience that follows best practices consistently across the Windows platform.
For more info on the control libraries, see Adding controls and content.
Tweak the user interaction experience through the pan/scroll and zoom settings of your app views. An app view dictates how a user accesses and manipulates your app and its content. Views also provide behaviors such as inertia, content boundary bounce, and snap points.
Pan and scroll settings dictate how users navigate within a single view, when the content of the view doesn't fit within the viewport. A single view can be, for example, a page of a magazine or book, the folder structure of a computer, a library of documents, or a photo album.
Zoom settings apply to both optical zoom and the SemanticZoom control. Semantic Zoom is a touch-optimized technique for presenting and navigating large sets of related data or content within a single view. It works by using two distinct modes of classification, or zoom levels. This is analogous to panning and scrolling within a single view. Panning and scrolling can be used in conjunction with Semantic Zoom.
Use app views and the following Cascading Style Sheets (CSS) properties, Document Object Model (DOM) attributes, and DOM events to modify the pan/scroll and zoom behaviors. This can provide a smoother interaction experience than is possible through the handling of pointer and gesture events.
|API surface||CSS Properties||DOM Attributes||DOM Events|
|touch-action specifies whether a given region can be manipulated through panning or zooming.|
For more info on app views, see Defining layouts and views.
For more info on panning/scrolling, see Guidelines for panning.
A pointer is a generic input type with a unified event mechanism. It exposes basic info, such as screen position, on the active input source, which can be touch, touchpad, mouse, or pen. Gestures range from simple, static interactions like tapping to more complicated manipulations like zooming, panning, and rotating. For more details, see Gestures, manipulations, and interactions.
Static gesture events are triggered after an interaction is complete. Manipulation gesture events indicate an ongoing interaction. Manipulation gesture events start firing when the user touches the element and continue until the user lifts their finger, or the manipulation is canceled.
Access to the pointer and gesture events enables you to use the Windows 8 Touch interaction design language for:
- Custom controls and feedback visuals
- Custom interactions
Take advantage of the built-in gesture recognition provided through these DOM gesture events:
|Type||DOM gesture events|
|Static gesture events|
|Manipulation gesture events|
Fully customize and control your app's interaction experience through the Windows Runtime APIs. With these APIs you can enable custom user interactions, and handle additional configuration options and hardware capabilities such as mouse devices with a right button, wheel button, tilt wheel, or X buttons and pen devices with barrel buttons and eraser tips.
Before you provide customized interaction experiences through new or modified gestures and manipulations, consider the following:
- Does an existing gesture provide the experience your app requires? Don't provide a custom gesture to zoom or pan when you can simply adapt your app to support or interpret an existing gesture.
- Does the custom gesture warrant the potential inconsistency across apps?
- Does the gesture require specific hardware support, such as number of contacts?
- Is there a control (such as SemanticZoom) that provides the interaction experience you need? If a control can intrinsically handle user input, is a custom gesture or manipulation required at all?
- Does your custom gesture or manipulation result in a smooth and natural interaction experience?
- Does the interaction experience make sense? If the interaction depends on such things as the number of contacts, velocity, time (not recommended), and inertia, ensure that these constraints and dependencies are consistent and discoverable. For example, how users interpret fast and slow can directly affect the functionality of your app and the users' satisfaction with the experience.
- Is the gesture or manipulation affected by the physical abilities of your user? Is it accessible?
- Will an app bar command or some other UI command suffice?
In short, create custom gestures and manipulations only if there is a clear, well-defined requirement and no basic gesture can support your scenario.