Motion and device orientation for complex apps (Windows Runtime apps)
A complex app relies primarily on the Windows Runtime OrientationSensor class when it’s available. This class requires a sensor hardware and firmware implementation that combines an Accelerometer, a Gyrometer, and a magnetometer.
When the OrientationSensor class isn’t available, an app has several fallback options. One is to rely on Euler angles derived from the accelerometer and magnetometer data.
Quaternion data requires an accelerometer and a magnetometer (or compass) at a minimum. Depending on the app, it may also require a gyrometer to give a “fluid” appearance to changes in view. When three sensors are used, the quaternion involves a total of nine axes. A quaternion can be most easily understood as a rotation of a point [x,y,z] about an arbitrary axis (in contrast to a rotation matrix, which represents rotations around three axes). The mathematics behind quaternions are fairly exotic because they involve the geometric properties of complex numbers and mathematical properties of imaginary numbers; however, working with them is simple, and a framework like DirectX supports them.
One example of a complex app is a car-racing game, which uses changes in the inclinometer’s yaw (or alpha) value to steer the car on the track. In addition, this sort of game can use the quaternion data to enforce a “natural” view from the camera (or user’s perspective). For example, if the car makes a sudden or radical turn, the camera adjustment is gradual.
Examples of other complex apps are augmented-reality or first-person shooter games. These apps rely extensively on the quaternion, or the rotation matrix, to affect the device orientation. In a first-person shooter game you can emulate a camera on the user’s helmet or head. By integrating the quaternion data directly with the camera’s viewpoint, the result is an immersive user experience.