Testing Windows Embedded Standard 2009 Images
Gordon H. Smith, Windows Embedded MVP
If you are an embedded developer or OEM who creates embedded devices, Windows Embedded Standard 2009 gives you a powerful and flexible development toolkit that you use to create custom operating system images. Your images can range from inclusive ones that contain many features to specialized ones with only the technologies necessary to best serve a particular solution. Because operating system images vary significantly among OEMs and projects, it is the responsibility of OEM quality assurance teams to validate the stability of those operating system images. This article discusses some considerations to keep in mind when validating the stability of your custom Standard 2009 operating system images.
Testing embedded images poses challenges that developers typically do not encounter with traditional operating system installations. For example, you cannot assume that the embedded image has basic functionality. You must verify that device drivers are working correctly, installation programs run successfully, and the expected functionality that results from specific embedded feature configurations exists. Additionally, a Standard 2009 image may not have some of the tools that you would typically use to perform the validation. Your device may be remotely administered and lack a graphical user interface. Your device may start in a custom shell and limit your ability to view log files or run diagnostic utilities.
You must be creative with your testing approach to overcome image configurations that pose challenges for traditional testing. For a device without a user interface, you may be able to use remote desktop or copy log files from the device to another computer. In the custom shell case, you can enter a maintenance mode where you may have access to a command prompt for running diagnostic utilities. At a minimum, you should explore the following areas of your Standard 2009 image.
The event log contains informational, warning, and error entries. In custom operating system images, it is important to identify entries that you must resolve. Because you most likely omitted features when creating your custom operating system image, you may notice event log entries that would be worrisome in a full Windows XP Professional installation yet excusable in your image. These entries could be the result of some operating system features that detect the absence of other operating system features and are not truly a problem in your particular image. Conversely, you may also notice entries that you would very rarely see in a full Windows XP Professional installation yet warrant resolution in Standard 2009. You may notice entries about device drivers or services that could not be loaded and are required for your solution. In short, examine the event log and work toward determining the root cause of entries to your satisfaction. Many of these discovery efforts have been pursued by other OEMs in the past. To learn from their investigations, please see the Windows Embedded Standard or XP Embedded newsgroups.
The log written by First Boot Agent (FBA) is an excellent resource for determining certain classes of errors. In addition to the errors detailed in Common FBA Log File Errors, you may also check the FBA log to verify the configuration of embedded enabling features, such as the Enhanced Write Filter (EWF).
You can use the SetupAPI log file to learn about application installation and setup in addition to device driver installations as detailed in Troubleshooting Device Installation with the SetupAPI Log File.
Most Standard 2009 operating system images require all devices to be functional and recognized by the operating system. A great resource for verifying the state of devices is Device Manager. You can start the Device Manager several ways, such as using the System Control Panel or by running Devmgmt.msc directly from the command prompt. Within the user interface, incomplete or missing device driver installations are marked by a yellow exclamation mark such as the Fingerprint Sensor shown here.
It is outside the scope of this article to discuss how to test functionality that is specific to your application. Instead, this article focuses on verifying that the custom Standard 2009 operating system image that you created supports your application. When testing your application on your custom operating system image, pay extra attention to how it works with operating system resources. For example, a common image creation issue is missing fonts. If you notice user interface elements that seem irregularly sized or spaced, it may be due to a system font mismatch. Luckily, you can easily find most of the static DLL dependencies when you start applications for the first time. What is more insidious is the failure to dynamically load DLLs or other dependencies at run time. Those may be very obvious or very subtle based on how your application code handles the failure condition. You should carefully check those areas of your application and verify the success of that level of interaction between your application and the Standard 2009 operating system.
There are two tools, AppVerifier and Driver Verifier, that you may want to use to verify your applications or custom drivers. If your custom operating system image makes it difficult to run these tools, consider running them under a full Windows XP Professional installation to resolve issues before deploying your application to your custom Standard 2009 image.
The third and final area in evaluating the stability of your operating system image is to verify whether the specific embedded customizations you included in your image function as expected. Testing some of these embedded customizations can be very simple. If you have enabled your device to start from a CD, USB flash drive, or a remote server, it is fairly easy to determine if the operating system starts. That leaves four additional primary areas to examine: Write Filters, Hibernate Once Resume Many (HORM), custom shells, and deployment.
There are two ways that you can investigate write filters: configuration and feature testing. If you have access to a command prompt, you can run Ewfmgr.exe or Fbwfmgr.exe to confirm the configuration of those features. As mentioned earlier, you can also learn about the success of the EWF configuration that occurred during FBA by inspecting Fbalog.txt.
The straightforward testing approach for write filters is feature testing. After you have verified that the appropriate write filter is enabled, exercise its capabilities. Create conditions that cause writes to occur that should not persist to disk and verify that restarting the operating system causes those changes to disappear. Conversely, if you have areas that should allow persistence, (either separate partitions, exclusions in File-Based Write Filter, or registry data specified in the Registry Filter), verify that changes to that content persist beyond a restart. If committing changes is part of your solution, verify that also.
Pay extra attention to the state of write filters in your manufacturing process. Some deployments enable the write filters during the production of units, for example, to guarantee the creation of a unique Security Identifier (SID) per device. Make sure that a shipment-ready unit performs as expected relative to write filter configuration and functionality.
Hibernate Once Resume Many (HORM)
The main purpose for including HORM support in your image is to increase startup speed. What may not be immediately obvious during testing are scenarios in which you may regenerate your hibernation file in the field. If you intend for a device that includes HORM to apply periodic updates, you must regenerate the hibernation file to remain in sync with the updated contents of your disk. Some exceptions exist, such as cases where the only updates are made to a volume that is mounted after the operating system starts (and therefore no stale file system metadata is present in the hibernation file).
Beyond testing the application-specific functionality of a custom shell, test several targeted areas, such as the following:
Shutdown and restart
Is the system built to survive a sudden loss of power (for example, pulling the power cord)? If not, it is the responsibility of the shell to expose to the user a mechanism for a graceful shutdown or restart.
Is that an end-user capability with the system? If so, test it. If not, make sure that you can access that feature from whatever serves as a maintenance mode for your device.
Does your device have a separate mode for maintenance operations? If so, verify that you can enter and exit that mode as required. For example, you may want each user on the system to have his or her own shell. The maintenance user perhaps uses the Windows Explorer shell, whereas the end-user exercises your embedded application as his or her shell. Whatever approach you decide to take for maintenance operations, validate that your approach works. Finding out in the field that your maintenance mode for servicing your image is inaccessible will greatly escalate your servicing costs.
- Shutdown and restart
Make sure that a shipment-ready unit retains all the features you want present with your device, such as auto logon information, configuration of custom shells, and so on.
Does your device have to regenerate a SID upon first start in the field or does that occur in your manufacturing process? Does your device have to support domain join in the field? If so, verify that joining a domain works as required. For example, if you have a write filter enabled and must support domain join, have you included the Registry Filter to enable the persistence of domain secret keys?
What you have learned
In this article, many of the testing issues that are unique to Standard 2009 operating system images have been described. Given the possible combinations of components in Standard 2009 images, testing Standard 2009 deployments can be challenging. Keep a list of why you are using Standard 2009, what differentiates your custom Standard 2009 image from a standard deployment of Windows XP Professional, and create your test plans accordingly.