Performance Test Methodologies for Windows CE .NET

Windows CE .NET
 

Mike Hall and Jason Browne
Microsoft Corporation

July 2002

Applies to:
    Microsoft® Windows® CE .NET 4.1 with Microsoft Platform Builder 4.1

Summary: Learn about performance tests and performance improvements between Windows CE .NET 4.0 and Windows CE .NET 4.1. The paper also describes the rationale behind the performance tests. (16 printed pages)

Contents

Introduction
Scenario Testing
Conclusion
For More Information

Introduction

Each release of Microsoft® Windows® CE .NET typically introduces support for a range of new technologies, support for additional hardware reference boards, processors (and therefore new compilers), and new tools that can be used to aid device bring up, development, and debugging. Adding support for new technologies can potentially degrade operating system performance, therefore performance testing needs to be carried out during the development of Windows CE to ensure the operating system is performing as well as it has in previous releases, if not better.

In order to ensure optimal performance of the Windows CE .NET operating system, it is critical that performance testing is executed regularly. Performance tests are performed at a high level to ensure that the operating system as a whole is tested. For example, the networking components may perform at near maximum speed for the networking hardware, but this does not ensure that Microsoft Internet Explorer's use of the network stack will produce optimal results. An application may not set up the stack optimally, or the use of synchronous calls within the application may impact the user experience.

Performance testing of Windows CE .NET focuses on a number of user scenarios; this provides overall operating system performance results. For the purposes of this paper we will be discussing a subset of the scenario tests, focusing on Remote Desktop Protocol (RDP), Internet Explorer, and multimedia technologies.

Scenario Testing

Windows CE .NET performance tests take a two-pronged approach. In the first approach, the performance of Windows CE .NET 4.1 is compared with the performance of Windows CE .NET 4.0. The output from these tests confirms that changes to the operating system do not negatively impact operating system performance. The results do, of course, also highlight performance improvements. The second approach to performance testing is to compare the user scenarios between Windows CE .NET and Microsoft Windows XP. A number of technologies within Windows CE .NET are ported from the desktop, like WinInet for example. In order to compare Windows CE .NET performance figures with Windows XP, we need to run the scenario tests on hardware which is capable of running both Windows CE .NET and Windows XP operating systems. This will produce like-for-like results.

One of the biggest issues with testing performance of an operating system is the dependency of performance results on the underlying hardware. CPU speed, memory bus speed, and graphics hardware are just some of the variables that have a major impact on performance test results. To minimize these variables, the scenario tests are run on hardware that can support both Windows CE .NET and Windows XP.

The Windows CE .NET images are all configured as Release images.

Client Device (Used as the Windows CE .NET and Windows XP Client)

A Microsoft Windows CE .NET PC-based hardware development platform (CEPC) reference platform with the following configuration is used on the client-side device in the Windows CE .NET and Windows XP scenario tests. The following list shows the hardware configuration for the client device:

  • Pentium-II 233 MHz
  • 64 MB Memory (Windows CE), 256 MB Memory (Windows XP)
  • PCMCIA Socket LPE Ethernet card
  • PCMCIA Cisco 802.11 Wireless card
  • ESS1354 Audio Card
  • Tvia5000 PCI Video Card (800 × 600 × 16)

A PCMCIA Socket LPE Ethernet 10MB Network Interface Card (NIC) is used as the network interface card for both Windows CE .NET and Windows XP scenario tests; tests were also conducted using a PCMCIA Cisco 802.11 Wireless card. Both Windows CE .NET and Windows XP support the TVIA 5000. By using this graphics card we are able to use the hardware acceleration capabilities on both operating systems and test the performance of the operating system and applications with minimal impact from the graphics hardware. The TVIA 5000 graphics hardware was chosen to remove the graphics card as a bottleneck in the system. Most of the display drivers that ship with Windows CE .NET are supplied as samples, in source code form. This provides OEMs with the ability to develop drivers based on the Windows CE .NET samples that are tailored to the graphics hardware that the OEM has selected for their application.

Server Device

The following list shows the server hardware configuration used for RDP, Internet Explorer, and multimedia tests:

  • 1.2 Intel P3 processor
  • 512 MB memory SDram Non-ECC
  • Nvidia GEforce 2 video card (1024 × 768 × 32)
  • Asus TUV4x Socket 370 MotherBoard
  • Apollo Pro133T Chipset
  • CM18738 C3DX PCI sound
  • Intel Pro 100 PCI NIC

Performance Tests

The Windows CE .NET development team uses a dedicated performance test team to develop and execute Windows CE .NET performance tests. Using a dedicated test team allows tests to be run in a controlled and standardized environment and ensures that changes made to the operating system do not impact the performance of the operating system.

The Windows CE .NET performance tests can be divided into two types—user-scenario tests and low-level tests. The user-scenario tests are used by the performance test team to ensure that the operating system components are working well together. The low-level tests are used by feature teams to maximize the performance of each component individually. Because each feature team is most knowledgeable about the low-level attributes of their feature, the feature teams are responsible for building and running the low-level tests. This paper describes user-scenario test configuration and results.

The User-Scenario Tests

Each of the scenarios described below has a test that has been specifically designed to test typical usage of the appropriate application, for example, RDP, Internet Explorer, and multimedia.

Each test scenario requires a client device, based on the CEPC hardware platform described above, running Windows CE .NET 4.1, Windows CE .NET 4.0, or Windows XP; in some cases, tests were also carried out using Windows CE 3.0.

The scenario tests were carried out on a private network, using only the client and server devices. The client and server were both configured to use fixed IP addresses. The following network configuration is used for RDP, Internet Explorer, and multimedia tests—the server and client devices are connected through a 10 MBit Ethernet hub.

Figure 1.

Remote Desktop scenario

The RDP tests run the RDP client and measure the time that each step in a typical RDP session takes. This includes running the RDP client, connecting, logging on, running WinBench, logging off, and disconnecting. The RDP scenario uses a PC running Windows XP Professional as the server. The client device consists of either a Windows CE .NET Webpad Enterprise configuration or Windows XP Professional; in both cases the client-side device uses the CEPC hardware configuration listed earlier. The server and client are connected through a 10 MB Ethernet hub. Both devices use the PCMCIA Socket LPE Ethernet 10 MB Network Interface Card (NIC) for the network connection. The RDP session is set up with 16-bit color depth and bitmap caching disabled.

To measure RDP performance timings, we use an RDP virtual channel. There are two components to the RDP virtual channel test harness, one running on the server and the other running on the client. The server component sends a message to the client at the start of a test and again when the test is completed. The time difference between the start and end of the test is measured. A virtual channel is used because timing the server would provide inaccurate timing values; we actually want to obtain timing information when the client completes drawing operations.

In the Terminal Services section of the Platform SDK, Virtual Channel Client Registration describes the process for creating a virtual channel.

The virtual channel DLL used in RDP testing exposes a callback function that deals with the following conditions: CHANNEL_EVENT_CONNECTED, CHANNEL_EVENT_DISCONNECTED, CHANNEL_EVENT_INITIALIZED, CHANNEL_EVENT_TERMINATED, CHANNEL_EVENT_V1_CONNECTED. Results of running an RDP session can be captured using the callback function.

The Remote Desktop scenario tests consist of the following client-side steps.

Connect to Host machine (server)

  • The total time from clicking Connect until the program is ready for user input of the password for login.

Log in

  • The total time from clicking OK until the system has logged in and is ready to execute applications. No special security measures will be used except for password authentication in this task.

Run WinBench

  • WinBench will be run on the server machine and the total execution time in milliseconds will be recorded.
  • The following list shows the WinBench configuration:
    • [Config]
    • Benchmark=WB99
    • [Registration]
    • UserName=zxcv
    • UserOrg=zxcv
    • SerialNumber=
    • [Options]
    • DemoMode=False
    • ExitBenchmark=True
    • SystemConfigurationCheck=False
    • [Step1]
    • Database=c:\ZDBENCH\RESULTS\RESULTS2001.ZTD
    • RebootSystem=False
    • SuiteSection=Suite1
    • StartMessage=10,The batch mode tests will start in 10 seconds.
    • EndMessage=10,The batch mode tests have finished.
    • DirectDraw Tests:DirectDraw Hardware=0
    • DirectDraw Tests:Show Test Name=Yes
    • DirectDraw Tests:Full Screen=Yes
    • DirectDraw Tests:Clipping Window=No
    • DirectDraw Tests:Color Depth=8
    • DirectDraw Tests:Screen Width=640
    • DirectDraw Tests:Screen Height=480
    • DirectDraw Tests:Test Type=0
    • DirectDraw Tests:Blt Type=1
    • DirectDraw Tests:Scaling Factor=1.0
    • DirectDraw Tests:Source In Video=Yes
    • DirectDraw Tests:Work In Video=Yes
    • DirectDraw Tests:Blt Size Width=64
    • DirectDraw Tests:Blt Size Height=64
    • Disk Inspection Tests:CPU Utilization/Transfer Rate=4000
    • Disk Inspection Tests:CPU Utilization/Block Size=16384
    • Disk Inspection Tests:Transfer Rate/Bitmap Path=
    • Disk Inspection Tests:Transfer Rate/CSV Path=
    • Disk Inspection Tests:Transfer Rate/CSV Blocks Per Point=16
    • GDI Tests:User-Supplied Bitmap=
    • Graphics WinMarks:Show Record Numbers=No
    • Graphics WinMarks:Allow VGA Display Resolution=No
    • Graphics WinMarks:Patch GDI on Windows 95=Yes
    • Common Properties:Disk Drive=c:\
    • Common Properties:CDROM Drive=d:\
    • Common Properties:Report CPU Util=No
    • Common Properties:Retry On Errors=1
    • Common Properties:Halt On Errors=Yes
    • Common Properties:Repeat Count=1
    • Common Properties:Variance Limit=3
    • Common Properties:Training Run Count=1
    • Common Properties:Error If Variance Exceeded=Yes
    • Common Properties:Fast Inspection Tests=Yes
    • Common Properties:Install Once=No
    • Common Properties:Repeat Each Test=No
    • Common Properties:Training Runs=No
    • Common Properties:Reboot Frequency=2
    • Common Properties:Defrag Frequency=2
    • Common Properties:Reboot Delay=1
    • [Suite1]
    • GDI/CToS/BitBlt, All ROPs=True
    • GDI/CToS/BitBlt, SRCCOPY=True
    • GDI/CToS/StretchBlt, All ROPs=True
    • GDI/CToS/StretchBlt, SRCCOPY=True
    • GDI/MToS/BitBlt, All ROPs=True
    • GDI/MToS/BitBlt, SRCCOPY=True
    • GDI/MToS/StretchBlt, All ROPs=True
    • GDI/MToS/StretchBlt, SRCCOPY=True
    • GDI/S/Arc, Circular, Complete=True
    • GDI/S/Arc, Circular, Partial=True
    • GDI/S/Arc, Elliptical, Complete=True
    • GDI/S/Arc, Elliptical, Partial=True
    • GDI/S/BltDIBits, 1 bpp, SRCCOPY=True
    • GDI/S/BltDIBits, 24 bpp, SRCCOPY=True
    • GDI/S/BltDIBits, 4 bpp, SRCCOPY=True
    • GDI/S/BltDIBits, 8 bpp, SRCCOPY=True
    • GDI/S/Chord, Circular=True
    • GDI/S/Chord, Elliptical=True
    • GDI/S/Circle=True
    • GDI/S/Ellipse=True
    • GDI/S/ExtFloodFill, Border=True
    • GDI/S/ExtFloodFill, Surface=True
    • GDI/S/FillRgn=True
    • GDI/S/FloodFill=True
    • GDI/S/FrameRgn=True
    • GDI/S/GetNearestColor=True
    • GDI/S/InvertRgn=True
    • GDI/S/Line, Diagonal=True
    • GDI/S/Line, Horizontal=True
    • GDI/S/Line, Vertical=True
    • GDI/S/MoveTo=True
    • GDI/S/MoveToEx=True
    • GDI/S/PaintRgn=True
    • GDI/S/PatBlt, All ROPs=True
    • GDI/S/PatBlt, DESTINVERT=True
    • GDI/S/PatBlt, PATCOPY=True
    • GDI/S/PatBlt, WHITENESS=True
    • GDI/S/Pie, Circular=True
    • GDI/S/Pie, Elliptical=True
    • GDI/S/Polygon, Few-sides=True
    • GDI/S/Polygon, Many-sides=True
    • GDI/S/Polygon, Trapezoid=True
    • GDI/S/Polygon, Triangle=True
    • GDI/S/SetDIBitsBlt, 1 bpp=True
    • GDI/S/SetDIBitsBlt, 24 bpp=True
    • GDI/S/SetDIBitsBlt, 4 bpp=True
    • GDI/S/SetDIBitsBlt, 8 bpp=True
    • GDI/S/SetDIBitsToDevice, 1 bpp=True
    • GDI/S/SetDIBitsToDevice, 24 bpp=True
    • GDI/S/SetDIBitsToDevice, 4 bpp=True
    • GDI/S/SetDIBitsToDevice, 8 bpp=True
    • GDI/S/StretchDIBits, 1 bpp, SRCCOPY=True
    • GDI/S/StretchDIBits, 24 bpp, SRCCOPY=True
    • GDI/S/StretchDIBits, 4 bpp, SRCCOPY=True
    • GDI/S/StretchDIBits, 8 bpp, SRCCOPY=True
    • GDI/S/Text, Times Roman 16=True
    • GDI/S/Text, Times Roman 16, 45=True
    • GDI/S/Text, Times Roman 16, 90=True
    • GDI/SToS/BitBlt, All ROPs=True
    • GDI/SToS/BitBlt, SRCCOPY=True
    • GDI/SToS/StretchBlt, All ROPs=True
    • GDI/SToS/StretchBlt, SRCCOPY=True
    • USER/S/DrawFocusRect=True
    • USER/S/DrawIcon=True
    • USER/S/DrawText, Times Roman 16=True
    • USER/S/FillRect=True
    • USER/S/FrameRect=True
    • USER/S/GrayString, Times Roman 16=True
    • USER/S/InvertRect=True
    • USER/S/ScrollDC=True
    • USER/S/TabbedTextOut, Times Roman 16=True
    • GDI Playback/Bus/Corel WordPerfect Suite 8=True
    • GDI Playback/Bus/Lotus SmartSuite 97=True
    • GDI Playback/Bus/Microsoft Office 97=True
  • Note that WinBench runs on the server device. The time taken to complete the WinBench test is affected by the performance of the RDP Client, but the WinBench score is not.

Logout

  • Logout will be selected and the total time from Selecting Logout until the time when the system is ready to login again will be recorded.

Internet Explorer scenario

The Internet Explorer tests examine a number of typical user scenarios, including page load speed, loading small, medium, and large images, and file load speed. The industry standard benchmark tool, i-Bench, is used to verify Internet Explorer performance. The Internet Explorer scenario uses a PC running Microsoft Windows 2000 Advanced Server with IIS 5.0 as the server. The server provides content for the client (test) machines. The client device consists of either a Windows CE .NET Webpad Enterprise configuration or Windows XP Professional. In both cases, the client-side device uses the CEPC hardware configuration listed earlier. The server and client are connected through a 10 MB Ethernet hub. Both devices use the PCMCIA Socket LPE Ethernet 10 MB Network Interface Card (NIC) for the network connection.

To ensure maximum performance, the Internet Explorer 5.5 cache size should be set large enough to completely cache the i-Bench pages when i-Bench is running the cached tests. To ensure this, the value of 4096 KB was used in the cache setting in the Options dialog box in iesample.

The Internet Explorer test timings are conducted using a Browser Helper Object (BHO). A Browser Helper Object is an in-process Component Object Model (COM) component that Internet Explorer will load each time it starts. BHO objects run in the same memory context as the browser and can perform any action on the available windows and modules. For example, a BHO could detect the browser's typical events (such as GoBack, GoForward, and DocumentComplete), access the browser's menu and toolbar, and make changes, create windows to display additional information on the currently viewed page, and install hooks to monitor messages and actions. Browser Helper Objects: The Browser the Way You Want It describes how to create BHO components.

The Internet Explorer scenario tests consist of the following steps.

Load Internet Explorer

  • The total time from calling CreateProcess() on Internet Explorer until the application is ready for user input is measured in milliseconds.

Load a single page on a private network (i-Bench.htmli-Bench welcome page)

  • The total time from pressing enter until the Web site is fully loaded will be measured in milliseconds (the server is pinged first).
  • Browse forward.
    • The total time from pressing forward button until the Web site is fully loaded will be recorded in milliseconds. A dummy Web site is used for this operation.
  • Browse back.
    • The total time from pressing back button until the Web site is fully loaded will be recorded in milliseconds. A dummy Web site is used for this operation.
  • Refresh.
    • The total time from pressing the refresh button until the Web site is fully loaded will be recorded in milliseconds.

Load a 500 KB .jpg and .gif

  • These files will be loaded from the server machine into the browser and the total download and display time will be recorded in milliseconds.

Load a 4 MB .jpg and .gif

  • These files will be loaded from the server machine into the browser and the total download and display time will be recorded in milliseconds.

Download a small (500 KB), medium (1 MB), and large (4 MB) .txt file

  • These files will be downloaded from the server machine and the total download time to the object store will be recorded in milliseconds.

Download a small (500 KB), medium (1 MB), and large (4 MB) .exe file

  • These files will be downloaded from the server machine and the total download time to the object store will be recorded in milliseconds.

Download a small (500 KB), medium (1 MB), and large (4 MB) uncompressible binary file

  • These files will be downloaded from the server machine and the total download time to the object store will be recorded in milliseconds.

Run i-Bench test

  • i-Bench 3.0 html load pages test is run on the client.

Close Internet Explorer

  • The total time from clicking the Close window button until the application is completely unloaded will be measured in milliseconds.

Multimedia Technologies scenario

The multimedia technologies tests consist of playing a number of multimedia clips at low, medium, and high bandwidth and measure dropped frames, bandwidth, and CPU utilization. The multimedia scenario uses a server running Windows XP Professional. The server provides streaming Microsoft Windows Media™ audio and video to the client over the network. The client device runs either a Windows CE .NET Webpad Enterprise configuration or Windows XP Professional, in both cases the client-side device uses the CEPC hardware configuration described earlier. The server and client are connected through a 10 MB Ethernet hub. Both devices use the PCMCIA Socket LPE Ethernet 10 MB Network Interface Card (NIC) for the network connection.

The multimedia server is running Windows Server 2003 with the following installed components: Microsoft Internet Information Server and Microsoft Windows Media Technologies.

The multimedia tests are designed to examine the CPU utilization and the number of dropped frames during playback. There are a number of ways in which dropped frames can be calculated. For example, the number of dropped frames can easily be obtained through the Windows Media Player, or through interfaces exposed by the Windows Media ActiveX® control. Specifically, the IAMDroppedFrames interface can provide the number of dropped and non-dropped frames as well as the frame rate and data rate achieved during playback. A call to GetNumDropped will return the total number of frames dropped since the streaming started and GetNumNotDropped will return the number of frames that achieved transfer successfully.

The multimedia scenario tests consist of the following steps.

Load the Windows Media Player in the Internet Explorer browser

  • The total time from calling CreateProcess() on Internet Explorer until the application is ready for user input will be measured in milliseconds.

Play Windows Media video files at the following bandwidths

  • For each of these bit rates, the frame rate and total number of dropped frames will be recorded:
    • High (300 kbps)
    • Medium (100 kbps)
    • Low (56 kbps)

Play Windows Media Audio and MP3 audio files at the following bit rates

  • For each of these bit rates, the number of dropped packets and CPU utilization will be recorded:
    • High (128 kbps)
    • Medium (32 kbps)
    • Low (8 kbps)

The Results

During the development of Windows CE .NET, scenario test results are examined and if any performance issues are discovered, bugs are logged with the appropriate feature team. Each of the feature teams is supplied with the performance data so that they can determine both where they stand in relationship to Windows XP and how well they perform compared to Windows CE .NET 4.0 and, in some cases, Windows CE 3.0. A number of performance issues were identified through scenario testing and this resulted in several investigations being launched. The following sections show the product improvements that have resulted from this work.

Remote Desktop Protocol (RDP)

The following table shows the comparison of Windows CE .NET 4.0 and Windows CE .NET 4.1. The results show time in seconds.

Table 1.

Test Case—RDP Windows CE .NET 4.0 Windows CE .NET 4.1
Time to connect 0.50 0.33
Time to logon 4.26 1.95
WinBench time 1,967 1,583
Time to logoff 9.18 9.78
Time to disconnect 0.57 0.94

The following table shows the comparison of Windows CE .NET 4.1 and Windows XP. The results show time in seconds.

Table 2.

Test Case—RDP Windows XP Windows CE .NET 4.1
Time to connect 0.86 0.33
Time to logon 2.99 1.95
WinBench time 1,527 1,583
Time to logoff 7.90 9.78
Time to disconnect 0.11 0.94

The following list shows the key metrics for RDP performance improvements.

  • WinBench execution time was 20% faster than Windows CE .NET 4.0.
  • WinBench execution time was 30% slower than Windows XP and is now only 4% slower than Windows XP (256 MB) on the same reference hardware. See earlier description of hardware.

Internet Explorer

Test results comparing Windows CE .NET 4.1 compared with Windows CE .NET 4.0. The results show time in seconds.

Table 3.

Test Case—Internet Explorer Windows XP Windows CE .NET 4.0 Windows CE .NET 4.1 Windows CE 3.0
Load 3.88 3.14 3.01  
Back 0.28 0.25 0.22  
Forward 0.26 0.32 0.31  
Refresh 0.02 0.01 0.00  
Close 0.6 0.47 1.37  
Blank 1.57 0.40 0.29  
S JPG 1.43 0.67 0.51  
L JPG 13.67 3.27 2.66  
S GIF 2.24 0.30 0.33  
L GIF 1.51 0.86 0.84  
S TXT 1.16 3.87 3.83  
M TXT 1.53 8.00 9.49  
L TXT 6.87 41.59 44.87  
S EXE 4.07 3.11 1.65  
M EXE 2.74 4.21 2.22  
L EXE 6.14 13.73 5.73  
S ZIP 2.18 2.04 1.37  
M ZIP 3.83 5.98 1.55  
L ZIP 7.85 15.44 7.65  
i-Bench Score 219.39 351.5 298.00 523.00
Total 277.34 456.02 382.90  

The following list shows the key metrics for Internet Explorer performance improvements.

  • i-Bench score: Improved 15% from Windows CE .NET 4.0, 60% faster than Windows CE 3.0
  • Single HTML page load: Improved 19% from Windows CE .NET 4.0, 23% faster than Windows XP
  • File download: Improved about 40% from Windows CE .NET 4.0, comparable to Windows XP
  • Display .jpg (x86 only): Improved 50%+ from Windows CE .NET 4.0, 60-70% faster than Windows XP

Multimedia Technologies

The following graph and table show CPU utilization when a medium-sized Windows Media Audio file is streamed on Windows CE .NET 4.1 and Windows XP Professional. Note that the same hardware is used in both cases. The computer running Windows CE .NET is using 64 MB RAM; the computer running Windows XP is using 256 MB RAM. The WPE columns are based on a Webpad Enterprise configuration build from Platform Builder and the MDX is a special build based on the Media Appliance configuration that makes use of the Direct Draw and Direct Audio components.

Figure 2.

All data is for 100% playback, no lost or recovered packets, unless otherwise noted.

Table 4.

Quality Type Stream/Local Windows CE .NET 4.0 WPE Windows CE .NET 4.0 MDX Windows XP 256 MB Windows CE .NET 4.1 WPE Windows CE .NET 4.1 MDX
High Windows Media Video Stream 82% Dropped Frames 95% 84% 70%
Medium Windows Media Video Stream 67% 63% 58% 63% 65%
Low Windows Media Video Stream 21% 23% 17% 20% 18%
High Windows Media Video Local 79% 66% 92% 83% 68%
Medium Windows Media Video Local 65% 61% 58% 62% 63%
Low Windows Media Video Local 20% 22% 16% 19% 17%
High Windows Media Audio Stream 18% 21% 22% 19% 21%
Medium Windows Media Audio Stream 15% 18% 18% 15% 19%
Low Windows Media Audio Stream 11% 14% 11% 11% 10%
High Windows Media Audio Local 19% 22% 22% 19% 22%
Medium Windows Media Audio Local 16% 19% 17% 16% 19%
Low Windows Media Audio Local 11% 14% 10% 11% 10%
High MP3 Stream 26% 28% 22% 25% 28%
Medium MP3 Stream 18% 21% 15% 18% 22%
Low MP3 Stream 14% 14% 10% 11% 10%
High MP3 Local 21% 23% 20% 21% 25%
Medium MP3 Local 15% 18% 14% 15% 19%
Low MP3 Local 11% 13% 11% 10% 9%

The following list shows the performance improvements for Multimedia Technologies.

  • 20% improvement of Windows Media Video playback with overlays using the DMO codec
  • Dramatically improved Windows Media streaming high bit rate video playback

Conclusion

This paper outlines the methodology used to perform specific scenario-based performance tests and also includes the results of these tests against RDP, Internet Explorer, and multimedia technologies. The test results clearly show performance improvements for Windows CE .NET 4.1 compared with Windows CE .NET 4.0.

It is important to note that testing of an operating system or device should occur at a number of levels. The performance of individual components or technologies, while important, should be looked at in the context of the complete operating system or device.

To ensure optimal performance of Windows CE .NET, developers should be aware of the importance of choosing appropriate hardware that can be optimized for the specific device. The graphics hardware can impact the performance of most user scenarios, and should therefore be a major area of focus for hardware selection.

Windows CE .NET ships with a number of tools that can be used to test the performance of your embedded design. These tools can be divided into three distinct groups: verification and performance, informational and utility, and debugging. The following list shows the tools in these groupings:

Verification/Performance

  • Kernel Tracker
  • Remote Call Profiler
  • Remote Performance Monitor

Informational/Utility

  • Remote Zoom-in
  • Remote File Viewer
  • Remote Registry Viewer
  • Remote System Info

Debugging

  • Remote SPY++
  • Remote Heap Walker
  • Remote Process Viewer

For More Information

For the latest information about Windows CE .NET, visit the Microsoft Windows Embedded Web site.

The online documentation and context-sensitive Help included with Windows CE .NET also provides comprehensive background information and instructions for using Windows CE .NET.

To access the online documentation for Windows CE .NET

  1. Start Platform Builder.
  2. Select the Contents tab on the Help menu to view the documentation.

Also see the product documentation for Windows CE .NET in the Embedded Operating System Development section of MSDN.

The following table shows the files that were used in the Internet Explorer file transfer tests.

Table 5.

File Size in bytes
small.exe 512,784
medium.exe 1,011,757
large.exe 4,188,432
small.zip 346,306
medium.zip 1,918,979
large.zip 5,714,674
small.gif 12,071
large.gif 163,381
small.jpg 21,124
large.jpg 1,004,440
small.txt 512,001
medium.txt 1,024,001
large.txt 4,096,001

The following table shows the files that were used in the multimedia playback tests.

Table 6.

File Size in bytes
Buddy Holly_128.mp3 2,550,096
Buddy Holly_128.wma 2,569,812
Buddy Holly_32.mp3 637,488
Buddy Holly_32.wma 654,918
Buddy Holly_8.mp3 159,336
Buddy Holly_8.wma 169,534
Dance_100.wmv 1,887,173
Dance_300.wmv 5,381,935
Dance_56.wmv 850,927

Third-party software used during performance tests:

WinBench 99 is a subsystem-level benchmark that measures the performance of a PC's graphics, disk, and video subsystems in a Windows environment. WinBench 99's tests can run on Microsoft Windows 95, Windows 98, Microsoft Windows NT®, Windows 2000, and Windows Me systems.

i-Bench is a comprehensive, cross-platform benchmark that tests the performance and capability of Web clients as they take on the latest Web technology and features. A Web client is defined as any combination of hardware and software you can use to retrieve content from the Web.

Show: