Results for the Photo Handling Assessment

Applies To: Windows 8.1

This topic provides guidance for understanding the results of the Photo handling assessment in addition to guidance on how to use those results to identify and resolve several common issues that negatively affect the customer experience during photo handling operations.

The Photo handling assessment measures the time that it takes to complete common photo-related tasks, such as viewing, searching and manipulating a set of photos. The purpose of this assessment is to enable comparison of photo handling experiences across computers and to diagnose performance issues that affect the photo handling experience. In interactive tasks such as photo viewing and manipulation, response times are very important for user satisfaction. Therefore, this assessment uses response times to measure the user experience.

The metrics generated by the assessment can be grouped as follows:

  • Top-level metrics, which provide an aggregation of various metrics related to responsiveness under the “Explorer Experience” group.

  • Resource consumption, which shows metrics that are related to consuming resources on the computer during the assessment.

  • Explorer Experience, which shows metrics that are related to actions in File Explorer.

  • Photo Manipulation, which shows metrics that are related to photo manipulation actions.

In this topic:

  • Goals File

  • Metrics

  • Issues

For more information about the system requirements, workloads and assessment settings, see Photo Handling.

Goals File

You can create custom goals to measure your improvements in the Results View. Goals files are a triage tool that can help you understand how a PC is performing and to compare PCs in your business.

For example, goals for a basic laptop might be different than the goals you set for a high end desktop computer, or market expectations might change in such a way that you want the flexibility to define different goals and key requirements as time passes and technology improves.

When a metric value is compared to the goal for that metric, the status is color coded in the Result View as follows:

  • Light purple means that the system has a great user experience and that there are no perceived problems.

  • Medium purple means that the user experience is tolerable and you can optimize the system. Review the recommendations and analysis to see what improvements can be made to the system. These can be software changes, configuration changes or hardware changes.

  • Dark purple means that the system has a poor user experience and that there is significant room for improvements. Review the recommendations and analysis to see the improvements that can be made to the system. These can be software changes, configuration changes or hardware changes. You might have to consider making tradeoffs to deliver a high quality Windows experience.

  • No color means that there are no goals defined for the metric.

Note

In the Windows Assessment Toolkit for Windows 8, some assessments include default goals files. The first time you view results using this version of the tools, the default goals file is used. However, you can also define custom goals for Windows 8 the same way that you can for Windows 8.1.

You can set the goals file location and add a goals file to that location before you can use the UI to apply the custom goals. Once a goals file is selected it will continue to be the goals file that is used for any results that are opened.

Only one goals file can be used at a time. Goals for all assessments are set in a single goals file. The assessment tools will search for goals in the following order:

  1. A custom goals file

  2. Goals that are defined in the results file

  3. Goals that are defined in the assessment manifest

You can use the sample goals file that is provided at %PROGRAMFILES%\Windows Kits\8.1\Assessment and Deployment Kit\Windows Assessment Toolkit\SDK\Samples\Goals to create your own goals file.

Note

You cannot package a goals file with a job, but you can store it on a share for others to use.

Metrics

This section describes the key metrics reported by the Photo handling assessment, common causes of poor results for these metrics, and common remediation for these issues. This section also attempts to identify the audience that has the most influence on each of these metrics.

This section includes:

  • Responses without Delays

  • Noticeable Delays

  • Unsatisfactory Delays

  • Resource Consumption Metrics

  • Change to Large Icons View

  • Display Context Menu

  • Photo Manipulation Metrics

Responses without Delays

Most applicable to: OEM/ODM, IHV, ISV, IT Professionals, Enthusiasts

This metric gives the percentage of operational responses which are meeting or exceeding the default goal. This metric indicates the overall responsiveness of interactive actions. With the exception of the Display Search Results metric, all responsiveness metrics in the “Explorer Experience” group are used to calculate this metric.

Analysis and Remediation Steps

This metric is an aggregation of several sub-metrics. Analyze the individual sub-metrics which are not meeting the goals and those which are close to the goal. Because these sub-metric values are averages, there may be individual instances of not meeting the goal, but when the value is averaged the goal is met.

Noticeable Delays

Most applicable to: OEM/ODM, IHV, ISV, IT Professionals, Enthusiasts

This metric gives the percentage of operational responses with moderate delays based on the predefined goals. This indicates the responsiveness of interactive actions. With the exception of Display Search Result, all responsiveness metrics in the “Explorer Experience” group are used to calculate this metric.

Analysis and Remediation Steps

Because this metric is an aggregation of several sub-metrics, analyze the individual sub-metrics that have noticeable delays, and are marked as yellow in the results.

Unsatisfactory Delays

Most applicable to: OEM/ODM, IHV, ISV, IT Professionals, Enthusiasts

This metric gives the percentage of operational responses with long delays based on the predefined goals. This metric indicates the responsiveness of interactive actions. With the exception of Display Search Result, all responsiveness metrics in the “Explorer Experience” group are used to calculate this metric.

Analysis and Remediation Steps

Because this metric is an aggregation of several sub-metrics, analyze the individual sub-metrics that have unsatisfactory delays, and are marked as red in the results.

Resource Consumption Metrics

Most applicable to: OEM/ODM, IHV, ISV, IT Professionals, Enthusiasts

This metric provides information about the consumption of resources on the computer during the assessment and can provide useful pointers for possible issues in the system.

The sub-metrics and remediation steps are outlined in the following table.

Metric Description Recommendation

Average CPU Utilization

Shows average CPU use, as a percentage, across all processors (cores or CPUs) during the assessment.

Choose the WPA in-depth analysis link to use the WPA Computation graphs: CPU Usage (Sampled) or CPU Usage (Precise).

Hard Fault Count

Shows the total number of hard faults. Hard faults occur when the operating system retrieves memory pages from disk instead of from the in-memory pages that the memory manager maintains.

Choose the WPA in-depth analysis link to use the WPA Hard Faults Memory graph.

Average Disk Utilization

Shows average disk use, as a percentage, during the assessment. The disk utilization percentage is computed by calculating the amount of time that a disk is servicing I/O requests vs. the time that is spent idling.

Choose the WPA in-depth analysis link to use the WPA Disk Usage graphs.

Long running DPC/ISR

Shows the total time, in microseconds, for deferred procedure calls (DPC) and interrupt service requests (ISR) that continue longer than 3 microseconds.

If this metric value is larger than 0, the assessment generates an issues which can be analyzed further in WPA. For more information, see Common In-Depth Analysis Issues.

Change to Large Icons View

Most applicable to: OEM, ISV

This metric measures the average time that it takes to change the view to large icons. This is a responsiveness metric that is measured against specific goals.

Typical Influencing Factors

Changing the icon view on a folder that contains photos involves opening the image files, decoding the images and rendering the icons. Therefore, factors influencing I/O time and graphics rendering speed are relevant. For example, anti-virus software might influence the I/O times and affect this operation.

Analysis and Remediation Steps

The Photo Handling assessment analyzes the trace file to generate issues affecting the assessment activities. Open the assessment trace file in WPA and analyze the activity corresponding to this metric. In addition to reporting issues for long-running ISRs and DPCs, the assessment will also report issues caused by delays (pre-emptions by other threads) which increase the duration of the activity. For more information, see Common In-Depth Analysis Issues.

Similar Metrics

Similar analysis and remediation steps can be applied to the following related metrics.

Metric Description

Change to Medium Icons View

Shows the average time, in milliseconds, to change the view to medium icons.

Change to Extra Large Icons View

Shows the average time, in milliseconds, to change the view to extra-large icons.

Change to Details View

Shows the average time, in milliseconds, to change the view to details.

Change to Sort by Date

Shows the average time, in milliseconds, to change the sort order to sort by date.

Change Sort Order

Shows the average time, in milliseconds, to change the sorting order from ascending to descending.

Scroll Page

Shows the average time for scrolling a page in File Explorer (showing a folder that has photos), by pressing the Page Up and Page Down keys.

Navigate

Shows the total time, in milliseconds, for folder navigation, including choosing the Back button.

Display Context Menu

Most applicable to: OEM, ISV, IT Professionals, Enthusiasts

This metric measures the average time, in milliseconds, to display the shortcut menu after a right-click inside a folder in File Explorer. This is a responsiveness metric that is measured against specific goals.

Typical Influencing Factors

The action of displaying a context menu should be responsive (<250 milliseconds) even on low end computer. The main factor that could negatively affect this metric is resource consumption, typically CPU, resulting in File Explorer being unresponsive.

Analysis and Remediation Steps

The Photo Handling assessment analyzes the trace file to generate issues that affect the assessment activities. The name of the activity corresponding to this metric are Display View Context Sub-Menu and Display Sort Context Sub-Menu. Open the assessment trace file in WPA and analyze the activity corresponding to this metric.

In addition to analyzing generated issues, you can also analyze the CPU usage graphs in WPA to find processes with high CPU utilization. If these processes are found to affect the photo usage activities on the system, consider fixing these programs or (if non critical) stopping them during photo usage activities.

Similar Metrics

Similar analysis and remediation steps can be applied to the following related metrics.

Metric Description

Display View Context Sub-Menu

Shows the average time, in milliseconds, to display the View context submenu. This submenu appears in File Explorer after you right-click and then choose View.

Display Sort Context Sub-Menu

Shows the average time, in milliseconds, to display a Sort context submenu. This submenu appears in File Explorer after you right-click and then choose Sort.

Acquire Search Box Focus

Shows the average time, in milliseconds, to acquire focus on the search box after you choose the search box.

Photo Manipulation Metrics

Most applicable to: OEM, ODM, IHV, ISV

This group of metrics reports time for performing common photo manipulation actions such as zooming, cropping and rotating a picture using Windows Imaging Component (WIC) API.

Sub-metrics are described in the following table.

Metric name Description

Photo Opening Time

Shows the average time, in milliseconds, to open a single photo.

Photo Saving Time

Shows the average time, in milliseconds, to save changes that were made to a single photo.

Photo Zooming Time

Shows the average time, in milliseconds, to scale a photo to one-quarter of its original size. This is the equivalent of zooming in on a photo.

Photo Cropping Time

Shows the average time, in milliseconds, to crop a photo to one-quarter of its original size.

Photo Rotating Time

Shows the average time, in milliseconds, to rotate a photo 90 degrees clockwise.

Typical Influencing Factors

Unlike the Responsiveness metrics, these photo manipulation metrics are expected to scale with the hardware configuration of the system. So the assessment does not apply responsiveness metric goals to these metric values. The durations of these operations may be affected by bottlenecks in CPU and Disk.

Analysis and Remediation Steps

To analyze issues generated for the corresponding activity of this metric, choose the WPA in-depth analysis link in the issue description. Analyze any delays affecting these metrics. In addition to the delay analysis, CPU and Disk Usage summary tables can be analyzed in WPA looking for high resource consumption processes. If these processes are found to affect the photo usage activities on the system, consider fixing these programs or (if non critical) stopping them during photo usage activities.

Issues

This assessment performs advanced issue analysis and provides links to Windows® Performance Analyzer (WPA) to troubleshoot the issues that are identified. In most cases, you can choose the WPA Analysis link to troubleshoot the issues that appear. When WPA opens additional details about disk activity or CPU activity might be available depending on the type of issue identified. For more information about in-depth analysis issues and recommendations, see Common In-Depth Analysis Issues.

The assessment reports an exit code of 0x80050006

This error occurs when maintenance tasks have been registered on the PC but have not completed before the assessment run. This prevents the assessment from running, as maintenance tasks often impact assessment metrics.

To resolve this issue, do one of the following:

  1. Ensure that the computer is connected to a network and is running on AC power. Manually initiate pending maintenance tasks with the following command from an elevated prompt:

    rundll32.exe advapi32.dll,ProcessIdleTasks

  2. Disable regular and idle maintenance tasks, and stop all maintenance tasks before running the assessment.

See Also

Tasks

Create and Run an Energy Efficiency Job

Concepts

Photo Handling
Assessments
File Handling

Other Resources

Windows Assessment Toolkit Technical Reference
Windows Assessments Console Step-by-Step Guide
Windows Performance Toolkit Technical Reference