Privacy and Security
Microsoft User Experience Group
How to Use Privacy and Security Features in Your Application
Guidelines for Setting Up Privacy and SecurityDefaults
A Word About Tone and Language
Recommendations for Privacy Policies
Four Must-Haves for Your Application
Security breaches, Internet attacks, privacy invasions—they're all daily news events now. And the fact is, the problem won't be solved by software alone. An important part of the solution is to address decisions that dilute people's feeling of security and privacy.
Good handling of privacy and security inspires user confidence, which can lead to an increased use of your product or service. For example, dealing decisively with privacy and security breaches can result in higher trust ratings than before the incident.
This chapter describes our user experience research findings about information privacy and security. The goal in outlining our research is to help you build a user experience that enables your customers to understand and feel confident about a) what the software is doing to protect their information privacy and security, and b) their ability to make informed decisions to protect their privacy and security.
"Information privacy" refers to the user's ability to control when, how, and to what extent information about themselves will be collected, used, and shared with others. "Information security" refers to the ability of businesses and individuals to secure their computers from vulnerabilities and maintain the integrity of the stored information.
Research within Microsoft® has pointed to several key factors affecting end-user understanding of privacy and security, as described here.
- Privacy and security issues can affect user confidence. Privacy and security issues fall into two categories: those with a positive outcome and those with a negative outcome. The resulting user confidence—or lack of it—affects the attitude toward your product and company.
- Privacy and security issues often elicit emotional reactions. Because security and privacy breeches can very negatively impact people's lives, people talk in emotional terms about these incidents, using words such as disappointed, unsupported, frustrated, exposed, and violated.
Complicating this, users often say they want their information to be secure and private, but they also often have a fatalistic attitude towards security and privacy. That is, they often feel resigned to accepting security or privacy compromises.
- Users often do not differentiate security from privacy. In fact, they hardly distinguish between these two concepts because they focus on the outcome of an event and its impact on their lives. For example, an individual might not consider a request for a credit card number to be a privacy issue, but if their credit card is stolen and used, their focus turns to mitigating the damages.
- Social systems affect information sharing. When it comes to information privacy—when, how, and to what extent users will allow personal information to be collected, used, and shared with others—users have a collective sense of what information they are willing to share, and which people they will share information with.
Figure 1 Social aspect: who users will share information with (information is shared less readily with entities farther from the center)
- Differences exist between home versus business use. User concerns vary depending on the environment—that is, whether the user is at home or at work. Home users are most concerned about their children's safety, computer viruses, and identity theft (someone getting their financial information). Home users are less concerned about the security of their files because they consider the likelihood of someone getting their files to be low—most users do not see themselves as the target of hackers. On the other hand, small business users care most about the security of their files (customer files, personnel files, financial administration) and computer viruses.
- Tradeoffs between security, privacy, and convenience are sometimes inevitable. From the user perspective, increases in security and privacy are frequently accompanied by a reduction in convenience. For example, turning off Microsoft ActiveX® controls and disallowing cookies improves Internet privacy and security, yet it can greatly impair commercial transactions. This presents a dilemma for users that can be hard to resolve through user experience. For this reason, you should try to find solutions that do not require this tradeoff.
For example, turning browser security as high as possible will degrade the browsing experience to a point where most users will be dissatisfied. Similarly, setting user defaults so that no data is sent out can degrade an application to the point where it has little value. Conversely, a user could set browser security low enough to allow for ActiveX controls to download and run without intervention, but this leaves the user open to security breaches. It is this dilemma that user-experience designers must seek to resolve. We must present users with understandable options that allow them to perform their tasks with a minimum of inconvenience.
Figure 2 Emotional aspect: information that users feel comfortable sharing (items farther from the center are shared more freely)
If You Do Only One Thing…
For Microsoft Windows® Code Name "Longhorn," the user experience goal is to make security and privacy "just work." Where a transparent experience is not possible, users should be able to make informed decisions with confidence.
The way in which users are alerted to privacy and security issues will vary depending on the situation and the implications. Here are some things you can do to achieve the goals described above:
- Set up application defaults to be secure and private.
- Frame privacy and security decisions in context.
- Minimize requests for personal information.
- Fully disclose how user information will be used.
- Provide clear and consistent channels for alleviating security and privacy concerns and for recovering from security and privacy breeches.
- Provide easy access to more information, such as a privacy and security policy, in most situations.
Most users do not have sufficient knowledge of the implications of technology to make good decisions about privacy and security decisions. They seldom have a deep understanding of the implications involved and will often trust the application to "do the right thing." Most people do not change the default settings, particularly if they trust the provider.
With regard to information privacy, if your application or feature does track user activity, consider what benefits exist for the user and explain those benefits when asking the user to opt in. Ideally, this opt-in request should be shown at the point where the user could derive immediate benefit from the feature. Users believe their information is secure and private on a computer that:
- Does not allow unauthorized people to have access to their computer and the information and programs stored on it.
- Does not allow viruses and related programs to interfere with or harm their computer and the information and programs stored on it.
- Is adequately backed up.
- Allows them to use the Internet safely without others unintentionally getting to their personal information and credit card information.
- Allows their children to use the computer safely without running into offensive material (junk e-mail, porn pop-ups) or offensive people (chat rooms).
Not all users have the same concerns. For example, warning users that their children may be exposed to adult content has no meaning in a business environment.
Frame Privacy and Security Decisions in Context
Because people often find it difficult to construct rules in advance about who should and should not have access to their data, your application should support in-context decision making where appropriate.
When setting security options up front, users will often make the settings overly secure without thinking about the consequences for future actions. They might forget that they've changed a setting or allowed a certain application to access their data, and therefore be confused when they suffer the consequences, such as not being able to get downloads or view e-mail attachments.
Decisions that a user must make at the time when a security or privacy issue arises should contain the scope that the user needs for decision making. This scope is often missing when the user is removed from the context (for example, during application setup). Users who are forced to make isolated decisions are often overly cautious, which can later hurt their experience with the application and lead to reduced data sharing. Users who are allowed to make decisions in the context of product usage are more likely to understand and accept the value proposition.
Minimize Requests for Personal Information
Users are more willing to share information that does not personally identify them than to share information that pinpoints who they are. The following list identifies the types of information and how the information is perceived in terms of sensitivity, from least sensitive to most sensitive.
- City, state, postal code, age, gender, and first name are relatively broad categorizations.
- Date of birth allows better focus and thus is more sensitive, especially in conjunction with the items in the first bullet.
- E-mail address is associated with an individual, and thus is sensitive, although as a contact medium users seem happier with this than with physical mail.
- Telephone number and Social Security number are highly targeted, allowing identification of an individual at a location, thus they are seen as very sensitive information.
To summarize, if you ask for personal information, you must also present the user with a strong value proposition. Asking for information that is not directly related to a user benefit breeds mistrust and avoidance.
Note In some jurisdictions, information privacy is regulated by statute, which determines additional requirements (for example, a legal right of subject access).
Fully Disclose How User Information Will be Used
Users want to know what will happen to the information they provide. Before they are even willing to sign up or pay for a service, users want to know:
- What kind of information does this application or service want from me and why is that information necessary?
- Where is my information going to be stored and who is going to have access to it? Can I control who has access to it?
- What can this application or service do for me and what is the benefit?
If these questions are not answered sufficiently, users will not want to use applications or services that require them to release personal information; or, if they've signed up for a service, they might stop using it.
Another thing to keep in mind: In many cases, users want reassurance that data will not be used for certain purposes rather than being told what it will be used for. Cases that users object to most frequently are those that enable unknown third parties to contact the user. The second most frequent objection is to having activities tracked. It's important to identify any stumbling blocks that users might have to using a product or service, and to make sure you address them with clear statements.
Provide Clear and Consistent Channels for Users to Recover from Privacy and Security Breaches
At any point when a security or privacy breach occurs, users need clear and consistent channels for mitigation and recovery. The goal here is to provide the easiest and fastest route to mitigation, without compromising the user's current state and settings.
When there is a privacy or security breach, the channels for mitigation must be clearly and transparently available to end users. This encompasses everything from ongoing engineering efforts, out-of-band releases, and product update mechanisms, to community presence and executive messages. Regaining user trust involves understanding exactly what the user's issue is, and that the issue might be an emotional rather than a logical one.
Provide Easy Access to More Information in All Situations
Present users with choices, not dilemmas. In other words, ensure that users are able to understand the consequences of choosing a certain option, and that wherever possible there is a trusted way for them to complete their task.
User assistance text should set the boundaries of the application's or feature's functionality. A clear description of the feature set, grounded in terms of the users' task, should be sufficient. Include links to other information that users need to know to secure their systems.
Because trust concerns are often more emotional than logical, the tone of language is an important aspect of presenting privacy and security issues. For example, telling users that by turning off a security feature "some files may be damaged" is less personal and therefore less compelling than "your personal information may be exposed" or "your files could be erased."
Note Be careful when invoking users' emotions in this way—save it for actions with serious consequences.
Incorporate a reassuring tone that leads with a description of how the user's data is being protected. For example, list your third-party certifications and endorsements. Do not hide any uses to which the data may be put, but wherever possible let the user know that this use will not result in the user being personally identified, tracked, or contacted.
When asking for data that users consider "close" to them, explain why it is needed, the purposes it will be put to, and how it will be stored. Reassurances result in users giving more dependable data, which facilitates more usage. Often, a value proposition can be fairly minor, such as, "helps improve subsequent versions of the product." However, the more personally identifiable the information being collected, the greater the value must be for the user to comply.
User interface text should describe the functions that the feature performs, without promising capabilities in absolute terms. For instance, saying that a firewall "keeps you safe online" may be over promising, because a user's definition of "safe" may be much broader than the type of safety that the firewall actually provides.
- Use clear, nontechnical language in privacy policies.
- Be as concrete as possible. Typical privacy policies are made as broad as possible in order to account for future use, but this doesn't help credibility with users. It might be better to provide more specific, clear policies that you update more frequently (optimally when users first encounter the feature that triggers the change in the policy) than to have one catch-all policy that does not inspire trust.
- Design for understanding and good decisions when the content is skimmed briefly. Most users typically won't read the whole policy, but they will look for incongruities or areas where the policy seems to be deliberately vague. Write the policy with this "skimming for anomalies" behavior in mind by having clear section headings and grouping similar user concerns into each section.
In each of these situations, when users are asked to make a decision regarding their privacy or security, the question should be framed so that they can understand, see the consequences of, and make a valid judgment on the decision. Providing easy access to more detailed information (in the form of Help) is valuable in all situations.
- Be private by default. In general, application settings that could expose user data should be switched off by default. Give users an option to turn these settings on in the context of their application use. In some instances, users expect certain feature behavior, which may not be private by default but should be maintained for consistency with legacy behavior. For example, a most-recently-used (MRU) list of files. However, there should always be a way to turn off these features and clear any history information. A privacy deployment guide should explain the settings required to increase a user's level of privacy.
- Be secure by default. Application settings that could compromise user security should be switched off by default. Make users aware of the implications of changing these settings within the context of using the application and before the changes are committed.
©2003 Microsoft Corporation. All rights reserved.
The "Longhorn" User Experience Guidelines are produced by the MSX (Microsoft User Experience) group.