Export (0) Print
Expand All

Cc307420.Release(en-us,MSDN.10).png

Phase 5: Release

The Release phase is when you ready your software for public consumption and, perhaps more importantly, you ready yourself and your team for what happens once your software is in the hands of the user. One of the core concepts in the Release phase is planning—mapping out a plan of action, should any security or privacy vulnerabilities be discovered in your release—and this carries over to post-release, as well, in terms of response execution. To this end, a Final Security Review and privacy review is required prior to release.

On This Page

Public Release Privacy Review
Privacy Requirements
Privacy Recommendations
Resources
Planning
Security Requirements
Privacy Requirements
Resources
Final Security Review and Privacy Review
The FSR Process
Possible FSR Outcomes
Security Requirements
Privacy Requirements
Security Recommendations
Resources
Release to Manufacturing/Release to Web
Security Requirements
Privacy Requirements
Resources

Public Release Privacy Review

Before any public release (including Alpha and Beta test releases), update the appropriate SDL Generic Privacy Questionnaire for any significant privacy changes that were made during implementation verification. Significant changes include changing the style of consent, substantively changing the language of a notice, collecting different data types, and exhibiting new behavior.

Although privacy requirements must be addressed before any public release of code, security requirements need not be addressed before public release. However, you must complete a final security review before final release.

Privacy Requirements

  • Review and update the Privacy Companion form.

    • For a P1 project, your privacy advisor reviews your final SDL Privacy Questionnaire (Appendix C), helps determine whether a privacy disclosure statement is required, and gives final privacy approval for public release.

    • For a P2 project, you need validation by a privacy advisor if any of the following is true:

    • A design review is requested by a privacy advisor.

    • You want confirmation that the design is compliant with privacy standards.

    • You wish to request an exception.

    • For a P3 project, there are no additional privacy requirements.

  • Complete the privacy disclosure.

    • Draft a privacy disclosure statement as advised by the privacy advisor. If your privacy advisor indicates that a privacy disclosure is waived or covered, you do not need to meet this requirement.

    • Work with your privacy advisor and legal representatives to create an approved privacy disclosure.

    • Post the privacy disclosure to the appropriate website before each public release.


Privacy Recommendations

  • Create talking points as suggested by the privacy advisor to use after release to respond to any potential privacy issues.

  • Review deployment guidance for enterprise programs to verify that privacy controls that affect functionality are documented. Conduct a legal review of the deployment guide.

  • Create “quick text” for your support team that addresses likely user questions, and generally foster strong and frequent communication between your development and support teams.


Resources

Planning

Any software can be released with unknown security issues or privacy issues, despite best efforts and intentions. Even programs with no known vulnerabilities at the time of release can be subject to new threats that emerge and might require action. Similarly, privacy advocates might raise privacy concerns after release. You must prepare before release to respond to potential security and privacy incidents. With proper planning, you should be able to address many of the incidents that could occur in the course of normal business operations.


Your team must be prepared for a zero-day exploit of a vulnerability—one for which a security update does not exist. Your team must also be prepared to respond to a software security emergency. If you create an emergency response plan before release, you will save time, money, and frustration when an emergency response is required for either security or privacy reasons.

Security Requirements

  • The project team must provide contact information for people who respond to security incidents. Typically, such responses are handled differently for products and services.

    • Provide information about which existing sustained engineering (SE) team has agreed to be responsible for security incident response for the project. If the product does not have an identified SE team, they must provide an emergency response plan (ERP) and provide it to the incident response team. This plan must include contact information for three to five engineering resources, three to five marketing resources, and one or two management resources who are the first points of contact when you need to mobilize your team for a response effort. Someone must be available 24 hours a day, seven days a week, and contacts must understand their roles and responsibilities and be able to execute on them when necessary.

    • Identify someone who is responsible for security servicing. All code developed outside the project team (third-party components) must be listed by filename, version, and source (where it came from).

    • You must have an effective security response process for servicing code that has been inherited or reused from other teams. If that code has a vulnerability, the releasing team may have to release a security update even though it did not develop the code.

    • You must also have an effective security response process for servicing code that has been licensed from third parties in either object or source form. For licensed code, you also need to consider contractual requirements regarding which party has rights and obligations to make modifications, associated service level agreements (SLAs), and redistribution rights for any security modifications.

  • Create a documented sustaining model that addresses the need to release immediate patches in response to security vulnerabilities and does not depend entirely on infrequent service packs.

  • Develop a consistent and comprehensible policy for security response for components that are released outside of the regular product release schedule (out-of-band) but that can be used to update or enhance the software after release. For example, Windows must plan a response to security vulnerabilities in a component, such as DirectX, that ships as part of the operating system but that might also be updated independently of the operating system, either directly by the user or by the installation of other products or components.

  • Disable tracing and debugging in ASP.NET applications prior to deployment. This neutralizes the following possible security vulnerabilities:

    • When tracing is enabled for the page, every browser requesting it also obtains the trace information that contains sensitive data about internal server state and workflow. This information could be security-sensitive.
    • When debugging is enabled for the page, errors happening on the server result in a full set of stack trace data presented to the browser. This data may expose security-sensitive information about the server’s workflow.

Privacy Requirements

  • For P1 and P2 projects, identify the person who will be responsible for responding to all privacy incidents that may occur. Add this person’s e-mail address to the Incident Response section of the SDL Privacy Questionnaire (Appendix C). If this person changes positions or leaves the team, identify a new contact and update all SDL Privacy Questionnaire forms for which that person was listed as the privacy incident response lead.

  • Identify additional development and quality assurance resources on the project team to work on privacy incident response issues. The privacy incident response lead is responsible for defining these resources in the Incident Response section of the SDL Privacy Questionnaire.

  • After release, if a privacy incident occurs you must be prepared to follow the SDL Privacy Escalation Response Framework (Appendix K) which might include risk assessment, detailed diagnosis, short-term and long-term action planning, and implementation of action plans. Your response might include creating a patch, replying to media inquiries, and reaching out to influential external contacts.


Resources

Final Security Review and Privacy Review

As the end of your software development project approaches, you need to be sure that the software is secure enough to ship. The Final Security Review (FSR) helps determine this. The security team assigned to the project should perform the FSR with help from the product team to ensure that the software complies with all SDL requirements and any additional security requirements identified by the security team (such as penetration testing or additional fuzz testing).


A Final Security Review can last anywhere from a few days to six weeks, depending on the number of issues and the team’s ability to make necessary changes.


It is important to schedule the FSR carefully—that is, you need to allow enough time to address any serious issues that might be found during the review. You also need to allow enough time for a thorough analysis; insufficient time could cause you to make significant changes after the FSR is completed.

The FSR Process

  • Define a due date for all project information that is required to start the FSR. To minimize the likelihood of unexpected delays, plan to conduct an FSR four to six weeks before release to manufacturing (RTM) or release to web (RTW). Your team might need to revalidate specific decisions or change code to fix security issues. The team must understand that additional security work needs to be performed during the FSR.

  • The FSR cannot begin until you have completed the reviews of the security milestones that were required during development. Milestones include in-depth bug reviews, threat model reviews, and running all SDL-mandated tools.

    • Reconvene the development and security leadership teams to review and respond to the questions posed during the FSR process.

    • Review threat models. The security advisor should review the threat models to ensure that all known threats and vulnerabilities are identified and mitigated. Have complete and up-to-date threat models at the time of the review.

    • Review security issues that were deferred or rejected for the current release. The review should ensure that a consistent, minimum security standard was adhered to throughout the development cycle. Teams should already have reviewed all security issues against the criteria that were established for release. If the release does not have a defined security bug bar, your team can use the standard SDL Security Bug Bar.

    • Validate results of all security tools. You should have run these tools before the FSR, but a security advisor might recommend that you also run other tools. If tool results are inaccurate or unacceptable, you might need to rerun some tools.

    • Ensure that you have done all you can to remove vulnerabilities that meet your organization’s severity criteria so that there are no known vulnerabilities. Ultimately, the goal of SDL is to remove security vulnerabilities from products and services. No software release can pass an FSR with known vulnerabilities that would be considered as Critical, Important, Moderate, or Low.

  • Submit exception requests to a security advisor for review. If your team cannot meet a specific SDL requirement, you must request an exception. Typically, such a request is made well in advance of the FSR. A security advisor reviews these requests and, if the overall security risk is tolerable, might choose to grant the exception. If the security risk is not acceptable, the security advisor will deny the exception request. It is best to address all exception requests as soon as possible in the development phase of the project.

Possible FSR Outcomes

Possible outcomes of an FSR include:

  • Passed FSR. If all issues identified during the FSR are corrected before RTM/RTW, the security advisor should certify that the project has successfully met all SDL requirements.

  • Passed FSR (with exceptions). If all issues identified during the FSR are corrected before RTM/RTW or the security advisor and team can reach an acceptable compromise about any SDL requirements that the project team was unable to resolve, the security advisor should identify the exceptions and certify that all other aspects of the project have successfully met all SDL requirements.

    • All exceptions and security issues not addressed in the current release should be logged, and then addressed and corrected in the next release.

  • FSR escalation. If a team does not meet all SDL requirements, and the security advisor and the product team cannot reach an acceptable compromise, the security advisor cannot approve the project, and the project cannot be released. Teams must correct whatever SDL requirements that they can or escalate to higher management for a decision.

    • Escalations occur when the security advisor determines that a team cannot meet the defined requirements or is in violation of an SDL requirement. Typically, the team has a business justification that prevents them from being compliant with the requirement. In such instances, the security advisor and the team should work together to compose a consolidated escalation report that outlines the issue—including a description of the security or privacy risk and the rationale behind the escalation. This information is typically provided to the business unit executive and the executive with corporate responsibility for security and privacy, to aid decision-making.

    • If a team fails to follow proper FSR procedures—either by an error of omission or by willful neglect—the result is an immediate FSR failure. Examples include:

      • Errors of omission, such as failure to properly document all required information.

      • Specious claims and willful neglect, including:

        • Claims of “Not subject to SDL” contrary to evidence.

        • Claims of “FSR pass” contrary to evidence, and software RTM/RTW without the appropriate signoff.

      Such an incident can result in very serious consequences subsequently, and is always immediately escalated to the project team executive staff and the executive in charge of security and privacy.

Security Requirements

  • The project team must provide all required information before the scheduled FSR start date. Failure to do so may delay completion of the FSR. If the schedule slips significantly before the FSR begins, contact the assigned security advisor to reschedule.

  • After the FSR is finished, the security advisor either signs off on the project as is or provides a list of required changes.

  • For online services and/or LOB applications, projects releasing services are required to have a security score of B or above to successfully pass the FSR. Both Operations and Product groups are responsible for compliance. A product’s security is managed at many levels. Vulnerabilities, whether in code or at host level, put the entire product (and possibly the environment) at risk.


Privacy Requirements

  • Repeat the privacy review for any open issues that were identified in the pre-release privacy review or for material changes made to the product after the pre-release privacy review. Material changes include modifying the style of consent, substantively revising the language of a notice, collecting different data types, or exhibiting new behavior. If no material changes were made, no additional reviews or approvals are required.

  • After the privacy review is finished, your privacy advisor either signs off on the product as is or provides a list of required changes.


Security Recommendations

  • Ensure the product team is constantly evaluating the severity of security vulnerabilities against the standard that is used during the security push and FSR. Otherwise, a large number of security bugs might be reactivated during the FSR.


Resources

Release to Manufacturing/Release to Web

Software release to manufacturing (RTM) or release to web (RTW) is conditional to completion of the Security Development Lifecycle process as defined in this document. The security advisor assigned to the release must certify that your team has satisfied security requirements. Similarly, for all products that have at least one component with a privacy impact rating of P1, your privacy advisor must certify that your team has satisfied the privacy requirements before the software can be shipped.

Security Requirements

  • To facilitate the debugging of security vulnerability reports and to help tools teams research cases in which automated tools failed to identify security vulnerabilities, all product teams must submit symbols for all publicly released products as part of the release process. This requirement is needed only for RTM/RTW binaries and any post-release binaries that are publicly released to users (such as service packs or updates, among others).

  • Design and implement a sign-off process to ensure security and other policy compliance before you ship. This process should include explicit acknowledgement that the product successfully passed the FSR and was approved for release.


Privacy Requirements

  • Design and implement a sign-off process to ensure privacy and other policy compliance before you ship. This process should include explicit acknowledgement that the product successfully passed the FSR and was approved for release.


Resources

N/A

Content Disclaimer

This documentation is not an exhaustive reference on the SDL process as practiced at Microsoft. Additional assurance work may be performed by product teams (but not necessarily documented) at their discretion. As a result, this example should not be considered as the exact process that Microsoft follows to secure all products.

This documentation is provided “as-is.” Information and views expressed in this document, including URL and other Internet website references, may change without notice. You bear the risk of using it.

This documentation does not provide you with any legal rights to any intellectual property in any Microsoft product. You may copy and use this document for your internal, reference purposes.

© 2012 Microsoft Corporation. All rights reserved.

Licensed under Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported

Show:
© 2014 Microsoft