High Availability and Disaster Recovery Considerations with Windows Azure SQL Database
When migrating on-premises SQL Server database to Windows Azure SQL Database (SQL Database), a frequently asked question is how to implement a backup and restore strategy to protect data from user mistakes, application errors, hardware failure, data center shutdown due to natural disasters, and other database disasters. Unlike on-premises deployments, SQL Database is designed to mask physical database file management and operations from database administers. Note that a SQL Database server is a logical server that defines a group of databases. Databases associated with your SQL Database server may reside on separate physical computers in the Microsoft data center. An individual logical database might share the space of a single physical database with another logical database. In a multitenant Windows Azure environment, traditional SQL Server backup and restore tools do not work.
Authors: Kun Cheng, Selcin Turkarslan
Reviewers: Steve Howard, Adrian Bethune
How to help protect your database from failure of individual servers, devices or network connectivity
Each SQL Database instance has three replicas residing on three different physical machines within a datacenter, one primary and 2 secondary replicas. All reads and writes go through the primary replica, and any changes are replicated to the secondary replicas asynchronously.
SQL Database uses a quorum-based commit scheme where the write must be completed to the primary replica and one secondary replica before we considering the transaction committed. If the hardware fails on the primary replica, the SQL Database fabric detects the failure and fails over to the secondary replica. Therefore, there are also at least two transactionally physical consistent copies of your data in a data center. The three replicas for every Windows Azure SQL Database instance protect your data from failure of individual servers, devices, or network connectivity. In addition to the redundant replicas, the Windows Azure SQL Database fabric maintains a minimum of 14 days of backups taken in five minute increments for all the databases in the data center. These backups are stored in the data center as a safe guard against simultaneous or catastrophic hardware and system failures.
The SQL Database environment is designed to maintain the server available along with data integrity of your data in case of hardware failures. During a failover event, a SQL Database instance may be inaccessible for a brief moment. Your application needs to have re-try logic to handle such failover events. But you can use the same connection string to re-establish the connection after the failover to the secondary replica. For more information on how to handle the connection-loss errors, see Connection Management in Windows Azure SQL Database article in the TechNet Wiki.
How to help protect your database from unwanted deletions or modifications
User or application error is one of the most common data-loss or corruption scenarios in many software applications. A user could drop a table or application by mistake or submit a transaction twice. These types of mistakes are hard to control and recover from. You can use the following tools to deal with such problems:
SQL Database Import/Export Service
Bcp and SQL Server Integration Services
Database Copy allows you to create a copy of your database either in the same server or in a different server in the same data center. It’s an online, asynchronous, and transactional consistent operation. Since it’s an asynchronous operation, you can issue the copy command and then monitor the progress by querying sys.dm_database_copies (SQL Database) system view.
In order to copy a SQL Database instance, your login must be a member of the server level dbmanager role at the destination server and be DBO of the source database on the source server. The login must have the same login name and password on both SQL Database servers: source and destination. The frequency at which you choose to copy your database can vary and depends on business needs. To recover from user or application errors, we recommend that you create a daily copy and maintain two or three running copies on a rotating basis by dropping the oldest copy every day after a fresh copy completes.
Note that although we recommend daily copies, you can copy your database more frequently. We recommend that you perform database copy operation no more frequently than hourly. Each database copy process, although executing independently of all other database copy processes, will produce a copy of the database which is transactionally consistent as of the end of the copy process. Each copy counts toward the database limit of 150 databases for each SQL Database server, and will be charged as a separate database. Therefore, copying too frequently presents a situation where you may run out of available databases in your account and pay needlessly for database copies that are nearly identical. For more information, see Copying Databases in SQL Database topic in the SQL Database MSDN library.
In addition to database copy, you can use the SQL Database Import/Export Service. This service allows you to import or export both data and schema in a package with .bacpac extension. The package is in a compressed format containing all SQL Database compatible objects like tables, views, indexes, constraints, triggers, stored procedures, logins, users and so on. The service can directly import or export BACPAC files between a SQL Database instance and Windows Azure Blob storage. You can access the Import/Export Service via the Windows Azure Management Portal. If you would like to import or export directly between on-premises SQL Server and SQL Database without using Windows Azure Blob storage, use the classes provided in the Microsoft.SqlServer.Dac Namespace Similarly, you can use
DacIESvcCli.exe in the SQL DAC Examples provided at the CodePlex site..
Unlike the Database Copy, the Import/Export Service does not produce a backup that is transactionally consistent. To do a backup, we recommend that you lock down your database and stop transactions before exporting the data and schema.
The Bulk copy utility (BCP.exe), SQL Server Integration Services (SSIS), and System.Data.SqlClient.SqlBulkCopy are also similar to the Import/Export Service. Currently SQL Database supports BCP, Bulk Copy API and SSIS to move data. You need to create schema objects in SQL Database before loading the data. Using BCP or SSIS as a bulk copy mechanism enables you to control what objects you move from within a database and what data you move from those objects. You can also specify different parameters like batch size, packet size, and number of streams to achieve best throughput depending on network bandwidth and latency.
How to help protect your database from widespread loss of data center facilities
To help protect against any data center loss in the event of a disaster, you need to create offsite storage of database backups outside of the data center, in which your database application is deployed. To achieve that, we recommend that you use both the database copy described in the previous section and the SQL Database Import/Export Service.
We recommend that you use the following suggested tools to manage your overall backup and restore strategy:
Implement a backup and restore strategy to handle user and application errors using:
SQL Database Import/Export Service
Bcp or SQL Server Integration Services
- Database Copy
Implement an advanced backup and restore strategy to handle widespread loss of data center facilities using:
Import/Export Service to migrate a database copy to one or more secondary data centers and, optionally, within your own on-premise SQL Server.
- Import/Export Service to migrate a database copy to one or more secondary data centers and, optionally, within your own on-premise SQL Server.
For more information on backup, restore, and disaster recovery options in Windows Azure, see Business Continuity in Windows Azure SQL Database and Business Continuity for Windows Azure articles in the MSDN library.