SALES: 1-800-867-1380
3 out of 6 rated this helpful - Rate this topic

Data Migration to Windows Azure SQL Database: Tools and Techniques

This document provides guidance on migrating data definition (schema) and data to Windows Azure SQL Database. It is primarily for one-time migration from SQL Server to SQL Database. For on-going data sharing and SQL Database backup, see SQL Data Sync Overview.

Migration Considerations

Microsoft Windows Azure offers several choices for data storage. You may choose to use one or many in your projects.

Windows Azure SQL Database is SQL Server technology delivered as a service on the Windows Azure Platform. The cloud-based SQL Database solutions can provide many benefits, including rapid provisioning, cost-effective scalability, high availability, and reduced management overhead. SQL Database supports the same tooling and development practices used for on-premise SQL Server applications. Therefore, it should be a familiar experience for most developers.

The long term goal for SQL Server and SQL Database is for symmetry and parity in both features and capabilities, however there are currently differences in architecture and implementation that need to be addressed when migrating databases to SQL Database and developing SQL Database solutions.

Before migrating a database to SQL Database, it is important to understand when to migrate and the differences between SQL Database and SQL Server.

When to Migrate

There are three main storage offerings on the Windows Azure platform. Windows Azure Storage contains Table, Blob, and Queue. When designing a Windows Azure solution, you shall evaluate different options, and use each storage mechanism to provide the best performance for the part of the solution.

 

Storage Offering

Purpose

Maximum Size

Windows Azure SQL Database

Relational database management system

150GB

Windows Azure Storage

Blob

Durable storage for large binary objects such as video or audio

200GB or 1TB

Table

Durable storage for structured data

100TB

Queue

Durable storage for inter-process messages

100TB

Local Storage

Per-instance temporary storage

250GB to 2TB

Local storage provides temporary storage for a local running application instance. A local store is only accessible by the local instance. If the instance is restarted on different hardware, such as in the case of hardware failure or hardware maintenance, data in the local store will not follow the instance. If your application requires reliable durability of the data, want to share data between instances, or access the data outside of Windows Azure, consider using a Windows Azure Storage account or Windows Azure SQL Database instead.

SQL Database provides data-processing capabilities through queries, transactions and stored procedures that are executed on the server side, and only the results are returned to the application. If you have an application that requires data processing over large data sets, then SQL Database is a good choice. If you have an application that stores and retrieves large datasets but does not require data processing, then Windows Azure Table Storage is a better choice.

Because the SQL Database size limitation is currently set at 150 GB, and SQL Database is much more expensive than Windows Azure storage, you may consider moving the blob data into Windows Azure Blog Storage. So that you can reduce the pressure on the database size limit and reduce the operational cost.

For more information, see Data Storage Offerings on the Windows Azure platform.

Compare Windows Azure SQL Database to SQL Server

Similar to SQL Server, SQL Database exposes a tabular data stream (TDS) interface for Transact-SQL-based database access. This allows your database applications to use SQL Database in the same way that they use SQL Server.

Unlike SQL Server administration, SQL Database abstracts the logical administration from the physical administration; you continue to administer databases, logins, users, and roles, but Microsoft administers and configures the physical hardware such as hard drives, servers, and storage. Since Microsoft handles all of the physical administration, there are some differences between SQL Database and SQL Server in terms of administration, provisioning, Transact-SQL support, programming model, and features.

The following list provides a high-level overview of some of the major differences:

  • Database Size

    SQL Database currently offers two editions:

    • Web Editions, in 1GB and 5GB sizes.

    • Business Editions, in 10, 20, 30, 40, 50, 100, 150GB sizes.

    It is important to check the size of your database and how it fits within the database allowances used by SQL Database. If your database is larger than SQL Database size limitation, then you must examine your database and see if it can be broken down into smaller databases (i.e. sharding), or moving large data to Window Azure blob storage. For more information on database sharding, see Federation: Building Scalable, Elastic, and Multi-tenant Database Solutions with Windows Azure SQL Database.

  • Authentication

    SQL Database supports only SQL authentication. You must consider whether changes are needed to the authentication scheme used by your application. For more information on the security limitations, see Security Guidelines and Limitations.

  • SQL Server Database Version

    SQL Database is based on SQL Server 2008 (level 100). If you want to migrate your SQL Server 2000 or SQL Server 2005 databases to SQL Database, you must make sure your databases are compatible with SQL Server 2008. You will find the best path is to migrate from a SQL Server 2008 to SQL Database. You can go through an on-premises upgrade to SQL Server 2008 before you migrate to SQL Database. Here are some great resources to help you with migrating from older versions of SQL Server: Upgrading to SQL Server 2008 R2 and Microsoft SQL Server 2008 Upgrade Advisor.

  • Schema

    SQL Database does not support heaps. ALL tables must have a clustered index before data can be inserted. For more information on the clustered index requirement, see Inside Windows Azure SQL Database.

  • Transact-SQL Supportability

    Windows Azure SQL Database supports a subset of the Transact-SQL language. You must modify the script to only include supported Transact-SQL statements before you deploy the database to SQL Database. For more information, see Supported Transact-SQL Statements, Partially Supported Transact-SQL Statements, and Unsupported Transact-SQL Statements.

  • The Use Statement

    In SQL Database, the USE statement does not switch between databases. To change databases, you must directly connect to the database.

  • Pricing

    Pricing for your SQL Database subscription is per database, and it is based on the edition. There are also additional charges for data transfer volume, any time data comes into or out of the data center. You have your choice of running your application code on your own premises and connecting to your SQL Database in the data center, or running your application code in Windows Azure, which is hosted in the same data center as your SQL Database. Running application code in Windows Azure avoids the additional data transfer charges. In either case, you should be aware of the Internet network latency that cannot be mitigated using either model. For more information, see Pricing Overview.

  • Feature Limitations

    Some of the SQL Server features are not currently supported by SQL Database. They include: SQL Agent, Full-text search, Service Broker, backup and restore, Common Language Runtime, and SQL Server Integration Services. For a detailed list, see SQL Server Feature Limitations.

Connection Handling

When using a cloud based database like SQL Database, it requires connections over the internet or other complex networks. Because of this, you should be prepared to handle unexpected dropping of connections. 

SQL Database provides a large-scale multi-tenant database service on shared resources. In order to provide a good experience to all SQL Database customers, your connection to the service may be closed due to several conditions. 

The following is a list of causes of connection terminations:

  • Network latency

    Latency causes an increase in time required to transfer data to SQL Database. The best way to mitigate this effect is to transfer data using multiple concurrent streams. However, the efficiency of parallelization is capped by the bandwidth of your network.

    SQL Database allows you to create your database in different datacenters. Depending on your location and your network connectivity, you will get different network latencies between your location and each of the data centers. To help reducing network latency, select a data center closest to majority of your users. For information on measuring network latency, see Testing Client Latency to Windows Azure SQL Database.

    Hosting your application code in Windows Azure is beneficial to the performance of your application because it minimizes the network latency associated with your application's data requests to SQL Database.

    Minimizing network round trips can also help with reducing network related problems.

  • Database Failover

    SQL Database replicates multiple redundant copies of your data to multiple physical servers to maintain data availability and business continuity. In the case of hardware failures or upgrades, SQL Database provides automatic failover to optimize availability for your application. Currently, some failover actions result in an abrupt termination of a session.

  • Load Balance

    Load balancer in SQL Database ensures the optimal usage of the physical servers and services in the data centers. When the CPU utilization, input/output (I/O) latency, or the number of busy workers for a machine exceeds thresholds, SQL Database might terminate the transactions and disconnect the sessions.

  • Throttling

    To ensure that all subscribers receive an appropriate share of resources and that no subscriber monopolizes resources at the expense of other subscribers, SQL Database may close or “throttle” subscriber connections under certain conditions. SQL Database Engine Throttling service continually monitors certain performance thresholds to evaluate the health of the system and may initiate varying levels of throttling to particular subscribers depending on the extent to which these subscribers are impacting system health.

    The following performance thresholds are monitored by SQL Database Engine Throttling:

    • Percentage of the space allocated to a SQL Database physical database which is in use, soft and hard limit percentages are the same.

    • Percentage of space allocated for SQL Database log files that is in use. Log files are shared between subscribers. Soft and hard limit percentages are different.

    • Milliseconds of delay when writing to a log drive, soft and hard limit percentages are different.

    • Milliseconds of delay when reading data files, soft and hard limit percentages are the same.

    • Processor usage, soft and hard limit percentages are the same.

    • Size of individual databases relative to maximum size allowed for database subscription, soft and hard limit percentages are the same.

    • Total number of workers serving active requests to databases, soft and hard limit percentages are different. If this threshold is exceeded, the criteria of choosing which databases to block are different than in the case of other thresholds. Databases utilizing the highest number of workers are more likely to be throttling candidates than databases experiencing the highest traffic rates.

For more information, see Windows Azure SQL Database Connection Management, and Windows Azure SQL Database Performance and Elasticity Guide.

The best way to handle connection loss it to re-establish the connection and then re-execute the failed commands or query. For more information, see Transient Fault Handling Framework.

Optimize Databases for Data Import

There are a few things you can do to the databases to improve the migration performance:

  • Delay creation of non-clustered indexes or disable non-clustered indexes. Addition indexes created before loading the data can significantly increase the final database size and the time taken to load the same amount of data.

  • Disable triggers and constrain checking. The triggers may fire when a row is inserted into a table causing the row to be reinserted into another table. The trigger can cause delays and you might not want those types of inserts to be reinserted.

  • Bulk import performance is improved if the data being imported is sorted according to the clustered index on the table. For more information, see Controlling the Sort Order When Bulk Importing Data.

Transfer Large Data to SQL Database

SQL Server Integration Services (SSIS) and bcp utility perform well with large data migration.

When loading large data to SQL Database, it is advisable to split your data into multiple concurrent streams to achieve the best performance.

By default, all the rows in the data file are imported as one batch. To distribute the rows among multiple batches, specify a batch size whenever it is available. If the transaction for any batch fails, only insertions from the current batch are rolled back. Batches already imported by committed transactions are unaffected by a later failure. It is best to test a variety of batch size settings for your particular scenario and environment to find the optimal batch size.

Choose Migration Tools

There are various tools available for migration database to SQL Database. In general, database migration involves schema migration and data migration. There are tools supporting one of each or both. You could even use the Bulk Copy API to author your own customized data upload application.

Migrate from SQL Server

 

Tools

Schema

SQL Database Compatibility Check

Data

Data Transfer Efficiency

Note

DAC Package

Yes

Yes

No

N/A

  • Entity containing all database objects, but no data

  • Full SQL Database support

DAC BACPAC Import Export

Yes

Yes

Yes

Good

  • Export/import of DAC plus data with DAC framework

  • Service for cloud-only support available

  • SQL DAC Examples available on CodePlex

SSMS Generate Scripts Wizard

Yes

Some

Yes

Poor

  • Has explicit option for SQL Database scripts generation

  • Good for smaller database

bcp

No

N/A

Yes

Good

  • Efficient transfer of data to existing table

  • Each bcp command transfer one database

SQL Database Migration Wizard

Yes

Yes

Yes

Good

  • Great capabilities, e.g. evaluate trace files

  • Open source on CodePlex

  • Not supported by Microsoft

SQL Server Integration Services

No

N/A

Yes

Good

  • Most flexibility

SQL Server Import and Export Wizard

No

N/A

Yes

Good

  • Simple UI on top of SSIS; also available in SSMS

Migrate from Other RDMSs

SQL Database Migration Assistant can be used to migration database from Access, MySQL, Oracle, Sybase to SQL Database.

Microsoft Codename “Data Transfer” can transfer data in CSV or Excel file to SQL Database.

Migrate between SQL Databases

To migrate data from one SQL Database to another SQL Database, you can use SQL Database copy and SQL Data Sync.

SQL Database supports a database copy feature. This creates a new database in SQL Database which is a transactionally consistent copy of an existing database. To copy database, you must be connected to the master database of the SQL Database server where the new database will be created, and use the CREATE DATABASE command:

CREATE DATABASE destination_database_name AS COPY OF 
[source_server_name.]source_database_name

The new database can be on the same server, or on a different server. The user executing this statement must be in the dbmanager role on the destination server (to create a new database) and must be dbowner in the source database. For more information see Copying Databases in Windows Azure SQL Database .

SQL Database Data Sync enables creating and scheduling regular synchronizations between SQL Database and either SQL Server or other SQL Databases. For more information, see SQL Data Sync Overview.

Using Migration Tools

Data-tier Application DAC Package

Data-tier Applications (DAC) were introduced in SQL Server 2008 R2, with developer tool support in Visual Studio 2010. They are useful for packaging the schema, code and configuration of a database for deploying to another server. When a DAC is ready to deploy, it is built into a DAC package (.bacpac), which is a compressed file that contains the DAC definitions in the XML format. From SQL Server Management Studio, you can export database schema to a DAC package, and then deploy the package to SQL Database.

noteNote
The DACPAC format is different from the BACPAC format. The BACPAC format extends the DACPAC format to include a metadata file and JavaScript Object Notation (JSON) encoded table data, in addition to the standard .dacpac file contents. The BACPAC format is discussed in the DAC Import Export section.

You have the option to modify DAC package using Visual Studio 2010 before deployment. Within the DAC project, you can specify pre-deployment and post-deployment scripts. These are Transact-SQL scripts that can perform any action, including inserting data in the post-deployment scripts. However it is not recommended to insert a large amount of data using a DAC package.

Installation and Usage

DAC is shipped with SQL Server 20008 R2. There are two main steps involved migrating SQL Server database schema to SQL Database:

  1. Extract a DAC package from a SQL Server database:

    The Extract Data-tier Application Wizard can be used to build a DAC package based on an existing database. The DAC package contains the selected objects from the database, and associated instance-level objects such as the logins that map to database users. 

    Here is a screenshot of opening the wizard:

    Extract Data-tier Application Wizard

    The wizard involves the following main steps:

    1. Set the DAC properties, including DAC application name, version, description, and the package file location.

    2. Validate that all the database objects are supported by a DAC.

    3. Build the package.

    A DAC can only be extracted from a database in SQL Database, or SQL Server 2005 Service Pack 4 (SP4) or later. You cannot extract a DAC if the database has objects that are not supported in a DAC, or contained users. For more information about the types of objects supported in a DAC, see DAC Support For SQL Server Objects and Versions.

  2. Deploy the DAC package to SQL Database:

    The Deploy Data-tier Application Wizard can be used to deploy a DAC package. You must first connect to the SQL Database server from SQL Server Management Studio. The wizard creates the database if the database doesn’t exist. The wizard deploys the DAC package to the instance of the Database Engine associated with the node you selected in the Object Explorer hierarchy. For example, in the following screenshot, it deploys the package to the SQL Server called maqqarly23.database.windows.net:

    Deploy Data-tier Application Wizard
    ImportantImportant
    It is a good practice to review the contents of a DAC package before deploying it in production, especially when the deployed package was not developed in your organization. For more information, see Validate a DAC Package.

    The wizard involves the following main steps:

    1. Select the DAC package.

    2. Validate the content of the package.

    3. Configure the database deployment properties, where you specify the SQL Database.

    4. Deploy the package.

Other than using the wizard, you can also use PowerShell with the dacstore.install() method to migration schema to SQL Database.

Resources

Data-tier Application BACPAC Package

A data-tier application (DAC) is a self-contained unit for developing, deploying, and managing data-tier objects. DAC enables data-tier developers and database administrators to package Microsoft SQL Server objects, including database objects and instance objects, into a single entity called a DAC package (.dacpac file). The BACPAC format extends the DACPAC format to include a metadata file and JavaScript Object Notation (JSON)–encoded table data in addition to the standard .dacpac file contents. You can package your SQL Server database into a .bacpac file, and use it to migrate the database to SQL Database.

noteNote
DACPAC & BACPAC are similar but actually target very different scenarios. DACPAC is focused on capturing and deploying schema. Its primary use case is for deploying to development, testing, and then production environment.

BACPAC is focused on capturing schema and data. It is the logical equivalent of a database backup and cannot be used to upgrade existing databases. BACPAC’s primary use cases are moving a database from one server to another (or to SQL Database) and archiving an existing database in an open format.

The Import and Export Service for SQL Database is currently available as a public CTP. The service can directly import or export BACPAC files between a SQL Database and Windows Azure Blob storage. The Import and Export Service for SQL Database provides some public REST endpoints for the submission of requests.

The Windows Azure Platform Management Portal has an interface for calling the Import and Export Service for SQL Database.

Import and Export Service For SQL Database

SQL Server Management Studio currently does not support exporting a database into a BACPAC file. You can leverage the DAC API to import and export data.

The SQL DAC Examples shows how to use the Data-tier Application Framework API to migrate databases from SQL Server to SQL Database. The package provides two command line utilities and their source code:

  • DAC Import and Export Client-side tools can be used to export and import bacpac files.

  • DAC Import and Export Service client can be used to call the Import and Export Service for SQL Database to import and export bacpac files between Windows Azure Blob storage and a SQL Database.

    One way to copy a bacpac file to Windows Azure Blob storage is to use Microsoft Codename “Data Transfer”. For more information, see the Microsoft Codename Data Transfer section.

noteNote
The ability to import and export data to SQL Database using the Data-tier Application (DAC) framework is currently only available as CodePlex examples. The tools are only supported by the community.

Installation and Usage

This section demonstrates how to use the client tools of the SQL DAC Examples for migrating database from SQL Server to SQL Database.

The SQL DAC Examples can be downloaded from CodePlex. To run the sample, you must also install Data-Tier Application Framework on your computer.

Before using the tools to migrate database, you must create the destination SQL Database first. There are two steps involved using the tool for migration:

  1. Export a SQL Server Database

    Assume a database exists that is running on SQL Server 2008 R2, which a user has integrated security access to. The database can be exported to a “.bacpac” file by calling the sample EXE with the following arguments:

    DacCli.exe -s serverName -d databaseName -f C:\filePath\exportFileName.bacpac -x -e
    
  2. Import the Package to SQL Database

    Once exported, the export file can be imported to a SQL Database with the following arguments:

    DacCli.exe -s serverName.database.windows.net -d databaseName -f C:\filePath\exportFileName.bacpac -i -u userName -p password
    

Resources

Generate Scripts Wizard

The Generate Scripts Wizard can be used to create Transact-SQL scripts for SQL Server database and/or related objects within the selected database. You can then use the scripts to transfer schema and/or data to SQL Database.

Installation and Usage

The Generate Scripts Wizard is installed with SQL Server 2008 R2. The wizard can be opened from SQL Server Management Studio 2008 R2. The following screenshot shows how to open the wizard:

Generate Scripts Wizard

The wizard involves the following main steps:

  1. Choose objects to export.

  2. Set scripting options. You have the options to save the script to file, clipboard, new query window; or publish it to a Web service.

  3. Set advanced scripting options.

    By default, the script is generated for stand-alone SQL Server instance. To change the configuration, you must click the Advanced button from the Set Scripting Options dialog, and then set the Script for the database engine type property to SQL Database.

    Advanced Scripting Options

    You can also set the Types of data to script to one of the following based on your requirements: Schema only, Data only, Schema and data.

After the script is created, you have the option to modify the script before running the script against a SQL Database to transfer the database.

Resources

bcp

The bcp utility is a command line utility that is designed for high performing bulk upload to SQL Server or SQL Database. It is not a migration tool. It does not extract or create schema. You must first transfer the schema to SQL Database using one of the schema migration tools.

noteNote
You can use bcp to backup and restore your data on SQL Database.

noteNote
The SQL Database Migration Wizard uses bcp.

Installation and Usage

The bcp utility is shipped with SQL Server. The version shipped with SQL Server 2008 R2 is fully supported by SQL Database.

There are two steps involved using bcp:

  1. Export the data into a data file.

    To export data out of SQL Server database, you can run the following statement at command prompt:

    bcp tableName out C:\filePath\exportFileName.dat –S serverName –T –n -q
    
    The out parameter indicates copying data out of SQL Server. The -n parameter performs the bulk-copy operation using the native database data types of the data. The -q parameter executes the SET QUOTED_IDENTIFIERS ON statement in the connection between the bcp utility and your SQL Server instance.

  2. Import the data file into SQL Database

    To import data into your SQL Database, you must first create the schema in the destination database, and then run the bcp utility on from a command prompt:

    Bcp tableName in c:\filePath\exportFileName.dat –n –U userName@serverName –S tcp:serverName.database.windows.net –P password –b batchSize
    
    The –b parameter specifies the number of rows per batch of imported data. Each batch is imported and logged as a separate transaction that imports the whole batch before being committed. Identify the best batch size and using the batch size is a good practice for reducing connection lose to SQL Database during data migration.

The following are some best practices when you use bcp to transfer a large amount of data.

  1. Use the –N option to transfer data in the native mode so that no data type conversion is needed.

  2. Use the –b option to specify the batch size. Be default, all the rows in the data file are imported as one batch. In case of a transaction failure, only insertions from the current batch are rolled back.

  3. Use the –h “TABLOCK, ORDER(…)” option. The –h “TABLOCK” specifies that a bulk update table-level lock is required for the duration of the bulk load operation; otherwise, a row-level lock is acquired. It can reduce lock contention on the table. The –h “ORDER(…)” option specify the sort order of the data in the data file. Bulk import performance is improved if the data being imported is sorted according to the clustered index on the table.

You can use the –F and –L options to specify the first and last rows of a flat file for the upload. This was useful to avoid having to physically split the data file to achieve multiple stream upload.

Resources

SQL Database Migration Wizard

The SQL Database Migration Wizard is an open source UI tool that helps migrating SQL Server 2005/2008 databases to SQL Database. Other than migrating data, it can also be used to identify any compatibility issues, fix them where possible and notify you of all issues it knows about.

The SQL Database Migration Wizard has built-in logic for handling connection lose. It breaks down the transactions into smaller bunches and runs until SQL Database terminates the connection. When the wizard encounters the connection error, it reestablishes a new connection with SQL Database and picks up processing after the last successful command. In the same manner, when using bcp to upload the data to SQL Database, the wizard chunks the data into smaller sections and uses retry logic to figure out the last successful record uploaded before the connection was closed. Then it has bcp restart the data upload with the next set of records.

noteNote
The SQL Database Migration Wizard is an open source tool built and supported by the community.

Installation and Usage

The SQL Database Migration Wizard can be downloaded from http://sqlazuremw.codeplex.com. Unzip the package to your local computer, and run SQLAzureMW.exe. Here is a screenshot of the application:

SQL Database Migration Wizard

The wizard involves the following steps:

  1. Select the process you want the wizard to talk you through.

  2. Select the source you want to script.

  3. Select database objects to script.

  4. Generate the script. You have the option to modify the script afterwards.

  5. Enter information for connecting to target server. You have the option to create the destination SQL Database.

  6. Execute script against destination server.

Resources

SQL Server Integration Services

SQL Server Integration Services (SSIS) can be used to perform a broad range of data migration tasks. It is a powerful tool when operating on multiple heterogeneous data sources and destinations. This tool provides support for complex workflow and data transformation between the source and destination. Even though SSIS is not currently supported by SQL Database, you can run it on an on-premise SQL Server 2008 R2 to transfer data to Windows Azure SQL Database.

The SSIS Import/Export Wizard lets the user create packages that move data from a single data source to a destination with no transformations. The Wizard can quickly move data from a variety of source types to a variety of destination types, including text files and other SQL Server instances.

Installation and Usage

You must use the SQL Server 2008 R2 version of SSIS to connect to SQL Database.

The ADO.NET adapters have the required support for SQL Database. It provides an option to bulk load data specifically for SQL Database. Use ADO.NET Destination adapter to transfer data to SQL Database. Connecting to Windows Azure SQL Database by using OLEDB is not supported.

The following is a screenshot for configuring the ADO.NET Connection to SQL Database:

Configure ADO.NET Connection Manager

Because your package might fail due to throttling or network issue, you might consider designing the packages in a way so that it can be resumed at the point of failure, as different to restarting the package.

When you configure the ADO.NET destination, make sure to use the “Use Bulk Insert when possible” option. That allows you to use bulk load capabilities to improve the transfer performance.

One way to improve the performance is to split source data into multiple files on the file system. In SSIS Designer, you can reference the files using the Flat File Component. Each input file is then connected to an ADO .Net Component that has the Use Bulk Insert when possible flag checked.

Resources

SQL Server Import and Export Wizard

The SQL Server Import and Export Wizard offers the simplest method to create a SQL Server Integration Services package for a simple import or export. This wizard configures the connections, source, and destination, and adds any data transformations that are required to let you run the import or export immediately. After a package is created, you have the option to modify the package in SSIS Designer.

The wizard supports the following data sources:

  • .NET Framework Data Provider for ODBC

  • .NET Framework Data Provider for Oracle

  • .NET Framework Data Provider for SQL Server

  • Flat File Source

  • Microsoft OLE DB Provider for Analysis Services 10.0

  • Microsoft OLE DB Provider for Search

  • Microsoft OLE DB Provider for SQL Server

  • SQL Native Client

  • SQL Server Native Client 10.0

The SQL Server Import and Export Wizard can only transfer data. Before using the wizard, you must transfer the schema using one of the schema migration tools, Generate Scripts Wizard or DAC package.

noteNote
On a 64-bit computer, Integration Services installs the 64-bit version of the SQL Server Import and Export Wizard (DTSWizard.exe). However, some data sources, such as Access or Excel, only have a 32-bit provider available. To work with these data sources, you might have to install and run the 32-bit version of the wizard. To install the 32-bit version of the wizard, select either Client Tools or Business Intelligence Development Studio during setup.

Installation and Usage

In SQL Server 2008 R2 or later, the SQL Server Import and Export Wizard supports for SQL Database. There are several ways to start the wizard:

  1. On the Start menu, point to All Programs, point to Microsoft SQL Server 2008, and then click Import and Export Data.

  2. In Business Intelligence Development Studio, right-click the SSIS Packages folder from Solution Explorer, and then click SSIS Import and Export Wizard.

  3. In Business Intelligence Development Studio, on the Project menu, click SSIS Import and Export Wizard.

  4. In SQL Server Management Studio, connect to the Database Engine server type, expand Databases, right-click a database, point to Tasks, and then click Import Data or Export data.

  5. In a command prompt window, run DTSWizard.exe, located in C:\Program Files\Microsoft SQL Server\100\DTS\Binn.

The migration involves the following main steps:

  1. Choose a data source from which to copy data.

  2. Choose a destination where to copy data to.

    To export data to SQL Database, you must choose the .NET Framework Data Provider for SQLServer as the destination:

    SQL Server Import and Export Wizard - Choose a Des
  3. Specify table copy or query.

  4. Select source objects.

  5. Save and run the package.

After the SQL Server Import and Export Wizard has created the package, you can optionally save the package to run it again later, or to refine and enhance the package in SQL Server Business Intelligence (BI) Development Studio.

noteNote
If you save the package, you must add the package to an existing Integration Services project before you can change the package or run the package in BI Development Studio.

Resources

Microsoft Codename “Database Transfer”

Microsoft Codename “Data Transfer” is a cloud service that lets you transfer data from your computer to a SQL Database or Windows Azure Blob storage. You can upload any data format to Windows Azure Blob storage, and data that is stored in comma-separated value (CSV) or Microsoft Excel format (.xlsx) to a SQL Database. When you upload data to a SQL Database, it is transformed into database tables.

Usage

The Data Transfer service is accessible at https://web.datatransfer.azure.com/. From the home page, you have the options to import data, manage your datasets and your stores.

The data import to SQL Database process involves the following steps:

  • Enter your SQL Database credentials.

  • Choose a file to transfer.

  • Analyze the data file and then transfer the data.

Resources

SQL Server Migration Assistant

SQL Server Migration Assistant (SSMA) is a family of products to reduce the cost and risk of migration from Oracle, Sybase, MySQL and Microsoft Access databases to SQL Database or SQL Server. SSMA automates all aspects of migration including migration assessment analysis, schema and SQL statement conversion, data migration as well as migration testing.

Installation and Usage

SSMA is a Web download. To download the latest version, see the SQL Server Migration Tools product page. As of this writing, the following are the most recent versions:

SSMS is installed by using a Windows Installer-based wizard. SSMA is free, but you must download a registration key. After you install and run the application, the application prompts you to register and download the registration key.

The migration process of the SSMA for Access involves the following steps:

  1. Create a new migration wizard. Make sure selecting SQL Database in the Migrate To box.

  2. Add Access databases.

  3. Select the Access objects to migrate.

  4. Connect to SQL Database.

  5. Link tables. If you want to use your existing Access applications with SQL Database, you can link your original Access tables to the migrated SQL Database tables. Linking modifies your Access database so that your queries, forms, reports, and data access pages use the data in the SQL Database instead of the data in your Access database.

  6. Convert selected objects.

  7. Load converted objects into SQL Database.

  8. Migrate data for selected Access objects.

Resources

See Also


Build Date:

2013-04-18
Did you find this helpful?
(1500 characters remaining)
Thank you for your feedback

Community Additions

ADD
Show:
© 2014 Microsoft. All rights reserved.