Dışarıya aktar (0) Yazdır
Tümünü Genişlet
EN
Bu içerik dilinizde bulunmamaktadır ancak İngilizce sürümüne buradan bakabilirsiniz.

How to: Use bcp to Migrate a Database to Azure SQL Database

Updated: April 9, 2014

You can use the SQL Server bcp utility for high-performance data movement when migrating a SQL Server database to Microsoft Azure SQL Database. You first use bcp to copy the data out of a source table and into a data file. You then run it again to copy the data from the data file into the destination table. bcp moves only data, so it must be used with another process for database schema migration.

Author: Shaun Tinline-Jones

The bcp utility is a command-line utility that is designed for high performance bulk upload to SQL Server or Microsoft Azure SQL Database. It is not a migration tool. It does not extract or create a schema. You must first transfer the schema to a database in Microsoft Azure SQL Database using one of the schema migration tools, such as the Generate Scripts wizard, or extracting and deploying a data-tier application (DAC) package. For help identifying a schema migration process, see Choosing Tools to Migrate a Database to Azure SQL Database.

The bcp utility calls the SQL Server bulk copy functionality that is also exposed in the SQL Server application programming interfaces (API). Several of the migration tools, such as the Microsoft Azure SQL Database Migration Wizard and DAC BACPACs also use bulk copy functionality to transfer data.

Utilize bulk copy best practices to improve performance when copying data into a large destination table. For example:

  • Use the –N option to transfer data in the native mode so that no data type conversion is needed.

  • Use the –b option to specify a batch size. Each batch is imported and logged as a separate transaction. By default, all the rows in the data file are imported as one batch. If a transaction fails, only insertions from the current batch are rolled back. Identify the best batch size and using the batch size is a good practice for reducing connection lose to Microsoft Azure SQL Database during data migration.

  • Use bcp hints:

    • Use the –h “TABLOCK” hint on the import to specify using a bulk update table-level lock for the duration of the bulk load operation. This reduces lock overhead by using a single table lock instead of a lock for each row.

    • The –h “ORDER(…)” hint on the export to sort the data file. Bulk import performance is improved if the data being imported is sorted according to the clustered index on the table.

  • For large tables, split the import copy into multiple streams that you can run concurrently. If you bulk copied the data from the source table into a single data file, use the –F firstrow and –L lastrow parameters to specify which part of the data file is processed by each run of bcp.

For more information about bulk copy best practices, see Optimizing Bulk Import Performance.

If you are using IDENTITY to generate the primary keys in a table, use the bcp –E parameter to preserve the keys generated in the source database. -E should prevent any foreign key violations during the import, provided no other updates are made to the tables while the import is running. Ensure that no other updates are possible, such as by putting the database in read-only mode.

noteNote
bcp operates one table at a time, so does not maintain transactional integrity across multiple tables when extracting data from a source database. You can address this issue by putting the source database in single-user or read-only mode during an export.

[Top]

The tables in the destination database must be empty for bulk copy import. You cannot perform multiple bulk copy imports to the same table unless you truncate or delete all the rows inserted by the previous bulk copy.

[Top]

bcp ships with SQL Server. Install the client utilities from SQL Server 2008 R2 or later versions of SQL Server to get a version of bcp supported for use with Microsoft Azure SQL Database.

[Top]

There are five steps involved using bcp to move the data from a table in the source database to the copy of the table in the destination database:

  1. Migrate the schema.

    Use a schema transfer mechanism, such as the Generate Scripts wizard or a DAC BACPAC, to create a copy of the database in Microsoft Azure SQL Database. At the end of the process all of the tables should have been created in the Azure SQL Database database, but not contain any data.

  2. Export the data into data files.

    For each table in the source SQL Server database, run a bcp out operation to copy the data from the table to a data file. This is an example of exporting the data from one table into a data file:

    bcp tableName out C:\filePath\exportFileName.dat –S serverName –T –n -q
    
    The out parameter indicates copying data out of SQL Server. The -n parameter performs the bulk-copy operation using the native database data types of the data. The -q parameter executes the SET QUOTED_IDENTIFIERS ON statement in the connection between the bcp utility and the instance of the Database Engine.

  3. Perform bulk copy optimizations

    Make any destination database schema changes needed to speed the performance of copying data into large tables, such as disabling non-clustered indexes, triggers, and constraints.

  4. Import the data file into Azure SQL Database

    Run the bcp utility for each table in the Microsoft Azure SQL Database destination database, copying the data from the export data file to the table. This example has three runs of bcp to copy data into a single table from a data file that has approximately 300,000 rows. Each run copies about 100,000 of the rows.

    Bcp tableName in c:\filePath\exportFileName.dat –n –U userName@serverName –S tcp:serverName.database.windows.net –P password –b 200 –L 99999 –h”TABLOCK”
    
    Bcp tableName in c:\filePath\exportFileName.dat –n –U userName@serverName –S tcp:serverName.database.windows.net –P password –b 200 –F 100000 –L 199999 –h”TABLOCK”
    
    Bcp tableName in c:\filePath\exportFileName.dat –n –U userName@serverName –S tcp:serverName.database.windows.net –P password –b 200 –F 200000 –h”TABLOCK”
    
    The in parameter indicates copying data into Microsoft Azure SQL Database. The –b parameter specifies the number of rows per batch of imported data. The –L lastrow and –F firstrow parameters are used to specify which part of the data file is processed by each run.

  5. Remove schema optimizations

    Restore any schema items that were removed to optimize the bulk inserts. For example, enable any non-clustered indexes, triggers, or constraints that were disabled in step 3.

[Top]

Show:
© 2014 Microsoft