Generating a BCP Utility Script for SQL Azure - SQL Azure Team Blog - Site Home - MSDN Blogs

Generating a BCP Utility Script for SQL Azure

Generating a BCP Utility Script for SQL Azure

Rate This
  • Comments 2

If you are migrating tables from SQL Server to SQL Azure, one of the easiest ways is to use bcp utilityto draw data out of your SQL Server into a file and then move the data from the file to SQL Azure. However, it can be tedious to write all the bcp utility commands by hand, since bcp utility requires that you execute a single command for each table, moving one table’s worth of data at a time (Find about more about how to use bcp utility with SQL Azure in our early blog post). Would it not be nice to move all the tables with a single batch file? This article presents a Transact-SQL script that will create a batch file with all the bcp utility commands you need to move a whole database.

Preliminaries

Before you run the script below:

  • You need to have already created the database schema on SQL Azure before you move files with bcp utility. You can do this using the Generate Script Wizard; see our previous blog post for more information.
  • The tables on the SQL Azure destination database should be empty, which means that you shouldn’t run the BCP utility batch file twice on the same destination database.
  • The script below runs in SQL Server Management Studio connected to the source SQL Server database. You will need to modify the variables at the top of the script to reflect your source SQL Server and destination SQL Azure server, before you execute the script.

The Transact-SQL script will generate the commands for a batch file using Transact-SQL PRINT statements. After executing the script, just copy the whole output to a file with a .bat extension. Once you have the batch file create you can run it from the command line to facilitate the move of your database.

Preserving Primary Keys

If you are using IDENTITY to generate the primary keys in your database, the bcp utility commands generated will preserve the numbering of your primary keys using the –E flag. However, the referential integrity of the foreign keys will not be checked when they are inserted. This was done so that rows could be inserted regardless of the dependencies amongst the tables – primary keys do not need to be inserted before foreign keys.

Because the primary keys are not regenerated there should not be any constraints violated as long as the source database is not written too while the batch file is running. Here lies the hitch, you need to make sure that your source database is either in read only mode, or that no application is writing data to it. The bcp utility commands are not wrapped inside a big transaction. There can be significant time between the time the first command in the batch file is executed and the last, this gives an opportunity for the data writes.

The Script

SET NOCOUNT ON

DECLARE @DestServer nvarchar(max)
DECLARE @DestDatabase nvarchar(max)
DECLARE @DestLogin nvarchar(max)
DECLARE @DestPassword nvarchar(max)

DECLARE @SrcServer nvarchar(max)
DECLARE @SrcDatabase nvarchar(max)
DECLARE @SrcLogin nvarchar(max)
DECLARE @SrcPassword nvarchar(max)

-- SQL Azure Server
SET        @DestServer = 'yourServer.database.windows.net'
SET        @DestDatabase = 'yourDatabase'
SET        @DestLogin = 'yourLogin@yourServer'
SET        @DestPassword = 'yourPassword'

-- SQL Server
SET        @SrcServer = 'yourServer'
SET        @SrcDatabase = 'yourDatabase'
SET        @SrcLogin = 'yourLogin'
SET        @SrcPassword = 'yourPassword'

PRINT 'echo %DATE% %TIME% > bcp.log'
PRINT ''

---------------------------------------------------

DECLARE @Schema nvarchar(max)
DECLARE @Table nvarchar(max)

DECLARE table_cursor CURSOR FOR
    SELECT TABLE_SCHEMA, TABLE_NAME
    FROM information_schema.tables 
    WHERE Table_Type = 'BASE TABLE'
    OPEN table_cursor;

-- Perform the first fetch.
FETCH NEXT FROM table_cursor INTO @Schema, @Table;

-- Check @@FETCH_STATUS to see if there are any
-- more rows to fetch.
WHILE @@FETCH_STATUS = 0
BEGIN

    PRINT 'REM Download ' + @Schema + '.' + @Table
    PRINT 'echo bcp.exe ' + @SrcDatabase + '.' + @Schema + '.' +
        @Table + ' out ' + 
        @Schema + '.' + @Table + '.bcp -q -h "TABLOCK" -N -CRAW -S' +
        @SrcServer + ' >> bcp.log'
    PRINT 'bcp.exe ' + @SrcDatabase + '.' + @Schema + '.' + @Table + 
        ' out ' + @Schema + '.' + @Table + 
        '.bcp -q -h "TABLOCK" -N -CRAW -S' + @SrcServer + 
        ' -U' + @SrcLogin + ' -P' + @SrcPassword + ' >> bcp.log'
    PRINT ''

    PRINT 'REM Upload ' + @Schema + '.' + @Table
    PRINT 'echo bcp.exe ' + @DestDatabase + '.' + @Schema + '.' + 
        @Table + ' in ' + @Schema + '.' + @Table + 
        '.bcp -q -h "TABLOCK" -N -E -CRAW -S' + @DestServer +
        ' >> bcp.log'
    PRINT 'bcp.exe ' + @DestDatabase + '.' + @Schema + '.' +
         @Table + ' in ' + @Schema + '.' + @Table + 
         '.bcp -q -h "TABLOCK" -N -E -CRAW -S' + @DestServer + 
         ' -U' + @DestLogin + ' -P' + @DestPassword + ' >> bcp.log'
    PRINT ''
    
   -- This is executed as long as the previous fetch succeeds.
   FETCH NEXT FROM table_cursor  INTO @Schema, @Table;
END

CLOSE table_cursor;
DEALLOCATE table_cursor;

Summary

Do you have questions, concerns, comments? Post them below and we will try to address them.

  • Thank you very much!  I was dreading importing all my tables to Azure one at a time when I migrate.  You just made it so easy.

    I assume the batch size is remains the default 1000?  Is there anyway to auto size the "-b" options that based on the table schema size so we don't get error you mentioned in this blog entry:  blogs.msdn.com/.../bcp-utility-upload-errors-in-sql-azure.aspx

    If not, I suppose we can just hardcode "-b100" in the scripts above.

  • Michael: This script is a starting place and a simple example to highlight a single concept:  You can script the creation of the bcp batch file.  You will need to modify it to meet your needs.

    Yes the batch file uses the default batch size 1000 rows.

    Changing the batch size might be a per table tuning your batch after it is run to successfully transfer your data.

Page 1 of 1 (2 items)
Leave a Comment
  • Please add 4 and 5 and type the answer here:
  • Post