BCP and SQL Azure - SQL Azure Team Blog - Site Home - MSDN Blogs

BCP and SQL Azure

BCP and SQL Azure

  • Comments 18

BCP is a great way to locally backup your SQL Azure data, and by modifying your BCP based backup files you can import data into SQL Azure as well. In this blog post, we will learn:

  • How to export data out of tables in SQL Azure server into a data file by using BCP and
  • How to use the BCP utility to import new rows from a data file into SQL Azure tables.

What is BCP?

The bcp utility is a command line utility that ships with Microsoft SQL Server. It bulk copies data between SQL Azure (or SQL Server) and a data file in a user-specified format. The bcp utility that ships with SQL Server 2008 R2 is fully supported by SQL Azure.

You can use BCP to backup and restore your data on SQL Azure

You can import large numbers of new rows into SQL Azure tables or export data out of tables into data files by using bcp.

The bcp utility is not a migration tool. It does not extract or create any schema or format information from/in a data file or your table. This means, if you use bcp to back up your data, make sure to create a schema or format file somewhere else to record the schema of the table you are backing up. bcp data files do not include any schema or format information, so if a table or view is dropped and you do not have a format file, you may be unable to import the data. The bcp utility has several command line arguments. For more information on the arguments, see SQL Server Books Online documentation. For more information on how to use bcp with views, see Bulk Exporting Data from or Bulk Importing Data to a View.

Exporting Data out of SQL Azure

Imagine you are a developer or a database administrator. You have a huge set of data in SQL Azure that your boss wants you to backup.

To export data out of your SQL Azure database, you can run the following statement at the Windows command prompt:

bcp AdventureWorksLTAZ2008R2.SalesLT.Customer out C:\Users\user\Documents\GetDataFromSQLAzure.txt -c -U username@servername -S tcp:servername.database.windows.net -P password

This will produce the following output in the command line window:

clip_image002[1]

The following screenshot shows the first 24 rows in the GetDataFromSQLAzure.txt file:

clip_image004

Note: The examples provided in this blog post use the AdventureWorkLTAZ2008R2 database. You can download it from SQL Server Database Samples.

Importing Data to SQL Azure

Similarly, you might have existing data in your local database and want to move it to the cloud. You need a very simple way to do this.

To import data into your SQL Azure database, create a data file which has the same schema format as the destination table in your SQL Azure database.

For example, rename the GetDataFromSQLAzure.txt file as MoveDataToSQLAzure.txt and remove all the existing entries and add 5 new entries as shown below.

clip_image006

Then, run the following statement at the Windows command prompt:

bcp AdventureWorksLTAZ2008R2.SalesLT.Customer in C:\Users\user\Documents\MoveDataToSQLAzure.txt -c -U username@servername -S tcp:servername.database.windows.net -P password

This will produce the following output in the command line window:

clip_image002[3]

The bcp utility provides several command line arguments. For a list of all the arguments, see SQL Server Books Online documentation.

Here, let’s describe the ones that are used in those examples.

  • database_name.schema.table_name: The database_name specifies the database in which the specified table or view resides. If not specified, this is the default database for the user. table_name is the name of the destination table when importing data into SQL Azure server (in), and the source table when exporting data from SQL Azure server (out).
  • in: copies from a file into the database table or view.
  • out: copies from the database table or view to a file. If you specify an existing file, the file is overwritten.
  • -c: Performs the operation using a character data type.
  • -U: Specifies the login ID used to connect to SQL Azure server. You must append the SQL Azure server name to the login name in the connection string by using the <login>@<servername> notation.
  • -S: Specifies the SQL Azure server to which to connect. The fully qualified name of the SQL Azure server is servername.database.windows.net.
  • -i: Specifies the input file.
  • queryout: Starting with SQL Server 2008 R2, SQL Azure supports the queryout argument as well. When you use the queryout argument, make sure to append –d database_name argument. Otherwise, the bcp utility cannot locate the database to connect to.

This blog post demonstrated how to use bcp with SQL Azure in a very simple way. If you have millions of rows in your table in your local database -- it is still fast and easy to use.

Do you have questions, concerns, comments? Post them below and we will try to address them.

  • Using BCP to export moderate amounts of data directly from sql azure has been a problem for me.  The connection is dropped constantly and I have to keep retrying.

  • To be more specific, when bulk exporting out of azure using BCP, I see a lot of this error:

    SQLState = 08S01, NativeError = 10054

    Error = [Microsoft][SQL Native Client]TCP Provider: An existing connection was forcibly closed by the remote host.

    SQLState = 08S01, NativeError = 10054

    Error = [Microsoft][SQL Native Client]Communication link failure

    It can happen when the connection first opens, and other times it happens after getting about 150,000 out of 170,000 in my largest table.

  • Pete,

    I have been doing a lot of experimenting with BCP options to get the best performance for uploading.  Can you tell me your BCP options/flags?  Have you tried exporting your data ordered, and using the ORDER hint in BCP?  What is your rows/second upload time for tables that work?

    Thanks,

    Wayne

  • Hi Wayne,

    I am just using vanilla bcp options.  I am using the -n flag and everything else is just credentials and server name.  Certainly I am no expert on bcp.

    I just checked back to this thread.  I am running bat file that exports each table in sequence.  Some succeed, others fail in progress, and some just fail to connect.  Here are some example successes:

    5872 rows copied.

    Network packet size (bytes): 4096

    Clock Time (ms.) Total     : 4118   Average : (1425.93 rows per sec.)

    450 rows copied.

    Network packet size (bytes): 4096

    Clock Time (ms.) Total     : 109    Average : (4128.44 rows per sec.)

    Here is the failure to connect (amidst other successful exports):

    SQLState = 08S01, NativeError = 10054

    Error = [Microsoft][SQL Native Client]TCP Provider: An existing connection was forcibly closed by the remote host.

    SQLState = 08S01, NativeError = 10054

    Error = [Microsoft][SQL Native Client]Communication link failure

    And here is an export that fails after getting many rows:

    ...

    1000 rows successfully bulk-copied to host-file. Total received: 137000

    SQLState = 08S01, NativeError = 10054

    Error = [Microsoft][SQL Native Client]TCP Provider: An existing connection was forcibly closed by the remote host.

    SQLState = 08S01, NativeError = 10054

    Error = [Microsoft][SQL Native Client]Communication link failure

    SQLState = 08S01, NativeError = 10054

    Error = [Microsoft][SQL Native Client]TCP Provider: An existing connection was forcibly closed by the remote host.

    SQLState = 08S01, NativeError = 10054

    Error = [Microsoft][SQL Native Client]Communication link failure

    137211 rows copied.

    Network packet size (bytes): 4096

    Clock Time (ms.) Total     : 44023  Average : (3116.80 rows per sec.)

  • Probably the issue is that you are trying to upload to much data in a single batch.  Try bcp with -b100 this will lower the rows per transaction to 100 instead of the default 1000.  A blog article about this is coming up.

  • I tried with a batch size of 100 as you suggested.  I get a connection timeout every time that I try to set the batch size.

    I think the single biggest obstacle to adopting SQL Azure (which I'm trying very hard to do) is our inability to back up our data.  The SSMS export wizard and bcp both are extremely unreliable.  If you guys can provide a clear, tested, reliable way to do this, it will remove the only "dealbreaker."

  • Would this be applicable if applied from SQL Azure to a Cloud drive mounted in blob storage?  I'm imagining there might be a way to backup a SQL Azure DB to a cloud drive.

  • Wayne, I have followed this advice and read all of your follow-on articles.  I still am just pulling my hair out trying to export and import data.  

    Using either SSMS's "Export Data..." wizard or bcp, it is proving very difficult to upload (even with batch size set to 100).  My row sizes are small anyway.  1 or 2 tables have 200,000 rows but they are small in size.  I am uploading a total of 11 tables, most of which only have a < 100 rows.  No blobs or binary data.  This is just an initialization script for our product to build the base, clean database.

    Is there some other way to get in touch besides these comments??

  • i am trying to copy the data of one table in Sql server to the same table in SQL Azure but i am constantly getting the error

    Could not find number of records copied. File will not be uploaded.

    Can anybody help me out ?

  • Couldn't get this to work.  SQL2K8R2, Win2k3.  Have been trying for days to get data uploaded and nothing works, including SSMS, SQLAzureMW, and bcp.exe.  I can connect, just cannot get my rows in.  I finally tweaked some generated SQL to get the tables put together, but still no rows.  275 tables.  Total DB size less than a GB.  DB originally from a SQL2K db.

  • Sean, I feel for you I know what it is like to work on something for days, why don't you start a support ticket they can help, follow this link: support.microsoft.com/.../default.aspx

  • [7/3/11] I've been banging my head against this same error, "existing connection was forcibly closed".  It always happened at 130000 rows with a batch size of 10000.  I reduced the batch size to 1000, and got the same error, then decreased the batch size to 100 and got a new error:

    Row 131135: The given value of type String from the data source cannot be converted to type int of the specified target column.

    In the source table, a column was nullable, and BCP does not like this.  

    Hope it helps.

  • hey i m tryn to migrate the database which contains a table which has 1.2 million records

    .exporting the data from local server to .dat file worked properly.later when importing the data to sql aure database i m getting an error.

    SQLState = 08S01, NativeError = 10054

    Error = [Microsoft][SQL Server Native Client 10.0]Communication link failure

    SQLState = 08S01, NativeError = 0

    Error = [Microsoft][SQL Server Native Client 10.0]Communication link failure

    SQLState = 08S01, NativeError = 0

    Error = [Microsoft][SQL Server Native Client 10.0]Communication link failure

    SQLState = 08S01, NativeError = 0

    Error = [Microsoft][SQL Server Native Client 10.0]Communication link failure

    SQLState = 08S01, NativeError = 0

    Error = [Microsoft][SQL Server Native Client 10.0]Communication link failure

    SQLState = 08S01, NativeError = 0

    Error = [Microsoft][SQL Server Native Client 10.0]Communication link failure

    SQLState = 08S01, NativeError = 0

    Error = [Microsoft][SQL Server Native Client 10.0]Communication link failure

    SQLState = 08S01, NativeError = 0

    Error = [Microsoft][SQL Server Native Client 10.0]Communication link failure

    SQLState = 08S01, NativeError = 0

    Error = [Microsoft][SQL Server Native Client 10.0]Communication link failure

    SQLState = 08S01, NativeError = 0

    Error = [Microsoft][SQL Server Native Client 10.0]Communication link failure

    SQLState = 08S01, NativeError = 0

    Error = [Microsoft][SQL Server Native Client 10.0]Communication link failure

    BCP copy in failed

    can u please help me out. i tried with decreasing the batch size too.

  • This comment thread has lost some steam, but I am also having the same issue when uploading 32m rows to Azure.  We get anywhere from 2-9m rows in and the process fails.  This is very frustrating when trying to support an Enterprise level app on SQL Azure.

  • I actually had a similar issue and found that turning up the batch size and packet size helped.  The natural instinct is to turn it down, but switching to settings of -b 10000 -a 16384 (defaults are 1000 rows and 4096 bytes) made the problem go away for me.

Page 1 of 2 (18 items) 12
Leave a Comment
  • Please add 2 and 5 and type the answer here:
  • Post