BCP Utility Upload Errors in SQL Azure - SQL Azure Team Blog - Site Home - MSDN Blogs

BCP Utility Upload Errors in SQL Azure

BCP Utility Upload Errors in SQL Azure

Rate This
  • Comments 2

One cause of bcp utility upload errors in SQL Azure is trying to upload too much data in a single batch. Each batch of rows from your table is a single transaction, and SQL Azure has constraints on transactions that could cause the bcp utility upload to fail if BCP violated those constraints. This article will address how to avoid violating the constraints.

The bcp utility is a command line utility that ships with Microsoft SQL Server. It bulk copies data between SQL Azure (or SQL Server) and a data file in a user-specified format. The bcp utility that ships with SQL Server 2008 R2 is fully supported by SQL Azure. You can find out more about using BCP with SQL Azure in this blog post.

One type of error you can encounter when uploading is:

SQLState = 08S01, NativeError = 10054

Error = [Microsoft][SQL Native Client]TCP Provider: An existing connection was forcibly closed by the remote host.

SQLState = 08S01, NativeError = 10054

Error = [Microsoft][SQL Native Client]Communication link failure

This is an example of SQL Azure closing the connection because of transaction constraints, including the rule that a transaction must not exceed 1 Gigabyte of data.

The default batch size is 1000 for BCP, which means that there are 1000 rows uploaded per transaction. If those rows exceed the transaction constraints, you could get the error above. To reduce the number of rows in a batch you can implement bcp utility with the –b flag and indicate the number of rows.

bcp AdventureWorksLTAZ2008R2.SalesLT.Customer in C:\Users\user\Documents\MoveDataToSQLAzure.txt -b100 -c -U username@servername -S tcp:servername.database.windows.net -P password

Reducing the row count will make the transfer chattier, and slows your rows/second transfer rate. However, it might be the only way to get tables with large row sizes into SQL Azure using bcp utility.

For Better Performance

You can increase your performance by increasing the batch size, so that more rows are inserted per transaction. This will only work if data length of each row is small – opposite of the rows giving you the error above. I have been uploading batches of 10,000 rows with good results, especially for many to many tables where there are only four columns.

Do you have questions, concerns, comments? Post them below and we will try to address them.

  • Hi Wayne,

    Thanks for sharing this. I am getting the same error while loading Data cleansing repository data from the flat file with approx 100K records.

    The default option -b1000 gives the error right away. Whereas, changing it to -b100 loads first 200 rows and fails. Changing it to -b10 loads total 180 rows and fails.

    Is there any other way that you can think of?

    Thanks in advance! Appreciate your help.

  • Nachiket:  I can guess what the issue is however, the best idea is to  start a support ticket they can help, follow this link: support.microsoft.com/.../default.aspx They have great tools to help you.

    So here is my question:  Are all or some of the columns in the table varbinary(max) or varchar(max) and they are holding a large amount of data?

Page 1 of 1 (2 items)
Leave a Comment
  • Please add 4 and 3 and type the answer here:
  • Post