Using the WebClient.Upload method for posting large files will eventually leave you stranded with OutOfMemoryExceptions.


WebClient.Upload reads the entire file to memory by default.


Build your own uploader.


One of my customers was using WebClient.Upload in a Winforms application to transfer files to a webserver. The idea in itself was fine, but when they transferred a couple of large files they'd get OutOfMemoryExceptions. When uploading a 500 MB file the application would need approximately 520 MB of memory and if you uploaded a few large files after each other you quickly hit the roof. Running GC.Collect(); after each transfer didn't help. Judging from the number of hits on the internet for this scenario they weren't the only ones with this problem.

This is the code they were using:

WebClient oWeb = new WebClient();
oWeb.UploadFile("http://localhost/test.aspx", "c:\\");

Okay, so why was this happening?
Well, first of all I wouldn't recommend running GC.Collect(); in any application. A lot has been written on this allready, but if you're interested in why I suggest you look at Rico Mariani's post on the subject. Anyway, for testing purposes we ran the following instead:




And this cleared the memory. So why isn't this a valid solution? Well, like I said, I wouldn't recommend using GC.Collect(); in any application, and why read the entire file to memory when you can stream it? I looked up the UploadFile-method and it seems like it does read the entire file to a byte array before posting. This is great for smaller files, but in this particular scenario it wasn't too good. So what I did was to write my own uploader:

public static string MyUploader(string strFileToUpload, string strUrl)


    string strFileFormName = "file";

    Uri oUri = new Uri(strUrl);

    string strBoundary = "----------" + DateTime.Now.Ticks.ToString("x");


    // The trailing boundary string

    byte[] boundaryBytes = Encoding.ASCII.GetBytes("\r\n--" + strBoundary + "\r\n");


    // The post message header

    StringBuilder sb = new StringBuilder();




    sb.Append("Content-Disposition: form-data; name=\"");


    sb.Append("\"; filename=\"");




    sb.Append("Content-Type: ");




    string strPostHeader = sb.ToString();

    byte[] postHeaderBytes = Encoding.UTF8.GetBytes(strPostHeader);


    // The WebRequest

    HttpWebRequest oWebrequest = (HttpWebRequest)WebRequest.Create(oUri);

    oWebrequest.ContentType = "multipart/form-data; boundary=" + strBoundary;

    oWebrequest.Method = "POST";


    // This is important, otherwise the whole file will be read to memory anyway...

    oWebrequest.AllowWriteStreamBuffering = false;


    // Get a FileStream and set the final properties of the WebRequest

    FileStream oFileStream = new FileStream(strFileToUpload, FileMode.Open, FileAccess.Read);

    long length = postHeaderBytes.Length + oFileStream.Length + boundaryBytes.Length;

    oWebrequest.ContentLength = length;

    Stream oRequestStream = oWebrequest.GetRequestStream();


    // Write the post header

    oRequestStream.Write(postHeaderBytes, 0, postHeaderBytes.Length);


    // Stream the file contents in small pieces (4096 bytes, max).

    byte[] buffer = new Byte[checked((uint)Math.Min(4096, (int)oFileStream.Length))];

    int bytesRead = 0;

    while ((bytesRead = oFileStream.Read(buffer, 0, buffer.Length)) != 0)

        oRequestStream.Write(buffer, 0, bytesRead);



    // Add the trailing boundary

    oRequestStream.Write(boundaryBytes, 0, boundaryBytes.Length);

    WebResponse oWResponse = oWebrequest.GetResponse();

    Stream s = oWResponse.GetResponseStream();

    StreamReader sr = new StreamReader(s);

    String sReturnString = sr.ReadToEnd();


    // Clean up






    return sReturnString;



One of the things worth noting is that you need to set oWebrequest.AllowWriteStreamBuffering = false; Otherwise you will read the entire file to memory anyway. This is because the default behavior of the WebRequest is to buffer the entire request in case it needs to re-send it due to authentication, connectivity problems, etc. Again, this is a default behavior that normally is a performance boost, but in this case is a performance killer.


So what was the end result?

During my first test runs the application needed as much memory as the file I was trying to upload, and then some. So in order to upload a 500 MB .cab-file the application needed at least 520 MB. The application using the custom uploader never went above 23 MB.

End of transmission

/ Johan