(Updated: providing more info about provided tests)

Goal

This post compares WCF and WebAPI technologies from the performance perspective.

Service description

I implemented almost same service in two different ways. Both are accessible via HTTP/REST and both works in the way to skip the serialization in order to isolate logic which could impact the performance results.

Both services implements GET and POST requests:

  1. GET request to return a string with current time
  2. POST request to accept a data and return back the same data in the stream

 

I self-hosted both services and run them via Full Azure emulator.

My own test agent

The first approach was stress testing via my own implementation, very similar to the way described here: http://www.ducons.com/blog/tests-and-thoughts-on-asynchronous-io-vs-multithreading). I was able to get cca 4000 request/sec but it had the following drawback that the test agent used more CPU than server so that the results were not real, the whole system run on 100% CPU. I think, it was mainly due to overhead connected with TPL and HttpClient.

Azure Load testing

This approach wasn’t successful neither, I created and setup the azure load testing (max number of users without any delays) but I was able to run max little bit over 2000requests/second.

Appache Benchmark

I knew this tool from my past and using it was worth again! It’s built on sockets, which has the minimal overhead (compared to HttpClient). There is also a similar benchmark done by Rick Strahl but I’ve measured a little bit different numbers.



WCF service details

class TestService : ITestService
{
    public Stream TestGet()
    {
        return new MemoryStream(Encoding.UTF8.GetBytes("Hello World. Time is: " + DateTime.Now));
    }
 
    public async Task<Stream> TestStream(Stream requestStream)
    {
        using (var reader = new StreamReader(requestStream))
        {
            var body = await reader.ReadToEndAsync();
 
            return new MemoryStream(Encoding.UTF8.GetBytes(body));
        }
 
    }
 
}


WebAPI service details

 
public class TestController : ApiController
{
    public HttpResponseMessage Get()
    {
        return new HttpResponseMessage() { Content = new StringContent("Hello World. Time is: " + DateTime.Now, Encoding.UTF8, "text/plain") };
    }
 
    public async Task<HttpResponseMessage> Post(HttpRequestMessage inputMessage)
    {
        var content = await inputMessage.Content.ReadAsByteArrayAsync();
        var response = new HttpResponseMessage { Content = new ByteArrayContent(content) };
        return response;
    }
 
    public async Task<HttpResponseMessage> Put(HttpRequestMessage inputMessage)
    {
        var content = await inputMessage.Content.ReadAsStreamAsync();
        var response = new HttpResponseMessage { Content = new StreamContent(content) };
        return response;
    }
 
 
}

Thanks to feedback from Joakim:

As you can see, WebAPI has POST and PUT request support. The reason is, that StreamContent turns the response to the chunked transfer mode and I wanted to compare chunked and unchunked modes.



Performance tests

I run both services locally on my machine (Dell XPS12, i7, SSD). In each test I issue 60000 requests with concurrency set to 100. POST requests sent and received 500 bytes over the network. Here are the commands and results:



GET request to WebAPI

ab -n 60000 -c 100 -k http://localhost:8082/api/test

Server Software:        Microsoft-HTTPAPI/2.0
Server Hostname:        localhost
Server Port:            8082

Document Path:          /api/test
Document Length:        41 bytes

Concurrency Level:      100
Time taken for tests:   8.042 seconds
Complete requests:      60000
Failed requests:        0
Write errors:           0
Keep-Alive requests:    60000
Total transferred:      13860000 bytes
HTML transferred:       2460000 bytes
Requests per second:    7460.40 [#/sec] (mean)
Time per request:       13.404 [ms] (mean)
Time per request:       0.134 [ms] (mean, across all concurrent requests)
Transfer rate:          1682.96 [Kbytes/sec] received



GET request to WCF

ab -n 60000 -c 100 -k http://localhost:8080/wcf/test/testget

Server Software:        Microsoft-HTTPAPI/2.0
Server Hostname:        localhost
Server Port:            8080

Document Path:          /wcf/test/testget
Document Length:        41 bytes

Concurrency Level:      100
Time taken for tests:   7.333 seconds
Complete requests:      60000
Failed requests:        0
Write errors:           0
Keep-Alive requests:    60000
Total transferred:      13800000 bytes
HTML transferred:       2460000 bytes
Requests per second:    8181.73 [#/sec] (mean)
Time per request:       12.222 [ms] (mean)
Time per request:       0.122 [ms] (mean, across all concurrent requests)
Transfer rate:          1837.69 [Kbytes/sec] received




POST request to WebAPI

POST request is implemented in NON-CHUNKED transfer mode.

ab -n 60000 -c 100 -p c:\temp\data500.txt -k http://localhost:8082/api/test


Server Software:        Microsoft-HTTPAPI/2.0
Server Hostname:        localhost
Server Port:            8082

Document Path:          /api/test
Document Length:        508 bytes

Concurrency Level:      100
Time taken for tests:   8.729 seconds
Complete requests:      60000
Failed requests:        0
Write errors:           0
Keep-Alive requests:    60000
Total transferred:      39480000 bytes
Total POSTed:           40267000
HTML transferred:       30480000 bytes
Requests per second:    6873.25 [#/sec] (mean)
Time per request:       14.549 [ms] (mean)
Time per request:       0.145 [ms] (mean, across all concurrent requests)
Transfer rate:          4416.60 [Kbytes/sec] received
                        4504.64 kb/s sent
                        8921.24 kb/s total



PUT request to WebAPI

PUT request is implemented in CHUNKED transfer mode.

ab -n 60000 -c 100 -u c:\temp\data500.txt -k http://localhost:8082/api/test


Server Software:        Microsoft-HTTPAPI/2.0
Server Hostname:        localhost
Server Port:            8082

Document Path:          /api/test
Document Length:        0 bytes

Concurrency Level:      100
Time taken for tests:   10.079 seconds
Complete requests:      60000
Failed requests:        0
Write errors:           0
Non-2xx responses:      60001
Keep-Alive requests:    60000
Total transferred:      10020167 bytes
Total PUT:              40207569
HTML transferred:       0 bytes
Requests per second:    5953.22 [#/sec] (mean)
Time per request:       16.798 [ms] (mean)
Time per request:       0.168 [ms] (mean, across all concurrent requests)
Transfer rate:          970.90 [Kbytes/sec] received
                        3895.90 kb/s sent
                        4866.81 kb/s total



POST request to WCF

ab -n 60000 -c 100 -p c:\temp\data500.txt -k http://localhost:8080/wcf/test/teststream

Server Software:        Microsoft-HTTPAPI/2.0
Server Hostname:        localhost
Server Port:            8080

Document Path:          /wcf/test/teststream
Document Length:        508 bytes

Concurrency Level:      100
Time taken for tests:   9.365 seconds
Complete requests:      60000
Failed requests:        0
Write errors:           0
Keep-Alive requests:    60000
Total transferred:      41880000 bytes
Total POSTed:           40928100
HTML transferred:       30480000 bytes
Requests per second:    6407.15 [#/sec] (mean)
Time per request:       15.608 [ms] (mean)
Time per request:       0.156 [ms] (mean, across all concurrent requests)
Transfer rate:          4367.37 [Kbytes/sec] received
                        4268.11 kb/s sent
                        8635.48 kb/s total



Summary

The measured results showed that WCF is faster than WebAPI when sending GET requests but slower when executing POST request in non-chunked mode. My results are different than Rick’s results little bit. WCF proved to be faster in some scenarios and I think all depends on the data payload. My take-away from it:

  1. I have a base line for the further service development
  2. both platforms are “fast enough” for me and I’m going to stay with WebAPI as it looks to be more appropriate for building further real RESTfull service development but I wrote about it in my previous article.