Recently, I’ve been experimenting more with ADO.NET Data Services as a simple way of enabling REST-based access to data that I’m interested in sharing, or using across different projects. In this post, I’m not going to get into setting up an ADO.NET Data Service, as there are plenty of blog posts, docs, and videos to get you started on this cool technology with an awkward name.

Rather, I want to talk about some of the cool things you can do with a data service once you have it set up, and some gotchas I’ve run into in my coding against them.

Getting JSON Data

An ADO.NET Data Service will by default return content in an AtomPub format. This makes the data pretty easy to work with in a variety of ways, since ATOM is just XML, and it’s a fairly well-understood format.

But many applications today, particularly those that do a lot of work client-side, tend towards the JSON (JavaScript Object Notation) format, since it’s pretty lightweight, and well-supported in most client-side script frameworks like jQuery or ASP.NET AJAX. What I didn’t realize was how super-easy it is to get JSON-formatted data out of an ADO.NET Data Service.

In fact, all it takes to get JSON instead of AtomPub is the addition of one simple request header. You simply add the “Accept” header with a value of “application/json.” That’s it.

PowerShell as a Test Client

Now by default, without an add-in or tool like Fiddler, you’re not going to be able to test returning JSON directly to the browser, and since I didn’t want to have to mess with all that, I followed the suggestion in this MSDN Forums thread, and used Windows PowerShell (I have V2) for testing my service, in particular, checking the JSON output.

Here’s an example of a script that can test a data service…to run the script, you’ll need to save it as a .ps1 file on your machine and execute that from within PowerShell (note that you may need to change the execution policy to RemoteSigned to run an unsigned script…be sure you understand the implications of doing so before making the change. You can view a help topic on the subject by executing get-help about_signing at the PowerShell prompt):

# Return the results from the ApprovedEvents service
$url = "`$filter=state%20eq%20'VA'"
$client = new-object net.webclient
# $client.Headers.Add("accept", "application/json")
$s = $client.DownloadString($url)

In the above script, we set $url to the URI of the data service, including a filter that specifies that we only want events from the state of Virginia. We then echo that out (for debugging, more on that in a moment), and create an instance of the .NET WebClient class, and call DownloadString on it, passing in the $url variable. Finally, we write out the result to the PowerShell window.

Again, by default, the result will be in AtomPub format. To return JSON instead, simply uncomment (remove the #) the line which adds the “accept” header. Then, the data service will return the results as JSON instead. Simple.

A Gotcha with PowerShell and ADO.NET Data Services

The gotcha that I encountered with this relates to how you filter results in an ADO.NET Data Service. Note in the URI in the script above that we append a ? to add a querystring, which uses a $filter parameter with a value of state%20eq%20'VA'.

That works fine and dandy in the browser, but in PowerShell, $ is a reserved character. But PowerShell didn’t give me any errors in the original query, it just appeared to be ignoring my filter parameter, instead always returning all rows. The reason for this is that PowerShell was helpfully removing the “$filter” from my query, which I discovered when I echoed the URI out to the console window.

The solution was to use the backtick (`) escape character, as shown in the script above, to escape the $ character. Once I added that, the filter parameters worked fine, and the data service returned only the records that fit the filter.

A Note on Filters in ADO.NET Data Services

One more point to note relating to filters in ADO.NET Data Services (and this applies to Entity Framework and LINQ to SQL as well) is that when you add a filter to your data service URL, that filter is applied as part of the query that is executed on your back-end data store. I wanted to highlight this, as I was initially concerned that I might be dragging down all of the records associated with a given query, and filtering them locally. Thankfully, EF guru Julie Lerman was kind enough to set the record straight on the matter and help me understand what was going on under the covers.