When I first started using the HealthVault SDK, I wrote some code like this, based on what I had seen before:
HealthRecordSearcher searcher = PersonInfo.SelectedRecord.CreateSearcher();HealthRecordFilter filter = new HealthRecordFilter(Height.TypeID);searcher.Filters.Add(filter);
HealthRecordItemCollection items = searcher.GetMatchingItems();
So, what's up with indexing into the result from GetMatchingItems()? Why isn't it simpler?
The answer is that queries can be batched up into a single filter, so that you can execute them all at once. So, if we want to, we can write the following:
ReadOnlyCollection<HealthRecordItemCollection> results = searcher.GetMatchingItems();
HealthRecordItemCollection heightItems = results;HealthRecordItemCollection weightItems = results;
Based on a partner question today, I got a bit interested in what the performance advantages were of batching queries up. So, I wrote a short test application that compared fetching 32 single Height values either serially or batched together.
Here's what I saw:
This is a pretty impressive result - if you need to fetch 4 different items, it's nearly 4 times faster to batch up the fetch compared to doing them independently. Why is this so big?
Well, to do a fetch, the following thing has to happen:
When a filter returns small amounts of data, steps 1, 3, and 5 are pretty fast, but steps 2 and 4 involve network latency, which dominates the elapsed time. So, the batching eliminates those chunks of time, and we get a nice speedup.
We would therefore expect that as we fetch more data in each request, batching would be less useful. Here is some data for fetching 16 items:
Which is pretty much what you would expect.