Disclaimer: Any time I post information about what I’m working on, it represents my current thinking, which is subject to change as I discover more information about the platform. I’d think this would be obvious in a blog, but I just want to remind the reader to always refer to the public documentation for the “official” word.
As I’ve been working on my time sensor sample, I’ve had to put some thought into how to manage data report intervals. In the Sensor Platform in Windows 7, applications that consume the sensor’s data by subscribing to events can specify the frequency for the report events by setting the SENSOR_PROPERTY_CURRENT_REPORT_INTERVAL property. Actually, I use the term “setting” loosely—it’s really more of a request. You see, events in for sensors are broadcast to all subscribed clients, so there’s really only one report interval for the sensor at any given time.
This raises some interesting issues.
The sensor class extension provides notifications when client apps connect, disconnect, subscribe to events, and unsubscribe from events. The lifetime of a connected app might look like this:
1. The app connects.
2. The app requests a report interval.
3. The app subscribes to events.
4. The app unsubscribes from events.
5. The app disconnects.
Steps 4 and 5 will always happen in that order, even if the app doesn’t explicitly unsubscribe from events; the platform will ensure that your driver gets the unsubscribe call before the disconnect call.
Since any number of client apps could be connected to the sensor at any given time, and only one report interval exists, how do we determine which report interval to use? Obviously, we could just set a fixed interval and leave it at that, ignoring all the requests. But we have the ability to make the clients happier than this, not to mention the fact that throttling the interval may afford us some power management advantages on the device. We could also simply let the most recently requested interval rule, but this strategy presents its own problems. For example, suppose one app needs data every second, but the most recent interval request was for five minute intervals? Seems like the first app would experience some major performance issues.
A better strategy would be to keep track of the requested report intervals and choose an interval based on a rule. Typically, it would be best to always choose the shortest interval that is greater than or equal to the minimum interval supported by the device. Doing so would avoid starving applications that require frequent data updates, while other applications could simply choose to ignore some notifications.
Ok, so we have a strategy for choosing the interval. Now we need to understand how to manage the states and transitions. A given app could be in any of the following states:
2. Connected, not subscribed, no report interval requested.
3. Connected, not subscribed, report interval requested.
4. Connected, subscribed, no report interval requested.
5. Connected, subscribed, report interval requested.
To complicate matters a bit, the report interval the app requests could simply be the default interval, which the app would request by specifying zero (0) for the interval value. We’ll uncomplicated this somewhat by agreeing that when the default interval is requested, that request will be treated just like any other interval request. We’ll simply plug in the default interval in place of zero.
Clearly, cases one and two aren’t very interesting and have no impact on the interval we choose. However, for case number two, we’ll at least need to keep track of the connected app, because the app could subscribe to events at any time.
To track the connected clients in the time sensor sample, I created a struct like the following one. Note that there’s no need for a “connected” field because the very fact that I’ve created an instance of the structure implies a connected client.
BOOL bListening; // TRUE when client is listening to events.
ULONG ulInterval; // Interval requested by client.
Each time a client connects, I create an instance of this struct and add it to the map, using the IWDFFile pointer that ISensorDriver::OnClientConnect provides as the key. If the client never requested a report interval, the ulnterval member stays set to zero. If the client requests an interval of zero, I set the ulInterval member to the driver’s default interval, which is 1000 in this case. Otherwise, I set this member to the value that the client requested. When the client disconnects, I use the key to find the struct pointer and free the memory.
I simply set the bListening value based on calls to ISensorDriver::OnClientSubscribeToEvents and ISensorDriver::OnClientUnsubscribeFromEvents—TRUE for the first call, FALSE for the latter. Clients are not subscribed, by default.
Each time a client subscribes to or unsubscribes from events, I look up the client in the map by using the IWDFFile pointer. The code chooses a report interval based on the following algorithm:
Walk through the collection of connected clients.
If the client is subscribed to events, test whether the client requested a report interval.
If the client requested a report interval, test whether it’s shorter than the shortest one found.
If no clients requested a report interval, return the default.
Otherwise, return the shortest requested interval.
Here’s the code for that function:
int iSize = Clients.GetSize();
ULONG temp = 0;
// Find the shortest interval stored in the array.
for(int i = 0; i < iSize; i++)
ClientData* pCurrent = Clients.GetValueAt(i);
// Choose the shortest interval
// from the clients that are listening to events
// and have explicitly requested an interval.
if(pCurrent->bListening == TRUE && // listening to events
pCurrent->ulInterval != 0 && // client app set an interval at some time in the past
(0 == temp || pCurrent->ulInterval < temp)) // shortest valid interval
temp = pCurrent->ulInterval;
return 0 == temp ? g_dwDefaultInterval : temp;
“0 == temp” in the conditional handles the case where the first valid interval hasn’t been selected, yet.
You might consider an optimization. When the client app subscribes to events, it’s tempting to simply compare the current report interval to the one the client requested and choose the shorter one. That would avoid having to call GetNewReportInterval and walking the collection of clients, except when a client unsubscribes. The problem with that idea is the case where the current interval equals the default interval.
For example, in the time sensor, what happens when the current interval is 1000 and the code needs to make a comparison? Was the current interval set to 1000 explicitly or by default? Unless we track that information, we don’t know, so the comparison could result in the code choosing the default interval, when the correct choice would be a longer, explicitly requested interval.
Really, several tests would be required, here:
- Check for zero-report-interval values. Zero indicates that the subscribing app never requested a value.
- Check whether the current interval equals the default.
- Check why the default interval was set—truly by default or explicitly.
- Choose either the shorter interval or the non-default interval, whichever is appropriate based on the tests.
For the sample, I decided that the GetNewReportInterval function is the way to go. I like centralizing this decision process and I’m not concerned about optimizing performance in this instance. I just don’t think such a driver will ever have more than a few connected client apps, so finding the correct interval should be a snap. You’ll need to decide the right approach for your sensor driver.