Windows #Azure Caching – Performance considerations

Windows #Azure Caching – Performance considerations

Rate This
  • Comments 2

Windows Azure Caching can help your application access state, store SQL Azure database data and serialize/deserialize objects regardless of location. Some of the key benefits of caching is to increase application performance, data distribution and data resiliency. Windows Azure Caching guarantees dedicated memory, easy setup and configured  from the portal, is automatically distributed, and has a promised uptime and availability (SLA)

When deciding if you should use caching its worth considering how your application manages its data. If your application frequently requests data that should be constant across all your application instances (e.g. Windows Azure web/worker roles) then caching is a worthy investment of your time.

When considering caching performance is usually top of mind and usage will be dependant on your applications:

  • Number of objects
  • Size of each object
  • Frequency of access of each object
  • Pattern for accessing these objects

Clearly caching everything would be great but will involve some cost. If you cache locally you need the local resources to support your cache size. If you use a distributed cache you will need to consider network transfer (latency)  and bandwidth. With Windows Azure Caching you will have a guaranteed cache without the management overhead, but you will need to bear in mind costs (cost of the service and cost of access*).   To get  a better picture of costs have a look at the Windows Azure  advanced pricing calculator (caching section) noting of course that  data transfer out from the data centers (*egress), if required, adds additional costs.

Further more specific guidance on performance is available here:

Local vs remote?

The Windows Azure Caching service is run on remote set of servers, to your application. If your application is within the same Data Center then you will undoubtedly get better performance. At times its worth taking this a step further and combining local and remote cached data. Consider if you have some data that doesn't change that often but is asked frequently by your web application. The quickest way of getting, typically a subset of, data is going to be in memory to the process running.

The Windows Azure Caching client, a the proxy that lives within your application, has the ability to cache a subset of the data that resides in the distributed cache servers (local cache). The client can cache any managed object—no object size limits, no serialization costs for local caching. As soon as you need more memory than your local machine can give you or you need to externalize the data from your compute tier (either for shared state or for more durability), the minimum price you pay is for the network hop

So, local caching is great, but now you may have a logical headache and planning around how distributed and local caches work together. Windows Azure Cache, whilst close to the Windows Server AppFabric Caching, doesn't support all the same features as its server superset. Particularly its worth noting that there are no notifications for local caching (DataCacheLocalCacheInvalidationPolicy)

The best of getting to grips with Windows Azure Caching is to try it for yourself. I’d start with  the Windows Azure Training Course and the  lab example of Caching Data with Windows Azure Caching

Leave a Comment
  • Please add 3 and 2 and type the answer here:
  • Post
  • Nice Info.

  • Looks promising however links are now broken, I wonder if this info is still relevant.

    thanks

Page 1 of 1 (2 items)