Why Caching?

As load increases on a web application, we will often take a series of steps to help improve site performance. Initially, we might put our web application behind a load balancer and add more web instances to better serve the incoming requests. With Windows Azure Web Sites and Cloud Services, adding more instances is as easy a turning a knob to the number of desired instances. The Windows Azure load balancer automatically adjusts to handle the additional instances.

Our next step might be to add more, or perhaps even larger, application servers to our middle / business logic tier. But as traffic continues to increase, the load on our database typically starts to increase as well. Soon the database may become the performance bottleneck. The database responses can get slower and slower. As pressure increases, the database may begin to throttle or connections may timeout. The end result for our users – errors and a poor experience.

Different Azure Cache Offerings from Microsoft and there supportability


  referred as



In-Role Cache



  In-role cache : http://msdn.microsoft.com/en-us/library/azure/hh914161.aspx

  Dedicated Cache

Role based Cache



  Co-Located Cache

Windows Azure Cache









Azure Shared Cache 

Azure AppFabric Cache

Goes out of service on
  September 1st 2014


Azure Cache Service

Azure Managed Cache

Supported(Not Recommended).
  Don’t support the ability to create a managed cached in the portal today, you
  will need use the Azure PowerShell SDK to create a cache


Why do we have
  support for Azure managed service? To support
  customers who made investments into the preview offering of Azure Managed
  Cache Service, who have dependency on it in their apps, and to give them as
  much time as they need to move to Azure Redis Cache (Preview).

Azure Redis Cache


Supported (Recommended)



Migration from Azure Shared Caching or AppFabric Caching

Migration from AppFabric Caching


Migrate from Shared Caching to Azure
  Managed Cache Service


Migrate from Shared Caching to Azure
  Redis Cache (Preview)




Usage Guidelines For a Co-located In-role Cache (ref: link )

 1.    Use diagnostics data to determine the correct percentage of memory to allocate for caching. Include performance counter data on memory and CPU for the virtual machine instances that are running under expected load. The amount of memory available per running role instance is determined by the virtual machine size (VM size) and the memory used by the operating system and other application services running on the role. To understand how to correctly set the Cache Size (%), see Capacity Planning Considerations for In-Role Cache (Azure Cache).

 2.    A co-located topology is not recommended in the following cases:
•   Cache sizes greater than 1.5 GB.   
•   Cache clusters with more than 400 caching transactions per second per role instance   
•   Cache clusters with more than 1.2 MB of bandwidth used for caching operations per second per role instance.



Usage Guidelines for a Dedicated Caching Topology (ref: link )

 The following guidelines apply to the dedicated In-Role Cache topology: 

•    In general, a dedicated InRole Cache role provides the best performance, because it does not share the role's virtual machine with any other application services. It also provides the most flexibility, because you can scale the In-Role Cache role independently. For these reasons, using a dedicated topology is the recommended caching architecture. However, there are situations where a co-located topology works well. For more information, see Guidelines for a Co-located Caching Topology. 
•   The amount of memory available per running role instance is determined by the virtual machine size (VM size) and the memory used by the operating system and other application services running on the role. To understand how to correctly set the Cache Size (%), see Capacity Planning Considerations for In-Role Cache (Azure Cache). 
•    Do not use a dedicated In-Role Cache role for other code or services.  
•   Only one cache cluster is supported for each cloud service.  



What is the recommendation of Microsoft in choosing between the available options? (ref: link )

There are always benefits of one over the other when it comes to selection of Caching methods, for example Azure managed service gives a capacity of 150 GB cache size when compared to Redis cache which is 26 GB.  That being said, Azure Redis Cache gives customers the ability to use a secure, dedicated Redis cache, managed by Microsoft. With this offer, you get to leverage the rich feature set and ecosystem provided by Redis, and reliable hosting and monitoring from Microsoft.

On May 12, 2014, Microsoft announced the availability of the Azure Redis Cache (Preview). Microsoft recommend all new developments use the Azure Redis Cache.  For more information please refer this link

Other features of Azure  Redis Cache include : 
•   Redis is an advanced key-value store, where keys can contain data structures such as strings, hashes, lists, sets and sorted sets. Redis supports a set of atomic operations on these data types. 
•   Transactions, Pub/Sub, Lua scripting, Keys with a limited time-to-live, and configuration settings to make Redis behave like a cache. 
•   Azure Redis Cache (Preview) leverages Redis authentication and also supports SSL connections to Redis.



Tutorial and Guide for Cache: http://azure.microsoft.com/en-us/documentation/services/cache/ 

Sample Demo’s





Azure Redis Cache 



Azure Managed Cache 



Now, that’s some facts and basic information one must know while working on Azure Caching. This brings all the information from different msdn articles into one.

For those who have already started using Azure Cache I’m sure you would have seen(I have come across this pretty often)  sometimes that the DataCacheFactory gets into a bad state where many requests will fail even though the server is fine. My next post will have details on handling DataCacheFactory by clearing DataCachefactory connectionPool.

I hope this will help developers who start there Azure Cache journey. 


Ganesh Shankaran

Software Development Engineer