Why do you write JavaScript? It's probably for one of two major reasons – Responsiveness or Richness. Richness we know about – it's the flying, sliding, flashing, and sometimes annoying UI features we see J

But Responsiveness is much less tangible – it's about how an application feels to use. Does it feel like it's doing something? Do you know when you've clicked a button? How long do you have to wait for your results? In other words, it's about the client side perceived performance of a site.

An approach that the patterns & practices guys talked about whilst putting together the Web Application Guidance Reference Implementation was Predictive Fetch.

The idea is that perhaps you know 80% or more of users hit "next" on your Search page, for example – or any other kind of navigation or data refresh that occurs on a web page. Predictive Fetch states that we can do this on behalf of the user, in the background (asynchronously) before they ask us to.

Extending the Progressive Enhancement Sample

For this post I've extended my previous Progressive Enhancement sample, which allows us to page through sets of results, and is enhanced to use jQuery partial rendering. Download the attached and have a look. The additional code falls into 3 steps;

1.       Fetch: When the document is ready, we immediately initiate asynchronously fetching the search results for the Next and Previous page links.

2.       Cache: When each of these fetches completes, we cache the HTML in an array.

3.       Render: When the user clicks one of the Next or Previous links, we first check the array for the HTML – and use it if we find it. If we don't, we just let the link do its usual work.


When the page loads and the document has been processed, we execute a bit of jQuery;

$('#partialregion a')

.live('click', linkClick)


This does almost exactly the same as the Progressive Enhancement sample (except I use ".live" as per Jaime's comment!), in that we add event handlers to the links. But then we also call "preload"; a jQuery extension I've written that looks like this;

jQuery.fn.preload = function() {

    return $(this).each(function() {

        var target = this.href;

        if (target) {

            var alreadyCached = pageData[target];

            if (!alreadyCached)





All it does is call "loadPage" for each link in the jQuery wrapped set that hasn't already been preloaded, passing in the target href for each. loadPage then uses a bit of jQuery to make a GET request to the server;



        url: target,

        success: function(data) {

            pageData[target] = data;


        method: 'get'



The "success" function defined by the jQuery in loadPage simply adds the returned HTML to a JavaScript array named pageData, for use later. It is indexed on the href for the target page – so that might be http://localhost/page/1 or http://localhost/page/2. At the moment in this sample I never clear down this array – so no matter how many pages back and forth they go, I store all of them. This might be questionable on a site with lots of pages!


Finally, when a user clicks on one of the Next or Previous links to display the next page, we execute the linkClick function;

var target = source.target.toString();

var data = pageData[target];

if (data) {


    $('#partialregion a').preload();


    return false;


 In stages this code;

1.       Determines the destination href for the link that was clicked.

2.       Checks the cache pageData array to see if we've already fetched that HTML

3.       If we have, replaces the partial rendering region's content with the cached HTML

4.       Starts Predictive Fetch on any new links in the rendered partial content.

5.       Prevents the link's default action (i.e. navigation) from firing.

The Impact on your Application

The net result of this change for the user is near-instantaneous page transitions when they hit Next or Previous... making your application feel hugely responsive. Don't underestimate what a difference this makes!

The impact on the architecture is one of my favourite things – if you're already doing partial rendering or some kind of Progressively Enhanced script based user interface, you probably need zero extra server code... this sample has exactly the same server code as my Progressive Enhancement post. All that is needed is some well written jQuery.

That's my kind of enhancement – focused, lightweight, simple, yet powerful and high impact.

However, Predictive Fetch will almost certainly increase the hits on your server, some of which are potentially not needed (in other words, if the user never visits Page 2 you've retrieved it for no reason). Obviously this needs consideration.

patterns & practices Superior Example!

Now that you've seen my simple example and understood it, check out patterns & practices Reference Implementation. Have a look at the Predictive Fetch explanation in the Responsiveness Patterns topic, which highlights the Search functionality's predictive nature.

What is really interesting about their example is that it doesn't use as simple a case as mine – they rely on a POST rather than GET, are fetching JSON data and using client side templates to display it, and have many pages of results so don’t preload them all; they employ some logic to maximise the benefit whilst avoiding being over-eager in pre-fetching and caching.

To dive into the code check out the ResultsController class' _prefetch method in ~/Content/Scripts/Search/ResultsController.debug.js.

Originally posted by Simon Ince on 22nd April 2010 here http://blogs.msdn.com/simonince/archive/2010/04/22/client-script-patterns-predictive-fetch.aspx