Engineering Windows 7

Welcome to our blog dedicated to the engineering of Microsoft Windows 7

Continuing our discussion on performance

Continuing our discussion on performance

We've talked some about performance in this blog and recently many folks have been blogging and writing about the topic as well. We thought it would be a good time to offer some more behind the scenes views on how we have been working on and thinking about performance because it such an interesting topic for the folks reading this blog. Of course I've been using some pretty low-powered machines lately so performance is top of mind for me as well. But for fun I am writing this on my early holiday present--my new home machine is a 64-bit all-in-one desktop machine with a quad core CPU, discrete graphics, 8GB of memory, and hardware RAID all running a pretty new build of Windows 7 upgraded as soon as I finished the out of box experience. Michael Fortin and I authored this post. --Steven

Our beta isn’t even out the door yet and many are already dusting off their benchmarks and giving them a whirl. As a reminder, we are encouraging folks to hold on benchmarking our pre-production builds. Yet we’ve come to expect it will happen, and we realize it will lead many to conclude one thing or another, and at the same time we appreciate those of you who take the time to remind folks of the pre-ship status of the code. Nevertheless we’re happy that many are seeing good results thus far. We're not yet as happy as we believe we will be when we finish the product as we continue to work on all the fundamental capabilities of Windows 7 as well as all the new features folks are excited about.

Writing about performance in this blog is nearly as tricky as measuring it. As we've seen directional statements are taken further than we might intend and at the same time there are seemingly infinite ways to measure performance and just as many ways to perceive the same data. Ultimately, performance is something each individual feels is right--whether than means adequate or stellar might vary scenario to scenario, individual to individual. Some of the mail we've received has been clear about performance:

  • Boot-very very fast in all applications ( open-load applications) especially so many simultaneously!!!!! Hence, massive multicore ( quad-octa core cpu) , gpgpu for all!!!!!!!!!!!!
  • This is right time to do this properly, the users want speed, we'll give them speed.
  • i want to be able to run windows 7 extremely fast and still look good graphically on a asus aspire one netbook with these specs-1.5 ghz intel atom processor (single core) 1gb of ram
  • I hope that in addition to improvements in the gui and heart (I hope massive multicore + 64-bit + Directx 11 ..extreme performance, etc) for windows 7, modified the feature Flip 3d In Windows 7!!!!! Try to make a Flip 3D feature, really efficient and sensible in windows 7.
  • With regard to the performance thing, could you look at ways to reduce the penalty of having a lot of fonts installed.
  • From performance, boot up, explorer speed and UI experience , I hope the next version of windows delivers something new and innovating. I was playing with the new UI on the HP TouchPC and I have to say they did a great 1.0 job on the touch interface controls.
  • I do keep my fingers crossed for Windows 7 to be dramatically better in its performance than Windows Vista.
  • The biggest feature I see a lot of people wanting is performance.

You can also see through some of these quotes that performance means something different to different people. As user-interface folks know, perceived performance and actual performance can often be different things. I [Steven] remember when I was writing a portion of the Windows UI for Visual C++ and when I benchmarked against Borland C++ at the time, we were definitely faster (measured by seconds). However the reviews consistently mentioned Borland as being faster and providing feedback in the form of counts of lines compiled flying by. So I coded up a line count display that flashed a lot of numbers at you while compiling (literally flashy so it looked like it couldn't keep up). In clock times it actually consumed a non-zero amount of time so we got "slower" but the reviewers then started giving us credit for being faster. So in this case slower actually got faster.

There's another story from the past that is the flip side of this which is the scrolling speed in Microsoft Word for DOS (and also Excel for Windows--same dynamic). BillG always pushed hard on visible performance in the "early" days and scrolling speed was one of those things that never seemed to be fast enough. Well clever folks worked hard on the problem and subsequently made scrolling too fast--literally to the point that we had to slow it down so you didn't always end up going from page 1 to the end of the document just because you hold down the page down key. It is great to be fast, but sometimes there is "too much speed".

We have seen the feedback about what to turn off or adjust for better performance. In many ways what we're seeing are folks hoping to find the things that cause the performance to be less than they would like. I had an email conversation with someone recently trying to pinpoint the performance issues on a new laptop. Just by talking through it became clear the laptop was pretty "clean" (~40 processes, half the 1GB of RAM free, <5% CPU at idle, etc.) and after a few back and forths it became clear it was the internet connection (dial-up) that was actually the biggest bottleneck in the system. Many encourage us to turn off animations, graphics, or even color as there is a belief that these can be the root of performance. We've talked about the registry, disk space utilization, and even color depth as topics where folks see these as potential performance issues.

It is important to consider that performance is inherently a time/space tradeoff (computer science sense, not science fiction sense), and on laptops there is the added dimension of power consumption (or CPU utilization). Given infinite memory, of course many algorithms would be very different than the ones we use. In finite memory, performance is impacted greatly by the overall working set of a scenario. So in many cases when we talk about performance we are just as much talking about reducing the amount of memory consumed as we are talking about the clock time. Some parts of the OS are much more tunable in terms of the memory they use, which then improves the overall performance of the system (because there is less paging). Other parts of the system are much more about the number of instructions executed (because perhaps every operation goes through that code path). We work a great deal on both!

The reality of measuring and improving performance is one where we are focused at several "levels" in Windows 7: micro-benchmarks, specific scenarios, system tuning. Each of these plays a critical role in how we are engineering Windows 7 and while any single one can be measured it is not the case that one can easily conclude the performance of the system from a measurement.

Micro-benchmarks. Micro-benchmarks are the sort of tests that stress a specific subsystem at extreme levels. Often these are areas of the code that are hard to see the performance of during usage as they go by very fast or account for a small percentage of time during overall execution. So tests are designed to stress part of the system. Many parts of the system are subjected to micro-benchmarking such as the file system, networking, memory management, 2D and 3D graphics, etc. A good example here is the work we do to enable fast file copying. There is a lot of low level code that accounts for a (very significant) number of conditions when copying files around, and that code is most directly executed through XCOPY in a command window (or an API). Of course the majority of copy operations take place through the explorer and along with that comes a progress indicator, cancellable operation, counting up bytes to copy, etc. All of those have some cost with the benefit as well. The goal of micro-benchmarks is to enable us to best understand the best possible case and then compare it to the most usable case. Advanced folks always have access to the command line for more power, control, and flexibility. It is tempting to measure the performance of the system by looking at improvements in micro-benchmarks, but time and time again this proves to be inadequate as routine usage covers a much broader code path and time is spent in many places. For Internet Explorer 8 we did a blog post on performance that went into this type issue relative to script performance. At the other end of the spectrum we definitely understand the performance of micro-benchmarks on some subsystems will be, and should be, carefully measured --the performance of directx graphics is an area that gamers rely on for example. It is worth noting that many micro-benchmarks also depend heavily on a combination of Windows OS, hardware, and specific drivers.

Specific scenarios. Most people experience the performance of a PC through high level actions such as booting, standby/resume, launching common applications. These are topics we have covered in previous posts to some degree. In Engineering Windows 7, each team has focused on a set of specific scenarios that are ones we wanted to make better. This type of the work should be demonstrable without any elaborate setup or additional tools. This work often involves tuning the code path for the number of instructions executed, looking at the data allocated for the common case, or understanding all the OS APIs called (for example registry lookups). One example that comes to mind is the work that we have going on to reduce the time to reinsert a USB device. This is particularly noticeable for UFD (USB flash drives) or memory cards. Windows of course allows the whole subsystem to be plumbed by unique drivers for a specific card reader or UFD, even if most of the time they are the same we still have to account for the variety in the ecosystem. At the start of the project we looked at a full profile of the code executed when inserting a UFD and worked this scenario end-to-end. Then systematically each of the "hot spots" was worked through. Another example along these lines was playback of DVD movies which involves not only the storage subsystem but the graphics subsystem as well. The neat thing about this scenario is that you also want to optimize for the CPU utilization (which you might not even notice while playing back the movie) as that dictates the power consumption.

System tuning. A significant amount of performance work falls under the umbrella of system tuning. To ascertain what work we do in this area we routinely look at the overall performance of the system relative to the same tests on previous builds and previous releases of Windows. We're looking for things that we can do to remove operations that take a lot of time/space/power or things that have "grown" in one of those dimensions. We have build-to-build testing we do to make sure we do not regress and of course every developer is responsible for making sure their area improves as well. We left no stone unturned in terms of investigating opportunities to improve. One of the areas many will notice immediately when looking at the pre-beta or beta of Windows 7 is the memory usage (as measured by task manager, itself a measurement that can be misunderstood) of the desktop window manager. For Windows 7, a substantial amount of architectural work went into reducing the amount of memory consumed by the subsystem. We did this work while also maintaining compatibility with the Windows Vista drivers. We did similar work on the desktop search engine where we reduced not just the memory footprint, but the I/O footprint as well. One the most complex areas to work on was the improvements in the taskbar and start menu. These improvements involved substantial work on critical sections ("blocking" areas of the code), registry I/O, as well as overall code paths. The goal of this work is to make sure these UI elements are always available and feel snappy.

It is worth noting that there are broad based measures of performance as well that drive the user interface of a selection of applications. These too have their place--they are best used to compare different underlying hardware or drivers with the same version of Windows. The reason for this is that automation itself is often version dependent and because automation happens in a less than natural manner, there can be a tendency to measure these variances rather than any actual perceptible performance changes. The classic example is the code path for drawing a menu drop down--adding some instructions that might make the menu more accessible or more appealing would be impossible to perceive by a human, but an automated system that drives the menu at super human speed would see a change in "performance". In this type of situation the effect of a micro-benchmark is magnified in a manner inconsistent with actual usage patterns. This is just a word of caution on how to consider such measurements.

Given this focus across different types of measurement it is important to understand that the overall goal we have for Windows 7 is for you to experience a system that is as good as you expect it to be. The perception of performance is just as important as specific benchmarks and so we have to look to a broad set of tools as above to make sure we are operating with a complete picture of performance.

In addition to these broad strategies there are some specific tools we've put in place. One of these tools, PerfTrack, takes the role of data to the next level with regard to performance and so will play a significant role in the beta. In addition, it is worth reminding folks about the broad set of efforts that go into engineering for performance:

  • We’ve been building out and maintaining a series of runs that measure thousands of little and big things. We’ve been running these before developer check-ins and maintaining performance and responsiveness at a level above which all that self-host our builds will find acceptable. These gates have kept the performance and responsiveness of our daily builds at a high enough level that thousands have found it possible to run their main systems on Windows 7 for extended periods of time, doing their normal daily work.
  • We’ve been driving down footprint, reducing our service costs, improving the efficiency of key code paths, refactoring locks to improve scalability, reducing hangs, improving our I/O efficiency and much more. These are scenario driven based on real world execution paths we know from our telemetry to be common.
  • We’ve been partnering closely with the top OEMs, ISVs and IHVs. Our tools have been made public, we’ve held numerous training sessions, and we’ve been focusing heavily on shipping systems in an effort to insure customers get great performing systems out of the box, with great battery life too.
  • Within the Windows dev team, we’ve placed a simple trace capturing tool on everyone’s desktop. This desktop tool allows each person to run 24x7 with performance tracing enabled. If anything seems slow or sluggish, they can immediately save the last minute-or-so of activity and send it for automated analysis. Additionally, a team of people visually inspect the traces for new issues or issues not yet decipherable by our automation. The traces are incredibly rich and allow us to get to the root of top issues most of the time.
  • For all Pre-Beta, Beta and RTM users, we’ve developed a new form of instrumentation and have used it to instrument over 500 locations in the operating system and inbox applications. This new instrumentation is simple in concept, but revolutionary in result. The tool is called PerfTrack, and it has helped confirm our belief that the client benchmarks aren’t too informative about real user responsiveness issues.

Perftrack is a very flexible, low overhead, dynamically configurable telemetry system. For key scenarios throughout Windows 7, there exist “Start” and “Stop” events that bracket the scenario. Scenarios can be pretty much anything; including common things like opening a file, browsing to a web page, opening the control panel, searching for a document, or booting the computer. Again, there are over 500 instrumented scenarios in Windows 7 for Beta.

Obviously, the time between the Stop and Start events is meant to represent the responsiveness of the scenario and clearly we’re using our telemetry infrastructure to send these metrics back to us for analysis. Perftrack’s uniqueness comes not just from what it measure but from the ability to go beyond just observing the occurrence of problematic response times. Perftrack allows us to “dial up” requests for more information, in the form of traces.

Let’s consider the distribution below and, for fun, let's pretend the scenario is opening XYZ. For this scenario, the feature team chose to set some goals for responsiveness. With their chosen goals, green depicts times they considered acceptable, yellow represents times they deemed marginal, and red denotes the poor times. The times are in milliseconds and shown along the X axis. The Hit Count is shown on the Y axis.

Graph measuring responsiveness goals and real world data.

As can be seen, there are many instances where this scenario took more than 5 seconds to complete. With this kind of a distribution, the performance team would recommend that we “dial up” a request for 100+ traces from systems that have experienced a lengthy open in the past. In our “dialed up” request, we would set a “threshold” time that we thought was interesting. Additionally, we we may opt to filter on machines with a certain amount of RAM, a certain class of processor, the presence of specific driver, or any number of other things. Clients meeting the criteria would then, upon hitting the “Start” event, configure and enable tracing quickly and potentially send back to us if the “Stop” event occurred after our specified “threshold” of time.

As you might imagine, a good deal of engineering work went into making this end to end telemetry and feedback system work. Teams all across the Windows division have contributed to make this system a reality and I can assure you we’ll never approach performance the same now that we have these capabilities.

As a result of focusing on traces and fixing the very real issue revealed by them, we’ve seen significant improvements in actual responsiveness and have received numerous accolades on Windows 7. Additionally, I’d like to point out that these traces have served to further confirm what we’ve long believed t be the case.

This post provides an overview of the ways we have thought about performance with some specifics about how we measure it throughout the engineering of Windows 7. We believe that throughout the beta we will continue to have great telemetry to help make sure we are achieving our goals and that people perceive Windows 7 to perform well relative to their expectations.

We know many folks will continue to use stop watches, micro-benchmarks, or to drive automated tests. These each have their place in your own analysis and also in our engineering. We thought given all the interest we would talk more about how we measure things and how we're engineering the product.

--Steven and Michael

Leave a Comment
  • Please add 8 and 8 and type the answer here:
  • Post
  • One thing I'd like to request is for issues around over-grown filecache problems on x64 machines to be looked at.

    It's not uncommon in vista x64 (on a 4gb machine) for the file cache to grow to 4+gb, and cause low memory error issues.  It seems to actively push out memory of active desktop processes (which should never get paged, minimized or not), which then causes thrashing when you return to these.

    In addition, this memory contention causes a really surprising performance hit:  after closing an app that consumed a great deal of memory.  

    In the default state, after closing Eclipse, for example, this will cause the drive to thrash hard for a minute or two.  When consulting Resource Monitor, its a normal-io-priority process hitting the pagefile.

    Because of all the disk activity, the desktop is slowed down.  Why would closing an app cause a minute or two of disk-thrashing ever, on a 4gb machine that never uses more than 4gb of memory (not including the horrible file cache)?  

    When doing some research on this, I discovered this post on the 'ntdebugging' blog:

    After reading that and the linked article, I started using Russinovich's CacheSet tool to constantly force the file cache down to 50MB.

    This results in dramatically increased performance (perceived at least) in normal operations, at some minor cost to startup of big apps and login scenarios.  

    But the important thing is that now, with that change, the laptop is well behaved all the time, even after closing eclipse or vstudio, and the disk does precisely ... nothing.  Which is wonderful.

    Something to consider, as excessive drive thrashing due to an overly aggressive file cache on x64 systems seems to be a performance problem, at least for some subset of users such as myself.

  • Good post, but what I really miss is the discussion on what you call 'perceived performance'. There are a lot of settings in Windows that are designed to "help" users but have a HUGE perceived performance cost.

    For example, On the 'System Properties' dialog --> 'Advanced' tab --> Performance Settings window, you can choose the setting 'animate windows when minimizing and maximizing'. I always disable this setting because I like windows to be snappy as you call it and not take ages to fade in / out to / from the taskbar. I think this setting should be disabled by default in Windows 7; it will greatly improve perceived performance for the end user. The 'benefit' of this fading in / out is only limited anyway; users need to discover only once that windows go to the taskbar when minimizing. So please turn this off by default!

  • All-in-one with RAID? Sounds like a Sony... I got a Touchsmart, though the specs are far from the same. But my main machine sounds identical spec wise (not an all-in-one). But it is strange how slow it can feel at times.

    Something like 10% of the time it feels so slow I want to pull my hair out, and the other 90% of the time I never think of performance (meaning that it is good). It can drive me nuts when an application (IE, Outlook, etc.) can't even show its menu when hitting the menu bar (the second click grays out the whole window for a time).

    That and loggging in -- every IM app want to startup and log immediately, all at the same time. Live Mesh too. Everything. I swear they all use just one CPU core between them.

    Which brings me to my pet peeve -- if I start using an app like IE, and start typing -- one of those startup items gets going and takes the focus away. So I have to click the window again and start typing -- nope. Another did it -- I have to click again on the window, see what letters didn't make it and start typing again -- oh wait, I have to start again -- wait I didn't type anything -- click that window again! No... again! Again! Type real fast before the focus is gone (and doesn't even seem to go to any other window). Oh heck, just go get a coffee until the blasted machine will let me work...

  • @steveandmicheal

    I think this was an excellent post and it definitely gave us a more informative outline of just how serious Microsoft is about giving its customers a faster and more responsive system, perceived or not ;). After looking over some of the pre-beta builds online and reviews, it looks like Windows 7 is heading in the right direction.  Performance is a big deal for some of my customers with the bigger ones having over 60,000 systems with most of them holding out on Windows Vista because it just does not perform acceptably on Dell D610, Dell D620 and other older laptop systems. Windows Vista and Outlook 2007 on these systems - Forget It! Unfortunately, I think you have a perception problem to overcome in regards to Vista.  I think the performance improvements will go a long way but you also need to make some improvements to the GUI. I know the taskbar is a start but on top of that you need to give users a feeling that Windows 7 is a new and exciting system...A cleaner Windows UI that is not exactly like Windows Vista along with the continued performance objectives will make this OS the best since Windows ME ;). You would not believe how many comments I've received from avid PC users reading Build 6956 reviews about how MS finally put a nice little graphic at system boot even though its just colored orbs coming together to form a Windows logo. I know..very simple and flashy. However, good looks, great functionality and high performance go a long way. With that said..Thank you all for continuing to prove that Microsoft is moving in the right direction with Windows 7.  Keep up all the good work and I look forward to contributing to the beta.

  • Nice post Thank's !!!

    PS. No vacation :-D

  • @Jembe -- you make an interesting point.  I was speaking with one of the many reviewers at our reviewer's workshop and their point used the same setting, but drew the opposite conclusion.  This person said "there are settings like window animation that I know technical people say they turn off for improved performance, but you need to leave those on and add more of them--performance is not just speed, but also the perception of being in control and feedback" [not an exact quote but close].

    I think your point and this point show how even perceived performance has multiple perspectives for the same thing.  


  • umm i was just wondering that what if you could make the fastest version of windows WITH effects(unlike vista).please make windows 7 a success

  • I know that performance is a big thing, and I am well aware that there is much that can be done in Windows to improve performance; but am I missing something when hundreds of people are commenting with stupid demands about maximising speed and having a flashy UI, whilst running on the equivalent of an old ZX Spectrum?!

    People just need to be a little more realistic, that's all I am saying. Maybe that's where Windows 7 excel - it should scale itself down well, so that it runs acceptably on all machines.

  • @ Mr. Steven

    All in one


    quad core

    Discrete Gpu

    by chance in multitouch :D ?


    Last Build of Windows 7 is 7XXXX

    as I read a post by Paul T. ?

  • I believe that for most users, the most visible measure of performance of the operating system is how much and how often they have to wait for foreground tasks to complete. For example, it bothers me when I have to wait for files a folders to show up when saving or opening files from an application or when opening the Explorer, with the entire computer freezing for five or more seconds, but I don't really care whether something like automatic backup or indexing, which run in the backgrounds, take a minute or two longer.

    Another example of a feature that would make the overall experience smoother and more fluent would be the ability of the system to save the state of the desktop in the event of application- or system-forced restart, such as after Windows Update. It would be nice to click restart, go make some coffee and return a couple of minutes later with all applications running as they were before restart, all the files opened and the computer ready to work.

  • This all certainly sounds quite excellent, and it is nice to hear that Microsoft is doing some proper performance enhancement with Windows 7.

    Just a few ideas.  

    1) There needs to be testing on "degraded" and "compromised" systems too, to see what the effect will be a few years hence on the OS performance.    Microsoft can't rely on future on users choosing a new version of Windows to replace an old one that is performing badly, as in the past.

    2) Performance should not be hit when predicable but infrequent problems occur.  For example, a Wifi network going down, a problem with DNS on a corporate network, sudden network congestion, an accidentally removed Flash device and so forth.  

    3) Performance after "hibernation" has always been an issue ("cron-style" processes that decide to trigger for each of the missed minutes) and this needs to be tested both virtually (by clock adjustment) and for real.  

    4) Windows 7 needs to monitor its own performance and make suggestions to the user when problems occur, for example messages about disk usage, defragmentation, even adding extra memory - using real world calculations such as the time, cost and simplicity.

  • sroussey has a very good point there - startup time would _seem_ a lot faster if I didn't keep losing focus.  It's something that seems blindingly dumb in Windows - that my focus can be whipped out from under me while I'm in mid-type.  I've lost count of the number of times I've been typing something, had a dialog come out of nowhere, be dismissed because I'd hit space as part of what I was already typing, and vanish, leaving me with no idea what I'd just agreed to.  Having that happen repeatedly during startup is beyond frustrating.

  • 1.  Lets say i change "logonui" with a edited one and windows dont start. I know this is not common but why windows cannot undersand this exe file is misssing and it doesnot put it back from its backup. For just a only one sys file system cannot start, last week this happened to me of a ndis.sys or something like this. İ had to restore from backup for just one file.

    2. Still there are small dialog windowses like folder options  and advanced properties etc.

    Why they dont change these to bigger one?

  • When I think of Windows performance issues, it's not so much the overall performance or responsiveness in normal conditions that bothers me. It's the 30+ second periods during boot where nothing seems to be happening, no disk activity and task manager shows 0% CPU, and apparently, all the processses starting up are just stumbling over each others, and none of them actually get any closer to loading. It's Explorer's magical single-threadedness. One wrong click on a network or floppy drive, and *every* Explorer window freezes up until some humongous timeout has expired. Or NTFS's seemingly incredible file access time (Subjective, and I haven't done any specific testing on this, but doing something which requires touching a lot of files, such as compiling large C++ projects or checking dependencies for same, seems to be significantly slower than under other OS'es with other filesystems. And of course, it's the amazingly excruciatingly slow installers and other non-essential software (Generally only Microsoft software seems guilty of this, although not actually Windows itself. But Visual Studio, TFS Team Explorer, SQL Server and even service packs all require incredible amounts of temporary disk space and enough CPU time to compute more than a few Folding@Home packets).

    And finally, another very subjective point, is the apparent step backwards in Vista. It seems every action got just a bit slower, because there had to be more time for intricate animations and graphical transitions, which don't really seem to add more of a sense of "control and feedback" as you said above, but simply just inject a few more milliseconds of nothingness between your action and the feedback for it.

    And the apparently overengineered workarounds to "optimize" media playback by throttling network activity, speeding up network copying in specific  cases, by vastly slowing it down in others and other instances which to an outsider just looks like hugely overcomplicated workarounds to avoid fixing the

  • Windows 7 benchmark:

    XP vs Windows 7:

    That's really impressive. I saw many screen shots of Windows 7 build 6956. It seems like just like Vista and 7 server, their boot screens will be the same. I have both Vista and WS 2008 server installed (dual boot) and when either one is booting, I often get confused which one is booting, and only to find out when the welcome screen appears. You guys should put a text or logo of Windows 7 and/or it's server so that we won't be confused which OS is booting.

    And the explorer of Windows 7 is pretty much unchanged. Am sure you guys have used Mac. When it's come to ease of use, it will blow Windows away. I was expecting explorer with ribbon UI which would expose the features esp the ones that are hidden deep under explorer like shadow copy etc right in front of our eyes. Ribbon has made using office so much fun and easy. Everything I use are just there, just in front of my screen. But the applications that are never or often used like Wordpad and Paint have ribbon integrated. And as usual, please update that ugly basic theme, it's still unchanged...Hope you guys will take those things into consideration.

Page 1 of 9 (129 items) 12345»