Notes on comments.
Welcome to our blog dedicated to the engineering of Microsoft Windows 7
We've talked some about performance in this blog and recently many folks have been blogging and writing about the topic as well. We thought it would be a good time to offer some more behind the scenes views on how we have been working on and thinking about performance because it such an interesting topic for the folks reading this blog. Of course I've been using some pretty low-powered machines lately so performance is top of mind for me as well. But for fun I am writing this on my early holiday present--my new home machine is a 64-bit all-in-one desktop machine with a quad core CPU, discrete graphics, 8GB of memory, and hardware RAID all running a pretty new build of Windows 7 upgraded as soon as I finished the out of box experience. Michael Fortin and I authored this post. --Steven
Our beta isn’t even out the door yet and many are already dusting off their benchmarks and giving them a whirl. As a reminder, we are encouraging folks to hold on benchmarking our pre-production builds. Yet we’ve come to expect it will happen, and we realize it will lead many to conclude one thing or another, and at the same time we appreciate those of you who take the time to remind folks of the pre-ship status of the code. Nevertheless we’re happy that many are seeing good results thus far. We're not yet as happy as we believe we will be when we finish the product as we continue to work on all the fundamental capabilities of Windows 7 as well as all the new features folks are excited about.
Writing about performance in this blog is nearly as tricky as measuring it. As we've seen directional statements are taken further than we might intend and at the same time there are seemingly infinite ways to measure performance and just as many ways to perceive the same data. Ultimately, performance is something each individual feels is right--whether than means adequate or stellar might vary scenario to scenario, individual to individual. Some of the mail we've received has been clear about performance:
You can also see through some of these quotes that performance means something different to different people. As user-interface folks know, perceived performance and actual performance can often be different things. I [Steven] remember when I was writing a portion of the Windows UI for Visual C++ and when I benchmarked against Borland C++ at the time, we were definitely faster (measured by seconds). However the reviews consistently mentioned Borland as being faster and providing feedback in the form of counts of lines compiled flying by. So I coded up a line count display that flashed a lot of numbers at you while compiling (literally flashy so it looked like it couldn't keep up). In clock times it actually consumed a non-zero amount of time so we got "slower" but the reviewers then started giving us credit for being faster. So in this case slower actually got faster.
There's another story from the past that is the flip side of this which is the scrolling speed in Microsoft Word for DOS (and also Excel for Windows--same dynamic). BillG always pushed hard on visible performance in the "early" days and scrolling speed was one of those things that never seemed to be fast enough. Well clever folks worked hard on the problem and subsequently made scrolling too fast--literally to the point that we had to slow it down so you didn't always end up going from page 1 to the end of the document just because you hold down the page down key. It is great to be fast, but sometimes there is "too much speed".
We have seen the feedback about what to turn off or adjust for better performance. In many ways what we're seeing are folks hoping to find the things that cause the performance to be less than they would like. I had an email conversation with someone recently trying to pinpoint the performance issues on a new laptop. Just by talking through it became clear the laptop was pretty "clean" (~40 processes, half the 1GB of RAM free, <5% CPU at idle, etc.) and after a few back and forths it became clear it was the internet connection (dial-up) that was actually the biggest bottleneck in the system. Many encourage us to turn off animations, graphics, or even color as there is a belief that these can be the root of performance. We've talked about the registry, disk space utilization, and even color depth as topics where folks see these as potential performance issues.
It is important to consider that performance is inherently a time/space tradeoff (computer science sense, not science fiction sense), and on laptops there is the added dimension of power consumption (or CPU utilization). Given infinite memory, of course many algorithms would be very different than the ones we use. In finite memory, performance is impacted greatly by the overall working set of a scenario. So in many cases when we talk about performance we are just as much talking about reducing the amount of memory consumed as we are talking about the clock time. Some parts of the OS are much more tunable in terms of the memory they use, which then improves the overall performance of the system (because there is less paging). Other parts of the system are much more about the number of instructions executed (because perhaps every operation goes through that code path). We work a great deal on both!
The reality of measuring and improving performance is one where we are focused at several "levels" in Windows 7: micro-benchmarks, specific scenarios, system tuning. Each of these plays a critical role in how we are engineering Windows 7 and while any single one can be measured it is not the case that one can easily conclude the performance of the system from a measurement.
Micro-benchmarks. Micro-benchmarks are the sort of tests that stress a specific subsystem at extreme levels. Often these are areas of the code that are hard to see the performance of during usage as they go by very fast or account for a small percentage of time during overall execution. So tests are designed to stress part of the system. Many parts of the system are subjected to micro-benchmarking such as the file system, networking, memory management, 2D and 3D graphics, etc. A good example here is the work we do to enable fast file copying. There is a lot of low level code that accounts for a (very significant) number of conditions when copying files around, and that code is most directly executed through XCOPY in a command window (or an API). Of course the majority of copy operations take place through the explorer and along with that comes a progress indicator, cancellable operation, counting up bytes to copy, etc. All of those have some cost with the benefit as well. The goal of micro-benchmarks is to enable us to best understand the best possible case and then compare it to the most usable case. Advanced folks always have access to the command line for more power, control, and flexibility. It is tempting to measure the performance of the system by looking at improvements in micro-benchmarks, but time and time again this proves to be inadequate as routine usage covers a much broader code path and time is spent in many places. For Internet Explorer 8 we did a blog post on performance that went into this type issue relative to script performance. At the other end of the spectrum we definitely understand the performance of micro-benchmarks on some subsystems will be, and should be, carefully measured --the performance of directx graphics is an area that gamers rely on for example. It is worth noting that many micro-benchmarks also depend heavily on a combination of Windows OS, hardware, and specific drivers.
Specific scenarios. Most people experience the performance of a PC through high level actions such as booting, standby/resume, launching common applications. These are topics we have covered in previous posts to some degree. In Engineering Windows 7, each team has focused on a set of specific scenarios that are ones we wanted to make better. This type of the work should be demonstrable without any elaborate setup or additional tools. This work often involves tuning the code path for the number of instructions executed, looking at the data allocated for the common case, or understanding all the OS APIs called (for example registry lookups). One example that comes to mind is the work that we have going on to reduce the time to reinsert a USB device. This is particularly noticeable for UFD (USB flash drives) or memory cards. Windows of course allows the whole subsystem to be plumbed by unique drivers for a specific card reader or UFD, even if most of the time they are the same we still have to account for the variety in the ecosystem. At the start of the project we looked at a full profile of the code executed when inserting a UFD and worked this scenario end-to-end. Then systematically each of the "hot spots" was worked through. Another example along these lines was playback of DVD movies which involves not only the storage subsystem but the graphics subsystem as well. The neat thing about this scenario is that you also want to optimize for the CPU utilization (which you might not even notice while playing back the movie) as that dictates the power consumption.
System tuning. A significant amount of performance work falls under the umbrella of system tuning. To ascertain what work we do in this area we routinely look at the overall performance of the system relative to the same tests on previous builds and previous releases of Windows. We're looking for things that we can do to remove operations that take a lot of time/space/power or things that have "grown" in one of those dimensions. We have build-to-build testing we do to make sure we do not regress and of course every developer is responsible for making sure their area improves as well. We left no stone unturned in terms of investigating opportunities to improve. One of the areas many will notice immediately when looking at the pre-beta or beta of Windows 7 is the memory usage (as measured by task manager, itself a measurement that can be misunderstood) of the desktop window manager. For Windows 7, a substantial amount of architectural work went into reducing the amount of memory consumed by the subsystem. We did this work while also maintaining compatibility with the Windows Vista drivers. We did similar work on the desktop search engine where we reduced not just the memory footprint, but the I/O footprint as well. One the most complex areas to work on was the improvements in the taskbar and start menu. These improvements involved substantial work on critical sections ("blocking" areas of the code), registry I/O, as well as overall code paths. The goal of this work is to make sure these UI elements are always available and feel snappy.
It is worth noting that there are broad based measures of performance as well that drive the user interface of a selection of applications. These too have their place--they are best used to compare different underlying hardware or drivers with the same version of Windows. The reason for this is that automation itself is often version dependent and because automation happens in a less than natural manner, there can be a tendency to measure these variances rather than any actual perceptible performance changes. The classic example is the code path for drawing a menu drop down--adding some instructions that might make the menu more accessible or more appealing would be impossible to perceive by a human, but an automated system that drives the menu at super human speed would see a change in "performance". In this type of situation the effect of a micro-benchmark is magnified in a manner inconsistent with actual usage patterns. This is just a word of caution on how to consider such measurements.
Given this focus across different types of measurement it is important to understand that the overall goal we have for Windows 7 is for you to experience a system that is as good as you expect it to be. The perception of performance is just as important as specific benchmarks and so we have to look to a broad set of tools as above to make sure we are operating with a complete picture of performance.
In addition to these broad strategies there are some specific tools we've put in place. One of these tools, PerfTrack, takes the role of data to the next level with regard to performance and so will play a significant role in the beta. In addition, it is worth reminding folks about the broad set of efforts that go into engineering for performance:
Perftrack is a very flexible, low overhead, dynamically configurable telemetry system. For key scenarios throughout Windows 7, there exist “Start” and “Stop” events that bracket the scenario. Scenarios can be pretty much anything; including common things like opening a file, browsing to a web page, opening the control panel, searching for a document, or booting the computer. Again, there are over 500 instrumented scenarios in Windows 7 for Beta.
Obviously, the time between the Stop and Start events is meant to represent the responsiveness of the scenario and clearly we’re using our telemetry infrastructure to send these metrics back to us for analysis. Perftrack’s uniqueness comes not just from what it measure but from the ability to go beyond just observing the occurrence of problematic response times. Perftrack allows us to “dial up” requests for more information, in the form of traces.
Let’s consider the distribution below and, for fun, let's pretend the scenario is opening XYZ. For this scenario, the feature team chose to set some goals for responsiveness. With their chosen goals, green depicts times they considered acceptable, yellow represents times they deemed marginal, and red denotes the poor times. The times are in milliseconds and shown along the X axis. The Hit Count is shown on the Y axis.
As can be seen, there are many instances where this scenario took more than 5 seconds to complete. With this kind of a distribution, the performance team would recommend that we “dial up” a request for 100+ traces from systems that have experienced a lengthy open in the past. In our “dialed up” request, we would set a “threshold” time that we thought was interesting. Additionally, we we may opt to filter on machines with a certain amount of RAM, a certain class of processor, the presence of specific driver, or any number of other things. Clients meeting the criteria would then, upon hitting the “Start” event, configure and enable tracing quickly and potentially send back to us if the “Stop” event occurred after our specified “threshold” of time.
As you might imagine, a good deal of engineering work went into making this end to end telemetry and feedback system work. Teams all across the Windows division have contributed to make this system a reality and I can assure you we’ll never approach performance the same now that we have these capabilities.
As a result of focusing on traces and fixing the very real issue revealed by them, we’ve seen significant improvements in actual responsiveness and have received numerous accolades on Windows 7. Additionally, I’d like to point out that these traces have served to further confirm what we’ve long believed t be the case.
This post provides an overview of the ways we have thought about performance with some specifics about how we measure it throughout the engineering of Windows 7. We believe that throughout the beta we will continue to have great telemetry to help make sure we are achieving our goals and that people perceive Windows 7 to perform well relative to their expectations.
We know many folks will continue to use stop watches, micro-benchmarks, or to drive automated tests. These each have their place in your own analysis and also in our engineering. We thought given all the interest we would talk more about how we measure things and how we're engineering the product.
--Steven and Michael
"i want to be able to run windows 7 extremely fast and still look good graphically on a asus aspire one netbook with these specs-1.5 ghz intel atom processor (single core) 1gb of ram"
I'm pretty sure Acer will be disappointed if Asus have bought out an Aspire One range :S
There's nothing to hide about DRM though. Several TV on demand services require vista, or XP, because the DRM that they require isn;t available in Linux, or OSX. Perfect example is the BBC's i-player. You can stream media on any os, but they'll only allow downloaded content on Windows.
If we didn't have the DRM infractructure, they wouldn't allow it.
"I suppose the movie would be faster if they had straight cuts, but that probably wouldn’t make for a very interesting movie."
Well that illustrates the mistake you are doing: an OS is not a movie or a video game. It doesn't have to be 'interresting'. Animations are cool but they are not crucial.
I'm not pushing for my personal preference to be the default ones in the next release, I'm just saying that this aproach is the wrong one.
2/ System slow down over time and porgams at startup:
This is one point we have talked again and again and that many here would like it to be discussed: Crapware.
One radical solution would be to totaly forbid a program to automaticaly launch at start up, unless the user has decided so through a MS Windows console or if the software has been installed as a driver, through a special driver installer.
3/ joewoodbury on bloat:
Vista'core code (+-2Gb) is twice as large as XP's core code and 10x the size of w98 complete install.
But what is the most frightening is the increased level of complexity. I was hallucinating when I read the explanation on WinSxS: How come MS developers engaged themselves in something so crazy?
And this is probably the tip of the iceberg: MS Windows developers can't make things simple and consistent with a simple idea of using a computer with a keyboard and a mouse. That's why their code is bloated.
Maybe now that the various MS developers gather once a week in front of a whiteboard for a brainstorming, things will be better.
4/ Windows Defender
Definetely it should be installed as an option, just like about everything.
But I like Windows Defender because it's one of the best anti-virus (at least 2 years ago) and because I don't want to look for a third party anymore. For simple users who don't extra-featured security facilities it's ok.
But radical security is still to totaly protect system files, make impossible for installers, scripts or softwares to modify them, to allow installers, scripts and softwares to modify things in their allocated folder only. With the same principles acting for the registry.
System files and registry should be accessed only through windows update, remote maintenance and UAC-like authorisation (no disabled UAC in this case).
Then the ultimate, albeit optionable, measure would be that under no circumstance a system file or registry can be modified if an internet connection is present or if IE, Office or Windows mail is running..
Great Stuff! Great Discussion! I enjoy it!
Merry xmass to all!
Power users are the ones who would want to disable a feature such as Windows Defender, or even Windows Resource protection, but casual users probably have more use of it enabled, as an effective mean of putting an end to spyware of different kinds. Although it would be nice if security features like that would come without a cost of decreased performance (A Windows installation with Windows Defender or Windows File protection disabled in XP, seems faster than one with those features enabled)
I don't have anything against protecting content...as long as this done on application layer and doesn't affect my functionality. When companies try to make such things like Sony rootkits, there is something wrong. When DRM is implemented on very low level and it can affect my performance, there is something wrong.
You notified very good thing. Microsoft seems to be have some wrong design rules and that's why they failed with Vista. I have feeling, that these rules were changed after 2001 (do you remember, what did happen then ?). I agree, that we need first good stable operating system for secure running applications, later it will be good to think about nice graphic animations or similar things.
I hope again that Microsoft will add a Installation for Normal Users and one for Powerusers who can disable what they would disable, and not after the installation, while you are installing will be mutch better.
About Boottime, i dont think that this point is very important(how mutch do you boot your machina while a workday? M? One time). Also i think that the 24.427 s that Windows need to boot up are fast enought(yes i now you say that you will go down to 15 s) but thats mutch faster then OS X Leopard on my MacPro does...ok now to my problem...i dont think that you can do a lot but there are your hardware partners.
BIOS...they need so long to start(on my brothers system the BIOS start needs about 10 s)...and thats the biggest problem in system bootup. The BIOS developers need to remove all de bonus featueres out of the systems...
@marcinw: I think Vista was the step to the stable operating system, and seven is the step to the stable OS with graphic animations, and the clear UI...
To all DRM-kills-my-performance-posters:
DRM does not impact performance of your daily operation. PVP and all that stuff only goes into operation when using WMP to play videos. DRM is not a performance killer. This is utter rubbish.
Ressource monitor in Win7 is great. I have only two suggestions:
1. It'd be nice if there was a shortcut for it (similar to CTRL-SHIFT-ESC for taskmgr).
2. There are some processes which represent a whole range of stuff. "System" or "svchost.exe" come to mind. When checking disk access I'm interested in which thread of "System" or which Service caused the access. Couldn't you give me more details there?
I haven't said, that everything in Vista is wrong. But the truth it, that market is changing and main design principles used to build OS should be changed too... And we still have shared Registry and thousands of old things... And this is wrong. When we will need 2x more of code for giving users ability of running their software, this is wrong. I will say probably much more, when beta program will start :)
from Microsoft documents:
"In Windows Vista, the Protected Environment provides process isolation and continually monitors what kernel-mode software is loaded"
and this is only example. I'm not sure in 100% how exactly this is implemented, but word "CONTINUALLY" still gives a lot to thinking... and many people will not like Windows 7 because of it (and not because, they want to play illegal content)
Besides, back old xp days, if you wouldn't do anything to your computer, processor would be standing at 0%, 1% usage... Computer used to be silent, no harddisk movements... Now, with Vista, computer's always doing something, harddisk's stuggling and making noises... weird. There's too much working inside... Can it be standing still at least for a minute before I leave my desk or decide to get back working?
One thing that is using idle harddrive time seem to be the superfetch feauture. That is a feature I would have liked to be able to manage better, and turn of through some gui.
Ive still got the old vista bootscreen in build 6956 as in build
Ive got an nvidia chipset, is that the problem?
please fix microsoft!!
marcinw -- please don't just post links to other sites. please try to keep comments fresh and related to the discussion. thank you.
Any ideas yet on how many versions will be released?
I am hoping it will be less than previously. I would hate for another one of these:
Can we please, for the sake of our collective sanity, have a moratorium on comments like = "ZOMG, DRM is killing us all".
Without the appropriate DRM infrastructure in place, elementary tasks like playing a DVD, Blu-Ray or other protected online content would simply be impossible. Thus, DRM is necessary and its inclusion in Vista and Seven does not impose an onerous performance overhead.
In fact, I've never seen one empirical study that concludes that DRM is one of the predominant causes of performance degradation in Windows. That's right, not one. Yet, day after day after day, we get unrelenting feigned indignation from anti-Vista, anti-Microsoft zealots falsely proclaiming otherwise.
So, can all those people oblivious to facts please give it a rest and go away. It is tiresome, especially for Mr Sinofsky who has gone to all the effort to post thoughtful and engaging pieces only to have the ensuing comments sullied by infantile, acerebral discussion.
Now that I've had my rant, I'd like to comment specifically on the article.
Firstly, it was very informative. I especially praise the Windows team for focusing on performance tuning in this release, particularly more efficient memory usage.
However, one of the ongoing problems with Windows, since the newly introduced network stack in Vista, is that network file transfers can be abominably slow. As of yet, even with SP1 and subsequent patches, no substantial improvements have been realised in my experience. I really hope that this issue is resolved in the final RTM of Windows 7.
Many people have commented on the issue of performance degradation over time. I can personally affirm that it does occur - however the situation improved markedly with Vista. The superior memory management, background defrags and other assorted performance tweaks introduced with Vista minimised the threat of deleterious performance, which I might add mostly originates from third party applications infecting the registry and start-up folders.
Given that Windows 7 now has an applet that allows users to configure the notification area applications with significantly more granularity, I am even more excited. I have to say that it's long overdue, but welcomed nonetheless. The only thing I would recommend is that access to this applet is made obvious to average users. Microsoft has a long history of providing feature-packed releases wherein the features are difficult to discover. So, make sure that these important features are discoverable, preferably via the new "Action Center".
As for real vs perceived performance, I would advocate that a reasonable balance between both is important. Using Windows has to be aesthetically pleasing, simple and snappy. A user experience which I feel strikes a pretty good balance is that on the Apple iPhone. Relative to its competitors, the device is running a slower-clocked processor, yet the software maximises real performance and complements it with visual effects that give the impression of snappy performance even where empirical measurement shows otherwise.
While Vista isn't interminably slow, the user experience is occasionally encumbered with an excess of dialogue boxes. As someone mentioned earlier, the simple process of changing TCP/IP settings, or even viewing your IP address, requires several clicks. In XP, you just needed to right click the network status notification icon and click properties or status. The same task requires you to open the network center first and then search around. It's cumbersome and makes basic network administration tedious because the software seems to forcibly encourage use of the "diagnose and repair network" applet, which is not very helpful at times. Needless to say, please bring back the option to view status and properties from the right-click menu of the network status notification icon. It's sorely missed by power users.