Engineering Windows 7

Welcome to our blog dedicated to the engineering of Microsoft Windows 7

Windows 7 -- Approach to System Performance

Windows 7 -- Approach to System Performance

Many folks have commented and written email about the topic of performance of Windows. The dialog has been wide ranging—folks consistently want performance to improve (of course). As with many topics we will discuss, performance, as absolute and measurable as it might seem, also has a lot of subtlety. There are many elements and many tradeoffs involved in achieving performance that meets everyone’s expectations. We know that even meeting expectations, folks will want even more out of their Windows PCs (and that’s expected). We’ve re-dedicated ourselves to work in this area in Windows 7 (and IE 8). This is a major initiative across each of our feature teams as well as the primary mission of one of our feature teams (Fundamentals). For this post, I just wanted to frame the discussion as we dig into the topic of performance in subsequent posts.  Folks might find this post on IE8 performance relevant along with the beta 2 release of IE 8. 

Performance is made up of many different elements. We could be talking about response time to a specific request. It might mean how much RAM is “typical” or what CPU customers need. We could be talking about the clock time to launch a program. It could mean boot or standby/resume. It could mean watching CPU activity or disk I/O activity (or lack disk activity). It could mean battery life. It might even mean something as mundane as typical disk footprint after installation. All of these are measures of performance. All of these are systematically tracked during the course of development. We track performance by running a known set of scenarios (there are thousands of these) and developers can run specific scenarios based on exercising more depth or breadth. The following represent some (this is just a partial list) of the metrics we are tracking and while developing Windows 7:

  • Memory usage – How much memory a given scenario allocates during a run. As you know, there is a classic tradeoff in time v. space in computer science and we’re not exempt. We see this tradeoff quite a bit in caches where you can use more memory (or disk space) in order to improve performance or to avoid re-computing something.
  • CPU utilization – Clearly, modern microprocessors offer enormous processing power and with the advent of multiple cores we see the opportunity for more parallelism than ever before. Of course these resources are not free so we measure the CPU utilization across benchmark runs as well. In general, the goal should be to keep the CPU utilization low as that improves multi-user scenarios as well as reduces power consumption.
  • Disk I/O – While hard drives have improved substantially in performance we still must do everything we can do minimize the amount that Windows itself does in terms of reading and writing to disk (including paging of course). This is an area receiving special attention for Windows 7 with the advent of solid state storage devices that have dramatically different “characteristics”.
  • Boot, Shutdown, Standby/Resume – All of these are the source of a great deal of focus for Windows 7. We recognize these can never be fast enough. For these topics the collaboration with the PC manufacturers and hardware makers plays a vital role in making sure that the times we see in a lab (or the performance you might see in a “clean install”) are reflected when you buy a new PC.
  • Base system – We do a great deal to measure and tune the base system. By this we mean the resource utilization of the base system before additional software is loaded. This system forms the “platform” that defines what all developers can count on and defines the system requirements for a reasonable experience. A common request here is to kick something out of the base system and then use it “on demand”. This tradeoff is one we work on quite a bit, but we want to be careful to avoid the situation where the vast majority of customers face the “on demand” loading of something which might reduce perceived performance of common scenarios.
  • Disk footprint – While not directly related to runtime performance, many folks see the footprint of the OS as indicative of the perceived performance. We have some specific goals around this metric and will dive into the details soon as well. We’ll also take some time to explain \Windows\WinSxS as it is often the subject of much discussion on technet and msdn! Here rather than runtime tradeoffs we see convenience tradeoffs for things like on disk device drivers, assistance content, optional Windows components, as well as diagnostics and logging information.

We have criteria that we apply at the end of our milestones and before we go to beta and we won’t ship without broadly meeting these criteria. Sometimes these criteria are micro-benchmarks (page faults, processor utilization, working set, gamer frame rates) and other times they are more scenario based and measure time to complete a task (clock time, mouse clicks). We do these measurements on a variety of hardware platforms (32-bit or 64-bit; 1, 2, 4GB of RAM; 5400 to 7200 RPM or solid-state disks; a variety of processors, etc.) Because of the inherent tradeoffs in some architectural approaches, we often introduce conditional code that depends on the type of hardware on which Windows is running.

On the one hand, performance should be straight forward—use less, do less, have less. As long as you have less of everything performance should improve. At the extreme that is certainly the case. But as we have seen from the comments, one person’s must-have is another person’s must-not-have. We see this a lot with what some on have called “eye candy”—we get many requests to make the base user interface “more fun” with animations and graphics (“like those found on competing products”) while at the same time some say “get rid of graphics and go back to Windows 2000”. Windows is enormously flexible and provides many ways to tune the experience. We heard lots on this forum about providing specific versions of Windows customized for different audiences, while we also heard quite a bit about the need to reduce the number of versions of Windows. However, there are limits to what we can provide and at the same time provide a reliable “platform” that customers and developers can count on and is robust and manageable for a broad set of customers. But of course within a known context (within your home or within a business running a known set of software) it will always be possible to take advantage of the customization and management tools Windows has to offer to tune the experience. The ability to have choice and control what goes on in your PC is of paramount importance to us and you will see us continue to focus on these attributes with Windows 7.

By far the biggest challenge in delivering a great PC experience relative to performance is that customers keep using their PCs to do more and more things and rightfully expect to do these things on the PC they own by just adding more and more software. While it is definitely the case that Windows itself adds functionality, we work hard to pick features that we believe benefit the broadest set of customers. At the same time, a big part of Windows 7 will be to continue to support choice and control over what takes place in Windows with respect to the software that is provided, what the default handlers are for file types and protocols, and providing a platform that makes it easy for end-users to personalize their computing experience.

Finally, it is worth considering real world versus idealized settings. In order to develop Windows we run our benchmarks in a lab setting that allows us to track specifically the code we add and the impact that has. We also work closely with the PC Manufacturers and assist them in benchmarking their systems as they leave the factory. And for true real-world performance, the Microsoft Customer Experience Improvement Program provides us (anonymous, private, opt-in) data on how machines are really doing. We will refer to this data quite a bit over the next months as it forms a basis for us to talk about how things are really working, rather than using anecdotes or less reliable forms of information.

In our next post we will look at startup and boot performance, and given the interest we will certainly have more to say about the topic of performance.

--Steven

Leave a Comment
  • Please add 2 and 5 and type the answer here:
  • Post
  • Thanks for the very complete paper on system performance.  I learned a lot and hope to keep learning as Windows 7 testing approaches.

  • I wrote a long comment so I split it up in sections here it goes:

    First I would like to address the fine people who comment on this blog. Comments like “make it better” or “make it faster” aren’t really helpful, but they add to the clutter and make it more difficult to find good ideas in the comments, also please read through the comments and if you find your idea expressed in more than three comments please don’t repeat it for the thousandth time, just think about something else. Thank you.

    Now a few observations on the raised issues:

    SKU Management: is a Marketing task, the Engineering department doesn’t have much to do with this decision I would imagine, it’s not an engineering task, it concerns me too little (since I used XP Pro and now Vista Ultimate), but to set a few things straight:

    There is only one Vista DVD, no matter what you install there I only ONE Vista DVD, there are no Basic, Premium, Business, Enterprise and Ultimate DVD’s there is only ONE DVD, the edition you install depends on the serial number you purchased, and you can upgrade using the same DVD depending on the serial number you enter at setup, the setup option of what to install is made when you decide which license (SKU) you buy. (Tip: If you can’t decide, get a free trial from Microsoft, it contains the different versions, install the ones you’re interested in and then make an informed decision)

    As for having only one version, that idea is only good for Vista Ultimate users and the Windows Engineering team, because what the different versions do is save customers money by not making them pay for features they don’t use. Why would home users have business and developer features and what would businesses do with Media Center or Movie Maker? More importantly why would they pay for them and have them take up space on the drive? By now, the people who suggested a customized install where people would just install the features they need, are thinking I’m trying to prove their point, no. The Vista setup installs over 6GB of OS in 20 to 30 minutes and it does that by just unzipping an OS image and sets up drivers and options on top of that base image, ISV’s, IT pros and users are happy with the simple, mostly unattended and fast setup procedure, by contrast the XP setup installs 2GB in 20-30 minutes. So if you just want a basic Windows without eye candy use Basic, if you are a typical home user use Home Premium, and so forth.

  • As for Windows components customizations all we get is a treeview with checkboxes in the Add Windows components, which takes me, as a developer, a little time to navigate through, it could be impossible for a simple user to tell what those components mean and whether they should Add or Remove them also you can’t uninstall PhotoGallery, MovieMaker, DVD Creator, Media Center, etc So my suggestion is to make a new .cpl with a intuitive interface and guidance for normal users whether a feature is useful for them or not. Because I like PhotoGallery, but if I install the Live PhotoGallery I don’t want them both, also I never use MovieMaker or DVD Creator or Media Center and would like a simple way to uninstall them, so making all the nonessential windows applications easy to uninstall is a good idea (most people won’t do it, but it should be easy for those who want to)

    Some say “throw WinSxS “, they don’t remember DLL HELL and exactly why it has that name, I do, and if giving up 7,5GB on my drive means never going back to that, then I’m happy, but I think there is still some optimization to be done here. I remember that before WinSxS applications only had a few conflicting dll’s which of course made installation impossible, but the key here is only a few conflicting dll’s. Versioning like DX versioning (d3dx9_35.dll) or vc runtime (vcrt80.dll) keeps conflicts to a minimum and a lot of times an application will install a newer dll which is backward compatible with the old one. So why do I need 8 versions of amd64 and 5 versions of the IA64 vcrt dll’s which I don’t use on my 32bit system and which I instructed Visual Studio not to install, maybe I need to take that up with the VS team, but why do I need 8 versions of system.servicemodel.dll which is a managed dll and should implement the same API throughout, so an older application should work just fine with the newer dll, I understand having a system.servicemodel.dll for .NET 2.0 one for 3.0 and one for 3.5, although 3.0 and 3.5 should use some of the same dlls as 2.0, but I would understand having 3 versions, but why eight, because a Service Pack shouldn’t change the API, and if developers developed against the API and not it’s implementation quirks it should be fine using the newest dll, and most applications work just fine with the newest version so why burden my system with 8 versions of every .NET dll on the off chance that maybe a application doesn’t work. I know .NET says it will try to run your application against the newest CLR and then revert to the CLR you were compiling against, but that should be CLR Versions (like 2.0, 3.0, not every single service pack and security update). I know that the current implementation assures compatibility but I’m also sure that a more restrictive implementation would ensure compatibility for 99% of users while reducing disk footprint for everyone. As an idea, maybe you should have a manifest field to specify whether a specific dll breaks (API) compatibility and should be placed as a copy in WinSxS or just replace the older dll.

  • punio4;

    I totally agree.  I would love it if I just dragged the install to the "Applications" folder and the only thing I had to worry about were the documents it created in my "Home" folder and the files it created in the "Applications" folder.

    Back to the registry but that's a given.  Talk about bootup time improvements.  How much performance would be gained, memory, etc, if the registry wasn't loaded at start?  I don't know for sure that it's fully parsed on startup but I'm not a fan anyway.

    While I think that it's good to allow users to disable certain "Eye Candy", I don't think this should be an excuse to have a non-standard UI.  Open IE, you have one style, open Explorer, you have old windows, open a number of different applications and when you want that eye candy you always end up running to that overzealous old schooler.  When I'm using "Competitor" products, do I ever run into an icon that looks like it was made in 1492 or a window that uses dropdowns designed in OS/2 Warp?

    The great thing that WPF provides is the ability to utilize animation and old school looking forms controls in the same application, while maintaining separate templates for both.  Why not open the templates up for override like WPF does?

    I just want to see a consistent clean user experience in both asynchronous GUI parts and everyday tasks.  I know there are a lot of people out there that like the old Windows UI still but that's why there's the option to disable it.

    Work out two separate explorers, two separate layouts entirely, and allow the user to choose what they want when they install windows.  Don't make people use the new UI and don't make people not use it.  If it means installing old libraries and getting the old Vista/XP experience, let the user install it and don't install the new UI components.  Work on the new UI entirely separate from the "old UI".  Both teams can hash out API's with the business and data layer teams.

  • The biggest enhancement you can make is, (as someone already pointed out), make Windows retain performance even after months of usage. I understand that this is not magically possible because leftover registry entries, growing registry size due to growing list of installed apps and piling log files all slow down Windows but Microsoft has never attempted to prevent any of these. (Windows Disk Protection as part of Shared Computer Toolkit/SteadyState and virtualization undo disks are nice attempts). After several months of usage, typically the most memory consuming processes on my system are usually the Shell which slows down upon adding shell extensions, non plug and play drivers, IE because of IE addins which even when disabled affect IE's performance and the startup times of all I/O intensive apps. Maybe MS can add a 'Windows Registry Protection' to SteadyState which *deliberately makes it* lose all changes upon reboot?

  • Another major turnoff I discovered in Vista is that it uses WinSxS not only to store side-by-side assemblies but also to store files protected by Windows Resource Protection and maintains multiple copies of those files (along with ridiculously long horrible names) which are updated by hotfixes/service packs. The servicing stack (Package Manager) howsoever it is designed always performs much much slower than its previous incarnation (Update.exe). Installing hotfixes is slow, every hotfix needs to be 'configured' before logon and logoff. Whatever maybe the case (poor design or simply another tradeoff), installing updates/hotfixes should not take this kind of approach. Also, the growing footprint of the WinSxS folder is a living example of how Microsoft has no concern about disk space on end users' drives and you've simply assumed modern disk drives are large enough to make disk space an issue. I've come to associate Vista with an OS that doesn't get right updating itself. The WinSxS folder doesn't seem to be a tradeoff in either time or space. (And yeah I have read http://blog.tiensivu.com/aaron/archives/1306-Demystifying-the-WinSxS-directory-in-Windows-XP,-Vista-and-Server-20032008.html but WinSxS doesn't perform the same way as in previous OSes, it does something more in Vista). Something seriously needs to be done about WinSxS and the servicing stack, I agree with one of the above posters that it's the worst architectural part of the OS.

  • 'Windows is enormously flexible and provides many ways to tune the experience.' That holds true only for the Windows NT 5.0/5.1 family. Vista isn't flexible or customizable in any way, it takes away all the power and customizability from power users and offers an enormously dumbed down interface that helps only grandmas and Joe Averages. Microsoft's shell and UI teams did their worst job during the Longhorn project and the UI is very less productive, disruptive to those already familiar, worsened in many cases by removal of fine-grained configurable settings and made complicated by long explanations which require a lot of reading before the user takes any action. Some examples of unproductive UI are Windows Explorer's lack of customizablility and removal of several old buttons and menus, idiotic behavior of compulsorily and automatically sorting files, sort by any criteria works in the reverse order, the efforts needed to get to the connection list out-of-the-box, the fixed-size tree-style Start menu, advanced search UI (although search itself works satisfyingly), 'Default Programs'/file types UI, URLs! (Can you believe it?) in load/save dialog boxes!. 'Property sheets' as MS calls them instead of dialogs aren't productively created in Vista (although they scale well resolution wise). I think all the money Microsoft spend in 'user interface R&D' and 'user experience' were completely wasted in Windows Vista.

  • I hope Microsoft will really try to address performance of redesigned apps in Vista (e.g. the abominable Disk Defragmenter, Windows Mail performance, slow as ever Windows Media Player), again rewrite the servicing stack in Windows 7 to have the fastest performance that even exceeds that of Update.exe, makes Windows 7's UI highly customizable (TweakUI, where are you?).

    As for number of editions, you could merge Starter and Home Basic and keep them for new markets. The Starter/Home Basic can be power optimized for laptops, 3 mainstream editions (Home, Professional (again merge the Business/Enterprise SKUs) and Ultimate balance out things. The Home edition can be the media/gamer-oriented one. Another minor aspect is the features of the OS aren't correctly distributed across the SKUs, for example, for some insane reason, the Business edition doesn't have BitLocker, and the Unix subsystem! (what value does it give me in upgrading if XP Professional can get Services for Unix?). Home Premium doesn't have Fax?, EFS!, Previous versions?, Complete PC Backup, RDP Host/Server?, Local Group Policy (at least)!

    Lastly, another change of approach I would like to see in Microsoft's attitude with respect to what it calls 'feature design change'. I think Microsoft can really carefully watch the market for issues which users have and agree with unanimously *with the current product* and ship solutions in the form of powertoys, hotfixes or service packs. Waiting for the next release to get some major blunders right besides bugs doesn't add value to those who've purchased the current product already and are not happy with it.

  • While re-engineering alot of other stuff under the hood, could we please see somekind of application-bundles (like osx) with an folder representing an application, preferably combined with something like a registry-root named  HKEY_CURRENT_APPLICATION that stores it's settings inside that bundle, so we finally can move applications around without breaking their configuration... - or force us developers to use inifiles again.

    Oh, and while you're at it, separating the ui/kernel and making the ui very open to customization would be really fun :)

  • "The Vista setup installs over 6GB of OS in 20 to 30 minutes and it does that by just unzipping an OS image and sets up drivers and options on top of that base image, ISV’s, IT pros and users are happy with the simple, mostly unattended and fast setup procedure, by contrast the XP setup installs 2GB in 20-30 minutes."

    What is the point of this thinking? That Vista is capable of installing 6GB of mostly useless stuff on your HD in same time, while XP installs only 2G in the same amount of time? How about if Vista would install the 2G of data it (the user) needs, and takes the 20 minutes for that? Would that be reasonable? Would that be right? I think so.

  • Things to think about for users and developers…

    The thoughts of many users which is reflected in the posts to this blog seem to skip over the feature trade off and how vast the performance issues they raise truly are.

    How big are the performance issues?

    What performance issues that ‘still’ exist between Vista and XP even on a 1GB Pentium system are quite small, and the performance ‘increases’ are often overlooked. In getting your daily work done, Vista’s performance is better, even if you want to show a 10sec slower boot speed.  

    A lot of this comes for the ‘artificial’ office performance tests that many news outlets actually used, comparing a ‘scripted’ set of performance is not real performance, and Vista’s user based optimization systems even make these even more invalid.

    There is no way to use a scripted set of tests to show that when jumping from Word to CorelDraw in the middle of thought that Corel will load significantly faster and make both it and Word far more usable as the user switches his/her work between them and other applications. (Just the minimize virtual memory page feature of NT that ended with XP is enough of a reason to move to Vista, so that memory is not shoved to VM just because a user minimized their current application.)

    How do the features affect performance and are they worth it?

    Someone mentions the defragmenter of Vista as being slower than tools like Contig.exe, but when Vista’s defragmenter for the first time puts files and applications in disk locations that allow them to load faster and also manages free space and other aspects that generic defrag tools don’t manage, is the defragmenter in Vista really so bad, especially when it usually runs when the user isn’t sitting at the computer?

    What about data security as a feature? The additional journaling features of Vista add a slight overhead (although not really noticeable in the performance), is this not worth the trade off?

    What about people that complain about the HD light staying on, even though there is no performance loss while this happening? What if the HD light is staying on because of the background CHKDSK features to ensure the volume is sound, fixing problems of the HD, and even keeping a failing HD alive by recognizing bad blocks and pre-emptively managing?

    Someone even mentions Windows Media Player as being slower. However Windows Media player in Vista is known for its ability to have the best experience in viewing video/audio content when it comes to stream latency and ensure the lips and words always match. Is the extra second it takes to load worth surpassing everyone else in sync features?  There is also the expanded library features, that allow your content to stream to your Xbox or other PCs, is this not worth whatever loss in performance the user notices?

    This is also a debate users seem to ‘keep’ having about ‘themes’ and ‘Glass’ in both XP and Vista. In XP the theme overhead was so tiny that even on a 200mhz Pentium 80mb system, our engineers could not find a test that showed any measurable performance with it on. Yet people today, still talk about the ‘performance loss’ of themes in XP. (Even in RDP sessions, we couldn’t measure enough of a different to have our server turn it off for TS for our clients.)

    Aero and Glass again are a ‘performance’ argument, but in truth, leaving Glass On results in better performance, even for business users. The only ‘slow down’ we have seen is on a bad driver on a 5200 FX series card that ½ clocked the GPU, and then turning off the transparency was enough to restore any performance loss. (Although fixing the driver to run at native speeds worked even better.)

    Aero and Glass are ‘pretty’, but other than the few ‘MB’ of RAM DWM consumes, is it really slower? The answer is no.

    People assume the Aero ‘composer’ in Vista works like composers of the past and composer in other OSes like OS X and even Linux’s KDE4. It doesn’t, the memory footprint is very light as it uses GPU RAM efficiently and even when that is not enough, shares out system RAM, due to the beauty of the WDDM. It also works more closely with WPF applications, acting more like a true Vector based composer for performance and low footprint on resources.

    For everyone reading this post, take a few minutes to learn about Aero and Glass and ‘learn’ that turning it off will reduce system performance. And this is from the basic window redraw and tearing all the way to how the Vista composer uses 3D GPU features for font rendering and bitmap decompression and even a few GDI calls.

    Gaming…

    While we are talking features and performance, let talk gaming, as this seems to be the best example of stretching both the underlying hardware and OS the game is running on. Even when Vista was released with ‘new’ drivers from NVidia and ATI, the performance averaged 15% less than XP. On a game that runs at 30fps, that is only 5 fps loss.

    Sure for gamers 5% is a lot, but not much considering the new WDDM and the graphics model ATI and NVidia had to meet with their drivers. Move forward to June 07 and this loss started to disappear on many games, and by Septemeber of 07, the maturity of the ATI and NVidia drivers finally allowed Vista to ‘keep up’ with XP and start surpassing it.

    Fast forward to July 08, and in most games Vista, especially Vista x64 is outperforming XP in every game, and in some games as much as 20%. This also doesn’t account for the faster load times or faster dynamic content loading Vista users were getting with less HD activity. Even MMOs that have zones or dynamic content stopped stuttering to load content and not only could zone users faster, but were noticeably smoothing when the player was entering a new area loading new content. And this is even older games like CoX or SWG or WoW.

    As for features for games, sure the superfetch of Vista adds in better features by pre-loading content from massive games and users get a more fluid experience. However, there are other features Vista brings to gamers, that a lot of people never realize or even try.

    For example the nature of the WDDM in Vista gives Vista the ability to manage the Video card, and this brings almost a full pre-emptive GPU experience if you are running several games at once, or are running the game in a Window ‘with the Aero Glass’ also running perfectly alongside the game. Some games even run faster in a Window on the Vista desktop than in a full-screen window, because the game is working with the Aero Composer in a shared texture write method. Try this yourself, some games will run faster in a Window on the desktop than full screen. Then bring up your games FPS and watch it barely even move while doing Flip3D. Even load another game and watch as the games both have rather high FPS running on the screen at the same time, thanks to Vista’s control over the GPU threads and the WDDM.

    The next feature often missed when gaming on Vista is the WDDM’s shared memory abilities. Sure this allows the OS to utilize more RAM than your GPU VRAM has available, but even on high end system, this gives users a few more features in the game.  For example, in a game where you can change your texture sizes used, if you have a 128mb or 256mb Video card, you can turn up your in game texture sizes to levels that normally only a 512mb card could handle.

    This is because Vista presents part of the system RAM as VRAM and does direct writing via old AGP write technologies. So Vista can identify high performance texture needs, ensure they are in the fastest place possible, and textures that don’t need super performance can reside in system ‘virtualized VRAM’, leaving room for more ‘high end’ textures in your game, without a performance loss.  

    So with shared RAM and Vista’s WDDM, gamers can not only ‘now’ get better performance than XP, but better performance running in a Window, and also use higher level textures in the game than is possible on XP.

    ----

    These are just a few things to think about, and hopefully get people to pay attention to how features and performance not only contrast each other, but sometimes work together in ‘strange’ ways to give both more features and better performance.

    The Net Avenger

    PS – I do not work for Microsoft or with Microsoft in any way, and have no personal or professional interests in Vista or its success. I am a consultant, researcher, and teacher that finds the misinformation of technology dangerous as it becomes more politically motivated rather than assessed based on fact.

  • As a developer and am quite dissatisfied with Windows Vista. Windows 7 should be very fast, not a resource hog like Vista. No eye candies should be installed by default on Business editions of Windows 7. And please give us more reasons to upgrade, more handy features and innovations. And just keep about three editions of Windows 7--Home Premium, Business and ultimate. And for those who want eye candies (home premium and Ultimate), please make more eye popping animations and stunning graphics for Windows 7 (let it be OS X killer). We would love to see Office 2007's ribbons integrated in Windows 7 explorer and that's what am expecting. Hopefully Windows 7 will have all those features and many new ones too. Thanks

  • To add to the mix here are the most important points I'd like to see addressed:

    1) A universal updater. As a previous poster touched upon every application these days seems to have it's own updater to automatically download and install bug fixes, new features, etc. The OS should provide a simple mechanism for this similar to the one included with some Linux distros such as Ubuntu. This would prevent a lot of annoying popups and system tray balloons as well as developers having to 'reinvent the wheel'.

    2) Several previous posters have mentioned there many MS applications that do the same thing, I agree this should be sorted. Is there no-one responsible for looking at the 'big picture'? My biggest annoyance in this area is with calendars and contacts. Vista provides perfectly good calendar and contacts functionality but I can't sync this with my PocketPC (which is running a version of Windows), instead I have to install Office which provides another calendar and contacts which will sync with my PPC. Not everyone has/wants to install Office you know!

    3) GUI consistency - Vista looks disjointed because each part uses different GUI paradigms. Again it seems there was no-one looking at the 'big picture'. How can 3rd party developers be expected to produce applications for Vista with a consistent GUI when MS can't even manage this with the OS itself?

    4) Ditch the registry. I realise this would be a huge amount of work but as others have said would help alleviate the gradual system slow-down over time.

  • I have spent much time discussing performance with users of all stripes - love Windows, hate Windows, indifferent to it. Strongly technical, wannabe technical, nontechnical.

    My feeling is that the biggest problem around performance is that people don't really understand how to put it into perspective. Windows actually has a wonderful perf measuring tool (PerfMon), but few people - even technical ones - few people take the time to learn and understand it. It's a deep topic.

    So I (humbly!) suggest that you set aside a few people to rethink performance visualization and explanation. The Experience Index is too simplistic; PerfMon too complex. TaskMan is often misinterpreted (especially the memory numbers; few people understand the concept of backing store). Maybe some middle ground? I'd love to see some sort of pie graph, or group of pie graphs, which allow a middle-of-the road user to instantly spot which programs are using the most cpu/ram/disk/network. Importantly, these graphs need to be able to show all programs, not just Windows and its services.

    In many 'Windows slow' complaints, the real slowdown is actually some 3rdparty program. Users need to be able to see and quickly understand where the hogs are. Over and over we hear people talking about 'bit rot' and 'stuffed registry'; these are mainly illusory pictures painted by people who aren't able to tell what's *really* going on. Give them the tools, and less-technical overview docs, and they'll start to have a better understanding of reality.

    Additionally, I was very sad to see BootVis go away. We need that (or something like it - maybe with those piegraph views) back again. We also need a ShutdownVis, so we can see what's causing those 5 minute shutdowns. Startup and shutdown are, as you say, very important to the overall performance perception.

    Thanks for hosting the dialog, Mr. Sinofsky. It's valuable!

  • Nice post

    TheNetAvenger !!!!!!!!!!!!!!!!!!!!!!!!!

    GREAT

Page 4 of 8 (113 items) «23456»