Geeking out with Windows Deployment Services

Geeking out with Windows Deployment Services

Rate This
  • Comments 5

(Stick with me on this one – it will take a while but I will bring this back to virtualization at the end of it all.  Oh – and be warned – this is a very long post.)

This weekend I had a challenge to solve:

A while ago I bought myself one of the original EEEPC netbooks (with teeny tiny screen and only a 4GB flash drive).  I have had lots of fun hacking the hardware on this system (added Bluetooth, internal SD card) and hacking software on it as well (installing Vista and then Windows 7).  But now I want to put Windows XP on it so my wife can use it for web browsing / instant messaging.

The problem is that I recently “broke” (as in “pulled apart to use the pieces in other things”) my only USB CD drive.  After a couple of failed attempts at installing Windows XP off of a USB flash stick (I am sure that I got this to work in the past) I could think of two options:

  1. Go out and spend $50 on a new USB CDROM drive.
  2. Spend the whole weekend setting up a Windows Deployment Services server to enable network deployment of operating systems in my house?

I should think that the choice is obvious!

I created a new virtual machine on my Hyper-V server, installed Windows Server 2008 R2 Enterprise Edition, enabled Windows Deployment Services – and fell in love!  Windows Deployment Services is surprisingly easy to setup and very powerful.  I am now kicking myself for not doing this ages ago!

Before I get too far into this post – let me go over some core concepts:

With Windows Vista / Windows Server 2008 and later Microsoft has moved to a new OS installation model (actually – this model was enabled for Enterprise customers with Windows XP and Windows Server 2003 – but Vista made it mainstream).  This model is one where we use Windows to install Windows.  Specifically we use a special light weight version of Windows to install the full version of Windows.

This light weight version is WinPE (Windows Preinstallation Environment).  When you boot a Windows Vista or later install DVD you are booting into a version of WinPE.  From there you configure your physical hard disk and lay down the full installation of Windows.

If you look in the “Sources” directory on an installation DVD you will see two .WIM files.  BOOT.WIM is the WinPE image that is booted off of the DVD.  INSTALL.WIM is the Windows image file that is applied to your computer to install Windows.

This concept of having a boot image and an install image is key to understanding how Windows Deployment Services works.

Now, back to the story.  Before I tackled the “XP on my netbook problem” I thought I would try getting generic “Windows 7 installed over the network” going – as I figured this would be the easier problem to solve.  Was it ever! 

After enabling Windows Deployment Services – I pretty much just put my Windows 7 DVD in the virtual machine and said “use the bits off of that”.  Windows Deployment Services:

  • Copied the appropriate files off of the installation DVD
  • Automatically configured boot images and install images
  • Automatically configured boot menus
  • Automatically configured my DHCP server to point to it for network boot (keep in mind that I am running in a Windows Server 2008 R2 domain with a Windows DHCP server)

All-in-all it took me about an hour to get from “blank virtual machine with no idea what to do” to “kicking off my first network based installation of Windows 7”.

I was immediately overwhelmed by the coolness of all of this and got completely distracted from my original task, and spent the next hour getting 32-bit and 64-bit versions of Windows 7, Windows Server 2008 R2, Windows Vista and Windows Server 2008 loaded onto my Windows Deployment Server.  I am never going to need to look for a DVD again (heck! I am never going to need to locate an .ISO file again!).

Side note – one interesting thing to note here is that you only need one 32-bit boot image and one 64-bit boot image for any number of install images.  In my case I am using the Windows 7 boot images for all of the above versions of Windows.  One nice benefit of doing this is that since Windows 7 has the Hyper-V integration services built in – I get integrated mouse support no matter what operating system I am installing.

Once I got this all up and running, I turned my attention back to the problem of getting Windows XP deployed with Windows Deployment Services.  It turns out that with Windows XP and Windows Server 2003 the process is the same as if you want to build a Windows Vista or later deployment with your own customizations.  The process looks like this:

  1. Install the version of Windows you want to use on a computer (or in my case – a virtual machine :-).
  2. Apply patches and updates, install applications and perform any customization you want to.
  3. Use Sysprep to prepare the operating system to be redeployed on another physical computer (or virtual machine).
  4. Capture the system in a .WIM file that can then be used as an installation image by Windows Deployment Services.

For this last step there are two options for for how to create the .WIM file:

  1. The first option is to use ImageX.exe to make the image, then copy it up to the Windows Deployment Services server and manually register the image on the server.
  2. You can create a capture image on the Windows Deployment Server (this is as simple as right clicking on one of your boot images and selecting to create a capture image from it).  Capture images allow you to network boot a Sysprep’d system and have it be automatically captured and uploaded to your server.

I tried this all out – and it worked pretty much just as advertised.

So how does this all tie back to virtualization?

Well, to start with – there is no way I would ever have dedicated an entire physical server to being a Windows Deployment Services server in my house.  Sorry – I just do not have enough systems for that (and I am the only user).  But I only have a single physical server in my house.  It currently runs:

  • Hyper-V (of course)
  • Active Directory
  • DNS
  • DHCP
  • An FTP server
  • Windows Home Server
  • System Center Virtual Machine Manager 2008 R2
  • Windows Deployment Services

As well as a desktop virtual machine, a Windows Deployment Services staging virtual machine, a Windows Deployment Services deployment test virtual machine and a spare installation of Windows Server 2008 R2 for trying random server software on.

But the real benefit of virtualization when using Windows Deployment Services is using a virtual machine to stage your custom installation images.  Apart from the obvious hardware saving – there is a workflow benefit.  If I used a physical computer for staging custom installations my workflow would look like this:

  • Install Windows
  • Apply patches / updates, install applications, apply customizations
  • Sysprep the system
  • Capture the system to the Windows Deployment Services Server

So far so good – but what about a month from now when I want to update my custom image to have the latest patches / updates?  Well – I would need to redeploy my image, and repeat the whole process.

By using a virtual machine, I am able to take a virtual machine snapshot before I Sysprep the system.  Then once the system has been captured by Windows Deployment Services – I apply the snapshot, delete the snapshot and turn the virtual machine off.  It is now in the state that it was just before I sysprep’d it.  I can safely leave the virtual machine off – and in a months time I can boot it up, updated it, snapshot it, sysprep it, capture it, revert the snapshot and turn the virtual machine off again.  This makes keeping my custom images up-to-date a whole lot easier and quicker than if I was using physical hardware.

Having got my Windows Deployment Services server up and running – there are a couple of advanced configurations that I have setup:

  • By installing the WAIK on my Windows Deployment Services server, I have been able to setup a generic WinPE image as a boot image.  This allows me to easily load WinPE in any of my virtual machines / physical computers to do data recovery / system fix-up on any system that I accidentally break.
  • By following these directions I have made Windows Recovery Environment images that I can also use to boot any virtual machines / physical computers to try and fix any problems that occur (unfortunately I have already had to use this once – fortunately it worked perfectly and the system is fully recovered).

Two more configurations that I would like to try are:

  • Setting up Windows Deployment Services to be able to deploy systems that are configured for native virtual hard disk booting (per these instructions).
  • Setting up Windows Deployment Services to be able to deploy Linux.

Fun times!


Leave a Comment
  • Please add 2 and 3 and type the answer here:
  • Post
  • I agree with you about the huge power of WDS, there is a tool out there to leverage WDS and take all the manual steps you describe above out of the process.  It's a tool called OpsQuick from visionapp.

    It will work with your existing WDS site and be able to inject Sysprep settings and start deployments all from within a single console.

    Have a look :)

    Oh yeah the real kicker is, it's licensed per admin rather than per node.


  • I've been doing something similar with MDT 2010 RC running on Hyper-V VM with a bunch of lab "PC's" running XP and no-OS.  It gives you a lot more with a little more effort.

    Within minutes of completing the install I was deploying Windows 7 onto clean PC's and onto XP machines (while capturing/restoring the userstate) and dropping the drivers into place, i.e. the 2008 R2 integration components.

    And there's no need to learn the syntax of DISM *phew* to sort out driver issues.

    With a tiny bit of work I had a Light Touch process that asked me for credentials at the start and gave me a fully configured build not long after with no more interaction.

    MDT 2010 is a scary idea: BDD was a documentation maze.  But it looks to me like the deployment guys put some effort into giving us scenario based step-by-step documentation.

    We'll be using this MDT server VM in the local Windows 7/Server 2008 R2 launch events kicking off in a few weeks.  And I'll probably do a step-by-step webcast on using MDT next month following some other ones I'm doing for the local Windows user group.

  • Can you sysprep and generalize an XP image using WAIK and deploy it in less time similar to Vista? Or rather put this way, is generalization supported on an XP sysprepped image using WAIK?

  • Anonymous -

    You use standard Sysprep to prepare XP - so nothing changes there.



  • Thanks for this Ben. Staging custom installation images in virtual machines is a great idea. I've been using WDS in a Hyper-V VM and agree with your general assessment. It's excellent. Unfortunately I misunderstood the WDS support for deploying VHDs to mean that WDS would deploy a VHD to hardware as though it were a WIM. When I figured out this was actually just WDS support for deploying native boot VHDs I gave up on the idea of virtualising my custom image and captured a physcial build. I completely overlooked this approach. I'll be giving it another go now. Your blog has been a great help to me over the last week.

    One other quick question. You mentioned that you might do a follow-up post about Hyper-V desktop performance work-arounds. Is there anything to add to these? http

    Oh, and do you think we might ever see dual monitor support in the VGA.sys driver, or if there is any other way to achieve that? I would have posted in that thread but comments are now closed.



Page 1 of 1 (5 items)