How to redistribute the Visual C++ Libraries with your application

How to redistribute the Visual C++ Libraries with your application

  • Comments 51

Hello again, this is Ben Anderson, SDET on the Visual C++ libraries team. 

One of the most common questions we get from customers on the forums and elsewhere is “My app needs the Visual C++ Libraries (CRT, ATL, MFC, OpenMP or some combination thereof) – how do I get them on my customers’ machines?”  It’s also something we see in the wild done, if not incorrectly, at least non-optimally fairly frequently.  The help documentation in MSDN is correct, but there is no one stop-shopping explanation of all your options.  This blog post will attempt to explain what to do. (In case you’re looking for the short answer, almost always, the correct thing to do to distribute the Visual C++ libraries is to add the Visual C++ redistributable MSMs, or “Merge Modules”, for the libraries you use to your application’s setup.) I’ve tried to outline below the various methods of redistributing the Visual C++ library DLLs based on what your deployment story may be.

 In most cases, folks deploy their applications using a standard Windows setup.  In these cases, you probably build an .msi file using some toolset (such as a Visual Studio setup project) which is then wrapped in an .exe file by your tool chain.  End users run this .exe file and your application is installed.  If you don’t already have a setup for your application, it’s very easy to create one using Visual Studio’s setup project (found by right clicking your solution, clicking Add -> New Project… -> Other Project Types -> Setup Project).  You can then right click your setup project, click “Add->Project Output…”, then select Primary Output.  You can then add Start Menu items and tweak your setup to meet your needs.

In order to redistribute the Visual C++ libraries, all you need to do is include the appropriate .MSM file and its accompanying policy .MSM to distribute the library you need.  If you are creating a setup project as part of your solution as described above, Visual Studio will attempt to detect which libraries you depend on and will add MSMs as appropriate.  If you are creating your setup project with another tool, or not using the “Add project output” option, you will have to manually add the MSMs for any libraries you need.  These libraries are found in “%ProgramFiles(x86)%\Common Files\Merge Modules”.  For example, on my VS 2005 SP1 system, if I had an x86 MFC App, I would add the following files as Merge Modules to my setup project:

  1. “C:\Program Files (x86)\Common Files\Merge Modules\Microsoft_VC80_CRT_x86.msm”
  2. “C:\Program Files (x86)\Common Files\Merge Modules\Microsoft_VC80_MFC_x86.msm”
  3. “C:\Program Files (x86)\Common Files\Merge Modules\policy_8_0_Microsoft_VC80_CRT_x86.msm”,
    and
     
  4. “C:\Program Files (x86)\Common Files\Merge Modules\policy_8_0_Microsoft_VC80_MFC_x86.msm”

These files are then consumed by your setup tool, and their contents are dropped as part of your MSI on your users’ systems.  They contain components which install to Windows Side by Side the DLLs and the redirection policies [see footnote 1] for the libraries you select.  These components are ref counted so that every time an app using these MSMs installs, the ref count is incremented, and every time one of these apps uninstalls it is decremented.  Once the ref count hits zero, the DLLs and policy are uninstalled. 

 There are a few cases in which MSM installation may not work for you.  In one case, you may have to deploy your app on systems where the user has no administrator privileges and so cannot run a setup.  There may also be some other reason you cannot use an MSI to install your application – for instance, users may run your binaries directly from a network share. 

In these cases, you can do an “app-local” deployment, which is sometimes called deploying the DLLs as “private assemblies”.  All you need to do in this case is provide a copy of the DLLs you need, and their accompanying manifest in the same directory as every .exe, .dll or .ocx file in your application.  To deploy in this way, simply copy the entire contents of the appropriate folder under <Visual Studio install dir>\VC\redist [see footnote 2] into the all folders which contain binaries which use those libraries. 

The advantage of this approach is that you do not need to create an install for your application.  This means you can deploy and run without requiring your users to elevate to administrator privileges.  All your users need to do is copy your application folder onto their systems or run your .exe directly from its current location.  The disadvantage is that you must put a separate copy of the libraries you need in every single directory in which your binaries reside.  For a simple application, this may not be a problem, but for a large app which might have many subdirectories with many tools and DLLs, this is a lot of file duplication. 

 Finally, there is one additional scenario for redistributing the Visual C++ libraries DLLs.  This scenario is if you are using “Click Once” deployment.  In this case, “Click Once” will use a custom built installer package called “VCRedist_<arch>.exe” to install the libraries for you.  DO NOT use the VCRedist_<arch>.exe installer packages for any other purpose. 

The VCRedist packages are simply MSI’s built by consuming all the MSMs from “%ProgramFiles(x86)%\Common Files\Merge Modules” as well as the MSDIA DLL (used for debugging).  However, MSIs are not ref counted like the components in the MSMs, so if you install it, you can never uninstall it because you do not know who else might be using it in addition to your app.  Further, your users cannot uninstall it because they do not know which of their applications may be using it.  Additionally, your users may not realize what it is when they see the entry in Add/Remove program files.  Imagine a user trying to free up space on their machine, seeing the entry for VCRedist which they do not recognize, uninstalling it, then some time later (maybe months), trying your application again.  It will not work!  Your user will probably not connect the action of uninstalling VCRedist at some point in the past, and will either be broken without a fix, or use your support center’s time trying to find out why your app stopped working.  What’s more, it’s very likely that you are not using every single Visual C++ library, and installing the whole of VCRedist is unnecessary.  Alternately, a poorly written installer for another application which used VCRedist to redistribute the Visual C++ libraries may (incorrectly) uninstall VCRedist when that app uninstalls. 

A better option if for some reason you cannot incorporate the MSMs into an MSI which installs your application is to use Visual Studio or another tool to build a tiny MSI installing just the MSMs, and only those that you require.  Since this MSI is unique to your product, and can be named whatever you like, you can uninstall it when your application is removed, and you can name it in such a way that your user recognizes it as part of your application and will not uninstall it inappropriately (name it say “MyApp Prerequisites).  By using your own MSI, you also guarantee that no other application which uses the VCRedist package will interfere with your app by incorrectly uninstalling it during that app’s uninstallation.

Again, just to emphasize – do not use VCRedist*.exe unless you are using Click Once to deploy your application.

 In addition to all the methods described above of distributing the Visual C++ libraries DLLs, there is one last option for building your application which does not require you to distribute the DLLs.  However, this option only works for native-only code (it is not supported with /clr) and leaves your customers seriously vulnerable to any security holes as well as adds a significant burden upon yourself to patch all customer systems should a vulnerability be found in any of the libraries.  This option is to statically link in the libraries as .lib files instead of dynamically loading them as DLLs.  You do this by using the /MT flag on the cl.exe command line (vs /MD), or selecting the appropriate option in your project properties through Visual Studio.  You may wish to use this option when testing early debug builds of your application on test machines before you start working on setup. [See footnote 3] 

However, I can think of no scenarios in which this is actually the right thing to do when shipping your product to customers.  Basically, what this approach does is pulls in the binary code needed from .LIB files at compile time, making it a part of your .exe or .dll files.  It increases the size of your application, and there is no way to update the libraries apart from recompiling your application with new .LIBs and redistributing your application all over again.  What this means is that unless you go touch every single machine which has installed your application every time there is a security vulnerability found in the Visual C++ libraries and completely reinstall your updated binaries, you will be leaving your customers vulnerable to attack.  If instead you use the DLLs, every time there is a security vulnerability found in the Visual C++ libraries, Microsoft will install the update centrally into the WinSxS folder via Windows Update and all requests for the DLLs will be redirected to the updated version.  This removes all servicing burden on your side and also allows the user to install one small update which will touch all their applications instead of replacing every installed exe and DLL on their system.  Please, do not distribute an application built by linking statically against the Visual C++ libraries unless you have a system in place for updating every customer machine and also have a very good reason to do so.  At this time, I can think of no circumstance under which this would be the right thing to do for a shipping application.

 Well, hopefully this article has helped you out in understanding how to redistribute the Visual C++ libraries onto your customer’s machines.  If you have additional questions, you can find the documentation for deploying Visual C++ built applications here:

http://msdn2.microsoft.com/en-us/library/zebw5zk9(VS.80).aspx

 If you still have questions, you can post comments here (I will check back for a few weeks), or you can post your question in the Visual C++ forums here:

http://forums.microsoft.com/MSDN/ShowForum.aspx?ForumID=29&SiteID=1

 Thanks,
Ben Anderson
Visual C++ Libraries Team


[1] The redirection policy always redirects requests for the Visual C++ dlls to the latest installed versions, even if the application requesting the dlls has used “app local” deployment to drop the dlls as private assemblies – this way, if a security issue comes along, Windows Update can drop fixed dlls into Windows SxS and all affected applications will be fixed.  The Visual C++ team maintains a strong binary compatibility guarantee that applications built against an earlier version of the library will work against all later versions with few exceptions (exploitable usage may be broken to prevent customer machines from being hacked).

[2] Please note that the files in this directory are not updated in QFE patches, and some of the manifest files in this directory were not updated as part of SP1 of Visual Studio 2005.  As a workaround, you can find the appropriate version of the files in the WinSxS directory of your Visual Studio development box by typing “c:\windows\winsxs> dir *VC80*”, identifying the correct directory based on version numbers, then copying the contents of that directory into your application directories instead.

[3] A better option would be to create a simple setup project and include all the Visual C++ MSMs and install this on all your target machines. 

  • Sorry it's been a while, but as I said responses here will get thin now that it's been more than a few weeks.  I recommend checking out the MSDN forums for quicker answers.

    Anyway, I'll hit what I can.  If I miss something, bug me again (maybe post short comments with hints about what I missed;):

    Nick:  We will indeed drop the updated CRT on every machine via Windows Update in the case of a security event.  The security fix I worked on a while back was actually not an exploited issue, but an internally reported issue where a specific use case of a rarely used function could lead to a garbage function pointer.  Basically it was an issue that noone would probably ever hit ever, and also that probably no one even has written exploitable code for so it did not trigger a security event.  Maybe it should have, I'm not sure what the exact polices are.  You're right in your comment though that if you want to see the VC8 DLLs on pretty much every machine, one thing you could do would be to report a security issue and get MSRC to issue a patch ;).  We jokingly suggested we code one in to do this and then have someone's mother report it, but of course we would never actually do that.  

    kstreith - it sounds like you might be trying to drop the wrong bits in the x64 case.  I would double check you've got all the manifests correct and you're dropping the right bits for app local.  It should work on every Windows OS XP and above (including 64-bit).  

    Charles - the reason you can't link to msvcrt.dll using 2005 is that msvcrt.dll defines objects and structs differently than 2005 does.  It also uses an older version of the C++ standard.  So, while in theory you could take the 2005 compiler on your path and put the VC6 headers and libs on your include and libs paths, the 2005 compiler would likely choke on the VC6 headers.  It might work, I have no idea, but you'd definitely be way out in unsupported land and it would probably be way more trouble than it's worth.  Also, then you're basically using VC6 WRT the libraries.

    I can't discuss planning for VC10, but one of the things we'd like to work on and address is an improved redist story.  It's all early stages now and we haven't prioritized and selected which features will get in to VC10 as a whole, but hopefully not too long down the road you'll see a new blog post talking about all the issues that are addressed :).  Some of the things we're thinking about would solve some of the scenarios people here are reporting (in fact, these comments are one of our sources of info for what sorts of things are the most pain).  

    Also, I wholeheartedly agree that 2005 SP1 was a huge beast of a patch and that it was extremely painful to install.  As someone who has to build test machines probably more frequently than anyone pretty much anywhere else, I feel your pain ;).  It will be interesting to see if the situation improves for 2008 SP1.  I guess part of the issue was that the 2005 SP1 patch was huge, but I'm sure there is more work that could have been done to make it more painless.

  • One more:

    Ashish -

    It sounds like you're trying to use .net somewhere in your component.  I think the installer can self register your ocx, but if not, regsvr32'ing it with /q should do the trick.

    -Ben

  • First of all, thanks for the explanation of redistribution! Additional documentation on this is more than welcomed.

    As another developer who has switched to statically linking whenever possible, just wanted to point out an annoying thing to watch if you statically link moderately complex applications with MSVCRT:

    Everyone knows not to alloc/dealloc memory and objects across DLL boundaries, especially when using multiple copies of a runtime (e.g. statically linked) or different runtimes.

    However, one thing that I've seen very little discussion on is that you should also avoid much of the runtime entirely during callbacks originating from a thread external to your runtime (e.g. DirectShow thread callbacks, third party DLLs, or even your own threaded plugins).  

    One example is the sprintf() family. If called from an externally created thread, it will allocate locale information for the thread.  Unfortunately, since the thread will be exited from the external module, this will never be cleaned up by MSVCRT - resulting in a small memory leak per thread that can add up over time.  Same problem with the FILE* io functions, except you get the added bonus of a leaked critical section as well.  

    These can be avoided by posting messages across the thread-boundaries or similar to ensure they are used on module-local threads. It would be *beautiful* if there were a function to clean this up manually (the code allocating is deep down in the CRT though - near _getptd()).

    As far as why I've switched to statically linking... I spend a very large percentage of my time tracking down, fixing, or working around bugs in third-party software - including but certainly not limited to Microsoft's.  My recent applications have required as close to 24/7/365 operation as the host OS's will allow, so even extremely minor leaks and rare race conditions are a big problem.  The apps are more industrial than consumer, although I've been less able than an earlier poster to exert much control on locking them down tightly.  

    Bugs I've found and/or reported (e.g. one leak in an ATL hosting container, another in the  IWebBrowser2's use of registry keys for Zones, another in the WMP's COM object, crash-prone race conditions between multiple IWebBrowser2 windows on navigation, etc.) don't seem to hit high priority to be fixed when I face them.

    They have also often been replaced with other problems when subsequent major versions came out.  Granted, these examples aren't in the MSVCRT, but they've made me nervous.

    Once I've fully tested a system of software and fixed or worked around all the issues I could find in the environments it is expected to run in, I like keeping it that way as much as possible.  It'd be particularly brutal to have an app break at multiple customer sites when a patch is pushed out. I'd rather trust the code I've run through thorough tests with myself.  I  wish there was a good (non-license-violating) way to make all other components I might use static as well. Codecs and DMOs, for example, are still a particularly painful problem (why do the new wmvdmo* objects  halt playback prior to the last few frames when accelerated on many modern video cards? That upgrade nailed me).

    Also, at my current workplace, 2 dev machines had silent failures during the install of Visual Studio 2005 and/or Visual Studio 2005 SP1 which caused mismatched merge modules/libs/DLLs. Dynamically-linked apps ran fine on those machines, but watch out when they hit a test machine (thankfully, the problem never hit customer boxes).  Tracking down such errors - especially with the added level of the SxS cache - is a bit painful. Static apps did just fine.

    One last point - size is often mentioned as a reason against statically linking. Aside from the obvious refutation of the size of all of the merge modules vs. the size of a statically linked .exe, I've found that my memory usage is often a bit smaller for statically linked applications.  This may even be true when there are *multiple* modules statically linked.  

    With a 'hello world' WinMain/MsgBox app, a statically linked module uses only about 200K less in memory. However, switching from dynamic to static linking on a large(-ish) Windows CE application lowered my memory footprint by about 2MB.  YMMV here... I still haven't figured out why the difference in that particular case was that big.

    Cheers,

    Mike

  • Thanks a lot :)

  • Michael:

    If you have several processes which all use the CRT (or other libraries) they should be able to share the pages of code they have in common.

    So in that case, dynamically linking will lower memory usage.

  • Recently I was working with one of my customers, who were trying to register a binary which was developed

Page 4 of 4 (51 items) 1234