How to redistribute the Visual C++ Libraries with your application

How to redistribute the Visual C++ Libraries with your application

  • Comments 51

Hello again, this is Ben Anderson, SDET on the Visual C++ libraries team. 

One of the most common questions we get from customers on the forums and elsewhere is “My app needs the Visual C++ Libraries (CRT, ATL, MFC, OpenMP or some combination thereof) – how do I get them on my customers’ machines?”  It’s also something we see in the wild done, if not incorrectly, at least non-optimally fairly frequently.  The help documentation in MSDN is correct, but there is no one stop-shopping explanation of all your options.  This blog post will attempt to explain what to do. (In case you’re looking for the short answer, almost always, the correct thing to do to distribute the Visual C++ libraries is to add the Visual C++ redistributable MSMs, or “Merge Modules”, for the libraries you use to your application’s setup.) I’ve tried to outline below the various methods of redistributing the Visual C++ library DLLs based on what your deployment story may be.

 In most cases, folks deploy their applications using a standard Windows setup.  In these cases, you probably build an .msi file using some toolset (such as a Visual Studio setup project) which is then wrapped in an .exe file by your tool chain.  End users run this .exe file and your application is installed.  If you don’t already have a setup for your application, it’s very easy to create one using Visual Studio’s setup project (found by right clicking your solution, clicking Add -> New Project… -> Other Project Types -> Setup Project).  You can then right click your setup project, click “Add->Project Output…”, then select Primary Output.  You can then add Start Menu items and tweak your setup to meet your needs.

In order to redistribute the Visual C++ libraries, all you need to do is include the appropriate .MSM file and its accompanying policy .MSM to distribute the library you need.  If you are creating a setup project as part of your solution as described above, Visual Studio will attempt to detect which libraries you depend on and will add MSMs as appropriate.  If you are creating your setup project with another tool, or not using the “Add project output” option, you will have to manually add the MSMs for any libraries you need.  These libraries are found in “%ProgramFiles(x86)%\Common Files\Merge Modules”.  For example, on my VS 2005 SP1 system, if I had an x86 MFC App, I would add the following files as Merge Modules to my setup project:

  1. “C:\Program Files (x86)\Common Files\Merge Modules\Microsoft_VC80_CRT_x86.msm”
  2. “C:\Program Files (x86)\Common Files\Merge Modules\Microsoft_VC80_MFC_x86.msm”
  3. “C:\Program Files (x86)\Common Files\Merge Modules\policy_8_0_Microsoft_VC80_CRT_x86.msm”,
    and
     
  4. “C:\Program Files (x86)\Common Files\Merge Modules\policy_8_0_Microsoft_VC80_MFC_x86.msm”

These files are then consumed by your setup tool, and their contents are dropped as part of your MSI on your users’ systems.  They contain components which install to Windows Side by Side the DLLs and the redirection policies [see footnote 1] for the libraries you select.  These components are ref counted so that every time an app using these MSMs installs, the ref count is incremented, and every time one of these apps uninstalls it is decremented.  Once the ref count hits zero, the DLLs and policy are uninstalled. 

 There are a few cases in which MSM installation may not work for you.  In one case, you may have to deploy your app on systems where the user has no administrator privileges and so cannot run a setup.  There may also be some other reason you cannot use an MSI to install your application – for instance, users may run your binaries directly from a network share. 

In these cases, you can do an “app-local” deployment, which is sometimes called deploying the DLLs as “private assemblies”.  All you need to do in this case is provide a copy of the DLLs you need, and their accompanying manifest in the same directory as every .exe, .dll or .ocx file in your application.  To deploy in this way, simply copy the entire contents of the appropriate folder under <Visual Studio install dir>\VC\redist [see footnote 2] into the all folders which contain binaries which use those libraries. 

The advantage of this approach is that you do not need to create an install for your application.  This means you can deploy and run without requiring your users to elevate to administrator privileges.  All your users need to do is copy your application folder onto their systems or run your .exe directly from its current location.  The disadvantage is that you must put a separate copy of the libraries you need in every single directory in which your binaries reside.  For a simple application, this may not be a problem, but for a large app which might have many subdirectories with many tools and DLLs, this is a lot of file duplication. 

 Finally, there is one additional scenario for redistributing the Visual C++ libraries DLLs.  This scenario is if you are using “Click Once” deployment.  In this case, “Click Once” will use a custom built installer package called “VCRedist_<arch>.exe” to install the libraries for you.  DO NOT use the VCRedist_<arch>.exe installer packages for any other purpose. 

The VCRedist packages are simply MSI’s built by consuming all the MSMs from “%ProgramFiles(x86)%\Common Files\Merge Modules” as well as the MSDIA DLL (used for debugging).  However, MSIs are not ref counted like the components in the MSMs, so if you install it, you can never uninstall it because you do not know who else might be using it in addition to your app.  Further, your users cannot uninstall it because they do not know which of their applications may be using it.  Additionally, your users may not realize what it is when they see the entry in Add/Remove program files.  Imagine a user trying to free up space on their machine, seeing the entry for VCRedist which they do not recognize, uninstalling it, then some time later (maybe months), trying your application again.  It will not work!  Your user will probably not connect the action of uninstalling VCRedist at some point in the past, and will either be broken without a fix, or use your support center’s time trying to find out why your app stopped working.  What’s more, it’s very likely that you are not using every single Visual C++ library, and installing the whole of VCRedist is unnecessary.  Alternately, a poorly written installer for another application which used VCRedist to redistribute the Visual C++ libraries may (incorrectly) uninstall VCRedist when that app uninstalls. 

A better option if for some reason you cannot incorporate the MSMs into an MSI which installs your application is to use Visual Studio or another tool to build a tiny MSI installing just the MSMs, and only those that you require.  Since this MSI is unique to your product, and can be named whatever you like, you can uninstall it when your application is removed, and you can name it in such a way that your user recognizes it as part of your application and will not uninstall it inappropriately (name it say “MyApp Prerequisites).  By using your own MSI, you also guarantee that no other application which uses the VCRedist package will interfere with your app by incorrectly uninstalling it during that app’s uninstallation.

Again, just to emphasize – do not use VCRedist*.exe unless you are using Click Once to deploy your application.

 In addition to all the methods described above of distributing the Visual C++ libraries DLLs, there is one last option for building your application which does not require you to distribute the DLLs.  However, this option only works for native-only code (it is not supported with /clr) and leaves your customers seriously vulnerable to any security holes as well as adds a significant burden upon yourself to patch all customer systems should a vulnerability be found in any of the libraries.  This option is to statically link in the libraries as .lib files instead of dynamically loading them as DLLs.  You do this by using the /MT flag on the cl.exe command line (vs /MD), or selecting the appropriate option in your project properties through Visual Studio.  You may wish to use this option when testing early debug builds of your application on test machines before you start working on setup. [See footnote 3] 

However, I can think of no scenarios in which this is actually the right thing to do when shipping your product to customers.  Basically, what this approach does is pulls in the binary code needed from .LIB files at compile time, making it a part of your .exe or .dll files.  It increases the size of your application, and there is no way to update the libraries apart from recompiling your application with new .LIBs and redistributing your application all over again.  What this means is that unless you go touch every single machine which has installed your application every time there is a security vulnerability found in the Visual C++ libraries and completely reinstall your updated binaries, you will be leaving your customers vulnerable to attack.  If instead you use the DLLs, every time there is a security vulnerability found in the Visual C++ libraries, Microsoft will install the update centrally into the WinSxS folder via Windows Update and all requests for the DLLs will be redirected to the updated version.  This removes all servicing burden on your side and also allows the user to install one small update which will touch all their applications instead of replacing every installed exe and DLL on their system.  Please, do not distribute an application built by linking statically against the Visual C++ libraries unless you have a system in place for updating every customer machine and also have a very good reason to do so.  At this time, I can think of no circumstance under which this would be the right thing to do for a shipping application.

 Well, hopefully this article has helped you out in understanding how to redistribute the Visual C++ libraries onto your customer’s machines.  If you have additional questions, you can find the documentation for deploying Visual C++ built applications here:

http://msdn2.microsoft.com/en-us/library/zebw5zk9(VS.80).aspx

 If you still have questions, you can post comments here (I will check back for a few weeks), or you can post your question in the Visual C++ forums here:

http://forums.microsoft.com/MSDN/ShowForum.aspx?ForumID=29&SiteID=1

 Thanks,
Ben Anderson
Visual C++ Libraries Team


[1] The redirection policy always redirects requests for the Visual C++ dlls to the latest installed versions, even if the application requesting the dlls has used “app local” deployment to drop the dlls as private assemblies – this way, if a security issue comes along, Windows Update can drop fixed dlls into Windows SxS and all affected applications will be fixed.  The Visual C++ team maintains a strong binary compatibility guarantee that applications built against an earlier version of the library will work against all later versions with few exceptions (exploitable usage may be broken to prevent customer machines from being hacked).

[2] Please note that the files in this directory are not updated in QFE patches, and some of the manifest files in this directory were not updated as part of SP1 of Visual Studio 2005.  As a workaround, you can find the appropriate version of the files in the WinSxS directory of your Visual Studio development box by typing “c:\windows\winsxs> dir *VC80*”, identifying the correct directory based on version numbers, then copying the contents of that directory into your application directories instead.

[3] A better option would be to create a simple setup project and include all the Visual C++ MSMs and install this on all your target machines. 

  • In response to Dusty:

    While you can't install the MSMs directly, what you can do is build your own setup project which is empty except for consuming the MSMs.  You can call the install "MyApp Prereqs".  This will associate the entry in Add/Remove Programs with your app, and will allow you to uninstall it since you know no one else will depend on it.  Then you can just call that instead of VCRedist in your install process.

  • For some of the reasons already mentioned by other readers many times I go for static linking of the CRT, too; however, OpenMP doesn't support static linking. Are there any plans to change this? If not, would the "app-local" scenario work for OpenMP deployment? (i.e. vcomp.dll)

  • App local does indeed work for vcomp.dll.  It also works for the other DLLs if you should choose to link the CRT that way as well.

  • Thank you for your reply :)

    I'll check the internal books to really understand what happens!

    I do see advantages using an "assembly" approach, so you can easily upload fixes via Windows Update.

    But the system has some incongruencies:

    - If I have an app-local deployment, and NOT the shared assemblies installed, this method doesn't works . It could happen, specially with new compiler versions. For many time I didn't got any VC2005 app installed.

    - If I have updated shared assemblies , app-local is useless.

    Ok, these are by design. But I wonder if something simpler could be done?

    (or at least provide a fallback to prevent loading errors)

    I will try to reproduce the loading error, and put a feedback on connect.

    Good Bye!

  • Ben Anderson wrote:

    "a) more informative error messages on loading errors, b) better accommodate the Win2k app local scenario and c) explore ways of removing the redist burden from the developer."

    Yes, that would be most helpful!!!

    Just two additions that come to my mind:

    d) Better documentation (this should really by a)...)

    e) A tool like "depends.exe" that can tell me - for managed, unmanaged and mixed code - what libraries a given executable would load and from what location (and if possible: why from that location). Today you can get some of that information by using a combination of ProcessExplorer, Depends.exe, ildasm/reflector (and sometimes WinDbg), but it's kind of a puzzle...

    (I will try to reproduce the deployment issue I described and file a bug report as soon as I have time.)

  • "I think redisting via Windows Update is something worth exploring."

    I think this is a much more sane deployment practice.  In this way, you also accomplish your desire to remove the static linking approach to redisting as well.

    It's the best security, deployment, and usability model that could be implemented.

    Otherwise, if that doesn't fly, I think the merge module approach is something that Microsoft's other departments have learned are a Bad Idea (tm) and only cause hardship for both sides of the deal.  Maintenance on those DLLs is miserable as if you need to update the merge modules for any reason, you have to deploy the entire MSI or pray that modules are authored per small/minor upgrade rules so you can deploy them as a patch.  And if they aren't?  You, as the installation developer, have no recourse :-(

    I think taking a look at the evolution from the original MSDE 2000 Merge Module style installation to how SQL Server 2005 has evolved from that.  Sure, people complained at first.  But then once you understand how you can leverage and take advantage of the functionality provided through the seperate redist, you realize how good it is to be able to install it outside of your installation without hardship.

    I really like the direction that the CRTs are moving as far as functionality goes and resolving those pesky security issues.  There's just a little area that I think got kinda overlooked to some extent.  :-)

    I read Heath Stewart's blog every day :)  It's a good one, but I think sometimes it's best to not talk to a setup guy.  Also, he hasn't really touched on the subject much :(

  • We provide two packages: one is simple zip containing the executables and some data files, another is setup created by InnoSetup which copies t

  • Oops, Enter key seems to overreact. I'm starting again :)

    We provide two packages: one is just a zip containing the executables and some data files, another is simple setup program created by InnoSetup. Using MSMs seems like an overkill in such basic scenario.

    Overall, the CRT distribution model is/was the biggest obstacle blocking us from migrating to VC8. Other things, like relative sluggishness of the IDE can be overcome with faster workstations :)

  • For the record you may have a strong commitment with respect to maintaining binary compatibility but you don't have a strong track record.

    Over the last twelve month our app was broken twice by binary incompatible CRT upgrades one was a SP1 beta CRTs shipped to customers by some unconnected product group (in a non beta product as far as I can tell) and one was SP1 itself!

    Let's face it, it's a losing battle. Give people a way to control the CRT version they link to. If you don't do it for DLLs they will link statically. It's the reasonable thing to do.

    Thomas

  • Hi Thomas - if you were broken, it's probably a bug.  We'd love to see it if you want to file a repro on Connect, or if you need a fix, you can request a QFE so we can see if we can fix the issue for you.  (I assume you're talking about VS2005 DLLs correct?)

    Alternately, if you don't want to go through those processes, you could post the repro or details here and I can take a look and forward it along if necessary.  You probably won't get any action in terms of a fix that way though.

  • Hi Thomas,

    from what I have understood of the whole thing, you can set an Application Config manifest that forces your app to link to a specific version of CRT (should override publisher settings).

    Right ?

    :)

  • The hands-on-the-baseball-bat go {private policy, publisher policy, admin policy}.  Private policy is what’s in your app .exe.manifest saying “I want at least version M.”  Publisher policy is from the component publisher and says “When someone asks for version A through Q, given them version R” and is the usual thing you distribute with a QFE/GDR.  Administrator policy says “My line-of-business app is broken with version R, substitute version P.”  It’s called administrator policy because it’s something the local administrator can author into “foo.exe.config”.  Because of the possibility of opting out of security updates, it must appear in the same directory as the foo.exe itself.  The implication being that if the admin was too lazy to not properly security foo.exe and its parent directory against writes, it’s irrelevant what “bad” policy can be added, local users can already replace foo.exe itself with something more interesting.

    The key here is that application config is for administrators to create, not for application publishers (even though the name is confusing.)

    Don't ship .exe.config with your app.  Now, your app is free to auto-update itself and pull down .exe.config after install in response to being broken.  This should be done with the consent of the administrator or via a 'knowledge base' article on your site.

  • One of my recent experiments, with a product which might or might not get released this way, has essentially Xcopy deployment and I managed to get some manifests working.  But I don't really need to depend on exactly this version of the CRT and ATL, I just need this version or newer.

    As far as I can tell, if some end user's machine has newer DLLs as a result of their administrator installing some other application that does a full SxS installation, my manifests are going to prevent using the newer ones.  If the newer ones have security fixes then this almost surely isn't what I want.  But I still need my manifests because my app can be Xcopied by a non-admin for personal use without doing an install.

  • Norman - as long as you only use manifests, not .configs which as Jon describes is bad, if newer DLLs get installed to WinSxS, your app will still query the system saying "I want version X".  The policy in WinSxS intalled with the new DLLs will redirect your app's request to the newer versions.  So the behavior you describe as optimal will still occur and you should be safe.  Feel free to include manifests in your xcopy deployment method - a policy in the central WinSxS can redirect you when a newer DLL is available.  

  • Ben - your explanation of the assemblies and manifests concepts and implementation helps a lot but once again, it seems that Microsoft has neglected support for dynamic situations.  Currently, I am rebuilding an ISAPI extension provided by a vendor using VC++ Express.  I have had some problems with the code, since it was written a long time ago for a non-Microsoft compiler.  I'm trying to use the EYESAPI tool to test my extension changes (www.genusa.com/isapi/eyesapi.html) and to figure out what problems I have.  EYESAPI lets you debug ISAPI extensions without having to use IIS - a much simple debugging strategy.  This tool gets the name of a DLL by having the user type the name into the window.  My examination of the assemblies and manifests implementation suggests that tools like this can't be built or supported with MSVC anymore.  Am I wrong?  How would you solve this problem....(bearing in mind that I have no source for this tool and that it does not already dynamically generate a manifest resource inside itself?)  Similar questions arise for linking using DLLs from dynamic languages like Perl, Python, Ruby and Smalltalk.  Are these languages going to have to dynamically generate manifests to support loading DLLs in dynamic language code?  That isn't my problem right now, but it looks like it could be down the road.

    Thanks for all of the time you've put into responses to our questions.

Page 2 of 4 (51 items) 1234