A first hand look from the .NET engineering teams
This post was authored by Morgan Brown, a Software Development Engineer on the .NET Native team. It is the fifth post in a series of five about Runtime Directives. Please see the first posts in this series, Dynamic Features in Static Code, Help! I Hit a MissingMetadataException!, Help! I Didn't Hit a MissingMetadataException!, and Making Your Library Great, before reading this post.
The previous posts in this series are about getting your app working with .NET Native, but you don’t want just working. You want excellent. We’ll focus on how to tune your runtime directives and other dynamic behavior to make your app or library sing. In particular, we’ll dig into cutting down app size, which in turn improves memory usage, runtime performance, and build time (that’s right, as you optimize, building will get easier!)
Let’s start out with something that sounds a little crazy, but will make a big difference: delete your rd.xml file. (Ok, make a backup first.)
Try doing a quick release build and measure your results – did your resulting binaries get significantly smaller? If they didn’t, you may have an app that really uses all of the code in the app package and your application doesn’t have many optimization opportunities available through Runtime Directives. It’s much more likely you saw a big difference, so this can be a good baseline – this is the minimum size you could possibly get to (and there’s a chance your app doesn’t work).
The default rd.xml file that gets added to your project when you enable .NET Native was chosen for a balance of compatibility with any app and good performance. The directive included in it says to compile all code found in your app package and include extra reflection information for it. Most apps include .NET libraries in the app package and often, only pieces of those libraries are needed. That default rd.xml tells the compiler to skip optimizing all of that unwanted code away and instead spend megabytes of binary size and tens of seconds compiling it.
So now that you’ve deleted your rd.xml, your app may be hitting MissingMetadataExceptions. Before diving for that backup copy, let’s see if we can make your app work by including only the code you need and not large amounts you don’t. At a high level, what that takes is having runtime directives that explain how reflection in your app works to the compiler. Here’s how you get there:
You probably don’t have to start from scratch. If your app is like most, lots of the dynamic behavior in your app really happens in third party libraries. For some of those, we’ve included rd.xml files in the SDK that automatically get used by the compiler. If you see MissingMetadataExceptions coming out of calls to libraries, check if there’s a newer version of the library that includes rd.xml; if there’s not, tell the author that you want their library to be part of your fast and lightweight .NET Native App and they should check out Making Your Library Great.
If you’ve factored your own code into libraries that work off of reflection, try the same article. The goal here is that the library code should use reflection directives like GenericParameter and Parameter that can automatically pick up all of the individual types you use with a library. That way, your code will work every time, without you either having to write a ton of directives or picking one that includes lots of stuff you don’t need. A good goal is to not use any Namespace or Assembly directives and instead just use Parameter and GenericParameter directives and maybe a few individual types.
See if there are some places you use reflection that you don’t need to. A few good candidates are:
Replace the C# dynamic keyword with strongly typed code using interfaces or generics. It turns out that behind that simple-looking keyword is a ton a reflection. As a bonus, this will make IL versions of your app faster too.
Compiled LINQ expressions are a useful optimization on a runtime with a JIT since they get compiled at runtime. However, on a static runtime like .NET Native, they get interpreted instead of compiled and use lots of reflection too. Instead, think about using handwritten methods that will get compiled up-front. On .NET Native, compiled methods are always faster than interpreted code (and don’t need rd.xml).
Instead of Type.GetType(“MyWellKnownType”), consider typeof(MyWellKnownType). Similarly, when you can, construct delegates directly from methods instead of using MethodInfo.CreateDelegate.
Lots of apps use an old trick to avoid having to come up with strings for INotifyPropertyChanged involving a LINQ Expression with the property. You can now use the CallerMemberName attribute, which will cause the C# compiler to automatically fill in the string without adding any reflection requirements.
So how do you know if you’ve done a good job? Of course, getting your app working with more specific directives is a pretty good sign. Aside from your total binary size, there’s a simple metric that can help you understand how well you’re cutting down on excess reflection metadata. In your Visual Studio project directory, under obj\(Architecture)\(Debug or Release)\(Name of your project).ilc\intermediate\ILTransformed, there’s a file named (Name of your project).reflectionlog.csv. That file contains all of the types and members that have been enabled for reflection and which degrees were applied to them. You can use the size of that file as a proxy for how much compiled stuff you’re asking for and you can also skim the contents to get an idea of what you might be pulling in and whether it seems like there are large swaths of things you don’t expect.
obj\(Architecture)\(Debug or Release)\(Name of your project).ilc\intermediate\ILTransformed
(Name of your project).reflectionlog.csv
If you go through all of that and either couldn’t get your app working without the original rd.xml or it really just didn’t get much smaller despite having library code you probably don’t need, we’d love to hear from you. We’re constantly improving the compiler and data helps us come up with new ways to make apps better automatically. Please feel free to leave comments at the end of this post or email us at email@example.com.
The focus on .NET Native is interesting, but do you think the .NET framework could be covered again in these posts?
@Andrew: This is the last post in this deep-dive series about .NET Native's rd.xml. We knew it was a lot of information but we wanted to get it out there for those who need it. We figured posting everything in a single week was the easiest way to go about it.
Expect more posts about the rest of the .NET Framework coming up. By the way, if there's a topic you'd like to hear more about, please ask!
Why is there such a focus on app start up time?
I would rather wait much longer for the app to load if it just ran faster during runtime, so would all my users.
will these posts and .NET native ever go into actual real performance benefits?
I have a game in the top 10 of the windows store that could really use some faster runtime performance.
@Daniel: Both startup time and steady-state runtime are important. Our initial focus was app startup time because the majority of Store apps that we tested are I/O bound. These apps do a lot of work when they start up--drawing the UI, restoring state--then they wait for the user to interact with the app. Steady-state runtime performance is less of an issue when you're waiting for the user to input some data or touch a control.
We are bringing optimizations online now that will help with steady state performance. If you mail us the name of your top 10 Windows Store app we'd be happy to use it as one of the test cases we focus on.
@Andrew Pardoe: good to hear. Thanks
No specific topic I'd like covered; just general things like this (in this order):
(a) New .NET features that would be useful to any programmer;
(b) best practices;
(c) common mistakes people make and the proper way to do things;
(d) little known valuable features of .NET.
I guess some specific topics of interest to me are: garbage collection and multithreading
@Andrew Pardoe [MSFT]
Topics I'd like to hear about...while .NET Native is great stuff (and the .NET core development seems to be nicely active), what about some new developments in regard to WPF? That is, if there are any...looks like there isn't even a blog that covers this topic.
If you are taking request then some topics about what you are doing with WPF would be very interesting.
Is there a way to tell the compiler to just include every type etc. if one does not care about executable size at all? Just want a working pre-compiled native executable! That works with dependency injection libraries, Caliburn.Micro etc. All very reflection based.
Topics for discussion. Performance work done in the .NET framework both CPU and GC. Specifically, global locks in WinForms/WPF synchronization contexts (e.g. dispatcher lock()).
Allocations in same e.g. allocations for every property change notification etc. Allocations on WaitHandle.WaitAny etc. All used by us in cases where a LOT of calls are generated per second.
Generally, it would be great if .NET framework code received attention regarding perf and heap allocations. So any work here? Or are BCL/WPF/WinForms libs frozen?
What do you guys think about CUDAfy.NET ? (https://cudafy.codeplex.com/)
It's a great way to increase your app performance by a lot (if your app is doing heavy compuatations) but it's api isnt the best and it uses some ugly tricks/hacks to do it's job.
Do you think CUDAfy.NET would work with .NET native?
I would like to have GPGPU programming library in .NET by default and it should be as easy to use as TPL :D
@Andrew, flöle, MEK, HarryDev: Thank you for the suggestions! I'll see if we can't get some posts on best practices and guidance, including feature guidance. This blog is written by the core .NET team so we do tend to focus on core .NET development but we'll talk about including more general guidance along with announcements.
@HarryDev: Might I point you at Vance Morrison's excellent blog for some performance topics? Vance is performance architect for the .NET Framework. Read more here: http://blogs.msdn.com/b/vancem
@Pawel: CUDAfy looks interesting for your apps. I don't know that the Windows Store API exposes GPU programming though, so I don't think we've looked at CUDAfy in the context of .NET Native.
@AndrewPardoe: Yeah I've pretty much read everything Vance has ever written... publicly ;) So very aware of this. The issue I point to are based on profiling our apps, where we see a lot of allocations of contentions that could easily be avoided if the .NET framework used modern practices and care for avoiding locks and allocations. We also use PerfView and custom TraceSources etc.
@HarryDev: Can you send me a private email? I'd like to hook you up with one of our PMs who will help you collect some PerfView traces for us to inspect. My email is firstname.lastname@YouKnowWhereIWork.com