Alik Levin's

Clarity, Technology, and Solving Problems | 

March, 2010

  • Alik Levin's

    Automating Code Review for Common ASP.NET Performance & Security Anti-Patterns


    In this post I will share with you how to automate code review when searching MSIL for common performance and security anti-patterns.


    You are an application performance/security consultant who’s been asked to review a large application for common security and performance anti-patterns. You are given no time and no source code. What you are given is 250 dll’s. What do you do?

    Reverse Engineer .Net Compiled Binaries

    It is easy to reverse engineer compiled .Net binaries. .Net SDK ships with ILDASM – disassembler that dumps intermediate language (MSIL) out of the assemblies into text file. The resulting file is not really human readable (although there are few individuals out there who can actually really read it and understand), but it is good enough to perform text searches for several patterns. To perform text searches one would use findstr command. Here are few examples:

    Find strings in resulted MSIL text file:
    Ildasm.exe yourcomponent.dll /text | findstr ldstr

    Find string constants in resulted MSIL text file:
    Ildasm.exe yourcomponent.dll /text | findstr /C:"literal string"

    Find boxing/unboxing in resulted MSIL text file::
    Ildasm.exe yourcomponent.dll /text | findstr box
    Ildasm.exe yourcomponent.dll /text | findstr unbox

    The searches above can help identifying either security or performance issues.

    The question here is how to automate the process for 250 dll’s.

    Automating MSIL Search

    The following are the steps we take to automate the search:

    • Step 1 – Create list of dll’s using DIR command. First lets create an input file that includes the list of the dll’s we need to inspect. For that purpose we use DIR command with few attributes:

    /A:-D means no directories, files only.
    /S means search subfolders. It is useful if you have dll’s in subfolders.
    /B means bare format, no headers, only file names.

    DIR /A:-D /S /B *.dll >C:\DllsList.txt

    Ok, now we have the text file – DllsList.txt – with the list of dll’s we want to inspect.

    • Step 2 – Create vbs file the builds batch file with the actual commands

    Now we create a vbs file that contains a script to build batch file. This batch includes search commands for each dll in the list we just created. In the sample below we create a batch file that will look for strings – ldstr . Substitute this search criteria to your needs.

    =====VBS FILE START====

    ildasmFilePath = """C:\Ildasm.exe"""

    assembliesFileList = "DllsList.txt"

    findStrPath = "C:\Windows\System32\findstr.exe"

    Set fso = CreateObject("Scripting.FileSystemObject")

    Set assembliesList = fso.OpenTextFile(assembliesFileList, 1, False)

    Set batchFile = fso.OpenTextFile(assembliesFileList & ".bat", 2, True)


        Do While assembliesList.AtEndOfStream <> True

                  assemblyPath = assembliesList.ReadLine
            stringToWrite = ildasmFilePath & " """ & assemblyPath  & " ""  /text | findstr ldstr >""" & assemblyPath & ".Strings.txt"""

            batchFile.WriteLine stringToWrite
    '        msdbox stringToWrite



        Set assembliesList = Nothing

    msgbox "DONE"

    =====VBS FILE END====

    The result is a vbs file ready to be used. Double clicking on the vbs file creates a batch file with a similar name. Open the batch file and review the contents. It should look similar to this:


    • Step 3 – Run the batch file to dump search results into text files

    Now that we created the batch file – let’s run it. The result would be a bunch of text files with search criteria results in it. All that is left is inspect the files. For example, in case of security we’d look for a patterns outlined here:

    In case of performance review we’d just go straight to the source code of the identified dll’s to look for massive boxing/unboxing which usually happens when we use general purpose collections.

    To make the process of inspection easier and faster use instant search available in Outlook 2007 as it is outlined here:

    Related Books

  • Alik Levin's

    ASP.NET Performance: Web Application Gets Slow Periodically – IIS Recycles


    A customer complained that his ASP.NET web application gets slow periodically. It happens at random times, the system just gets slow then after few minutes it gets back on track with normal response times.

    One of the reasons for such behavior is an AppPool default recycling policy set in IIS.

    Default AppPool Recycling Policy in IIS

    The default recycling policy for application application pool is 29 hours. Both IIS 6.0 and IIS 7.


    It means that every 1740 minutes the application pool recycled. 1740 minutes means 29 hours. 29 hours means randomness.

    Correlate Event Logs and IIS Logs

    To make sure you are dealing with recycling (there are few more reasons I will be discussing in the next posts) one needs to correlate IIS recycle events in Windows Event log with slowness of the pages in IIS log.

    Start with Windows Event log. Go to System Event Log and filter events with W3SVC as a source. Look for recycle events with id of 1074:

    iis recycling event

    Notice the time the recycle occurred.

    Go to IIS logs and filter out the resources that took significantly long. One way to look into the IIS logs is using Excel as outlined here - Identify ASP.NET, Web Services, And WCF Performance Issues By Examining IIS Logs. Another way is using LogParser as outlined here (via Tess) - Using LogParser 2.2 to Parse IIS Logs and Other Logs.

    Notice time-taken is pretty lengthy, around 30 seconds. Notice the time it is taken. Now look at the times that the Event log captured 1074 events – it is 2 hours diff. It is because IIS logs events using GMT and Windows Event log is time zone sensitive. I live in GMT+2 time zone. So the times are correlated.



    Remove IIS recycling default policy. I once heard someone saying it is training wheels. If your app pretends to be stable then you do not need recycling at all – of any kind. Just remove it all. If you are using recycle there are potentially two reasons, actually one – you have problematic application that misbehaves. It’s either under your control to fix it by changing the code or out of your your control when you buy an app that you cannot fix. Ask for a fix from the vendor. It might be also a good time to review your capacity plan. In any case – recycling is a workaround and not the solution.

    Related Book

    My related posts

  • Alik Levin's

    ASP.NET Performance: Get Rid of HTTP 401 and HTTP 304

     Alik Levin    Making fewer calls to IIS web server improves your ASP.NET application’s performance, or more precisely, it improves UI responsiveness or, even more precisely, it improves UX, the User Experience. Better User Experience leads to better adoption. 

    Quick Resource Box

    In this post I will share how to improve User Experience by reducing the number of HTTP 401 and HTTP 304 responses.

    The Impact of HTTP 304

    In general HTTP 304 is returned by web server when the browser is not really sure about up-to-date’ness of the resource. Imagine this conversation:

    1. Browser: “Hey, IIS web server, I have this GIF file in my cache stored locally. I stored it here since my last visit to your page. Not sure it’s up-to-date. Should I use it? Or, do you have a newer version?”
    2. IIS web server: “HTTP 304. Nope, there is no newer version down here in the data center. Use your locally cached version of the GIF.”
    3. Browser displays the GIF from local cache.

    This one extra roundtrip might look very subtle in case when there are very few static elements on the page. In case there are many static elements on the page the User Experience can be severely affected. Below is an extreme example of HTTP 304 responses captured by Fiddler. All the images in the diagram are stored in local cache but they never displayed right away – Browser first consults with the server and gets HTTP 304 before displaying it:


    The Impact of HTTP 401

    HTTP 401 returned when Browser requests a resource that requires authentication. The fact that the resource requires authentication results in two HTTP requests – initial requests gets HTTP 401 asking for credentials, and subsequent request is satisfied with the actual response after the creds were validated. Something similar to this:


    Imagine now situation that static resources such images, JavaScript, CSS files, etc require authentication. Usually these guys are the same for every user and there is no point to require authentication/authorization for it, right? If so then there is no need to waste another HTTP 401 roundtrip for them, right?

    Improve Performance by Partitioning Your Application

    Consider the following solution structure:


    Notice the following four folders: CSS, IMG, JS, and Restricted. The first three contains static style sheets, images, and JavaScript files respectively. The Restricted folder contains all the dynamic ASPX pages that implement your scenarios referencing the static content from the other three folders when needed. The web.config file looks as follows:

    <?xml version="1.0"?>
      <location path="Restricted">

    This configuration achieves the following:

    1. Static files served from publically accessible location that way we avoid HTTP 401 extra roundtrips.
    2. System administrators are not required to fish the static files across the solution’s file structure just to set expiration policy – there are only 3 folders to specify it so there is a better chance it will happen. It should help reducing HTTP 304 extra roundtrips.

    Related Books

Page 1 of 3 (8 items) 123