Someone asked me an interesting question the other day, "How do I detect if any users are leeching my FTP site?" That's a great question, and it warrants some explanation and a little LogParser code.
First of all, I should explain the term leeching as it applies to FTP. If you host a public FTP site with a collection of files for downloading, a leech is someone that connects to your site and downloads everything - or almost everything. The term leech is most-often used in peer-to-peer (P2P) sites when someone downloads and never uploads, or as Wikipedia appropriately summarizes it, "leeching is taking without giving."
This leads me back to the original question, which was how to detect if someone is leeching your FTP site. The basic pattern for leeching is usually pretty easy to detect – you'll see a large volume of change directory (CWD), directory listing (LIST), and file retrieval (RETR) requests; the pattern will usually be something like the following flow of events:
CWD / LIST / RETR1 / RETR2 / RETRn / CWD / LIST / RETR1 / RETR2 / RETRn / etc.
An excerpt from an IIS W3C log file might look something like the following, and you can easily see the client's activity as it traverses through the FTP site and grabs everything. (I highlighted the file download requests.)
Unless the client is scripting their leeching to hide their activity, the client requests will generally be in the same session, which makes it easier to track. Otherwise, you have to track activity through IP address, but it’s still doable.
I've written a lot of content about Microsoft's LogParser utility in the past, and so it should seem logical that I'd find a way to use it for this situation as well. If I were to write a LogParser script to detect leeches, I would probably start with something like the following example:
Logparser.exe "SELECT date,COUNT(*) as downloads,c-ip,x-session FROM *.log WHERE cs-method='RETR' GROUP BY date,c-ip,x-session HAVING COUNT(*) > 100" -i:W3C
An easier view of just the SQL syntax for that LogParser query looks like the following:
SELECT date,COUNT(*) AS downloads,c-ip,x-sessionFROM *.logWHERE cs-method='RETR'GROUP BY date,c-ip,x-sessionHAVING COUNT(*) > 100
In this example, the script is asking LogParser to query all of the FTP logs in a folder and return the date, download count, client IP address, and session ID for every session where the client downloaded more than 100 files in that day, and it's grouping records together based on the client IP address, session, and date. By way of explanation, 100 is a number that I chose somewhat arbitrarily, but I would think anyone with more downloads than 100 in a day would probably be a leech for most FTP sites. If you are writing your own LogParser script, you want to make sure that you group the query across dates like I did so people that download 100 files over a couple of years don’t show up as a leech. ;-]
When you run the above query, LogParser will give you output that looks like the following:
This enables you to see pretty quickly who are the top leeches for your FTP site, and then you can act accordingly. At the very least you might want to think about adding the IP addresses from any leeches to your FTP IP Address and Domain Restrictions.
I hope this helps. ;-]
Nice post always good too see more content about LogParser.
Do you have anyway idea what the development situation at Microsoft is for Logparser, there hasn't been any word for a long time as far as i can tell.
Are we ever getting a LogParser 3?
I would personally LOVE to see Microsoft release LogParser 3, but unfortunately there are no plans at this time for an update to LogParser in the near future. :-(