Terry Zink: Security Talk

Discussing Internet security in (mostly) plain English

A comparison of antispam vendors

A comparison of antispam vendors

  • Comments 3

InfoWorld recently released a report where they compared the effectiveness of various spam filters.  It's mostly about on-premise anti-spam appliances.  They do touch on hosted solutions but don't go into much detail.  At the end, they do a filter-by-filter comparison.  You can view the results of their study by looking at the pretty image here.

The table contains a very nice looking comparison.  It has total valid mail, spam percentage (catch rate), false positive rates, and the like.  It is twelve categories in all.  But at the very end, we still are having trouble answering the question of which one performed the best.   We can see that Ironport and Barracuda have the lowest catch rates, but Ironport has a pretty good FP rate.  There's a lot of numbers, how can we summarize them?

To do this, let's go back and look at my Relative Performance Index.  Recall that this is a metric I created that combines the catch rate and false positive rates and normalizes the results.  Also recall our definition of spam in the inbox (SITI), a measurement that combines the amount of spam and non-spam that the end-user sees in their mailbox.  The results are below:

Barracuda Borderware Ironport Mirapoint Proofpoint
RPI   5 1 91 7 7
SITI    7% 10% 12% 8% 7%

 

  IronMail Sendio Symantec Tumbleweed
RPI 5 3 51 22
SITI 4% 0% 10% 17%

From this table, we can clearly see that IronPort has the best RPI (higher is better).  In fact, they totally crush their competition using this metric due to their low FP rates.  So, while the catch rate of Borderware was higher, the FP rate boosts the Relative Performance of Ironport.

The numbers change a bit when we look at Spam in the Inbox.  Here, Ironport's lower catch rate negatively affects the user experience in spam, but better FP evasion improves it. 

It would be better still if we further combined RPI and SITI (or SITI, catch rate and false positive rate).  I will leave that for another post.

Leave a Comment
  • Please add 2 and 7 and type the answer here:
  • Post
  • PingBack from http://microsoftnews.askpcdoc.com/?p=2871

  • Very interesting study but to be quite honest I think they left a couple of good appliances out of that study.

  • Indeed, the software/appliance from Roaring Penguin, called CanIT Pro is noticably absent.

    Also, as someone else wrote elsewhere, the catch rate is a little misleading due to how much spam is blocked at SMTP connection time.  Read about that here: http://blog.mailchannels.com/2008/04/why-anti-spam-effectiveness-testing.html

    We use the ironport c100 here and it has a much higher catch rate than what this InfoWorld report claims.  Here's 30 days worth of stats on incomming mail (note AV result is inacurate as it reflects mails with password protected or encrypted content, regardless if it was actually infected):

    Stopped by Reputation Filtering 98.9%   6.9M

    Stopped as Invalid Recipients   0.7%    47.8k

    Spam Detected                   0.1%    5,795

    Virus Detected                  0.0%    1

    Stopped by Content Filter       0.0%    0

    Total Threat Messages:          99.7%   7.0M

    Clean Messages                  0.3%    20.6k

    Total Attempted Messages: 7.0M

    Although the numbers are rounded, you can see that the catch rate performance is clearly better than what TFA states.

Page 1 of 1 (3 items)