Larry Osterman's WebLog

Confessions of an Old Fogey
Blog - Title

IE Code quality commentary...

IE Code quality commentary...

  • Comments 47

I just saw this post by Michal Zalewski on BugTraq.  From the post:

It appears that the overall quality of code, and more importantly, the
amount of QA, on various browsers touted as "secure", is not up to par
with MSIE; the type of a test I performed requires no human interaction
and involves nearly no effort. Only MSIE appears to be able to
consistently handle [*] malformed input well, suggesting this is the
only program that underwent rudimentary security QA testing with a
similar fuzz utility.

I'm wondering when Michael's post will show up on slashdot.

Edit: Corrected Michal's name - Sorry about that.

 

  • "I'm wondering when Michael's post will show up on slashdot."

    It's now submitted, let's see what happens.
  • I have managed to crash IE using pretty straight forward Javascript code, so take these tests with a grain of salt. The tests are 'targeted' to non-IE browsers. If one was so inclined, they could do the exact opposite - write tests targeting IE and comment 'Non-IE browsers have much better code quality since they do not crash on these tests but IE does'.

    This is not to say the browsers which did crash on these tests don't have bad code, but so does IE. Given so many IE exploits, one can't say IE code base is very high quality, it may still have a plethora of holes of which no-one knows yet.
  • Michael wasn't testing the javascript interpreter, he was testing the HTML renderer. I'm sure that his results would be different when looking at a different component.

    But in this case, he wasn't testing something that was targetted at non IE browsers. As far as I know, all browsers CLAIM to handle HTML (if any of the tested browsers don't claim to support HTML, please let me know).

    In this case, he was simply performing a basic security test that should be performed by EVERY test department: Fuzzing the input.

    In other words, he took valid inputs and made them invalid in various ways, and tried to see what would happen when the browser tried to render the HTML.

    Remember - the bad guys don't write valid HTML. They write INVALID HTML. So if all your security testing is done with valid HTML, you're not thinking like a bad guy.

  • I simply won't consider FireFox until they implement a security manager. Currently XPCOM binary extensions have no security model at all. My complaint about IE's security manager is simply that it's pretty hard to say 'open this link in Restricted Sites'.

    However, they're doing better than the Linux kernel - they actually have smoke tests and test plans. I cannot *believe* the amount of praise that Linux gets when it's such an unknown quantity. Lest you suggest I have no experience, I was an active Linux user four years ago in the late 2.0.x/early 2.2.x days, and I clearly recall the regular disk-trashing bugs that appeared in the early 2.2 kernel series. The clean-room journalled filesystems are still a joke - your data is safer with ext2 than with ext3 or ReiserFS. If you want a journalled filesystem, go with SGI's XFS or IBM's JFS. Linux sites are deluding themselves that there are no problems in the OS. You have literally no way to know whether a new kernel release will work correctly on your system.

    Mozilla/Firefox smoke testing still appears to be post-checkin, though, not pre-checkin as I believe has become common practice at MS. It's not automated, which suggests the software wasn't designed-for-test.

    If you choose Microsoft software (and to a greater or lesser extent commercial software in general), you have to believe that Microsoft have tested the software to the best of their ability, and that the build/test labs that are mentioned genuinely exist, and that Microsoft personnel, and consultants hired by MS, have performed the security reviews they say they have and that they're skilled to do so. With Open Source, at its worst you have to believe that a nebulous collection of unknown people, of unknown size, of unknown skill, review all changes made to the software, typically with no release plan of features that are to be included.

    For serious business purposes, I know which one I choose.
  • Hole is a hole - no matter how it was exploited - thru architectural ignorance or otherwise. So your argument that HTML parser is significant than other portions of a browser is not valid. IE has bad code inspite of Microsoft's so called testing efforts and non-IE browsers too have bad code (arguably in different places) inspite/despite of their testing efforts. So what are we so happy about?

    Lets see in how many *days* Mozilla issues a fix and find out how what's the least it took Microsoft to issue fix for any of the previous exploits.

    Piece of software as complex as browser is going to have bugs - What matters is how soon it gets fixed and how many users does it affect. IE is _bad_ in both cases - it affects hell lot more users and Microsoft wasn't anywhere near quick to issue fix to known exploits.
  • Doesnt Matter: No, I'm NOT arguing that an HTML parser is significant.

    But I AM arguing that if their HTML parser failed this basic test, what will happen to their JavaScript interpreter? What does this say about the methodologies used to test the components that make up their system?

    If extensive regression testing of a fix isn't a criteria, then your time to fix can be way smaller than if you have to run large regression suites.

    I don't have numbers, but I'm wondering how many fixes to Mozilla have to be revised after they were "fixed"? I also don't have numbers for MS products, but I suspect (with no strong evidence) that it's somewhat lower.

  • Of the reported 3, 1 didnt crash on anyone, 1 was already fixed in Dev builds and I fixed the 3rd one myself - a simple NULL pointer deref. Within _hours_ everything suddenly feels safe, without having to hopelessly depend on the vendor to fix the problems. Isn't this magical compared to what would have happened with a closed source product?
  • Larry - Extensive regression testing applies to only such things as architectural changes. For instance - Previously you used to allow to run a active X control if you thought you are in Local zone. Now some one is able to trick you that you are in Local zone even though you aren't. Then you got to fundamentally change the way you arrive at what is Local zone. That's going to break may be dozen things that rely on the original buggy way of your thinking and then yes - you need to regression test it. If it breaks - you need a ugly workaround instead of an elegant fix.

    Why would someone need to worry about regression in case of NULL pointer deref?

    It's altogether a different and easy game with OSS - If a elegant fix breaks something then you can easily/elegantly fix the source of the problem and all other dependents who rely upon that bug - no ugly workarounds and hell lot of regression testing is necessary.

    And I don't understand what you said in your last statement - I haven't heard Mozilla had to fix their fix any time - but I definitely remember couple such things happening with MS fixes.
  • I don't know if all bug fixes made with Mozilla have been error free. I do know that on other OS projects, it has taken several revisions to create a security fix that didn't itself introduce new bugs.

    In <i this /i> case, the fix may have been simple and clean and easy. In other cases, it's not at all as clear.
  • 10/18/2004 2:38 PM Mike Dimmick

    > I clearly recall the regular disk-trashing
    > bugs that appeared in the early 2.2 kernel
    > series.

    OK, I don't because I didn't experiment with Linux in those days. I remember Windows 95's disk-trashing bugs from those days, and I remember Windows Server's 2003's disk-trashing bug from a few weeks ago. And Windows 2000's disk-trashing bug from a time midway between those two.

    > Linux sites are deluding themselves that
    > there are no problems in the OS.

    I haven't seen that, unless you mean some of the marketing pages on commercial vendors's sites. If you hate the marketing more than I do, you're in luck: you can buy a computer, even a notebook computer, without paying for an unwanted copy of Linux. I've seen lots of sites reporting problems in Linux. My own opinion also is that there are two essential differences between Linux and Windows:
    (1) With Linux you DO get what you paid for (except if you paid for it).
    (2) With Linux if something needs fixing, and if you're a programmer, then you DO have a snowball's chance in hell of fixing it.

    > If you choose Microsoft software [...] you
    > have to believe that Microsoft have tested
    > the software to the best of their ability

    No way. Things as trivial installing Windows 98 Service Pack 1 (onto Windows 98 first edition), rebooting, and clicking the Start menu; or as trivial as installing Word 2000 upgrade on an existing Office 97 installation and clicking the Start menu, etc., pretty clearly demonstrate that Microsoft never tested them. Sometimes Microsoft tests the US versions of their products, but the vast majority of their products don't benefit from that.

    And then last weekend I tried installing .NET Framework 1.1 SP1 onto Windows Server 2003. There's a special version of that service pack for Windows 2003, separate from the version for the rest of Microsoft's OSes. And it doesn't even install, it tries to dereference a null pointer during installation. That's quite a reassuring security fix eh?

    This doesn't mean Linux is better, it just means Windows isn't.
  • To summarize - Software is hard to get right, _humans_ code software as of now and thusly there is every chance that it is not perfect - But you are better off when you have the source with you. You can fix it by some means if nothing else works out. You don't have to be at anyone's mercy.

    And most importantly if everything in open and out there, you get elegant fixes instead of mere workarounds and you have the ability and capacity to correct the design if need be, without having to worry too much about how many other closed things it might break. (Linux USB API is a good example of this - they changed it thrice and they fixed all the drivers dependent on it - no ugly workarounds and bloat.)
  • Woah!

    The teardrop TCP security fix took Microsoft two attempts to get right back in the day (and significantly longer than the equivalent Linux kernel patch).

    Software development in general is beginning to wake up to the needs of security and the basic truth that pretty much any bug can be a security hole.

    It's grossly unfair to state that the open-source world is radically worse than commercial vendors at this - security has taken a back burner for a lot of people. However in general, the people crying out at the beginning were much more able to work on open-source projects. If you look at the age of bounds-checking patches for gcc, at anti-stack-smashing approaches for the Linux kernel, amongst other things, these were all done before the big recent stink about security.

    I think it's a good thing that Microsoft have 'gotten' security - I think a lot of people underestimate what Microsoft can do, but to state that the open-source world is significantly worse is to do large portions of it a disservice. I was very impressed having had a quick look at some functions in MSDN that they have security notes accompanying them (strcat, strtok, sprintf), and I hope that this makes commercial vendors take notice.

  • didn't crash firefox for me,
    Mozilla/5.0 (Windows; U; Windows NT 5.1; rv:1.7.3) Gecko/20040913 Firefox/0.10.1
  • That's quite interesting!

    However: One of the biggest flaws with IE is that it approves also broken code (html etc) for rendering and renders it. EVEN when the rules of a technology explicitly say that the parsing MUST be stopped at the first error.

    The fact that IE has been always so forgiving on the code has contributed greatly to the fact that most of the code on the Interweb is just plain crap in quality.

    I would personally like to drag every IE developer behind the sauna to be put out.
  • I really dont know what is wrong here. Tested on firefox PR1.0 XP SP1 fully patched and there is is no crashing evidenced, even after several refreshes.

    However, I know that even a 'hardened' IE is no where near as safe as firefox in regards to viruses and spyware. I guess your not cleaning PC's for a living?
Page 1 of 4 (47 items) 1234