Friday's post about security blogs apparently contained a bit of unintended controversy.
When describing Bruce Schneier's blog, I said "I don't agree with a lot of what he says". Apparently this is heresy in some parts, although I don't understand why. Bruce is unquestionably a very, very smart man (and an excellent writer, I simply loved Applied Cryptography), but he's no Chuck Norris :)
On most topics - security architecture, crypto design, threat analysis, etc, Bruce is remarkable. I find most of what he writes to be insightful.
But Bruce seems to have a complete blind eye when it comes to Microsoft. To my knowledge, even though essentially every other serious security analyst has acknowledged that Microsoft has done a staggering amount of work to improve the security of its products, Bruce still maintains that Microsoft has no clue when it comes to security. That stings.
The #2 hit in a search for Bruce Schneier Microsoft is: http://searchsecurity.techtarget.com/originalContent/0,289142,sid14_gci1011474,00.html which includes: " Microsoft is certainly taking it more seriously than three years ago, when they ignored it completely. But they're still not taking security seriously enough for me. They've made some superficial changes in the way they approach security, but they still treat it more like a PR problem than a technical problem". This couldn't be farther from the truth (the #1 hit is Schneier's FAQ about the PPTP analysis he did where he neglected to acknowledge the work that Microsoft did to rectify the issues he found after his analysis).
And then there was this gem (from February of this year): http://www.schneier.com/blog/archives/2007/02/drm_in_windows.html. He took Peter Gutmann's article and accepted it as the gospel truth, even though Gutmann had absolutely no factual basis for his speculation - Gutmann hadn't verified a single one of his claims, heck he hadn't even installed Vista at the time he wrote his paper.
On the basis of one paper from someone who had never even RUN Vista, Schneier leapt to the conclusion that Microsoft had embedded DRM into all levels of the operating system and that was a reason to avoid Vista.
For the following 5 paragraphs, please note: I AM NOT A LAWYER. I AM NOT GIVING A LEGAL OPINION, THESE ARE JUST MY THOUGHTS.
I also believe that he hasn't fully thought out his position on holding companies financially liable for the security holes in his product. At first blush his idea is attractive, but I firmly believe that the consequences of his idea would totally destroy the Internet as we know it today.
It's also entirely possible that it would kill the open source movement (talk about unintended consequences). Let's say that there's a security vulnerability found. If the vulnerability is found in a closed source product (or in proprietary code), then the corporation would be the only one that could be held liable for the damages - the individual developer would be protected by the corporate liability shield.
But for open source projects, often there is no such corporate liability shield (I could imagine scenarios where a corporate liability shield might apply, but I don't think they apply in general). So who pays up if a vulnerability is found in an open source project? The only likely target is the individual developer (or developers) who introduced the defect (I suspect that those involved in the distribution that contained the vulnerable code would also be targeted).
This means that it's highly likely that the individual contributors to open source projects would be held personally financially liable for security vulnerabilities they introduce. So to contribute to open source projects, you'd have to have many millions of dollars of personal liability insurance (or run the risk of financial ruin if a mistake is found in your code). That is highly likely to result in a stifling of the open source movement, and there's no easy way to work around it.
It's also likely to decrease the likelihood that a corporation would adopt an OSS solution. Consider the situation where a bank (or major retailer) is worried about having its customer records hacked. Since the bank/retailer is going to be held responsible for its security breaches, then the bank/retailer has to factor that risk when it chooses a vendor for its database solution. If the bank/retailer thinks it can sue the software developer who developed the database solution in the event of a breach, and it has two choices for a database vendor, one of them developed by a bunch of people who don't have any real assets and the other comes from a company with insurance and assets, it would be crazy to choose the one where you have no one to sue.
Those are a couple of reasons why I disagree with Bruce Schneier on occasion.
Larry, I addressed that point, I thought. Aren't Red Hat, IBM et al convincing counterexamples?
Microsoft considers the liability cost of possible IP infringement in Paint.Net completely impossible to even consider, even though there's no hint that any IP is actually infringed.
And yet Red Hat, Novell, IBM, Canonical, and a whole host of other companies ship and support entire operating systems written by developers just as "amateur" as the developers of Paint.Net and with even less "formal" assurance that the IP is clean.
Why does that work for them and not for Microsoft? Who knows. Point is that your claim that "no company would do this" doesn't square with the fact that companies *do* today do something that Microsoft considers just as unacceptable for the exact same reasons.
The way I see it, open source vendors act as "liability shields" for their customers. And that applies perfectly well today for IP infringement liability, same as in a hypothetical future for security liability.
I'm not sure it would be the death of open source in a global sense. Even if it [liabilities for vulenerabilities in OSS] did kill it (which is a very extereme take, "stifle in some circumstances" would seem more likely to me) in the US the rest of the world would probably shrug shoulders, possibly shake heads knowingly, maybe even snigger a little and then get on with using and writing OSS.
Perhaps litigation laws should be changed?
> I'd argue that the mere fact that these vendors DO ship open source code and DO accept the risk
> of all the potential liability that Microsoft considers so unacceptable, tells you that in fact
> they don't feel that way.
They feel the risk is acceptable *today* with the laws as they currently stand. But if they were 100% liable for all bugs in the code, I don't think there'd be much of a business case for them to accept outside contributions from anybody (unless the liability could then be passed down to them).
The point is, Bruce wants the person who created the bug to be liable (which is fair enough if the person is a corperation) but in the case of OSS, the person who creates the bug cannot reasonably be liable for it.
Larry, thanks for this response. I was one of those who questioned you about this. Whether or not I agree with your response, a clarification was in order simply because Bruce generally speaks common sense like Seattlites drink espresso.
Stuart: See Dean's response. Right now, there is no liability for software security defects, so there's no issue with accepting contributions from non-employees. If there was liability for software security defects, it's not clear that companies like RedHat etc would be willing to accept liability from people who are volunteers.
See my comment at the beginning about liability w.r.t. volunteer labor - if your car is totaled by a volunteer working for Meals-On-Wheels (or Habitat for Humanity or <pick your favorite charity that has driving as a part of its responsibilities>) does it mean that somehow Meals-on-Wheels AND the driver aren't responsible? Currently I don't believe it does even though the driver was a volunteer working for the organization.
The rules get REALLY murky when dealing with volunteer labor.
> On the basis of one paper from someone who had never even
> RUN Vista, Schneier leapt to the conclusion that Microsoft
> had embedded DRM into all levels of the operating system
Interesting. I had read a whitepaper on Microsoft's site itself about protected processes in Vista, and that gave me the same impression that Mr. Gutman's paper gave to Mr. Schneier. At the time, I had not even seen or heard of Mr. Gutman's paper.
> and that was a reason to avoid Vista.
I didn't draw that conclusion from Microsoft's paper, but I didn't draw the opposite conclusion either.
> Consider the situation where a bank (or major retailer) is
> worried about having its customer records hacked. Since the
> bank/retailer is going to be held responsible for its security
> breaches, then the bank/retailer has to factor that risk when
> it chooses a vendor for its database solution.
Considering that this is already the situation in some countries, I think that Mr. Schneier's opinion would neither increase nor decrease the adoption of OSS. Banks already have to figure out how to protect themselves, at least in some countries.
It made me really comfortable (not) when a bank in the US was sending out spams on behalf of a hacker. (This does not mean a hacker sending out phishing spams that pretend to be a bank.)
It made me really comfortable (not) when a nuclear laboratory in Turkey was sending out spams on behalf of hackers. The second time was after I'd already reported the first time to their administrators. It made me even more comfortable (not) when a nuclear laboratory in the US was sending out spams on behalf of hackers, and they were bouncing e-mail reports because they didn't even register their domain properly. Yeah, so maybe there were no other breaches in their systems, their bots were only accepting spamming instructions and sending out spams but ignoring instructions to send out copies of secret files. Should I care whether they were running OSS or monopolyware?
> Why should open source development get pass from liability
> laws when closed source doesn't?
Although that was answered by the person you asked, I think there are more answers to consider. One is that the maker is publishing their usage limitations for all to see, so that if you want to know what cannot safely be done with their software then theoretically you can read it and see. One is that if you need an adjustment to the software to make it safe for you to use then theoretically you can make that adjustment.
Why doesn't that apply to liability for IP infringement, though? It's not like the theoretical possible damages for copyright or patent infringements are any less severe than for defect liability. How many billions was the SCO suit for?
Stuart, I suspect it does. And the theoretical damages for defects are WAY more than the damages for IP infringement.
Consider the amount of damage a breach like TJX could cause - millions of customers credit card numbers siphoned off over the course of years - the liability to individual consumers is small, but that's because the credit card companies are swallowing the debt. If they had the ability to sue TJX to recover those damages, the would.
Yes, the damages to Microsoft (or any other closed source organization) for a major IP breach would be large, but it's not even vaguely in the same orders of magnitude as the potential liability of a single security defect.
I just can't understand how one can speak about liabilities regarding anything that applies to software, let alone security issues. According to EULAs I encounter, all liability is in the court of the user. I mean... from EULA of my VS2003: "MICROSOFT AND ITS SUPPLIERS PROVIDE THE SOFTWARE AND SUPPORT SERVICES (IF ANY) AS IS AND WITH ALL FAULTS, AND HEREBY DISCLAIM ALL OTHER WARRANTIES yada-yada-yada". I read that as "no warranty wrt to how the software works, period".
Therefore, I find the whole liability discussion hypothetical.
There may be software that holds itself responsible for what it does, but it surely isn't MSs or FOSSs as we know it?
How about "liability capped at the price you paid for the software/support, plus legal fees"? Or at some multiple of the price you paid?
It seems like it ought to be possible to create the incentives Schneier wants for companies to produce secure software without making the damages completely unbounded.
I think damages are legitimate in some cases, but there should be obvious limitations.
First, we all know that many software companies make exagerated claims about what features their products offer. False claims like that should be immediately result in treble damages - refunds of thrice the cost of the software. Should a company hard sell others on non-yet implemented features then it should be liable for full reimbursement of all costs a customer incurs in purchasing, accomodating, and replacing the software. That hits the dishonest companies.
Second, if a company develops and delivers a product where ANY feature doesn't work due to a significant bug (one that affects over 10% of users for instance) the company must provide a patch for that bug or refund the purchase price. That hits the companies that rush products to market too early.
Third, every software company/provideer must make both binaries and source code freely available to all former customers when they orphan a product. From the biggest version of windows down to the tiniest of applications, there should be no leaving a former customer stranded with no way to use or retrieve their data or reinstall their software. If a business goes under then bankruptcy courts should seize and release this material. If a business goes under with no court supervision then the software assets are automatically assumed orpaned and released to public domain. If a software product line is shut down, the business in mandated to do the same and if it fails to within reasonable time the release is automatic. No loopholes allowed where the company claims to support it for an exhorbitant price. That hits the careless, the greedy, and the poorly run companies.
yada yada yada
Larry, can you come up with some examples where a company or open source project was found liable for non-malicious code problems? (With that term I'm trying to exclude situations like back doors intentionally and obviously added by a company and/or person to disable the software or allow unauthorized access.)
Dave, I can't because according to the current law (as I understand it - IMNAL), in general liablity doesn't attach for software defects.
But Bruce Schneier (and others) are advocating attaching liability for software defects - holding software developers responsible for their bugs.
J. Simpson: So you can't say that your browser was "designed to be secure from the bottom up", that your database is "Unbreakable", etc?
Stuart: I'm actually not interested in trying to figure out how to make this work. I'm not a lawyer or legislator so nothing I come up with will really have any effect in the real world.
Tuesday, June 19, 2007 3:17 AM by Goran
> According to EULAs I encounter, all liability is in the court of
> the user. I mean... from EULA of my VS2003: "MICROSOFT
> AND ITS SUPPLIERS PROVIDE THE SOFTWARE AND SUPPORT
> SERVICES (IF ANY) AS IS AND WITH ALL FAULTS, AND
> HEREBY DISCLAIM ALL OTHER WARRANTIES yada-yada-yada".
> I read that as "no warranty wrt to how the software works,
Yeah, but other parts of asserted EULAs contain assertions of 90-day warranties, even though no asserted party to those EULAs honours that 90-day warranty. I read that as a fake warranty. Recently I read rumours that some new products have fake 1-year warranties instead of fake 90-day warranties.
Tuesday, June 19, 2007 10:00 AM by Stuart Ballard
> How about "liability capped at the price you paid for the
> software/support, plus legal fees"?
In my case if it were liability capped at the price I paid for the software, plus transportation expenses going to hardware vendors in tracking down the bugs (i.e. no legal fees in my case), that would be fine with me. I think I wouldn't hate a company that would honour its warranties in such a way.
Tuesday, June 19, 2007 11:21 AM by Dave
> examples where a company or open source project was found
> liable for non-malicious code problems?
Does code include firmware? For example firmware in floppy disk drives? For example a case where there weren't even any known examples of data loss but only the theoretical possibility was shown?
Um, maybe no, code doesn't include firmware. Because for firmware the answer was a huge "yes" and for software maybe the answer is still "no".
Schneier's system is also rife for abuse. How do you attach blame for misconfiguration? Say I want to take down a big, evil software corporation... Well, I install one of their servers (in Europe, preferably) and misconfigure it slightly. Then I get my friends in Russia to hack the server (or I let them do it) and me and my lawyer friends get on SoftCo's back in a European court. We'd have lawyers trying to decide what proper security is and then we're done for.
But I am not a lawyer, and really should not be talking about legalisms.