It isn’t really an issue of “is this a secure platform on which to write this code”, it’s more an issue of “does this platform comply with a standard that has been created by somebody else in an attempt to make my life easier by doing all my thinking for me”.

When a standard, certification, accreditation, attestation etc is created – it’s done to save time and money. You can spend a great many hours, weeks, months poring over code, testing a platform and checking the combination of the platform and code for security holes. But security holes come in so many shapes and sizes. One of the biggest security problems exists in the last 40-50cm of the connection. It’s the air-gap between the screen and the user’s eyes. For many evil people this is a much easier attack surface than all the technical stuff. As technologists, for the past 15 or so years, we have taught people that whenever they see a web page with a username and password prompt on it – they should fill it in and click Login. To an evil person, this Pavlov’s Dog response is perfect. I’m not aware of a certification or standard that takes this part of the channel in to consideration (but if you are, please leave a comment). And yet many systems that exhibit this very serious weakness have every security compliance certification and accreditation you could ever imagine. So many times compliance stops within the boundaries of the technology and sometimes the people and processes that surround the management and administration of the technology. A system manager might proudly announce he has a technology estate that enjoys this accreditation and that certification and he may live under the impression the he is running a secure system. What he might be running is merely a compliant system.

I used to think about this when I worked for DEC (Digital Equipment Corporation – yes, I’m old), at one time, the world’s second biggest computer company, behind IBM. They had a nifty range of mini-computers with purple and pink switches – the PDP 11/70.

 

 

DEC PDP 11/70

I often wanted to create a Purple accreditation. It would specify the specific pantone of purple the switches had to be, how many switches, their shape and position and so on. This would have produced Purple Compliance in the case of the PDP 11/70. But would that have made it a good computer? Nope. It would have made it compliant.

If say a bank wants to procure a system and the legislation in that country says the system has to be certified to standard x or y, well, they have no choice. All systems that don’t enjoy the certification are automatically rejected. It’s when a team has the freedom to determine their own security that I think a degree of dysfunctional thinking often comes in to play. When you make the decisions, you can set the bar as high or as low as you like for security. If you set it high, it will necessarily be more inconvenient for users and will provoke complaints. If you set it very low then the chance of a security breach increases. All system security is about exposing the system to risk – high risk or low risk. Security accreditations merely try to give a defined set of principles that somebody thinks are important for you to implement. It takes away not only the thinking, but I’d argue, to a degree, the responsibility. Maybe this is why teams adopt them when there is no pressure from say legislation, regulators or the law to adopt them. In some big companies there are even departments called “Compliance”. The employe “Compliance Officers”. I often think how awful it must be to be hamstrung by these guys. You are just trying to create software and they are on your back all the time – “is it Purple Compliant?”. They are a bit like “Health and Safety” – the group that stops you from doing your job

I think we have to avoid using the terms “compliance” and “security” as if they are the same thing. You can easily build an insecure yet compliant system in the same way you can build a highly secure non-compliant system. I’m not saying compliance is always a bad thing. I’m saying there is a sense that an organisation can’t possibly own and operate a system unless it’s compliant because it’ll be insecure. There is also a sense that compliance brings with it, a sense of security. Unfortunately, this is often a false sense of security.

No Compliance Officers were harmed or killed in the writing of this blog post.