There are two primary concerns for governments and organisations in Australia when considering taking advantage of Cloud Computing. #1: Security, #2: Data Sovereignty. While these are legitimate concerns, they are largely misunderstood and surrounded by Fear, Uncertainty and Doubt (FUD). We need to address this FUD, rather than sticking our collective heads in the sand and just waving the FUD flag.
Often when I talk to organisations about cloud computing, or read articles about how someone choose to spend up to 75% more on a project rather than consider cloud computing, inevitably security comes up as the reason they spent more taxpayer money. They believe that a Cloud data centre and systems aren’t as secure as what they have in their data centres. They feel that their data centre is run better, and follows a stricter set of rules that a cloud data centre. They feel that because they run the data centre, that it is more secure.
Almost all of the government and large corporate data centres in Australia are not in fact run by the organisations themselves, but are run by contractors. So they are already letting someone else manage the data centre operations and server management. Their control, only extends to the contract that the contracted agency has signed. Most of these kinds of contracts are boilerplate contracts with guarantees for uptime, issue management response times, costs for services, etc. In many cases, these contracts don’t include provisions for the organisation to physically audit or touch the servers to ensure that their rules are being followed. Some of them even go so far as to specify that the contractors are in fact the only ones that can touch or audit the machines, and all reports on their health and compliance must be requested at a cost to the organisation itself. I have seen some for government organisations that prohibit anyone except the contractor’s employees from even entering the data centre.
So the idea that the organisation has more control if the data centre is run by “themselves” rather than a cloud provider is a red herring. The control the organisation has over the operations of the data centre stops at the paperwork. Granted, the paperwork may be very diligently crafted, but it usually favours the contractor, not the organisation. Cloud vendors provide very similar assurances and guarantees for the data centres they run. So the management of the data centre is no different if the organisation runs the data centre, or if a cloud vendor runs the data centre. I’d go so far as to say that it would be preferable to trust a cloud vendor whose has bet a large portion of their company’s future on making sure that their cloud services is run to perfection, rather than to trust a contracting agency who has the organisation over a barrel and only demonstrates any kind of diligence when it comes time for contract renewal.
How do we manage all of this and ensure that we are dealing with reputable contractors or hosting organisations? We look to international standards for data centre operations, and information management. But since Cloud is relatively new, there aren’t any specific standards for Cloud Service Providers yet. So we fall back on the tried and true methods of IT and use the closest thing we have. In this case, the most relevant certifications are ISO 27001 / 27002 and the SAS 70 Type II attestations. Service and hosting providers have long held to these standards as demonstration that they follow best practice for managing and protecting your critical data.
The Microsoft data centres and associated operations are in fact ISO 27001 certified and have SAS 70 Type II attestations. They have also received FISMA approval. The same kinds of evidence that is used for local data centre operations. Microsoft has been operating data centres for large scale 24/7 systems for over 17 years. Longer than most internet companies have been in business. There is a lot of experience there in high availability, large scale, secure data centre operations.
The golden triangle of data security is Confidentiality, Integrity and Availability (CIA). Ensuring that these tenets are met is critical to proper IT system management. Cloud providers do this by making sure that their systems are highly available, redundant systems. For example, data stored in applications built on Windows Azure using the Windows Azure Storage mechanisms always have 3 live redundant copies available at any time. If one of those copies becomes unavailable the system seamlessly switches to a live backup while creating a third live backup. This high availability scenario is much more reliable than most on-premises systems where any kind of failure means recovery from backup.
Data Integrity is provided inasmuch as a service provider can cover it. This one is tricky because the application that was written by the customer is the part responsible for ensuring that the data is processed and written to storage correctly. While the data storage mechanism will faithfully store the data, it can’t control what the application writes. But once the data is in the hands of the storage system it will be available exactly as written by the application.
Confidentiality is another, and perhaps the most important, aspect. In most public cloud scenarios the systems are what we call multi-tenant. That means that you share physical disks, network cables and CPUs with other customers’ systems. It is the cloud provider’s systems and operations that provide the separation between customers, and ensures that one customer’s system cannot access another customer’s data. With Windows Azure for example, there are mechanisms in place to ensure that this does not happen. There are over 10 layers of security wrapped around any system build on Windows Azure to ensure that it cannot access any resources outside of its own. Microsoft’s Global Foundation Services have published several white papers on how they implement security in Microsoft’s online services.
Now this does not excuse customers from following proper Security Development Lifecycle practices when developing their software. A vast majority of the attacks that happen now are through the application layer. If you have written a vulnerable application that runs on your servers, it will still be a vulnerable application when it runs in the cloud. Granted, you are probably better off with it running in the cloud because only the application will be compromised rather than your entire network.
Another very common topic or excuse for not taking advantage of Cloud Computing, is data sovereignty issues. While this isn’t as much of a problem in countries that the cloud data centres are located in, it does present an issue for countries that do not have local data centres. The essence of the problem is that there is a perception that organisations are not allowed to send data outside the geographic boundaries of their home country. In Australia, the perception is that data must stay on the island. There is no law that states data must remain on the island. There are some policies that recommend it, but nothing actually preventing it except in national secret instances.
The most common bit of legislation that is quoted as the reason for the perception is the Privacy Act (in Australia). However, the Privacy Act actually states that trans-border data flows are permissible under appropriate circumstances.
9 Transborder data flows An organisation in Australia or an external Territory may transfer personal information about an individual to someone (other than the organisation or the individual) who is in a foreign country only if: (a) the organisation reasonably believes that the recipient of the information is subject to a law, binding scheme or contract which effectively upholds principles for fair handling of the information that are substantially similar to the National Privacy Principles; or (b) the individual consents to the transfer; or (c) the transfer is necessary for the performance of a contract between the individual and the organisation, or for the implementation of pre-contractual measures taken in response to the individual's request; or (d) the transfer is necessary for the conclusion or performance of a contract concluded in the interest of the individual between the organisation and a third party; or (e) all of the following apply: (i) the transfer is for the benefit of the individual; (ii) it is impracticable to obtain the consent of the individual to that transfer; (iii) if it were practicable to obtain such consent, the individual would be likely to give it; or (f) the organisation has taken reasonable steps to ensure that the information which it has transferred will not be held, used or disclosed by the recipient of the information inconsistently with the National Privacy Principles.
9 Transborder data flows
(a) the organisation reasonably believes that the recipient of the information is subject to a law, binding scheme or contract which effectively upholds principles for fair handling of the information that are substantially similar to the National Privacy Principles; or
(b) the individual consents to the transfer; or
(c) the transfer is necessary for the performance of a contract between the individual and the organisation, or for the implementation of pre-contractual measures taken in response to the individual's request; or
(d) the transfer is necessary for the conclusion or performance of a contract concluded in the interest of the individual between the organisation and a third party; or
(e) all of the following apply:
(i) the transfer is for the benefit of the individual;
(ii) it is impracticable to obtain the consent of the individual to that transfer;
(iii) if it were practicable to obtain such consent, the individual would be likely to give it; or
(f) the organisation has taken reasonable steps to ensure that the information which it has transferred will not be held, used or disclosed by the recipient of the information inconsistently with the National Privacy Principles.
It is section 9 (f) that most common applies in Cloud Computing environments. Now that being said, no sensible cloud provider would recommend letting any data above Highly Protected (Secret) go outside the country. There are some systems and cases where public cloud services simply aren’t practical. This is where data classification strategies are a necessary part of any IT Strategy. With a proper data classification strategy and a hybrid architecture, can alleviate the concerns around Data Sovereignty.
In a Hybrid Architecture, some of the parts of the system are deployed to a public cloud, critical parts of the system are kept on-premises, and they are connected by a secure encrypted communications link. The diagram below gives a high level overview of the layout of a Hybrid Architecture.
This allows an organisation to keep necessary data on premises, while being able to deploy public or partner facing applications to the cloud. An additional benefit is that public facing applications and interfaces no longer have to be kept on the organisation’s infrastructure. This means that you no longer have the anonymous users touching machines on your network and you can close down some of those attack vectors.
Data Sovereignty can be addressed by having the master copy of all data locally, while syncing public or non-protected data to cloud storage. You get all the benefits of locally protected data while being able to take advantage of the savings and ease of deployment that Cloud Computing provides.
While these issues may spring to mind any time someone mentions Cloud Computing, there are answers for them. Microsoft’s Windows Azure can offer a solution for the Data Sovereignty problem through Hybrid Architectures. The GFS managed data centres are some of the most secure data centres in the world and have a 17 year track record of high availability. More and more, the use of cloud computing will open up new markets, make achieving Gov 2.0/3.0 easier, and above all save money in a tight economy. Maybe you should think ‘Cloud First’.