The Best Whitepaper I’ve Read In Years: Cloud Computing from UC Berkeley

The Best Whitepaper I’ve Read In Years: Cloud Computing from UC Berkeley

  • Comments 3

image

For anyone interested in Cloud Computing a new report by UC Berkeley Reliable Adaptive Distributed Systems Laboratory is an absolute must read. Like my pal Daz, I’ve seen too many boring whitepapers over the years to not get excited about these things but this one stands out as the best I have read in a very long time. It’s genuinely useful, insightful and unbiased.

Above the Clouds walks through cloud computing and discusses Amazon, Google and Microsoft and others. It also does a terrific job at bringing some long overdue definition to some widely over used terms and exposes many of the metrics behind the economy of scale that cloud computing brings. I particularly liked how they explained why even if cloud computing was more expensive over time than on premises hardware, cloud may still work out cheaper because of the “elasticity and transference of risk”.

They do a great job of answering the questions they set out to:

  • What is Cloud Computing, and how is it different from previous paradigm shifts such as Software as a Service (SaaS)?
  • Why is Cloud Computing poised to take off now, whereas previous attempts have foundered?
  • What does it take to become a Cloud Computing provider, and why would a company consider becoming one?
  • What new opportunities are either enabled by or potential drivers of Cloud Computing?
  • How might we classify current Cloud Computing offerings across a spectrum, and how do the technical and business challenges differ depending on where in the spectrum a particular offering lies?
  • What, if any, are the new economic models enabled by Cloud Computing, and how can a service operator decide whether to move to the cloud or stay in a private datacenter?
  • What are the top 10 obstacles to the success of Cloud Computing—and the corresponding top 10 opportunities available for overcoming the obstacles?
  • What changes should be made to the design of future applications software, infrastructure software, and hardware
    to match the needs and opportunities of Cloud Computing?

My favourite line from the paper

Physics tells us it’s easier to ship photons than electrons; that is, it’s cheaper to ship data over fiber optic cables than to ship electricity over high-voltage transmission lines.

Thanks UCB…this is a keeper and it was great to see Jim Gray referenced too. The update on his work in Table 5 within the document is fascinating.

I can’t remember the last time I read something as long as this document that wasn’t a novel and enjoyed it. It’s now in the inbox of many of my friends at Microsoft and wider.

thanks to Andreas for bringing to my attention.

  • Steve,

    I'm surprised that you left out the more telling comment in the paper regarding the legality of having data moved across national borders. This is something that European comanies recognise and last year, Fujitsu Siemens announced that its datacentres that would be used for SaaS would not export data from the country of creation.

    This, of course, is the complete opposite of Microsoft's stated policy, as given by Microsoft legal, that Safe Harbour is sufficient to protect non US customers. As Berkeley point out, it isn't as Safe Harbour has no provisions for protecting data against the Patriot Act in the US or, conversely, the RIP Act in the UK. Other countries also have their own legislation in this area.

    None of this, of course, is new. Every since companies have been putting data into the hands of large hosting companies, this situation has been in play but oddly enough it gets ignored by many in this industry and by many customers who need to rethink some of their outsourcing and take more notice of compliance laws.

  • Ian

    totally understand your points but on points of law, I tend to steer clear. it's not my forte and by it's nature is something best left to those who know...

    what I would say is we're definitely not ignoring it.

    I'm intrigued to see how Google deals with this with GAPE as given it's built on the same platform for search (I assume) that is distributed by nature and therefore makes it hard to pinpoint where precisely your data lives within their network of data centers at any point in time. As a search user I really don't care where my result gets served from but if I'm using enterprise email, I really do care. the clash of consumer oriented cloud platforms with enterprise oriented will be a very interesting dynamic.

    Steve

  • Steve

    Believe me, I try and ignore points of law as well but, well, let's not play that game :)

    On a serious note, however, how do you see Cloud computing developing given the problems of data privacy and national differences? For example, if you don't know where the data is backed up, how can you tell a customer the data is "safe", although that does depend on your definition of safe.

    There is also a developer question here as well. The failure of UDDI to deliver approved components from multiple sources that could be added to enterprise software has been embarrassing. Remember all those wonderfully hopeful statements about how accounting groups and others would come together to vet web services and create vibrant marketplaces?

    If we couldn't get UDDI to work with all the hype and pressure, where will the validated components and platforms come from for developers? I've looked through what passes for documentation on Azure (and it is woefully inadequate) and there is little there to show any mechanism by which code signing will be made stronger or how, as a developer, I will be able to use an authorisation and approval authority when working in the Cloud.

    Any chance of prompting the dev team to talk about this?

Page 1 of 1 (3 items)