With its massive data centers, cloud computing delivers virtually infinite resources, providing the storage capacity and processing power to tackle some of the world’s toughest problems in healthcare, the environment, energy, scientific discovery, and many other fields. A hub for all data and information, it will enable us to capture, store, index, parse, and recall as much of our day-to-day lives as we choose to share. It will also provide a platform for orchestrating the flow of information and technology across our lives so that we always have instant access to the tools and information that we need. Fundamental breakthroughs in massively parallel computing will enable us to see patterns in data that can make actionable intelligence more prevalent. The influence of the cloud today is undeniable: more than 80 percent of new apps in 2013 will be distributed and/or deployed on clouds.
The immense number of digital devices in our world is driving an explosion in data. In fact, according to analyst IDC, the “volume of digital content will grow to 2.7ZB (1 zettabyte = 1 billion terabytes) in 2012, up 48% from 2011, rocketing toward 8ZB by 2015.” Deep analysis of this vast amount of data is enabling computers to begin to understand the physical world and to behave in a more human way, anticipating our needs and understanding our intentions.
Social computing has already changed how we create and maintain our connections with others. But the world of social computing remains highly fragmented. The lack of integration creates frustrating disconnects that are inevitable when we are forced to switch between services and applications to stay up to date. Social computing will undergo a dramatic transformation as technology advances make it possible to weave our social lives more deeply and more seamlessly into every aspect of our digital lives, so that information from our social networks can provide insights to guide us in the real world and online. Social networking itself will also change, becoming far more visual and less text-centric.
Our view of what defines a computer is changing as previously “unintelligent” objects are gaining intelligence, becoming connected, and joining the ecosystem of computing. We are entering the era of an “Internet of things” in which almost any object can be connected to the Internet and collect data that contributes to a global web of knowledge. Virtually every type of product is becoming part of the computing ecosystem—from cars, phones, and houses to scales, cameras, power meters, and televisions. Many of the computers you’ll interact with in the future will be in devices that we don’t think of as computers today. In effect, computing is becoming increasingly invisible.
As consumer devices become increasingly powerful, affordable, and prevalent, it is inevitable that they will make their way into the enterprise. Today’s users (in particular, millennials who have never known a time without those devices) want and expect to have the same kind of technology experiences at work as they have at home, and they don’t want to have to segregate their computing for work and personal across different devices. According to a Unisys study (conducted by IDC), 95% of information workers now use at least one self-purchased device for work. This means greater flexibility for the business and an inevitable extension of working hours and locations, but it also presents new security and management challenges for IT.
Increasingly we will be connected at all times to people, information, services and applications without requiring any specific action on our part. This will liberate the information that we have created ourselves and unlock any information from any source that might be relevant to where we are and what we are trying to accomplish, bringing everything we need together seamlessly in the form that is most useful.
More natural ways to interact with technology are rapidly emerging—multi-touch, voice, vision, gestures, and many more. This means that for the first time, computing will adapt to us and demonstrate some degree of “intelligence.” This trend will see computers shift from being tools to being helpers, performing tasks on our behalf based on an “awareness” of the environments we are in and the context of our actions. Ultimately, this will enable computing interfaces that are far more natural and increasingly simple to use, helping eliminate the learning curve of today’s technology.
A major focus of machine learning research is the design of algorithms that recognize complex patterns and make intelligent decisions based on input data. Applications for machine learning include natural language processing, voice recognition, robotics, search engines, and so forth. These technologies help people get things done faster and more effectively, for example with predictive results in Bing search.
If your agency ready for the future? Is your IT platform enabling some of these new trends? As Government competes for skills in the marketplace it will have no choice but to adapt quickly to this changing reality.
THE TOOLS DEPARTMENT free/trial tools for developers
outstanding presentation on Windows to go after last conference in May. Microsoft is a giant and portability reliability of computing even today.
Thanks oshtech... appreciate the feeddback