I'm skeptical of too much hand-waving when computing changes in a general way, but I do think there's been a shift in how we think about the components within computing since its inception. When computing entered the mainstream, mainframes computed the information, stored it and then delivered a "thin" result to a terminal. Computing power - from memory to the processors - was the area where researchers, developers, hardware companies and administrators spent most of their time.

When the PC revolution took place, computing and memory shifted to both the server and the local workstation. Emphasis on storage became paramount - and we're in that very age now. The explosion of data, infoglut, and other terms were coined to capture where organizations spent their money and placed the most emphasis on. Government regulations even feed this need for greater storage - most companies have to retain e-mails and other information to meet those regulations.

We're entering a new age, or at least a new emphasis in computing. And although we retain the gains we made in processing, memory and storage, the next component that becomes paramount is the network. Smartphones, tablets, RFID devices, smart tags, geo-enabled devices, and yes, the cloud, all depend on a robust network. And not just wired connections - I read a report yesterday talking about the work at large colleges to allow hundreds and in some cases thousands of students to be in the same room and still use their wireless devices such as phones and laptops. Who can forget Steve Job's famous request for everyone at his presentation to turn off their phones so that he could finish a demo.   

While "cloud" (or utility) computing should be on your radar, do not ignore the network, or the companies that provide them. Don't be surprised if you're asked what bandwidth the latest product you want to install uses or information about your wireless spectrums and channels. The new age is here.