When you think about the future of clinical computing, what images come to mind? In five or ten years, will we still be using desktop PCs, computers on wheels, laptops and Tablet PCs? Will the keyboard and mouse be as prevalent as they are today? What kinds of devices and interfaces will clinical staff be using? How will device and workflow changes impact the design of our healthcare facilities of tomorrow?
Designers, architects and engineers of healthcare facilities must always be thinking about the future. So too, must the executives, managers and staff of our hospitals and clinics. It is far too easy to imagine the future as we experience the world today. But that may lead us totally in the wrong direction. Over the span of my own career, I’ve noted monumental changes in business communication and collaboration; from telephone “while you were out” memos stacked on my desk after lunch to instant messaging, e-mail, voice mail, and web conferencing that keeps me continuously connected no matter where I am.
Medical computing stated out much as computing did in other industries, with desktop workstations strategically placed around the hospital ward. As laptops gained in popularity, they were mounted on carts with wheels to improve workforce mobility and provide for longer battery life. Tablet PCs became popular particularly as new models designed specifically for the healthcare industry became available. More recently, some manufacturers have combined the best attributes of Tablets and Desktops into a new generation of “computers on wheels“. But what can we expect in the future?
I believe clinical computing may look quite differently than it does today. I think we’ll see a transition to much larger displays that take advantage of touch and multi-touch navigation married to increasingly sophisticated speech recognition solutions for data input. If you’ve seen any of our “future vision” videos, you’ll note that large screen displays with touch screens are quite prevalent in the scenarios we paint. So too are smaller, pocket-sized devices that communicate wirelessly with their larger-screen cousins whenever more screen real estate is required by the user.
You might have had a premonition of this future vision the first time you saw Microsoft Surface, or dare I say, the first time you played with an iPhone. With the launch of Windows 7, you’ll see more and more software applications and computer screens that take advantage of “touch” as a user interface. With Windows 7, if you've got a touch-screen monitor, you can just touch your computer screen for a more direct and natural way to work. Use your fingers to scroll, resize windows, play media, and pan and zoom. Windows 7 also introduces support for new multi-touch technology, so you can control what happens on the screen with more than one finger. For example, you can zoom in on an image by moving two fingers closer together, like you're pinching something, or zoom out by moving two fingers apart. You can rotate an image on the screen by rotating one finger around another, and can right-click by holding one finger on your target and tapping the screen with a second finger.
Now imagine all of that combined in an environment that allows you to move seamlessly from one device to another and being “recognized” by each device as you approach it. Perhaps you’ll be wearing a small Bluetooth enabled, bone conducting microphone. You will touch items on the screen, gesture with your hand, or use your voice to open and close windows. You will ask questions or give directions and the computer will respond. You will use natural hand gestures or touches to navigate and move from one workflow to another. You’ll transition from a large screen display to a portable device in your pocket to another large screen on the wall and pick up your work exactly where you left off each time along the way.
I believe this is the future of clinical computing. In fact, I think we are already seeing this natural evolution in the products and solutions that are hitting the market and certainly in those that will shortly be coming to market.
Bill Crounse, MD Senior Director, Worldwide Health Microsoft
Excellent essay! The future sure looks exciting...
Nice post. I think you hit some important near term interface opportunities but I think it also raises the questions around the improvements on the backend that are coming. The user interface will be an important game changer but the ability to leverage the information from electronic health records and other sources to improve clinical decision making will be just as important. Considering the impact that knowlege, through connecting important dots, has had on the history of medicine I can only imagine what we can achieve in the next 10 years with the improvements to processing power and communications lines. All will add significant value to the 3Ps (Patient, Provider and Payer).
A very thought provoking piece. What intrigues me is what will need to happen under the hood to make this happen. Such rich user interfaces and complex functionality across a domain will not be possible from a vendor. Software vendors will need to focus on which piece of the jigsaw they want to be really good at and the result will need to be orchestrated from all the pieces from many vendors into seemless user interfaces.
Thanks for your comment. At least part of the "jigsaw" might be solved through "clinical groupware". See my latest HealthBlog post for more on that. As for the user interface, check out the work we've been doing in the UK with the National Health Service www.mscui.net.
Bill Crounse, MD
This looks like the future of clinical computing as seen from the perspective of a large US teaching hospital (and the MS product line)
Nothing wrong with that of course, but teaching hospitals represent a tiny (and very expensive) fraction of clinical care.
Thanks for writing. While the images I used may convey what you suggest, it is certainly not the case that these tools and technologies only apply to large academic centers and teaching hospitals. In fact, some of the most creative applications of these technologies that I see around the world are actually happening in small private practices and retail health centers where innovative clinicians are using commodity and widely available ICT solutions to drive organizational efficiency and improve communication and collaboration with care teams and patients.
The idea of voice controlled interface navigation intrigues me. The idea that we may be able to open a file out of our database without having to navigate through long series of files, but by simply asking the computer to access what we need. (Not only fixing my difficulty with sticky buttons and dodgy touch screens). Great post