Next month I will have a unique opportunity to influence the design of future devices for clinical computing. Our OEM (original equipment manufacturers) group at Microsoft has invited me to the Consumer Electronics Show (CES) in Las Vegas. My mission? Meet with our OEM partners to talk about trends in clinical computing and help them design the next generation of devices that physicians and other clinicians will use in hospitals, clinics, and also at home.
That’s a pretty tall order. Of course, being as close to the industry as I am, I have lots of my own thoughts about this. I’ve been involved in many such discussions with OEMs in the US and abroad over the years. But I’ve not had an opportunity to meet with so many vendors at one time in one place. That’s why I’m looking forward to CES.
The fact that I am doing this at CES says volumes about how the industry itself is changing. Today, one of the biggest trends in clinical computing is doctors, nurses and other clinicians purchasing consumer devices like ultrabooks, tablets and smartphones and bringing them to work. There is an increasing expectation from these dedicated health professionals that they be allowed to use their personal devices to connect to the enterprise’s network and go about doing their work. This can cause all kinds of headaches for IT staff who must worry about the privacy and security of patient and financial data. Another cause for concern is the possibility that these consumer devices might introduce malware or viruses to the hospital or clinic network. In fact, this is such a hot topic that I’ll be participating in a special webcast on this subject on Wednesday, December 7th. Here is more information about that.
I’ve stated on HealthBlog before that I believe technology has finally matured to the point that robust devices can now be made to meet most of the requirements of clinicians and clinical workflow. You’d have to be living on another planet not to know about the influx of smartphones and slate computers (especially the iPad) into healthcare. Manufacturers can now offer devices with much improved battery life, brilliant displays, light weight, mobile connectivity, and more intuitive user interfaces. Unfortunately, most of the consumer devices currently on the market fail to deliver what clinicians need in other ways. They may not hold up to disinfectants or other cleaners. They often are not rugged enough to withstand sustained clinical use. The displays, while excellent, may not be of sufficient definition to meet requirements for medical images. Consumer devices may be lacking in the input and output ports needed for connectivity to other devices like barcode scanners, pumps, printers, projectors, and digital physiological equipment and monitors. I could go on, but I think you get the point.
I’ve also said before that there really is no such thing as a single device that meets every clinician’s needs. Chances are you will want to use different kinds of devices depending on where you are and what you are doing. Out to dinner with your spouse? You’ll probably want to connect to the data you need on your smartphone. Rounding in the hospital? You’ll likely want something with a bit more real estate on the screen. Doing a history and physical? That means lots of data input so a keyboard, digital ink, or better yet, robust speech recognition is needed. Checking on a patient in the ICU? You may want a 42 inch monitor to survey all that data.
So, recognizing that no single device can meet all your needs, if you had an opportunity to tell device manufacturers what’s missing from todays solutions that you’d like to see in future devices, what would you tell them? Write your ideas in the comments section here on HealthBlog, and I’ll be sure to vet as many of your ideas as possible when I meet with the OEM community at CES next month.
Bill Crounse, MD Senior Director, Worldwide Health Microsoft
very nice post
My experience is that the problem is not at the device layer, it is actually the application (not OS) support for different interfaces like touch. Look an iPad is great, just like the new Dell Latitude ST, the problem is that the PACS ? RIS app or the EMR, ED or surgical app just isn't designed for a tablet eco system. Generally the app is delivered via a Citrix layer so the OS has little part to play. You also have to bear in mind that the solution, whatever it may be must work straight up. They cannot be developed on the fly but must be mature and workable from the outset or you are just getting in the way of good patient outcomes.
As a clinician I can tell you that your focus on devices is not the issue, the issue today is documentation time and cost. Laptops, tablets, smart phones with appropriate operating systems like full versions of windows or iOS, with appropriate apps provide access to EMR data like histories, labs, radiology and pathology.
However, after seeing patients all day, scheduling, than email and phone calls another six hours can be spent verbally describing what went on with each patient. For reimbursement, to prevent fraud and to avoid having to take time searching through old records, much of the data is REDUNDANT with the exception of that days events. This redundant data takes hours to repeat each day. We need software that will automatically update the latest form being created with unchaged data so the physician only has to describe new events for each day. This would save huge amounts of time, cost of transcription, not to mention clinician voices, vasculature sitting at a desk and transcriber carpal tunnels:)
I also think Steve Jobs was wrong about a stylus and Bill Gates was correct, it has a role in healthcare. We spend hours writing and then more hours dictating what we toook notes on. It would be great to write on a tablet on electronic forms (created with Infopath, Adobe software, etc.) and have software that converts the hand writing to digital accurately and puts it into the EHR. Hence hours of time and money saved.
I would be happy to consult on development of such products (unless they already exist and I am not aware of them). IT at my previous institutions have said what I have described above is impossible. I say it can be done and will improve healthcare.
Dear Windowsdoc. Thanks for sharing your wisdom. You and I are totally on the same page. In fact, I've been saying for many years now that the device isn't nearly as important as the software. Of course, the software is often dependent on the device so you cannot look at one in isolation of the other. I've also commented many times that data input remains the holy grail of clinical workflow and I agree that we need greater automation not only to facilitate input but also integrate with the record as you have outlined. I'll be sure to bring up these issues when I meet with our partners in Vegas. Thanks for taking the time to provide your thoughts.
Bill Crounse, MD
Just one more comment. Having finally recovered after two years of dealing with sports related rotator cuff calcifications and impingement, I've grown very sensitive to the topic of ergonomics and repetitive motion injury.
At one of the recent Apple events, the late Steve Jobs commented on touch screens for iMacs, and mumbled something like, it won't work, we looked into it and only "if you want your arm to fall off."
When I heard this, I thought bravo Steve (and/or his team that was alert enough to recognize the problem).
With Windows 8 they keep talking about touch on multiple form factors. Although touch appears to be great for tablets and smart phones, I'm concerned about desktop screens. Interestingly I have an older HP touch screen all-in-one for family use, but have the touch screen option turned off. I've never used the touch function and did not obtain it for the touch capability. All I can say is touch is NOT for desktop screens. The human shoulder was not designed to reach up and forward to a screen all day. I'm sure that after a period of months or years this type of repetitive motion would cause shoulder injuries.
I hope others realize that touch on desktop screens could lead to numerous shoulder problems, suffering, and health care dollars for an avoidable problem.
Thanks again for weighing in. I've always been adamant that clinicians need multiple ways to interact with computers depending on workflow, data acquisition vs. data entry, device, etc. Ideally, this would include touch, no-touch (gesture), speech, handwriting recognition, point and click, keyboard and perhaps other modalities not yet invented. How we interact with the devices we use is as important as what we do with them. And, of course, if we do anything that is repetitive in excess we risk injury. So, the secret is to take breaks, stretch, and vary the ways we work. Recently, I installed a device in my office that lets me quickly adjust my desktop arrangement from sitting to standing. It has made a huge difference in my productivity and comfort level throughout the day.
I couldn't agree more that getting input right is the key to this. Too many clinicians spend time tapping on their keyboard and not focusing on their patient - it really disrupts that important part of medicine. I agree that writing is key and the big failure of the ipad for healthcare is the lack of an accurate stylus (although it succeeds in many other areas). I also agree that speech and gestures etc. are key.
We do need all day battery life. We also need something that is suitably portable and also robust enough to survive clinical practice. The technology is finally catching up with the great ideas from 10 years ago, but is it there yet?
As is also said, the interface is the key. I don't yet see an EMR which has that rigid focus on usability and simplicity that is so required.
Thanks for responding. I'll be sure to incorporate your thoughts and those of all the other clinicinas who've responded when I meet with device manufacturers and health industry partners at CES in Las Vegas.