As an on-demand society, we rely on our smartphones, tablets, and other cloud-connected devices to provide immediate access to information, complementing and enhancing every aspect of our lives. And when physicians go to work, they don’t set these expectations aside.
Face it: if the odds were high that your paycheck would not be correctly deposited into your online bank account, you wouldn’t bank online. But that’s not the case. Our general experience is that digital banking is highly accurate, and although errors are possible, we know they’re not probable. So, when we log in to our accounts to make deposits, transfer money or pay bills, we expect and assume those requests will be properly executed, on time and without incident. The impact this automated service has made on our everyday lives is significant: no more waiting in lines, no more rushing to get to the bank before closing time, quite simply, less hassle.
There was a time, however (not so long ago), when customers shied away from the idea of online banking because it was new and uncertain. But, as this technology became more advanced and secure, people developed confidence and flocked to it because deposits appeared correctly, bills were paid – all by the touch of a button or the use of their voice. The nature of technological evolution is that acceptance develops into confidence which, in turn, becomes reliance.
Consider a field such as natural language understanding (NLU) where we are seeing continual and rapid advancements in accuracy, also known as “precision and recall.” This technology is quietly becoming deeply embedded in the devices that we use daily to get our jobs done, making everyday life a little more convenient and productive. NLU is being embedded in applications to understand context, identify relationships found within documents and commands, and when this technology is leveraged in a user interface, it makes the user feel like the application is “predicting” what he or she wants to do next. Over time, this capability will blend into the tapestry of the applications we touch every day and become second nature to us. And, as fact-extraction accuracy rises to the level of “human agreeance,” the application of NLU will push the limits of what we have come to expect as a great (and productive) user experience.
As an on-demand society, we rely on our smartphones, tablets, and other cloud-connected devices to provide immediate access to information, complementing and enhancing every aspect of our lives. And when physicians go to work, they don’t set these expectations aside. They need the convenience of anytime, anywhere access to critical information without being tethered to a computer, and they need to jump through hoops quickly to get patients what they need. Clinical Language Understanding (CLU), which is NLU specifically purposed for the medical domain, is helping physicians navigate electronic medical records (EMRs), perform basic computerized-provider order entry (CPOE) tasks (such as ordering lab tests and medications), or when combined with speech recognition, create a patient note in seconds. For all of these use-models, the CLU engine is becoming deeply embedded in the core of cloud-based applications. It quietly operates in the background, simultaneously extracting facts and processing evidence through a variety of domain-specific knowledge bases. For example, when a physician documents a patient note, the engine drives Computer-Aided Physician Documentation (CAPD) queries back to the clinical teams, creating real-time decision support.
On the cutting edge of CLU’s application are virtual assistants, which use the cloud to enable dialogue between physicians and their clinical applications that uses meaningful information to enhance workflows and inform clinical decisions. This technology wields the immense power to transform how physicians interact with EMRs and share data with colleagues and their patients, fostering real-time decision support and, some day, qualifying as a trusted advisor in the pocket of the physician. This type of immediate feedback acts as a second set of eyes, relying on vast clinical knowledge bases to review the patient’s case and assist the physician in outlining different treatment options.
Technology, like CLU, is already beginning to transform how care is provided. It is, however, only a short matter of time before it becomes common-place for a conversation between a patient and her physician to be comprehended by an intelligent system that transcribes the dialogue, pulls up relevant health information, documents procedures, records reactions to treatments, and codes the office visit. And all without so much as a break in eye contact between the patient and her doctor. The future is here, bringing with it new use cases for NLU and cloud capabilities, positioning healthcare for the next phase of its evolution.