There is an old joke told among people in aviation. The joke is that the airplane cockpit of the
future will have a pilot and a dog. The
pilot is there to feed the dog and the dog is there to bite the pilot if he
tries to touch anything. The idea behind
this is that the airplane of the future, carrying hundreds of people, will be
fully automated and be safer than planes are today and that the pilot, will not
only be unnecessary but potentially dangerous to those on board. Of course that joke does not acknowledge that
computers may shut down or need to be overridden. Presumably in the cockpit of the future a
Captain Chesley Sullenberger will not be needed to land a plane with no working
engines on the Hudson River and safely get all of his passengers out without
any deaths. As a passenger however, I
would be reassured to see “Sulley” or someone like him up there in the cockpit
no matter how good the systems were.
Human beings are at least as complex as airplanes yet we now
see a belief, most eloquently voiced by the brilliant businessman and venture
capitalist, Vinod Khosla, the founder of Sun Microsystems (the link to download Mr. Khosla’s full statement is attached) that can be read to suggest that the
doctor’s office of the future will be staffed by a doctor (but probably a nurse
since doctors are too costly) and a dog.
His thesis is that 80% of what is now done by a physician will, in the
future be done by technology. As one reads
his thoughts on this, two of his assumptions became apparent to me. One is that he believes that the diagnosis of
disease and the defining of treatment plans is a bigger part of medical
practice, especially primary care medical practice, than it really is. The second is that he has great faith in the
ability of systems to consider non-medical factors such as a person’s beliefs,
values, social situation, financial situation and psychological makeup that may
affect diagnostics and therapeutics. I wonder. The reality of practice is that 80% of doctors’
visits are driven by fear and social isolation rather than diagnosable
disease. The reality of therapeutics is
that the diagnosis and treatment plan are only a small step in the march
towards cure which depends as much on the non-medical factors and the physician
and nurse ability to understand and impact those non-medical factors as it does
on the science of disease.
Mr. Khosla’s contention is that much of the knowledge that is
now driving specific diagnostic approach and the outline of treatment steps will
be and should be facilitated by technology.
That is absolutely true. All he says of the promise of technological
systems to process medical inputs such as blood pressure, pulse, weight, lab
data and certain pieces of history to create a better health care experience
for all also is true. However if our
current approaches to laws, rules, regulations, and health systems are based totally
on Khosla’s faulty assumptions of the nature of medical practice rather than on
practice realities, then we may create a system that is not as individualized and
not as attuned to the inherent uncertainty that comes with people as social,
spiritual and psychological animals, as good medicine should be. I worry that if we build a health system that
is based on this already accepted assumption that technology and guidelines will
fix all of our health care woes we will end up with a dehumanized system that
creates an entire new set of problems. In our zeal to improve what we acknowledge to
be broken in health care, we may minimize the danger of a system that is based
purely on technology, rules and algorithms.
A recent case in California that David Shaywitz wrote about in his Forbes blog is a good case in point.
A nurse in an extended care facility did not perform CPR on a patient
because it was against institutional policy for her to do so. Instead of seeing a person in need, this
nurse, who is presumably trained in CPR and trained to recognize someone having
a cardiac arrest, followed the institutional policy instead of doing the right
thing for that patient. This is an
example of procedure trumping patient care and trumping empathy and
thought. One can make the argument, as Khosla does,
that our current systems and protocols are primitive and they will improve over
time and they would, in the future, not allow this to happen. However even in the best of systems, the
danger of relying on expert systems, and even requiring their use, is that fear
of overriding the system can take precedent over the need to help.
The recent entry of electronic medical records is another
case in point. The EMR has allowed for
medical guidelines to be built into the medical record and that is a strong
plus when combined with an ability to catch and prevent certain errors that may result in
danger to patients. However it has also fragmented
the information needed by a health professional so that trying to navigate a
medical record to get a quick understanding of a patient sitting before you has
become even more of a nightmare than it was with a paper record. It also requires sometimes time consuming and
complex data entry by the care giver. I
have accompanied friends to emergency rooms only to watch the doctor pay more
attention to data entry and following the algorithm than to the patient in
need. Bad information can also become so
embedded by a system that encourages “copy and paste” and standardized wording
instead of independent thoughtful analysis that incorrect diagnoses and
histories become impossible to change.
So I sit here and only hope that I always have my and my
family’s care directed by me and my family working in partnership with a caring
health professional, usually a doctor, who uses technology but never allows it
to overtake his or her own judgment, knowledge and empathy. I will always prefer a captain in the cockpit
of an airplane and a caring physician by my side.
No comments:
Post a Comment