Doctors don’t see their own limits.
A mistake origins from a cognitive error. Being aware of them can be for the better.
One should be allowed to make a mistake. That seems to be the mantra of the whizkids from Silicon Valey. They are transforming from garage nerds to billionaires, so we might believe them. To fall and rise, fail en reinvent yourself is apparently the road to success. However, in my sector, healthcare that is, people don’t seem to appreciate mistakes that much. I became painfully aware of this fact the first time I tried (and failed, obviously) to put in a cannula. “To err is human, to forgive is divine”, wrote pope Alexander VI in the 15th century. Nonetheless, our society if far from divine. We prefer to accuse rather than to forgive, especially in times where our fraudulent behavior seems to become more and more everyday.
“Primum non nocere.” (First of all, do not harm). It is the first sentence of the oath of Hippocrates which is sworn by every student who becomes a doctor. We aim to strive for a sector without mistakes, so we claim. A lack of trust of society has made the amount of inspection committees and paperwork a frustrating burden for every doctor. We have become obsessed with preventing visible mistakes. But in fact, a great deal is to be gained from the invisible.
Meanwhile in Zambia
The time was five to seven in the morning. My mosquito net felt oppressive after a busy night shift. The ringtone of the old Nokia broke the scarce silence. “The girl in the burns unit is sick. Please come.” The four year old had been admitted a few days ago. She had fallen into a an open fire, resulting in one third of her body being covered by second degree burn wounds. I put on my scrubs and white coat and hurried towards the hospital, a five minute walk from my house. Upon arriving, I met my supervisor, an English surgeon.
The bandages covering her wounds were all loose. She had a high fever and with every breath, she grunted softly. After examining here, we suspected a pneumonia and gave her extra oxygen, fluids and started antibiotics. I stayed with her to redress the bandages. She seemed to recover slightly. The nurses and her mother agreed. I walked away to join the rest of the team to start the morning rounds. Thirty minutes later, we came at her bed. She was no longer breathing. We tried to resuscitate her. At 8h42am, she died.
I have repeated this scenario hundreds of times in my head. Where did it go wrong? Where did I go wrong? What could I have done differently? There were enough excuses one could think of not to have to take responsibility. In the heart of Africa, you don’t have the means to be a good doctor. Such severe cases of burns always have the odds against them. I did not have enough knowledge or experience to treat such a complicated case. It all is true. However, I could have been more aware of the ‘cognitive pitfalls’ that my brain had fallen into during this event.
What are cognitive pitfalls?
The daily practice of medicine has few absolute truths. We strike compromises between knowledge from scientific research and the experience of our colleagues, our patients and ourselves. Especially concerning children, we don’t want to make mistakes. To comfort ourselves, we seek for the ‘acknowledgment bias’, people who agree with our thoughts. It was seven o’clock in the morning, I had only slept for a couple of hours and my colleagues would meet me shortly. I wanted that I could tell them all went well, so I looked for symptoms that confirmed that the girl was improving.
I had listened to her lungs and hear diminished breath sounds. As soon as my mind connected her symptoms to a possible explanation, I stopped thinking. A cognitive error that is described as ‘finding satisfaction’. Off course, in my lack of experience, I would turn to my supervisor. Since he agreed on our plan, another pitfall was at risk, the bandwagon effect, the tendency to do things because other people believe the same.
What happens in stressful situations?
“In order to acquire knowledge, we must be aware of our limits.” This phrase derives from the oration of professor Marja Boermeester, gastro intestinal surgeon in the Academic Medical Center in Amsterdam. A similar philosophy is stated by Robert M Pirsig in his famous novel Zen and the art of motorcycle maintenance: “A problem that we don’t see is either too big or too small.” Overall you could say, this is just some adventurous story out of Africa, not comparable to any event in the Western world. The opposite seems to be true. Our capacities to act rationally in stressful situations are very limited, whether we have nothing or everything to our disposal.
“Where did it go wrong? Where did I go wrong?”
Every invention that increases our spectrum of sensory input, increases our understanding. The microscope showed bacteria, and we understood disease. Today we can collect, store and analyze large amounts of data, and we want to understand those patterns. Initiatives that collect medical data in a standardized way (e.g. DICA, ICHOM) visualize our measurable mistakes. At the same time, we should be aware of aspects in our daily clinical practice that are currently invisible to the current measure instruments. The ability to put a hand on a shoulder or to talk about a patient’s favorite holiday destinations are not a part of a cost-effective analysis in excel.
As upon today, our blind spot towards cognitive errors is a bias in itself. The quality of the doctor of the future will greatly depend on the ability to see our own limits. Hopefully, we won’t only learn how to put in a cannula in the right way, but we will also learn what clouds our brain in making the decision whether or not to put in a cannula at all.