I recently stumbled over a small article about the "Semmelweis-Reflex". It was published in a Swiss Magazin and I found it quite interesting. I translated and summarized it, because I found some analogy to software testing:
In 1846, the Hungarian gyneocolist Ignaz Semmelweis realized in a delivery unit that the rate of mothers dying in one department was 4 percent while in another within the same hospital, the rate was 10 per cent.
While analyzing the reason for it, he identified that the department with the higher death rate was mainly operated by doctors which were also involved in performing post-mortem examination. Right after the autopsy they went helping mothers without properly disinfecting their hands.
The other department with the lower death rate was maintained mainly by midwifes who were not involved in any such autopsy. Based on this observation, Semmelweis adviced all employees to disinfect their hands with chlorinated lime. The result: the death rate decreased remarkably.
Despite the clear evidence, employees's reaction remained sceptical, some were even hostile. Traditional believes were stronger than empiricism.
Even though this is more than 150 years ago, people haven’t changed so much in these days. We are still biasing ourselves a lot. The tendency to reject new arguments that do not match our current beliefs is still common today. It is known as the "Semmelweis-Reflex”. We all have our own experience and convictions. This is all fine, but it is important to understand that these are personal convictions that may not be transferred to a general truth.
How can we fight such bias? Be curious. If you spontaneously react with antipathy on something new, force yourself to find pieces in the presenter’s arguments that could still be interesting, despite your disbelieve as a whole.
Second, make it a common practice to question yourself by saying “I might be wrong”. This attitude helps you overcome prejudice by allowing new information to get considered and processed.
Sources: text translated to English and summarized from original article published as "Warum wir die Wahrheit nicht hören wollen», by Krogerus & Tschäppeler, Magazin, Switzerland, March 20, 2021
Back to testing:
I am learning in this article, in case we didn't do it before, we should start questioning ourselves more often. We should learn to listen and not hide behind dismissive arguments simply because what is told to us doesn't match our current view of the world.
But this story has two sides. If I am getting biased, then others may be biased, too. Not all information reaching my desk, must be right. The presenter's view of the world may be equally limited.
My personal credo therefore is NOT ONLY to question myself, but to also question statements made by others, even if these confirm my current view of the world and even if it sounds all reasonable.
He/she may also have the intention to achieve something without the need of answering critical questions. He/she may want to avoid getting challenged into discussions that uncover his/her own "sloppy" analysis.
Call me a doubting Thomas. I don't believe anything, until I've seen it.
- If someone tells you "a user will never do that",question it.
- If someone tells you "users typically do it this way", question it.
- If someone tells you "this can't happen", question it.
- If someone tells you "we didn't change anything", question it
I trust in facts only. This is the result of 20 years testing software.
Come-on, I don't even trust my own piece of code. I must be kind of a maniac =;O)