I recently stumbled over a small article about the
"Semmelweis-Reflex". It was published in a Swiss magazine and it was
quite interesting as I drew some analogy to software testing:
In 1846, the Hungarian gynecologist Ignaz Semmelweis
realized in a delivery unit that the rate of mothers dying in one department
was 4 percent while in another within the same hospital, the rate was 10 per
cent.
While analyzing the reason for it, he identified the
department with the higher death rate was mainly operated by doctors which were
also involved in performing post-mortem examination. Right after the autopsy
they went helping mothers without properly disinfecting their hands.
The other department with the lower death rate was
maintained mainly by midwifes who were not involved in any such autopsy. Based
on this observation, Semmelweis adviced all employees to disinfect their hands
with chlorinated lime.
The consequence: Death rate decreased remarkably.
Despite clear evidence, the employees' reaction
remained sceptical, some were even hostile. Traditional believes were stronger
than empiricism.
Even though this is more than 150 years ago, people
haven’t changed so much in these days. We are still biasing ourselves a lot.
The tendency to reject new arguments that do not match our current beliefs is
still common today. It is known as the Semmelweis-Reflex.
We all have our own experience and convictions. This is all fine, but it is
important to understand, these are personal convictions and may not be
transferred to a general truth.
How can we fight such bias? Be curious! If you
spontaneously react with antipathy on something new, force yourself to find
pieces in the presenter’s arguments that could still be interesting, despite your
current disbelieve as a whole.
Second, make it a common practice to question yourself
by telling yourself “I might be wrong”. This attitude helps overcome prejudice
by allowing new information getting considered and processed.
Back to testing:
From this article, I am learning we should start to listen and not hide behind dismissive
arguments simply because what is told to us doesn't match our current view of
the world.
But this story has two sides. If I am getting biased,
then others may be biased, too. Not all information reaching my desk, can be
considered right by default. The presenter's view of the world may be equally
limited.
Plus, the presenter may have a goal. His/her intention
is to sell us something. The presenter's view may be wrong and based on
"sloppy" analysis if any fact collection was done at all.
Call me a doubting Thomas, but I don't believe
anything, until I've seen the facts.
So what?
If someone tells you "a user will never do
that", question it!
|
It may be wrong.
|
If someone tells you "users typically do it
this way", question it!
|
It may be an assumption and not a fact.
|
If someone tells you "this can't happen",
question it!
|
Maybe she has just not experienced it yet and will happen
soon.
|
If someone tells you "we didn't change
anything", question it!
|
One line of code change is often considered as hardly
anything changed at all, but in fact this adaptation can be the root of a new
severe bug or a disaster.
|
I trust in facts only, or, as someone said: "in
God we trust, the rest we test". This is the result of many years testing software. Come-on, I don't even
trust my own piece of code. I must be kind of a maniac!
Sources: text translated
to English and summarized from original article
published as "Warum wir die Wahrheit nicht hören wollen», by
Krogerus & Tschäppeler, Magazin, Switzerland, March 20, 2021