We need to add a testcase

Over the years I have often had this conversation with test managers:
– Why didn’t we find this defect earlier? (Usually accompanied by a pitying, understanding  smile).
After the painful response, we would agree to “add a test case”.
When installing a software product, a tester got the message, “The following applications must be closed before proceeding” along with the options – “Retry”, “Cancel”, “Ignore”. After choosing “Ignore”, the installer proceeded and after some time gave a standard cryptic message , “Error – Installer cannot proceed….”. The tester then had to redo the install. When the developer saw this problem, his response was, “Of course, you need to close Firefox. Our installer modifies Firefox…”
A user has been performing an operation on Windows XP for years. The same operation does not work on Windows 7. Due to some limitations or differences in Windows 7, the software won’t work the same way on Windows 7. The team decides to “document” the “limitation”.
In all cases, I felt something was wrong, but was not able to articulate it. In the case of managers asking, “why did we miss this problem”, the fact that the manager had never used the software never stopped them from being surprised that the defect was not “caught” earlier. In all cases I felt that we were unable to empathize with the user and the tester.
This behavior, hindsight bias, has been described by Joseph T. Hallinan in his book, “Why we make Mistakes”. “When something goes wrong, especially something big, the natural tendency is to lay blame. But it isn’t always easy to figure out where the fault lies. If the mistake is big enough, it will be analyzed by investigators who are presumed to be impartial. But they are plagued by a bias of their own: they know what happened. And knowing what happened alters our perception of why it happened – often in dramatic ways. Researchers call this effect hindsight bias. With hindsight, things appear obvious after the fact that weren’t obvious before the fact. This is why so many of our mistakes appear – in hindsight – to be so dunderheaded. (“What do you mean you locked yourself out of the house again?”) It’s also why so many of the “fixes” for those mistakes were equally dunderheaded. If our multitasking driver wrecks the car while fiddling with the GPS device on the dashboard, the driver will be blamed for the accident. But if you want to reduce those kinds of accidents, the solution lies not in retooling the driver but in retooling the car.”
When developing software, when testers, developers find problems, it seems like the focus is on “retooling” the user, “instead of the car”. The thinking is – if only the user were more careful, if only the user read the documentation….
As a tester it is important that you be aware of hindsight bias. When looking at problems reported by users, you must not jump to a quick solution. You have to imagine what the user was thinking, what was his background, what did he know/not know about the software/environment.
As a tester, when you find something confusing or you are unable to complete certain operations in the software, try to capture your thoughts. When a developer or an experienced tester resolves your problem, you should try to analyze why you were confused. What did the software assume about you, about what you know?
Hallinan continues with this insight on hindsight bias:
“Basically, hindsight bias comes down to this: knowing how things turned out profoundly influences the way we perceive and remember past events. This is true no matter how trivial they may be. It could be the 1975 Super Bowl or Grandma’s colostomy or the decision to neuter Rover, knowing how the event turned out alters our recollection of it. Even historians are prone to this error. It is much easier after an event – whether it is the Battle of Gettysburg or the bombing of Pearl Harbor – to sort the relevant factors from the irrelevant. Those who write of these events will almost invariably give the outcome an appearance of inevitability. But this type of compelling narrative is achieved by suppressing some facts at the expense of others – a process know as creeping determinisn. Near the conclusion of her influential history of the attack on Pearl Harbor, for instance, the noted military analyst Roberta Wohlstetter had this to say: “After the event, of course, a signal is always crystal clear; we can now see what disaster it was signaling, since the disaster has occurred. But before the event it is obscure and pregnant with conflicting meanings.”