Guest Column | February 8, 2016

The Science of Being Wrong

By Derek Hennecke, CEO and President, Xcelience

Everyone gets it wrong sometimes.

Most of us like to think that we’re good sports about it. If it’s a little thing, we may laugh it off good-naturedly. If it’s a major mistake, we shoulder responsibility, accept consequences, and correct course. As scientists, most of us probably think we have a healthier than average relationship with our human tendency to err. We are, one could argue, obsessed with the possibility of being wrong. We are wrong far more than we are right on the path to knowledge. Ninety-five percent of drug candidates fail during development. Our egos are buff. We dust off, and keep going.

We think we’re better at admitting our errors because we must do so every day. But really, most of the day-to-day disappointments are little things. They are part of our job, not linked to our fragile egos. When it matters, are we really any better? We’re still human.

I took my MBA in Holland. A Dutch friend of mine seemed to have it all together. During her Masters in Chemistry, she had stumbled across a promising drug candidate. She was working her way through the MBA as preparation for launching a start up. She supplemented her studies with internships in other start up labs; her off hours were spent networking and fundraising for her candidate.

When we graduated, most of us headed off to decently-paying corporate careers. She took on a load of debt, fitted out a small lab, brought in a couple of colleagues who were willing to work for the promise of a better future, and began in earnest.

Her drug candidate, unfortunately, was one of the 95 percent. The science spoke pretty clearly, and pretty early on. But she was so emotionally involved that she carried on, clinging to hope despite the evidence, for months longer than she should have. So much time, energy, money and emotion had gone into this single project.

It can be hard to let go of your convictions, even when the rational part of you – the scientist in you - knows that the sooner you accept you’re wrong, the better.

How does it feel to be wrong? It feels a lot like being right

If I ask you what it feels like to be wrong right now, you’ll probably say it feels horrible, depressing, or something like that. But you’re not answering my question. I asked you what it feels like to be wrong right now, in the present. You told me what it felt like in the past. The moment that you realize that you were wrong, it’s already in the past, writes Kathryn Schulz in Being Wrong: Adventures in the Margin of Error. When you are wrong and haven’t yet realized it – how does that feel? That’s the real question. And the truth is, it feels an awful lot like being right.  

In her Ted Talk on the same subject, Schulz reminds us of Wile E Coyote and his perpetual pursuit of the Roadrunner. In almost every episode, the Roadrunner eventually runs off a cliff. Being a bird, he just flies away. When Wile E Coyote, in hot pursuit, pedals valiantly over the edge of the cliff, he always makes headway for several feet in the thin air until the moment he looks down, realizes his mistake, and plunges to the ground. Being wrong is a lot like that, Schulz says. It feels just like being right, until the moment we realize that we are wrong. Then it feels bad.

Being Wrong forces us to look inside ourselves and recognize our own misguided behavior. It makes us think about the things we can do to realize sooner when we’re wrong, despite our inborn psychology of denial. Her goal is make us embrace our errant nature, accept it and learn from it.

Science finds its earliest roots in the acceptance of being wrong

The history of science is a long chain of stories about perceiving errors and correcting for them. Knowledge advances when current theories collapse under the weight of new evidence. The Scientific Revolution is, to a great extent, a shift among the greatest thinkers of that time from certainty about everything to doubt, according to Schulz. Augustine was so doubtful that he penned, “I err, therefore I am”. Basically he was saying that the only thing he could really be certain about was that because he made mistakes, he knew that he existed. Descartes later wrote “I think, therefore I am.” He explained this to mean, “I doubt, therefore I think, therefore I am”, expressing the same universal doubt as his predecessor.

Indeed, if we never erred we would be gods. Of all the other species on earth, we are the only one driven to understand the universe. Our lives demand us to make decisions based on what we think about what will happen in the future, and so it is inevitable that we will get things wrong. It is the most fundamental characteristic of our humanity. And yet, as Shultz aptly points out, we deny our ability to err at every turn. Many of us go through life assuming we are right about just about everything, from political and intellectual convictions to religious and moral beliefs. We call ourselves human, and yet we walk around acting omniscient.

Top scientific theories that were wrong

Science dictates that we should abandon a theory that is collapsing under its own counterevidence, but the truth is, we rarely abandon a floundering hypothesis unless we can first find a new and better theory to adhere to. We believed in a geocentric universe despite countless inexplicable anomalies, until Copernicus finally gave us a better theory. Aristotle insisted that animals appear spontaneously and not from other animals of their kind. It would be centuries before science displaced this theory. More recently, we believed in the stress theory of ulcers despite lack of lifestyle evidence among sufferers, until Australian clinical researcher Barry Marshal gave us the bacterium H. pylori which proved to cause peptic ulcer disease.

Theories help us identify what questions to ask and what to look for, but in doing so, they give us a sort of tunnel vision, effectively keeping us from asking the burning questions we ought to ask. We need to “learn to actively combat our inductive biases,” Schulz reasons, “to deliberately seek out evidence that challenges our beliefs, and to take seriously such evidence when we come across it”. She adds, “Remembering to attend to counterevidence isn’t difficult, it is simply habit of mind. But, like all habits of mind, it requires conscious cultivation.”

Scientific group think

Membership in any community influences the way we see the world, and sharing a belief with others can solidly wall us off from those who might contend that we are wrong. Scientists are not immune to this effect. We tend to believe and not question the current dominant theories. Group membership confers a sort of “disagreement deficit,” writes Schulz, which “creates a kind of social counterpart to cognition’s confirmation bias” and shields us from the possibility that we are wrong.

John F. Kennedy made a grave mistake when he let his cabinet of the ‘best and brightest’ minds of the day led him to invade Cuba in the Bay of Pigs. Kennedy’s submission to the prevailing opinion is held up as a classic group think scenario, in which no one in the group dared to challenge the prevailing wisdom. Kennedy learned from this experience and avoided falling into the same trap in his handling of the Cuban Missile Crisis.

In management, I fend off group think by going around the room and demanding an individual opinion from everyone present. Scientists can be introverted and it would be a mistake to rely only on the opinions of the most vocal. I’m not looking for a democratic conclusion; I’m just putting everyone on notice that they are individually responsible for the decision that the group makes. This practice also serves to seed doubt in my own convictions, by forcing me to listen to countervailing advice.

Business model group think: the mega-company

Pfizer’s impending merger with Allergan is much in the news these days. The prevailing wisdom is that bigger is better. Such mega-companies made sense in the era of blockbuster drugs, but that era is ending. Cancer, diabetes and other major diseases are all being segmented into smaller and smaller markets based on genotype and, most recently, microbiome.

An article in Israel 21c in November claims that there’s no such thing as a one-size-fits all diet because everyone’s body responds differently to different foods. In what claims to be the largest diet study of its kind, researchers found that glucose may be the food that spikes your blood sugar levels, while sushi might do the same for your partner. Both of you may experience spikes when eating after exercise, but only one of you after sleep. After a week of eating diets that matched their personal profile, volunteers experienced consistent changes in the composition of their gut microbes, suggesting that the microbiome may be influenced by personalized diets.

Other recent papers show that responses to immune checkpoint inhibitors like iplimumimab differ according to the microbiome.

Here’s a perspective from outside of the Pfizer-Allergan group thought process: the days of mega-blockbusters that are prescribed for everyone are disappearing and with them the mega-giants that depend on them. Personalized medicine will consist of more niche markets and be dominated by smaller, more nimble players.

Sunk costs and learning from our mistakes

We discussed how scientists will live with a crumbling theory long beyond its expiry date if necessary, until a new and better theory presents itself. Non-scientists, in fact, are just as prone to never find themselves between theories. Everyone finds it excruciatingly uncomfortable to be “stuck in real-time wrongness with no obvious way out”, as Schulz puts it. The bigger the wrong, the harder it is to abandon it. Whether we’re talking about religious convictions or major capital expenditures, we all want to avoid the sunk costs of our actions.

About fifteen years ago, I took a position as General Manager of a DSM biologics plant in Montreal. The plant had swallowed up and spit out three GMs in four years. It was growing fast, but it had inherent structural problems – piping and engineering - that were, to make a long story short, affecting quality and causing the entire business to rather spectacularly implode. In hindsight, the diagnosis should have been clear. The plant was not salvageable. But I was young with a spotless career, and certain I could succeed where others failed. DSM had sunk so much time and money into the plant that they were eager to support my attempts. Looking back, I doubt there was anything I could’ve done to prevent the final outcome, but I should’ve seen the problem much sooner. There was an element of group think among upper management for sure. We told ourselves the answer was to work harder, throwing time after issues that were ultimately beyond our control. I lasted two and a half years in the post – which I guess is an accomplishment in self-punishment. The next guy lasted another two before the plant met its destiny and closed. That failure at DSM was the lowest point in my career, but I learned about sunk costs and adhering to group think. I think a good swift kick to my ego was probably good for me too. I wouldn’t repeat those difficult years for anything – and yet they made me a better person and a better manager going forward. I learned.

It’s sad when people experience some of life’s biggest lessons and fail to grow from them. Republican presidential candidate Carly Fiorina may be such a person. When Fiorina was CEO of Hewlett-Packard, the company lost over half its value. Sure, many tech companies fell in that period, but few lost 55 percent as HP did under her leadership. Apple and Dell rose in the same period, Google went public and Facebook launched. The S&P fell only 7 percent in the same timeframe. The decision that defined her tenure was the $25 billion purchase of Compaq, undertaken at a time when devices were becoming low margin commodities. History has already passed judgment on the decision, and it is indefensible. She was wrong. Whatever her reasons may have been, she made a bad choice.

My point is not to disparage her failure. Everyone screws up. What troubles me is that she denies it. Instead of shouldering responsibility in the presidential debates, she tried to repaint the picture, claiming she wasn’t a failure, but rather doubled revenues. Anyone in business knows that doubling revenue is a meaningless metric. What counts is not revenue, but profit. And that’s the metric she did not deliver on.

A leader who accepts and learns from failure – that is someone I could follow. She would be following in the footsteps of Abraham Lincoln, Thomas Jefferson and William McKinley, all of whom recovered from bankruptcies. There is something stoic about rising from the ashes of defeat, like Steve Jobs or even Martha Stewart. But a leader who denies her mistakes cannot learn from them. And that is not a characteristic I would look for in a president.

Learning from our mistakes in drug development: FDA filings

In 2008, companies filing new molecular entities were rejected 66 percent of the time. So far this year, with 28 entities submitted, the approval rate is 89 percent, and 96 percent if you eliminate new uses for an existing drug, according to Forbes ‘BioMedTracker’, a division of the magazine devoted to helping investors track events in the pharma world.

Has the Agency eased up? I find it hard to believe that the FDA, with a mission as urgent as food and drug safety and a large, wieldy bureaucracy behind it, could possibly become more lenient in its approval system. But what I do see is that we’ve gotten better at accepting when we’re on the wrong track sooner by killing drug candidates earlier on in the process, before the trials begin.

Publishing negative clinical trial results

Those of us in drug development should know better than anyone the importance of learning from our mistakes. So why is it that so often failed trials never get published?

Anges Dechartres, an epidemiologist at Paris Descartes university pulled 600 trials at random from the FDA database ClinicalTrials.gov. This database has been made public since 2008. Researchers who fail to post results within a year of a trial’s completion may lose grants and be fined up to $10,000/day. The database is extremely useful, but it doesn’t replace publications, which are generally much more detailed and provide the basis for reviews of research on a given drug.

Deschartres found that only 50 percent of the trials she randomly selected made it to publication. This is a big deal. Failure to publish, Descartres argues, breeches the implied contract made between researchers and patients participating in the trials. I would state her assertion more strongly: Failure to publish may condemn future researchers and patients to repeat what was already a lengthy and costly mistake.

The blame probably doesn’t lie with the researchers alone. I’m sure sometimes results are submitted to journals and rejected. Negative results, which show that a drug doesn’t work better than the placebo, are less likely to find their way to print than positive results. And yet without these results in the body of scientific literature, the literature is biased.

We do still have the ClinicalTrials.gov database, however even this is not complete. Phase 1 trials do not need to be reported, nor do any trials performed entirely outside the United States. A bill before Congress this year may, if passed address some of these omissions.

How to be right more often

Six Sigma, a revered set of techniques and tools for process improvement, says there will always be errors and deviations. The key is to:

  • Accept that there will be errors.
  • Create an environment that talks about errors without retribution.
  • Rely on verifiable fact and date rather than opinion or anecdotes.

The latter point is especially important. When there is a mistake in my business, I don’t just ask if there was an investigation, or if the matter had been handled. The answer is invariably, “yes”. I ask for specifics. How is the investigation being handled, who is handling it, how will it be reported, to whom and when. If answers are vague I keep asking. Again. And again. Sometimes five or six times. Until I have established either the facts or the lack thereof. Then we can move forward.

Being wrong, in science and in life, is never the issue. Recognizing when you’re wrong, denying it after the fact, covering it up, or failing to learn from it, these are errors that keep us from moving forward in science, in business and in life.