Amy Edmondson on how to fail intelligently

Think:Act Magazine "The new blueprint for innovation"
Amy Edmondson on how to fail intelligently

May 11, 2025

The Harvard professor encourages us to embrace our failures for valuable lessons

Listen to the interview

Interview

by Neelima Mahajan
Illustrations by Nigel Buchanan

After the data proved her PhD hypothesis to be conclusively wrong, Amy Edmondson saw we lacked productive ways to think about failure. Now the leading psychological safety expert is shining a spotlight on how we can navigate uncertainty more effectively. 

Psychologically, all of us are primed to think success is good and failure is bad. What's the origin of this notion?

In school you learn quite quickly that you're supposed to get the right answer. As a child, you're learning all sorts of information that other people already know – and there often is a right answer. You learn that the kids who get valued and celebrated are the ones getting the right answers. Then you grow up, and in the real world, in most settings, success is very much valued over failure. Some of that is quite sensible, but some of it is not right. Some of it gets in the way of people's willingness to take smart risks through which progress, innovation and discovery come.

Head and shoulders portrait of Amy Edmondson leaning back against a wall and turning to look directly into the camera.
Right kind of wrong: Amy Edmondson outlines four criteria for ensuring each of our failures is good or intelligent.

Is there a cultural component? Are there cultures where failure is absolutely bad?

Yes, it's probably roughly correlated with the power distance index, the degree to which people take seriously the need to get it right and to hit their targets, to have the right answers, to look good, not bad, to save face. High power distance index cultures are more allergic to failure than low power distance index cultures. But the truth is that nobody really likes failure. So the happy talk about failure comes from a good place. What we've seen historically is that countries or cultures with less tolerance for failure tend to be less innovative. And, in every culture, innovation isn't for everyone: Many people don't have the language and the tools they need to progress thoughtfully toward novelty.

"Good failure" guru

Amy Edmondson is the Novartis professor of leadership and management at Harvard Business School and the author of seven books and over 60 scholarly papers. Recognized by the biannual Thinkers50 since 2011, and ranked No. 1 in 2021 and 2023, her most recent book, Right Kind of Wrong, was selected for the Financial Times and Schroders Business Book of the Year Award in 2023. This interview was conducted at the Global Peter Drucker Forum.

Your book is titled "Right Kind of Wrong". This indicates that there is a good kind of failure. How did you hit upon that?

I hit upon it by studying consultancies, scientists, physicians and innovators in companies. It's very logical, even intuitive, but it's not emotionally intuitive. So, it becomes clear after a while. There are actually four criteria for a failure to be good, or intelligent.

One is if it's in new territory, meaning there isn't a current precedent and process that allows you to get the results you want. There's no recipe on the internet.

Two is if it's in pursuit of a goal. You're not just messing around to have fun failing for the fun of it, you're trying to develop a new product or make a scientific discovery or write a book.

Three: You've done your homework. You have taken the time and effort to find out everything you can about what works and what doesn't, and you've got a hypothesis about what to try next. So it's a thoughtful experiment.

Four: The failures that do occur – and they will – are small and not dangerous. They're small from a safety perspective, a reputation perspective and financial perspective. So that's just a smart experimentation strategy and you learn from it.

There are two kinds of preventable failure, as well. One is a basic failure which has a single cause, often but not always human error. And the other is a complex failure which has multiple causes, any one of which on their own wouldn't have led to the failure. With great teamwork, management, mentorship, vigilance, learning and so forth, you can come pretty close to failure-free in familiar territory. This is, for example, the essence of the Toyota production system: Let's produce failure-free cars, but the only way we can do that is if people are willing to catch and correct the inevitable errors that happen along the way. So I'm equally passionate about preventing preventable failures as about welcoming intelligent failures.

"The first question to ask is always 'what happened?' It's not 'who did it?' or 'what caused it?"

Amy Edmondson

Professor
Harvard Business School

What does an intelligent failure look like? Can you give me an example?

I'll give you one very good example. So a global pharmaceutical company has got a clinical trial for a promising new drug they hope will alleviate a particular kind of cancer. You must do a trial with scientific sampling to show that the drug actually has the impact you hope, in a treatment condition versus a control condition. If you do everything right and you fail to show efficacy, that's an intelligent failure. It's terribly disappointing. But you couldn't have known it in advance. You had to try it. It's a painful setback for the company, but they'll figure out why it didn't work. Maybe they'll tweak the mix, maybe they'll go back to the lab, but they will go forward.

What mechanisms can organizations start using to make the most out of these intelligent failures?

First, you put in place the structures and support so that people can have intelligent experiments: space and resources. Second, you put in place the structures, rituals and support for learning from the failures, because you really want to get your money's worth once you've invested. It's about ensuring that the experimentation is as smart as it can be, and then that the learning is as deep and rich as it can be. And the first question to ask is always "what happened?" It's not "who did it?" or "what caused it?"

Illustration of a yellow-and-black striped traffic ramp on an indigo background, two feet visible from under the ramp as if an illustrated figure is trapped underneath.
No reason to hide: Agreeing to face our failures and learn from them helps create a safe space in which we can all continually grow and share organizational knowledge.
Climb over these barriers and start failing better
Aversion

Failure is never fun, but no one ever grew from dodging blame. Reframe your thinking from seeing a loss to the potential for even bigger gains.

Confusion

We are much more likely to fail in a novel context than a consistent one, but we often lack the framework to see the difference. Not all failures are equal.

Fear

Our evolutionary brain fears social rejection as much as being hit by a bus. A culture that allows for mistakes also will allow us to reach the highest standards.

How can we train our organizations to distinguish between different types of failure and adopt tolerance thresholds?

You clarify the concepts. And it's really important to have the kind of psychological safety to allow people to speak up. Part of this is building a healthy failure culture which is equally tolerant of intelligent failures and eager to avoid as many problems as possible. So, how do you do it? First you need to help people understand the difference, and then you teach them to run through the criteria – novelty, goal-driven, hypothesis-driven, and as small as possible. It can all be subjectively assessed. I've seen a lot of innovation failures in organizations where they flunked the fourth test. So then you say, how could we have gotten the same lesson with less time and money? 

I'm not 100% enthusiastic about the word "tolerance" because I think we, as fellow human beings, have to tolerate each other. But we don't want to tolerate sloppiness. So assume good intent and then try to understand what happened and what you could have done. Be curious about it before you start blaming, but in general, set standards. Be clear in advance that we don't tolerate a failure to wear safety equipment. Decide where the boundaries are. Then when people cross the line, there are consequences.

Illustration of a yellow-and-black striped traffic cone against a white background.
Learning to read the signs: If we are aware of what things we need to steer clear of, there is a chance that we can become smarter in avoiding major mistakes.

Are there examples of organizations that have been curious in this regard and institutionalized the learnings? 

On one end of the spectrum is IDEO, arguably the world's most celebrated innovation consultancy. They're a company that has instituted smart failure into their activities. Now, the reason that's at one end of the spectrum is that this is a company where that's all they do: innovation projects. The other end of the spectrum is Toyota, where they have R&D, of course, and are willing to experiment in the laboratory. They're smart – they're also more eager than any other manufacturing company that I know to produce flawless quality in familiar territory. They realize that fallible human beings and systems don't produce that without help. So they train everybody in vigilance, in speaking up, in problem-solving. It's a beautifully engineered mindset and set of practices, policies and systems that all work toward that single-minded goal of excellence.

In real life organizations are looking for outcomes. And CEOs have to answer to the board. So failure is probably not seen as a good thing.

Boards don't usually talk about failures that happened in the lab, or what pilot projects yielded disappointing results and had to be dropped. But if I'm on a board and I do have access to that kind of data, I would be very disappointed if your failure rate was too low, because I can see maybe some nice profits this year. But where are you going to be in five years if there's no innovation? So it's the board's job and, more accurately, senior management's job, to make sure we are delivering beautiful work today for today's customers, but also that we are creating products and services that will be needed in the future. It's very easy to privilege the present over the future. You must not fall prey to that trap.

Drop the failure dichotomy

The opposite of success isn't failure – it's missing the opportunity to minimize unproductive failure. We can learn when failure is our friend, pursue smart risks and prevent avoidable harm. Embrace your fallible human self and you might just end up being more successful.

In your book you talk about the unequal license to fail. What implications does this have for organizations?

The unequal license to fail refers to the fact that it's one thing to clarify what an intelligent failure is for an entrepreneur, a scientist or an innovator, and it's another to experientially put in place a culture where that opportunity is equally available to everyone, also to people in underrepresented groups in any given role. Their failures will stand out because of the way our brains work. The failures will be attributed to their identity group in a way that would never happen for a majority white male in the same role. They might say: This is what happened when we put a woman in charge.

What can companies do about that very real psychological bias that we all have? 

The answer is make it discussable. Be clear when we give someone an important job to do, where there's uncertainty and risk, where there will be failures. Get out ahead and say this thing very well may fail because nobody's ever done anything like this before. Also call attention to the unequal license to fail so that everybody can think: "Oh, yeah, my brain might do that. And I'm going to try to teach my brain not to do that."

About the author
Portrait of Neelima Mahajan
Neelima Mahajan
Neelima Mahajan is Editor-in-Chief of Think:Act magazine. She has been a business journalist for two decades in various publications in India and China, including a stint in the founding team of Forbes magazine in India. In 2010-11 she was a visiting scholar at University of California, Berkeley, where she was also a recipient of a Bill and Melinda Gates Foundation grant for an Africa reporting project. Neelima has a keen interest in management thought and has done extensive work in the domain. She has interviewed several world-renowned management thought leaders, Nobel Prize winners and global business leaders. In 2010, Neelima received the Polestar Award for Excellence in IT and Business Journalism, one of the highest awards in journalism in India.
All online publications of this edition
Load More
Download the full PDF
Think:Act Edition

The new blueprint for innovation

{[downloads[language].preview]}

The latest edition of Think:Act prepares us for innovation in 2025 and beyond while faced with constrained resources and uncertainty across the business landscape.

Published May 2025. Available in
Portrait of Think:Act Magazine

Think:Act Magazine

Munich Office, Central Europe