Menu podcast

in: Behavior, Character, Podcast

• Last updated: June 5, 2021

Podcast #112: The Science of Insights with Dr. Gary Klein

How do we make decisions in complex environments? Can we trust our gut? How do we gain insights? In today’s podcast I talk to Dr. Gary Klein to get some answers to these questions. Dr. Klein pioneered the field of naturalistic decision-making and is an expert on the science of insights.

Show Highlights

  • What “naturalistic decision-making” is and how it differs from what we typically think of as decision-making
  • Why our gut instinct or intuition isn’t as irrational as we think
  • The biggest myths about gut-based decision making
  • How the “pre-mortem” technique can help you make better decisions
  • How analytical thinking can get us into trouble
  • The patterns of gaining insights
  • Why some people are better at gaining insights than others
  • What we can do to cultivate insights
  • And much more!

Seeing what others do'nt, book cover by Gary Klein.

If you’ve enjoyed our articles about the OODA Loop or situational awareness, you’ll enjoy Dr. Klein’s books. Streetlights and Shadows focuses on making quick decisions in complex, fast changing, and even threatening environments. His latest, Seeing What Others Don’t uncovers the science behind insights and what happens in our brain when we have those “a-ha!” moments. Both books provide a nuanced and balanced look at the “fast vs. slow thinking” debate that’s prevalent today.

Be sure to follow Dr. Klein on Twitter for links to fascinating studies on decision-making. Also, check out his websites: MacroCognition and Shadow Box Training.

Special thanks to Keelan O’Hara for editing the podcast!

Transcript

Brett: Brett McKay here, and welcome to another edition of The Art of Manliness Podcast. The past few years, there’s been a lot of books that have come out about how irrational humans are, and that we shouldn’t follow our gut or intuition, we shouldn’t use heuristics to make decisions quickly. Instead, we should really think things out. My guest today makes the case that while the research that has come out in the past few years that these books are based on are useful, they provide us some great insights, they don’t really show us how human beings make decisions in the real world. All of these experiments that this research has been based on has been done in the lab.

My guest today makes the case, in the real world, in complex environments, your intuition, your gut, whatever you want to call it, using heuristics, can actually be extremely useful, and right most of the time. His name is Gary Klein. He is a pioneer in the field of naturalistic decision making and the author of several books. One that I’ve read recently is Street Lights and Shadows, which is all about decision making in complex environments, making fast decisions based on your intuition. His most recent book, which I’ve also read, is called Seeing What Others Don’t, and it’s all about how we gain insights and the science of gaining new insights.

Anyways, today on the podcast Dr. Klein and I are going to discuss his research in naturalistic decision making, how we can become better decision makers, and how we can become more agile decision makers, learning how to use those heuristics and intuition we need to, and then also knowing how to use that more slow, methodical, analytical decision making in certain situations. We also talk about insights, the science of insights, and what we can do to create an environment around ourselves to have more aha moments. A completely fascinating discussion with some actionable things that you can apply today in your life. I think you’re really going to like this show, so let’s do this.

Dr. Gary Klein, thank you so much for being on the show.

Dr. Klein: Thanks for having me. I appreciate it.

Brett: You have spent your career dedicated to studying and researching insights, the way we get insights, decision making in ambiguous situations. You study naturalistic decision making. Can you describe briefly what naturalistic decision making is and how it differs from the classical type of decision making we read about in books and in blog post?

Dr. Klein: Sure. Naturalistic decision making studies how people actually make decisions in complex situations. Not just decisions, but how they make sense of events, how they plan, just a wide range of cognitive activities in, you can say “in the wild,” in situations that aren’t controlled. It differs from the conventional approaches to decision making which primarily rely on carefully controlled studies using well researched paradigms and tasks, and using populations such as college students that are easily attained and can be scheduled for an hour or so at a time, and can be given tasks that they haven’t seen before, because you don’t want them to vary in how familiar they are because that can mess up the results.

Right away there’s a problem, because we find in natural settings that people rely on their expertise, and so rather than screening it out in order to achieve greater control and precision, we want to see how expertise comes into play. Naturalistic decision making really just is a way of exploring all kinds of cognitive phenomena, but in field settings and not under controlled laboratory conditions.

Brett: Some of these field studies or field settings have been firefighters, for example, what they do to decide how to approach a house that’s on fire.

Dr. Klein: Right. That was one of my early studies back in the 1980s. The belief was that in order to be a good decision maker, you had to generate a range of different options, and then you had to have some criteria for evaluating the options, and then you figure out which options scored best on the criteria. We thought that firefighters didn’t have enough time to set up that kind of a matrix. They probably were just looking at two options rather than a large set. I wanted to test that out. We thought that was a daring hypothesis, but it turned out to be too cautious a hypothesis. They weren’t even comparing two options. They would look at a situation and know what to do, and be right most of the time.

That created all kinds of confusion for us, because we weren’t expecting that. How could they be so confident and so accurate with the first option? It turned out it was because they had enough experience and they built up patterns over ten, fifteen, twenty years, they’d built up a large repertoire of patterns, and so they could rely on pattern matching to figure out what was going on. They still had to evaluate the option, and so how could you evaluate one option without comparing it to the other? We looked at our notes and we looked at our interview results, and we found the way they evaluated it wasn’t by comparing it to other options, it was by imagining how it would play out in that particular context.

If it worked, then they could make a decision in less than a minute. If it almost worked, they could improve the option. If they couldn’t figure out a way to improve it so that it was adequate, then they would say, “Forget this, what else can I do?” and keep going down their repertoire until they found one that would work. This was a recognitional strategy, and nobody had identified it before because they hadn’t studied how people use their experience to make tough decisions under time pressure and uncertainty.

Brett: This was the development of the recognition primed decision model, correct?

Dr. Klein: Yes, that’s where it got started. It came as a surprise to us. We were just trying to work with experts who were trained and were demonstrably good at making decisions, namely firefighters, to see how they could do it. Because the research suggested that it took at least a half hour or so to arrange one of these matrices, and if you didn’t make a decision like that, you were supposedly not making a good or rational choice. Yet they were making good choices, and we didn’t know how they did it, so that’s what we studied in the wild.

Brett: You develop this ability. Are patterns what you’d call mental models, or are they something separate and distinct from mental models?

Dr. Klein: A mental model is a really squishy kind of concept. With us, a mental model is the story that we tell about how things work. It reflects the kinds of causes that we think are operating and how those causes interact with each other, and that’s our mental model. Usually we can’t ever articulate what our mental model is, but we certainly have them, because somebody with more experience we know has a richer mental model and can reflect a wider variety of causes and be more accurate about what’s happening and what’s going to happen.

Brett: One of the things I loved about your book, was refreshing about it, was because it seems like over the past ten years or so there’ve been a lot of books put out there about how human beings are extremely irrational, we make poor decisions, and that we shouldn’t trust our instincts or intuition. I guess the upside of irrationality is one of them, thinking fast and slow. But you point out, I think you said this a little bit in the introduction, that the problem with the research that these books are based on is that they don’t capture actual decision making. It’s in a lab. Are these books out there, are we as irrational as these books point out that we are, or can we trust our intuition sometimes?

Dr. Klein: Were not as irrational as these books are claiming. They’re making a bold statement. It’s exciting and it’s popular, it’s appealing, and the researchers are extraordinarily talented at setting up a controlled laboratory conditions that make their subjects look like they don’t know what they’re doing and look incompetent. The reason why they set these studies up that way is to show that people will make poor choices because they use heuristics, and the way to demonstrate it is to arrange for these heuristics to be misleading and to be inappropriate, and still people use these heuristics. The original idea was to show that people use heuristics, which are rules of thought, simple rules, strategies for how to get things down.

They demonstrated that, but that doesn’t mean we shouldn’t use these heuristics. We would be lost without these heuristics, and the researchers haven’t shown how valuable these heuristics are and how much we rely on them. They haven’t looked at the positive value of these heuristics that we learn through experience. I have a big problem with the takeaway message from these kinds of books that you shouldn’t trust your intuition, you should ignore your intuition because it’s going to get you in trouble. With regard to trust, I don’t want to encourage anybody to just have blind trust in our instinct, in our intuition, because intuition can mislead us.

That’s why what we found the firefighters doing, they weren’t just going with the top option that popped into their mind, they would evaluate it by imagining “what will happen if I put that course of action into play here?” They were evaluating it, just not in the conventional fashion. We shouldn’t just blindly trust our intuition, but we also shouldn’t blindly trust the results of our analysis, because we see people fooling themselves and doing incomplete analyses all the time. I don’t think we should blindly trust either intuition or analysis.

Brett: Have a balance.

Dr. Klein: Mm-hmm, yeah. We need to balance the two, and rely on the two. Those are two different kinds of capabilities that we want to take advantage of. When we have intuitions about things, we shouldn’t immediately do it, but we shouldn’t ignore the intuitions. We should listen to them, because they may be telling us things from our unconscious that we otherwise wouldn’t have been aware of.

Brett: Are there situations where intuition is a better decision making model, and are there some situations where the more analytical, procedural decision making process is better?

Dr. Klein: There are situations that favor one or the other, but I’m uncomfortable with that question because in most situations we have to rely on both of them. You don’t have to choose, “Should I be intuitive for this situations. Should I be analytical there?” We should draw on both of these strengths, and we do.

Brett: Okay. One of the interesting sections in your first book, in Street Lights and Shadows, was that procedures and checklists, that methodical, analytical process, actually has some downsides. What are those downsides of relying on a procedural based model of decision making?

Dr. Klein: Right. I don’t want to mislead anyone and think that procedures aren’t useful. They’re extraordinarily useful. I don’t want to take off in an airplane where the pilots have left their checklists behind and they’re just going to go by the seat of their pants.

Brett: Sure.

Dr. Klein: I want them to use checklist. There are many situations where checklists are extremely valuable, and situations where the actions are repetitive, the situation is relatively straightforward, and you can work out what the procedures are and there’s not an awful lot of complexity to contend with. But in many situations, that’s not the case, and the procedures break down because people aren’t using procedures to handle complex situations. They’re relying on the patterns and on the experience that they’ve built up. Even when they’re trying to rely on procedures, they’re using their experience to know how to adapt the procedures or which steps to skip and which steps to add that maybe weren’t in the original design.

People who have experience are, even when they’re trying to follow procedures, they’re modifying the procedures to fit the situation. Procedures themselves aren’t going to let people perform at a satisfactory or expert level.

Brett: I think you mentioned typically amateurs follow procedure more, and experts are more likely to use that experience that they’ve gained to maybe modify the procedure if they’ve recognized a pattern that would, I don’t know, dictate a different course of action.

Dr. Klein: Right, yeah. Novices don’t have any expertise, and so the best thing we can do for them is provide them with some ground rules and steps that they can follow, and they’re extremely grateful when we do that, but then sometimes in organizations people get carried away and say, “Let’s proceduralize as much as possible.” That shows that they don’t appreciate their skilled workers and what their skilled workers have learned and the kind of expertise that they have gained.

Brett: If we want to become more adept with our intuition, we need expertise, but how do we gain expertise? Is it just a matter of time and just seeing things over and over again? Do we need to be methodical about it, or it just sort of happens? We just encounter so many vast amount of experiences that through that process you become and expert?

Dr. Klein: Yeah. That’s what most people rely on, is just the slow accumulation of experiences. It sort of works, because most of us get better at what we do, so there’s nothing wrong with it, except it’s awfully slow and it’s awfully haphazard. In a lot of situations, we rely on feedback, but the feedback isn’t always reliable. I think there are ways to speed up the growth of expertise, to accelerate its development. One of the ways that is available to most organizations, that they don’t use, is to take advantage of the highly experienced workers and have them coach the junior ones. That way, they can recycle the expertise and make it more broadly available.

Most organizations don’t take advantage of this. The senior people, the more experienced ones, you might say maybe they don’t want to share their secrets because then they’ll be obsolete, but even if you wait shortly before they retire and say, “Look, can you help us out now?” they resist it, because their expertise is not based on rules, so they don’t have an easy way to describe how they recognize things, how they make perceptual discriminations, what their mental models are. All of these are called tacit knowledge, and these are the basis of expertise. People don’t have an easy time of describing it. That’s why it’s tacit knowledge. Experts don’t do a good job of explaining it.

However, what we found is that it’s possible to take the experts in the organization and sensitize them to their tacit knowledge and make them aware of the skills that they have, that they’re not just following rules, and the opportunities that they have for bringing things to the attention of their junior colleagues. I think you can do a great, much better job of on-the-job learning than organizations currently do. I think that’s one big area, big way that organizations and leaders can improve the skill level of the people in their organization.

The second way is for the learners to try to be more deliberate about how they improve, about how they work. They should be watching what the skilled performers are able to do, and then maybe they should be the ones who start prompting the discussion. I know that’s hard for somebody who’s junior and it seems a bit aggressive, but we find that people who are experts are pretty proud of what they do and appreciate that kind of acknowledgement and that kind of attention. By saying, “When you did this, that wasn’t what I thought you were going do. Why did you do it?” Or, “What were you noticing, what were you thinking about?” You can have that kind of a dialogue. I think it’s possible to have that kind of an arrangement.

We’ve also been using a method of skill development called the ShadowBox method, which we just developed within the last year or two, which is to help trainees see the world through the eyes of the experts without the experts being there. The strategy we have is we present challenging scenarios and people go through the scenarios, and then in the middle of the scenario they’ve got to answers questions like, “Rank these options of different courses of action or different roles.” They rank them and they write down their rationale, but then we show them, “Here’s what a panel of experts ranked, and here’s what they were noticing. Here’s the rational reasons that they gave,” so that the trainee gets a chance to compare his or her skills and observations with those of the experts.

The experts don’t have to even be in the room, but by capturing all that information up front, the trainees and the novices can see how the experts were viewing the situation and how that differed from the way had been viewing it.

Brett: You actually started a website with this programming, correct?

Dr. Klein: Correct. The website, for anybody who’s interested, is www.shadowboxtraining, that’s one word, dot com.

Brett: Your book, Streetlights and Shadows, you talk about another method, the premortem. I think a lot of us have heard of the postmortem where, after … I always did these in law school after my exams. You’d circle up with your law school buddies and then you’d dissect the exam afterwards and see what problems you might have missed. This is the postmortem, but the premortem, is this basically you’re doing a postmortem but before the actual event?

Dr. Klein: Right. The postmortem, the concept comes from medicine, where you have this discussion after a patient has died to see what caused it. It’s a valuable opportunity to learn. The physicians learn, then the family members can learn why their loved one has died. If there’s something unusual, then if you write it up so the whole community can … Everybody benefits, except for the patient. The patient is dead. We said, “Why don’t we move that up front?” Like if you’re doing a project, instead of doing a postmortem at the end, which you may still want to do, why not do a premortem?

The way it worked is really a form of risk assessment. If you have a plan, if you have new program getting started, and you’ve got the team ready and they’re all enthusiastic and they see what the plan is, then usually at the kick-off meeting or as you’re planning it you’ll say, “Okay, does anybody have any criticism or see any weaknesses?” It’s hard to say to an energetic and enthusiastic group, “Here’s my reservation,” so people tend not to respond with the critiques, and they may not even be thinking about any critiques. They’re in a mindset of “let’s do it” rather than “what could go wrong?”

We developed this premortem technique. The way it works is we say, “Okay, here’s the plan. You’re looking in a crystal ball and it’s now four months later and we started this pro- … Oh, gosh, the image in the crystal ball is really ugly. This plan has fallen apart, it’s been a disaster. We know that. The image is clear that this has been a disaster, the plan has failed.” Everybody in the room, people on the team and observers, everybody’s got two minutes to write down why the plan failed. You just wait and you give them two minutes to write down their reasons, and then you go around the room and compile the reasons.

It’s amazing the kinds of things that people pick up, because the mindset is different. Instead of the mindset being, “Yeah, let’s get started, we’re enthusiastic,” here there’s a mindset, “It’s not about if it will fail, we know it has failed. Now use your experience to try to identify what went wrong.” People will say the most amazing and prophetic things, because they are in a different place. The exercise seems to work. We’ve done some research, it seems to work far more effectively than just asking people to do a critique.

Brett: Interesting. I can see this sort of depressing people to where they’re just like, “I don’t want to take action on this.” How do you avoid that, where you get all these problems and they’re just so overwhelming that, “Oh, maybe this thing’s not going to work at all, let’s not even try”? Is there a balance to it? How do you take action despite seeing all the problems?

Dr. Klein: We were using this premortem technique, we were teaching it to other organizations and we were using it ourselves, and then some of our staff members started to raise exactly that concern. You don’t want to be overconfident and overoptimistic, but this is really reducing our motivation here, so we added an additional step. I’m glad you asked me about that. The additional step at the end of the premortem is to say, “Okay, we now have all the reasons people have identified for why this plan’s fail, now let’s take another two minutes, have everybody write down what he or she personally can do to prevent this outcome from occurring, to prevent these reasons from biting us.”

Now we compile that, and we found some ways to improve the plan to make it more rugged. That seems to take some of the emotional sting out of the premortem and leave people enthusiastic, but a bit shaken and maybe not overoptimistic anymore.

Brett: That’s good. I guess overoptimism has gotten to us in a lot of trouble, and a lot of societal problems. I guess like the mortgage crisis was a crisis over overoptimism.

Dr. Klein: Yes.

Brett: For example.

Dr. Klein: People are just overconfident in these situations, because when you’re just getting started, you wouldn’t start if you didn’t think it was going to be a good program, and so you’re looking at all the ways that it’s going to succeed and you tend to ignore some of the potential problems. We want to correct that.

Brett: Your most recent book is Seeing What Others Don’t, and it’s about how we gain insights. I think most people think of insights as sort of this mystical thing that just happen out of the blue, the eureka moment, and there’s nothing really we can do to control, there’s really no science behind it. In your book, you make the case that actually there’s some patterns to insights. What are those main patterns for insights that you’ve uncovered?

Dr. Klein: I conducted a study of a hundred-and-twenty examples of insights. Previously, for the last few decades, there had been a number of researchers conducting studies of insight, but they conducted it in laboratory settings, and it’s hard to schedule an insight. It’s hard to say you’re going to have an insight at 3:00 Friday afternoon. The paradigms that they use presented people with an impasse problem, where people are likely to be making inappropriate assumptions that trap them, and so the problem seems unsolvable and people wrestle with it. Then, some of them, not all of them, but some of them realize, “Wait a second, here’s the trap.” They discover the belief that’s trapping them, the unnecessary assumption they’re making, and then they get around the impasse.

We call that an impasse path, and when I did my review as a naturalistic decision researcher, I found that some of the cases fit that category, but not too many. That was actually one of the smallest categories I found. The most common category was a connection pathway that people use, where you have knowledge that you’ve already gathered, and then you’re exposed to some additional knowledge and you put that together with what you have, and now you have a much richer mental model, a much richer idea. It’s sort of an explosive aha experience of, “Wow, now I see what I can do that I didn’t realize before.” That’s a second path, and it was the most common.

We also found a third path, we hadn’t seen this stuff in the literature, that we call a contradiction path. The connection path is how you put things together. Contradiction path is when you realize that these things don’t fit together, that there’s a disconnect here. Do you have time for a short example?

Brett: Sure.

Dr. Klein: This came from one of the interviews we had done in the project where we studied police officers. This highly experienced police officer is riding in his car and he’s got a much lesser experienced officer next to him. They’re at a stop light and they’re just waiting for the light to turn. The younger officer looks up and he says, “Huh?” because the driver of the car ahead, which is a brand new BMW, he sees the driver take a drag on his cigarette and then flick the ashes. He says, “What?” because nobody who’s driving a new BMW is just going to flick the ashes onto the upholstery into the car, and if you borrowed the car from somebody, you wouldn’t do that to somebody else’s car.

This doesn’t make any sense at all, so they pull the car over, and as you can imagine, it was a stolen car. The insight, which he got immediately, was, “These pieces don’t fit together. Something is wrong, we need to investigate.” This was a contradiction path, it was a third path that we identify. It turns out that there’s several different pathways to gaining insight, not just working around impasses and giving up unnecessary assumptions.

Brett: In the book you talk about things that keep us from gaining insights, and you talk about goals could possibly get in the way of insights. How could goals get in the way of gaining new insights? Because goals are supposed to be great, right? We’re supposed to have goals and accomplish them and feel good about ourselves.

Dr. Klein: Yeah. This happens all the time. The problem with goals … Who could be against setting goals? We certainly should set goals. The trouble is when we fixate on these goals. If the project is clear and the goals can be identified nicely up front and the situation isn’t going change, then, yeah, you try to reach your goal. The trouble is, in many cases, we’re approaching what’s called wicked problems and ill-defined goals that can’t be defined up front and don’t have any clear definition. Like, for example, our goal is to find a solution for global warming. What’s the right answer? There isn’t a right answer. Or, our goal is to reduce the cost of healthcare. We all want to do that. What’s the right answer? There’s not a right answer.

These kinds of goals, and many smaller ones on a personal level, are examples of wicked problems. The goals that we start out with turn out to be fairly shallow. The more we wrestle with the problem, the more we’re going to discover. The trap here is, if we stay locked into our initial idea of what the goal is, we’re not going improve, we’re not going to shift to a more sophisticated goal. We have people, especially in organizations, who are given tasks and they’re told, “here’s the goal,” and they’re afraid to make any kind of a change, and so they lock into their initial goal, they fixate on their initial goal. Their initial goal is simply inadequate and too shallow, and they do a mediocre job.

For that reason, we advocate something that we’ve called, not management by objectives, by management by discovery, to have people try to identify their goal up front, but then be sensitive to what they’ve learned about the goal as they go along so that they can replace their initial goal with a more valuable one and a more powerful one.

Brett: Fascinating. Another part in your book, in Seeing What Other Don’t, you talk about the role of serendipity in insights and making new connections. I’m curious, there’s been a lot of talk about, we’re seeing this in our own lives right now with companies like Amazon and Netflix that have algorithms or programs that allow smart discovery, where you find things that are related to your interests already. There’s no more browsing the book store where you just stumble upon a book you never would’ve found if you weren’t just browsing randomly. Are those algorithms going to get in the way of insights or possibly get in the way of insights?

Dr. Klein: I think so. I certainly appreciate the power of big data, and we have access to an amazing amount of data that never could have imagined before, and different kinds of data that we can obtain very easily. There’s more data than we can possibly handle, so we want to have algorithms that will do the job of sorting through it. These algorithms can do and do a wonderful job of finding patterns and of being able to track trends and things like that. The problem with a big data approach, the problem is that it locks us into what the original programmers knew and what their beliefs were and what their dimensions were, and so the algorithms reflect what people already knew.

You can use these programs to follow historical trends, which is very powerful, but what happens if you get into a situation that has changed in some subtle but really important way and so the historical trends don’t apply anymore? What we see happening in organizations is people giving up their own control to the analytical approaches and relying on that, and so their expertise is starting to diminish as they cede control to the programs, and the programs, which are very good at crunching the data using what we already know, are usually not as good at picking up the departures from the previous trends and being able to notice that the world has shifted in some small, subtle, but significant ways that have to be taken into account.

Without somebody watching what’s happening, the programs can mislead us, if we’re not careful. If we’re not mindful of this, we’re going to lose our capability to provide an oversight for these kinds of programs and just become more and more dependent on them, and I’m seeing that happen in too many situations.

Brett: Sure. People would say in the stock market you might be seeing that, and other areas of economic life as well. I guess the trick would be use these as a tool, but don’t make them a crutch.

Dr. Klein: Right. For people who want to be leaders, there’s no way to evade your own responsibility for developing their own expertise, building their own mental model, or making sure that the people on their team continue to develop expertise so that they can work more effectively as teams.

Brett: What can individuals do to cultivate insights? Are there practices or routines or something of that nature that we can put into place in our everyday lives and basically make it a more fertile environment for insights?

Dr. Klein: I’ve wondered about that ever since I started working on the book. It’s only in the last few months that I’ve come up with something. I haven’t tested it, so I may be wrong, but it’s an idea that I’m playing around with now. The idea is about having people cultivate an active mindset, an active stance, I’m calling it an insight stance, or an instance, about things that happen around us. Part of that will involve noticing our insights, our large ones, but even our small ones. We tend to dwell on our mistakes, a lot of us do. I know I do. “I should have done this,” or, “I should have realized that.”

That’s helpful, but we should also celebrate the insights that we have when we notice contradictions that other people weren’t spotting, or when we make connections that other people hadn’t seen, or when we realize what’s wrong with one of our beliefs or mental models, and improving. We should be celebrating these and appreciating how we’re continuing to build expertise. That’s part of it, is to be able to just become more sensitive to insights that we have, and to be more alert to insights that we might have. Related to that is trying to just encourage people’s curiosity, so that if you see a coincidence you don’t just dismiss it, “Oh, it’s just a coincidence.” Maybe it is, but maybe it deserves a second or two to think, “I wonder, could there be something here that is worth my attention?”

Or, if you receive a piece of information that contradicts what you believe, instead of saying, “Well, that’s just an anomaly, I don’t have to pay attention to it,” most of the time you don’t, but maybe give it a few seconds and say, “If this was actually an accurate data point, what could it be telling me?” We can try to be more deliberative, more mindful about becoming open. It’s not simply becoming open, that sounds too passive. It’s about being curious about coincidences and connections and anomalies and things going bad, rather than just doing our work in a mindless fashion. You could also try to be more alert, when working with a team, be more alert to conflicts and confusions.

I’ll give you an example there. I was putting on a workshop, and it was to a bunch of executives. We were talking about expressing intent, and how important it is and how hard it is. One of the executives said, “I know just what you mean. I just had a situation the last couple of days. I gave one of my subordinates a job to do. I told him, ‘Here’s what I want,’ I explained what I wanted. My deputy was there, so he heard me. Off he went. Then, brought him back a day or two later to see how he was doing, and he was going off in the wrong direction. He just totally missed the boat.

“I said, ‘No, no. That’s not what I want to get. Here’s what I want,’ and he said, ‘Okay, I’ll try to do better,’ and off he went. I turned to my deputy and I said, ‘Didn’t I explain?’ He said, ‘Yeah, you did, he just missed it.'” He was resonating to what I was saying, but that didn’t feel right to me, so I asked him, “When he came back and he had missed the boat, did you think about asking him, ‘What did you think I wanted?'” The man said, “No, I didn’t. Why would I do that?” I thought, of course that’s what you’d want to do. If somebody has misunderstood your directions, maybe there’s a flaw in that person’s mental model, and this is a chance for you to discover it with him. Or, maybe your description wasn’t as clear as you thought, and this is an opportunity for you to get feedback about how people are understanding your directions, and maybe you can do a better job.

There’s all kinds of opportunities when people are confused or having a conflict or things like that that we would just as soon sweep under the rug, and instead of sweeping under the rug, we can say, “This is an opportunity to gather some insights.” There’s ways of having an insight stance individually and also in an organization.

Brett: Fascinating. Gary, where can people find out more about your work?

Dr. Klein: I’ve given the website, www.shadowboxtraining.com. There is another website that we have, www.macrocognition, that’s all one word, macrocognition.com. Or, they can look up my work on Amazon or go into a bookstore. My book, Seeing What Others Don’t: The Remarkable Ways People Gain Insights, or Streetlights and Shadows: Searching for the Keys to Adaptive Decision Making, those are my two most recent books.

Brett: Fantastic. Dr. Gary Klein, thank you so much for your time. It’s been a pleasure.

Dr. Klein: Thank you. I appreciate the conversation.

Brett: Our guest today was Dr. Gary Klein. He is a research psychologist specializing in naturalistic decision making. Make sure to check out his books on Amazon.com, Streetlights and Shadows. Completely fascinating. If you enjoyed our articles on situational awareness or the OODA loop, this book will flesh out some of the concepts we discussed in those articles. Also, check out his latest book, Seeing What Others Don’t, all about the science of insight. Really fascinating, and you’ll also have some takeaways on how you can have more insights in your own life or create an environment for insights. Make sure you also follow Gary on Twitter. He’s always posting interesting research. His Twitter handle is Kleinsight, @Kleinsight. Then you can visit his websites to learn more information about his work, macrocognition.com and shadowboxtraining.com.

That wraps up another edition of The Art of Manliness Podcast. For more manly tips and advice, make sure to check out The Art of Manliness website at artofmanliness.com. If you enjoy this podcast and have gotten something out of it, I’d really appreciate it if you go to iTunes or Stitcher or whatever it is you use to listen to the podcast and give us a review. I’d really appreciate that. Also, please recommend us to your friends. That’s the best compliment you can give us. Until next time, this is Brett McKay telling you to stay manly.

bg
Mohan

Submitted by: Mohan in Milton Keynes, UK
random
library