We all have biases. We demonstrate this whenever we show an implicit, often unspoken tendency for or against someone or something. For better and for worse, biases are baked into our social, psychological, and cognitive evolution as human beings. Our pattern-seeking brains are hardwired to make predictions, and biases make life quicker and easier to predict.
Biases can be excellent things, for example, when we prefer foods that are good for us to foods that are bad for us, or when we teach our children not to get into cars with strangers. But biases can be nasty things, too, for example when we succumb to preconceived notions about race, religion, or culture and then make assumptions and generalizations according to those prejudices.
Even protective intelligence analysts are prone to biases; it can鈥檛 be helped. But we can help ourselves 鈥 and those who consume the intelligence that we produce 鈥 by recognizing our biases and doing our best to mitigate them.
Four types of biases 鈥 and what they mean for intelligence analysts
Analysts working in protective intelligence must overcome at least four different types of biases. Let鈥檚 take a look at each of these.
1. Cultural biases:
Everyone was born into a culture, and we naturally judge the world 鈥 and people of other cultures 鈥 according to our own cultural standards. For example, it鈥檚 not that the Japanese tendency to put the group before the individual is right, or that the American propensity to put the spotlight on the individual is wrong 鈥 they鈥檙e just expressions of different cultures. Where things go wrong for intelligence analysts is when our culturally-biased evaluations lead to intelligence failure.
For example, let鈥檚 say junior analyst who researched a foreign location with which he had no experience. Instead of becoming familiar with the country鈥檚 culture, customs, etc., asking someone who was, or admitting no experience with the foreign culture, the analyst wrote a report that leaned heavily on his American background.
While this should have been obvious to the reviewer of the report, it wasn鈥檛. Since the analyst鈥檚 previous reports had always been accurate and effective, the reviewer indulged in another bias and simply checked the report for grammatical errors and basic information before forwarding it to the marketing team headed abroad. The report, culturally-biased as it was, proved to be misleading and did the marketers more harm than good. It was a long time before the marketing team again trusted their intelligence colleagues.
2. Organizational biases:
Most of the work that intelligence analysts do occurs in large, complex organizations. Decision-making in large, complex organizations is based on authority and power as well as rational criteria. Organizational politics are as real as organizational charts. It should be no surprise that organizational biases can contribute to poor intelligence analysis.
Corporate and world histories are rife with intelligence failures due to organizational bias. Politics can enter into every stage of the intelligence cycle, from selectively framing what鈥檚 important in the planning phase, to meeting top-down expectations in the collection and analysis phases, and avoiding unpopular or 鈥減olitically incorrect鈥 conclusions in the analysis and dissemination phases. As much as we hate to admit it, this can happen in protective intelligence analysis, too.
When the highest-paid opinion in the room has spoken, it can be difficult for a lowly analyst to do anything but fall in line.
3. Self-interest biases:
We鈥檙e all motivated to 鈥渓ook out for number one鈥 both as individuals and as groups. But when this motivation goes too far and we put our own interests above those of others or of the organization as a whole, things can quickly go south.
This is also true for intelligence analysis. It鈥檚 not just a case of analysts shaping intelligence for their own monetary or career gains, although this can happen, too. It also occurs because people interpret reality in their own way based on individual perspective, expertise, and personal hobby horses.
The self-interest bias can be mitigated by making decision-making criteria explicit and clear from the beginning and revisiting them frequently throughout the intelligence cycle. It also helps to recognize that different individuals and groups have different interests and that these must be faced openly even if this rocks the boat.
4. Cognitive biases:
If the three bias types mentioned above are hard to recognize and mitigate, cognitive biases rooted in the way our brains work can be even more difficult to combat. As Amos Tversky and Daniel Kahneman, whose groundbreaking research into cognitive biases stretches across five decades, wrote in 1974: 鈥淐ognitive Biases are mental errors caused by our simplified information processing strategies.鈥 I.e., these biases are baked into how we as humans think and are not something we learn through experience or education.
Here, we turn our attention to how cognitive biases can influence protective intelligence analysis. As the CIA鈥檚 Richards J. Heuer, Jr. writes in Psychology of Intelligence Analysis, analysts must be aware of five cognitive biases in particular:
Vividness beats boring but true: Heuer describes it like this: 鈥淚nformation that is vivid, concrete, and personal has a greater impact on our thinking than pallid, abstract information that may actually have substantially greater value as evidence.鈥 In other words, what stands out sticks 鈥 even if it鈥檚 not true.
Things we hear or experience ourselves rank higher than second-hand information. Stories, anecdotes, and cases influence us more than statistics. Again, it鈥檚 just how we鈥檙e wired. Analysts 鈥 and the consumers of our reports 鈥 can easily push aside piles of correct but abstract and boring data and instead give more importance to a memorable tale they heard first-hand.
No evidence? No problem: Analysts are often tasked with coming up with reports on situations and issues for which significant information is lacking. If all the key evidence were there and easily accessible, then why would our stakeholders ask us to come up with it in the first place?
Unfortunately, you don鈥檛 know what you don鈥檛 know, and 鈥渙ut of sight, out of mind鈥 can easily end up 鈥渙ut of report.鈥 To mitigate this bias, analysts need to recognize what they don鈥檛 know, determine its relevance, and include this in their analyses and level of confidence in their findings.
Fooled by consistency: Consistency is a must for analysts, and inconsistencies in analysis and reporting are sure signs of trouble. But consistency is only great until it isn鈥檛.
Information might be perceived as consistent 鈥 and valid 鈥 because it鈥檚 frequently correlated in the same way or repeated again and again. But apparent correlation plus repetition don鈥檛 add up to true. Information might seem consistent only because it all comes from the same source or limited group of sources. A small amount of consistent evidence, no matter how often it is repeated, should make analysts think hard about its validity and their level of confidence in evaluations based on it.
Basing too much on best guesses: We base judgments on different bits of information, and it鈥檚 hard to know when information is accurate and just how accurate it is. We might have misunderstood something, we might not have seen some connections, our sources might want to mislead us鈥he list goes on. But life goes on, too. At some point, we will have to decide about the validity of our information.
Heuer writes that because we have such a hard time dealing with complicated probabilistic relationships, we tend to default to a 鈥渂est guess鈥 strategy, short-circuit the difficulty, and simplify matters into a yes-no decision: either this bit of information is accurate, or it isn鈥檛.
Instead of betting too much on best guesses, Heuer recommends breaking the issues down so separate bits of information can be assigned separate levels of probability. Then, the analyst needs to figure out how to integrate all of these different probability levels into an overall evaluation. Again, easier said than done 鈥 but much better to try than not.
Sticking to bad evidence despite the facts: Finally, Heuer asks us to consider another cognitive bias that baffles psychologists: 鈥淚mpressions tend to persist even after the evidence that created those impressions has been fully discredited.鈥 Once we believe something, we tend to keep believing it even though the reasons for our beliefs are proven wrong.
This is not just stubbornness. Subjects in numerous psychological tests continue to stick to false beliefs they were inculcated with as part of the test, even when the researchers went out of their way to show that the beliefs were based on false evidence.
Analysts can run into trouble with this bias, too. If they have made judgments based on sources they deemed accurate, and the reliability of the sources later comes into question, it can be difficult to rethink the entire judgement 鈥 even if this might be called for.
Know your biases 鈥 and use structure and tools to mitigate them in protective intelligence
Use interview questions about the candidate鈥檚 awareness of their own biases as a tool to gauge analyst maturity and opportunities for mentoring. Most junior analysts are either not aware of their own biases or think they have none. They concentrate on the process, research, and finished product. If you follow it up with, 鈥淗ow would you start to learn what your biases are?鈥, and they stick to the notion that they have no biases, then you might want to ask even more questions.
We all have biases, so the only way to mitigate them is through structure and tools which force us to address them in the products we provide.
Systematic approaches to intelligence analysis, like the Structured Analytic Techniques (SAT), are often utilized at the CIA. SAT runs through analytic techniques to ensure that analysts utilize accurate information, challenge initial hypotheses, and review all possibilities. SAT has been criticized as somewhat outdated and for not doing enough to counter the impact of biases, but it is a good starting point for young analysts 鈥 and for the rest of us.