Bayesian Modeling in Intelligence Analysis

I stumbled across an unclassified CIA document evaluating Bayesian modeling for intelligence Analysis. Statistics are commonly utilized in analysis, but often the data that can best allow one to make a prediction is not quantifiable. In the rare cases that can be quantified, advanced statistics can be quite helpful. 

Bayes can help analysts in several ways, which I will illustrate with an example. We take for granted that most analysts have extremely good knowledge of their area of specialty, so it can be assumed that they will be able to make ballpark probability estimates. Let’s take the North Korean case. Using current information, an analyst may predict that there is a 15% chance of North Korea attacking the South within the next month. But what happens when new information is released? When faced with new and possibly alarming information, like troop movements near a border, one can be predisposed to overestimating the impact of this information. Luckily, we have Bayes which can allow us to rationally change and update the previous intelligence assessment.

Using the Korean example again, let’s say the analyst predicts a 10% chance of an attack.  However, a new update is rushed in which states that an attack is imminent, perhaps that the troop movements near the border have a 90% chance of being preparation for an attack. Someone without an understanding of Bayes might panic and believe that there is an extremely high likelihood of an attack. However, a Bayesian prediction including the new information might say that there is now a 50% chance of war, which while noticeably higher, may not be as dire as the non-Bayesian prediction may have stated. 

This is not to say intuition should be thrown out the window, and that our nations intelligence community should make decisions based on cold statistics alone, but I do believe that Bayesian analysis could seriously improve the quality of analysis when used to check other predictions. 

Advertisements

3 thoughts on “Bayesian Modeling in Intelligence Analysis”

  1. I agree that this is where Bayes’ Theorem is extremely useful. This has been demonstrated in the case of disease testing quite a bit.

    Diseases that are extremely rare often have a higher false positive rate than the actual rate of incidence. This means that even if you take a test and get a positive reading, your knowledge of just how rare the disease is should outweigh the result.

    That is, Pr(Have the disease | Positive test) is still very low. Higher than simply Pr(Have the disease), but not high enough to get too worried yet.

  2. This sounds a lot like actuarial science. It seems to me the that this might run into practicality issues when put into practice; how can you know the probability of one event is linked to the outcome you’re thinking it’s linked to? For example, how could anyone be sure that troop movements within a country have anything to do with the likelihood of an impending war? There are so many variables. There’s a famous linking the decline of sea pirates and the rise of global warming. Statistically, that makes sense, but like you say, intuitively bayes can be misleading.

  3. This is pretty descriptive of the incidences that use the Baye’s Rule in calculating odds. I like the application to the intelligence world as well because it shows how vastly applicable the use of the subjects we have covered thus far are to different types of situations. I think that it is very useful, and also wonder if they (CIA) incorporate some uses of cascading behaviors within networks to derive the possible spread of panic and hostility to neighbors and/or allies of North Korea (China?)

Comments are closed.