A military adage, from the intelligence community. It is very difficult to gauge the intent of an enemy who doesn't think like you, who doesn't look like you, who doesn't wage war like you. This is one of the few rules that has made it into the canon of intelligence wisdom. It's very similar to the civilian philosophy that "When all you have is a hammer, all your problems look like nails." The Department of Justice can use this philosophy to prosecute people like the Unabomber, who get caught hoarding materials for home made pipe bombs. The three word phrase is almost self-explanatory, but here's a very brief example:
  • Country A and Country B are traditionally enemies.
  • Country A usually maintains several divisions of tanks at the border with Country B, and Country B maintains a moderately resilient force with a defense-only posture.
  • For no apparent reason, Country A scales down its tank and troop presence at the border.
  • Two weeks later, Country A releases footage of a successful IRBM launch.

You may think this means that Country A is preparing for a war, but based on the idea that capability implies intent, you can safely say that Country A won't invade Country B with tanks anytime soon. The somewhat scary downside is that, because they are capable of launching IRBMs, you have to assume they intend to use them, possibly in a first strike that eliminates the need for a war, and they don't want to pressure Country B with tanks until they have enough missiles built.

One more succinct example, from my father's advice to me as a child: "Never point a gun at anything you don't want to shoot."

I definitely agree with the assessment below--using this doctrine as your sole source of intelligence is a bad idea, as it paints the scariest, most conservative estimate, and sketches in details that may not be there at all. Most analysts agree that no action should be taken based on the content of one report; unfortunately, an analyst's job is to provide intelligence, and not to suggest a course of action.
One major problem with this adage is that it can lead otherwise peaceful nations into the nastiness of arms races, brinkmanship or even unwanted war. The problem here lies in the conservative thread that underlies this behavior: if you don’t understand the opponent, then you must prepare for the worst as you won’t be able to interpret positive signs that they will attack you.

Consider:

  • Country A, Country B and Country C are opponents.
  • Country B lies between A and C.
  • Country A has always maintained a strong conventional military presence on the border with B as a deterrent/defense against attack.
  • Country B has always maintained its land forces at the borders of A and C for the same reason.

Now suppose Country C develops and tests an IRBM. If B and C have a longstanding grudge, then Country B is suddenly in a vulnerable position; logically, it begins to pursue IRBMs as well in order to deter C. A, however, sees B begin to develop a technology that threatens them, and by this adage must assume that B is developing the capability to preempt them in order to use it. A has the following choices:

  • Ignore the situation, which is unwise according to this node’s prescription.
  • Start developing its own IRBMs to counterbalance Country B. This might then drive Country B, in turn to try to maintain enough forces to offset both A and C, which would mean that it had enough forces to perform a splendid first strike on either one singly...and so it goes.
  • Pre-emptively attack B using its existing forces in order to avoid falling dangerously behind in the missile race.

The conflation of capability with intent is the base cause of arms racing and conflict spiraling (where small countermoves beget additional countermoves which beget larger moves and so forth). One reason this sort of analysis continues to get done is because (as it admits) it is far easier to do beancounting than to understand an opponent. It is much easier to say to one’s superiors ‘Here is their order of battle and I’m 95% sure of the totals; if I’m wrong, it’s intelligence’s fault’ than to say ‘Here is what we think they want and what they will do.’ While the second is more useful, it is much easier to get wrong, which of course no analyst wants to do, and no decision maker likes to risk.

Log in or register to write something here or to contact authors.