Sections

Commentary

The fog of certainty: Learning from the intelligence failures of the 1973 war

Then-Israeli Major General in the Reserves Ariel Sharon (2nd R) confers with comrades during the 1973 Middle East War in the Sinai Peninsula, October 10, 1973 in this handout photo released by the Government Press Office. Sharon, a dominant figure for decades in shaping the Middle East, was fighting for his life on January 5, 2006, after suffering a massive brain haemorrhage. ISRAEL OUT BW ONLY REUTERS/Shlomo Arad/Government Press Office/Handout - RP3DSFDYCZAB

At 2 p.m., 44 years ago—on October 6, 1973—deafening air-raid sirens broke the silence that enveloped Israel on the most holy day of the Jewish year, Yom Kippur. A harrowing portent of the dire days that lay ahead, this marked the beginning of the 1973 war between Egypt, Syria, and Israel, known to Israelis as the Yom Kippur War, and to Arabs as the October War. Among the many aftershocks of this fateful clash, the hardest for Israel to digest was Syria and Egypt’s successful strategic and tactical deception, and the failure of the Israeli intelligence community (IC) to provide the necessary warning, despite having sufficient intelligence indicators.

This professional failure was due largely to what became known as the “concept”—a preconception held by the Israeli IC, which dictated that the Arab states would not initiate a war against Israel that they could not win. This preconception proved wrong, and sticking with it up to the last minute a fatal mistake.

After the war, Israel established a special state commission of inquiry—the Agranat Commission—to determine personal accountability for those responsible for the failures, and to propose structural changes for the Israeli IC. The commission’s recommendations led to the forced resignation of the Israeli Defense Forces (IDF) chief of staff, David Elazar, and eventually the voluntary resignation of Prime Minister Golda Meir and Defense Minister Moshe Dayan. The commission also added additional analytic responsibilities to the portfolios of both the Mossad and Ministry of Foreign Affairs.

The Israeli failure to recognize and respond to key intelligence indicators was not the first, nor the last time a misconception has proved fatal. Yet, despite many lessons learned—which resulted in substantial structural changes, both in Israel and intelligence communities worldwide— the danger of relying on unchallenged basic assumptions remains. Only through comprehending the nature of perception, and the inherent limitations and pitfalls involved in processing ambiguous information, can we mitigate the dangers inherent in assumptions and preconceptions.

Victory and Defeat Inversed

“The concept” refers to the Israeli IC assessment that the probability of war between Arab nations and Israel was low. As our colleague, Bruce Riedel described at length in a recent essay on the intelligence failures involved in the lead-up to the 1973 war, the Israeli inability to incorporate indicators that did not fit into their paradigm led to the misconception that the Arabs can’t win a war under current military preparedness, therefore, they will not initiate one.

This assumption was accompanied by a significant and exaggerated sense of overconfidence brought on by Israel’s surprise military victory in the 1967 Six-Day War. Winning, under the Israeli conception, was viewed purely in military terms. Accordingly, Israelis perceived the Arab definition of victory to be reclaiming the territories lost in 1967 by military means.

Egyptian President Anwar Sadat’s calculus to go to war involved considerably more ambitious and sophisticated goals that could be achieved by a symbolic victory. A successful surprise attack, where territory could be held for only a few days, should be enough for the superpowers to intervene and force the parties to the negotiating table. The Egyptians were willing to take losses on the battlefield in order to win at the negotiating table, therefore redefining the parameters of victory. Thus, despite numerous relevant and reliable indicators, Israel’s preconception survived the contradictory intelligence.

Intelligence DÉjÀ-vu

History is littered with a host of similar cases where the prevailing intelligence paradigm failed to predict crucial events. To mention only a few: the Japanese attack of Pearl Harbor (1941), the Iranian revolution (1979), the Soviet invasion of Afghanistan (1979), the collapse of the U.S.S.R, and the more recent 9/11 attacks, and the so-called Arab Spring.

Israel has also suffered recurring intelligence failures. This includes the glaring misconception of both Hezbollah and Hamas, for instance, with Israel significantly underestimating both groups in their early stages of development. Israel also utterly misjudged the political landscape during the 1982 Lebanon War, where the Israeli goal was to upend the balance of power in Lebanon, and in so doing, destroy the Palestine Liberation Organization (PLO), enervate Syria’s influence in the country, and ultimately refashion the geopolitical landscape of the Middle East—a misguided and overly ambitious goal.

While there are obvious differences between these cases, and in each case there may have been additional complicating factors, it is evident they all share an overreliance on preconceptions. Sufficient intelligence indicators existed to prevent the element of surprise, and yet, in all the cases mentioned, analysts’ steadfast reliance on their preconceptions, despite contradicting information, was directly responsible for the failure.

Nevertheless, preconception is not a foul word. Intelligence attempts to elucidate the unknown. Reality cannot be interpreted without an initial subjective interpretation. Building an intelligence assessment begins with a preconception, which is in essence an accumulative evaluation, based on both concrete intelligence inputs and assumptions. Assumptions are a necessary component of the intelligence process (as in everyday life), as they fill in the gaps of missing information needed to make decisions. These gaps are apparent when trying to ascertain leaders’ intentions, or attempting to predict the outcome of complex circumstances.

The ABC’s of Misconceptions: Analytical, Bureaucratic, and Cognitive

Israel’s failure to correctly identify Egypt’s intentions in 1973 was a maelstrom of faulty judgments—a result of cultural, organizational, and cognitive biases. While helping to simplify complexity and elucidate ambiguity, these biases are pernicious to intelligence analysts. The following five pitfalls outline a few of the difficulties involved in the analytical process, and warrant special attention:

  1. Hidden assumptions: At times, there isn’t awareness that the assumption even exists. It could be embedded so profoundly in one’s perception, or in the organizational mindset, that it is treated as fact, and not questioned.
  2. Seeking stability through heuristics: Confirmation bias is a well-known and thoroughly researched behavioral phenomena, whereby information is interpreted in a way that confirms and reinforces previous beliefs. It is also simpler and more natural to stick to a previously well-designed assessment than to change it. When facing new contradictory information, the default approach is to seek an alternative explanation that would not alter the assessment. New information is often assimilated into the existing image, or explained away.
  3. Linear prediction: Predictions based on a pattern of expectations of what came prior is both cognitively and methodologically simpler. The tools to forecast a game-changing, disruptive event remain less clear-cut.
  4. Mirror imaging: Analysts often project their own conceptual or cultural norms onto their adversary, a mind-set that constricts the perception of whether an adversaries goals are deemed likely or not. Perception is filtered through the context in which it occurs.
  5. Groupthink: Particularly in hierarchal organizations, where discipline is required, the natural disposition might tend towards groupthink, where institutional norms pressure members to stay in line, and the consensus reigns.

Strategies for More Conscious Analysis

Today’s world is characterized by increasing complexity, making it even more difficult to correctly predict future events and their outcomes. The main factors contributing to this complexity are the increasingly fast-paced nature of world events; the growing number of relevant participants in any given process; and the multiplying interactions between them. Relying on assumptions is indispensable: The paradox is that while the need to rely on mindsets and preconceptions grow, so does the potential room for error, and the cost of such errors. Given this dilemma, the following strategies allow those involved in the decision making process to take a more self-aware approach to assumptions and preconceptions.

  • Create a detached thinking session, as a distinct step in the analysis process. Analysts should hold a session whose sole aim is to articulate, define, and challenge all the basic and most profound assumptions of the mindset. This is most useful at the very beginning of a planning process, and periodically, following an apparent change in circumstances. This session should include outside participants who are removed from the planning process itself, preferably with a disparate area of expertise. In the case of the 1973 war, if such a session were held, the Israelis would have likely further examined the Egyptian definition of victory, leading to an updated assessment.
  • Offer an alternative concept. While analysts use a situation-specific concept, they should maintain an alternative concept on the back burner. The difficulty here is that this alternative can’t simply be the opposite, or alternative paradigm. It has to be a valid, coherent, and articulated set up, valuable despite its improbability, and even-handed regarding its consequences. One concept takes the lead role, while the alternative is carefully examined if indicators point in that direction. One difficulty inherent to this approach is that “collection bias,” or the feedback loop between collection and intelligence: By focusing extensively on one preconception, intelligence collection will be directed towards finding supporting evidence for that assumption. Collection efforts supporting alternative approaches are weaker, and thereby the differing approach is taken less seriously.  Therefore, sufficient collection efforts should be directed at this alternative concept, lest collection efforts support an additional bias. An example of successfully using an alternative preconception can be seen in IDF Military Intelligence General Aharon Yariv: Prior to the 1967 war, he held to the preconception that since Egypt was embroiled in the civil war in Yemen, President Nasser would not have the bandwidth to launch a war against Israel—Egypt’s military build up was intended solely to deter Israel. Yet just days before the war broke out, this assessment changed, using the tactical information on Egypt’s air force deployment as a justification to launch a pre-emptive attack, thus starting the 1967 war. Due to an alternative paradigm he maintained, coupled with a constructive dose of constant doubt, Yariv affected the course and outcome of the war.
  • Re-examine the major assumptions vis-à-vis the ever-changing reality. While this appears self-evident, it is often overlooked, due to time constraints. Dedicating a disparate session to the sole question of how this new piece of information changes the underlying assumptions and preconception can be extremely beneficial. Separating this session from other sessions in the intelligence process increases awareness, and combats the tendency of adopting comfortable explanations.
  • Actively look for misconceptions in retrospect, and not only following dramatic failures such as the 1973 war or 9/11 attacks. Best practice dictates implementing a debriefing process, where preconceptions are reviewed, assessing whether or not they proved useful or damaging, and why. This practice is crucial as it buttresses institutional memory, ensuring the organization is prepared to handle misconceptions in the future. It also creates the necessary individual self-awareness, where analysts learn to cope with their own psychological limitations.
  • Make the institutional framework underlying these strategies pluralistic and open, where colleagues can challenge their superiors during the thinking process where and speaking up is encouraged. Establishing such a culture should the highest priority, and the organization must be willing to pay the price for nurturing, teaching, and practicing it.

There is no easy remedy to the age-old problem of being led astray by misconceptions, just as there is no way to avoid making misguided decisions. The measures suggested here are no panacea, however, they can make an already difficult and ambiguous process considerably less precarious. Objective analysis is not achieved through avoiding preconceptions altogether—rather by re-examining the essential components of the mindset, from the ground up. Basic assumptions and reasoning should be made as explicit as possible, so members of the intelligence community can challenge them, ensuring objectivity and validity.

Authors