Fog of War: Errol Morris and Robert McNamara Discuss Military Failures
An intelligent person takes appropriate actions to effectively plan for future activities and postulate for any scenarios that would cause problems. A wise person not only reviews those plans and subsequent actions following the prescribed events, but they apply those lessons at future engagements.
Fog of War (2003), from Academy Award winning documentarian Erol Morris, is a film comprising multiple interviews with Former Secretary of Defense Robert McNamara (1961 – 1968) for Presidents John F. Kennedy and Lyndon B. Johnson. Not known to be either humble nor admitting his failures, Fog of War was released 6 years before McNamara’s death at the age of 93 and served as his retrospective review of his actions as the controversial SECDEF during the Vietnam War and the Cuban Missile Crisis. McNamara came from an academic background who served during WWII working on General Curtis LeMay’s staff planning bombardment of Japan. After the war, the became the first non-Ford family member to be the CEO of Ford Motor Company and subsequently improved Ford’s profits. JFK appointed McNamara to Defense due to his non-traditional and data driven decision making that was needed for the Defense Department.
McNamara was a lightning rod when it came to controversy. He was disliked inside the defense establishment, the political establishment, and the anti-Vietnam War demonstrators. When one studies him, one will notice he rarely apologized for any of his errors, nor did he admit those failures openly. Fog of War along with his 1995 book In Retrospect: The Tragedy and Lessons of Vietnam opened the door to those tough lessons. While watching McNamara discuss the past, one sees an elderly man who is openly emotional for his mistakes wishing he an others could have done better. That is the power of this documentary: a man admitting at the end of his life that some of his life’s work was wrong.
Throughout the film, McNamara reviews his life and offers 11 lessons for future generations to understand on security, war-fighting, and the political plans in the future.
Lesson #1: Empathize with your enemy.
If one wants to beat their enemy or at least breakeven, they must empathize with their counterparts. As military leaders, we can not just apply our own social, military, or political assumptions upon adversaries but we must evaluate their actions through their own assumptions. He reviews the actions of the Cuban Missile Crisis and points to several occasions where JFK and his administration were responding to Soviet actions through U.S. assumptions. Following the crisis, McNamara realized following conversations with Cuban President Fidel Castro their assumptions were incorrect. This would be the same for U.S. assumptions in fighting the Northern Vietnamese.
Lesson #2: Rationality alone will not save us.
Rationality underpins nuclear deterrence theory; the idea being rational people will not make decisions that will cross the nuclear threshold. McNamara says all the Cuban Missile Crisis players were rational (JFK, Khrushchev, Castro, etc) but the situation played out in an irrational manner pushing towards the nuclear red line. He reiterates it was purely luck that the Cuban Missile Crisis ended the way it did.
Lesson #3: There’s something beyond one’s self.
McNamara walked away from a thriving academic life to join the U.S. Army Air Corps during WWII. Then in the 1950s as he was the extremely successful CEO of Ford Motor Company, he accepted a significant pay cut to serve in the Kennedy Administration. Several times in his life he sacrificed money and fame joining government work where his talents could provide significant improvement.
Lesson #4: Maximize efficiency.
McNamara was known as an efficiency expert. Everything he did in his professional endeavors focused on improving effectiveness through increasing efficiency. He highlights his time working on Guam during the U.S. firebombing of Japanese cities and how his actions increased weapons effectiveness. In looking back at LeMay’s actions, he concludes that their actions against Japanese civilian populations could be considered war crimes.
Lesson #5: Proportionality should be a guideline in war.
Waring nations should remember to be proportional in response. Even with technological advantages, the U.S. should continue to use proportionality as a guideline of war ensuring our actions do not obliterate cities or populations. Even if a military action is easy and meets the objective, it should be checked against the total brutality against the adversary. Destroying huge percentages of Japanese cities during WWII was easy for the U.S. Army Air Corps to do, but those actions were brutal beyond what any military objective could be reached.
Lesson #6: Get the data.
Make decisions from data as best as one can do to counter emotional responses. Decision making does not stop at the first layer, but a thorough root-cause analysis. He highlights Ford Motor Company’s actions in the 1950s at getting at the reasons for automobile tests and doing even the most primitive tests to get answers.
Lesson #7: Belief and seeing are both often wrong.
McNamara points to several occasions where his beliefs and the things he witnessed did not match what was reality. We may think we know a situation and the background, but most of the times our estimates are incorrect. The received intelligence or after action reports often does not match what reality was during the events. McNamara references the Gulf of Tonkin incident and how reality did not match the story.
Lesson #8: Be prepared to reexamine your reasoning.
Because our assumptions are incorrect or the situation does not match what we see or believe, one must be ready to re-evaluate their own decision logic. It is easy to acknowledge there are problems in the input data, but one must examine why they think the way they do.
Lesson #9: In order to do good, you may have to engage in evil.
“Recognize at times we have to engage in evil, but minimize it.” McNamara believed that in order to drive political objectives, military actions need to take on an “evil” dimension. This sounds counter to some of the previous rules, but McNamara’s rule ties back to proportionality and minimizing evil. Militaries may have to use weapons or tactics that may be perceived as evil to meet the military objective.
Lesson #10: Never say never.
Absolutely brilliant. Possibilities.
Lesson #11: You can’t change human nature.
No matter how much society changes or the rules of international law change, human nature remains the same and it is not one that is necessarily positive. McNamara makes these statements through the lens looking back at one’s life. How in hindsight we can look through the fog of war (life) to see how things do not change.
Fog of War is a great documentary and should be watched in conjunction with The Unknown Known (2013) Morris documentary about Donald Rumsfeld. The power of this documentary lies in the reflection of one’s life and where one makes mistakes. McNamara looks broken, maybe beaten, and realizes that as his life comes to an end the best thing for him to do is talk with the U.S. political equivalent to a minister: a documentarian. Morris interviewing approach is also unparalleled in setting the
Pred
A 19 year Active Duty Air Force Officer who loves a good story. At the time of this article, he is reading the book “Justinian’s Flea” by William Rosen.