Last week, the commanding general of SOCOM Africa was reprimanded over an October 2017 skirmish/ambush in Niger that left four special forces soldiers dead.  In all likelihood the actual issue is that the incident brought unwanted attention on the American presence in Niger.  Nevertheless, Army’s attitude toward this incident reminds me of Tom Wolfe’s The Right Stuff:

Barely a week had gone by before another member of the Group was coming in for a landing in the same type of aircraft, the A3J, making a ninety-degree turn to his final approach, and something went wrong with the controls, and he ended up with one rear stabilizer wing up and the other one down, and his ship rolled in like a corkscrew from 800 feet up and crashed…after dinner one night they mentioned that the departed had been a good man but was inexperienced, and when the malfunction in the controls put him in that bad corner, he didn’t know how to get out of it.
[…]
Not long after that, another good friend of theirs went up in an F-4, the Navy’s newest and hottest fighter plane, known as the Phantom. He reached twenty thousand feet and then nosed over and dove straight into Chesapeake Bay. It turned out that a hose connection was missing in his oxygen system and he had suffered hypoxia and passed out at the high altitude…How could anybody fail to check his hose connections? And how could anybody be in such poor condition as to pass out that quickly from hypoxia?
[…]
When Bud Jennings crashed and burned in the swamps at Jacksonville, the other pilots in Pete Conrad’s squadron said: How could he have been so stupid? It turned out that  Jennings had gone up in the SNJ with his cockpit canopy opened in a way that was expressly forbidden in the manual, and carbon monoxide had been sucked in from the exhaust, and he passed out and crashed. All agreed that Bud Jennings was a good guy and a good pilot, but his epitaph on the ziggurat was: How could he have been so stupid? This seemed shocking at first, but by the time Conrad had reached the end of that bad string at Pax River, he was capable of his own corollary to the theorem: viz., no single factor ever killed a pilot; there was always a chain of mistakes. But what about Ted Whelan, who fell like a rock from 8,100 feet when his parachute failed? Well, the parachute was merely part of the chain: first, someone should have caught the structural defect that resulted in the hydraulic leak that triggered the emergency; second, Whelan did not check out his seat-parachute rig, and the drogue failed to separate the main parachute from the seat; but even after those two mistakes, Whelan had fifteen or twenty seconds, as he fell, to disengage himself from the seat and open the parachute manually. Why just stare at the scenery coming up to smack you in the face! And everyone nodded. (He failed—but I wouldn’t have!)
–Tom Wolfe, The Right Stuff
Feel free to read the whole thing, it’s worth it.  Anyway, contemporary “flight test”, and later NASA, were organizations with no tolerance for human error.  If cultivating such an attitude required occasionally blaming someone for something that was not, objectively speaking, their fault, well the payoff in effort and vigilance were worth it.  Better to reprimand one innocent man than let two others make avoidable mistakes.

Modern special forces “operators” have much higher life expectancies than 1950s test pilots, in no small part thanks to the technical and organizational efforts of and for said pilots.  That no incident has a single cause but rather results from a chain of “mistakes” or errors has gone from Pete Conrad’s private observation to official fact.  Of course it took a long time — decades, generations even — for this attitude to really take root elsewhere (sometime around the late 1980s, as best as I can tell.  And by the way, it worked)
Accidental_Deaths
There is a fundamental difference, however, between what NASA was doing in the 1960s and what special forces, or combat troops more generally, do.  The struggle Tom Wolfe documents, however bloody, was one of man against nature.  There was no thinking enemy.
The presence of the enemy changes everything: “You can be 100% right, and still be 100% dead”.  Mr. Right Stuff himself, Neil Armstrong, was shot down in Korea.  Unfortunately, thinking of “exposure” as the only significant risk factor can encourage troops to avoid conflict and sacrifice victory for the prospect of individual survival.  Inversely, accepting the role of Fate in one’s destiny easily leads to a sort of learned helplessness about factors that are within human control.  I do not think it a coincidence that the graph of decreasing fatal accident rates I posted above correlates with decreasing numbers of Vietnam veterans on duty.
The truth is that both things are true at the same time: human diligence and The Right Stuff go a long way, and they can only do so much.  Unfortunately there’s a certain level of cognitive dissonance involved here, compounded by the fact that if you really believe there’s no such thing as luck, you may well end up with better results.  Particularly if you’re lucky. If you’re the leader of such an organization, you will find yourself in possession of esoterica: the boys are rolling the dice, but they’ll have better odds if they don’t believe it — so you don’t let them.
Coming back around to Niger in 2017, SOCOM tries its damnedest to reduce enemy initiative to a rounding error in their plans.  The more elite (i.e. direct-action oriented) the organization, the more planning, the more resources, the more variables are factored into an operation.  In Niger, the enemy attacked from a deliberate ambush with superior numbers and managed to kill several of their targets.  Had the SF team involved done things differently, undoubtedly they would have avoided the ambush.  And the chain of responsibility for not having done these things extends all the way up to the equivalent of the division commander.
Or maybe this time the Americans got outvoted by the enemy, and it wouldn’t have made a difference.  Zero-defect thinking works as well as it does because the enemy operates on the lowest level of proficiency and technical capability possible without actually being uncontacted primitive tribes.
In any conflict against even a remotely-near-peer force, the idea that every single combat fatality results from avoidable mistakes by one side will go out the window.  Until then however, pretending otherwise provides a valuable training tool for building a maximally competent force.