The two disasters led to investigations, to the removal of senior officials, to promises of institutional change, to undying expressions of sorrow. Among the remorseful were engineers who had warned that headaches like eroded O rings and missing foam created risks, but who could not persuade their immediate superiors to act. One former NASA engineer, Rodney Rocha, lamented to Retro Report that he wished he had been more aggressive in making his concerns known to those at the very top. Observing the chain of command looms large in the world of engineering, Mr. Rocha said, adding, “I will regret, always, why I didn’t break down the door by myself.”

In theorizing about what went wrong in these two disasters and in others, some social scientists have observed that certain circumstances may well be beyond anyone’s control. Even before Challenger, in 1984, a sociologist named Charles Perrow put forth the concept of “normal accidents,” by which he meant that in a technologically complex operation something is bound to go wrong at some point. These systems are made up of so many tightly linked parts, Mr. Perrow said, that even a seemingly minor glitch could lead to a cascade of woes that make cosmic failure almost unavoidable. An example for him was the near-meltdown in 1979 at the Three Mile Island nuclear power plant in Pennsylvania.

Another sociologist, Diane Vaughan, has written extensively about Challenger and served on the commission that investigated the Columbia horror. She has advanced the theory of “normalization of deviance,” meaning that in many organizations — NASA certainly being no exception — some problems and risks are understood to be acceptable — part of doing business, if you will. Take those problematic O rings on Challenger. Their erosion had been evident on earlier launchings, but flying with them became routine. To Ms. Vaughan, NASA’s decision to forge ahead on that fateful January day in 1986, despite new concerns about the O rings that were raised, did not reflect cold, bottom-line thinking or an amoral bending of rules. “They applied all the usual rules,” she told Retro Report. Regrettably, they did so “in a situation where the usual rules didn’t apply.”

Theories of this sort can conceivably touch on other calamities. The Deepwater Horizon oil spill of 2010 in the Gulf of Mexico might be one example. Another could be the recent South Korean ferry disaster in which several hundred people died, most of them high school students. Setting aside possibly criminal conduct by senior officers who abandoned ship, reports out of South Korea suggest that safety failings had long been accepted as routine: normalized deviance, which included excessive overloading of cargo onto the ferry.

Even when a system is not technologically complex, a tiny defect can lead to calamity. This is not a new thought. Perhaps the tragedies of Challenger and Columbia, seemingly rooted in relatively minor problems, were ordained by a concept that has been around for centuries. It has something to do with what can happen for want of a nail.

CLYDE HABERMAN, a regular contributor to Retro Report, has been a reporter, columnist and editorial writer for The New York Times, where he spent nearly 13 years based in Tokyo, Rome and Jerusalem. Subscribe to our newsletter here and follow us on Twitter @RetroReport.

This article first appeared in The New York Times.