Date_en
March 2026

Collapsing Structures and Public Mismanagement


auteur
Author(s):
author

Wolfang Seibel

référence
Reference:
référence_en

Seibel, W. (2022). Collapsing Structures and Public Mismanagement. Palgrave Macmillan / Springer Nature

Our opinion

stars_en
4
opinion

This month, we have selected a book which, although it does not originate from the safety science research community nor deal with major accident hazard industries, describes a range of organizational and political processes that can hinder sound risk management and that may not be entirely unfamiliar to our readers.
Analyzing four civil engineering accidents, the author explains how institutional fragmentation, poorly controlled delegation of responsibility for safety issues, the erosion of professional integrity standards, the politicization of decision-making, and willful ignorance led to managerial negligence that allowed these disasters to occur.

Our Summary

Written using an AI system.


The book by Wolfgang Seibel, Professor of Political Science and Public Administration at the University of Konstanz (Germany), examines the tragic impact of poor public management on citizen safety by providing an in-depth analysis of real-life cases in which the collapse of structures built or monitored by public authorities resulted in significant loss of life. While engineering and technical safety are often highlighted, Seibel emphasizes that managerial, institutional, and bureaucratic negligence plays a central role in these failures. Through a comparative analysis of four emblematic cases from Australia, New Zealand, the United States, and Germany, the author develops a theory of the causal mechanisms of administrative dysfunction: dilution of responsibility, willful ignorance, paper safety, erosion of professional integrity, and the politicization of technical decisions.

At the theoretical level, the book is situated within the debate between Charles Perrow’s normal accident theory, which argues that certain complex organizational systems inevitably generate accidents, and high-reliability organization (HRO) theory, which, by contrast, maintains that organizations can sustain a high level of reliability even in high-risk contexts, provided they implement organizational arrangements that integrate safety as a core component of performance and adopt professional practices that support safety. Seibel aligns himself with HRO theory: preventing catastrophe through public oversight is, in itself, fundamentally achievable; the real challenge lies in resisting incentives that threaten professional integrity. The author thus rejects purely structural or bureaucratic explanations and refocuses the analysis on actors, their motivations, and their responsibilities.
 

Collapsing structures and public management


In the preface, Seibel situates his work in the year 2020, marked by the Covid 19 pandemic, which revealed the crucial importance of public agencies in protecting human life. The book raises two central questions: what explains the unlikely failure of expert agencies and public authorities to safeguard safety? And what can be generalized from rare cases for the purposes of learning and prevention? The introductory chapter lays out the conceptual and methodological foundations of the book. The author draws on the metaphor of the “black swan,” borrowed from Nassim Nicholas Taleb, to refer to rare, unpredictable events with massive consequences. However, Seibel qualifies this notion: the collapse of public structures is not entirely unpredictable; such events follow recognizable trajectories of social causality that could have been interrupted through appropriate institutional vigilance.


 

The author presents four cases of structural collapse:

•    The West Gate Bridge in Melbourne (1970)
•    The I-35W Mississippi River Bridge in Minneapolis (2007)
•    The Canterbury Television Building in Christchurch (2011) 
•    The Ice skating Rink in Bad Reichenhall (2006)

 

He justifies their selection by the richness of the empirical documentation available (commissions of inquiry, judicial archives), as well as by the diversity of the institutional contexts they represent.


 

The Collapse of the West Gate Bridge in Melbourne (1970)

The second chapter analyzes the collapse of span 10–11 of the West Gate Bridge in Melbourne, which killed 35 workers and engineers while the structure was still under construction. The inquiry conducted by a Royal Commission revealed that the disaster resulted from a combination of an ambitious and unconventional structural design, an unusual erection method, and fragmented and conflict ridden site management, the deleterious effects of which were not brought under control by the responsible authorities.

The central causal mechanism identified by Seibel is that of “evaporated responsibility.” Regulatory and oversight powers had been delegated to a quasi public body (a QUANGO), which diluted responsibility structures and decisively weakened the agency’s capacity for coordination and control. The competent authority lacked the technical resources and functional authority required to properly supervise contractors who were in a state of permanent conflict.

The author retraces the urban context in which the project developed: the need to relieve congestion at the crossing of the Yarra River between Port Melbourne and Williamstown, two separated suburbs, and the decisions made in the 1960s regarding the design and the contracting arrangements. Contractual fragmentation among several main contractors and consulting engineers generated counterproductive rivalry, unresolved technical disagreements, and an inability to impose corrections at the critical moment. The analysis highlights how the absence of a supervisory body endowed with real authority constituted a necessary condition for the disaster. This case illustrates how excessive institutional fragmentation and delegation of responsibilities can lead to a loss of organizational control and to dangerous technical decisions.

 

The Collapse of the I-35 W Mississippi River Bridge (2007)

The third chapter examines the collapse of the I 35W Bridge in Minneapolis, which claimed the lives of 13 people. The investigation conducted by the National Transportation Safety Board (NTSB) established that the direct cause was the failure of gusset plates that had been undersized from the original design of the bridge in 1967 by an engineering firm and were never detected during subsequent inspections.

The causal mechanism highlighted is that of “willful ignorance. The Minnesota Department of Transportation had systematically avoided carrying out a full load rating analysis, which would necessarily have revealed the bridge’s structural weaknesses and therefore made an expensive rehabilitation unavoidable. Decision makers were aware that the bridge posed considerable budgetary challenges: identified as early as 2004 as one of the “budget buster” bridges, its full replacement was ruled out from the outset because of the substantial cost and the traffic disruptions it would entail. Discussions in 2006 focused on replacing the deck, not the truss, and even less so the gusset plates.

Seibel emphasizes that photographs of damaged gusset plates had existed since 1999 (University of Minnesota) and, more importantly, since 2003 (URS consulting firm), but were deliberately ignored or downplayed. The NTSB explicitly concluded that, had a load capacity evaluation been performed prior to the bridge’s opening, the design error could have been detected. This case illustrates how a public organization can institutionalize ignorance in order to avoid politically and financially uncomfortable decisions, at the cost of an increasing risk to public safety.

 

The Collapse of the Canterbury Television Building in Christchurch (2011)

The fourth chapter is devoted to the collapse of the Canterbury Television (CTV) building during the 2011 earthquake in Christchurch, New Zealand. This event is the deadliest of the four cases analyzed: 115 people lost their lives, including many foreign students enrolled in a language school housed in the building. The New Zealand Royal Commission determined that the building permit had been granted in 1986 by the Christchurch City Council (CCC) despite serious structural deficiencies.

The central causal mechanism is the erosion of professional integrity. The lead engineer, David Harding, had used a structural analysis software package whose limitations he did not fully understand and had never previously designed a multi story building with the specific characteristics of the CTV building. His supervisor, Alan Reay, had not checked any of the building’s structural details. The CCC building consent officer, Bryan Bluck, had granted the permit under the personal influence of Reay, who had sought an intervention to bypass reservations raised internally – thereby compromising his own professional integrity as well as that of the institution.

Seibel emphasizes the accumulation of “missed tipping points”: in 1990, during an inspection commissioned by the Canterbury Regional Council in preparation for the acquisition of the building, engineer John Hare had identified a major structural non compliance, but this information was not passed on to the competent authority. After the first earthquake in September 2010, the CCC did not require a structural assessment of the building and issued it a “green placard” declaring it fit for use. The weakness of the joints between the floors and the shear walls, the probable cause of the collapse, had thus been implicitly known for years, without anyone judging it necessary to take formal action.

This case illustrates how, over time, professional standards can be relaxed, and how institutional and bureaucratic pressures can undermine engineering practices that are nevertheless essential to safety.

 

The Collapse of the Ice Skating Rink in Bad Reichenhall (2006)

The fifth chapter examines the collapse of the roof of the municipal ice rink in Bad Reichenhall, Bavaria (Germany), which occurred in January 2006 after heavy snowfall. Fifteen people lost their lives, including twelve children aged 7 to 15 and three mothers accompanying them; 34 others were injured. Judicial proceedings concluded that the City of Bad Reichenhall had seriously neglected the maintenance of the building over a long period, despite clear signs of water infiltration and weakening of the roof structure.

The central causal mechanism emphasized by Seibel is the “politicization of the non politicizable.” The city’s mayor admitted in court that he had deliberately obstructed the municipal council’s decision to renovate the ice rink because he intended to demolish it and replace it with a modern leisure and wellness center. He thus used his power to subordinate a matter of physical safety – which, by its very nature, should not be subject to political calculation – to a personal vision of urban development. The city’s technical authority not only failed to neutralize this drift but also implemented its damaging consequences to the detriment of user safety at the ice rink.

Seibel shows how the civil engineer convicted at first instance had accumulated three professional failings: construction not compliant with the approved permit, inadequate structural calculations, and deficient oversight of the construction process. These technical faults, however, developed within an institutional environment in which political pressure had made any independent safety control illusory. This case illustrates how “political logic” can invade and corrupt “technical logic,” with fatal consequences.

 

Conclusion: Strategic learning and situational high reliability

The final chapter brings together the cross cutting lessons drawn from the four cases and formulates a normative theory of the prevention of administrative disasters. Based on a comparative synthesis of the causal mechanisms identified, Seibel develops an analytical framework applicable to other public management contexts.

The cross case comparison shows that, despite the diversity of national settings and types of infrastructure, the same broad families of causal mechanisms are at work:

  • dilution of responsability (West Gate),
  • deliberate avoidance of threatening information (I-35W),
  • erosion of professional standards through relationships of collusion (CTV), and
  • subordination of safety imperatives to political or personal interest-driven logics (Bad Reichenhall). 

In all cases, the conditions for catastrophe were known or knowable and could have been neutralized through adequate institutional oversight.

Seibel then develops the concept of situational high reliability, in contrast to the idea of permanently high reliability organizations. He argues that public authorities are not inherently high reliability organizations, but that they must act as such whenever safety is at stake. This reliability is situational: it must be activated in response to specific warning signals. This implies an organizational capacity to recognize such signals and to give them priority over budgetary, political, or relational considerations.

The author’s central proposal is that of strategic learning. Rather than relying on abstract, bureaucratic structural reforms, he advocates learning processes focused on neutralizing threats to integrity – financial, professional, and political – and on strengthening public officials’ sense of responsibility. This learning must be strategic in that it responds to the specific configurations of vulnerability identified through the analysis of causal mechanisms. The issue is not to reform public administration as a whole, but to precisely identify breaking points in chains of responsibility and to equip institutional actors with the resources – normative, procedural, and political – needed to resist them.

 


Commentary by Éric Marsden, Program Manager at Foncsi

The cases examined in this volume fall within the field of civil engineering, rather than the world of high risk industries with which Foncsi is more familiar. Nevertheless, many themes common to both sectors emerge from the reading.

 

Several cases illustrate the difficulty authorities face in overseeing complex activities due to a lack of appropriate internal expertise. This issue in the relationship between regulator and regulated entity can be found in several emblematic industrial accidents (the Boeing 737 Max, for example) and is becoming increasingly pronounced with the growing importance of complex technological systems such as AI based systems.

 

The fragmentation of responsibilities appears in many accident cases. In the case of the Francis Scott Key Bridge in Baltimore (USA), destroyed by a container ship collision in 2024, no fewer than nine organizations and stakeholders were involved, with roles in management, funding, oversight, and advisory functions regarding applicable construction standards. This multiplicity of actors delayed the decision to undertake (costly) work to improve the bridge’s protective features. During the trial following the Grenfell Tower fire (London, 2017, 72 fatalities), 19 organizations were implicated, ranging from the engineering firm, the companies carrying out the refurbishment, suppliers of insulation materials unsuitable for use on high rise buildings, the materials testing laboratory, the body that accredited the laboratory, and public authorities. The King’s Cross Underground fire (London, 1987, 31 fatalities) provides another example: a multiplicity of organizations (London Underground, British Transport Police, London Fire Brigade) shared fragmented jurisdiction over network safety, without any one of them holding clear responsibility.

 

The predominance of a financial logic focused on maximizing shareholder value over long term safety issues, combined with weak safety indicators to inform catastrophic risk management, can be found in many decision trade offs. Similar phenomena occurred in the Piper Alpha offshore platform explosion (1988, 167 fatalities), where economic and organizational pressure gradually eroded formal permit to work practices until they became ineffective, and in the Texas City disaster (2005, 15 fatalities), where numerous repeated signals of technical degradation of the facilities were ignored by the operator in the name of reducing maintenance costs.

 

The issue of safety audits based on superficial paper compliance, rather than on an in depth analysis of the state of organizational practices or technical structures, is present in accidents such as Deepwater Horizon (2010, USA). In the case of the I 35W bridge, procedures originally designed to enable effective maintenance lost their meaning and became rituals – mere “paper” obligations. The oversight system was characterized by a concern for compliance and by an excessive simplification of technical reports to make them compatible with financial statements (thus running counter to the preoccupation with failure and the reluctance to simplify interpretations emphasized by high reliability organization theory).

 

Barry Turner (Man Made Disasters, 1978) argues that catastrophes are preceded by a long incubation period during which warning signals are ignored or misinterpreted. This theory is illustrated in the case of the Morandi Bridge by the corrosion of the stay cables, known since the 1990s, but which – through a process of drift from preventive maintenance toward purely reactive maintenance – reflects an implicit acceptance of risk by managers within the public private governance structure. Turner emphasized the role of erroneous beliefs and institutional rigidities in the failure to decode these signals; Seibel adds to this analysis the active mechanisms (politicization, collusion, financial calculation) that transform passive ignorance into active ignorance. A link can also be made with the process of normalization of deviance theorized by Diane Vaughan in her analysis of the Challenger space shuttle disaster. Vaughan shows that NASA had gradually incorporated O ring failures into its definition of “acceptable” operation, to the point that the emergence of a new risk factor (exceptionally low temperatures) was not properly assessed. In the case of the I 35W bridge, the failure to detect defective gusset plates is the product of an organization that had embedded the avoidance of uncomfortable information into its routines.

 

More generally, the accidents analyzed in this volume illustrate a key argument in the academic Safety Science literature: major accidents are rarely “crazy” or unpredictable events, but rather symptoms of systemic organizational pathologies.