The part that we play
Hayes, J., Maslen, S., Holdsworth, S. & Sandri, O. (2025). The part that we play: Engineers’ perceptions of their responsibility for public safety. Safety Science, 187
Our opinion
An article by Australian colleagues offering a way to reconsider a current trend that systematically emphasizes organizational responsibility in accidents and their prevention. The aim is to bring the pendulum back to the center by also acknowledging the active responsibility of workers — and in particular engineers — both in accidents and, even more importantly, in their prevention. A responsibility that also restores the ethical commitment of these engineers to doing their job well.
Our Summary
The 1980s saw a major shift in how accident causality — and responsibility — was understood, moving sharply from individual causes toward more organizational and systemic ones. But the pendulum may have swung too far, and this article reasserts the importance of the individual’s role — particularly that of engineers — both in accidents and, perhaps even more importantly, in the creation of safety. This individual responsibility is also closely tied to the professional ethics of engineers striving to do their job well.
James Reason, a key figure in this shift toward organizational factors, had himself clearly stated that the wisdom of frontline errors should also serve as an effective safeguard against organizational accidents. He did not advocate blaming frontline workers for accidents, but rather argued that greater hazard awareness could, in some cases, compensate for organizational shortcomings. He cited healthcare as a strong example where “although organizational accidents stem from systemic weaknesses, purely systemic countermeasures are not sufficient to prevent tragedies… The last line of defense is a doctor or a nurse working close to the patient.”
For other professionals, such as engineers — whose decision-making is more collective by nature — it is more complex to take individual action to prevent a catastrophe.
The article explores these questions through a literature review complemented by data gathered from ten workshops held with 93 Australian gas engineers. These workshops were centered around the (re)analysis of four emblematic cases of technological failure.
- The massive gas network reconnection accident in Massachusetts in 2018, where 10,000 customers were suddenly exposed to dangerous overpressure, causing 22 deaths;
- The pipeline rupture in California in 2010 due to overpressure, resulting in a massive fire ;
- The loss of NASA’s Space Shuttle Challenger in 1986;
- An accident on an artificial river rapid ride at an Australian amusement park in 2016, which caused four fatalities.
Each workshop lasted half a day, with two 90-minute face-to-face sessions facilitated by two professional moderators. The first session used a case-based discussion approach, while the second was structured as a role-play exercise. In the case discussion, participants were presented with a brief scenario describing the context of the issue and the stakeholders involved. The discussion focused on a real judgment or decision, requiring participants to grapple with a concrete problem. During the role-play, participants were assigned the roles of real individuals whose decisions had contributed to the development of the accident in that particular case.
The authors first revisit several key concepts from the literature
One of them is the ethical challenge faced by engineers when confronted with the dilemma between their professional integrity and the culture of the company they belong to.
It is not easy for engineers to navigate the internal contradictions between, on one hand, an ethical, objective, and critical view of the quality of their own work within the company, and on the other hand, the pressure to conform to the cultural norms and values of the organization they are part of — and to which, in some sense, they are also expected to adhere.
Two examples clearly highlight this dilemma.
The case of Boeing’s 737 Max
The case of Boeing’s 737 Max clearly illustrates the difficulty of finding the right balance between technical decision-making and managerial decision-making — and the consequences of getting it wrong. Over more than two decades, this imbalance deepened following major cultural shifts that prioritized profit over engineering excellence. Boeing’s organizational structure further reinforced this divide, with engineering and management operating in silos, and a hierarchy that discouraged engineers from bringing bad news to top management. Despite these communication barriers, some Boeing engineers nevertheless tried — unsuccessfully — to voice their concerns.
The Challenger disaster
The complex cultural interaction between management and engineers was also highlighted in Diane Vaughan’s 1996 analysis of the Challenger disaster. Diane Vaughan challenges the idea that the managers were solely responsible for ignoring the engineers’ warnings that the launch was unsafe on the eve of the scheduled date. She writes: “Socially organized and historically dependent, the decision they made was unlikely to have been different, given the multiple overlapping cultures to which they all belonged.”
In other words, while engineers hold the responsibility to act in the interest of safety, it is unrealistic to expect them to suddenly act against the culture they are part of when faced with an imminent disaster.
This perspective has given rise to a significant body of literature on the failure of prevention (Turner, 1978; Fischhoff, 1975; Weick, 1998; Woods, 2009), which can occur for a wide range of reasons. Such failure is closely linked to organizational values that either enable — or prevent — the expression of dissenting views and the regular communication of bad news up the hierarchy, well before the eve of a major disaster (Hopkins, 2019).
The search for someone to blame
Despite the widespread adoption of system-based explanations for accidents, public opinion typically demands that a specific individual be held accountable following a major accident.
In several countries (including France), the search for culpability has shifted toward senior executives, particularly with the introduction of legislation establishing the criminal liability of company directors for involuntary manslaughter. These laws hold senior managers accountable for workplace fatalities, even when their decision-making is remote in time and space from the specific incident. Prosecutions serve two purposes: punishment and deterrence.
However, the legal literature remains divided over the pros and cons of individual versus corporate (i.e., organizational) liability in cases of wrongdoing. In a way, this debate mirrors a broader discussion on the respective merits of criminal (retributive) justice — which seeks to assign blame — and civil (distributive) justice, which aims to provide financial compensation.
Owning responsibility
Responsibility is fundamentally about how a professional perceives their obligations and sphere of influence, independently of legal frameworks.
The literature on professional ethics in engineering argues that engineers have a responsibility to act in accordance with the rules and standards of their profession, including operating within their area of competence and acting with honesty. However, ambiguity quickly arises when it comes to interpreting the boundaries of this responsibility. While this responsibility clearly covers safety, health, and public well-being in relation to specific technical projects or activities (Mitcham, 2009), there is less consensus regarding the broader social impacts of the technologies that engineers may work on (Herkert & Borenstein, 2022).
Some scholars argue that addressing this broader impact and contributing to social justice should be core aspects of the engineer’s role. Others propose engineering ethics frameworks that instead emphasize loyalty to the employer as a primary professional value.
Experts also describe responsibility as being situated in time, recognizing that engineers should act to avoid blame (retrospective responsibility) and make decisions that serve the long-term integrity of an asset (prospective responsibility). In this context, a distinction is made between formal “duties” — such as reporting hazards and incidents — and a broader approach to professional practice, where individuals do what is necessary to address issues that cannot be fully codified. These concepts are particularly relevant when considering engineers’ responsibilities in the context of public safety.
What responsibility does an engineer have when they are “just one” among many?
What can an individual engineer do to prevent an industrial accident given the inherently collective nature of engineering practice? In what is commonly called the “problem of many hands” (Thompson, 1980; van de Poel et al., 2012), decisions related to the design and operation of complex sociotechnical systems are distributed across numerous actors. Engineering responsibility must be examined and assessed within this context. The multiplicity of actors can prevent engineers from fully appreciating the potential negative consequences of their own actions. Ethics specialists in engineering conclude that systems thinking is not only a technical skill but also critical for ethical decision-making (Riley, 2023).
However, this systemic ethical approach has its limits. To paraphrase Perrow (1984), no improved engineering practice focused on safety responsibility can make a highly complex and tightly coupled technology completely safe. Perrow’s concept of the “normal accident” remains a genuine risk, even though the ways to manage and avoid this risk are scarcely covered in the literature. The few empirical studies addressing this issue appear mainly focused on engineering consultants, and the complex relationship between consultants, their clients, and legal requirements for managing safety and environmental risks.
Results from the workshops with engineers
The results are grouped by themes common to all four cases.
Taking responsibility
Participants emphasize the contrast between the apparent routine nature of daily work and the need to maintain constant awareness of the potential risk of catastrophe. Speaking up becomes an important tool to raise these often hidden risks and concerns, especially regarding how to ensure lessons have been learned, necessary changes have been made, and critical issues have been properly communicated to management.
The four studied accidents also highlight the importance of personal responsibility, which should not be confined to a technical “echo chamber”. Engineers must remain active at all levels, ensuring that all operators fully understand the causes, with an impact on the culture extending up to executive management. Engineers cannot remain passive; their professionalism and responsibility depend on it.
Bureaucratic risk is everywhere, and it is also up to engineers to ensure that recommendations are not buried in company bureaucracy.
Another debate concerns what constitutes a “sufficient level of safety” to uphold, and what is reasonably achievable (“safe enough”). It is recognized that engineers are often overly cautious and conservative, supporting safety requirements that exceed what is reasonably feasible.
Ultimately, all these stances require that the individual conviction of engineers be shared by top management. Even if engineers are not passive, they must understand that their actions are always bounded and mediated by their company’s culture (which they share to some extent as employees), as well as the prevailing economic and political realities.
Sharing concerns with management
Participants emphasize that it is not just about delivering a one-time message but about establishing an ongoing communication posture with executive management based on trust and sufficient time for explanation and dialogue on both sides.
Managers necessarily deal with a broader range of risks than engineers, but they must include technological risks in their decision-making and therefore need to have properly assessed and understood their possible consequences. To achieve this, engineers must go beyond a simple risk briefing, discussing solutions, alternatives, and well-reasoned consequences — in short, they must be convincing and clearly understood.
The way risk is communicated matters. Some use a ‘scare tactic’ by recalling historical cases and outlining the most extreme consequences — the worst that could happen — often prompting demands for at least temporary shutdown measures. Naturally, management is more reluctant to embrace these catastrophic hypothetical scenarios, as the social and economic consequences of a preventive shutdown may be perceived as even more disastrous than the technological risk itself.
Another dimension of management’s listening concerns the legitimacy of engineers speaking about issues beyond purely technical matters, such as the human cost of working under degraded conditions.
However, engineers must be careful not to discredit themselves by focusing solely on catastrophes and the system’s weaknesses; it is clearly better to concentrate on priorities.
Other strategies can help facilitate management’s attention, such as arranging site visits for senior executives to share and directly experience the risks. These visits can also include tours of other sites or companies that handle things differently (and better).
Ultimately, everyone acknowledges that being heard is not easy and can fail. This reflects a certain maturity on the part of the engineer, who may, or may not, take the necessary steps for themselves — sometimes even leaving a company that has become deaf to their concerns.
Building confidence to speak up
It is not always easy to raise problems with management, especially when expecting a cold reception.
The legal dimension of individual responsibility can motivate engineers to speak up, as can the moral dimension. Young engineers often find it more difficult to voice these concerns than their more experienced colleagues; they are more influenced by management regarding their career and compensation but have the advantage of being able to present almost iconoclastic viewpoints that may leave a strong impression on their listeners. Additionally, they are more legitimate in requesting explanations under the somewhat excusable guise of (perhaps false) naivety.
Conversely, criticizing or even simply questioning the validity of management directives is not easy, but it is possible for more senior engineers who have little left to gain from their careers.
Loyalty is another factor, reflecting the complex ethics engineers face in their ability or inability to dissociate themselves from management.
Environmental factors influencing responsibility
For some participants, being affiliated with a professional body and certified in their specialty matters, as it imposes ethical values on its members.
Another factor is the level of stress experienced at work and the workload, which can reduce curiosity and cause individuals to become more withdrawn and focused on immediate daily problems.
Finally, having genuine decision-making authority at any level within the system is a key element of assumed safety. This sense of responsibility in safety decision-making stands in opposition to a blame culture, where actors seek any excuse to avoid exposure and shirk responsibility. It is also noted that it is the engineer’s responsibility to contribute to building and maintaining a strong safety culture.
Conclusion
The workshops showed that engineers are willing to discuss their professional responsibilities. They acknowledge that holding engineers accountable after a disaster can make sense, just as they recognize that one of their tasks is—as much as possible with the resources available—to prevent accidents.
Although in the literature holding engineers responsible after an accident is often viewed negatively, past cases can serve as lessons and analysis to build frameworks and improve the teaching of safety practices. This includes understanding why engineers failed or were trapped, what skills and/or organizational factors were lacking, including their non-technical competencies in leading teams.
Comments by Hervé Laroche and Eric Marsden from the Foncsi team
The authors of this article describe how engineers perceive their responsibility for accident prevention from a predominantly Anglo-Saxon perspective (the interviews were conducted with Australian engineers). It is worth noting that there are some specificities in France regarding professional culture and institutional roles of engineers:
- A fairly strong permeability between engineering and management roles and careers, particularly for graduates of the “Grandes Écoles”.
- The absence of a professional order for engineers, whereas in other countries such as the USA, this order and the specific regulatory responsibilities of engineers in certain sectors allow them to influence their employer’s decisions.
Finally, since the authors of the article mention it, the Challenger case and the catastrophic decision made on the eve of the launch call for a more nuanced view of the relationship between engineers and managers.
First of all, even though, as the article states, it is “not surprising that managers followed their ‘culture’” in deciding in favor of the launch, this does not exempt them from any responsibility! Otherwise, the only cases of personal responsibility would be those where decision-makers did not follow their culture! In other words, and more bluntly: a decision-maker is responsible for the culture that permeates them.
But the most important point lies elsewhere. During the infamous meeting examining the issue of temperature effects on the O-rings, a perverse interaction between engineers and managers took place. Certainly, overall, the engineers vigorously advocated for postponing the launch for safety reasons, while the managers decided to proceed. However, both parties were ultimately quite comfortable with this cultural distribution of roles. The engineers could raise technical alarms without bearing the weight of the final decision, while the managers retained the power.
At the contractor Thiokol, only two participants acted against the launch decision once it was made: one engineer (who incidentally noted that there was no unanimity within Thiokol) and one manager (who refused to sign the final recommendation sent to NASA). But neither they nor any other participant tried to openly oppose the decision by bypassing procedures and hierarchical channels (for example, by directly addressing key actors absent from the meeting, such as NASA executives). Boisjoly, the most vehement engineer opposing the launch, remained silent. He never recovered from it.