Employees: A Problem to Control or Solution to Harness?
By Dekker, Sidney | |
Proquest LLC |
Throughout the 20th century, we have been of two minds about the role of humans in creating and breaking safety; perhaps we still are. One view is that humans are a problem to control. Before World War U, an Oxford psychologist named Vernon studied 50,000 personal injury incidents and concluded that they "depend, in the main, on carelessness and lack of attention of the workers" (
Humans were seen as the cause of safety trouble. These negative characteristics could be identified by testing and screening employees. Psychology was dominated by behaviorism during this time (something still seen in behavioral safety today). Behaviorism aims to influence human behavior so that it fits the constraints and demands of the system in which people work. It does not really ask why people do what they do, for it has no models of the mind to explain any of that (in fact it actively resists such introspection). Rather, behavioral intervention assumes that with the right incentives and sanctions, people can be engaged in safety.
World War II changed this. Technological developments were so rapid and complex that no amount of intervention addressing the human worker alone could solve safety problems. Practically all
What happened was that in the last decades of the 20th century, experts dealing with mechanization in many settings all moved away from a focus on the careless or cursed individual who caused accidents. Instead, they now concentrated, to an extent that is remarkable, on devising technologies that would prevent damage no matter how wrongheaded the actions of an individual person, such as a worker or a driver. (
The psychology that was called on was different from behaviorism. It was a psychology of cognition, a science that was interested in understanding why it made sense for people to do what they did. For this, it had to develop models of mind-of attention, perception, decision making and information processing. Technologies and work environments could then be designed to consider what had been learned about human performance and limitations.
Safety Engineering Today
How do we relate to safety engineering ideas today? The instability between them is still visible. On one hand, we have learned through numerous high-visibility negative events that people are resources to harness-that expertise and practical experience matters. In the wake of the
Deference to expertise means engaging those who are practiced at recognizing risks and anomalies in operational processes. These may be the workers who are in direct contact with the organization's safety-critical processes. This has become a well-established prescription in research on high-reliability organizations and resilience (HolInagel, Woods & Leveson, 2006; Weick & Sutcliffe, 2007).
That said, experts do not always get it right either. Research has identified limits on experts' privileged knowledge of safety-critical processes and safety margins (Dömer, 1989). Continued operational success, for instance, can be perceived by experts as evidence that risk-free pathways have been developed (Barton & Sutcliffe, 2009; Starbuck & Milliken, 1988), and exceptional expert competence is often associated with taking greater risk (Amalberti, 2013).
Greater Bureaucratization?
The question is whether the solution lies in greater bureaucratization, and in seeing people as a problem to control. Let's first define bureaucratization; it means the administrative governing, by not necessarily representative organization members, of the relationship between the means an organization dedicates to safety and the ends it pursues. Bureaucratization involves hierarchy, specialization, division of labor and formalized rules. Between 1974 and 2008,
*Oil-drilling: In 2008, 2 years before the Macondo well blowout, BP warned that it had "too many risk processes" that had become "too complicated and cumbersome to effectively manage" (Elkind & Whitford, 2011, p. 9). While measurable safety successes were celebrated, the organization's coherent understanding of engineering risk across a complex network of contractors had apparently eroded (Dekker, 2011; Graham, Reilly, Beinecke, et al., 2011).
*School trips: The Health and Safety Executive in the
misunderstandings about the application of health and safety law [which] may, in some cases, discourage schools and teachers from organising [school] trips. These . .. may include frustrations about paperwork, fears of prosecution if the trip goes wrong, [or] that a teacher will be sued if a child is injured. (HSE, 2011, p. 1 )
*Aviation:
The rate of production of new rules in aviation is significantly increasing while the global aviation safety remains for years on a plateau at 10-6 (over 200 new policies/guidance/rules per year). Since nobody knows really what rules/materials are really linked to the final safety level, the system is purely additive, and old rules and guidance material are never cleaned up. (Amalberti, 2001, p. 111)
*Wildlife surveys: One professional conducting environmental impact studies in
I am obliged to wear a hard hat (even in treeless fields); high-visibility clothing; long-sleeved shirts with the sleeves buttoned at the wrist; long trousers; steel-capped boots; and safety glasses. I may have to carry a GPS, EPIRB, UHF radio, first-aid kit, 5 kilograms of water, sunscreen, insect repellent and, albeit rarely, a defibrillator. (Reis, 2014)
*Procurement, now typically requiring "pretender/supplier health and safety questionnaires ... of varying or increasing complexity and all requiring different information," and the increased use of a "third party to assess a supplier's suitability to be included on the approved list [involving] an assessment fee and annual membership fee" (Simmons, 2012, p. 20).
Of course, increasing regulation and the kind of standardization and systematization that comes with bureaucratic governance have paid great safety dividends during the 20th century. Increasing bureaucratization of safety over the past decades owes much to a moral commitment to stop hurting and killing people at work. But, other factors are probably at play, too. Contracting out work, for example (including safety-critical work), is another trend from past decades. Managing, monitoring and controlling operations across a network of contractors and subcontractors tends to be vastly more complex, so bureaucratic organization becomes a plausible means to do so (Vaughan, 1996). This includes self-regulation, where organizations themselves need to gather, analyze and distill data requested by regulators and by each other. Bureaucratic accountability is increasingly becoming a business-to-business requirement, as with showing lost-time incident rates or medical-treatment incident rates, or proving the presence of an occupational safety and health management system (
Furthermore, technological capabilities for surveillance and behavior monitoring in workplaces have expanded over the past decades. The commitment to a zero vision has both necessitated and been enabled by surveillance and measurement of incident and injury data, which in turn both requires and generates bureaucratic accountability processes for its capture, reporting, tabulation, storage and analysis (Hallowell & Gambatese, 2009; Zwetsloot, Aaltonen, Wybo, et al., 2013).
Secondary (Negative) Effects of Bureaucratized Safety
The safety yield of further bureaucratization, however, seems to be declining or plateauing in many industries (
In addition, bureaucracy tends to generate secondary effects that actually run counter to its objectives. Research over the past decades shows that such secondary effects in safety include an inability to predict unexpected events, a focus on bureaucratic accountability, quantification and numbers games, the occasional creation of safety problems that result from the application of rules or safety systems, and constraints on workers' freedom, diversity and creativity. Here are some highlights.
*A recent Delphi analysis of safety interventions in the construction industry showed that the interventions most associated with bureaucracy are deemed the least worthwhile (Hallowell & Gambatese, 2009). This includes the writing of safety plans and policies; recordkeeping and incident analysis; and emergency response planning. Having them makes no real safety difference. The more safety policies and structures are developed or enforced bureaucratically by those who are at a distance from the operation, the less they might represent risk and how it is managed in practice. Frnergency response planning has been critiqued in this regard, particularly for its generation of fantasy documents that bear little relation to actual requirements or conditions. Such documents are tested against reality only rarely, and draw from an unrealistic or idealistic view of the organization and its environment (Clarke & Perrow, 1996).
*A study of Finnish construction and manufacturing from 1977 to 1991 showed a strong negative correlation between incident rate and fatalities (r = -.82, p < .001). In other words, the fewer incidents a construction site reported, the higher its iatality rate (Saloniemi & Oksanen, 1998). Low incident reporting rates might suggest workplaces where superiors are not open to hearing bad news, which might explain why those that report fewer incidents also suffer more fatal incidents. A zerovision can sometimes be implicated, as it can encourage the suppression, discouragement (e.g., through postinjury drug testing) or recategorization of incident or injury data (Donaldson, 2013) and lead to other "numbers games" (Frederick & Lessin, 2000), such as an inappropriate use of modified duties or retum-to-work programs (
*Notwithstanding the noted point about low incident reporting rates, in industries with near-zero safety performance (i.e., a tiny residue of fatalities), the predictive value of incidents (for those fatalities or larger-consequence incidents) is no longer so obvious (Amalberti, 2013). Bigger incidents in such industries seem to be preceded not by the things that are seen as (or that were ever reported as) incidents, but by normal work. Incidents are preceded by a gradual drift into failure, driven by production pressures and continued operational success (Dekker, 2011). Such "fine-tuning until something breaks" (Starbuck & Milliken, 1988) is difficult to capture in incident reporting, because the precursors are part of the messy details of normal work-the daily workarounds and frustrations that are part of getting the job done. Bureaucratic systems of tabulating and reporting are not typically sensitive to those subtleties.
*Rules, of course, offer advantages to follower and imposer alike. They save time and effort, prevent reinvention of the wheel, offer clarity about tasks and responsibilities and create more predictability. The disadvantages, however, include supervisory demands on compliance monitoring, and a blindness to, or unpreparedness for, new situations that do not fit the rules (Hale & Swuste, 1998). They can also be experienced as a loss of freedom and a constraint on initiative that hampers innovation: "Compliance with detailed, prescriptive regulations may build a reactive compliance culture, which stifles innovation in developing new products, processes and risk control measures" (Hale, et al., 2013, p. 2).
*Following rules and complying with procedures and bureaucratic protocol can actually harm safety in certain circumstances (Dekker, 2001): "Major accidents such as Mann Gulch and Piper Alpha have shown that it can be those who violate rules who survive such emergencies, whilst those who obey die" (Hale & Borys, 2013, p. 214). This is of course a result of the insensitivity of rules and compliance pressure to context (Dekker, 2003; Woods & Shattuck, 2000). The case of the environmental impact assessor (p. 34) presents a fairly obvious (and less extreme) example: by enforcing various layers of PPE, including a hard hat and long sleeves, the wearer may suffer dehydration and heat stroke more quickly in the climate where s/he typically works.
Conclusion & Ways Forward
Bureaucratization of safety has been driven by a complexity of factors, including legislation and regulation, changes in liability and insurance arrangements, a wholesale move to outsourcing and contracting, and increased technological capabilities for surveillance, monitoring, storage and data analysis. Bureaucratization tends to see people as a problem to control (e.g., by standardizing and fixing rules, expecting compliance) and generates secondary effects that run counter to its original goals. These effects include bureaucratic entrepreneurism, an inability to predict unexpected events, a focus on bureaucratic accountability, quantification and numbers games, the occasional creation of new safety problems and constraints on organization members' personal freedom, diversity and innovation.
The most useful prescription is to strike a balance between bureaucratically controlled safety and worker-managed safety (Amalberti, 2013), or between deference to protocol and procedure on the one hand, and practical expertise on the other (Galison, 2000). When confronted with a safety problem, or a proposed solution to one, ask the following:
*Who or what is responsible for the safety problem, or what is responsible? If you ask who is responsible, you likely see people (or a person, or certain people) as the problem to control. These people may, however, be the recipients of trouble deeper inside the organization (Dekker, 2006).
*What is the balance in the organization between intervening at the behavior level (which might assume that tools and tasks are fixed, and that people must fit to them) and intervening with people's working conditions and equipment (which suggests the environment can be shaped so as to fit people better)? If capital investments were made in machinery that might last a few more decades, there may be no choice. But these lessons can inform the design of the next generation of equipment.
*Is safety being measured mostly as an absence of bad events? Or is the focus on looking for the presence of positive capacities in people, teams, organization? These include the capacities to question continued success (and not see it as a guarantee of future safety), the capacity to say no in the face of acute production pressures, and the capacity to bring in fresh perspectives on a problem and listen to the voice from below.
*Are safety policies mostly organized around limiting, constraining and controlling what people do? Or do the policies actually empower people, encourage them to share or invite them to help innovate?
*When observing a worker acting unsafely, do you just tell him/her not to do it? Or do you try to understand why it made sense to do what s/he did? The worker probably did not come to work to do a bad job. If what s/he did made sense to him/her, it probably makes sense to others as well. That points to systemic conditions to examine.
*When a gap is observed between how people work and what the rule tells them to do, is that called a violation? That means it has already been decided who is right, and nothing new may be learned. Instead, consider this same gap as workers finishing the design of a procedure or the equipment because that design was imperfect to begin with. See that gap as resilience, as workers recog- nizing and adapting to situations that fall outside of what was designed or trained for.
*Are you, as ASSE laureate
IN BRIEF
* Safety interventions can target the human, or the technological and organizational environment. The choice means either fitting people into fixed systems or engineering systems so they are fit for people.
* Today, various factors seem to favor safety interventions that consider people as a problem to control (through procedures, compliance, standardization, sanctions). Safety is measured mainly by the absence of negatives.
* Safety professionals should view people as a solution to harness rather than a problem to control, and consider safety more as a presence of positive capacities, and move from a vocabulary of control, constraint and deficit, into one of empowerment, diversity and human opportunity.
Humans were no longer seen as just the cause of trouble; they were the recipients of trouble.
The most useful prescription is to strike a balance between bureau-cratically controlled safety and worker-managed safety.
References
Amalberti, R. (2001). The paradoxes of almost totally safe transportation systems. Safety Science, 37 (2-3), 109-126.
Amalberti, R. (2013). Navigating safety: Necessary compromises and trade-offs-theory and practice. Heidelberg,
Baker, J.A. (2007). The report of the
Barton, M.A. & Sutcliffe, K.M. (2009). Overcoming dysfunctional momentum: Organizational safety as a social achievement. Human Relations, 62(9), 1327-1356.
Chaparás, A. (1970). Human factors in systems engineering. In
Clarke, L. & Perrow, C. (1996). Prosaic organizational failure. American Behavioral Scientist, 39(8), 1040-1057.
Dekker, S.W.A. (2001). Follow the procedure or survive. Human Factors and Aerospace Safety, 1(4), 381-385.
Dekker, S.W.A. (2003). Failure to adapt or adaptations that fail: Contrasting models on procedures and safety. Applied Ergonomics, 34(3), 233-238. doi:10.1016/ s0003-6870(03)00031-0
Dekker, S.W.A. (2006). The field guide to understanding human error.
Dekker, S.W.A. (2011). Drift into failure: From hunting broken components to understanding complex systems. Famham,
Donaldson, C. (2013, March). Zero harm: Infallible or ineffectual? OHS Professional, 22-27.
Dömer, D. (1989). The logic of failure: Recognizing and avoiding error in complex situations.
Eikind, P. & Whitford, D. (2011,
Fitts, P.M. & Jones, R.E. (1947). Analysis of factors contributing to 460 "pilot error" experiences in operating aircraft controls.
Frederick, J. & Lessin, N. (2000, Nov.). The rise of behavioral-based safety programs. Multinational Monitor, 22(11), 1-7.
Galison, P. (2000). An accident of history. In
Graham, B., Reilly, W.K., Beinecke, F., et al. (2011). Deep water: The Gulf oil disaster and the future of offshore drilling [Report to the President].
Hale, A. & Borys, D. (2013, June). Working to rule, or working safely? Part 1: A state-of-the-art review. Safety Science, 55,207-221.
Hale, A., Borys, D. & Adams, M. (2013). Safety regulation: The lessons of workplace safety rule management for managing the regulatory burden. Safety Science. Retrieved from http://dx.doiorg/10.1016 .j.ssci.2013.11.012
Hale, A. & Swuste, P. (1998). Safety mies: Procedural freedom or action constraint? Safety Science, 29(3), 163-177.
Hallowell, M.R. & Gambatese, J.A. (2009). Construction safety risk mitigation.
Hasle, P. & Zwetsloot, G.I.J.M. (2011, Aug.). Editorial:
Hollnagel, E., Woods, D.D. & Leveson, N.G. (2006). Resilience engineering: Concepts and precepts.
Health and Safety Executive. (2011). School trips and outdoor learning activities: Tackling the hea 1th and safety myths.
Reis, T. (2014). How wearing a hard hat can threaten wildlife.
Saloniemi, A. & Oksanen, H. (1998). Accidents and fatal accidents-Some paradoxes. Safety Science, 29(1), 59-66.
Simmons, R. (2012, May). The stranglehold of bureaucracy. The Safety & Health Practitioner, 30,20.
Starbuck, W.H. & Milliken, F.J. (1988). Challenger: Finetuning the odds until something breaks.
U.S.
Vaughan, D. (1996). The Challenger launch decision: Risky technology, culture and deviance at
Weick, K.E. & Sutcliffe, K.M. (2007). Managing the unexpected: Resilient performance in an age of uncertainty (2nd ed.).
Woods, D.D. & Shattuck, L.G. (2000). Distant supervision-Local action given the potential for surprise. Cognition, Technology & Work, 2(4), 242-245.
Zwetsloot, G.I.J.M., Aaltonen, M., Wybo, J.L., et al. (2013). The case for research into the zero accident vision. Safety Science, 58, 41-48.
Copyright: | (c) 2014 American Society of Safety Engineers |
Wordcount: | 3762 |
Building Options at Project Front-End Strategizing: The Power of Capital Design for Evolvablilty
Nuts & Bolts of Effective Ergonomics Programs
Advisor News
Annuity News
Health/Employee Benefits News
Life Insurance News