2004
Volume 1, Issue 1
  • ISSN: 2589-6725
  • E-ISSN: 2589-6733

Abstract

Abstract

In the dilemma of Work-as-Imagined versus Work-as-Done (WAI/WAD), the delicate balance between rules and reality is often explained by stating that in case of an accident, a drift into failure had occurred. Such explanations refer to either individual operator’s or managerial decision making, where a violation of a predefined rational and hence, optimal and efficient decision making had taken place. Such drift is substantiated by normative theories of Reason, Perrow and Turner, each with their underlying assumptions and simplifications. Their notions of human error and failure of foresight are considered to have gained a generic validity over the years. More modern theories on human error and failure of foresight criticise these assumptions from an academic and theoretical perspective. They do however not take into account their normative component. The socio-economical context in which these theories were developed was to legitimize failure: the values and norms of the Anglo-Saxon society in the 1960’s and 1970’s. Such a construct is criticised nowadays within socio-psychological and organisational, even raising doubts whether safety science is a science or not. This contribution elaborates on several options for closing the gap between WAI/WAD in advocating the abolition of several obsolete notions that hamper a better understanding of the dilemma. Such constructs represent a logic of simplifications, assumptions and linearisations based on a combination of Procrustes bed, Laocoon’s fate and Ockham’s Razor respectively. Drift into failure is a mystifying construct, disregarding notions and knowledge about democratic participation, engineering design principles and socio-economical drivers for optimization.

Loading

Article metrics loading...

/content/journals/10.5117/ADV2018.1.007.STOO
2018-05-01
2022-05-17
Loading full text...

Full text loading...

/deliver/fulltext/25896725/1/1/07_ADV2018.1.STOO.html?itemId=/content/journals/10.5117/ADV2018.1.007.STOO&mimeType=html&fmt=ahah

References

  1. Berkhout, A., (2000). The Dynamic Role of Knowledge in Innovation. The Netherlands Research School for Transport, Infrastructure and Logistics TRAIL. The Netherlands: Delft University of Technology.
    [Google Scholar]
  2. Dekker, S., (2011). Drift into failure. From Hunting Broken Components to Understanding Complex Systems. Ashgate Publishers.
    [Google Scholar]
  3. Dekker, S. and PruchnickiS., (2013). Drift into failure: theorising the dynamics of disaster incubation. Theoretical Issues in Ergonomics Science, doi:10.1080/1463922X.2013.8564952.
    [Google Scholar]
  4. Pidgeon, N. and O’Leary, M., (2000). Man-made disasters: why technology and organizations (sometimes) fail. Safety Science34, 15–30.
    [Google Scholar]
  5. Hendrickx, L., (1991). How versus how often. The role of scenario information and frequency information in risk judgement and risky decision making. (Doctoral Thesis). The Netherlands: Rijksuniversiteit Groningen.
    [Google Scholar]
  6. Kahneman, D., (2011). Thinking Fast and Slow. Penquin Books
    [Google Scholar]
  7. Lande, K., (2014). Aircraft Controllability and primary Flight Displays – A Human Factors Centred Approach. European 46th SETP and 25th SFTE Symposium, 15-18 JuneLulea, Sweden.
    [Google Scholar]
  8. McCartin, Joseph A., (2006). Professional Air Traffic Controllers Strike (1981). In: Eric Arnesen, Encyclopedia of U.S. Labor and Working-class History, CRC Press, 1123–1126.
    [Google Scholar]
  9. Minsky, H., (1986). Stabilizing an Unstable Economy. McGraw-Hill Publishers.
    [Google Scholar]
  10. Milne, S., (1995). The enemy within: The Secret War Against the Miners. London: Pan Edition.
    [Google Scholar]
  11. Mohrmann, F., Lemmers, A. and Stoop, J., (2015). Investigating Flight crew recovery Capabilities from System Failures in Highly Automated Fourth Generation Aircraft. Aviation Psychology and Applied Human Factors, 5(2), 71–82.
    [Google Scholar]
  12. Ockels, W., (2002). Personal conversation between the author and Wubbo Ockels, the Dutch NASA astronaut, during the NVVK Jan de Kroes Lecture on Transport Safety.
    [Google Scholar]
  13. Safety Science, (2014). Special Issue on the foundations of safety science. Safety Science67, 1–70.
    [Google Scholar]
  14. Slovic, P., (1994). Trust, emotion, sex, politics and science: surveying the risk-assessment battlefield. Risk Analysis, 19(4), 689–701.
    [Google Scholar]
  15. StoopJ. and BennerL., (2015). What do STAMP-based analysts expect from safety investigations?Porc. Eng.128, 93–102.
    [Google Scholar]
  16. StoopJ., De KroesJ. and HaleA., (2017). Safety Science, a founding fathers’ retrospection. Safety Science94, 103–115.
    [Google Scholar]
  17. Stoop, J.A. and Van der Burg, R., (2012). From factor to vector, a system engineering design perspective on safety. PSAM 11 and ESREL 2012 Conference on Probabilistic Safety AssessmentJune 25-29, Helsinki, Finland.
    [Google Scholar]
  18. Stoop, J., (2017). How did aviation become so safe, and beyond?53th ESReDA Seminar Enhancing safety: The Challenge of Foresight. 14-15 November 2017, EU Commission JRC, Ispra Italy.
    [Google Scholar]
  19. Taleb, N., (2007). The Black Swan: the impact of the Highly Improbable. New York, NY: Random House.
    [Google Scholar]
  20. Torenbeek, E., (2013). Advanced Aircraft design. Conceptual design, Analysis and Optimization of Subsonic Civil Airplanes. Wiley Aerospace Series.
    [Google Scholar]
  21. Troadec, J.P., (2013). AF447, presentation by Director of BEA. Second International Accident Investigation Forum, Singapore23-25 April 2013.
    [Google Scholar]
  22. Turner, B.A., (1978). Man-Made Disasters. London: Wykeham Science Press.
    [Google Scholar]
  23. Van Kleef, E. and StoopJ.A., (2016). Life cycle analysis of an infrastructural project. 51th ESReDA Seminar on maintenance and Life Cycle Assessment of Structures and Industrial Systems, 20-21 October, Clermont-Ferrand France.
    [Google Scholar]
  24. Vaughan, D., 1996. The Challenger Launch Decision. Risky technology, culture and deviance at NASA. Chicago: University of Chicago Press.
    [Google Scholar]
  25. Vincenti, W., (1990). What Engineers Know and How They Know It. Analytical Studies from Aeronautical History. The John Hopkins University Press.
    [Google Scholar]
  26. Vuorio, A., Stoop, J., and Johnson, C., (2017). The need to establish consistent international safety investigation guidelines for the chemical industries. Safety Science, 95, 62–74.
    [Google Scholar]
  27. Wikipedia, (2017a). Professionalism/Diane Vaughan and the normalization of deviance. Retrieved 5 November 2017 from: https://en.wikibooks.org/wiki/Professionalism/Diane_Vaughan_and_the_normailsation_of_deviance.
    [Google Scholar]
  28. Wikipedia, (2017b). Space Shuttle Columbia disaster. Retrieved 5 November 2017 from: https://en.wikipedia.org/wiki/pace_Shuttle_Columbia_disaster
    [Google Scholar]
  29. Wikipedia, (2017c). Roger Boisjoly. Retrieved 5 November 2017 from: https://en.wikipedia.org/wiki/Roger_Boisjoly
    [Google Scholar]
  30. Wikipedia, (2017d). TransAsia Airways Flight 235. Retrieved 5 November 2017 from: https://e.wikipedia.org/wiki/TransAsia_Airways_Fligth_235
    [Google Scholar]
  31. Woods, D.D., (2016). Origins of Cognitive Systems Engineering. P.Smith and R.Hofman (Eds). Cognitive Systems Engineering: A Future for a Changing World.
    [Google Scholar]
http://instance.metastore.ingenta.com/content/journals/10.5117/ADV2018.1.007.STOO
Loading
  • Article Type: Research Article
Keyword(s): Drift Into Failure; Safety; Systems Engineering
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error