Date of Award

12-2013

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Legacy Department

Industrial Engineering

Committee Chair/Advisor

Dr. Anand Gramopadhye

Committee Member

Dr. Scott Shappell

Committee Member

Dr. Mary E. Kurz

Committee Member

Dr. Julia Sharp

Abstract

Human error has been identified as the primary contributing cause for up to 80% of the accidents in complex, high risk systems such as aviation, oil and gas, mining and healthcare. Many models have been proposed to analyze these incidents and identify their causes, focusing on the human factor. One such safety model is the Human Factors Analysis and Classification System (HFACS), a comprehensive accident investigation and analysis tool which focuses not only on the act of the individual preceding the accident but on other contributing factors in the system as well. Since its development, HFACS has received substantial research attention; however, the literature on its reliability is limited. This study adds to past research by investigating the overall intra-rater and inter-rater reliability of HFACS in addition to the intra-rater and inter-rater reliability for each tier and category. For this investigation, 125 coders with similar HFACS training coded 95 causal factors extracted from actual incident/accident reports from several sectors. The overall intra-rater reliability was evaluated using percent agreement, Krippendorff's Alpha, and Cohen's Kappa, while the inter-rater was analyzed using percent agreement, Krippendorff's Alpha, and Fleiss' Kappa. Because of analytical limitations, only percent agreement and Krippendorff's Alpha were used for the intra-rater evaluation at the individual tier and category level and Fleiss' Kappa and Krippendorff's Alpha, for the corresponding inter-rater evaluation. The overall intra-rater and inter-rater results for the tier level and the individual HFACS tiers achieved acceptable reliability levels with respect to all agreement coefficients. Although the overall intra-rater and inter-rater reliability results at the category level were lower than the tier level, both types of reliabilities achieved acceptable levels with inter-rater reliability being lower than intra-rater. In addition, the intra-rater and inter-rater results for the individual HFACS categories varied from achieving low reliability levels to being acceptable. Both the inter-rater and intra-rater results found that the same 5 categories among the 19 - Skill Based Error, Decision Error, Inadequate Supervision, Planned Inappropriate Operations, and Supervisory Violation - were lower than the required minimum reliability threshold. While the overall findings suggest that HFACS is reasonably reliable, the fact that there were 5 categories with low reliability levels requires further research on ways and methods to improve its reliability. One such method could be to focus on training by designing and developing a standard HFACS training program that improves its reliability, which will have the potential to enhance both the confidence in using it as an accident analysis tool and the effectiveness of the safety plans and strategies based on it.

Included in

Engineering Commons

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.