DOI: 10.3724/SP.J.1042.2017.01614

Advances in Psychological Science (心理科学进展) 2017/25:9 PP.1614-1622

The detriments and improvement of automation trust and dependence to aviation safety

Currently, automation systems have been widely applied in airlines. However, the introduction of complex automation systems has already triggered new error models, which make the human factors related aviation safety issues prominent. Affected by many factors, users can't always achieve the adjusted level of trust, when they interact with automation systems. There are various safety incidents in aviation, which induced by unadjusted automation trust and dependence. But, it should be noted that the display design and training of human-centered automation can turn the unadjusted automation trust and dependence to the adjusted state.

Key words:automation trust,automation dependence,human-automation interaction,aviation safety

ReleaseDate:2017-10-20 02:12:54

Bahner, J. E., Hüper, A. D., & Manzey, D. (2008). Misuse of automated decision aids:Complacency, automation bias and the impact of training experience. International Journal of Human-Computer Studies, 66, 688-699.

Billings, C. E. (1996). Human-centered aviation automation:Principles and guidelines. NASA Technical Memorandum No. 110381. Moffett Field, California.

Billings, C. E. (1997). Aviation automation:The search for a human-centered approach. Englewood Cliffs, NJ:Erlbaum.

Billings, C. E., Lauber, J. K., Funkhouser, H., Lyman, G., & Huff, E. M. (1976). NASA aviation safety reporting system. Technical Report TM-X-3445, Moffet Field, CA:NASA Ames Research Center.

Bureau d'Enquêtes et d'Analyses Pour la Sécurité de l'Aviation Civile. (2013). Final Report on the accident on 1st June 2009 to the Airbus A330-203 registered F-GZCP operated by Air France, flight AF 447 Rio de Janerio-Paris. Paris:Author.

de Boer, R. J. (2012). Seneca's error:An affective model of cognitive resistance. (Unpublished doctorial dissertation). TU Delft.

de Boer, R. J., Heems, W., & Hurts, K. (2014). The duration of automation bias in a realistic setting. The International Journal of Aviation Psychology, 24, 287-299.

de Vries, P., & Midden, C. (2008). Effect of indirect information on system trust and control allocation. Behaviour & Information Technology, 27, 17-29.

Dehais, F., Peysakhovich, V., Scannella, S., Fongue, J., & Gateau, T. (2015). "Automation surprise" in aviation:Real-time solutions. Proceedings of the 33rd annual ACM conference on human factors in computing systems. New York, NY, USA:ACM.

Dekker, S. W. A., & Woods, D. D. (2002). MABA-MABA or abracadabra? progress on human-automation coordination. Cognition, Technology & Work, 4, 240-244.

Dornheim, M. A. (2000). Crew distractions emerge as new safety focus. Aviation Week & Space Technology, 153, 58-60.

Dutch Safety Board (2010). Crashed during approach, Boeing 737-800, near Amsterdam Schiphol Airport, 25 February 2009. The Hague:Author.

Federal Aviation Administration. (2013a). Operational use of flight path management systems:Final report of the performance-based Operations Aviation Rulemaking Commitee/Commercial Aviation Safety Team Flight Deck Automation Working Group. Washington:Author.

Federal Aviation Administration. (2013b). Safety alert for operators (13002). Washington, DC:Author.

Ferris, T., Sarter, N., & Wickens, C. D. (2010). Cockpit automation:Still struggling to catch up…. In E. Salas & D. Maurino (Eds.), Human factors in aviation (2nd ed., pp. 479-503). Amsterdam, Netherlands:Elsevier.

Geiselman, E. E., Johnson, C. M., & Buck, D. R. (2013). Flight deck automation:Invaluable collaborator or insidious enabler? Ergonomics in Design:The Quarterly of Human Factors Applications, 21, 22-26.

Giraudet, L., Imbert, J.-P., Bérenger, M., Tremblay, S., & Causse, M. (2015). The neuroergonomic evaluation of human machine interface design in air traffic control using behavioral and EEG/ERP measures. Behavioural Brain Research, 294, 246-253.

Hoff, K. A., & Bashir, M. (2015). Trust in automation:Integrating empirical evidence on factors that influence trust. Human Factors:The Journal of the Human Factors and Ergonomics Society, 57, 407-434.

Huerta, E., Glandon, T., & Petrides, Y. (2012). Framing, decision-aid systems, and culture:Exploring influences on fraud investigations. International Journal of Accounting Information Systems, 13, 316-333.

Hughes, J. S., Rice, S., Trafimow, D., & Clayton, K. (2009). The automated cockpit:A comparison of attitudes towards human and automated pilots. Transportation Research Part F:Traffic Psychology and Behaviour, 12, 428-439.

Hurts, K., & de Boer, R. J. (2014, September). "What is it doing now?" results of a survey into automation surprise. Paper presented at the 31st EAAP Conference, Valletta, Malta.

Kaber, D. B., Onal, E., & Endsley, M. R. (1999). Level of automation effects on telerobot performance and human operator situation awareness and subjective workload. In M. W. Scerbo & M. Mouloua (Eds.), Automation technology and human performance:Current research and trends (pp. 165-170). Mahwah, NJ:Erlbaum.

Lee, J. D., & Moray, N. (1992). Trust, control strategies and allocation of function in human-machine systems. Ergonomics, 35, 1243-1270.

Lee, J. D., & See, K. A. (2004). Trust in automation:Designing for appropriate reliance. Human Factors:The Journal of the Human Factors and Ergonomics Society, 46, 50-80.

Lee, J. D., & Seppelt, B. D. (2009). Human factors in automation design. In S. Y. Nof (Ed.), Springer handbook of automation (pp. 417-436). New York:Springer.

Manzey, D., Reichenbach, J., & Onnasch, L. (2012). Human performance consequences of automated decision aids:The impact of degree of automation and system experience. Journal of Cognitive Engineering and Decision Making, 6, 57-87.

Marsh, S., & Dibben, M. R. (2003). The role of trust in information science and technology. Annual Review of Information Science and Technology, 37, 465-498.

McGuirl, J. M., & Sarter, N. B. (2006). Supporting trust calibration and the effective use of decision aids by presenting dynamic system confidence information. Human Factors:The Journal of the Human Factors and Ergonomics Society, 48, 656-665.

Merritt, S. M., Unnerstall, J. L., Lee, D., & Huber, K. (2015). Measuring individual differences in the perfect automation Schema. Human Factors:The Journal of the Human Factors and Ergonomics Society, 57, 740-753.

Metzger, U., & Parasuraman, R. (2005). Automation in future air traffic management:Effects of decision aid reliability on controller performance and mental workload. Human Factors:The Journal of the Human Factors and Ergonomics Society, 47, 35-49.

Meyer, J. (2001). Effects of warning validity and proximity on responses to warnings. Human Factors:The Journal of the Human Factors and Ergonomics Society, 43, 563-572.

Molesworth, R. C., & Koo, T. R. (2016). The influence of attitude towards individuals' choice for a remotely piloted commercial flight:A latent class logit approach. Transportation Research Part C, 71, 51-62.

Mosier, K. L., & Fischer, U. M. (2010). Judgment and decision making by individuals and teams:Issues, models, and applications. Reviews of Human Factors and Ergonomics, 6, 198-255.

Mosier, K. L., & Skitka, L. J. (1996). Human decision makers and automated decision aids:Made for each other? In R. Parasuraman & S. Mouloua (Eds.), Automation and human performance. Theory and applications (pp. 201-220). Mahwah, NJ:Lawrence Erlbaum.

National Transportation Safety Board. (2013). Descent below visual glidepath and impact with seawall Asiana Airlines Flight 214 Boeing 777-200ER, HL7742, Report # NTSB/AAR-14/01. Washington, DC:Author.

Neyedli, H. F., Hollands, J. G., & Jamieson, G. A. (2011). Beyond identity:Incorporating system reliability information into an automated combat identification system. Human Factors:The Journal of the Human Factors and Ergonomics Society, 53, 338-355.

Operator's Guide to Human Factors in Aviation. (2014). Unexpected events training (OGHFA BN). Briefing note.

Parasuraman, R., Hancock, P. A., & Olofinboba, O. (1997). Alarm effectiveness in driver-centred collision-warning systems. Ergonomics, 40, 390-399.

Parasuraman, R., & Manzey, D. H. (2010). Complacency and bias in human use of automation:An attentional integration. Human Factors:The Journal of the Human Factors and Ergonomics Society, 52, 381-410.

Parasuraman, R., & Riley, V. (1997). Humans and automation:Use, misuse, disuse, abuse. Human Factors:The Journal of the Human Factors and Ergonomics Society, 39, 230-253.

Parasuraman, R., & Wickens, C. D. (2008). Humans:Still vital after all these years of automation. Human Factors:The Journal of the Human Factors and Ergonomics Society, 50, 511-520.

Perkins, L., Miller, J. E., Hashemi, A., & Burns, G. (2010). Designing for human-centered systems:Situational risk as a factor of trust in automation. In Proceedings of the human factors and ergonomics society 54th annual meeting (pp. 2130-213). Santa Monica, CA:Human Factors and Ergonomics Society.

Rempel, J. K., Holmes, J. G., & Zanna, M. P. (1985). Trust in close relationships. Journal of Personality and Social Psychology, 49, 95-112.

Rice, S., Kraemer, K., Winter, S. R., Mehta, R., Dunbar, V., Rosser, T. G., & Moore, J. C. (2014). Passengers from India and the United States have differential opinions about autonomous auto-pilots for commercial flights. International Journal of Aviation, Aeronautics, and Aerospace, 1, 1-12.

Rovira, E., McGarry, K., & Parasuraman, R. (2007). Effects of imperfect automation on decision making in a simulated command and control task. Human Factors:The Journal of the Human Factors and Ergonomics Society, 49, 76-87.

Sebok, A., Wickens, C., Sarter, N., Quesada, S., Socash, C., & Anthony, B. (2012). The automation design advisor tool (ADAT):Development and validation of a model-based tool to support flight deck automation design for nextgen operations. Human Factors and Ergonomics in Manufacturing & Service Industries, 22, 378-394.

Singh, I. L., Molloy, R., & Parasuraman, R. (l993). Automation-induced "complacency":Development of the complacency-potential rating scale. The International Journal of Aviation Psychology, 3, 111-122.

Sorkin, R. D., Kantowitz, B. H., & Kantowitz, S. C. (1988). Likelihood alarm displays. Human Factors:The Journal of the Human Factors and Ergonomics Society, 30, 445-459.

Spain, R. D., & Madhavan, P. (2009). The role of automation etiquette and pedigree in trust and dependence. In Proceedings of the Human Factors and Ergonomics Society 54th Annual Meeting (pp. 339-343). Santa Monica, CA:Human Factors and Ergonomics Society.

Stokes, C. K., Lyons, J. B., Littlejohn, K., Natarian, J., Case, E., & Speranza, N. (2010). Accounting for the human in cyberspace:Effects of mood on trust in automation. In Proceedings of the 2010 International Symposium on Collaborative Technologies and Systems (pp. 180-187). Chicago, IL:IEEE.

Wang, L., Jamieson, G. A., & Hollands, J. G. (2009). Trust and reliance on an automated combat identification system. Human Factors:The Journal of the Human Factors and Ergonomics Society, 51, 281-291.

Watts, A. C., Ambrosia V. C., & Hinkley E. A. (2012). Unmanned aircraft systems in remote sensing and scientific research:Classification and considerations of use. Remote Sensing, 4, 1671-1692.

Wickens, C. D., & Alexander, A. L. (2009). Attentional tunneling and task management in synthetic vision displays. The International Journal of Aviation Psychology, 19, 182-199.

Wickens, C. D., Hollands, J. G., Banbury, S., & Parasuraman, R. (2012). Engineering psychology and human performance (4th ed., pp. 446-462). Upper Saddle River, NJ:Pearson.

Wickens, C. D., Rice, S., Keller, D., Hutchins, S., Hughes, J., & Clayton, K. (2009). False alerts in air traffic control conflict alerting system:Is there a "cry wolf" effect? Human Factors:The Journal of the Human Factors and Ergonomics Society, 51, 446-462.

Wiener, E. L. (1981). Complacency:Is the term useful for air safety. In Proceedings of the 26th Corporate Aviation Safety Seminar (pp. 116-125). Denver:Flight Safety Foundation, Inc.

Wiener, E. L. (1989). Human factors of advanced technology ("glass cockpit") transport aircraft. NASA Contractor Report No. 177528. Moffett Field, CA:NASA-Ames Research Center.

Woods D. D., Johannesen, L. J., Cook, R. I., & Sarter, N. B. (1994). Behind human error:Cognitive systems, computers, and hindsight. Wright Patterson Air Force Base, Dayton, OH:CSERIAC.

Yeh, M., Merlo, J. L., Wickens, C. D., & Brandenburg, D. L. (2003). Head up versus head down:The Costs of imprecision, unreliability, and visual clutter on cue effectiveness for display signaling. Human Factors:The Journal of the Human Factors and Ergonomics Society, 45, 390-407.