How is Aviation Automation helping in the improvement

Ace your studies with our custom writing services! We've got your back for top grades and timely submissions, so you can say goodbye to the stress. Trust us to get you there!


Order a Similar Paper Order a Different Paper

How is the Aviation Automation helping in the improvement of the aviation efficience and safety?

According to Wise et al. (1994) there has been an effort to improve aviation efficiency and safety by promoting automation of various procedures at the cockpit. The automation has aimed at reducing the occurrence of errors and violations by the flight crew. Automation has indeed yielded the expected results in many cases but has also failed in some cases. Merritt and Klinect (2006) define error as actions or inactions by the flight crew that make the crew to deviate from initial intentions with reduced safety margins and increased probability of unfavorable operational events during flight or on the ground. There are three categories of errors: communication, procedural and aircraft handling errors. Communication errors result from miscommunication among pilots, flight crew and external agents, procedural errors are due to the flight crew deviating from the requirements of the flight manual, regulations or standards of airline operational procedures while Aircraft handling errors are due to aircraft configuration, speed and direction related deviations.

According to Airbus (2005) in aviation, an error is an intentional action or inaction that fails to achieve its intended outcome. An error in itself cannot be intentional although the original action that led to the error is intentional. Movements beyond the control of the actor in the error context such as reflexes are therefore not considered errors. Moreover, factors beyond the control of the actor do not determine the outcome. Errors are in two categories: slips and lapses; and mistakes. Slips and lapses result from failure to execute intended actions, that is, actions that fail to proceed as planned are slips while memory failures are lapses. Mistakes are when a plan of action fails, that is, even with the correct execution of the plan; the intended action is not achieved. If a plan leads to mistakes, it may be because it is either deficient, clumsy, an inappropriate good plan or dangerous.

A violation is an intentional action or inaction which goes against known procedures, rules or norms. Whereas violations are deliberate, errors are not. Simply put, a violation is committed out of a conscious decision while errors happen despite an actor’s intention to avoid them. It is important to note that, a person who ends up committing a violation is not necessarily anticipating the negative consequences that may follow due to the violation. In some cases, the situation remains under control despite the violation. Slips and lapses occur at skill-based performance level, while violations at skill-based performance level are normally part of the actor’s automated routines. Mistakes occur from conscious decisions hence they occur at both knowledge-based and rule-based performance levels (Airbus, 2005).

In the earlier times, aircraft accidents were associated with adverse weather conditions, systems or engine failures. During investigations into the causes of accidents, attention was only paid to the technical aspects. When the commercial jet transport became common in the nineteen seventies, technology matured and there was a subsequent decline in the number of accidents due to mechanical failures (Leiden, Keller & French, 2001). High profile accidents could occur and the cause of the accident found out not to involve any mechanical problems. To increase aviation safety, there was need to research on the contribution made by human factors in aviation errors that led to accidents. Between 1970 and 1997, human factors contributed to sixty nine percent of the causes of aircraft accidents. Human errors and violations are seen to originate from causal factors such as incapacitation, fatigue, lack of experience or proper training and poor crew management (Karwal, Verkaik & Jansen, 2000).

Reason (1990) argued that there are four levels of human failures by human beings that lead to aviation accidents. Each one of the levels influences the next. The levels are as follows: organizational influences leading to unsafe supervision which leads to conditions favoring unsafe acts and finally the unsafe acts by the aircraft operators. The unsafe acts are responsible for aviation accidents due to human error. Shappel and Wiegmann (2000) categorized human errors in three categories which include decision errors which occur due to either not making the correct choices or not following the required procedures appropriately. Secondly, skill-based errors which mostly affect flight crew skills that occur without significant conscious thought. Skill-based actions are therefore vulnerable to attention failures leading to errors. The third category of human errors is the perceptual error which results due to degraded perception like visibility at night or in weather conditions that are visually impoverished.

Despite automation improving aviation safety, it has also introduced safety issues (Young, 2007). As a result of problems resulting from automation of the flight deck, Billings (1991) suggested that flight deck designs must include designs for automated agents and human roles and they to be explicitly supported. Inadequate consideration of the integration of automated systems has been found to be new sources of error. After an extensive study, for example, it was found out that the interface (control display unit) for the flight management system introduced a new form of error (Abbot, 1996). There was a need therefore to develop adaptive displays in order to have the flight crew in control of highly automated aircraft systems.

According to NATO (2002) the problem of aviation automation is better approached from the view point of providing an assistant or an electronic crew member. However, it is noted that, the approach is only successful in situations which are non-time-pressured, that is, situations in which the control crew has time to review and understand proposed actions. The systems have been unsuccessful in time-pressured situations hence falling short of the promise that they would interact and show human like intelligence. Instead of emulating an electronic crew member, Schutte & Goodrich (2006) suggested a complimentary automation strategy in which human beings and the machine work together in a symbiotic dependence. In this strategy the human operator provides general intelligence, knowledge of common sense and creative thinking. On the other hand, the machine provides resistance to fatigue, precise and swift control and encyclopedic memory. Generally the human is in charge of decisions and actions that of significant consequences on the overall safety while the machine performs the more deterministic, tedious, repetitious and time constrained activities that require a great level of precision. This greatly simplifies user interaction with the system while lowering the chances of error and the subsequent failure.

There have been reported cases in which individuals monitoring automated systems remain unaware of changes in the system (Ephrath & Young, 1981). Failures that arise in Monitoring have been associated with simple tasks according to Parasuraman (1987). However, he notes that humans are poor monitors of passive activities of automated systems regardless of the level of complexity associated with such events. Billings (1991) has it that there are higher chances of human failure in monitoring automated systems especially when the systems behave in a reasonable manner even though it is incorrect or when the operators are not alert to the state of the system. There is a major impediment to the implementation and operation of automated systems since many operators experience difficulty in understanding the system even when the systems are functioning properly as designed. This may be due to the complexity of the system, poor design of the interface or insufficient training on the system. In such circumstances, there exists a high risk of human error brought about by automation (Endsley 1996).

In an accident case involving airbus 320 in January 1992, no mechanical failure was identified as a probable cause of the accident. The voice recorder in the plane did not register any state of panic among the passengers immediately before the accident. Even though the situations surrounding the air crash were inevitably complex, De Keyser and Javaux (1996) agree that the accident was caused by inappropriate entry into the autopilot mode. The cockpit crews had been denied permission to land upon first request and were under assistance from air traffic control to make a second attempt. Both the pilot and the co-pilot were very busy in a bid to correct the lateral course, deploy the landing gear, check the pre-descent checklist and enter a suitable descent rate. Using the bimodal dial system, the pilot entered a descent speed of 3300 feet per meet in the place of 3.3 degrees descent angle hence the plane crushed.

Automation in aviation has also resulted in error situations due to imperfections such as false alarms, incorrect recommendations and missed alerts. Consequently, the level of automation should be varied accordingly during system automation; a process called adaptive automation. Too much reliance on automation by the human actor results in diminished situation awareness and skill degradation hence higher possibilities of error occurrence (Wickens & Dixon 2007). Decision aids are mostly imperfect. Since automated systems cannot be 100 percent depended upon especially in decision making functions, the costs and error possibilities involved with such systems should be critically analyzed. Crocoll and Coury (2003) found out that the cost of imperfect advice from automated systems was high in situations that involved decision making and than in situations where the automated system was involved in giving mere information. However, Rovira et al. (2007) found out that reliable automation reduces error levels by a huge margin and increases the level of accuracy both in the battlefield decisions on engagement and in aviation for high, medium and low levels of automation. However, when the incorrect advisory was provided, the accuracy significantly reduced as indicated below.

Decision accuracy under different forms of automation support for correct and incorrect advisories (Rovira et al. 2007).

In conclusion, automation in the aviation industry is done mainly to reduce the workload on the flight crew and increase the levels of accuracy especially in overly complex and critical processes. As much as automation has an impact that may lead to error, most errors that result from automated processes originate from human errors. It is however not possible to completely do away with the human factor since there is need for human control even for aircrafts that fly without human pilots. A bigger percentage of aviation errors are due to human and only a small percentage are mechanical.

References

Abbott, K. (1996). The Interfaces Between Flight crews and Modern Flight Deck Systems, Federal Aviation Administration

Airbus, (2005). Flight Operations Briefing Notes: Human Performance Error Management. FOBN Reference: FLT_OPS – HUM_PER – SEQ 07 – REV 01.

Billings, C. (1991). Human-centered aircraft automation: A concept and guidelines NASA TM

103885, NASA, Washington DC.

Crocoll, W.M., and Coury, B.G. (2003). Status or recommendation: Selecting the type of information for decision aiding. In Proceedings of the Human Factors Society, 34th Annual Meeting. Santa Monica, CA. 1524–1528

De Kayser, V. and Javaux, D. (1996). Human factors in Aeronautics. Design, Specification & Verification of Interactive Systems

Endsley, M. R. (1996). Automation and situation awareness. Texas Tech University.

Ephrath, A. R. and Young, L. R. (1981). Monitoring vs. Man-in-the-loop Detection of aircraft control failures. New York: Plenum press.

Karwal, A. K., Verkaik, R. & Jansen, C. (2000). Non-Adherence to Procedures: Why Does it

Happen? The University of Texas

Leiden, K., Keller, J. and French, J. (2001). Context of human error in commercial aviation.

NASA: System-Wide Accident Prevention Program Ames Research Center Moffett Field, CA 94035-1000

Merritt, A and Klinect, J. (2006). Defensive Flying for Pilots: An Introduction to threat and

error management. The University of Texas

Parasuraman, R. (1987). Human-Computer monitoring: Human factors 29 (6), 695 – 705

Reason, J. (1990). Human Error. New York: Cambridge University Press

Research and Technology Organization (RTO) of NATO, (2002). Tactical Decision Aids and Situational Awareness. RTO/NATO BP 25, 7 rue Ancelle, F-92201 Neuilly-sur-Seine Cedex, France.

Rovira, E., K. McGarry, and Parasuraman, R. (2007). Effects of imperfect automation on decision making in a simulated command and control task. Human Factors 49: 76–87.
Shappell, S. and Wiegmann, D. (2000). The Human Factors Analysis and

Classification System (HFACS). Federal Aviation Administration, Office of

Aviation Medicine Report No. DOT/FAA/AM-00/7. Office of Aviation Medicine:

Washington, DC.

Schutte, P. and Goodrich, K. (2006). The naturalistic Flight Deck System: An

Integrated System Concept for Improved Single-Pilot Operations. NASA TM (in

Review), NASA, Washington DC.

Wiegmann, D. and Faaborg, T. (2005). Human Error and General Aviation

Accidents: A Comprehensive, Fine-Grained Analysis Using HFACS. Civil

Aerospace Medical Institute Federal Aviation Administration Oklahoma City, OK

73125

Wise, J. A., Tilden, D. S., Abbott, D. W., Dyck, J. L. and Guide, P. C. (1994). Managing Automation in the Cockpit. Orlando: University of Central Florida.

Young, S. D. (2007). Aviation Safety Program: Integrated Intelligent Flight Deck;

Technical Plan Summary.

Writerbay.net

Looking for top-notch essay writing services? We've got you covered! Connect with our writing experts today. Placing your order is easy, taking less than 5 minutes. Click below to get started.


Order a Similar Paper Order a Different Paper