- •Textbook Series
- •Contents
- •1 Basic Concepts
- •The History of Human Performance
- •The Relevance of Human Performance in Aviation
- •ICAO Requirement for the Study of Human Factors
- •The Pilot and Pilot Training
- •Aircraft Accident Statistics
- •Flight Safety
- •The Most Significant Flight Safety Equipment
- •Safety Culture
- •Reason’s Swiss Cheese Model
- •The Five Elements of Safety Culture
- •Flight Safety/Threat and Error Management
- •Threats
- •Errors
- •Undesired Aircraft States
- •Duties of Flight Crew
- •2 The Circulation System
- •Blood Circulation
- •The Blood
- •Composition of the Blood
- •Carriage of Carbon Dioxide
- •The Circulation System
- •What Can Go Wrong
- •System Failures
- •Factors Predisposing to Heart Attack
- •Insufficient Oxygen Carried
- •Carbon Monoxide
- •Smoking
- •Blood Pressure
- •Pressoreceptors and their Function Maintaining Blood Pressure
- •Function
- •Donating Blood and Aircrew
- •Pulmonary Embolism
- •Questions
- •Answers
- •3 Oxygen and Respiration
- •Oxygen Intake
- •Thresholds of Oxygen Requirements Summary
- •Hypoxic Hypoxia
- •Hypoxic Hypoxia Symptoms
- •Stages/Zones of Hypoxia
- •Factors Determining the Severity of and the Susceptibility to Hypoxic Hypoxia
- •Anaemic Hypoxia
- •Time of Useful Consciousness (TUC)
- •Times of Useful Consciousness at Various Altitudes
- •Effective Performance Time (EPT)
- •Hyperventilation
- •Symptoms of Hyperventilation
- •Hypoxia or Hyperventilation?
- •Cabin Pressurization
- •Cabin Decompression
- •Decompression Sickness (DCS)
- •DCS in Flight and Treatment
- •Questions
- •Answers
- •4 The Nervous System, Ear, Hearing and Balance
- •Introduction
- •The Nervous System
- •The Sense Organs
- •Audible Range of the Human Ear and Measurement of Sound
- •Hearing Impairment
- •The Ear and Balance
- •Problems of Balance and Disorientation
- •Somatogyral and Somatogravic Illusions
- •Alcohol and Flying
- •Motion Sickness
- •Coping with Motion Sickness
- •Questions
- •Answers
- •5 The Eye and Vision
- •Function and Structure
- •The Cornea
- •The Iris and Pupil
- •The Lens
- •The Retina
- •The Fovea and Visual Acuity
- •Light and Dark Adaptation
- •Night Vision
- •The Blind Spot
- •Stereopsis (Stereoscopic Vision)
- •Empty Visual Field Myopia
- •High Light Levels
- •Sunglasses
- •Eye Movement
- •Visual Defects
- •Use of Contact Lenses
- •Colour Vision
- •Colour Blindness
- •Vision and Speed
- •Monocular and Binocular Vision
- •Questions
- •Answers
- •6 Flying and Health
- •Flying and Health
- •Acceleration
- •G-forces
- •Effects of Positive G-force on the Human Body
- •Long Duration Negative G
- •Short Duration G-forces
- •Susceptibility and Tolerance to G-forces
- •Summary of G Tolerances
- •Barotrauma
- •Toxic Hazards
- •Body Mass Index (BMI)
- •Obesity
- •Losing Weight
- •Exercise
- •Nutrition and Food Hygiene
- •Fits
- •Faints
- •Alcohol and Alcoholism
- •Alcohol and Flying
- •Drugs and Flying
- •Psychiatric Illnesses
- •Diseases Spread by Animals and Insects
- •Sexually Transmitted Diseases
- •Personal Hygiene
- •Stroboscopic Effect
- •Radiation
- •Common Ailments and Fitness to Fly
- •Drugs and Self-medication
- •Anaesthetics and Analgesics
- •Symptoms in the Air
- •Questions
- •Answers
- •7 Stress
- •An Introduction to Stress
- •The Stress Model
- •Arousal and Performance
- •Stress Reaction and the General Adaption Syndrome (GAS)
- •Stress Factors (Stressors)
- •Physiological Stress Factors
- •External Physiological Factors
- •Internal Physiological Factors
- •Cognitive Stress Factors/Stressors
- •Non-professional Personal Factors/Stressors
- •Stress Table
- •Imaginary Stress (Anxiety)
- •Organizational Stress
- •Stress Effects
- •Coping with Stress
- •Coping with Stress on the Flight Deck
- •Stress Management Away from the Flight Deck
- •Stress Summary
- •Questions
- •Answers
- •Introduction
- •Basic Information Processing
- •Stimuli
- •Receptors and Sensory Memories/Stores
- •Attention
- •Perception
- •Perceived Mental Models
- •Three Dimensional Models
- •Short-term Memory (Working Memory)
- •Long-term Memory
- •Central Decision Maker and Response Selection
- •Motor Programmes (Skills)
- •Human Reliability, Errors and Their Generation
- •The Learning Process
- •Mental Schema
- •Questions
- •Answers
- •9 Behaviour and Motivation
- •An Introduction to Behaviour
- •Categories of Behaviour
- •Evaluating Data
- •Situational Awareness
- •Motivation
- •Questions
- •Answers
- •10 Cognition in Aviation
- •Cognition in Aviation
- •Visual Illusions
- •An Illusion of Movement
- •Other Sources of Illusions
- •Illusions When Taxiing
- •Illusions on Take-off
- •Illusions in the Cruise
- •Approach and Landing
- •Initial Judgement of Appropriate Glideslope
- •Maintenance of the Glideslope
- •Ground Proximity Judgements
- •Protective Measures against Illusions
- •Collision and the Retinal Image
- •Human Performance Cognition in Aviation
- •Special Situations
- •Spatial Orientation in Flight and the “Seat-of-the-pants”
- •Oculogravic and Oculogyral Illusions
- •Questions
- •Answers
- •11 Sleep and Fatigue
- •General
- •Biological Rhythms and Clocks
- •Body Temperature
- •Time of Day and Performance
- •Credit/Debit Systems
- •Measurement and Phases of Sleep
- •Age and Sleep
- •Naps and Microsleeps
- •Shift Work
- •Time Zone Crossing
- •Sleep Planning
- •Sleep Hygiene
- •Sleep and Alcohol
- •Sleep Disorders
- •Drugs and Sleep Management
- •Fatigue
- •Vigilance and Hypovigilance
- •Questions
- •Answers
- •12 Individual Differences and Interpersonal Relationships
- •Introduction
- •Personality
- •Interactive Style
- •The Individual’s Contribution within a Group
- •Cohesion
- •Group Decision Making
- •Improving Group Decision Making
- •Leadership
- •The Authority Gradient and Leadership Styles
- •Interacting with Other Agencies
- •Questions
- •Answers
- •13 Communication and Cooperation
- •Introduction
- •A Simple Communications Model
- •Types of Questions
- •Communications Concepts
- •Good Communications
- •Personal Communications
- •Cockpit Communications
- •Professional Languages
- •Metacommunications
- •Briefings
- •Communications to Achieve Coordination
- •Synchronization
- •Synergy in Joint Actions
- •Barriers to Crew Cooperation and Teamwork
- •Good Team Work
- •Summary
- •Miscommunication
- •Questions
- •Answers
- •14 Man and Machine
- •Introduction
- •The Conceptual Model
- •Software
- •Hardware and Automation
- •Intelligent Flight Decks
- •Colour Displays
- •System Active and Latent Failures/Errors
- •System Tolerance
- •Design-induced Errors
- •Questions
- •Answers
- •15 Decision Making and Risk
- •Introduction
- •The Mechanics of Decision Making
- •Standard Operating Procedures
- •Errors, Sources and Limits in the Decision-making Process
- •Personality Traits and Effective Crew Decision Making
- •Judgement Concept
- •Commitment
- •Questions
- •Answers
- •16 Human Factors Incident Reporting
- •Incident Reporting
- •Aeronautical Information Circulars
- •Staines Trident Accident 1972
- •17 Introduction to Crew Resource Management
- •Introduction
- •Communication
- •Hearing Versus Listening
- •Question Types
- •Methods of Communication
- •Communication Styles
- •Overload
- •Situational Awareness and Mental Models
- •Decision Making
- •Personality
- •Where We Focus Our Attention
- •How We Acquire Information
- •How We Make Decisions
- •How People Live
- •Behaviour
- •Modes of Behaviour
- •Team Skill
- •18 Specimen Questions
- •Answers to Specimen Papers
- •Revision Questions
- •Answers to Revision Questions
- •Specimen Examination Paper
- •Answers to Specimen Examination Paper
- •Explanations to Specimen Examination Paper
- •19 Glossary
- •Glossary of Terms
- •20 Index
Man and Machine 14
•If you do not understand a displayed piece of information, double-check it. It could be wrong.
•Take any opportunity to take extra simulator training.
Automation - Summary
Aircraft automation is gaining ground and is here to stay. It is a tool and, as we have seen, it is far from a panacea. Certainly it has gone a long way to solving many of the traditional problems but, in its wake, brings those of its own.
As any tool, its effectiveness depends on the user. It should be handled in the correct way and with an awareness of its weaknesses and dangers. Used badly it can lead to catastrophic results but handled well it becomes a major contributor to flight safety.
Intelligent Flight Decks
There is no precise line that divides the ‘automated’ from the ‘intelligent’, but the problem solving and data evaluation of which modern computers are capable would merit the use of terms such as ‘pilot’s associate’ and ‘electronic crew member’. There are however three main human factors issues that may be identified:
• |
How much autonomy should be given to the machine? Should the computer be allowed |
|
|
to evaluate information, make decisions, and execute them without reference to the pilot? |
14 |
|
Or should it remain in an advisory role, presenting suggestions to the pilot to assist him in |
|
|
making the necessary decisions? For example, should an aircraft fitted with a GPWS take |
Machineand |
• As machines become more complex they evaluate greater quantities of possibly ‘noisy’ data. |
||
|
automatic climb action on receipt of a terrain warning? |
|
|
Any increase in this ‘noisy’ data will lead to an increase in ‘probabilistic’ solutions. Present |
Man |
|
aircraft displays do not give the pilot any estimate of the reliability of the data displayed |
|
|
they simply display the machine’s best guess. |
|
• |
The computer uses information from both its internal inertial system and ground based |
|
|
fixing aids. However, the pilot is given the same display regardless of whether the aircraft |
|
|
‘knows’ that good data is being received from all sources or whether the computer ‘knows’ |
|
|
that it is receiving information, for example, from two poor cross-cuts, distant VORs or an |
|
|
inertial system that has been drifting for a number of hours. |
|
• Pilots must have an appropriate level of trust in their equipment since under-trust can lead to unnecessary workload and over-trust has obvious dangers. Modern equipment is normally very reliable and the perceived reliability will determine the amount of trust that pilots have in the equipment. Another factor for consideration is that the modern display may be so compelling that it generates more trust than it actually deserves.
Colour Displays
Where colour is used to indicate a change of state, for example, from ‘ALT (altitude) capture’ (blue) to ‘ALT hold’ (green), the colour change should be accompanied by a change of caption or location. The change of colour by itself is not normally sufficient to ensure that the crew will notice the difference.
287
14 Man and Machine
System Active and Latent Failures/Errors
Introduction
The human contribution to the failures with modern technological systems can be divided into two types: Active and Latent Failures. The distinction between the two is:
•who made the error and/or
•how long these errors take to appear
Active Failures/Errors
Active errors/failures are committed at the human-system interface (i.e. in the cockpit, in the cabin or at the Air Traffic Controllers desk) and have an immediate effect. We have already discussed a number of these (Action Slip, Environmental Capture etc.).
Machine and Man 14
Latent Errors/Failures
Latent errors/failures are normally the results of decisions taken by designers, manufacturers and senior management. These people are usually a long way removed from the immediate system. However the consequences of their actions or decisions which have been dormant - perhaps for a long time - may have sudden and disastrous results. An example of latent failure was the Mount Erebus crash where an aircraft database had an unnoticed waypoint error of 2°W. This was sufficient for the aircraft to hit a mountain in poor visibility. Rushed or incomplete preparation is another example of latent failure.
System Tolerance
Sod’s Law states: |
If something can go wrong, it will. |
An example of Sod’s Law is:
Murphy’s Law
which states: If a system can be operated incorrectly, sooner or later it will be.
ErrorTolerance
Aviation systems, whether aircraft, organizational or procedural must be error-tolerant. This ensures that no error has serious implications to the overall safety or conduct of the system. An example of this would be an automatic system that prevents an aircraft moving outside its flight envelope regardless of the orders the pilot enters through the controls.
Protected andVulnerable Systems
Systems must also be designed to contain their own intrinsic protection. A system is considered vulnerable if one error is allowed to affect the whole system. Figure 14.7 illustrates the concept. A brick taken out from the “protected” wall will leave the main structure still standing however, in the case of the “vulnerable” wall, its whole function will be affected.
288
Man and Machine 14
Man and Machine 14
Figure 14.7 Protected and vulnerable systems
Design-induced Errors
These errors are those made by aircrew as a direct result of poor or faulty design of any part of the aircraft. The philosophy which will underpin all future EASA design efforts - especially those in the field of avionics and automation - will be based upon:
Detectability Tolerance Recoverability
Systems will be expected to detect errors made by aircrew, tolerate them and, as far as is possible, to recover from these errors.
289
14 Questions
Questions
|
1. |
A pilot is reading a checklist. In what way is this reference to the SHELL Concept ? |
||
|
|
a. |
S - L |
|
|
|
b. |
H - L |
|
|
|
c. |
L - S |
|
|
|
d. |
H - E |
|
|
2. |
What percentage of the appropriate population are anthropometric data table |
||
|
|
measurements taken from? |
||
|
|
a. |
80%, i.e. the tenth to the ninetieth percentile, using contour, dynamic and |
|
|
|
|
static data |
|
|
|
b. |
90%, i.e. the fifth to the ninety-fifth percentile, using contour, dynamic and |
|
|
|
|
static data |
|
|
|
c. |
50%, i.e. the twenty-fifth to the seventy-fifth percentile, using contour, |
|
|
|
|
dynamic and static data |
|
|
|
d. |
None of the above |
|
|
3. |
What is the most common checklist error? |
||
|
|
a. |
Action slip |
|
|
|
b. |
Too many capital letters are used |
|
14 |
|
c. |
Responded to automatically |
|
|
d. |
Missing items |
||
Questions |
|
|||
4. |
What is the purpose of the lumbar support? |
|||
|
||||
|
|
a. |
To allow the most comfortable position for the spine and higher neck bones |
|
|
|
b. |
To allow the most comfortable position for the spine and shoulder bones |
|
|
|
c. |
To allow the most comfortable position for the spine |
|
|
|
d. |
To produce an even pressure of the discs by allowing the lower spine to curve |
|
|
|
|
naturally |
|
|
5. |
What are the essential characteristics of a cockpit warning? |
||
|
|
a. |
It should have the best attention-getting qualities as possible |
|
|
|
b. |
It should be attention-getting but not alarming |
|
|
|
c. |
It should have attention-getting qualities which do not compromise a clear |
|
|
|
|
indiction to the pilot of the faulty component/system |
|
|
|
d. |
Must not dazzle or possibly compromise the crew’s night vision |
|
|
6. |
What is the most important feature of flight deck design? |
||
|
|
a. |
Escape and emergency exits should be clear of obstructions |
|
|
|
b. |
The design eye point must be clearly marked |
|
|
|
c. |
Important controls must be located in easily reached and unobstructed |
|
|
|
|
positions |
|
|
|
d. |
Controls and indicators should be standardized |
290
|
|
Questions |
|
14 |
|
|
7. |
What will the pilot lose sight of on the approach if seated below the Design Eye |
|||||
|
|
|
||||
|
Point? |
|
|
|
|
|
|
a. |
Some of the undershoot |
|
|
|
|
|
b. |
Some of the overshoot |
|
|
|
|
|
c. |
Peripheral objects especially at night |
|
|
|
|
|
d. |
The sight view |
|
|
|
|
8. |
What instrument is best for showing small change? |
|
|
|
||
|
a. |
A digital display |
|
|
|
|
|
b. |
An analogue display |
|
|
|
|
|
c. |
A mixed digital/analogue display |
|
|
|
|
|
d. |
Ultra/high-precision gyro instrument |
|
|
|
|
9. |
What colour should the ‘Alert’ warning be on a CRT? |
|
|
|
||
|
a. |
Bright red and flashing |
|
|
|
|
|
b. |
Steady Red |
|
|
|
|
|
c. |
Flashing yellow/amber |
|
|
|
|
|
d. |
Steady yellow |
|
|
|
|
10. |
Which pitot/static instrument is most likely to be misread? |
|
|
|
||
|
a. |
The ASI at night illuminated by a red light |
14 |
|||
|
b. |
The ASI at night illuminated by low intensity white light |
||||
|
|
|
|
|||
|
c. |
The three point altimeter |
|
Questions |
||
|
d. |
The four point altimeter |
|
|||
|
|
|
|
|||
11. |
A manually operated valve should be opened by: |
|
|
|
||
|
a. |
turning it clockwise |
|
|
|
|
|
b. |
turning it anticlockwise |
|
|
|
|
|
c. |
turning either way |
|
|
|
|
|
d. |
depends on the system it operates |
|
|
|
|
12. |
The three types of anthropometric measurements are: |
|
|
|
||
|
a. |
static, design, contour |
|
|
|
|
|
b. |
contour, design, dynamic |
|
|
|
|
|
c. |
static, dynamic, contour |
|
|
|
|
|
d. |
static, dynamic, design |
|
|
|
|
13. |
In the Shell Model L stands for: |
|
|
|
||
|
a. |
latent errors |
|
|
|
|
|
b. |
long-term errors |
|
|
|
|
|
c. |
lengthy errors |
|
|
|
|
|
d. |
liveware |
|
|
|
291
14 Questions
|
14. |
System Tolerance can be subdivided into: |
||
|
|
a. |
protected and semi-protected systems |
|
|
|
b. |
protected and endangered systems |
|
|
|
c. |
protected and vulnerable systems |
|
|
|
d. |
protected and quasi-protected systems |
|
|
15. |
A flashing red warning light on a CRT normally indicates: |
||
|
|
a. |
there is a fault in a critical system |
|
|
|
b. |
emergency |
|
|
|
c. |
alert |
|
|
|
d. |
danger |
|
|
16. |
Automation Complacency is: |
||
|
|
a. |
overconfidence in the handling capability of the pilot |
|
|
|
b. |
overconfidence in the handling capability of the pilot of computers |
|
|
|
c. |
overreliance on automation |
|
|
|
d. |
the blind belief in automation |
|
|
17. |
Mode error is associated with: |
||
|
|
a. |
automation |
|
14 |
|
b. |
hardware |
|
|
c. |
INS |
||
Questions |
|
d. |
software |
|
18. |
A danger of automation is that: |
|||
|
||||
|
|
a. |
there can be greater delays between the performance of the crew and its |
|
|
|
|
ultimate effect |
|
|
|
b. |
delays between the performance of the crew and its ultimate effect are |
|
|
|
|
shortened |
|
|
|
c. |
delays between the performance of the crew and its ultimate effect are not |
|
|
|
|
appreciated |
|
|
|
d. |
delays between the performance of the crew and its ultimate effect have no |
|
|
|
|
effect |
|
|
19. |
Automation: |
||
|
|
a. |
helps with unusual and unexpected situations |
|
|
|
b. |
may result in a pilot being unaware of important information when dealing |
|
|
|
|
with an unusual and unexpected situation |
|
|
|
c. |
increases reaction time when dealing with unusual and unexpected situations |
|
|
|
d. |
decreases reaction time when dealing with unusual and unexpected situations |
|
|
20. |
Automation can result in: |
||
|
|
a. |
lack of information being passed between crew members |
|
|
|
b. |
too much information being passed between crew members |
|
|
|
c. |
confused information being passed between crew members |
|
|
|
d. |
too much detailed information being passed between crew members |
292
Questions 14
Questions 14
293