Добавил:
Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
040 Human Performance & Limitations - 2014.pdf
Скачиваний:
1760
Добавлен:
04.03.2019
Размер:
12.19 Mб
Скачать

Man and Machine 14

If you do not understand a displayed piece of information, double-check it. It could be wrong.

Take any opportunity to take extra simulator training.

Automation - Summary

Aircraft automation is gaining ground and is here to stay. It is a tool and, as we have seen, it is far from a panacea. Certainly it has gone a long way to solving many of the traditional problems but, in its wake, brings those of its own.

As any tool, its effectiveness depends on the user. It should be handled in the correct way and with an awareness of its weaknesses and dangers. Used badly it can lead to catastrophic results but handled well it becomes a major contributor to flight safety.

Intelligent Flight Decks

There is no precise line that divides the ‘automated’ from the ‘intelligent’, but the problem solving and data evaluation of which modern computers are capable would merit the use of terms such as ‘pilot’s associate’ and ‘electronic crew member’. There are however three main human factors issues that may be identified:

How much autonomy should be given to the machine? Should the computer be allowed

 

 

to evaluate information, make decisions, and execute them without reference to the pilot?

14

 

Or should it remain in an advisory role, presenting suggestions to the pilot to assist him in

 

making the necessary decisions? For example, should an aircraft fitted with a GPWS take

Machineand

• As machines become more complex they evaluate greater quantities of possibly ‘noisy’ data.

 

automatic climb action on receipt of a terrain warning?

 

 

Any increase in this ‘noisy’ data will lead to an increase in ‘probabilistic’ solutions. Present

Man

 

aircraft displays do not give the pilot any estimate of the reliability of the data displayed

 

 

they simply display the machine’s best guess.

 

The computer uses information from both its internal inertial system and ground based

 

 

fixing aids. However, the pilot is given the same display regardless of whether the aircraft

 

 

‘knows’ that good data is being received from all sources or whether the computer ‘knows’

 

 

that it is receiving information, for example, from two poor cross-cuts, distant VORs or an

 

 

inertial system that has been drifting for a number of hours.

 

• Pilots must have an appropriate level of trust in their equipment since under-trust can lead to unnecessary workload and over-trust has obvious dangers. Modern equipment is normally very reliable and the perceived reliability will determine the amount of trust that pilots have in the equipment. Another factor for consideration is that the modern display may be so compelling that it generates more trust than it actually deserves.

Colour Displays

Where colour is used to indicate a change of state, for example, from ‘ALT (altitude) capture’ (blue) to ‘ALT hold’ (green), the colour change should be accompanied by a change of caption or location. The change of colour by itself is not normally sufficient to ensure that the crew will notice the difference.

287

14 Man and Machine

System Active and Latent Failures/Errors

Introduction

The human contribution to the failures with modern technological systems can be divided into two types: Active and Latent Failures. The distinction between the two is:

who made the error and/or

how long these errors take to appear

Active Failures/Errors

Active errors/failures are committed at the human-system interface (i.e. in the cockpit, in the cabin or at the Air Traffic Controllers desk) and have an immediate effect. We have already discussed a number of these (Action Slip, Environmental Capture etc.).

Machine and Man 14

Latent Errors/Failures

Latent errors/failures are normally the results of decisions taken by designers, manufacturers and senior management. These people are usually a long way removed from the immediate system. However the consequences of their actions or decisions which have been dormant - perhaps for a long time - may have sudden and disastrous results. An example of latent failure was the Mount Erebus crash where an aircraft database had an unnoticed waypoint error of 2°W. This was sufficient for the aircraft to hit a mountain in poor visibility. Rushed or incomplete preparation is another example of latent failure.

System Tolerance

Sod’s Law states:

If something can go wrong, it will.

An example of Sod’s Law is:

Murphy’s Law

which states: If a system can be operated incorrectly, sooner or later it will be.

ErrorTolerance

Aviation systems, whether aircraft, organizational or procedural must be error-tolerant. This ensures that no error has serious implications to the overall safety or conduct of the system. An example of this would be an automatic system that prevents an aircraft moving outside its flight envelope regardless of the orders the pilot enters through the controls.

Protected andVulnerable Systems

Systems must also be designed to contain their own intrinsic protection. A system is considered vulnerable if one error is allowed to affect the whole system. Figure 14.7 illustrates the concept. A brick taken out from the “protected” wall will leave the main structure still standing however, in the case of the “vulnerable” wall, its whole function will be affected.

288

Man and Machine 14

Man and Machine 14

Figure 14.7 Protected and vulnerable systems

Design-induced Errors

These errors are those made by aircrew as a direct result of poor or faulty design of any part of the aircraft. The philosophy which will underpin all future EASA design efforts - especially those in the field of avionics and automation - will be based upon:

Detectability Tolerance Recoverability

Systems will be expected to detect errors made by aircrew, tolerate them and, as far as is possible, to recover from these errors.

289

14 Questions

Questions

 

1.

A pilot is reading a checklist. In what way is this reference to the SHELL Concept ?

 

 

a.

S - L

 

 

b.

H - L

 

 

c.

L - S

 

 

d.

H - E

 

2.

What percentage of the appropriate population are anthropometric data table

 

 

measurements taken from?

 

 

a.

80%, i.e. the tenth to the ninetieth percentile, using contour, dynamic and

 

 

 

static data

 

 

b.

90%, i.e. the fifth to the ninety-fifth percentile, using contour, dynamic and

 

 

 

static data

 

 

c.

50%, i.e. the twenty-fifth to the seventy-fifth percentile, using contour,

 

 

 

dynamic and static data

 

 

d.

None of the above

 

3.

What is the most common checklist error?

 

 

a.

Action slip

 

 

b.

Too many capital letters are used

14

 

c.

Responded to automatically

 

d.

Missing items

Questions

 

4.

What is the purpose of the lumbar support?

 

 

 

a.

To allow the most comfortable position for the spine and higher neck bones

 

 

b.

To allow the most comfortable position for the spine and shoulder bones

 

 

c.

To allow the most comfortable position for the spine

 

 

d.

To produce an even pressure of the discs by allowing the lower spine to curve

 

 

 

naturally

 

5.

What are the essential characteristics of a cockpit warning?

 

 

a.

It should have the best attention-getting qualities as possible

 

 

b.

It should be attention-getting but not alarming

 

 

c.

It should have attention-getting qualities which do not compromise a clear

 

 

 

indiction to the pilot of the faulty component/system

 

 

d.

Must not dazzle or possibly compromise the crew’s night vision

 

6.

What is the most important feature of flight deck design?

 

 

a.

Escape and emergency exits should be clear of obstructions

 

 

b.

The design eye point must be clearly marked

 

 

c.

Important controls must be located in easily reached and unobstructed

 

 

 

positions

 

 

d.

Controls and indicators should be standardized

290

 

 

Questions

 

14

 

7.

What will the pilot lose sight of on the approach if seated below the Design Eye

 

 

 

 

Point?

 

 

 

 

 

a.

Some of the undershoot

 

 

 

 

b.

Some of the overshoot

 

 

 

 

c.

Peripheral objects especially at night

 

 

 

 

d.

The sight view

 

 

 

8.

What instrument is best for showing small change?

 

 

 

 

a.

A digital display

 

 

 

 

b.

An analogue display

 

 

 

 

c.

A mixed digital/analogue display

 

 

 

 

d.

Ultra/high-precision gyro instrument

 

 

 

9.

What colour should the ‘Alert’ warning be on a CRT?

 

 

 

 

a.

Bright red and flashing

 

 

 

 

b.

Steady Red

 

 

 

 

c.

Flashing yellow/amber

 

 

 

 

d.

Steady yellow

 

 

 

10.

Which pitot/static instrument is most likely to be misread?

 

 

 

 

a.

The ASI at night illuminated by a red light

14

 

b.

The ASI at night illuminated by low intensity white light

 

 

 

 

 

c.

The three point altimeter

 

Questions

 

d.

The four point altimeter

 

 

 

 

 

11.

A manually operated valve should be opened by:

 

 

 

 

a.

turning it clockwise

 

 

 

 

b.

turning it anticlockwise

 

 

 

 

c.

turning either way

 

 

 

 

d.

depends on the system it operates

 

 

 

12.

The three types of anthropometric measurements are:

 

 

 

 

a.

static, design, contour

 

 

 

 

b.

contour, design, dynamic

 

 

 

 

c.

static, dynamic, contour

 

 

 

 

d.

static, dynamic, design

 

 

 

13.

In the Shell Model L stands for:

 

 

 

 

a.

latent errors

 

 

 

 

b.

long-term errors

 

 

 

 

c.

lengthy errors

 

 

 

 

d.

liveware

 

 

 

291

14 Questions

 

14.

System Tolerance can be subdivided into:

 

 

a.

protected and semi-protected systems

 

 

b.

protected and endangered systems

 

 

c.

protected and vulnerable systems

 

 

d.

protected and quasi-protected systems

 

15.

A flashing red warning light on a CRT normally indicates:

 

 

a.

there is a fault in a critical system

 

 

b.

emergency

 

 

c.

alert

 

 

d.

danger

 

16.

Automation Complacency is:

 

 

a.

overconfidence in the handling capability of the pilot

 

 

b.

overconfidence in the handling capability of the pilot of computers

 

 

c.

overreliance on automation

 

 

d.

the blind belief in automation

 

17.

Mode error is associated with:

 

 

a.

automation

14

 

b.

hardware

 

c.

INS

Questions

 

d.

software

18.

A danger of automation is that:

 

 

 

a.

there can be greater delays between the performance of the crew and its

 

 

 

ultimate effect

 

 

b.

delays between the performance of the crew and its ultimate effect are

 

 

 

shortened

 

 

c.

delays between the performance of the crew and its ultimate effect are not

 

 

 

appreciated

 

 

d.

delays between the performance of the crew and its ultimate effect have no

 

 

 

effect

 

19.

Automation:

 

 

a.

helps with unusual and unexpected situations

 

 

b.

may result in a pilot being unaware of important information when dealing

 

 

 

with an unusual and unexpected situation

 

 

c.

increases reaction time when dealing with unusual and unexpected situations

 

 

d.

decreases reaction time when dealing with unusual and unexpected situations

 

20.

Automation can result in:

 

 

a.

lack of information being passed between crew members

 

 

b.

too much information being passed between crew members

 

 

c.

confused information being passed between crew members

 

 

d.

too much detailed information being passed between crew members

292

Questions 14

Questions 14

293

Соседние файлы в предмете [НЕСОРТИРОВАННОЕ]