Orynbekova A.B – KSTU student (IM – 15 – 1 group)
Zhalgas A.B. – KSTU student (IM – 15 – 1 group) Supervisor – Ayazbaeva S.S.
IRONIES OF AUTOMATION
Abstract – This paper discusses the ways in which automation of industrial processes may expand rather than eliminate problems with the human operator. Some comments will be made on methods of alleviating these problems within the «classic’ approach of leaving the operator with responsibility for abnormal conditions, and on the potential for continued use of the human operator for on-line decision-making within human-computer collaboration.
The classic aim of automation is to replace human manual control, planning and problem solving by automatic devices and computers. However, as Bibby and colleagues (1975) point out: «even highly automated systems, such as electric power networks, need human beings for supervision, adjustment, maintenance, expansion and improvement. Therefore one can draw the paradoxical conclusion that automated systems still are man-machine systems, for which both technical and human factors are important.» This paper suggests that the increased interest in human factors among engineers reflects the irony that the more advanced a control system is, so the more crucial may be the contribution of the human operator.
This paper is particularly concerned with control in process industries, although examples will be drawn from flight-deck automation. In process plants the different modes of operation may be automated to different extents, for example normal operation and shut-down may be automatic while start-up and abnormal conditions are manual. The problems of the use of automatic or manual control are a function of the predictability of process behaviour, whatever the mode of operation. The first two sections of this paper discuss automatic on-line control where a human operator is expected to take-over in abnormal conditions, the last section introduces some aspects of human – computer collaboration in on-line control.
Tasks after automation. There are two general categories of task left for an operator in an automated system. He may be expected to monitor that the automatic system is operating correctly, and if it is not he may be expected to call a more experienced operator or to take-over himself. We will discuss the ronies of manual take-over first, as the points made also have mplications for monitoring. To take over and stabilize the process requires manual control skills, to diagnose the fault as a basis for shut down or recovery requires cognitive skills.
Cognitive skills. Long-term knowledge: An operator who finds out how to control the plant for himself, without explicit training, uses a set of propositions about possible process behaviour, from which he generates strategies to try (e.g. Bainbridge, 1981). Similarly an operator will only be able to generate successful new strategies for unusual situations if he has an adequate knowledge of the process. There are two problems with this for ‘machine-minding’ operators. One is that efficient retrieval of knowledge from long- term memory depends on frequency of use (consider any subject which you passed an examination in at school and have not thought about since). The other is that this type of knowledge develops only through use and feedback about its effectiveness. There is some concern that the present generation of automated systems, which are monitored by former manual operators, are riding on their skills, which later generations of operators cannot be expected to have. Working storage: The other important aspect of cognitive skills in on-line decision making is that decisions are made within the context of the operator’s knowledge of the current state of the process. This is a more complex form of running memory than the notion of a limited capacity short-term store used for items such as telephone numbers. The operator has in his head (Bainbridge, 1975) not raw data about the process state, but results of making predictions and decisions about the process which will be useful in future situations, including his future actions. This information takes time to build up. Manual operators may come into the control room quarter to half an hour before they are due to take over control, so they can get this feel for what the process is doing. The implication of this for manual take-over from automatically controlled plant is that the operator who has to do something quickly can only do so on the basis of minimum information, he will not be able to make decisions based on wide knowledge of the plant state until he has had time to check and think about it.
The ingenious suggestions reviewed in the last section show that humans working without time-pressure can be impressive problem solvers. The difficulty remains that they are less effective when under time pressure. 1 hope this paper has made clear both the irony that one is not by automating necessarily removing the difficulties, and also the possibility that resolving them will require even greater technological ingenuity than does classic automation.
Brief Paper
1. Enstrom, K. O. and W. B. Rouse (1977). Real-time determination of how a human has allocated his attention between control and monitoring tasks.
IEEE Trans. Syst., Man & Cybern., SMC-7, 153.
2. Ephrath, A. R. (1980). Verbal presentation. NATO Symposium on Human Detection and Diagnosis of System Failures, Roskilde, Denmark.
3. Marshall, E. C. and A. Shepherd (1981). A fault-finding training programme for continuous plant operators. In J. Rasmussen and W. B. Rouse
(Eds.), op. cit., pp. 575-588.