Automation is often seen as the path to higher efficiency, better safety and reduced human error.

But automation can actually give rise to human error. This can occur if the design of automated systems does not take into account the interaction between human operators and automation.

When a process is automated, humans are usually not completely removed from the picture. A human operator will generally monitor the automated system’s progress toward completion of the process, standing ready to take remedial action in the event of a failure of the automated system to complete a task successfully.

Figure 1: Even in a fully automated process, human interaction is not far removed.Figure 1: Even in a fully automated process, human interaction is not far removed.Humans are needed because elements of automation like robotics and artificial intelligence are not yet advanced enough to handle every scenario, especially unexpected, novel situations for which an appropriate reaction is not present in the automated system’s programming.

The best automation systems are designed with these limitations in mind, providing human operators with the power to make up for automation’s deficiencies while taking advantage of automation’s potential benefits like improved efficiency and quality.

Human errors can actually increase and lead to failure of an automated system if automation design does not consider the effects of automation on the human operator, including how automation changes the feedback received by the operator during a process, how automation changes the kind of tasks an operator performs and how these changes affect the operator’s behavior.

Altered feedback

To properly control a system, an operator must have sufficient feedback on variations in relevant process variables. Automating a process often reduces the quality of feedback an operator receives compared to feedback received during manual work.

For example, as the pulp and paper industry automated its operations, human machine operators were no longer required on the factory floor. Instead, operators monitored processes remotely. As a result, operators lost a number of sensory feedback elements, including auditory (the sounds machines make), haptic (the vibrations machines produce), and even olfactory (the smells created during processing). Operators had to adjust to a new method of controlling the process, monitoring a different set of feedbacks, such as digital gauges displaying pressures in the press section of the paper manufacturing line or temperatures in the drying section.

Automation design should take into account this change in feedback, compensating for a loss of one type of feedback by providing the operator sufficient alternative feedback. Inadequate feedback can cause human operators to make errors by making it harder to spot problems in automated processes or to figure out the nature of automation failures.

Different tasks

Automation transforms jobs, changing the set of tasks human operators are expected to perform. Simple tasks like manual labor are often eliminated, replaced by more complex tasks that require a different set of skills.

For instance, if a material handling system is automated, operators may no longer need to manually load or move process materials from one machine to the next. On the other hand, to be able to recognize, diagnose and correct failures in the system, they need an understanding of how automated conveying systems work, including components such as conveyor roller motors or alignment sensors that could be points of failure.

Figure 2: An automated material handling system still requires human understanding when it comes to failure or maintenance.Figure 2: An automated material handling system still requires human understanding when it comes to failure or maintenance.

The potential for human error can rise if human operators are not properly trained and prepared with the right knowledge set appropriate for the altered tasks that automation presents.

Modified behavior

As processes are automated, direct manual human control is relinquished. Instead, the operator’s job is to monitor the automated system, ensuring its proper operation and successful completion of the process. This shift from active manual control to passive supervisory control separates an operator’s actions from the controlled process.

As a result, there is a possibility that the human operator might develop a complacent reliance on automation and be more likely to lose focus on monitoring the automated system. This is especially true in cases where the operator is responsible for monitoring multiple machines and systems and may have additional tasks competing for their attention. Inadequate monitoring can cause human operators to miss small errors in an automated system that cascade into more serious failures.

Trust issues also arise when manual processes are automated. For example, an operator might place too much trust in an automated system’s capabilities, leading to overreliance and lax monitoring. Conversely, if the operator places too little trust in the automated system, the operator might put too much effort into verification tasks, leading to reduced efficiency or overwork.

Since automation changes the feedback operators receive and the tasks operators perform, operator behavior is affected. A higher probability of human error can exist if automation system design does not consider these effects of automation on human operators.

Designs that focus on the interaction between humans and automation as well as their interdependence as a collaborative system instead of solely focusing on the technical implementation of the automated system (e.g. hardware and control algorithms) are more likely to produce a system with lower rates of human error and greater overall efficiency.

Resources

Lee, John & Seppelt, Bobbie. (2009). Human Factors in Automation Design. 10.1007/978-3-540-78831-7_25.