UPDATED 1 Sept: The EI library in London is temporarily closed to the public, as a precautionary measure in light of the ongoing COVID-19 situation. The Knowledge Service will still be answering email queries via email , or via live chats during working hours (09:15-17:00 GMT). Our e-library is always open for members here: eLibrary , for full-text access to over 200 e-books and millions of articles. Thank you for your patience.
New Energy World™
New Energy World™ embraces the whole energy industry as it connects and converges to address the decarbonisation challenge. It covers progress being made across the industry, from the dynamics under way to reduce emissions in oil and gas, through improvements to the efficiency of energy conversion and use, to cutting-edge initiatives in renewable and low-carbon technologies.
How to deal with hazards in process operations more effectively
19/3/2025
10 min read
Feature
Despite the widespread use of risk assessments and hazard identification, accidents continue to occur in the energy industry and elsewhere. Perhaps more attention should be given to human factors in process safety, and particularly the conditions that raise the likelihood of errors. The co-author of new Energy Institute guidelines on the subject, Dr Marcin Nazaruk, founder and CEO of Psychology Applied, explains.
A very unfortunate workplace incident occurred a couple of years ago. Two technicians were planning to do some repair and maintenance on a valve in a process plant. A risk assessment was in place. However, on the day they approached the valve and started working on it, a jet of steam was released that killed both of them.
Despite having had a risk assessment and hazards controlled, they had opened the wrong valve. The case went to court, and the judge emphasised that the employer should have foreseen and prevented the risk of the men selecting the wrong valve.
The lesson of this sad story is that identifying and controlling hazards isn’t enough to effectively reduce risk in operations. That’s because something else is co-creating risk.
Hazards and error-traps
Hazards can be defined as anything with the potential to cause harm. There is a school of thought that hazards can be expressed in terms of energy (gravity, motion, mechanical, electrical, pressure, temperature, chemical, biological, radiation, sound) where the amount of force exceeds the amount the human body can withstand and causes injury.
‘Error traps’ increase the likelihood of error. Such as procedures that are out of date, training missing key skills, lack of certain tools, or confusing designs. People face these challenges when working in ways that can lead to error or risk. For example, if the procedure is unworkable workers can develop their own procedure. Or if the tools aren’t right, they may fabricate their own. Sometimes these adaptations are misleadingly labelled, but workers employ them to help get the job done.
Consider for example the case of two aerosol cans in a kitchen. One is a cooking oil spray; the other is an industrial lubricant that contains a toxic chemical. They are the same shape and colour and happen to have similar-looking labels. In this case, the hazard is the toxicity of the lubricant that may lead to harm if ingested. The potential error is using the industrial lubricant when cooking; the error-producing conditions are the visual resemblance of the two cans and their proximity in storage.
Here is an industrial example (see Fig 1). A wall-mounted panel with buttons is used to control a process. In front of the panel, pipes have been installed which partially block access to the panel, so that an operator has to reach through the pipes. Hazards could include electrocution, hand entrapment, burns, hearing damage, electrocution, or potential pressure release.
Fig 1: How physical barriers can increase operational risk
Source: Psychology Applied Limited
But there are more things that increase the risk of failure. For example, the location of the pipes in front of the panel make the control panel harder to operate as it restricts visibility and hand movement. And what if it’s in the middle of the night, and this should be a two-person job, but their colleague was called to a different job, and the work instructions do not cover how this can be done by a single operator. And the radio to the control room is broken. It’s difficult to classify these factors as hazards, because on their own they don’t have the same potential to cause harm in the same way as physical hazards. Yet they do increase the likelihood of failure, and the risk that the operator will press the wrong button by mistake.
Identifying and controlling hazards isn’t enough to effectively reduce risk in operations. That’s because something else is co-creating risk.
How do you control risks before a job?
The traditional UK risk assessment process consists of five steps: identify hazards, assess risks, control hazards, record them and review.
That works well for hazards, but it needs to be expanded to include identifying error-producing conditions and potential errors. Then give some thought to how we can address those and control the associated risk.
The Energy Institute has published new, and free, advice on how to identify error-producing conditions before they contribute to risk, EI 3579 Guidance on human factors in task-based risk assessment, written by myself and Piotr Cichowicz of Psychology Applied. This work is not a replacement for safety critical task analysis, but is a practical tool to help manage risks associated with human factors.
Case study
Let’s say that several workers are told that they must empty a chemical tank. That process is accomplished by fitting a hose, opening a valve and then, when finished, to close the valve and disconnect the hose. The hazards are physically touching the corrosive chemical contained in the tank or releasing it into the environment. The controls applied were chemical awareness and a spill response procedure.
But when we engaged with the technicians doing this work, and asked what makes this task difficult, we learned that they might disconnect the hose before closing the valve, so the chemical leaks out. This is a potential error. The error-producing condition is that the valve, which is actuated externally by turning a wrench, has no visual indication of its position (open or closed).
With that insight, how can we prevent this error. In this case, the employer decided to redesign the hose connection to make it impossible to disconnect the hose when the valve is open. He also added a visual indication. What is important is that this aspect of risk was not visible to those who did the original risk assessment, so the original risk profile and calculated likelihood of injury didn’t reflect the actual risk.
How do you control risks during the job?
That is an example of a planned situation. But job sites are dynamic. They often change and require people to adapt to stay safe. For that, the last-minute risk assessment helps workers check if there are any conditions that increase risk.
Below are several techniques which have been tested in various industries.
One is called a ‘15-second scan’, and involves an operator stopping work and, starting from an arbitrary point, viewing the entire work area systematically by slowly turning their head or body. This is based on research in human vision and attention, which indicates that when we scan the area around us systematically through a pattern, such as horizontally, we are more effective in capturing potential sources of danger. It is used in training aeroplane pilots to scan the outside for hazards and systematically review their gauges.
A second technique is called ‘look, point and call out’. It has been used by train drivers in Japan to reduce the risk of crossing at a red signal. It is a way of focusing attention on a single point; verbalisation activates cognitive processes to reduce the risk of error. I use this when I fuel my diesel car to reduce the risk of filling the tank with petrol. When I choose the nozzle marked ‘diesel’ from the multiple ones at the pump, I say to myself: ‘This is the diesel pump.’ I point to the car and say, for confirmation: ‘This is a diesel car.’ This is a simple behavioural technique for self-management for error reduction. Other techniques include self-checks, peer checks and having criteria about when to stop the job.
It is important to emphasise that it’s not as simple as just asking people to do a 15-second scan. This is a skill that needs to be practiced and developed into a habit to work effectively.
Common challenges with the implementation
In implementing these techniques, there are some pitfalls. In distinguishing between hazards and error traps, it is common to blame behaviours (such as selecting the wrong tool) or cognitive states (complacency and underestimating) for the mistake. This is not very helpful.
While cognition is a factor in process safety, humans are bad at viewing the causes of things in an objective manner. For example, the fundamental attribution error creates a tendency to point to personal characteristics as the problem. Secondly, people tend to find what they look for. People often find information that confirms their pre-existing view of the world.
Concluding that an accident is caused by ‘not being aware’ or ‘not knowing’ is not good enough. We need to ask what is responsible for giving people awareness or knowledge in the workplace.
Organisations don’t employ random people off the street. If workers don’t have the knowledge or awareness they should have, that’s not necessarily their fault. Something (or someone) in the organisation should have provided that information or pre-select. If we conclude that the root cause was ‘not knowing’, then we gravitate towards solutions like ‘remind’, ‘refresh’ and campaign posters, which are actually not that effective at reducing risks.
Moreover, the bias of people ‘finding what they were looking for’ will mean that giving workers an updated template will only result in them tweaking it to confirm what they already know. If they aren’t prepared, workers will use the new template to give you more of the same.
Someone is needed to ensure the quality of the output, and to achieve common understanding and use of terminology.
Based on work with front-line employees, the quality of hazard identification can be classified into three levels.
- Using a guide word of the relevant issue, like ‘gravity’.
- Translating that into a generic risk, so ‘gravity’ becomes ‘dropped object’. Better, but not good enough.
- What is needed is to describe hazards in a contextual manner. For example: ‘When the forklift driver puts the pallet on the top storage rack, if the light is poor, and the driver pushes it too far, it will fall.' This connects the potential danger with the activity and some additional conditions that exist in the work environment. Getting from the first to the third level requires practice, repetition and training.
We developed a training course to help front-line workers and supervisors improve their hazard and error trap identification, differentiation of the two, how to identify critical steps, situational awareness and dynamic risk assessment, how to develop sensitivity to changes, predicting movement, what can collide with what, and how to scan the area around them.
Having received this training, workers returned to the workplace, where they practiced these techniques, which were reinforced with the help of team leaders on an ongoing basis. Before-and-after tests revealed that the training had increased the number of identified hazards by 50%, and improved the quality of hazard identification by 40% (shifting from level 1 to level 3). It also increased identification of error traps.
How can these techniques be integrated into organisational processes?
Risk management is an ecosystem of different components that need to be aligned. There are the policies or procedures, from which stem the templates that allow people to comply; and training to provide them with skills to use the templates to comply with the policy. There is also coaching of others, data analysis, tracking KPIs (key performance indicators).
Some error traps, through poor design, are in place all the time. Others, like a broken radio, or someone being off work, are temporary and change all the time. But any change will affect the risk profile, so workers need to know how to identify temporary aspects of risk.
Imagine that your team identifies a hazard or an error trap before or during the job. Now what? What happens with that information? Where is it recorded? Who records it? Who receives that report? Who validates it? Who decides what to do with it? What action is taken? Who provides the resources if a new tool is needed, or time spent to rewrite the procedure? How is that information used to reduce the risk and provide feedback to the workers? How is this information shared more broadly?
These are questions that should be considered when deploying these techniques in an organisation.
- Further reading: EI 3579: Guidance on human factors in task-based risk assessment
- ‘Learning from Normal Work: How to proactively reduce risk when nothing goes wrong’, Marcin Nazaruk, PSJ Process Safety, November 2023.
- ‘Why are process safety incidents still happening and what can we do to stop them?’ Identifying, evaluating the risks of, and ultimately preventing accidents and incidents should be of the utmost importance to anyone working with hazardous chemicals, but this is not always the case. Dr Stephen Bater FEI, Director of Process Safety Auditing Limited, recommends implementing the EI Process Safety Management Framework to maintain safe operations.
- G+Technical Manager Mariana Carvalho explains why the new G+ lifesaving rules for offshore wind are so important.