Doctors bury their mistakes, and lawyers get to lock theirs away; but engineers build their mistakes out in the open for everyone to come and see.
- An Engineering Professor
It's all fun and games until someone puts out an eye...
- Any parent, any time, anywhere
The Beginnings of Risk Assessment
It could be argued that the use of clever applications of technology, such as machines and so forth, is one of the notable characteristics of an industrialised society, and leads to a greater enjoyment of life - or at least, the appearance of a greater enjoyment of life.
There are also some who would argue that life is defined by risk itself, and that technology tends to turn life into a brain-bending boring affair from which all meaning is stripped away. The argument then boils down to what makes life worth living, which is beyond the scope of this entry.
Machines have been designed to perform functions that people prefer not to perform themselves, for reasons either of convenience or, more frequently, the danger involved. We have machines to do everything: from washing our clothes and dishes to entertaining us during the time we used to spend washing dishes and clothes. They even do some amazingly difficult and dangerous work, such as the handling, moving, or confinement of highly hazardous material.
Having machines to perform our most dangerous or undesirable tasks for us is a fine luxury indeed - unless they make things worse, of course. One might ask, 'What if the machines make things worse?' ' What do we mean by worse, anyway?' The answers to such questions are not to be had easily, and they are at the heart of philosophical debate on the usefulness of any machine.
What Is Risk?
One of the most important tools for evaluating a machine's usefulness is the concept of risk. There are of course other uses for the concept of risk (most of which are similar in nature to the one applied to machines and industry): for example, fiscal and insurance policies, to name just two.
To understand what is meant by risk, it must first be clearly differentiated from another important concept: hazard.
In engineering risk assessment, 'hazard' is a term used to refer to a physical property or condition of a material (or a set of properties and conditions), in a situation where the possibility for harm exists. The term 'hazard' may on the surface look a lot like the definition of risk that will be given in just a moment, but the primary difference is that hazard is a property that is independent of frequency or consequence. In simple terms, a hazard is a source of danger, while the risk is a quantitative or qualitative expression of a possible loss, an expression that considers both the probability and the consequences.
When identifying a hazard, there is no consideration of the likelihood or credibility of accidents, or of any sort of prevention or mitigation. There is a very important distinction to be made here. Risks can be controlled, lessened, or minimised. Hazards either exist or do not exist; they are not controlled or minimised.
Many dictionaries define risk as 'the possibility of suffering harm or loss'. Further entries list synonyms such as 'danger', 'peril', 'jeopardy', and 'hazard'.
While such terms are fine for general use, the term 'risk' has come to be used more and more often by those who design and make machines, and has taken on a more detailed and specific meaning.
Anyone who has met an engineer knows that engineers are most comfortable with things that can be measured and quantified. To Risk Management pioneers, terms as nebulous and prone to human interpretation as 'danger' or 'peril' must have seemed quite unacceptable when applied to machines. Thus the term 'risk' has developed into a quantifiable concept that can be applied in the all-important decision-making process carried out before the making of the machine.
It should be noted that 'risk' is still a relative term. As discussed below, there are limits to how well and accurately we can predict probabilities and consequences, and therefore a lot of 'engineering judgement' (and other phrases that boil down to expert opinion) becomes necessary.
The need for risk-based decision-making arose in the early days of industrialised society, when it quickly became apparent that there was a need to ask very important questions, such as,
Will this machine help?
Will this machine work?
and, of course
Will this machine, in fact, kill us?
It became apparent rather quickly that a specific answer (such as Yes or No) was required before turning a machine on. This meant that the engineers had some work to do in order to come up with something the people could hang their hats on.
Engineers (and scientists, policy-makers, investors, governments, et al) then developed a means of quantifying risk. They quickly realised that risk was best quantified as a combination of two factors:
- How badly can something go wrong?
- How often can it happen?
This made it easier to come up with some sort of 'number' that could quantify the level of 'risk' resulting from a combination of consequence and frequency (which can be thought of as the probability of something happening in a given time span). As an added benefit to splitting consequence from frequency, each part of the equation could be weighted individually, according to certain criteria determined by the sensitivity of either factor in the final decision-making process.
That is, one could decide beforehand that, no matter what the likelihood, certain consequences are just too grave to accept.
Through this application and the definitions used above, the modern engineering concept of risk came into being.
Understanding and Controlling Risk
Risk is determined using risk assessment methods that vary somewhat from application to application, but generally have a few key points in common.
A risk assessment begins with a hazard identification, then considers the effects of structures, systems, and components that prevent, detect, or mitigate accidents. Risk assessments may employ all sorts of methods to determine the frequency with which a particular hazard might result in an accident or other undesired event (sometimes rather endearingly called 'off-normal'), as well as the worst-case consequences that could result. It is at this stage that sciences such as mathematics (and particularly probability) and physics have a large part to play.
This entry will not discuss the myriad methods used in rigorous risk assessment in engineering applications. Suffice it to say that a study of probability and physics is required for anyone professionally involved in engineering risk assessment.
The study of probability, as it relates to engineered machines and other industrial factors, is essential to determining half of the risk equation. The study of physics (and in particular the sciences that deal with phenomena such as fire and explosion), as well as the study of medicine and human anatomy, is crucial to understanding and predicting the consequences of an event. This makes up the other side of the risk equation.
There are many more disciplines involved than any one person can normally become an expert in, thus there is usually a large number of scientists, engineers, medical doctors, and evironmental experts involved in engineering risk assessment.
The large amount of information and the number of people involved usually presents quite a logistical problem when conducting large-scale industrial or engineering risk assessments. One problem that is common to all risk assessment, however, and must be recognised by those who rely on risk assessment to make critical decisions, is the fundamental one of attempting to quantify risk itself.
Advantages and Disadvantages of Quantified Risk
It is important to note that when employed rigorously, using the sciences of mathematical probability and physics, risk assessment gives a number that only represents some relative concept of more or less risk. Deciding what that actually means in plain English is the most difficult part of risk assessment.
At present, there are only a few units of risk for which any attempt has been made at a definition, and none has been defined officially. This is because there is no real means of developing a unit that could apply to consequences that vary greatly between accident scenarios, at least outside of military applications. (Perhaps one notorious example of a maximum consequence is the very macabre unit thought up to express the damage of a global nuclear war in terms of millions of human lives - the 'megadeath'.)
Risk exists as a sometimes arbitrary-seeming grouping or range that has some sort of loosely-defined consequences ascribed to it. Regardless of how one may wish to express risk, however, those that use risk assessment to make decisions must recognise and remain cognizant of the fact that such an expression of risk is useful for a single, all-important purpose: how to control it.
While engineering risk assessments can be very complicated, quite frequently risk assessment is performed in a less rigorous manner using the concept of estimation. (The practice of routinely using over-estimation of risk, or a 'safety factor', is frequently referred to as 'conservatism' in risk assessment.) In fact, people probably employ some amount of risk assessment in their everyday lives, using their own estimation.
A fairly common example would be the decisions and actions involved when driving a car. The act of deciding to buckle one's safety belt when riding in a motor vehicle is a conscientious application of a risk assessment process.
In engineering terms, our seatbelt-wearing person has provided a control that mitigates the consequences of an accidental condition. The seatbelt does nothing to prevent the accident, it must be noted. To prevent accidents, one may study how to drive one's particular car, taking courses in its safe operation, and the rules and regulations governing the routes one is likely to use while driving the vehicle.
In analogous engineering terms, such preventive human-interactive practices are termed 'administrative controls'. One might purchase a vehicle that is capable of stopping quickly, or one that also handles well under a variety of conditions, so as to make collisions less likely. Such measures are termed preventive 'engineered' controls when applied to an industrial process.
In industry, engineered controls are generally preferred over administrative controls. This is actually because of their lack of a brain or cognitive reasoning, such lack having a distinct advantage when it comes to maintaining 100% vigilance over a risky situation. Machines are far better than humans at waiting around for ages until something unexpected happens, and then reacting in a very rapid and predictable manner. This is because they do not have brains that can get bored. Humans tend to wander off when things are slow - which, as any fireman can tell you, is precisely when things tend to happen suddenly.
An important and obvious caveat to the preference of engineered controls over administrative ones is the possibility that something can go wrong in a totally unplanned and unusual way. In such a case engineered controls may be inadequate or, even worse, contribute to the overall negative consequence of an accident if allowed to perform as they were designed.
In industry, particularly high-consequence industries such as some toxic chemical processing facilities and large commercial nuclear reactors, risk is controlled through a wide range of engineered controls of varying complexity, which are all backed up by human beings at some point. People who monitor and control safety systems at commercial nuclear power plants are frequently called 'Operators', and such a title has been applied elsewhere for similar positions at large-scale industrial plants.
Maintaining Risk Control
Once the risk has been assessed the next step is, naturally, to try to control or manage the risk involved. After determining how best to control risk, maintaining the risk control becomes the highest priority. In fact, it is in maintaining control of the operation of highly hazardous industrial or engineering operations that the most significant expenditure of resources and time is usually required.
It is essential, then, to determine how important a risk-controlling procedure or practice may be. Unfortunately, and sometimes tragically, the part about maintaining the risk control itself can be the most often skipped over, particularly with very complex processes and applications.
It seems logical that something that is relied upon to provide some amount of protection from a risky event would be maintained and treated as a Very Important Thing for Safety. Logically this is obvious, but as the complexity of machines and the systems incorporating them increases, so too does the difficulty of maintaining controls on small components that may seem to be insignificant parts of the overall process.
The problem is that humans are not particularly noted for their absolute adherence to logic. This is why the requirement for properly documenting the risk assessment process is so important. People tend to be far less likely to be allowed to make decisions outside of the risk-based logic if they are forced to write down their thought processes on paper for their peers to read.
In fact, although it is common and quite useful to have one's peers review the documented assessment of risk, people are known to be quite easily swayed by a natural tendency to think as a group following a leader. (The psychology of this can be very intriguing and intricate. It is therefore important to have members outside of the immediate peer group to review and comment on the risk assessment process documented by the originating group of risk assessors.
Taking this philosophy further, the level of public involvement in documenting and reviewing risk assessment has greatly increased over the past 20 years or so, most notably as a result of some very tragic events. As can be seen in the case histories part of this article, certain large-scale tragic accidents have led to the adoption of laws (frequently termed 'rules' in the USA).
Related BBCi Site
Are we any good at assessing risk? Listen to a discussion.