Following on from the philosophical pages we assume here that certainty is only available through faith. But we will not assume that this implies that we are helpless and cannot act to fulfil a proper duty of care.
Dictionary definitions of certainty include words such as truth, surity, freedom from doubt, trustworthiness, reliability, dependability, unquestionablness, inevitability, indisputableness, definiteness, particularity or unambiguity.
We can unpack these meanings by noting that there are three groups. First is the age old philosophical and practical question of what we mean when we say something is true (e.g. truth, surity). Second is the need to be able to trust what we say (freedom from doubt, trustworthiness, reliability, dependability, unquestionableness, inevitability, indisputableness). Third is the level of clarity of what we say (definiteness particularity or unambiguity).
By contracts dictionary definitions of uncertainty use opposite terms like irregularity, variable, erratic, unpredictable, indeterminate, skeptical, hesitancy, not assured or fully confident and ambivalent. Uncertainty is also a quality of being with respect to duration, precariousness, continuance or occurrence; a liability to chance or accident. We can distill these many aspects into another three kinds of meaning – changeableness (irregularity, variable, erratic, unpredictable), incompleteness (indeterminate, skeptical, hesitancy, not assured or fully confident and ambivalent) and risk (precariousness, continuance or occurrence; a liability to chance or accident).
These six kinds of uncertainty (truth, trust, clarity, changeableness, incompleteness and risk) are manifest in engineering in the following way. Truth and changeableness (together as one kind) with clarity and incompleteness are the three structural attributes of any proposition based on any model of information. Trust is central to the making of decisions by ‘experts’ in the managing and controlling risk.
How does this manifest itself in the scientific analysis of hard systems? By this we mean the engineering science of structures, hydraulics, thermodynamics, electronics etc. First these analyses use physical and measurable parameters such as loads, geometries, material properties etc. Models are (usually) mathematical relationships between the parameters that capture system behaviour and allow predictions of future behaviour. They are tested under laboratory conditions and are usually quite accurate. Unfortunately as we move away from those well-defined conditions in the laboratory to working situations in practice the accuracy of the relationship suffers and the actual performance becomes less certain. This new uncertainty is usually described as parameter uncertainty and model/system uncertainty.
Parameter uncertainty is the uncertainty of truth and changebleness of the values of the parameters – such as the load on a bridge. Model uncertainty is the uncertainty of truth and changebleness of the model or system given precisely defined parameters. Engineers have to judge whether a model is appropriate or not in a context and they do that on the basis of evidence. In both cases (parameter and system uncertainty) reliability analysts model the truth and changeableness using probability theory. Unfortunately it is difficult to capture lack of clarity and incompleteness this way.
The lack of clarity in a model or piece of information is of, at least, three types. The first is that it is poorly constructed or difficult to understand – the only way to deal with that is to seek clarification. The second is that it may be vague and the third is that it may be ambiguous. For example if I say to you that my friend John is of medium height then I communicate something to you clearly but not precisely. The information is vague or fuzzy. A scientist may demand that I be more precise and demand that I state his height in metres. If I say he is between 1.80 and 1.82 metres then I am more precise but less likely to be accurate and I may have to spend resources to get that information. In a practical situation I may be able to solve my problem without spending that resource. For example if I want to buy him a shirt which comes in three sizes of small, medium and large then I have all the information I need.
Information is ambiguous if it has more than one meaning. Whilst poetry and the arts thrive on ambiguity, science and engineering require practitioners not to be equivocal and to be as clear and precise as possible. Therefore it is part of an engineer’s duty to explore any potential ambiguity and remove it wherever possible. There is no mathematics that can deal with ambiguity – it has to be managed by users. Ambivalence is similar to ambiguity – it is having mixed or contradictory feelings.
Incomplete information is not fully formed, unfinished and lacking in something. For example a structural model of a bridge says nothing about its aesthetic value. However a strict definition of incompleteness is that it is that which we do not know – and therein lies the difficulty of anticipating surprises or unforeseen events.
Donald Rumsfeld, the one time US Secretary of State for Defence was ridiculed when he famously said, ‘There are known knowns – these are things we know we know. There are known unknowns…..these are the things we do not know. But there are also unknown unknowns; these are the things we don’t know we don’t know’.
But Rumsfold was right. Unless you are incredibly arrogant you must admit that there are many things that you don’t know even exist. They may be known by others but not by you – you are completely unaware. Until lateral torsional buckling of beams was identified in the late 19th century it was an unknown unknown to everyone – no-one could recognise it even if presented with it in practice. When some of these things are identified then they become a known known. But perhaps, as an individual, you stop studying them before getting to the deepest level of understanding – you know they exist but that’s all – they are a known unknown to you. We all have lots of these – quantum physics is one such for most people.
The recognition of the importance of incompleteness is very humbling because it makes us realise that unanticipated events may happen. If the consequences are minor then we can cope. The central concern is managing surprises where the consequences are serious. This century began with the big surprise we call 9/11 and continued with the surprise financial crisis of 2008. In today’s complex world there will be more surprises and we have to prepare for them as best we can.
Trust is a key commodity and easily broken. When people who have been trusted are found wanting then enormous damage ensues. There has been, in recent years, a loss of confidence in the idea of an ‘expert’. A general questioning of authority, scandals in the City, corrupt lawyers who end up in jail, large scale engineering failures have all contributed to the loss of faith in the ‘expert’. Professional people have to grapple with complex problems within complex systems and have to make judgments which lay people have to take on trust. Trust is a very fragile commodity which once broken is very difficult to rebuild.
Risk is about the future – it is the chance or likelihood of a specific hazard set in a specific context actually coming about. Risk is the chance of some state of affairs happening at some time in the future combined with the consequences that will follow.
Risks can only be understood in context. To identify, understand, monitor and change them we need dependable evidence. Evidence is information that helps us to come to a conclusion. It is the basis or reason for us to believe something–though we must hastily agree that, as individuals, we can quite easily believe something without evidence.
One final aspect of risk is worth highlighting. When a small amount of damage to a system can cause disproportionate consequences that system is vulnerable even if the chances of that damage are very low. Risk consists of three equally important factors – the three cs of chance, consequences and context. Much of the risk literature focuses on chance. The much neglected analysis of vulnerability concerns disproportionate consequences. Both are underpinned by insufficient explicit consideration of context.
Two assumptions, commonly made in conventional risk and reliability studies are that all uncertainty is either aleatoric (random or stochastic) or epistemic (knowledge based). This is simply insufficient to manage uncertainty in engineering.
Here we maintain that the core attributes in the discrimination of an object are fuzziness, incompleteness and randomness (FIR). We use them to classify the essential structural uncertainty because they characterise both the structural uncertainty of the objects (the parts that make a whole) and the connections between them.
F is for Fuzziness – the vagueness or imprecision of the definition of an object.
I is for Incompleteness – that which we do not know about an object.
R is for andomness is a lack of a specific pattern in data concerning the parameters of the state of an object.
FIR are definitional structural attributes of uncertainty. All other attributes such as ambiguity (unclear meaning), unreliability (untrustworthiness), capriciousness (inconsistency and being erratic) are interpretive and natural language attributes of uncertainty. In other words they give meaning to a combination of unclear, fuzzy (F) and incomplete (I) and changeable (R) propositions. As such the meaning emerges from these three structural attributes in a particular context. We show this in the figure below where FIR are the three axies.
For example ambiguity in the interpretation of the meaning of a proposition emerges when there is a high level of fuzziness and incompletess. The role of changeableness (randomness) is low. Likewise an interpretation of a proposition will be conflicting or contradictory when there is a high level of fuzziness, incompleteness and changeableness – i.e. when all three FIR are present.
In order to find a practical, informative and relatively simple way of representing uncertainty we have developed the use of interval probability theory into the colourful Italian Flag.