Free access
EDITORIAL
May 1, 2005

Incorporating Uncertainty and Variability in Engineering Analysis

Publication: Journal of Water Resources Planning and Management
Volume 131, Issue 3
Engineering education has always been strong in providing students with the tools needed to solve problems. As civil engineers, we learned about methods for calculating runoff and solving for flows and pressures in pipe networks and applying various optimization techniques. The basic concept is to understand what data are needed, where or how that information can be acquired, how to choose and apply the appropriate methodology, and then how to use the results from that analysis to design or analyze a system. The underlying assumption is that if you use the correct data and select the proper methodology then the resulting answers will be the correct solutions. However, to a lesser degree engineering students are also exposed to the concept that engineering is not quite as black and white, or more appropriately, deterministic, as may be suggested by strict application of formulas. There are shades of gray that must be considered in the form of variability and uncertainty that are factors in all engineering (and other) decisions. It is this topic that I will discuss here both in general and as it applies to the subject of this special issue of the Journal—water distribution system analysis.

Variability and Uncertainty

Though there is some ambiguity as to the exact meaning and difference between variability and uncertainty, the following definitions from the book Risk Analysis—A Quantitative Guide by David Vose (2000) provide a good distinction between these terms. “Variability is the effect of chance and is a function of the system. Uncertainty is the assessor’s lack of knowledge (level of ignorance) about the parameters that characterize the physical system that is being modeled.” Vose then goes on to say that “total uncertainty is the combination of uncertainty and variability” and that “these two components act together to erode our ability to be able to predict what the future holds.”
There are many cases where the public is being exposed to the concepts of variability and uncertainty. Weather forecasts no longer state that it will rain tomorrow. Rather, they are posed in probabilistic terms: “there is a 90% chance of precipitation.” This statement actually includes both concepts of variability and uncertainty. Variability recognizes that it may rain in one neighborhood while another area a few miles away may not receive rain. There is also uncertainty in such a situation because the mathematical models used to predict weather are not perfect. Another example in the area of weather is in the prediction of hurricane tracks. Instead of offering one probable track, modern hurricane predictions provide a cone of uncertainty with a central line representing the most likely movement of the hurricane. The outer limits of the cone represent the spatial area within which meteorologists have determined, to some level of confidence, that the hurricane will likely strike. Another very different example is the way in which polling results for elections are reported. Candidate A is reported to be favored by 48% of the voters, while candidate B is supported by 45% of the voters with a margin of error of plus or minus 4%. This statement actually means that the pollsters think that candidate A is slightly ahead, but due to the underlying variability and uncertainty, the race is still too close to call. Variability in this case may be due to the fact that the interviewees may not actually know whether they will vote and for whom they will vote, while uncertainty is caused by the size and makeup of the poll and the polling techniques.
In the field of water resources engineering there are numerous examples of variability and uncertainty. In some studies, variability and uncertainty are explicitly considered, in other cases they are more subtlety incorporated into the analysis, while in all too many cases they are completely ignored. Hydrologic analysis generally includes an explicit representation of variability/uncertainty. For example, floods and rainfall are typically treated in a probabilistic fashion when designing dams, levees, and other structures by specifying probability distribution functions, rather than point estimates.
However, in many areas of water resources planning, variability and uncertainty are largely ignored. Water supply planning projects typically express system performance in terms of “safe” or “firm” yield. Recent droughts have demonstrated that there is no such dependable level. In most water system master plans, water consumption is typically projected for the next 5, 10, and 20 years. However, the degree of uncertainty in these predictions are not generally stated and the implications of falsely precise predictions on resulting designs are not provided. Similarly, uncertainty in parameters used in distribution system models is usually ignored rather, a most likely parameter value is selected and used in the analysis. Walski (2000) very aptly addressed this issue in his paper “Model Calibration Data: The Good, the Bad and the Useless.” He shows that errors in field data can be quantified and that, for example, uncertainty in measuring flow and head losses during a C -factor test can render the resulting estimates essentially useless.
Water distribution systems analysis is particularly challenging in terms of the examination of uncertainty due to the spatial and temporal aspects of the data involved, the use of long-range forecasts, temporal and spatial autocorrelation, the influence of significantly different time and space scales that must simultaneously be included in the analysis (forecast in years, dynamic behavior in minutes), as well as the high degree of “model uncertainty” in terms of understanding all of the operative mechanisms (for example chlorine residual decay, or THM growth) in a manner sufficient for prediction. Further, it is very difficult to define an aggregate measure of water distribution system performance that takes into account temporal and spatial factors, and both hydraulic and quality aspects. Analysis is typically performed such that the system reliably performs above certain standards at the lowest cost, but metrics that reward increased performance, or allow for occasional substandard performance, are not widely used.
Another critical but little-noted issue is how best to make choices when presented with information about uncertainty. Most decision makers (and most engineers) prefer a clear set of choices, typically based on some single aggregate average measure of performance, and most analyses provide no more than that. When multiple measures of performance are incorporated, the problem becomes a multicriteria decision issue, adding additional complexity to the choice problem. When uncertainty is included, choices must frequently be made between better “average” performance and reduced risk of poor performance. The well-known analysis technique of Monte Carlo simulation (to be discussed), can provide probability distributions of performance measures rather than simple point estimates. From these distributions, simpler statistical measures, such as a mean and standard deviation, can be developed, but the decision maker must still make trade-offs in their final decisions. Trade-off examples include the explicit choice of a higher mean performance or a reduced risk (this problem is classic in financial investment decisions such as stock market portfolio analysis, where the “risk-return” issue is paramount [Males 2002]) or trade-offs between lower costs and increased reliability.
Thus, not only is it necessary to display the range of choices, with information as to the uncertainty surrounding each choice, it is also necessary to provide paradigms for decision making when uncertainty is present and explicitly stated.

Monte Carlo Simulation

There are several quantitative methods for incorporating uncertainty and variability in an analysis. The most commonly used approach and one that is most accessible to researchers and practicing engineers is Monte Carlo simulation. This approach, named after the gambling capital of Monaco, uses the concept of “rolling dice” as part of a simulation process. A deterministic model of a process is developed and applied for many iterations with the values for model parameters for each iteration randomly drawn from probability distributions rather than just choosing the best estimate for the parameter. The results of the analysis can then be expressed probabilistically. Monte Carlo analysis is especially attractive because of the relative ease in applying it. There are several “add-ins” available for spreadsheet software, or computer programming can be used to develop custom solutions.
Proper application of Monte Carlo simulation requires that:
1.
All possible sources of variability/uncertainty for the problem under study be identified;
2.
The most important variables that should be dealt with in the probabilistic analysis are determined; and
3.
Probability distributions are assigned to these variables (not always a simple task).
The discipline of working through a problem in this framework is extremely valuable to the practitioner, forcing a deeper understanding of the problem, and an explicit examination and admission of the uncertainties that are present in the analysis. Though widely used in many areas of analysis for almost a half century, there are relatively few examples of application of Monte Carlo simulation in water distribution system analysis. Murray et al. (2004) are applying Monte Carlo simulation to study the vulnerability of water distribution systems to terrorist contamination threats based on a conceptual model developed by Propato and Uber (2004). Variables such as location and the characteristics of the contaminant are treated probabilistically through simulation of a large number of scenarios. Since Monte Carlo simulation requires examination of many iterations, running a model of a large system repeatedly requires extensive computer resources. In this study, the problem is being solved by using a large parallel processing capability to simultaneously simulate many scenarios. Similarly, in a paper presented in this special issue of the Journal, Nilsson et al. (2005) use Monte Carlo computer experiments to simulate a deliberate biochemical assault on a municipal drinking-water distribution system in order to generate empirical frequency histograms of mass-dose loadings at select nodes in a typical water distribution system.
Harding and Grayman (2002) applied Monte Carlo simulation in an historical reconstruction of distribution system contamination as part of a legal case in California. Variables such as wellhead concentrations, system operation, and demands were treated probabilistically in the analysis. Rather than just stating a single value, analysis results were presented in probabilistic terms, expressed, for example, as 95% confident that the true value of the concentration of the contaminant of concern fell between an upper and lower value. In a recent federal case, a similar technique was applied in Hawaii (Linda Akee et al. vs. the Dole Corporation et al. 2004). The significance of this case was that the methodology incorporating Monte Carlo simulation in water distribution system analysis passed the rigid “Daubert test” under the Federal Rules of Evidence, thus legitimizing this procedure in the eyes of the court.

What Do We Need to Do?

In order to effectively consider uncertainty and variability in water-resources decision analysis there is both a need for increased sensitivity to the issue and improved training in this area. Most students are exposed to some basic concepts of probability and statistics. In addition, some universities offer either required or optional courses that specifically address uncertainty.
However, in order to effectively incorporate uncertainty into engineering analysis, both engineering students and practicing engineers must be further sensitized into the pervasiveness of this concept. In reality, every piece of information and every result of a model application should be viewed in a multidimensional framework. The narrow view is that this information or model result is the “correct” answer. The broader view considers each piece of data or model result as one observation drawn from a probabilistic distribution. That observation may be considered to be the most likely value, but there are other possible values with some finite level of probability that must also be considered in any analysis. Once this concept is accepted and used, then the engineer becomes more sensitive to the implications and impacts that may result due to uncertainty.

References

Harding, B. L. , and Grayman, W. M.  (2002). “Historical reconstruction of contamination in a distribution system incorporating uncertainty.” Proc., 12th Int. Society of Exposure Analysis Conf. and 14th Int. Society for Environmental Epidemiology Conf., Final program and abstracts, Vancouver, B.C.
Males, R. (2002). “Beyond expected value: Making decisions under risk and uncertainty.” US Army Corps of Engineers, Institute for Water Resources, Ft. Belvoir, Va, ⟨http://www.iwr.usace.army.mil/iwr/pdf/02r4bey_exp_val.pdf⟩ (accessed Feb. 7, 2005).
Murray, R., Janke, R., and Uber, J. (2004). “The threat ensemble vulnerability assessment (TEVA) program for drinking water distribution system security.” Proc., World Water & Environmental Resources Congress, EWRI-ASCE, Reston, Va.
Nilsson, K., Buchberger, S. G., and Clark, R. M. (2005). “Simulating exposures to deliberate intrusions into water distribution systems.” J. Water Resour. Plan. Manage., 131(3).
Propato, M., and Uber, J. G. (2004). “Vulnerability of water distribution systems to pathogen intrusion: How effective is a disinfectant residual? Environ. Sci. Technol., 38(13), 3713–3723.
Vose, D. (2000). Risk analysis—A quantitative guide, Wiley, Chichester, U.K.
Walski, T. M.  (2000). “Model calibration data: The good, the bad and the useless.” J. Am. Water Works Assoc., 92(1), 94–99.

Information & Authors

Information

Published In

Go to Journal of Water Resources Planning and Management
Journal of Water Resources Planning and Management
Volume 131Issue 3May 2005
Pages: 158 - 160

History

Published online: May 1, 2005
Published in print: May 2005

Permissions

Request permissions for this article.

Authors

Affiliations

Walter M. Grayman
Consulting Engineer, Cincinnati, OH 45215

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited by

View Options

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share