Open access
Case Studies
Jun 2, 2021

Assessing Data Adequacy for Determining Utility-Specific Water Loss Reduction Standards

Publication: Journal of Water Resources Planning and Management
Volume 147, Issue 8

Abstract

In 2017, urban water retailers in California reported 364  millionm3 in real water losses within their distribution systems, representing on average 7.12% of their total water delivered that year. Recognizing the value of these water losses, the California State Water Resource Control Board (SWRCB) has proposed setting economic water loss reduction standards—regulatory water loss levels—that are individually tailored based on utility-specific data. Our research analyzed data sets currently available to the SWRCB, including water audits and electronic annual reports, as well as supplemental data collected by utilities to determine if the information currently available is adequate for creating individually tailored water loss reduction regulations. Adequacy was evaluated based on the data’s completeness, validity scores, and consistency. We found that, given the current state of the data, the resulting standards may not be representative of individual utilities and that annual data variability may be greater than a proposed reduction. It is recommended that entities considering similar individual water loss reduction targets consider timing, the data collection tool, and data selection when developing utility-specific real loss reduction policy.

Introduction

In 2017, urban water retailers in California reported 364  millionm3 (295,196 acre-ft.) in real water losses in their distribution systems, representing a loss of both $167 million and approximately 803.1 GWh used to produce that water (Copeland and Carter 2017; Department of Water Resources, State of California 2019). Water losses from a public utility’s distribution system can occur via leaks, unauthorized consumption, data or administrative errors, and metering system inaccuracy or failure (USEPA 2010). Leaked or unaccounted for water translates directly into lost revenue for utilities and requires them to make greater source withdrawals, potentially depleting water sources and impacting their future reliability.
To reduce water losses in the distribution system, SB 555 (California State Senate 2015) directs the State Water Resources Control Board (SWRCB) to set water loss performance standards for urban retail water suppliers (defined as those that serve more than 3,000 connections or that produce more than 3,000 acre-ft of water each year) while accounting for the economic impact these standards will have on individual retail water utilities. The SWRCB has proposed tailoring standards to each utility based largely on data taken from the utility’s water audits and electronic annual reports. This differs from nontailored approaches such as overall percentage reductions in water loss or maximum allowable percentage losses. Unlike individually tailored requirements, a uniform standard would be applied as a blanket regulation for acceptable water losses for all water retailers within a jurisdictional area. While nontailored approaches require fewer data from each water retailer, they disproportionately impact particular utilities.
This case study examines whether the data currently available in California from water audits and electronic annual reports are adequate for setting individual water loss standards in terms of the data’s completeness, validity, and consistency. This analysis does not assess economic methods to calculate a water loss standard. Since the method of standard calculation can have as much impact on any resulting water loss reduction as the data used in the calculation, this analysis cannot assess the sensitivity or numerical impact of the data on a resulting regulation. Instead, it focuses on whether the data used in such a calculation are of adequate quality to represent a specific utility, highlighting deficiencies in the existing data and the potential impacts of the data on tailored water loss regulations. While this research directly references California’s data and efforts, it can be used to highlight general data issues that should be considered by any entity pursuing individually tailored water loss standards. Ultimately, any standard that is created is only as reliable as the data it is based on.

Background

In light of its history of droughts and its increasing population, California has passed several regulations that promote reduced water consumption (Lund et al. 2018). In 2009 the California Senate passed the 20×2020 Water Conservation Plan, SB X77, that outlined specific activities to achieve a 20% reduction in urban per capita water demand by 2020 (California Department of Water Resources 2010). In keeping with this plan and emerging from a severe drought in 2012–2016, the state was motivated to pass additional legislation such as the Title 20 Water Efficiency Standards (California Energy Commission 2016), which mandates that faucets, showers, and toilets sold in California as of 2016 meet water efficiency requirements, and SB 606 (California State Senate 2018), which sets indoor and outdoor urban water budgets. SB 555 (California State Senate 2015), establishing economic water loss performance standards for urban water retailers, is yet another regulation that has been driven by the goal to use water more efficiently throughout the state.
Determination of an appropriate economic standard requires regulations that consider the activities a utility can undertake to reduce its real losses and the specific costs and benefits of doing so. How these regulations are formulated, such as to maximize the benefit-to-cost ratio to determine an “optimal” leak rate, or whether standards will be based solely on a benefit-to-cost ratio greater than one—the “breakeven leakage—” will be decided by the prevailing regulator. However, to an economic standard must consider both costs and benefits to the individual utility of meeting some prescribed water loss level. The derived regulatory standard will be affected just as much by the method as by the data used in its estimation. In this paper, the data to be used in derivation of the standard are assessed but the method used is not directly discussed. The data examined represent the information required to estimate the cost and benefits of water loss control actions to achieve an economic leakage level.

Real Loss Management and Assessment

Water loss control programs aim to mitigate water loss by using data about the distribution system to identify and manage leaks. Typical measures for reducing real losses include pressure management, active leak control (sometimes broken into leak detection and speed and quality of repairs), and pipeline and asset management (Lambert and Fantozzi 2005). As depicted in Fig. 1, these three components can be used to reduce current real losses to an economic leakage level (ELL) and to as low a level as unavoidable annual real loss level (UARL). The UARL is the theoretically lowest level of leakage possible for a specific distribution system; it assumes all infrastructure is in favorable condition and that operators are applying state-of-the-art leakage control. While technically possible, the UARL is often achievable only by ignoring economic considerations (Lambert et al. 1999). In contrast, the ELL is achievable by cost-effective methods.
Fig. 1. Components of real-loss management. (Adapted from Lambert and Fantozzi 2005.)
Pressure management includes eliminating or reducing excessive pressure, mitigating pressure transients, and stabilizing pressure variation. All of these activities reduce of pipe failures and their associated water losses (Thornton and Lambert 2005; Martinez Garcia et al. 2020). Lower pressure also results in slower leak flow rates, making pressure reduction one of the ways to reduce real losses from background leakage. For operators to understand where they can reduce pressure, they must have adequate data on their distribution systems to ensure that all customers are supplied with adequate pressure and that fire flow requirements are met. This may require time series pressure data at critical nodes for different pressure zones and hydraulic data on current leakage to estimate the effect of reducing pressure on current losses (Thornton and Lambert 2005).
Active leak control requires proactive detection and timely repairs. Leakage is usually detected through a leak detection survey, which locates leaks that may never be detected otherwise. Quickly and effectively repairing leaky infrastructure can both reduce the time a leak is allowed to run and prevent the leak from recurring. The economics of active leak control depend largely on the quantity of leaks, the rate at which they develop, the speed and effectiveness of their repair, and the costs to locate and fix them.
Asset management generally refers to effectively managing pipes, pumps, valves, reservoirs, and similar assets to ensure maintenance, repair, and replacement occurs on time and that there are sufficient resources to support activities. To account for asset management when estimating economic real losses, a large amount of utility-specific information is required, including detailed asset characteristics, failure records, and maintenance history (Boxall et al. 2007; Rogers and Grigg 2009). Furthermore, understanding how asset management strategies impact water losses requires extensive data to model correlations between asset characteristics and operations and their impact on failure frequencies and leakage rates. (Martinez Garcia et al. 2018).
When it comes to measuring water losses, several metrics can be used to assess the performance of a utility, Known as key performance indicators (KPIs), they quantify water loss performance and both assess performance over time and compare it to that of other utilities. Commonly accepted KPIs include losses as a percentage of delivered water, losses in gallons per connection per day (GPCD), GPCD per unit of system pressure, and infrastructure leakage index (ILI) which is a ratio of current annual to unavoidable annual real losses. Not all KPIs are created equal, and some may be more appropriate than others for tracking performance and establishing an acceptable water loss standard. KPI selection is important and depends on a regulation’s goals and applications. Current development of regulations in California has explored economic GPCD losses for an individual utility with no lower bounds. Therefore, this paper uses this metric in its discussion while recognizing that it is not the only way.

California Data

In October 2015, California enacted SB-555, requiring all 412 urban water retailers to submit annual audits beginning in 2017. Prior to this bill, very few California utilities separated their overall nonrevenue water into water lost to leaks—real losses—and water lost to unauthorized and unbilled uses—apparent losses (Sayers et al. 2016). Now urban water retailers track their water consumption via a water balance that includes both real and apparent losses in their annual audits.
The water audits utilities use were jointly designed by the American Water Works Association (AWWA) and the International Water Association (IWA), which also designed validity scores to quantify the trustworthiness of audit data, accounting for limitations in metering and monitoring throughout a distribution system (Kunkel 2016). Validity scores for individual data parameters range from 1 to 10, determined by specific grading criteria based on the metering the utility does in a service area or on its data quality policy and processes. A grading matrix provides a weighted summary of individual validity scores and an overall water audit validity score ranging from 1 to 100 (Kunkel 2016). Utilities are informed of areas that decrease their overall = scores so they can improve their data. High validity scores do not guarantee accurate values, and low validity scores do not necessarily mean inaccurate measurements; instead, they are meant to represent the degree of confidence in reported values. Validity scores are not statistical measures and do not necessarily follow a linear relationship. In other words, a parameter score of 6 is not twice as likely to be accurate as a score of 3.
In addition to the water audit, utilities must submit an electronic annual report (eAR) to the SWRCB. Whereas water audits track water losses, eARs track system operations and modifications from year to year. As of 2018, eARs include new questions regarding infrastructure, water pressure, and real loss reduction measures (State Water Board Division of Drinking Water 2018) specifically meant to help the SWRCB develop water loss performance standards as per SB-555.
Any data necessary for estimating the ELL which cannot be found in water audits or eARs must be supplied by the utilities, which may or may not collect these data. Table 1 outlines some of the data necessary to model economic water losses as well as the potential sources of those data. For example, variable production cost, which is the cost to produce one unit of water, can represent the value of water lost to leaks and is reported in the water audit. On the other hand, flow data are not reported in either the water audit or the eAR, but is necessary to understand how much water is lost from leaks and to estimate background leakage and the impacts of pressure reduction. To accurately estimate an ELL, then, the utility itself must supply these data in addition to information already reported in its audit and eAR submissions.
Table 1. Data required to model economic water losses
TypeSourceSpecific data
PerformanceWater auditCurrent annual real losses, infrastructure leakage index, unavoidable annual real losses
PressureWater audit, eAR, utilityAverage pressures, minimum pressures, pressures at critical nodes
AsseteAR, utilityPipe material, pipe age, pipe length, number of connections, current infrastructure
Failure recordseAR, utilityType of failure, pipe characteristics, estimated impacts
Infrastructure costseAR, utilityReplacing/rehabilitating pipe, pressure management (sensors, loggers, PRVs)
Variable production costsWater auditVariable production cost
Flow dataUtilityFailure flows, nightly flow rates

Methods

Data Sets

This analysis used two data sets: 2016–2018 California water audits and a 2018 eAR. The water audit collected data on a distribution system’s water budget, including imported and exported sources, apparent losses, and nonrevenue water. Additionally, it collected basic system data such as number of service connections, average operating pressure, and variable production cost. An overall validity score was assigned to each audit and performance metrics such as ILI were calculated. The eAR data set included data on service connections, water served, infrastructure, water rates, water quality, and system improvements.
Parameters of interest (Table 1) for the water audit include length of mains, number of service connections, average pressures, variable water production cost, and overall data validity scores. For the eAR, they include service and main breaks and leaks, pipe materials, average pipe ages, minimum operating pressure, and repair costs.

Data Processing

The water audit data used in this study were downloaded on, from wuedata.water.ca.gov. This data set contained three years of data for over 300 urban water retailers. If a utility had more than one entry for a given year, only the most recent entry was kept (California Department of Water Resources 2019). This data set, referenced from here forward as the “complete data set,” was used to investigate the data as a whole. A data set with additional filtering, referenced as the “filtered data set,” was used to investigate only the data deemed probable (likely to be accurate). Improbable data were removed both by deleting audits reporting negative real losses and by use of the ILI. Andrews and Sturm (2016) labeled ILIs greater than 20 as improbably high and those less than 1 as impossible, and omitted both from consideration. This study took a similar approach, except for retaining audits with ILIs outside the 1–20 range if they also had a validity score greater than the median of 63.0, indicating a higher degree of confidence in the reported values. Based on these parameters, the complete data set included 1,050 values while the filtered data set included 978 for the three-year study period.
Only a single year of eAR data was used because only as of 2018 did the SWCRB require answers to questions related to water loss and system characteristics. The eAR data set was filtered to remove any public water systems that did not fit the criteria of urban water retailers, since water loss reduction standards did not apply to them. Filtering the eAR data left 312 unique records.

Data Adequacy

To assess data adequacy, individual utility data were examined where possible. The criteria used to achieve this evaluation were (1) completeness, (2) validity, and (3) consistency from year to year. Completeness was determined by compiling tabulated total responses including default values; validity, by comparing validity score ranges and distributions for each data entry over the three years; and consistency, by calculating percentage changes in data from year to year for the key parameters and assessing the distribution. The results of this analysis are described in the following sections and their implications for setting utility-specific standards are discussed.

Results and Discussion

Big Picture in California

Overall water losses in California have been fairly steady over time but variable among individual utilities. From 2016 to 2018, up to 377 utilities submitted water audits to the SWRCB reporting up to 364  millionm3/year (295,196  acre-ft/year) and averaging 7.1% of the reported total annual water delivered. Individual utilities reported a wide range of losses and ILIs. The heterogeneity observed in Fig. 2 between 2018 utility GPCD real losses supports the effort to create utility-specific standards since a uniform water loss reduction strategy would likely have an uneven and perhaps overly burdensome impact on some water retailers.
Fig. 2. (a) Boxplot of 2018 GPCD real losses reported by California urban water retailers; and (b) histogram of 2018 GPCD real losses reported by California urban water retailers.

Data Completeness

Water Audit Parameters

The water audits were mostly complete and contained many useful values that could be used in an economic estimation. Before a utility submits its water audit, a third party (certified by the SWRCB) validates it, ensuring that the spreadsheet is complete to the best of the utility’s ability. Occasionally critical parameters are missing, making it difficult to model ELLs. In some situations, if utilities do not know the values for a specific parameter, they use a default value, which is the same for all utilities.
The audits were first evaluated to determine the number of missing entries each year for the parameters of interest. Table 2 shows the number of entries for each parameter as a percentage of the total number of utilities that submitted a water audit each year. Overall the data set had relatively few missing data points for the parameters of interest, for all of which the number of missing entries decreased over the three-year time frame. Average pressure, a critical component for calculating real losses and UARL, was reported 100% of the time. The water audit parameters are important for estimating ELLs, but they are not comprehensive. Average pressure, for example, is insufficient for estimating economic levels of pressure management as they do not capture any system complexities (Puust et al. 2010).
Table 2. Percentage of reporting utilities including a given parameter
ParameterComplete data set (n=1,050)2016 (n=377)2017 (n=348)2018 (n=325)
Length of mains98.397.498.699.1
Number of service connections98.397.498.699.1
Overall validity score97.997.497.798.8
Average pressure100.0100.0100.0100.0
Variable production cost97.196.396.898.2
Defaults are heavily relied on to complete a water audit and may or may not represent an individual utility. For a number of parameters, utilities can use default percentages when they do not have exact values. With these default values, utilities can estimate unbilled unmetered volumes, unauthorized consumption, customer metering inaccuracies, and systematic data-handling errors. Reliance on default values did not decrease over time, and over 90% of utilities depended on them to estimate unauthorized consumption and customer metering inaccuracies. Table 3 shows the ranges that were not the default values, recommended default values, and number of utilities using the default values as a percentage of total utilities. For example, for customer metering inaccuracies, the default was overregistration of 1.0% of water supplied whereas utility-entered values ranged from 0.1% to 5.6%. The discrepancy between default and known percentages indicates that default values may not be appropriate for some utilities; here the default percentage could be off by up to 4.6%.
Table 3. Range of values entered by utilities, recommended defaults. and percentage of utilities using default values each year
ParameterUtility-provided range (%)Default value (%): water suppliedDefault value (%): 2016 (n=377)Default value (%): 2017 (n=348)Default value (%): 2018 (n=325)
Unbilled unmetered0.0–10.51.254.518.117.2
Unauthorized consumption0.0–1.50.2592.894.593.5
Customer metering inaccuracies0.1–5.61.091.092.291.4
Systematic data handling errors0.0–0.80.2590.788.888.3
The impact of using a nonrepresentative default value depends on how that value is incorporated in an ELL estimation and if any margin of error is considered. Defaults may be appropriate for some retailers, but in many cases may be used simply because the actual number is unknown. In situations where defaults are heavily relied on, potential errors can be compounded, resulting in an inaccurate estimate of current annual real losses and a water loss reduction standard that does not truly represent the state of the urban water retailer.

Electronic Annual Report Parameters

Electronic annual reports (EARs) were largely incomplete and likely not adequate for use in setting standards. The 2018 eAR contained several pieces of data (Table 1). Unlike the water audits, eARs received no formal third party review or validation of the data; it was up to utilities to assess their data quality. Data were missing most of the time for most of the parameters of interest (Table 4). Most utilities (71.5%) reported the number of breaks and leaks found on service lines, but very few (1.6%) provided estimates of repair costs. This might have been because they did not collect these data or they found them too burdensome to report in the necessary form.
Table 4. Percentage of utilities reporting specific data parameters in 2018 eAR
ParameterPercentage reporting (n=312)
Service connection breaks/leaks71.5
Main breaks/leaks54.5
Pipe materials45.2
Average pipe ages31.4
Minimum operating pressure33.3
Repair costs1.6
In the absence of data, defaults or assumptions may be used to estimate ELLs. Depending on the parameter, the assumptions used can have a significant impact on these estimates. For example, main breaks are likely to vary significantly between utilities, making a single blanket assumption inappropriate. Alternatively, repair or replacement costs may be more consistent between utilities, making an assumed cost a fitting one.
Missing or default data in the water audits and eARs provide information about the status of the data available to the state regulating agency, but do not necessarily signify that the information does not exist at the utility level. Absent data may be a result of the data collection tools themselves. That only 54.5% of utilities reported their main breaks/leaks in the eAR does not necessarily mean that this information was not collected by the other 45.5% but only that they did not report it or may not have collected it in a manner conducive to reporting in the way the eAR or audit requested it. Similarly, defaults in the water audit may have been convenient when measured data collection was onerous, but this does not necessarily imply that the data could not be measured or that the default was appropriate. In situations such as these, an accurate water loss reduction standard may require allowing direct utility interaction and data collection instead of reliance on data as reported to the regulator.

Utility-Shared Parameters

Because the water audit and eAR did not ask for all parameters necessary to model economic water losses, utilities would need to use their internally collected data. For instance, average operating and minimum operating pressures tell little about the capacity to reduce pressures and the costs and benefits of doing so. To include this in the estimation, the utility at a minimum needs to supply additional data about pressures at various points and times, which may require surveying a distribution system or constructing a hydraulic model. Asset management is similar. Without condition data on the infrastructure and the likelihood of pipe failure, it is difficult to estimate water reductions resulting from economic asset management activities. Even if these data were available, translating pipe characteristics into a reduced rate of pipe failure and water is a topic of current research (Martinez Garcia et al. 2018, 2020). At present it is unknown whether all utilities in California either do collect or can collect the data necessary to consider economic levels of the three main water-loss control components.

Data Validity

Validity scores have moderately improved over time and may be adequate to begin benchmarking, but they are likely still too low to be the basis of future requirements. The water audit asked utilities to indicate the reliability of their data by assigning validity scores to the relevant parameters. Low validity scores indicate low confidence in reported values; therefore, low-validity data are likely inadequate for informing regulations. Fig. 3 summarizes the validity scores for parameters from the filtered data set for all three reporting years (2016:n=350; 2017:n=325; 2018:n=303). It highlights median values with a single point and the middle 50% of the data with a box. The whiskers extend to the 5th- and 95th-percentile data values. Each validity score is assigned a specific definition; for example, a median score of 8 in Fig. 3(b) for number of service connections indicates a confidence level within 2% of the actual number, and a median score of 5 in Fig. 3(c) for average pressure signifies that calculations are based on a reasonably reliable mix of electronically logged telemetry data and pressure gauge data from hydrants. A full explanation of each value for each parameter can be found in the AWWA’s M36 Water Audits and Loss Control Programs (Kunkel 2016).
Fig. 3. Box and whisker plots of data validity scores for each reporting year: (a) length of mains; (b) service connections; (c) average pressure; (d) variable production cost; and (e) cumulative data validity scores.
Fig. 3(e) presents cumulative data validity scores and their shifts over the three mandatory reporting years. From 2016 to 2018, the score range shrank and the mean increased from 60 to 65. Kunkel (2016) proposes a minimum validity score of 51 before utilities can begin performance benchmarking; in 2018, 97.1% of utilities reported scores at or above that level. M36 Water Audits and Loss Control Programs recommends validity scores of 71 or higher before performance benchmarking can be “meaningful”; in 2018, only 28.2% of utilities reported this level.
Overall, the majority of utilities received reasonably high data validity scores for both length of mains and service connections. Validity scores for average pressure varied significantly, indicating that these data may not be adequate for estimating water loss reductions or UARLs for some utilities. For variable production costs, all utilities had a validity score of at least 4, indicating some degree of confidence. Generally, validity scores increased, and as of 2018 the majority of reporting utilities had achieved the minimum scores necessary to begin benchmarking and long-term planning.

Data Consistency

Some parameters were relatively consistent, but others experienced large variations from one year to the next. Data consistency is the steadiness of a data measurement over time. A water retailer is unlikely to experience large fluctuations year to year in parameters such as water losses or average pressures. Therefore, sizable changes in these parameters may indicate that the data are not reliable. Fig. 4 shows the average percentage changes (in absolute values) from 2016 to 2017 and from 2017 to 2018 for individual utilities using the filtered data set. To more easily visualize trends, Fig. 4 does not display outlier percentage changes, focusing instead on the bulk of the data.
Fig. 4. Utility count by average percentage change, 2016–2018: (a) average pressures; (b) variable production costs; and (c) GPCD real losses.
Average pressures each year were relatively constant, with over half of utilities reporting no change at all over the three reporting years. Variable production costs fluctuated from year to year, with a median average percentage change of 10.9%. Calculated as total annual real losses divided by total service connections, real loss GPCD had the greatest variability, with a median average percentage change of 35.1%. For this parameter, any given year of data could have a large (and potentially inaccurate) impact on resulting standards. Consistency was not evaluated for eAR data since only one record was available.

Data Adequacy Implications for Applying Standards Today

Given the findings of this research, there are certain implications of using currently available data to set utility-specific volumetric water loss reduction standards.
First, reductions based on missing or incomplete data may not reflect a utility’s reality. While most utilities collect certain data parameters, very few if any report all the data necessary to calculate a truly utility-specific standard. Table 4 shows that data from the eAR for parameters such as pipe material and age, minimum operating pressures, and repair costs were missing over half the time. When utilities are missing data, they can substitute default values, but Table 3 demonstrates how defaults can significantly differ from actual data, which means that the resulting performance standards may not represent the conditions of the individual utilities.
Second, regulated water loss reductions may be inconsequential in the face of expected variations in reported real losses. Volume of real losses per service connection, such as GPCD losses, is a common performance indicator utilities use to benchmark their water losses and track their performance (Lambert et al. 2015). Currently available data on GPCD losses are highly variable and, depending on how such data are used, can have sizable ramifications on resulting standards and compliance.
A utility may do nothing and meet the standard or do much and not meet the standard simply because of annual data variations. For example, using a median utility with losses of 28.2 GPCD and a median average percentage change of 35.1, Fig. 5 plots both the expected range of GPCD losses in gray; the black line indicates the water loss percentage reductions that can be enforced based on the utility’s current GPCD losses. In this case, if an economic model suggests a reduction based on current GPCD real losses, the reduction has to be greater than 35% before it falls outside the expected range for the next year. For a first-quartile utility, a reduction in real losses has to be greater than 21.7%. For a third-quartile utility, it has to be at least 75.2%. For a median utility with an average annual percentage change in water losses of 35%, reducing its GPCD real losses by, say, 10%, is meaningless. Compliance depends only minimally on actual water loss reduction steps and much more on the data used to calculate real losses, thus undermining the intent of the regulation itself.
Fig. 5. Projected GPCD losses for a median utility with hypothetical standard percentages reductions.

Data Adequacy Considerations for Formulating Policy

Given the condition of the data and the implications of using these data in their current form, any entity considering regulation should carefully consider the time frame of implementation, the tools to collect the data, and the data it uses to set policy.

Time frame of Implementation

The timing of a regulation can have significant impacts on the available data and therefore the resulting standard. As shown in Table 2 and Fig. 3, over time the missing data decreased and validity scores increased, indicating that, with more time, utilities would be able to improve data quantity and quality. M36 Water Audits and Loss Control Programs states that most water systems need three to six years to achieve adequate validity scores (Kunkel 2016). Presumably, variability decreases as data quality increases, allowing retailers to establish more representative baseline values. Fig. 4 shows that accurate baselining is difficult to achieve because many utilities are experiencing sizable data fluctuations from year to year. Entities considering setting water loss reduction policies should therefore carefully consider how much time they will allow for data collection before formulating regulations.

Data Collection Tool

California’s water audits, although well filled out, were not created with the intent of estimating economic real loss levels. As a result, critical data were missing. Using only the data available to California regulators risks oversimplifying and creating a regulation that does not represent the unique situation of some of urban water retailers. This runs counter to the intent of a tailored approach. To create tailored standards, California regulators may need to work with utilities to gather the necessary data in the appropriate format for estimating their ELL.

Data Selection

Data selection also significantly impacts resulting standards. First, since data change from year to year, regulating entities should consider which year or years of data it will use to create regulations or if some consistency must be achieved before benchmarking water losses and economic loss estimation can begin. Currently single values may not be appropriate given their inconsistency. To be representative, standards can instead incorporate multiple years of data, wait until a level of consistency is achieved, and apply confidence intervals or a regulatory range instead of a single value to account for data variations.
Second, if utilities heavily rely on default values or values with low validity scores—meaning low confidence—the potential for error should be considered. As shown in Table 3, the difference between what a utility actually measures and what a default assumes can be off by up to 9.25% for some parameters. If utilities use multiple defaults, errors can build and carry through to the estimate of annual real losses. Likewise, data with low confidence levels are more likely to be in error, which can impact overall real losses estimation. Given this limitation in the data, a resulting regulation should consider all sources of error and report an expected overall error that may be incorporated into the resulting standard.

Conclusion

Our analysis demonstrates that the currently available data for setting utility water loss reduction standards in California is improving but not yet adequate. The water audits, although mostly complete, relied on default values for certain parameters. Most utilities were missing data for the majority of the parameters in their eARs. While validity scores have been increasing, as of 2018 only 28.2% of utilities had overall validity scores sufficient to begin performance benchmarking. Additionally, data fluctuated from year to year, so for example the KPI of GPCD losses showed a median average percentage change of 35.1% from 2016 to 2018. Overall, these data may result in water loss performance standards that are not truly representative of individual utilities’ economic capacity for water loss reduction and which may not be meaningful given the inherent variation in the current data. For this reason, regulators should carefully consider the timing of data collection, the collection tool used, and the data to be selected when writing standards. Until data adequacy improves, tailored water loss standards may not be accurate or appropriate. For other entities considering adopting a similar approach, these results highlight the importance of data adequacy when setting standards and the implications of using inadequate data.

Data Availability Statement

All data, models, or code that support the findings of this study are available from the corresponding author upon reasonable request. This includes all California water audit data sets and the relevant R code used to process and clean them.

Acknowledgments

This document was prepared as a result of work sponsored by the California State Water Resources Control Board. It does not necessarily represent the views of the State Water Board, its employees, or the State of California. The State Water Board, the State of California, its employees, contractors, and subcontractors make no warranty, express or implied, and assume no legal liability for the information in this document; nor does any party represent that the use of this information will not infringe upon privately owned rights. This manuscript has not been approved or disapproved by the Water Board nor has the Water Board passed upon the accuracy of the information in this report.

References

Andrews, L., and R. Sturm. 2016. “Water audits in the United States: Challenges, successes, and opportunities.” J. Am. Water Works Assn. 108 (2): 24–29. https://doi.org/10.5942/jawwa.2016.108.0032.
Boxall, J. B., A. O’Hagan, S. Pooladsaz, A. J. Saul, and D. M. Unwin. 2007. “Estimation of burst rates in water distribution mains.” Proc. Inst. Civ. Eng. Water Manage. 160 (2): 73–82. https://doi.org/10.1680/wama.2007.160.2.73.
California Department of Water Resources. 2010. “20 × 2020 Water conservation plan.” Accessed May 13, 2021. https://www.waterboards.ca.gov/water_issues/hot_topics/20x2020/docs/20x2020plan.pdf.
California Department of Water Resources. 2019. “Water use efficiency data.” Accessed November 13, 2019. http://wuedata.water.ca.gov/.
California Energy Commission. 2016. Title 20 water efficiency standards, California code of regulations §§ 1601-1609. Sacramento, CA: California Energy Commission.
California State Senate. 2015. Urban retail water suppliers: Water loss management. Sacramento, CA: Legislative Counsel’s Digest.
California State Senate. 2018. Water management planning. Sacramento, CA: Legislative Counsel’s Digest.
Copeland, C., and N. T. Carter. 2017. “Energy-water nexus: The water sector’s energy use.” Accessed December 15, 2019. https://fas.org/sgp/crs/misc/R43200.pdf.
Department of Water Resources, State of California. 2019. “Water use efficiency data.” Accessed November 13, 2019. http://wuedata.water.ca.gov/.
Kunkel, G. A. 2016. M36 water audits and loss control programs. 4th ed. Denver: American Water Works Association.
Lambert, A., et al. 2015. Good practices on leakage management. Brussels, Belgium: European Union
Lambert, A. O., T. G. Brown, M. Takizawa, and D. Weimer. 1999. “A review of performance indicators for real losses from water supply systems.” Aqua 48 (6): 227–237.
Lambert, A. O., and M. Fantozzi. 2005. “Recent advances in calculating economic intervention frequency for active leakage control, and implications for calculation of economic leakage levels.” Water Sci. Technol. Water Supply 5 (6): 263–271. https://doi.org/10.2166/ws.2005.0072.
Lund, J., J. Medellin-Azuara, J. Durand, and K. Stone. 2018. “Lessons from California’s 2012–2016 drought.” J. Water Resour. Plann. Manage. 144 (10): 04018067. https://doi.org/10.1061/(ASCE)WR.1943-5452.0000984.
Martinez Garcia, D., J. Lee, J. Keck, J. Kooy, P. Yang, and B. Wilfley. 2020. “Pressure-based analysis of water main failures in California.” J. Water Resour. Plann. Manage. 146 (9): 05020016. https://doi.org/10.1061/(ASCE)WR.1943-5452.0001255.
Martinez Garcia, D., J. Lee, J. Keck, P. Yang, and R. Guzzetta. 2018. “Hot spot analysis of water main failures in California.” J. Am. Water Works Assn. 110 (6): 39–49. https://doi.org/10.1002/awwa.1039.
Puust, R., Z. Kapelan, D. A. Savic, and T. Koppel. 2010. “A review of methods for leakage management in pipe networks.” Urban Water J. 7 (1): 25–45. https://doi.org/10.1080/15730621003610878.
Rogers, P. D., and N. S. Grigg. 2009. “Failure assessment modeling to prioritize water pipe renewal: Two case studies.” J. Infrastruct. Syst. 15 (3): 162–171. https://doi.org/10.1061/(ASCE)1076-0342(2009)15:3(162).
Sayers, D., W. Jernigan, G. Kunkel, and A. Chastain-Howley. 2016. “The water audit data initiative: Five years and accounting.” J. Am. Water Works Assn. 108 (11): E598–E605. https://doi.org/10.5942/jawwa.2016.108.0169.
State Water Board Division of Drinking Water. 2018. “Electronic Annual Reporting System.” Accessed November 13, 2019. https://www.drinc.ca.gov/ear/.
Thornton, J., and A. Lambert. 2005. “Progress in practical prediction of pressure: Leakage, pressure: Burst frequency and pressure: Consumption relationships.” In Proc., IWA Special Conf. ‘Leakage 2005’. London: International Water Association.
USEPA. 2010. Control and mitigation of drinking water losses in distribution systems. Washington, DC: USEPA.

Information & Authors

Information

Published In

Go to Journal of Water Resources Planning and Management
Journal of Water Resources Planning and Management
Volume 147Issue 8August 2021

History

Received: May 20, 2020
Accepted: Mar 15, 2021
Published online: Jun 2, 2021
Published in print: Aug 1, 2021
Discussion open until: Nov 2, 2021

Authors

Affiliations

Ph.D. Candidate, Center for Water-Energy Efficiency, Univ. of California, Davis, CA 95616. ORCID: https://orcid.org/0000-0002-1177-156X. Email: [email protected]
MacKenzie S. Guilliams
Undergraduate Student Intern, Center for Water-Energy Efficiency, Univ. of California, Davis, CA 95616.
Undergraduate Student Intern, Center for Water-Energy Efficiency, Univ. of California, Davis, CA 95616. ORCID: https://orcid.org/0000-0002-0953-9358
Katrina K. Jessoe, Ph.D.
Faculty Researcher/Associate Professor, Agricultural and Resource Economics, Univ. of California, Davis, CA 95616.
Frank J. Loge, Ph.D. [email protected]
Faculty Researcher/Center Director, Center for Water-Energy Efficiency, Univ. of California, Davis, CA 95616 (corresponding author). Email: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited by

View Options

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share