Introduction
Fresh water supplies are diminishing as a result of human population growth and the impacts of climate change, greater urbanization, and increased water demands. These effects are sometimes exacerbated by competition across the municipal, industrial, agricultural, and environmental sectors (
Kummu et al. 2010;
Oki and Kanae 2006;
Vörösmarty et al. 2000). Water reuse offers a viable strategy to alleviate this pressure, either by reducing environmental withdrawals, augmenting fresh water supplies, or directly supplying finished drinking water. This is accomplished with conventional or advanced treatment trains that can be optimized based on local wastewater characteristics to achieve a water quality consistent with the intended use (i.e., fit-for-purpose) and regulatory framework (
Gerrity et al. 2013). Benchmark examples include the
full advanced treatment (FAT) train employed by the Orange County Water District in California and the ozone-biofiltration treatment train employed by Gwinnett County, Georgia (Fig.
1). Although these treatment trains are able to remove a wide range of chemical and microbial contaminants (
Pecson et al. 2015), there are potential risks that must be assessed and mitigated (
Dominguez-Chicas and Scrimshaw 2010). For example, intermittent failures may increase microbial and/or chemical loads to downstream barriers (
Pecson et al. 2018), potentially leading to compound failures (i.e., domino effects) and increased public health risks (
Amoueyan et al. 2017,
2019).
Regulatory frameworks currently rely on log removal values (LRVs) to estimate pathogen attenuation during treatment. Specifically, each qualifying unit process within a treatment train (e.g., ozonation, filtration, etc.) is awarded an LRV for each target pathogen (
Sano et al. 2016;
SWRCB 2016;
Amarasiri et al. 2017). The sum of the LRVs across the whole treatment train is then calculated for each pathogen and compared against regulatory requirements or public health benchmarks (e.g.,
annual risk of infection). These water reuse regulations are administered on a state-by-state basis in the United States (
USEPA 2012), but there is currently little standardization across states or between countries (
EPHC 2008;
USEPA 2012;
Paranychianakis et al. 2015;
WHO 2017). For example, LRVs of 12/10/10 for viruses,
Cryptosporidium, and
Giardia are required for potable reuse in California, while North Carolina is considering LRVs of 6/5/4 for
E. coli, MS2 bacteriophage, and
Clostridium perfringens (
USEPA 2012). These frameworks use raw sewage as the starting point for the LRV calculation. In contrast, Texas requires LRVs of 8/5.5/6 for viruses,
Cryptosporidium, and
Giardia, with conventional wastewater treatment plant effluent as the starting point for the LRV calculation (
TWDB 2015). The LRV awarded for a particular unit process may even vary by facility within the same jurisdiction (
SWRCB 2016).
One of the principal treatment processes affecting water quality and operational performance in water reuse applications is secondary biological treatment. This has traditionally been accomplished with activated sludge basins and secondary clarifiers, with low-pressure membranes often applied downstream of secondary treatment (
Gerrity et al. 2013). Concurrent with the expansion of water reuse applications, the use of membrane bioreactors (MBRs) integrating biological treatment with semipermeable membranes has rapidly increased in popularity due to their reduced structural footprint and potential for improved water quality (
Zhang and Farahbakhsh 2007;
De Luca et al. 2013;
Purnell et al. 2016). MBRs have been shown to remove microorganisms more effectively than conventional secondary treatment (
De Luca et al. 2013;
Francy et al. 2012;
Hmaied et al. 2015;
Ottoson et al. 2006), making them well suited for water reuse applications where human exposure is anticipated (
Hai et al. 2014).
Despite their efficacy, MBRs are rarely awarded pathogen LRVs, while independent microfiltration (MF) or ultrafiltration (UF) membranes are generally awarded 0/4/4 LRVs for viruses,
Cryptosporidium, and
Giardia (
SWRCB 2016). The resulting LRV deficiencies are problematic for potable reuse systems seeking to substitute an MBR for independent secondary and tertiary wastewater treatment processes (Fig.
1). In MBRs, larger microorganisms such as protozoa and bacteria should be retained by size exclusion, considering that nominal membrane pore sizes are often
(i.e., MF) or smaller (i.e., UF). Based on size alone, many bacteriophages and human viruses should pass through these membranes. However, studies have shown that MBRs can achieve high virus removal (
Chaudhry et al. 2015;
Kuo et al. 2010;
Simmons et al. 2011), either due to inactivation during biological treatment (
Bertucci et al. 1977;
Ward 1982;
Kim and Unno 1996) or physical removal facilitated by particle attachment (
Van den Akker et al. 2014;
Miura et al. 2015), cake or fouling layer development (
Ueda and Horan 2000;
Farahbakhsh and Smith 2004;
Marti et al. 2011), or direct membrane interactions (
Chaudhry et al. 2015;
Lv et al. 2006;
Wu et al. 2010). Cake layer disruption following membrane backwashing or cleaning may promote viral passage, but the degree of passage will depend on the pathogen, membrane pore size, and operational parameters (
Hirani et al. 2014;
Lv et al. 2006;
Van den Akker et al. 2014;
Erdal and Vorheis 2015).
From a design perspective, discrepancies between observed versus regulatory LRVs can result in overly conservative and potentially unsustainable treatment train designs (
Schimmoller et al. 2015). Therefore, the water reuse industry is now considering virus LRVs for membrane-based processes, particularly for MBRs, and to identify rapid methods for validating process integrity and performance. This is often hindered by low ambient virus concentrations in membrane feeds, which necessitates spiking of surrogate viruses (e.g., MS2 bacteriophage) during challenge tests. However, alternative viral surrogates may occur in sufficient quantities under ambient conditions to facilitate characterization of membrane performance. These surrogates include plant viruses found in certain foods [e.g., pepper mild mottle virus (PMMoV)] and bacteriophages associated with nonpathogenic human gut microbiota (e.g.,
and
). PMMoV is a rod-shaped (
;
Wetter et al. 1984) RNA virus with a low isoelectric point (3.2–4.9;
Haramoto et al. 2013;
Shirasaki et al. 2017). It is also one of the most abundant viruses in human feces (up to
virions per gram of dry feces;
Zhang et al. 2006), presumably due to consumption of pepper-based foods. Consequently, PMMoV is abundant in wastewater (
Rosario et al. 2009) and in drinking water sources (
Haramoto et al. 2013). The
Bacteroides-specific bacteriophages are spherical, double-stranded DNA viruses (
Jofre et al. 2014;
Stachler et al. 2017) with diameters of approximately 50 nm for
(
Ogilvie et al. 2012) and 75 nm for
(
Shkoporov et al. 2018), with isoelectric points potentially similar to that of MS2 (
;
Rhodes et al. 2016). These bacteriophages have been found at high concentrations in wastewater: up to
for
and up to
plaque forming units (PFU)/mL for
(
Jofre et al. 2014;
Stachler and Bibby 2014). Because PMMoV,
, and
are abundant in feces—and ultimately raw sewage—they can serve as indicators of fecal contamination and as valuable surrogates for treatment process performance in water reuse applications.
In this study, we used quantitative polymerase chain reaction (qPCR) to monitor these endogenous viral surrogates and their bacterial analogues at a full-scale MBR facility and at a demonstration-scale FAT facility. The objectives of this study were to (1) quantify the occurrence of the five molecular targets in wastewater matrices; (2) characterize their removal across common unit processes in water reuse applications; and (3) determine their suitability for awarding virus LRVs (e.g., across an MBR).
Conclusions
Use of MBRs is increasing in water reuse applications, but broad implementation for potable reuse requires a regulatory framework for awarding LRVs. Such a framework requires extensive datasets, rapid methods, and suitable surrogates to document process performance and integrity with sufficient frequency. In this study, Bacteroides–associated , , and plant-associated PMMoV were highly abundant in the MBR feed, which allowed for demonstrating for . However, data in the literature suggest that the more conservative LRVs observed for (overall average ) may be more consistent with human pathogen removal by MBR. PMMoV proved to be a valuable indicator of human fecal contamination due to its persistence through wastewater treatment, but it may be an overly conservative surrogate for treatment performance (overall average ). Finally, FAT was effective in achieving the LoQs for all molecular targets, but surrogate abundance was insufficient to demonstrate LRVs beyond the initial ozone treatment process, at least with the methods used in this study. Although data provided here report a case study, this information expands our knowledge by increasing the existing body of scientific literature devoted to various aspects of water treatment and reuse. Yet, a more comprehensive study evaluating viral surrogates in multiple MBR and FAT facilities is warranted to characterize the broad applicability of the data presented here.