Technical Papers
Jun 23, 2017

Hidden-Model Processes for Adaptive Management under Uncertain Climate Change

Publication: Journal of Infrastructure Systems
Volume 23, Issue 4

Abstract

Predictions of climate change can significantly affect the optimization of measures reducing the long-term risk for assets exposed to extreme events. Although a single climate model can be represented by a Markov stochastic process and directly integrated into the sequential decision-making procedure, optimization under epistemic uncertainty about the model is computationally more challenging. Decision makers have to define not only a set of models with corresponding probabilities, but also whether and how they will learn more about the likelihood of these models during the asset-management process. Different assumed learning rates about the climate can suggest opposite behaviors. For example, an agent believing, optimistically, that the correct model will soon be identified may prefer to wait for this information before making relevant decisions; on the other hand, an agent predicting, pessimistically, that no further information will ever be available may prefer to immediately take actions with long-term consequences. This paper proposes a set of optimization procedures based on the Markov decision process (MDP) framework to support decision making depending on the assumed learning rate, thus trading off the need for a prompt response with that for reducing uncertainty before deciding. Specifically, it outlines how approaches based on the MDP and hidden-mode MDPs, dynamic programming, and point-based value iteration can be used, depending on the assumptions on future learning. The paper describes the complexity of these procedures, discusses their performance in different settings, and applies them to flood risk mitigation.

Get full access to this article

View all available purchase options and get full access to this article.

Acknowledgments

The first author acknowledges the support of NSF project CMMI #1638327, titled “CRISP Type 1/Collaborative Research: A Computational Approach for Integrated Network Resilience Analysis under Extreme Events for Financial and Physical Infrastructures.” The authors thank the Center for Engineering and Resilience for Climate Adaptation (CERCA) of the CEE/EPP departments at Carnegie Mellon University for inspiring this research.

References

Bertsekas, D. P. (1995). Dynamic programming and optimal control, Vol. 1, Athena Scientific, Belmont, MA.
Chades, I., Carwardine, J., Martin, T., Nicol, S., Sabbadin, R., and Buffet, O. (2012). “MOMDPs: A solution for modeling adaptive management problems.” ⟨https://www.aaai.org/ocs/index.php/AAAI/AAAI12/paper/view/4990⟩ (May 15, 2017).
Dittrich, R., Wreford, A., and Moran, D. (2016). “A survey of decision-making approaches for climate change adaptation: Are robust methods the way forward?” Ecol. Econ., 122, 79–89.
Field C. B., ed. (2012). Managing the risks of extreme events and disasters to advance climate change adaptation: Special report of the intergovernmental panel on climate change, Cambridge University Press, Cambridge, U.K.
Hallegatte, S., Shah, A., Lempert, R., Brown, C., and Gill, S. (2012). Investment decision making under deep uncertainty: Application to climate change, World Bank, Washington, DC.
Hawkins, E., and Sutton, R. (2009). “The potential to narrow uncertainty in regional climate predictions.” Bull. Am. Meteorol. Soc., 90(8), 1095–1107.
IPCC (Intergovernmental Panel on Climate Change). (2014). “Climate change 2014: Impacts, adaptation, and vulnerability.”, Cambridge University Press, Cambridge, U.K.
Krause, A. (2008). “Optimizing sensing: Theory and applications.” Ph.D. dissertation, School of Computer Science, Carnegie Mellon Univ., Pittsburgh.
Kurniawati, H., Hsu, D., and Lee, W. (2008). “SARSOP: Efficient point-based POMDP planning by approximating optimally reachable belief spaces.” Robotics: Science and Systems IV, Eidgenössische Technische Hochschule Zürich (ETHZ), Zurich, Switzerland.
Leiserowitz, A., Maibach, E., Roser-Renouf, C., and Hmielowski, J. D. (2012). “Climate change in the American mind: Americans’ global warming beliefs and attitudes in March 2012.”, Yale Univ. and George Mason Univ., New Haven, CT.
Lin, N., Emanuel, K., Oppenheimer, M., and Vanmarcke, E. (2012). “Physically based assessment of hurricane surge threat under climate change.” Nat. Clim. Change, 2(6), 462–467.
Madanat, S. (1993). “Optimal infrastructure management decision under uncertainty.” Transp. Res. Part C, 1(1), 77–88.
Melillo J. M., Richmond T. C., and Yohe G. W., eds. (2014). Climate change impacts in the United States: The third national climate assessment, U.S. Global Change Research Program, Washington, DC, 418–440.
Memarzadeh, M., and Pozzi, M. (2016a). “Integrated inspection scheduling and maintenance planning for infrastructure systems.” Comput. Aided Civil Infrastruct. Eng., 31(6), 403–415.
Memarzadeh, M., and Pozzi, M. (2016b). “Value of information in sequential decision making: Component inspection, permanent monitoring and system-level scheduling.” Reliab. Eng. Syst. Saf., 154, 137–151.
Memarzadeh, M., Pozzi, M., and Kolter, J. Z. (2015). “Optimal planning and learning in uncertain environments for the management of wind farm.” J. Comput. Civil Eng., 04014076.
Memarzadeh, M., Pozzi, M., and Kolter, J. Z. (2016). “Hierarchical modeling of systems with similar components: A framework for adaptive monitoring and control.” Reliab. Eng. Syst. Saf., 153, 159–169.
Papakonstantinou, K., and Shinozuka, M. (2014). “Planning structural inspection and maintenance policies via dynamic programming and Markov processes. Part II: POMDP implementation.” Reliab. Eng. Syst. Saf., 130, 214–224.
Russell, S., and Norvig, P. (1995). Artificial intelligence: A modern approach, Pearson Education, Inc., Boston.
Shani, G., Pineau, J., and Kaplow, R. (2013). “A survey of point-based POMDP solvers.” Auton. Agents Multi-Agent Syst., 27(1), 1–51.
Smallwood, R. D., and Sondik, E. J. (1973). “The optimal control of partially observable Markov processes over a finite horizon.” Oper. Res., 21(5), 1071–1088.
Sondik, E. J. (1978). “The optimal control of partially observable Markov processes over the infinite horizon: Discounted costs.” Oper. Res., 26(2), 282–304.
Spaan, M., and Vlassis, N. (2005). “Perseus: Randomized point-based value iteration for POMDPs.” J. Artif. Intell. Res., 24, 195–220.
Špačková, O., and Straub, D. (2015). “Cost-benefit analysis for optimization of risk protection under budget constraints.” Risk Anal., 35(5), 941–959.
Špačková, O., and Straub, D. (2017). “Long-term adaption decisions via fully and partially observable Markov decision processes.” Sustainable Resilient Infrastruct., 2(1), 37–58.
Sutton, R. S., and Barto, A. G. (1998). Reinforcement learning: An introduction, MIT Press, Cambridge, MA.
von Neumann, J., and Morgenstern, O. (1944). Theory of games and economic behavior, Vol. 60, Princeton University Press, Princeton, NJ.

Information & Authors

Information

Published In

Go to Journal of Infrastructure Systems
Journal of Infrastructure Systems
Volume 23Issue 4December 2017

History

Received: Jul 12, 2016
Accepted: Mar 7, 2017
Published online: Jun 23, 2017
Discussion open until: Nov 23, 2017
Published in print: Dec 1, 2017

Permissions

Request permissions for this article.

Authors

Affiliations

Matteo Pozzi, A.M.ASCE [email protected]
Faculty, Dept. of Civil and Environmental Engineering, Faculty Affiliate, Scott Institute for Energy Innovation, Carnegie Mellon Univ., 107b Porter Hall, 5000 Forbes Ave., Pittsburgh, PA 15213-3890 (corresponding author). E-mail: [email protected]
Milad Memarzadeh [email protected]
Postdoctoral Scholar, Dept. of Environmental Science, Policy and Management, Univ. of California Berkeley, 201 Wellman Hall, Berkeley, CA 94720. E-mail: [email protected]
Kelly Klima [email protected]
Adjunct Assistant Professor, Research Scientist, Dept. of Engineering and Public Policy, Scott Institute for Energy Innovation, Carnegie Mellon Univ., Baker 129, 5000 Forbes Ave., Pittsburgh, PA 15213-3890. E-mail: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited by

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share