TECHNICAL PAPERS
Sep 18, 2010

Three-Dimensional Field-Scale Coupled Thermo-Hydro-Mechanical Modeling: Parallel Computing Implementation

Publication: International Journal of Geomechanics
Volume 11, Issue 2

Abstract

An approach for the simulation of three-dimensional field-scale coupled thermo-hydro-mechanical problems is presented, including the implementation of parallel computation algorithms. The approach is designed to allow three-dimensional large-scale coupled simulations to be undertaken in reduced time. Owing to progress in computer technology, existing parallel implementations have been found to be ineffective, with the time taken for communication dominating any reduction in time gained by splitting computation across processors. After analysis of the behavior of the solver and the architecture of multicore, nodal, parallel computers, modification of the parallel algorithm using a novel hybrid message passing interface/open multiprocessing (MPI/OpenMP) method was implemented and found to yield significant improvements by reducing the amount of communication required. This finding reflects recent enhancements of current high-performance computing architectures. An increase in performance of 500% over existing parallel implementations on current processors was achieved for the solver. An example problem involving the Prototype Repository experiment undertaken by the Swedish Nuclear Fuel and Waste Management Co. [Svensk Kärnbränslehantering AB (SKB)] in Äspö, Sweden, has been presented to demonstrate situations in which parallel computation is invaluable because of the complex, highly coupled nature of the problem.

Get full access to this article

View all available purchase options and get full access to this article.

Acknowledgments

Support for the first writer from an Engineering and Physical Sciences Research Council (EPSRC) studentship and funding by the European Commission (EC) via the Prototype Repository Project (ECFIKW-2000-00055), along with access to SKB’s high quality data set related to the Prototype Repository project are gratefully acknowledged. Support and use of the computing facilities both at the High-Performance Computing Collaboratory (HPC2) at Mississippi State University and at Advanced Research Computing @ Cardiff (ARCCA) at Cardiff University are also appreciatively acknowledged. The writers would also like to thank the National Science Foundation for its partial support of this work through the grants NSFNSF EPS #0556308 and NSF IIP #1034897.

References

Asanovic, A., et al. (2006). The landscape of parallel computing research: A view from Berkeley, Univ. of California, Berkeley, CA.
Barrett, R., et al. (1995). Templates for the solution of linear systems: Building blocks for iterative methods, Wiley, New York.
Cleall, P. J., Melhuish, T. A., and Thomas, H. R. (2006a). “Modeling the three-dimensional behavior of a prototype nuclear waste repository.” Eng. Geol., 85(1–2), 212–220.
Cleall, P. J., Thomas, H. R., Melhuish, T. A., and Owen, D. H. (2006b). “Use of parallel computing and visualization techniques in the simulation of large scale geoenvironmental engineering problems.” Future Gener. Comput. Syst., 22(4), 460–467.
Dahlström, L-O. (1998). “Äspö HRL—Test plan for the prototype repository.” Progress Rep. No. HRL-98-24, Swedish Nuclear Fuel and Waste Management (SKB), Stockholm, Sweden.
Duff, I. S., and van der Vorst, H. A. (1999). “Developments and trends in the parallel solution of linear systems.” Technical Rep. TR/PA/99/10, Centre Européen de Recherche et de Formation Avancée en Calcul Scientifique (CERFACS) (European Centre for Research and Advanced Training in Scientific Computation), Toulouse, France.
Flynn, M. (1972). “Some computer organizations and their effectiveness.” IEEE Trans. Comput., C-21(9), 948–960.
Goudarzi, R., and Johannesson, L.-E. (2006). “Äspö Hard Rock Laboratory, Prototype Repository—Sensor data report (Period 010917-061201).” 16, IPR-07-05, Swedish Nuclear Fuel and Waste Management (SKB), Stockholm, Sweden.
Hennessy, J., and Patterson, D. (2007). Computer architecture: A quantitative approach, 4th Ed., Morgan Kauffman, San Francisco.
Hockney, R. (1994). “The communication challenge for MPP: Intel Paragon and Meiko CS-2.” Parallel Comput., 20(3), 389–398.
Intel. (2006). “White paper—Intel architecture and silicon cadence. The catalyst for industry innovation.”
Jiayin, M, Bo, S., Yongwei, W., and Guangwen, Y. (2006). “Overlapping communication and computation in MPI by multithreading.” Proc., 2006 Int. Conf. on Parallel & Distributed Processing Techniques and Applications (PDPTA’06), CSREA Press, Bogart, GA, 52–57.
Johannesson, L.-E., Börgesson, L., Goudarzi, R., Sandén, T., Gunnarsson, D., and Svemar, C. (2007). “Prototype Repository: A full scale experiment at Äspö HRL.” Phys. Chem. Earth, 32(1–7), 58–76.
Neretnieks, I. (1993). “Solute transport in fractured rock—applications to radionuclide waste repositories.” Flow and contaminant transport in fractured rock, J. Bear, C.-F. Tsang, and G. de Marsily, eds., Academic Press, San Diego, 39–127.
Owen, D. H. (2000). “Preconditioned parallel iterative solution methods for coupled finite element analyses.” Ph.D. thesis, Univ. of Wales, Cardiff, UK.
Roosta, S. H. (2000). “Parallel processing and parallel algorithms.” Theory and computation, Springer-Verlag, New York.
Smith, I. M., and Griffiths, D. V. (2004). Programming the finite element method, 4th Ed., Wiley, Chichester, UK.
Svemar, C., and Pusch, R. (2000). “Äspö Hard Rock Laboratory.” Prototype Repository, project description, FIKW-CT-2000-00055, IPR-00-30, Swedish Nuclear Fuel and Waste Management (SKB), Stockholm, Sweden.
Thomas, H. R., Cleall, P. J., Chandler, N., Dixon, D., and Mitchell, H. P. (2003a). “Water infiltration into a large scale in situ experiment in an underground research laboratory—Physical measurements and numerical simulation.” Géotechnique, 53(2), 207–224.
Thomas, H. R., Cleall, P. J., Dixon, D., and Mitchell, H. P. (2009). “The coupled thermal-hydraulic-mechanical behavior of a large scale in situ heating experiment.” Géotechnique, 59(4), 401–413.
Thomas, H. R., and He, Y. (1997). “A coupled heat-moisture transfer theory for deformable unsaturated soil and its algorithmic implementation.” Int. J. Numer. Methods Eng., 40(18), 3421–3441.
Thomas, H. R., Yang, H. T., He, Y., and Cleall, P. J. (2003b). “A multilevel parallelized substructuring frontal solution for coupled thermo/hydro/mechanical problems in unsaturated soil.” Int. J. Numer. Anal. Meth. Geomech., 27(11), 951–965.
van de Steen, A. J., and Dongarra, J. J. (1996). “Overview of recent supercomputers.” Technical report, Dept. of Computer Science, 6th Ed., Univ. of Tennessee, Knoxville, TN.
White, J. B., and Bova, S. W. (1999). “Where’s the overlap? An analysis of popular MPI implementations.” Proc., Third MPI Developer’s and User’s Conf., MPIDC ’99, MPI Software Technology Press, Starkville, MS.

Information & Authors

Information

Published In

Go to International Journal of Geomechanics
International Journal of Geomechanics
Volume 11Issue 2April 2011
Pages: 90 - 98

History

Received: Jul 3, 2009
Accepted: Feb 24, 2010
Published online: Sep 18, 2010
Published in print: Apr 1, 2011

Permissions

Request permissions for this article.

Authors

Affiliations

Philip James Vardon [email protected]
Research Fellow, Geoenvironmental Research Centre, Cardiff School of Engineering, Cardiff Univ., Cardiff, UK (corresponding author). E-mail: [email protected]
Peter John Cleall [email protected]
Lecturer, Geoenvironmental Research Centre, Cardiff School of Engineering, Cardiff Univ., Cardiff, UK. E-mail: [email protected]
Hywel Rhys Thomas [email protected]
Professor and Director, Geoenvironmental Research Centre, Cardiff School of Engineering, Cardiff Univ., Cardiff, UK. E-mail: [email protected]
Roger Norman Philp [email protected]
Lecturer, Computer Science, Cardiff Univ., Cardiff, UK. E-mail: [email protected]
Ioana Banicescu [email protected]
Professor, Dept. of Computer Science and Centre for Computational Sciences, Mississippi State Univ., Starkville, MS 39759. E-mail: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited by

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share