ABSTRACT
Ensuring reliability and reproducibility in computational research raises unique challenges in the supercomputing context. Specialized architectures, extensive and customized software, and complex workflows all raise barriers to transparency, while established concepts such as Validation, Verification, and Uncertainty Quantification point ways forward. The topic has attracted national attention: President Obama's July 29 2015 Executive Order "Creating a National Strategic Computing Initiative" includes accessibility and workflow capture as objectives; an XSEDE14 workshop released a report "Standing Together for Reproducibility in Large-Scale Computing"; on May 5 2015 ACM Transactions in Mathematical Software released a "Replicated Computational Results Initiative"; and this conference is host to a new workshop "Numerical Reproducibility at Exascale", to name but a few examples. In this context I will outline a research agenda to establish reproducibility and reliability as a cornerstone of scientific computing.
Supplemental Material
Recommendations
Evaluating parameter sweep workflows in high performance computing
SWEET '12: Proceedings of the 1st ACM SIGMOD Workshop on Scalable Workflow Execution Engines and TechnologiesScientific experiments based on computer simulations can be defined, executed and monitored using Scientific Workflow Management Systems (SWfMS). Several SWfMS are available, each with a different goal and a different engine. Due to the exploratory ...
Comments