Scientific workflows are routinely used in most scientific disciplines today, as they provide a systematic way to execute a number of applications in Science and Engineering. They are at the interface of end-users and computing infrastructures, often relying on workflow management systems and a variety of parallel and/or distributed computing resources. In addition, with the drastic increase of raw data volume in many domains, their role is important to assist scientists in organizing and processing their data and leverage High-Performance or High-Throughput computing resources
Proceeding Downloads
E-HPC: a library for elastic resource management in HPC environments
Next-generation data-intensive scientific workflows need to support streaming and real-time applications with dynamic resource needs on high performance computing (HPC) platforms. The static resource allocation model on current HPC systems that was ...
On the use of burst buffers for accelerating data-intensive scientific workflows
Science applications frequently produce and consume large volumes of data, but delivering this data to and from compute resources can be challenging, as parallel file system performance is not keeping up with compute and memory performance. To mitigate ...
rvGAHP: push-based job submission using reverse SSH connections
Computational science researchers running large-scale scientific workflow applications often want to run their workflows on the largest available compute systems to improve time to solution. Workflow tools used in distributed, heterogeneous, high ...
A compiler transformation-based approach to scientific workflow enactment
We investigate in this paper the application of compiler transformations to workflow applications using the Manycore Workflow Runtime Environment (MWRE), a compiler-based workflow environment for modern manycore computing architectures. MWRE translates ...
Supporting task-level fault-tolerance in HPC workflows by launching MPI jobs inside MPI jobs
While the use of workflows for HPC is growing, MPI interoperability remains a challenge for workflow management systems. The MPI standard and/or its implementations provide a number of ways to build multiple-programs-multiple-data (MPMD) applications. ...
Towards preserving results confidentiality in cloud-based scientific workflows
Cloud computing has established itself as a solid computational model that allows for scientists to deploy their simulation-based experiments on distributed virtual resources to execute a wide range of scientific experiments. These experiments can be ...
A machine learning approach for modular workflow performance prediction
Scientific workflows provide an opportunity for declarative computational experiment design in an intuitive and efficient way. A distributed workflow is typically executed on a variety of resources and it uses a variety of computational algorithms or ...
Processing of crowd-sourced data from an internet of floating things
- Raffaele Montella,
- Diana Di Luccio,
- Livia Marcellino,
- Ardelio Galletti,
- Sokol Kosta,
- Alison Brizius,
- Ian Foster
Sensors incorporated into mobile devices provide unique opportunities to capture detailed environmental information that cannot be readily collected in other ways. We show here how data from networked navigational sensors on leisure vessels can be used ...
Index Terms
- Proceedings of the 12th Workshop on Workflows in Support of Large-Scale Science