skip to main content
10.1145/1273360acmconferencesBook PagePublication PagesscConference Proceedingsconference-collections
WORKS '07: Proceedings of the 2nd workshop on Workflows in support of large-scale science
ACM2007 Proceeding
  • General Chairs:
  • Ewa Deelman,
  • Ian Taylor
Publisher:
  • Association for Computing Machinery
  • New York
  • NY
  • United States
Conference:
HPDC07: International Symposium on High Performance Distributed Computing Monterey California USA 25 June 2007
ISBN:
978-1-59593-715-5
Published:
25 June 2007
Sponsors:

Bibliometrics
Skip Abstract Section
Abstract

Welcome to the 2nd Workshop on Workflows in Support of Large-Scale Science (WORKS'07). Since the 1st WORKS workshop, there has been a growing interest in the workflow technologies and workflows are still considered a key technology to enable large-scale science applications. Workflows enable scientists to design complex applications that are composed of individual application components or services. Often times these components and services are designed, developed, and tested collaboratively. Because of the size of the data and the complexity of the analysis, large amounts of shared resources such as clusters and storage systems are being used to store the data sets and execute the workflows. The process of workflow design and execution in a distributed environment can be very complex and involve mapping high-level workflow descriptions onto the available resources, as well as monitoring and debugging of the subsequent execution. Because computations and data access operations are performed on shared resources, there is an increased interest in managing the fair allocation and management of those resources at the workflow level.

Adequate workflow descriptions are needed to support the complex workflow management process that includes workflow creation, reuse, and modifications made to the workflow over time---for example modifications to the individual components. Additional annotations may provide guidelines and requirements for resource mapping and execution.

Large-scale scientific applications impose requirements on the workflow systems. Besides the magnitude of data processed by the workflow components, the resulting and intermediate data need to be annotated with provenance information and any other information needed to evaluate the quality of the data and support the repeatability of the analysis.

The Workshop on Workflows in Support of Large-Scale Science focuses on the entire workflow lifecycle including the workflow composition, mapping, and robust execution, as well as workflow applications.

During the 2nd WORKS meeting, papers spanning a range of workflow topics will be presented. Among them are: real-time workflow systems, graphical workflow composition, distributed workflow caching in P2P environments, workflow automation, workflow-based applications and semantic authoring tools.

Skip Table Of Content Section
Article
myExperiment: social networking for workflow-using e-scientists

We present the Taverna workflow workbench and argue that scientific workflow environments need a rich ecosystem of tools that support the scientists. experimental lifecycle. Workflows are scientific objects in their own right, to be exchanged and ...

Article
GRIDCC: real-time workflow system

The Grid is a concept which allows the sharing of resources between distributed communities, allowing each to progress towards potentially different goals. As adoption of the Grid increases so are the activities that people wish to conduct through it. ...

Article
Cache for workflows

This paper discusses the motivation for the design of a decentralised data-caching scheme for Internet scale distributed computing applications. We provide three target applications in distributed cycle sharing applications, music information retrieval ...

SESSION: Adaptation and integration
Article
Integrating existing scientific workflow systems: the Kepler/Pegasus example

Scientific workflows have become an important tool used by scientists to conduct large-scale analysis in distributed environments. Today thereare a variety of workflow systems that provide an often disjoint set of capabilities and expose different ...

Article
Workflow adaptation as an autonomic computing problem

The performance of long running scientific workflows stands to benefit from adapting to changes in their environment. Autonomic Computing provides methodologies for managing run-time adaptations in managed systems. In this paper, we apply the monitoring,...

SESSION: Workflow applications and models
Article
Workflow automation for processing plasma fusion simulation data

The Center for Plasma Edge Simulation project aims to automate the tedious tasks of monitoring the simulation, archiving and post-processing the output. This paper describes the tasks and requirements, the several components developed within the Kepler ...

Article
Supporting large-scale science with workflows

Current workflow systems support data flow through complex analyses in a distributed environment. The scope of scientific workflow systems could expand to support the entire scientific research cycle, which includes data flow, design flow, and knowledge ...

Article
On the black art of designing computational workflows

Computational workflows have recently emerged as an effective paradigm to manage large-scale distributed scientific computations. Workflow systems can automate many execution-level details and provide assistance in composing and validating workflows. ...

SESSION: Short papers
Article
WS-VLAM: towards a scalable workflow system on the grid

Large scale scientific applications require extensive support from middleware and frameworks that provide the capabilities for distributed execution in the Grid environment. In particular, one of the examples of such frameworks is a Grid-enabled ...

Article
A semantic workflow authoring tool for programming grids

Workflows have an increasing role in scientific applications programming and business environment development because they are an effective technology to define composition of different pieces of knowledge, both in the application domain and in the ...

Article
A workflow approach to designed reservoir study

Reservoir simulations are commonly used to predict the performance of oil and gas reservoirs, taking into account a myriad of uncertainties in the geophysical structure of the reservoir as well as operational factors such as well location. Designed ...

Contributors
  • University of Southern California
  • Cardiff University

Recommendations

Acceptance Rates

Overall Acceptance Rate30of54submissions,56%
YearSubmittedAcceptedRate
WORKS '1725832%
WORKS '1513969%
WORKS '13161381%
Overall543056%