Welcome to WORKS11

Data Intensive Workflows (a.k.a. scientific workflows) are a key technology that enable the set up of large data sets analysis experiments in all scientific areas, exploiting capabilities of large-scale distributed and parallel computing infrastructures. Workflows enable scientists to design complex analysis that are composed of individual application components or services and often such components and services are designed, developed, and tested collaboratively. On large-scale computing infrastructures routinely used for e-Sciences today, workflow management systems provide both a formal description of distributed processes and an engine to enact applications composed of wealth of concurrent processes.

The Sixth Workshop on Workflows in Support of Large-Scale Science focuses on the entire workflow lifecycle including the workflow composition, mapping, robust execution and the recording of provenance information. The workshop also welcomes contributions in the applications area, where the requirements on the workflow management systems can be derived.

Related Workshops: