Generic Components of the eScience Infrastructure Ecosystem
Monday, 29 October 2018
At the high energy and cosmic frontiers of physics, the challenges facing us in dealing with measured and reference data sets are two regular twins: size as well as complexity. By combining the quantitative statistics methods of HEP with generic pattern-matching algorithms, machine learning, and the advances in vectorisation that benefit both GPU and massive multi-processor programming, we stand a chance of dealing with the next generation experiments - if we manage the complex task of integrating these generic components in the large existing scientific code base and sustain them for the next decades: as much of a challenge as developing the (very promising) methods and tools in the first place!
Session 1 (09.00 - 10.30): building software as an infrastructure
- 09.10 Sustainable software as an infrastructureOxana Smirnova, Lund University
- 09.40 An infrastructure for file distribtion: evolving from software distribution to dataRadu Popescu, CERN
- 10.10 SoftDrive.nl: end-user managed software distribution with CVMFSDennis van Dok, Nikhef
Session 2 (11.00 - 12.30): building the components
- 11.00 Collaborative analysis at a global scale with SWAN: Jupyter-as-a-serviceMassimo Lamanna, CERN
- 11.30 Reproducible data, reproduceable analysis: today, tomorrow, next decadeTibor Simko, CERN
- 12.00 Getting researchers to the data: data from the lakeJaroslava Schovancová, CERN
Session 3 (13.30 - 15.00): scaling the services for the next decade
- 13.30 Building next decade's infrastructure: by yourself, or public in the HNSciCloud?Martin Brandt, SURFsara and the HNSciCloud project
- 14.00 Infrastructures for Research: the science of deployability and operationsMaurice Bouwhuis, SURFsara
- 14.30 Plenary discussion: what mechanisms do we have to put in place to ensure our eScience Infrastructure in the next decades?
The Generic Components workshop is orgasised by the Nikhef Physics Data Processing group for advanced computing for research.