Physics today would be inconceivable without computing and intense data processing. From the analysis of the LHC data, over 30 Peta-bytes per year, to long-term achival of unique results, and calculating exact theoretical predictions, data processing is all around us. And that data processing is collaborative, global, and at the edge of what is technically feasible (and sometimes just beyond what’s possible)
The Nikhef Data-Processing Facility (NDPF) offers federated and local high-throughput compute and data services. It offers Netherlands Tier-1 service for the LHC, is a main node for XENON and Virgo data, and it serves the Dutch science in the Dutch National e-Infrastructure coordinated by SURF. It also provides the infrastructure for Stoomboot, the Nikhef analysis cluster and high-throughput storage environment.
Provisioned on top of a terabit-speed network, connected globally and embedded in our NikhefHousing datacenter, with hardware systems and an underlying cloud facility, it follows demand and can address ‘exotic’ cases. For unique experiments, we work with many systems and network vendors to challenge the limits of tomorrow’s computing.
We study the integration of ICT infrastructure (computing systems, networks, and storage), the implementation methodology of algorithms to be able to exploit high-throughput and high-performance computing and storage, and the secure collaboration mechanisms that enable this infrastructure to operate as a collective, coherent, and reliable ecosystem.
The Nikhef PDP group is happy to talk to you! Contact us by
email@example.com), or visit us at Nikhef.