Today, computers and supercomputers are essential tools for physics research, with the purpose of analysing experimental data and implementing theoretical models useful for studying often very complex phenomena, impossible to investigate with conventional analytical tools. Thanks to their capacity to perform complex calculations very quickly, the supercomputers available to many cutting-edge experiments are able to efficiently deal with numerous problems. For example, they can model the behaviour of fluids in extreme conditions, simulate particle collisions at very high energy, or study the formation dynamics of galaxies.
The application fields go, of course, well beyond fundamental physics. Today, supercomputers are crucial, for example for investigating the behaviour of so-called complex systems. Their applications range from studying pandemics to finance, from meteorology and medicine to climate modelling.
INFN has always been in the front line in managing advanced IT resources and in developing high-performance computational infrastructure, both for fundamental physics experiments and research, and for applications. In particular, in 2023 INFN proposed the National Research Centre in High Performance Computing, Big Data and Quantum Computing, one of the five national centres envisaged by Italy’s National Recovery and Resilience Plan (PNRR). The centre has 51 founding members spread across the country, coming from public and private sectors, from the world of scientific research and industry. It has the threefold goal of building Italian supercomputing infrastructure, aggregating research and innovation resources in the most strategic sectors for the country, and positioning itself as a national platform for supporting scientific and industrial initiatives.
Quantum computers have always been considered a kind of “holy grail” of technological innovation, capable of revolutionising the IT world and all of society thanks to their applications.
The increase in the sensitivity and efficiency of physics experiments, combined with the use of more and more technologically advanced electronics, has led to an explosion in the quantity of data collected during experiments in recent decades.
For many years, INFN has developed its own infrastructure dedicated to scientific computing. Both analysing data produced by the big experiments and theoretical simulations actually need computing power, large storage quantities, and ultra-fast networks.
The integration of automatic learning or machine learning – i.e. the capacity of a machineto learn and improve its performance with experience – in modelling physical systems and analysing experimental data is offering very interesting opportunities (in some cases, revolutionary ones) for the scientific community.
Since the dawn of the scientific investigation of natural phenomena, the construction ofsimplified models of physical reality has been an essential tool for studying more or less complex systems.