Since the dawn of the scientific investigation of natural phenomena, the construction ofsimplified models of physical reality has been an essential tool for studying more or less complex systems. Typically, models capture the essential features of a system, making it possible to explore the laws that govern its behaviour. While inevitably being an approximation of reality, a good model must be able to describe, in a sufficiently precise and rigorous way, the system it wishes to represent. Today, the chance to have very advanced IT and computing tools offers researchers the opportunity to model a large number of complex systems with an extraordinary level of detail, both for studying fundamental physics problems and for applications.
One fundamental sector consists of computational simulations, which permit (thanks to powerful supercomputers) the detailed recreation of the behaviour of physical systems. This is very useful, for example, for simulating “in advance” the possible results of an experiment, thus orienting the definition of experimental goals and then comparing the predictions of the simulation with actual data. In theoretical physics too, simulations are important. The theory of strong interactions, QCD, discretized on a lattice of space-timep oints, can be simulated to obtain numerical predictions of physical processes that cannot otherwise be calculated.
Another key aspect of the simulations is the capacity to explore scenarios that cannot be reproduced experimentally. A large part of research in the astrophysical field is based on simulations. These may model, for example, the evolution of galaxies across very longtime scales, enabling researchers to study processes that are otherwise impossible to directly observe. Another very important example of the use of models and simulations consists in atmospheric physics and climatology, where the integration of a vast range of atmospheric, terrestrial, and oceanic data may make it possible to predict climate changes over time and their potential impacts on ecological and social systems.
Quantum computers have always been considered a kind of “holy grail” of technological innovation, capable of revolutionising the IT world and all of society thanks to their applications.
Today, computers and supercomputers are essential tools for physics research, with the purpose of analysing experimental data and implementing theoretical models useful for studying often very complex phenomena, impossible to investigate with conventional analytical tools.
The increase in the sensitivity and efficiency of physics experiments, combined with the use of more and more technologically advanced electronics, has led to an explosion in the quantity of data collected during experiments in recent decades.
For many years, INFN has developed its own infrastructure dedicated to scientific computing. Both analysing data produced by the big experiments and theoretical simulations actually need computing power, large storage quantities, and ultra-fast networks.
The integration of automatic learning or machine learning – i.e. the capacity of a machineto learn and improve its performance with experience – in modelling physical systems and analysing experimental data is offering very interesting opportunities (in some cases, revolutionary ones) for the scientific community.