<< Torna alla home page
L'inserimento della tesi può essere fatto dall'autore stesso, da un responsabile d'esperimento o dalle segreterie scientifiche
Per qualsiasi problema di natura tecnica scrivere a Supporto Web INFN

NEW: E' assolutamente vietato inserire il nome del relatore al posto del tesista per "bypassare" l'inserimento in anagrafica. Le tesi così formate saranno ritenute invalide ed eliminate dal sistema

Il tesista dev'essere presente nel database dell'anagrafica centralizzata (GODiVA)
Cercare il tesista per cognome, cliccando sul pulsante apposito, ed eventualmente inserirlo tramite il link proposto. Sono richieste le informazioni fondamentali di anagrafica e tipo/numero di documento di riconoscimento
Ultimo aggiornamento 21 gen 2018
Autore
Riccardo Poggi
Sesso M
Esperimento ATLAS
Tipo Laurea Magistrale
Destinazione dopo il cons. del titolo Dottorato (estero)
Università Universita' Di Pavia
Strutt.INFN/Ente
Pavia
Titolo Hadronic tau trigger performance study and process management evolution plan for the data acquisition system of the ATLAS experiment at LHC
Abstract Yearning for the understanding of the universe, physicists from all over the world have combined theories and discoveries into the Standard Model (SM) of particle physics, which successfully provides a description for the electromagnetic, weak, and strong interactions. The SM is a paradigm of quantum feld theory and it has demonstrated enormous and continued success over the years, but at the same time it fails to accommodate important experimental results. Altogether there are clear signs for the presence of new physics Beyond the Standard Model (BSM). Supersymmetry (SUSY) is a theoretical extension of the SM that introduces new supersymmetric particles to ofer a possible Dark Matter candidate and to deal with the sensitivity of the Higgs potential to new physics outside the SM (hierarchy problem). In 2015, after a scheduled shutdown, the LHC resumes proton collisions at a centre-of-mass energy of √s = 13 TeV and is foreseen to run for many years ahead with a steadily increasing luminosity and pile-up conditions. This is a great opportunity for BSM research as it opens up sensitivity to yet unexplored parts of the SUSY phase space. The ATLAS experiment is collecting and analysing the high volume of experimental data produced at the LHC. The ATLAS Trigger and Data Acquisition (TDAQ) system is a critical hardware and software component, which holds the very challenging task of selecting the few interesting interactions concealed within a huge background by accepting approximately one in O(10^5) events almost in real-time. Contrary to ofine data analyses that can be improved upon and executed multiple times, data not selected by the trigger is lost forever. Consequently, it is crucial to maximise the trigger efficiency for the physics signal without introducing uncontrolled biases. Furthermore, as the LHC evolves over the years towards higher experimental conditions, the trigger decision times are expected to increase and selections will have to become more and more stringent at the risk of sacrificing physics sensibility. Which strategies will be adopted to cope with the new physical and technical challenges? This thesis presents different actions that were taken on different levels to help solve this problem, both in the current status of operations and in plans for future upgrades. The first thing is to optimise the trigger selecting algorithms towards the needs of the offine analysis, and at the same time it is essential for the offine analysis to know which specific trigger selections would better shape their signal region. A trigger strategies was outlined, which would benefit present searches for direct top squark pair production with final states consisting in missing transverse energy, b-jets and two hadronically decaying tau leptons. The performances of a ditau trigger have been compared to a pure missing transverse energy trigger on a Monte Carlo sample. The results suggested to use two orthogonal signal regions, both making optimal use of the two independent triggers. Previous analysis at √s = 8 TeV in this channel used only one signal region, and this new approach would lead to a significant gain in signal significance for the new version of this analysis at √s = 13 TeV. The trigger algorithms for the reconstruction and selection of tau leptons are among the most demanding in terms of execution time and processing power. While optimisation of the trigger strategies may help in mitigating the effects of TDAQ imposed limitations, it is also important to keep pushing these boundaries. A key role in the efficient selection of interesting physics events is played by the ATLAS trigger computing cluster. In 2015 and 2016, as part of this thesis, a software package was developed and successfully operated for the DAQ commissioning of approximately 1200 server machines out of the 2000 that today compose the computing cluster. This tool automated the setup, monitoring and testing of the overall validation procedure. A series of tests reproducing the characteristic Data Flow of the ATLAS experiment have been executed. The package is now part of the ATLAS TDAQ software alongside an exhaustive documentation. This software tool will also be used when new batches of machines will be installed in the future. This project was recognised by the ATLAS collaboration that, after one year of qualification work, granted me authorship over their general scientific publications. The ATLAS TDAQ computing cluster counts approximately 3000 machines and 30000 processes which in a coordinated manner provide the data-taking functionality to the overall experiment. One of the pillars of the current infrastructure is the Process Manager (PMG): a service responsible of creation, termination and management for all DAQ processes across the whole cluster. As part of this thesis and my one year project as a Technical Student at CERN, different approaches have been evaluated on how to upgrade the PMG for Phase-II. A survey of seven possible open-source projects for the replacement of the PMG was conducted. No candidate was able to fulfl the strict DAQ requirements. Therefore, options on how to modify the PMG were evaluated. A design was chosen and a series of tests demonstrated its proof of concept. Future plans include further development and full integration with the ATLAS TDAQ system
Anno iscrizione 2014
Data conseguimento 28 apr 2017
Luogo conseguimento Pavia
Relatore/i
Andrea Negri   
File PDF
170428.TesiPoggi.pdf
File PS