Moderadores
COMCHA: Quantum Computing, Simulations, PID & Advanced Tecniques
- Luca Fiorini (IFIC / U. Valencia - CSIC)
- Arantza Oyanguren (IFIC- Valencia)
COMCHA: Institutes, Infrastructures and Entities
- Arantza Oyanguren (IFIC- Valencia)
- Luca Fiorini (IFIC / U. Valencia - CSIC)
COMCHA: Detector Design, Reconstruction and Analysis
- Luca Fiorini (IFIC / U. Valencia - CSIC)
- Arantza Oyanguren (IFIC- Valencia)
Future experiments such as HL-LHC plan to reach unprecedented energies and amount of data to look for Beyond the Standard Model Physics - about 10^{10} tracks per second or more, thus pushing the data challenge to new frontiers when processing these events at the various stages of the experimental pipeline.
Within this context, the use of Quantum Computing for this type of fundamental...
The identification of anomalous events – not explained by the Standard Model of particle physics – and the possible discovery of exotic physical phenomena pose significant theoretical, experimental and computational challenges. It is anticipated that these challenges will increase significantly with the operation of next-generation colliders, such as the High-Luminosity Large Hadron Collider...
The generation of hard-scattering events in high-energy physics, such as the process $gg \to t\bar{t}g$, is one of the computational bottlenecks in collider phenomenology. MadGraph provides a flexible framework to evaluate these matrix elements, but the sheer scale of Monte Carlo event production required at the LHC drives both execution time and power consumption to critical levels. In this...
For the HL-LHC era, the Phase-2 CMS upgrade includes a full replacement of the trigger and data acquisition system. The upgraded readout electronics will support a maximum Level-1 (L1) accept rate of 750 kHz with a latency of 12.5 µs. The muon trigger is implemented as a multi-layered system that reconstructs and measures muon momenta by correlating signals from different muon chambers within...
We present the reconstruction and identification of the main decay modes of the lepton $\tau$ in the framework of the Future Circular Collider electron-positron (FCC-ee), using the CLD detector, being one of the first FCC-ee studies based on a realistic (full simulation) detector simulation. Using simulated data from the $e^+e^- \rightarrow Z \rightarrow \tau^+\tau^-$ process, different...
The LHCb experiment relies on a two-level trigger system to efficiently select events of interest among the vast number of proton-proton collisions that occur at the LHC. In this work, we present a proof-of-concept study exploring the integration of an autoencoder into the High Level Trigger 2 (HLT2) as a novel strategy for event selection. Autoencoders, as unsupervised machine learning...
Compton imaging has long been constrained by intrinsic limitations in sensitivity, resolution, and computational efficiency. Traditional reconstruction methods, largely based on analytic backprojection or iterative schemes, often fail to fully exploit the complex statistical and structural information contained in the measured data. These deficiencies translate into blurred images, loss of...
We present a novel deep learning approach, inclusive flavor tagging (IFT), to determine the production flavor of B mesons at the LHCb. This technique is designed to overcome the challenges faced by classical taggers’ performance in the current and future environment, where luminosity and event track multiplicity increase. The IFT utilizes state-of-the-art deep learning models to process...
This talk summarises the software and computing activities of the LHCb UB group.
The Center for Astroparticles and High Energy Physics (CAPA), recently recognized as Research Institute of the University of Zaragoza, is an interdisciplinary research group encompassing high-energy, nuclear and particle physics, as well as astrophysics, cosmology, astroparticles, theoretical physics, and the related technological developments. Progress in these research areas poses new...
We will summarize the software work performed at the HEP group of university of A Coruna. This includes LHCb Real Time Analysis, offline analysis in GPU, Flavor tagging, green algorithms, QC, reconstruction software for HyperK or data compression for KOTO.
This talk summarises the software and computing activities of the High-Low team at IFIC.
The ROOT project is an open-source, modular scientific software toolkit for data analysis, developed at CERN primarily for high-energy physics. This project can help address the future computing challenges that HL-LHC and other scientific experiments.
IRIS-HEP is a software institute funded by the National Science Foundation. It is developing state-of-the-art software cyberinfrastructure required for the challenges of data intensive scientific research at the High Luminosity Large Hadron Collider (HL-LHC) at CERN, and other planned HEP experiments of the 2020’s.
The HEP Software Foundation (HSF) is an international community that facilitates cooperation and common efforts in high energy physics (HEP) software and computing. Its goal is to help developers and users create, discover, and use common software, while also supporting the career development of software and computing specialists.
An optimization framework is presented for a Parallel-Plate Avalanche Counter (PPAC) with Optical Readout for heavy-ion tracking and imaging. In a previous work, a differentiable optimization framework was developed in which a surrogate model predicted reconstructed positions of impinging charged particles as a function of detector parameters. This approach is extended by introducing a...
The High-Luminosity upgrade of the LHC will increase the collision rate by a factor of five, resulting in dense environments with dozens of overlapping interactions. Within this context, the LHCb Upgrade II and its next-generation electromagnetic calorimeter, the PicoCal, will face major challenges in the accurate energy reconstruction of photons, electrons, and neutral pions. To address these...
Graph Neural Networks (GNNs) have become promising candidates for particle reconstruction and identification in high-energy physics, but their computational complexity makes them challenging to deploy in real-time data processing pipelines. In the next-generation LHCb calorimeter, detector hits — characterized by energy, position, and timing—can be naturally encoded as node features, with...
The aim of this contribution is to show a comprehensive ML framework compiled after a period of application of ML/DL methods in the context of physics analysis in ATLAS experiment. From technical and organizational point of view, we addressed the use different ML/DL libraries, the managing of relevant computting infrastructures, the processing of different kinds of datasets, etc. Another...
Imaging Atmospheric Cherenkov Telescopes (IACT) rely on the Electromagnetic Calorimetry technique to record gamma rays of cosmic origin. Therefore, they use combined analog and digital electronics for their trigger systems, implementing simple but fast algorithms. Such trigger techniques are forced by the extremely high data rates and strict timing requirements. In recent years, a design of...
This work focuses on adapting LHCb’s existing machine learning approach to electron identification to the upgraded conditions of Run 3. The LHCb experiment at CERN focuses on precision measurements in heavy-flavour physics, where efficient particle identification (PID) is crucial. During the last LHC Long Shutdown (2019-2022) a major upgrade to the detector was done, including a fully-software...