Difference between revisions of "Master Projects"
Line 127: | Line 127: | ||
Contact: ''[mailto:zwolffs@nikhef.nl Zef Wolffs], [mailto:mvozak@cern.ch Matouš Vozák], [mailto:b.kortman@nikhef.nl Bryan Kortman] and [mailto:Ivo.van.Vulpen@nikhef.nl Ivo van Vulpen]'' | Contact: ''[mailto:zwolffs@nikhef.nl Zef Wolffs], [mailto:mvozak@cern.ch Matouš Vozák], [mailto:b.kortman@nikhef.nl Bryan Kortman] and [mailto:Ivo.van.Vulpen@nikhef.nl Ivo van Vulpen]'' | ||
− | === ATLAS: HGTD | + | === ATLAS: A new timing detector - the HGTD === |
− | The ATLAS is going to get a new ability: a | + | The ATLAS is going to get a new ability: a timing detector. This allows us to reconstruct tracks not only in the 3 dimensions of space but adds the ability of measuring very precisely also the time (at picosecond level) at which the particles pass the sensitive layers of the HGTD detector. This allows to construct the trajectories of the particles created at the LHC in 4 dimensions and ultimately will lead to a better reconstruction of physics at ATLAS. The new HGTD detector is still in construction and work needs to be done on different levels such as understanding the detector response (taking measurements in the lab and performing simulations) or developing algorithms to reconstruct the particle trajectories (programming and analysis work). With this work you will be part of the Atlas group and/or the Fast Timing detector group together with the R&D department at Nikhef. |
− | In this project there are two | + | '''In this project there are two possibilities:''' |
One can choose to either focus on the impact on physics performance by studying how the timing measurements can be included in the reconstruction of tracks, and what effect this has on how much better we can understand the physical processes occurring in the particles produced in the LHC collisions. | One can choose to either focus on the impact on physics performance by studying how the timing measurements can be included in the reconstruction of tracks, and what effect this has on how much better we can understand the physical processes occurring in the particles produced in the LHC collisions. |
Revision as of 10:49, 17 March 2023
Master Thesis Research Projects
The following Master thesis research projects are offered at Nikhef. If you are interested in one of these projects, please contact the coordinator listed with the project.
Projects with a 2023 start
ALICE: The next-generation multi-purpose detector at the LHC
This main goal of this project is to focus on the next-generation multi-purpose detector planned to be built at the LHC. Its core will be a nearly massless barrel detector consisting of truly cylindrical layers based on curved wafer-scale ultra-thin silicon sensors with MAPS technology, featuring an unprecedented low material budget of 0.05% X0 per layer, with the innermost layers possibly positioned inside the beam pipe. The proposed detector is conceived for studies of pp, pA and AA collisions at luminosities a factor of 20 to 50 times higher than possible with the upgraded ALICE detector, enabling a rich physics program ranging from measurements with electromagnetic probes at ultra-low transverse momenta to precision physics in the charm and beauty sector.
Contact: Panos Christakoglou and Alessandro Grelli and Marco van Leeuwen
ALICE: Searching for the strongest magnetic field in nature
In a non-central collision between two Pb ions, with a large value of impact parameter, the charged nucleons that do not participate in the interaction (called spectators) create strong magnetic fields. A back of the envelope calculation using the Biot-Savart law brings the magnitude of this filed close to 10^19Gauss in agreement with state of the art theoretical calculation, making it the strongest magnetic field in nature. The presence of this field could have direct implications in the motion of final state particles. The magnetic field, however, decays rapidly. The decay rate depends on the electric conductivity of the medium which is experimentally poorly constrained. Overall, the presence of the magnetic field, the main goal of this project, is so far not confirmed experimentally.
Contact: Panos Christakoglou
ALICE: Looking for parity violating effects in strong interactions
Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions, in the presence of a deconfined state, gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, what is called the Chiral Magnetic Effect (CME). The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, however further studies are needed to constrain background effects. These highly anticipated results have the potential to reveal exiting, new physics.
Contact: Panos Christakoglou
ALICE: Machine learning techniques as a tool to study the production of heavy flavour particles
There was recently a shift in the field of heavy-ion physics triggered by experimental results obtained in collisions between small systems (e.g. protons on protons). These results resemble the ones obtained in collisions between heavy ions. This consequently raises the question of whether we create the smallest QGP droplet in collisions between small systems. The main objective of this project will be to study the production of charm particles such as D-mesons and Λc-baryons in pp collisions at the LHC. This will be done with the help of a new and innovative technique which is based on machine learning (ML). The student will also extend the studies to investigate how this production rate depends on the event activity e.g. on how many particles are created after every collision.
Contact: Panos Christakoglou and Alessandro Grelli
ALICE: Search for new physics with the most sensitive vertex tracker at the LHC
With the newly installed Inner Tracking System consisting fully of monolithic detectors, ALICE is very sensitive to particles with low transverse momenta, more so than ATLAS and CMS. With the ALICE time projection chamber (TPC), the energy loss dE/dx can be measured quite accurately, also enabling searches for heavy stable charged particles. This project will search for new physics beyond the standard model in ALICE, such as new particles leaving disappearing tracks or magnetic monopoles, that appear as highly ionizing particles in for example the ALICE TPC.
Contact: Jory Sonneveld and Panos Christakoglou
ATLAS: The Higgs boson's self-coupling
The coupling of the Higgs boson to itself is one of the main unobserved interactions of the Standard Model and its observation is crucial to understand the shape of the Higgs potential. Here we propose to study the 'ttHH' final state: two top quarks and two Higgs bosons produced in a single collision. This topology is yet unexplored at the ATLAS experiment and the project consists of setting up the new analysis (including multivariate analysis techniques to recognise the complicated final state), optimising the sensitivity and including the result in the full ATLAS study of the Higgs boson's coupling to itself. With the LHC data from the upcoming Run-3, we might be able to see its first glimpses!
Contact: Tristan du Pree and Carlo Pandini
ATLAS: Triple-Higgs production as a probe of the Higgs potential
So far, the investigation of Higgs self-couplings (the coupling of the Higgs boson to itself) at the LHC has focused on the measurement of the Higgs tri-linear coupling λ3 mainly through direct double-Higgs production searches. In this research project we propose the investigation of Higgs tri-linear and quartic coupling parameters λ3 and λ4, via a novel measurement of triple-Higgs production at the LHC (HHH) with the ATLAS experiment. While in the SM these parameters are expected to be identical, only a combined measurement can provide an answer regarding how the Higgs potential is realised in Nature. Processes in which three Higgs bosons are produced simultaneously are extremely rare, and very difficult to measure and disentangle from background. In this project we plan to investigate different decay channels (to bottom quarks and tau leptons), and to study advanced machine learning techniques to reconstruct such a complex hadronic final state. This kind of processes is still quite unexplored in ATLAS, and the goal of this project is to put the basis for the first measurement of HHH production at the LHC.
Furthermore, we'd like to study the possible implication of a precise measurement of the self-coupling parameters from HHH production from a phenomenological point of view: what could be the impact of a deviation in the HHH measurements on the big open questions in physics (for instance, the mechanisms at the root of baryogenesis)?
Contact: Tristan du Pree and Carlo Pandini
ATLAS: The Next Generation
After the observation of the coupling of Higgs bosons to fermions of the third generation, the search for the coupling to fermions of the second generation is one of the next priorities for research at CERN's Large Hadron Collider. The search for the decay of the Higgs boson to two charm quarks is very new [1] and we see various opportunities for interesting developments. For this project we propose improvements in reconstruction (using exclusive decays) and advanced analysis techiques (using deep learning methods).
[1]https://atlas.cern/updates/briefing/charming-Higgs-decay
Contact: Tristan du Pree
ATLAS: Searching for new particles in very energetic diboson production
The discovery of new phenomena in high-energy proton–proton collisions is one of the main goals of the Large Hadron Collider (LHC). New heavy particles decaying into a pair of vector bosons (WW, WZ, ZZ) are predicted in several extensions to the Standard Model (e.g. extended gauge-symmetry models, Grand Unified theories, theories with warped extra dimensions, etc). In this project we will investigate new ideas to look for these resonances in promising regions. We will focus on final states where both vector bosons decay into quarks, or where one decays into quarks and one into leptons. These have the potential to bring the highest sensitivity to the search for Beyond the Standard Model physics [1, 2]. We will try to reconstruct and exploit new ways to identify vector bosons (using machine learning methods) and then tackle the problem of estimating contributions from beyond the Standard Model processes in the tails of the mass distribution.
[1] https://arxiv.org/abs/1906.08589
[2] https://arxiv.org/abs/2004.14636
Contact: Flavia de Almeida Dias, Robin Hayes, Elizaveta Cherepanova and Dylan van Arneman
ATLAS: Top-quark and Higgs-boson analysis combination, and Effective Field Theory interpretation [not available in 2023]
We are looking for a master student with interest in theory and data-analysis in the search for physics beyond the Standard Model in the top-quark and Higgs-boson sectors.
Your master-project starts just at the right time for preparing the Run-3 analysis of the ATLAS experiment at the LHC. In Run-3 (2022-2026), three times more data becomes available, enabling analysis of rare processes with innovative software tools and techniques.
This project aims to explore the newest strategy to combine the top-quark and Higgs-boson measurements in the perspective of constraining the existence of new physics beyond the Standard Model (SM) of Particle Physics. We selected the pp->tZq and gg->HZ processes as promising candidates for a combination to constrain new physics in the context of Standard Model Effective Field Theory (SMEFT). SMEFT is the state-of-the-art framework for theoretical interpretation of LHC data. In particular, you will study the SMEFT OtZ and Ophit operators, which are not well constrained by current measurements.
Besides affinity with particle physics theory, the ideal candidate for this project has developed python/C++ skills and is eager to learn advanced techniques. You start with a simulation of the signal and background samples using existing software tools. Then, an event selection study is required using Machine Learning techniques. To evaluate the SMEFT effects, a fitting procedure based on the innovative Morphing technique is foreseen, for which the basic tools in the ROOT and RooFit framework are available. The work is carried out in the ATLAS group at Nikhef and may lead to an ATLAS note.
Contact: Geoffrey Gilles and Wouter Verkerke and Marcel Vreeswijk
ATLAS: Machine learning to search for very rare Higgs decays
Since the Higgs boson discovery in 2012 at the ATLAS experiment, the investigation of the properties of the Higgs boson has been a priority for research at the Large Hadron Collider (LHC). However, there are still a many open questions: Is the Higgs boson the only origin of Electroweak Symmetry Breaking? Is there a mechanism which can explain the observed mass pattern of SM particles? Many of these questions are linked to the Higgs boson coupling structure.
While the Higgs boson coupling to fermions of the third generation has been established experimentally, the investigation of the Higgs boson coupling to the light fermions of the second generation will be a major project for the upcoming data-taking period (2022-2025). The Higgs boson decay to muons is the most sensitive channel for probing this coupling. In this project, you will optimize the event selection for Higgs boson decays to muons in the Vector Boson Fusion (VBF) production channel with a focus on distinguishing signal events from background processes like Drell-Yan and electroweak Z boson production. For this purpose, you will develop, implement and validate advanced machine learning and deep learning algorithms.
Contact: Oliver Rieger and Wouter Verkerke and Peter Kluit
ATLAS: Interpretation of experimental data using SMEFT
The Standard Model Effective Field Theory (SMEFT) provides a systematic approach to test the impact of new physics at the energy scale of the LHC through higher-dimensional operators. The interpretation of experimental data using SMEFT requires a particular interest in solving complex technical challenges, advanced statistical techniques, and a deep understanding of particle physics. We would be happy to discuss different project opportunities based on your interests with you.
Contact: Oliver Rieger and Wouter Verkerke
ATLAS: Reconstructing tracks from particle physics detector hits with state-of-the-art machine learning techniques
This project concerns the application of new machine learning techniques to tackle the problem of track reconstruction at the ATLAS detector in CERN. While algorithms to construct particle tracks from low-level detector information such as particle hits and timestamps have been around for decades, recent developments in the field of machine learning open up new opportunities to improve these algorithms significantly. Some recent developments that could help in this context include graph-based neural networks, which embed the input data in the format of a graph and as such have the capability to enhance underlying correlations within events. Transformer neural networks are a particular extension of graph-based neural networks proposed only in 2017 which could also provide helpful in this case. Another option would be to build upon some of the work done within the field of computer vision and see if image segmentation networks can help solve this problem. There are a range of available options and this project includes the freedom for the student to choose particular types of networks, but more explicit guidance could be provided in case it is desired.
In this project the student will develop and compare the performance of various machine learning models to initially reconstruct tracks from simplified test data. Upon successful completion of this, simulated data from the actual ATLAS detector can be analysed as well in the scope of this project. The student will need some familiarity with programming in python and an interest in machine learning, but a physics background is not required. In this project the student will be able to contribute to fundamental physics research and will familiarize themselves with state-of-the-art machine learning models.
Contact: Zef Wolffs, Matouš Vozák and Ivo van Vulpen
ATLAS: New machine learning approaches to target Higgs interference signatures in LHC data
In this project we aim to improve an ongoing analysis to determine the lifetime of the Higgs Boson through state-of-the-art machine learning techniques, in particular by addressing a novel solution to an as of yet unsolved fundamental problem in modeling quantum interference. While the Higgs is an elusive particle that generally only appears in physics processes with small cross sections, its signature can be amplified in the Large Hadron Collider (LHC) through quantum interference with larger background (non-Higgs) processes. This is the effect that the Higgs’ lifetime analysis relies on to be able to measure the relevant Higgs signature. A fundamental physics modelling problem arises though in the simulation of individual events for this interference due to the fact that these events are in reality described by a superposition of underlying Higgs and non-Higgs processes.
Since machine learning models in particle physics are typically trained to characterise individual physics events, the fact that interference events cannot currently be generated is a significant problem when interference is the target. In the currently existing Higgs lifetime analysis, a machine learning model was trained which instead focuses only on the explicit Higgs-mediated processes as a proxy, which is suboptimal. The aim of this project is to improve upon this current machine learning strategy used in this analysis by implementing either of the inference-aware approaches suggested in [1] and [2]. The idea behind these inference-aware machine learning algorithms is that they do not optimise for a simplified goal such as the loss function which is common in traditional machine learning, but rather for the end-goal of the analysis. In this case, this would omit the need for interference event generation altogether and allow the machine learning models to be trained optimally regardless.
The first checkpoint of this project is to use either of the frameworks used in [1] and [2] (which are both publicly available) and run them with a simplified dataset from the aforementioned analysis. After this proof-of-principle is achieved, the next goal would be to actually implement the newly developed machine learning models in the full analysis and to determine the improvement upon the existing result. A successful completion of these tasks would not only benefit the Higgs lifetime analysis, but would be an important stepping stone to future developments to make machine learning approaches also aware of other hard to model effects such as systematic uncertainties. Finally, there are further options to improve this analysis such as the generation of actual interference training data, which could be attempted in case the primary project finishes earlier than expected.
[1] De Castro, P., & Dorigo, T. (2019). INFERNO: inference-aware neural optimisation. Computer Physics Communications, 244, 170-179.
[2] Simpson, N., & Heinrich, L. (2023, February). neos: End-to-end-optimised summary statistics for high energy physics. In Journal of Physics: Conference Series (Vol. 2438, No. 1, p. 012105). IOP Publishing.
Contact: Zef Wolffs, Matouš Vozák and Ivo van Vulpen
ATLAS: Development of state-of-the art modeling techniques to generate Higgs interference events
In this project we aim to improve an ongoing analysis to determine the lifetime of the Higgs Boson through new event generation strategies, in particular by addressing a novel solution to an as of yet unsolved fundamental problem in modeling quantum interference. While the Higgs is an elusive particle that generally only appears in physics processes with small cross sections, its signature can be amplified in the Large Hadron Collider (LHC) through quantum interference with larger background (non-Higgs) processes. This is the effect that the Higgs’ lifetime analysis relies on to be able to measure the relevant Higgs signature. A fundamental physics modelling problem arises though in the simulation of individual events for this interference due to the fact that these events are in reality described by a superposition of underlying Higgs and non-Higgs processes.
The current approach to deal with this problem is to ignore the interference in analysis optimization and instead optimize only for explicitly Higgs mediated processes, but this severely impacts analysis performance. In the context of Effective Field Theories (EFT) however, a similar problem arises and has been solved for simple (leading order) processes. In this project we plan to take the machinery developed for EFT and apply it to the Higgs lifetime analysis. Furthermore, with the recent development of a Next-to-Leading Order (NLO) Higgs event generation tool [1] a subsequent goal would be to use this to also generate interference at the NLO level. Successful completion of this project would lead to a much improved analysis result, significantly constraining the lifetime of the Higgs Boson. Besides, the techniques developed would almost certainly be used in future analyses on Large Hadron Collider (LHC) run 3 data.
[1] Alioli, S., Ravasio, S. F., Lindert, J. M., & Röntsch, R. (2021). Four-lepton production in gluon fusion at NLO matched to parton showers. The European Physical Journal C, 81(8), 687.
Contact: Zef Wolffs, Matouš Vozák, Bryan Kortman and Ivo van Vulpen
ATLAS: Approaching the Higgs from a new direction: Constraining new physics with off shell Higgs data from the LHC
The Heisenberg uncertainly principle allows for all elementary particles---including the Higgs Boson---to momentarily disobey the fundamental energy-momentum relation, allowing the particle in question to have a significantly larger mass than usual. A description of the Higgs Boson in this state (“off shell Higgs Boson”) can provide a portal to the discovery of potential new physics, albeit very difficult to do due to its infrequent appearance. The goal of this project is to constrain or hint at new physics by estimating parameters of a generalized model which allows for new physics, Effective Field Theory (EFT), using off shell Higgs data.
Most of the underlying analysis to measure the prevalence of off shell Higgs bosons has already been set up, so the goal of this project is to do the aforementioned EFT interpretation on top of this existing analysis. From a theoretical point of view much of the groundwork has also been done on simulated data which showed the potential for this EFT interpretation to constrain new physics [1]. Being on the interface between experimental and theoretical physics this project allows the student to gain a deeper understanding of both, furthermore its successful completion could be one of the first hints towards as of yet not understood physics.
[1] Azatov, A., de Blas, J., Falkowski, A., Gritsan, A. V., Grojean, C., Kang, L., ... & Vryonidou, E. (2022). Off-shell Higgs Interpretations Task Force: Models and Effective Field Theories Subgroup Report. arXiv preprint arXiv:2203.02418.
Contact: Zef Wolffs, Matouš Vozák, Bryan Kortman and Ivo van Vulpen
ATLAS: A new timing detector - the HGTD
The ATLAS is going to get a new ability: a timing detector. This allows us to reconstruct tracks not only in the 3 dimensions of space but adds the ability of measuring very precisely also the time (at picosecond level) at which the particles pass the sensitive layers of the HGTD detector. This allows to construct the trajectories of the particles created at the LHC in 4 dimensions and ultimately will lead to a better reconstruction of physics at ATLAS. The new HGTD detector is still in construction and work needs to be done on different levels such as understanding the detector response (taking measurements in the lab and performing simulations) or developing algorithms to reconstruct the particle trajectories (programming and analysis work). With this work you will be part of the Atlas group and/or the Fast Timing detector group together with the R&D department at Nikhef.
In this project there are two possibilities:
One can choose to either focus on the impact on physics performance by studying how the timing measurements can be included in the reconstruction of tracks, and what effect this has on how much better we can understand the physical processes occurring in the particles produced in the LHC collisions.
The second possibility is to focus on the working of the fast-timing detectors themselves and participate in measurements in the lab and in test-beam campaigns at CERN.
If you are interested, contact me to discuss the possibilities. Contact: Hella Snoek
Cosmic Rays/Neutrinos: Seasonal muon flux variations and the pion/kaon ratio
The KM3NeT ARCA and ORCA detectors, located kilometers deep in the Mediterranean Sea, have neutrinos as primary probes. Muons from cosmic ray interactions reach the detectors in relatively large quantities too. These muons, exploiting the capabilities and location of the detectors allow the study of cosmic rays and their interactions. In this way, questions about their origin, type, propagation can be addressed. In particular these muons are tracers of hadronic interactions at energies inaccessible at particle accelerators.
The muons reaching the depths of the detectors result from decays of mesons, mostly pions and kaons, created in interactions of high-energy cosmic rays with atoms in the upper atmosphere. Seasonal changes of the temperature – and thus density - profile of the atmosphere modulate the balance between the probability for these mesons to decay (producing muons) or to re-interact. Pions and kaons are affected differently, allowing to extract their production ratio by determining how changes in muon rate depend on changes in the effective temperature – an integral over the atmospheric temperature profile weighted by a depth dependent meson production rate.
In this project, the aim is to measure the rate of muons in the detectors and to calculate the effective temperature above the KM3NeT detectors from atmospheric data, both as function of time. The relation between these two can be used to extract the pion to kaon ratio.
Contact: Ronald Bruijn
Dark Matter: Building better Dark Matter Detectors - the XAMS R&D Setup
The Amsterdam Dark Matter group operates an R&D xenon detector at Nikhef. The detector is a dual-phase xenon time-projection chamber and contains about 0.5kg of ultra-pure liquid xenon in the central volume. We use this detector for the development of new detection techniques - such as utilizing our newly installed silicon photomultipliers - and to improve the understanding of the response of liquid xenon to various forms of radiation. The results could be directly used in the XENONnT experiment, the world’s most sensitive direct detection dark matter experiment at the Gran Sasso underground laboratory, or for future Dark Matter experiments like DARWIN. We have several interesting projects for this facility. We are looking for someone who is interested in working in a laboratory on high-tech equipment, modifying the detector, taking data and analyzing the data themselves You will "own" this experiment.
Contact: Patrick Decowski and Auke Colijn
Dark Matter: Searching for Dark Matter Particles - XENONnT Data Analysis
The XENON collaboration has used the XENON1T detector to achieve the world’s most sensitive direct detection dark matter results and is currently operating the XENONnT successor experiment. The detectors operate at the Gran Sasso underground laboratory and consist of so-called dual-phase xenon time-projection chambers filled with ultra-pure xenon. Our group has an opening for a motivated MSc student to do analysis with the new data coming from the XENONnT detector. The work will consist of understanding the detector signals and applying a deep neural network to improve the (gas-) background discrimination in our Python-based analysis tool to improve the sensitivity for low-mass dark matter particles. The work will continue a study started by a recent graduate. There will also be opportunity to do data-taking shifts at the Gran Sasso underground laboratory in Italy.
Contact: Patrick Decowski and Auke Colijn
Dark Matter: Signal reconstruction and correction in XENONnT
XENONnT is a low background experiment operating at the INFN - Gran Sasso underground laboratory with the main goal of detecting Dark Matter interactions with xenon target nuclei. The detector, consisting of a dual-phase time projection chamber, is filled with ultra-pure xenon, which acts as a target and detection medium. Understanding the detector's response to various calibration sources is a mandatory step in exploiting the scientific data acquired. This MSc thesis aims to develop new methods to improve the reconstruction and correction of scintillation/ ionization signals from calibration data. The student will work with modern analysis techniques (python-based) and will collaborate with other analysts within the international XENON Collaboration.
Contact: Maxime Pierre, Patrick Decowski
Dark Matter: The Ultimate Dark Matter Experiment - DARWIN Sensitivity Studies
DARWIN is the “ultimate” direct detection dark matter experiment, with the goal to reach the so-called “neutrino floor”, when neutrinos become a hard-to-reduce background. The large and exquisitely clean xenon mass will allow DARWIN to also be sensitive to other physics signals such as solar neutrinos, double-beta decay from Xe-136, axions and axion-like particles etc. While the experiment will only start in 2027, we are in the midst of optimizing the experiment, which is driven by simulations. We have an opening for a student to work on the GEANT4 Monte Carlo simulations for DARWIN. We are also working on a “fast simulation” that could be included in this framework. It is your opportunity to steer the optimization of a large and unique experiment. This project requires good programming skills (Python and C++) and data analysis/physics interpretation skills.
Contact: Tina Pollmann, Patrick Decowski or Auke Colijn
Dark Matter: Exploring new background sources for DARWIN
Experiments based on the xenon dual-phase time projection chamber detection technology have already demonstrated their leading role in the search for Dark Matter. The unprecedented low level of background reached by the current generation, such as XENONnT, allows such experiments to be sensitive to new rare-events physics searches, broadening their physics program. The next generation of experiments is already under consideration with the DARWIN observatory, which aims to surpass its predecessors in terms of background level and mass of xenon target. With the increased sensitivity to new physics channels, such as the study of neutrino properties, new sources of backgrounds may arise. This MSc thesis aims to investigate potential new sources of background for DARWIN and is a good opportunity for the student to contribute to the design of the experiment. This project will rely on Monte Carlo simulation tools such as GEANT4 and FLUKA, and good programming skills (Python and C++) are advantageous.
Contact: Maxime Pierre, Patrick Decowski
Dark Matter: Sensitive tests of wavelength-shifting properties of materials for dark matter detectors
Rare event search experiments that look for neutrino and dark matter interactions are performed with highly sensitive detector systems, often relying on scintillators, especially liquid noble gases, to detect particle interactions. Detectors consist of structural materials that are assumed to be optically passive, and light detection systems that use reflectors, light detectors, and sometimes, wavelength-shifting materials. MSc theses are available related to measuring the efficiency of light detection systems that might be used in future detectors. Furthermore, measurements to ensure that presumably passive materials do not fluoresce, at the low level relevant to the detectors, can be done. Part of the thesis work can include Monte Carlo simulations and data analysis for current and upcoming dark matter detectors, to study the effect of different levels of desired and nuisance wavelength shifting. In this project, students will acquire skills in photon detection, wavelength shifting technologies, vacuum systems, UV and extreme-UV optics, detector design, and optionally in Python and C++ programming, data analysis, and Monte Carlo techniques.
Contact: Tina Pollmann
Detector R&D: Studies of wafer-scale sensors for ALICE detector upgrade and beyond
One of the biggest milestones of the ALICE detector upgrade (foreseen in 2026) is the implementation of wafer-scale (~ 28 cm x 18 cm) monolithic silicon active pixel sensors in the tracking detector, with the goal of having truly cylindrical barrels around the beam pipe. To demonstrate such an unprecedented technology in high energy physics detectors, few chips will be soon available in Nikhef laboratories for testing and characterization purposes. The goal of the project is to contribute to the validation of the samples against the ALICE tracking detector requirements, with a focus on timing performance in view of other applications in future high energy physics experiments beyond ALICE. We are looking for a student with a focus on lab work and interested in high precision measurements with cutting-edge instrumentation. You will be part of the Nikhef Detector R&D group and you will have, at the same time, the chance to work in an international collaboration where you will report about the performance of these novel sensors. There may even be the opportunity to join beam tests at CERN or DESY facilities. Besides interest in hardware, some proficiency in computing is required (Python or C++/ROOT).
Contact: Jory Sonneveld , Roberto Russo
Detector R&D: Time resolution of monolithic silicon detectors
Monolithic silicon detectors based on industrial Complementary Metal Oxide Semiconductor (CMOS) processes offer a promising approach for large scale detectors due to their ease of production and low material budget. Until recently, their low radiation tolerance has hindered their applicability in high energy particle physics experiments. However, new prototypes ~~such as the one in this project~~ have overcome these hurdles, making them feasible candidates for future experiments in high energy particle physics. Achieving the required radiation tolerance has brought the spatial and temporal resolution of these detectors to the forefront. In this project, you will investigate the temporal performance of a radiation hard monolithic detector prototype, using laser setups in the laboratory. You will also participate in meetings with the international collaboration working on this detector, where you will report on the prototype's performance. Depending on the progress of the work, there may be a chance to participate in test beams performed at the CERN accelerator complex and a first full three dimensional characterization of the prototypes performance using a state-of-the-art two-photon absorption laser setup at Nikhef. This project is looking for someone interested in working hands on with cutting edge detector and laser systems at the Nikhef laboratory. Python programming skills and linux experience are an advantage.
Contact: Jory Sonneveld, Uwe Kraemer
Detector R&D: Improving a Laser Setup for Testing Fast Silicon Pixel Detectors
For the upgrades of the innermost detectors of experiments at the Large Hadron Collider in Geneva, in particular to cope with the large number of collisions per second from 2027, the Detector R&D group at Nikhef tests new pixel detector prototypes with a variety of laser equipment with several wavelengths. The lasers can be focused down to a small spot to scan over the pixels on a pixel chip. Since the laser penetrates the silicon, the pixels will not be illuminated by just the focal spot, but by the entire three dimensional hourglass or double cone like light intensity distribution. So, how well defined is the volume in which charge is released? And can that be made much smaller than a pixel? And, if so, what would the optimum focus be? For this project the student will first estimate the intensity distribution inside a sensor that can be expected. This will correspond to the density of released charge within the silicon. To verify predictions, you will measure real pixel sensors for the LHC experiments. This project will involve a lot of 'hands on work' in the lab and involve programming and work on unix machines.
Contact: Martin Fransen
Detector R&D: Time resolution of the hybrid pixel detectors with the Timepix4 chip
Precise time measurements with silicon pixel detectors are very important for experiments at the High-Luminosity LHC and the future circular collider. The spatial resolution of current silicon trackers will not be sufficient to distinguish the large number of collisions that will occur within individual bunch crossings. In a new method, typically referred to as 4D tracking, spatial measurements of pixel detectors will be combined with time measurements to better distinguish collision vertices that occur close together. New sensor technologies are being explored to reach the required time measurement resolution of tens of picoseconds, and the results are promising. However, the signals that these pixelated sensors produce have to be processed by front-end electronics, which hence also play a role in the total time resolution of the detector. An important contribution comes from the systematic differences between the front-end electronics of different pixels. Many of these systematic effects can be corrected by performing detailed calibrations of the readout electronics. To achieve the required time resolution at future experiments, it is vital that these effects are understood and corrected. In this project you will be working with the Timepix4 chip. This is a so-called application specific integrated circuit (ASIC) that is designed to read out pixelated sensors. This ASIC will be used extensively in detector R&D for the characterisation of new sensor technologies requiring precise timing (< 50 ps). In order to do so, it is necessary to first study the systematic differences between the pixels, which you will do using a laser setup in our lab. This will be combined with data analysis of proton beam measurements, or with measurements performed using the built-in test-pulse mechanism of the Timepix4 ASIC. Your work will enable further research performed with this ASIC, and serve as input to the design and operation of future ASICs for experiments at the High-Luminosity LHC.
Contact: Kevin Heijhoff and Martin van Beuzekom
Detector R&D: Performance studies of Trench Isolated Low Gain Avalanche Detectors (TI-LGAD)
The future vertex detector of the LHCb Experiment needs to measure the spatial coordinates and time of the particles originating in the LHC proton-proton collisions with resolutions better than 10 um and 50 ps, respectively. Several technologies are being considered to achieve these resolutions. Among those is a novel sensor technology called Trench Isolated Low Gain Avalanche Detector. Prototype pixelated sensors have been manufactured recently and have to be characterised. Therefore these new sensors will be bump bonded to a Timepix4 ASIC which provides charge and time measurements in each of 230 thousand pixels. Characterisation will be done using a lab setup at Nikhef, and includes tests with a micro-focused laser beam, radioactive sources, and possibly with particle tracks obtained in a test-beam. This project involves data taking with these new devices and analysing the data to determine the performance parameters such as the spatial and temporal resolution. as function of temperature and other operational conditions.
Contacts: Kazu Akiba and Martin van Beuzekom
Detector R&D: Other projects
Are you looking for a slightly different project? Are the above projects already taken? Are you coming in at an unusual time of the year? Do not hesitate to contact us! We always have new projects coming up at different times in the year and we are open to your ideas.
Contact: Jory Sonneveld
FCC: The Next Collider
After the LHC, the next planned large collider at CERN is the proposed 100 kilometer circular collider "FCC". In the first stage of the project, as a high-luminosity electron-positron collider, precision measurements of the Higgs boson are the main goal. One of the channels that will improve by orders of magnitude at this new accelerator is the decay of the Higgs boson to a pair of charm quarks. This project will estimate a projected sensitivity for the coupling of the Higgs boson to second generation quarks, and in particular target the improved reconstruction of the topology of long-lived mesons in the clean environment of a precision e+e- machine.
Contact: Tristan du Pree
Gravitational Waves: Computer modelling to design the laser interferometers for the Einstein telescope
A new field of instrument science led to the successful detection of gravitational waves by the LIGO detectors in 2015. We are now preparing the next generation of gravitational wave observatories, such as the Einstein Telescope, with the aim to increase the detector sensitivity by a factor of ten, which would allow, for example, to detect stellar-mass black holes from early in the universe when the first stars began to form. This ambitious goal requires us to find ways to significantly improve the best laser interferometers in the world.
Gravitational wave detectors, such as LIGO and VIRGO, are complex Michelson-type interferometers enhanced with optical cavities. We develop and use numerical models to study these laser interferometers, to invent new optical techniques and to quantify their performance. For example, we synthesize virtual mirror surfaces to study the effects of higher-order optical modes in the interferometers, and we use opto-mechanical models to test schemes for suppressing quantum fluctuations of the light field. We can offer several projects based on numerical modelling of laser interferometers. All projects will be directly linked to the ongoing design of the Einstein Telescope.
Contact: Andreas Freise
LHCb: Search for light dark particles
The Standard Model of elementary particles does not contain a proper Dark Matter candidate. One of the most tantalizing theoretical developments is the so-called Hidden Valley models: a mirror-like copy of the Standard Model, with dark particles that communicate with standard ones via a very feeble interaction. These models predict the existence of dark hadrons – composite particles that are bound similarly to ordinary hadrons in the Standard Model. Such dark hadrons can be abundantly produced in high-energy proton-proton collisions, making the LHC a unique place to search for them. Some dark hadrons are stable like a proton, which makes them excellent Dark Matter candidates, while others decay to ordinary particles after flying a certain distance in the collider experiment. The LHCb detector has a unique capability to identify such decays, particularly if the new particles have a mass below ten times the proton mass.
This project assumes a unique search for light dark hadrons that covers a mass range not accessible to other experiments. It assumes an interesting program on data analysis (python-based) with non-trivial machine learning solutions and phenomenology research using fast simulation framework. Depending on the interest, there is quite a bit of flexibility in the precise focus of the project.
Contact: Andrii Usachov
LHCb: Searching for dark matter in exotic six-quark particles
Three quarters of the mass in the Universe is of unknown type. Many hypotheses about this dark matter have been proposed, but none confirmed. Recently it has been proposed that it could be made of particles made of the six quarks uuddss, which would be a Standard-Model solution to the dark matter problem. This idea has recently gained credibility as many similar multi-quarks states are being discovered by the LHCb experiment. Such a particle could be produced in decays of heavy baryons, or directly in proton-proton collisions. The anti-particle, made of six antiquarks, could be seen when annihilating with detector material. It is also proposed to use Xi_b baryons produced at LHCb to search for such a state where the state would appear as missing 4-momentum in a kinematically constrained decay. The project consists in defining a selection and applying it to LHCb data. See arXiv:2007.10378.
Contact: Patrick Koppenburg
LHCb: Measuring lepton flavour universality with excited Ds states in semileptonic Bs decays
One of the most striking discrepancies between the Standard Model and measurements are the lepton flavour universality (LFU) measurements with tau decays. At the moment, we have observed an excess of 3-4 sigma in B → Dτν decays. This could point even to a new force of nature! To understand this discrepancy, we need to make further measurements.
One very exciting (pun intended) projects to verify these discrepancies involves measuring the Bs → Ds2*τν and/or Bs → Ds1*τν decays. These decays with excited states of the Ds meson have not been observed before in the tau decay mode, and have a unique way of coupling to potential new physics candidates that can only be measured in Bs decays [1]. See slides for more detail: File:LHCbLFUwithExcitedDs.pdf
[1] https://arxiv.org/abs/1606.09300
Contact: Suzanne Klaver
LHCb: New physics in the angular distributions of B decays to K*ee
Lepton flavour violation in B decays can be explained by a variety of non-standard model interactions. Angular distributions in decays of a B meson to a hadron and two leptons are an important source of information to understand which model is correct. Previous analyses at the LHCb experiment have considered final states with a pair of muons. Our LHCb group at Nikhef concentrates on a new measurement of angular distributions in decays with two electrons. The main challenge in this measurement is the calibration of the detection efficiency. In this project you will confront estimates of the detection efficiency derived from simulation with decay distributions in a well known B decay. Once the calibration is understood, the very first analysis of the angular distributions in the electron final state can be performed.
Contact: Mara Soares and Wouter Hulsbergen
LHCb: Discovering heavy neutrinos in B decays
Neutrinos are the lightest of all fermions in the standard model. Mechanisms to explain their small mass rely on the introduction of new, much heavier, neutral leptons. If the mass of these new neutrinos is below the b-quark mass, they can be observed in B hadron decays.
In this project we search for the decay of B+ mesons in into an ordinary electron or muon and the yet undiscovered heavy neutrino. The heavy neutrino is expected to be unstable and in turn decay quickly into a charged pion and another electron or muon. The final state in which the two leptons differ in flavour, "B+ to e mu pi", is particularly interesting: It is forbidden in the standard model, such that backgrounds are small. The analysis will be performed within the LHCb group at Nikhef using LHCb run-2 data.
LHCb: Scintillating Fibre tracker software
The installation of the scintillating-fibre tracker in LHCb’s underground cavern was recently completed. This detector uses 10000 km of fibres to track particle trajectories in the LHCb detector when the LHC starts up again later this year. The light emitted by the scintillating fibres when a particle interacts with them is measured using photon multiplier tubes. The studies proposed for this project will focus on software, and could include writing a framework to monitor the detector output, improving the detector simulation or working on the data processing.
Contact: Emmy Gabriel
LHCb: Vertex detector calibration
In summer 2022 LHCb has started data taking will an almost entirely new detector. At the point closest to the interaction point, the trajectories of charge particles are reconstructed with a so-called silicon pixel detector. The design hit resolution of this detector is about 15 micron. However, to actually reach this resolution a precise calibration of the spatial positions of the silicon sensors needs to be performed. In this project, you will use the first data of the new LHCb detector to perform this calibration and measure the detector performance.
Contact: Wouter Hulsbergen
Neutrinos: Neutrino scattering: the ultimate resolution
Neutrino telescopes like IceCube and KM3NeT aim at detecting neutrinos from cosmic sources. The neutrinos are detected with the best resolution when charged current interactions with nucleons produce a muon, which can be detected with high accuracy (depending on the detector). A crucial ingredient in the ultimate achievable pointing accuracy of neutrino telescopes is the scattering angle between the neutrino and the muon. While published computations have investigated the cross-section of the process in great detail, this important scattering angle has not received much attention. The aim of the project is to compute and characterize the distribution of this angle, and that the ultimate resolution of a neutrino telescope. If successful, the results of this project can lead to publication of interest to the neutrino telescope community.
Depending on your interests, the study could be based on a first-principles calculation (using the deep-inelastic scattering formalism), include state-of-the-art parton distribution functions, and/or exploit existing event-generation software for a more experimental approach.
Contacts: Aart Heijboer
Neutrinos: acoustic detection of ultra-high energy neutrinos
The study of the cosmic neutrinos of energies above 1017 eV, the so-called ultra-high energy neutrinos, provides a unique view on the universe and may provide insight in the origin of the most violent astrophysical sources, such as gamma ray bursts, supernovae or even dark matter. In addition, the observation of high energy neutrinos may provide a unique tool to study interactions at high energies. The energy deposition of these extreme neutrinos in water induce a thermo-acoustic signal, which can be detected using sensitive hydrophones. The expected neutrino flux is however extremely low and the signal that neutrinos induce is small. TNO is presently developing sensitive hydrophone technology based on fiber optics. Optical fibers form a natural way to create a distributed sensing system. Using this technology a large scale neutrino telescope can be built in the deep sea. TNO is aiming for a prototype hydrophone which will form the building block of a future telescope.
The work will be executed at the Nikhef institute and/or the TNO laboratories in Delft. In this project master students have the opportunity to contribute in the following ways:
Project 1: Hardware development on fiber optics hydrophones technology Goal: characterize existing prototype optical fibre hydrophones in an anechoic basin at TNO laboratory. Data collection, calibration, characterization, analysis of consequences for design future acoustic hydrophone neutrino telescopes; Keywords: Optical fiber technology, signal processing, electronics, lab.
Project 2: Investigation of ultra-high energy neutrinos and their interactions with matter. Goal: Discriminate the neutrino signals from the background noises, in particular clicks from whales and dolphins in the deep sea. Study impact on physics reach for future acoustic hydrophone neutrino telescopes; Keywords: Monte Carlo simulations, particle physics, neutrino physics, data analysis algorithms.
Further information: Info on ultra-high energy neutrinos can be found at: http://arxiv.org/abs/1102.3591; Info on acoustic detection of neutrinos can be found at: http://arxiv.org/abs/1311.7588
Contact: Ernst Jan Buis or Ivo van Vulpen
Neutrinos: Oscillation analysis with the first data of KM3NeT
The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first few strings with sensitive photodetectors have been deployed at both the Italian and the French detector sites. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In this project the available data will be used together with simulations to best reconstruct the event topologies and optimally identify and reconstruct the first neutrino interactions in the KM3NeT detector. The data will then be used to measure neutrino oscillation parameters, and prepare for a future neutrino mass ordering determination.
Programming skills are essential, mostly root and C++ will be used. Contact: Ronald Bruijn Paul de Jong
Neutrinos: the Deep Underground Neutrino Experiment (DUNE)
The Deep Underground Neutrino Experiment (DUNE) is under construction in the USA, and will consist of a powerful neutrino beam originating at Fermilab, a near detector at Fermilab, and a far detector in the SURF facility in Lead, South Dakota, 1300 km away. During travelling, neutrinos oscillate and a fraction of the neutrino beam changes flavour; DUNE will determine the neutrino oscillation parameters to unrivaled precision, and try and make a first detection of CP-violation in neutrinos. In this project, various elements of DUNE can be studied, including the neutrino oscillation fit, neutrino physics with the near detector, event reconstruction and classification (including machine learning), or elements of data selection and triggering.
Contact: Paul de Jong
Neutrinos: Searching for Majorana Neutrinos with KamLAND-Zen
The KamLAND-Zen experiment, located in the Kamioka mine in Japan, is a large liquid scintillator experiment with 750kg of ultra-pure Xe-136 to search for neutrinoless double-beta decay (0n2b). The observation of the 0n2b process would be evidence for lepton number violation and the Majorana nature of neutrinos, i.e. that neutrinos are their own anti-particles. Current limits on this extraordinary rare hypothetical decay process are presented as a half-life, with a lower limit of 10^26 years. KamLAND-Zen, the world’s most sensitive 0n2b experiment, is currently taking data and there is an opportunity to work on the data analysis, analyzing data with the possibility of taking part in a ground-breaking discovery. The main focus will be on developing new techniques to filter the spallation backgrounds, i.e. the production of radioactive isotopes by passing muons. There will be close collaboration with groups in the US (MIT, Berkeley, UW) and Japan (Tohoku Univ). Contact: Patrick Decowski
Theoretical Particle Physics: Effective Field Theories of Particle Physics from low- to high-energies
Known elementary matter particles exhibit a surprising three-fold structure. The particles belonging to each of these three “generations” seem to display a remarkable pattern of identical properties, yet have vastly different masses. This puzzling pattern is unexplained. Equally unexplained is the bewildering imbalance between matter and anti-matter observed in the universe, despite minimal differences in the properties of particles and anti-particles. These two mystifying phenomena may originate from a deeper, still unknown, fundamental structure characterised by novel types of particles and interactions, whose unveiling would revolutionise our understanding of nature. The ultimate goal of particle physics is uncovering a fundamental theory which allows the coherent interpretation of phenomena taking place at all energy and distance scales. In this project, the students will exploit the Standard Model Effective Field Theory (SMEFT) formalism, which allows the theoretical interpretation of particle physics data in terms of new fundamental quantum interactions which relate seemingly disconnected processes with minimal assumptions on the nature of an eventual UV-complete theory that replaces the Standard Model. Specifically, the goal is to connect measurements from ATLAS, CMS, and LHCb experiments at the CERN's LHC among them and to jointly interpret this information with that provided by other experiments including very low-energy probes such as the anomalous magnetic moment of the muon or electric dipole moments of the electron and neutron.
This project will be based on theoretical calculations in particle physics, numerical simulations in Python, analysis of existing data from the LHC and other experiments, as well as formal developments in understanding the operator structure of effective field theories. Depending on the student profile, sub-projects with a strong computational and/or machine learning component are also possible, for instance to construct new operators with optimal sensitivity to New Physics effects as encoded by the SMEFT higher-dimensional operators. Topics that can be considered in this project include the interpretation of novel physical observables at the LHC and their integration into the global SMEFiT analysis, matching of EFTs to UV-complete theories and their phenomenological analyses, projections for the impact in the SMEFT parameter space of data for future colliders, the synergies between EFT studies and proton structure fits, and the matching to the Weak Effective Field Theory to include data on flavour observables such as B-meson decays.
References: https://arxiv.org/abs/2105.00006 , https://arxiv.org/abs/2302.06660, https://arxiv.org/abs/2211.02058 , https://arxiv.org/abs/1901.05965 , https://arxiv.org/abs/1906.05296 , https://arxiv.org/abs/1908.05588, https://arxiv.org/abs/1905.05215
Contacts: Juan Rojo
Theoretical Particle Physics: High-energy neutrino-nucleon interactions at the Forward Physics Facility
High-energy collisions at the High-Luminosity Large Hadron Collider (HL-LHC) produce a large number of particles along the beam collision axis, outside of the acceptance of existing experiments. The proposed Forward Physics Facility (FPF) to be located several hundred meters from the ATLAS interaction point and shielded by concrete and rock, will host a suite of experiments to probe Standard Model (SM) processes and search for physics beyond the Standard Model (BSM). High statistics neutrino detection will provide valuable data for fundamental topics in perturbative and non-perturbative QCD and in weak interactions. Experiments at the FPF will enable synergies between forward particle production at the LHC and astroparticle physics to be exploited. The FPF has the promising potential to probe our understanding of the strong interactions as well as of proton and nuclear structure, providing access to both the very low-x and the very high-x regions of the colliding protons. The former regime is sensitive to novel QCD production mechanisms, such as BFKL effects and non-linear dynamics, as well as the gluon parton distribution function (PDF) down to x=1e-7, well beyond the coverage of other experiments and providing key inputs for astroparticle physics. In addition, the FPF acts as a neutrino-induced deep-inelastic scattering (DIS) experiment with TeV-scale neutrino beams. The resulting measurements of neutrino DIS structure functions represent a valuable handle on the partonic structure of nucleons and nuclei, particularly their quark flavour separation, that is fully complementary to the charged-lepton DIS measurements expected at the upcoming Electron-Ion Collider (EIC).
In this project, the student will carry out updated predictions for the neutrino fluxes expected at the FPF, assess the precision with which neutrino cross-sections will be measured, and quantify their impact on proton and nuclear structure by means of machine learning tools within the NNPDF framework and state-of-the-art calculations in perturbative Quantum Chromodynamics. This project contributes to ongoing work within the FPF Initiative towards a Conceptual Design Report (CDR) to be presented within two years. Topics that can be considered as part of this project include the assessment of to which extent nuclear modifications of the free-proton PDFs can be constrained by FPF measurements, the determination of the small-x gluon PDF from suitably defined observables at the FPF and the implications for ultra-high-energy particle astrophysics, the study of the intrinsic charm content in the proton and its consequences for the FPF physics program, and the validation of models for neutrino-nucleon cross-sections in the region beyond the validity of perturbative QCD.
References: https://arxiv.org/abs/2203.05090, https://arxiv.org/abs/2109.10905 ,https://arxiv.org/abs/2208.08372 , https://arxiv.org/abs/2201.12363 , https://arxiv.org/abs/2109.02653, https://github.com/NNPDF/
Contacts: Juan Rojo
Theoretical Particle Physics: Probing the origin of the proton spin with machine learning
At energy-frontier facilities such as the Large Hadron Collider (LHC), scientists study the laws of nature in their quest for novel phenomena both within and beyond the Standard Model of particle physics. An in-depth understanding of the quark and gluon substructure of protons and heavy nuclei is crucial to address pressing questions from the nature of the Higgs boson to the origin of cosmic neutrinos. The key to address some of these questions is by carrying out an universal analysis of nucleon structure from the simultaneous determination of the momentum and spin distributions of quarks and gluons and their fragmentation into hadrons. This effort requires combining an extensive experimental dataset and cutting-edge theory calculations within a machine learning framework where neural networks parametrise the underlying physical laws while minimising ad-hoc model assumptions. The upcoming Electron Ion Collider (EIC), to start taking data in 2029, will be the world's first ever polarised lepton-hadron collider and will offer a plethora of opportunities to address key open questions in our understanding of the strong nuclear force, such as the origin of the mass and the intrinsic angular momentum (spin) of hadrons and whether there exists a state of matter which is entirely dominated by gluons. To fully exploit this scientific potential, novel analysis methodologies need to be develop that make it possible to carry out large-scale, coherent interpretations of measurements from the EIC and other high-energy colliders.
In this project, the student will carry out a new global analysis of the spin structure of the proton by means of the machine learning tools provided by the NNPDF open-source fitting framework and state-of-the-art calculations in perturbative Quantum Chromodynamics, and integrate it within the corresponding global NNPDF analyses of unpolarised proton and nuclear structure in the framework of a combined integrated global analysis of non-perturbative QCD. Specifically, the project aims to realise a NNLO global fit of polarised quark and gluon PDFs that combines all available data and state-of-the-art perturbative QCD calculations, and study the phenomenological implications for other experiments, including the EIC, for the spin content of the proton, for comparisons with lattice QCD calculations, and for nonpperturbative models of hadron structure.
References: https://arxiv.org/abs/2201.12363, https://arxiv.org/abs/2109.02653 , https://arxiv.org/abs/2103.05419, https://arxiv.org/abs/1404.4293 , https://inspirehep.net/literature/1302398, https://github.com/NNPDF/
Contacts: Juan Rojo
Finished master projects
See:
- https://wiki.nikhef.nl/education/Master_Theses
- https://www.nikhef.nl/master-theses-2021/
- https://www.nikhef.nl/facts-figures-2020/master-theses-2020/