Difference between revisions of "Master Projects"

From Education Wiki
Jump to navigation Jump to search
(Removed my (Zef) projects since I do not have time for supervision in the last year of my PhD)
 
(258 intermediate revisions by 48 users not shown)
Line 3: Line 3:
 
The following Master thesis research projects are offered at Nikhef. If you are interested in one of these projects, please contact the coordinator listed with the project.  
 
The following Master thesis research projects are offered at Nikhef. If you are interested in one of these projects, please contact the coordinator listed with the project.  
  
== Projects with September 2020 start ==
+
== Projects with a 2024 start [WORK IN PROGRESS, please look below for older projects] ==
  
=== ATLAS: Top Spin optimal observables using Artificial Intelligence ===
+
=== ALICE: Search for new physics with 4D tracking at the most sensitive vertex detector at the LHC ===
  
The top quark has an exceptional high mass, close to the electroweak symmetry breaking scale and therefore sensitive to new physics effects. Theoretically, new physics is well described in the EFT framework [1]. The (EFT) operators are experimentally well accessible in single top t-channel production where the top quark is produced spin polarized. The focus at Nikhef is the operator O_{tW} with a possible imaginary phase, leading to CP violation. Experimentally, many angular distribution are reconstructed in the top rest frame to hunt for these effects. We are looking for a limited set of optimal observables. The objective of your Master project would be to find optimal observables using simulated events including the detector effects and possible systematic deviations. All techniques are allowed, but promising new developments are methods which involve artifical intelligence. This work could lead to an ATLAS note.  
+
With the newly installed Inner Tracking System consisting fully of monolithic detectors, ALICE is very sensitive to particles with low transverse momenta, more so than ATLAS and CMS. This will be even more so for the ALICE upgrade detector in 2033. This detector could potentially be even more sensitive to longlived particles that leave peculiar tracks such as disappearing or kinked tracks in the tracker by using timing information along a track. In this project you will investigate how timing information in the different tracking layers can improve or even enable a search for new physics beyond the Standard Model in ALICE. If you show a possibility for major improvements, this can have real consequences for the choice of sensors for this ALICE inner tracker upgrade.
  
[1] https://arxiv.org/abs/1807.03576
+
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld] and [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
  
''Contact: Marcel Vreeswijk [mailto:h73@nikhef.nl] and Jordy Degens [mailto:jdegens@nikhef.nl]  ''
+
=== ALICE: Connecting the hot and cold QCD matter by searching for the strongest magnetic field in nature===
 +
In a non-central collision between two Pb ions, with a large value of impact parameter, the charged nucleons that do not participate in the interaction (called spectators) create strong magnetic fields. A back of the envelope calculation using the Biot-Savart law brings the magnitude of this filed close to 10^19Gauss in agreement with state of the art theoretical calculation, making it the strongest magnetic field in nature. The presence of this field could have direct implications in the motion of final state particles. The magnetic field, however, decays rapidly. The decay rate depends on the electric conductivity of the medium which is experimentally poorly constrained. Overall, the presence of the magnetic field, the main goal of this project, is so far not confirmed experimentally and can have implications for measurements of gravitational waves emitted from the merger of neutron stars.
  
=== ATLAS: The Next Generation ===
+
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
 +
 
 +
=== ALICE/LHCb Tracking: Innovative tracking techniques exploting modern heterogeneous architectures===
 +
The recostruction of charged particle tracks is one of the most computationaly demanding components of modern high energy physics experiments. In particular, the upcoming High-Luminosity Large Hadron Collider (HL-LHC) makes the usage of fast tracking algorithms using modern computing architectures with many cores and accelerators essential. In this project we will be investigating innovative, machine learning, experiment agnostic tracking algorithms in modern architectures e.g. GPUs, FPGAs.
 +
 
 +
''Contact: [mailto:jdevries@nikhef.nl Jacco de Vries] and [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
 +
 
 +
=== ATLAS: Search for very rare Higgs decays to second-generation fermions ===
 +
While the Higgs boson coupling to fermions of the third generation has been established experimentally, the investigation of the Higgs boson coupling to the light fermions of the second generation will be a central project for the current data-taking period of the LHC (2022-2025). The Higgs boson decay to muons is the most sensitive channel for probing this coupling. In this project, event selection algorithms for Higgs boson decays to muons in the associated production with a gauge boson (VH) are developed with the aim to distinguish signal events from background processes like Drell-Yan and WZ boson production. For this purpose, the candidate will implement and validate deep learning algorithms, and extract the final results based on fit  to the output of the deep learning classifier.
  
After the observation of the coupling of Higgs bosons to fermions of the third generation, the search for the coupling to fermions of the second generation is one of the next priorities for research at CERN's Large Hadron Collider. The search for the decay of the Higgs boson to two charm quarks is very new [1] and we see various opportunities for interesting developments. For this project we propose improvements in reconstruction (using exclusive decays), advanced analysis techiques (using deep learning methods) and expanding the theory interpretation. Another opportunity would be the development of the first statistical combination of results between the ATLAS and CMS experiment, which could significantly improve the discovery potentional.
+
''Contact: [mailto:oliver.rieger@nikhef.nl Oliver Rieger] and [mailto:verkerke@nikhef.nl Wouter Verkerke]''
  
[1] https://arxiv.org/abs/1802.04329
+
=== ATLAS: Advanced deep-learning techniques for lepton identification ===
 +
The ATLAS experiment at the Large Hadron Collider facilitates a broad spectrum of physics analyses. A critical aspect of these analyses is the efficient and accurate identification of leptons, which are crucial for both signal detection and background event rejection. The ability to distinguish between prompt leptons, arising directly from the collision, and nonprompt leptons, originating from heavy flavour hadron decays, is a challenging task. This project aims to develop and implement advanced techniques based on deep learning models to leverage the lepton identification beyond the capabilities of current standard methods.
  
''Contact: [mailto:tdupree@nikhef.nl Tristan du Pree and Marko Stamenkovic]''
+
''Contact: [mailto:oliver.rieger@nikhef.nl Oliver Rieger] and [mailto:verkerke@nikhef.nl Wouter Verkerke]''  
  
=== ATLAS: The Most Energetic Higgs Boson ===
+
=== ATLAS: Probing CP-violation in the Higgs sector with the ATLAS experiment ===
 +
The Standard Model Effective Field Theory (SMEFT) provides a systematic approach to test the impact of new physics at the energy scale of the LHC through higher-dimensional operators. The scarcity of antimatter in the cosmos arises from the slight differences in the behavior of particles and their antiparticle counterparts, known as CP-violation. The current data-taking period of the LHC is expected to yield a comprehensive dataset, enabling the investigation of CP-odd SMEFT operators in the Higgs boson's interactions with other particles.The interpretation of experimental data using SMEFT requires a particular interest in solving complex technical challenges, advanced statistical techniques, and a deep understanding of particle physics.
  
The production of Higgs bosons at the highest energies could give the first indications for deviations from the standard model of particle physics, but production energies above 500 GeV have not been observed yet [1]. The LHC Run-2 dataset, collected during the last 4 years, might be the first opportunity to observe such processes, and we have various ideas for new studies. Possible developments include the improvement of boosted reconstruction techniques, for example using multivariate deep learning methods. Also, there are various opportunities for unexplored theory interpretations (using the MadGraph event generator), including effective field theory models (with novel ‘morphing’ techniques) and new interpretations of the newly observed boosted VZ(bb) process.
+
''Contact: [mailto:lbrenner@nikhef.nl Lydia Brenner], [mailto:oliver.rieger@nikhef.nl Oliver Rieger] and [mailto:verkerke@nikhef.nl Wouter Verkerke]''
  
[1] https://arxiv.org/abs/1709.05543
+
=== ATLAS: Signal and background sensitivity in Standard Model Effective Field Theory (SMEFT) ===
 +
Complex statistical combinations of large sectors of the ATLAS scientific program are currently being used to obtain the best experimental sensitivity to SMEFT parameters. However, to achieve a fully consistent investigation of SMEFT and to push the limit of what is possible with the data already collected it is needed to include background modifications effects. Joining our efforts in this topic means contributing to a cutting-edge investigation that requires both a particular motivation in solving complex technical challenges and into obtaining a broad knowledge of experimental particle physics.
  
''Contact: [mailto:tdupree@nikhef.nl Tristan du Pree and Brian Moser]''
+
Contact: ''[mailto:avisibil@nikhef.nl Andrea Visibile] and [mailto:lbrenner@nikhef.nl Lydia Brenner]''
  
=== LHCb: Measurement of delta md  ===  
+
=== ATLAS: Performing a Bell test in Higgs to di-boson decays ===
The decay B0->D-pi+ is very abundant in LHCb, and therefore ideal to study the oscillation frequency
+
Recently, theorist [1] have proposed to perform a Bell test in Higgs to di-boson decays. This is a fundamental test of not only quantum mechanics but also a test of quantum field theory using the elusive scalar Higgs particle. At Nikhef we started to brainstorm on the experimental aspects of this challenging measurement. Due to the studies of a PhD student [2] we have considerable experience in the reconstruction of Higgs rest frame angles that are essential to perform a Bell test. Is there a master student who wants to join our efforts to study the ''"spooky action at a distance"'' in Higgs to WW decays?
delta md, with which B0 mesons oscillate into anti-B0 mesons, and vice versa.
 
This process proceeds through a so-called box diagram which might hide new yet-undiscovered particles.
 
Recently, it has been realized that value of delta md is in tension with the valu of CKM-angle gamma,
 
triggering renewed interest in this measurement.
 
  
''Contact: [mailto:Marcel.Merk@nikhef.nl Marcel Merk]''
+
''Contact: [mailto:Peter.Kluit@nikhef.nl Peter Kluit]''  
  
=== LHCb: Searching for CPT violation ===
+
  [1] Review article <nowiki>https://arxiv.org/pdf/2402.07972.pdf</nowiki>
CPT symmetry is closely linked to Lorentz symmetry, and any violation
 
would revolutionize science. There are possibilities though that supergravity could
 
cause CPT violating effects in the system of neutral mesons.
 
The precise study of B0s oscillations in the abundant Bs->Dspi decays can
 
give the most stringent limits on Im(z) to date.
 
  
''Contact: [mailto:Marcel.Merk@nikhef.nl Marcel Merk]''
+
  [2] <nowiki>https://www.nikhef.nl/pub/services/biblio/theses_pdf/thesis_R_Aben.pdf</nowiki>
  
=== LHCb: BR(B0->D-pi+) and fd/fu with B+->D0pi+ ===  
+
=== ATLAS: A new timing detector - the HGTD ===
The abundant decay B0->D-pi+ is often used as normalization channel, given its
+
The ATLAS is going to get a new ability:  a timing detector. This allows us to reconstruct tracks not only in the 3 dimensions of space but adds the ability of measuring very precisely also the time (at picosecond level) at which the particles pass the sensitive layers of the HGTD detector. The added information helps to construct the trajectories of the particles created at the LHC in 4 dimensions and ultimately will lead to a better reconstruction of physics at ATLAS. The new HGTD detector is still in construction and work needs to be done on different levels such as understanding the detector response (taking measurements in the lab and performing simulations) or developing algorithms to reconstruct the particle trajectories (programming and analysis work).  
clean signal, and well-known branching fraction, as measured by the B-factories.
 
However, this branching fraction can be determined more precisely, when comparing
 
to the decay B+->D0pi+ , which has a twice better precision.
 
In addition, the production of B0 and B+ mesons is often assumed to be equal,
 
based on isospin symmetry. The study of B+->D0pi+ and B0->D-pi+ allows for the  
 
first measurement of this ratio, fd/fu.
 
  
''Contact: [mailto:Marcel.Merk@nikhef.nl Marcel Merk]''
+
'''Several projects are available within the context of the new HGTD detector:'''  
  
 +
# One can choose to either focus on '''''the impact on physics analysis performance''''' by studying how the timing measurements can be included in the reconstruction of tracks, and what effect this has on how much better we can understand the physical processes occurring in the particles produced in the LHC collisions. With this work you will be part of the Atlas group at Nikhef.
 +
# The second possibility is to '''''test the sensors in our lab''''' and in test-beam setups at CERN/DESY. The analysis performed will be in context of the ATLAS HGTD test beam group in connection to both the Atlas group and the R&D department at Nikhef.
 +
# The third is to contribute in an ongoing effort '''''to precisely simulate/model  the silicon avalanche detectors''''' in the Allpix2 framework. There are several models that try to describe the detectors response. The models have depend on operation temperature, field strenghts and radiation damage.  We are getting close in being able to model our detector - but not there yet. This work will be within the ATLAS group.
  
=== LHCb: Optimization studies for Vertex detector at the High Lumi LHCb ===
+
Contact''[mailto:hella.snoek@nikhef.nl Hella Snoek]''
The LHCb experiment is dedicated to measure tiny differences between matter and antimatter through the precise study of rare processes involving b or c quarks. The LHCb detector will undergo a major modification in order to dramatically increase the luminosity and be able to  measure indirect effects of physics beyond the standard model.  In this environment, over 42 simultaneous collisions are expected to happen at a time interval of 200 ps where the two proton bunches overlap. The particles of interest have a relatively long lifetime and therefore the best way to distinguish them from the background collisions is through the precise reconstruction of displaced vertices and pointing directions. The new detector considers using extremely recent or even future technologies to measure space (with resolutions below 10 um) and time (100 ps or better) to efficiently reconstruct the events of interest for physics.  The project involves changing completely  the LHCb Vertex Locator (VELO) design in simulation and determine what can be the best performance for the upgraded detector, considering different spatial and temporal resolutions.
 
  
''Contact: [mailto:kazu.akiba@nikhef.nl Kazu Akiba]''
+
=== ATLAS: Studying rare modes of Higgs boson production at the LHC ===
 +
The Higgs boson is a crucial piece of the Standard Model and its most recently-discovered particle. Studying Higgs boson production and decay at the LHC might hold the key for unlocking new information about the physical laws governing our universe. With the LHC now in its third run, we can also use the enormous amounts of data being collected to study Higgs boson production modes we have not previously been able to access. For instance, we can look at the production of a Higgs boson via the fusion of two vector bosons, accompanied by emission of a photon, with subsequent H->WW decay. This state is experimentally-distinctive and should be accessible to us using the current dataset of the LHC. It is also theoretically-interesting because it probes the Higgs boson’s interaction with W bosons. This exact interaction is a cornerstone of electroweak symmetry breaking, the process by which particles gain mass, so studying it provides a window onto a fundamental part of the Standard Model. This project will study the feasibility of measuring this or another rare Higgs production mode using H->WW decays, providing a chance to be involved in the design of an analysis from the ground up.  
  
=== LHCb: Measurement of charge multiplication in heavily irradiated sensors ===
+
''Contact: [mailto:rhayes@nikhef.nl Robin Hayes], [mailto:f.dias@nikhef.nl Flavia de Almeida Dias]''
During the R&D phase for the LHCb VELO Upgrade detector a few sensor prototypes were irradiated to the extreme fluence expected to be achieved during the detector lifetime. These samples were tested using high energy particles at the SPS facility at CERN with their trajectories reconstructed by the Timepix3 telescope. A preliminary analysis revealed that at the highest irradiation levels the amount of signal observed is higher than expected, and even larger than the signal obtained at lower doses.  At the Device Under Test (DUT) position inside the telescope, the spatial resolution attained by this system is below 2 um. This means that a detailed analysis can be performed in order to study where and how this signal amplification happens within  the 55x55 um^2 pixel cell. This project involves analysing the telescope and DUT data to investigate the charge multiplication mechanism at the microscopic level.
 
  
''Contact: [mailto:kazu.akiba@nikhef.nl Kazu Akiba]''
+
=== ATLAS: Exploring triboson polarisation in loop-induced processes at the LHC ===
 +
Spin is a fundamental, quantum mechanical property carried by (most) elementary particles. When high-energy particles scatter, their spin influences how angular momentum is propagated through the processes and ultimately how final-state particles are (geometrically) distributed. Helicity is the projection of the spin vector upon momentum. For example: in the loop-induced process gg > W+W-Z, the angular separation between the various decay products of the W and Z bosons depends on the helicity polarisation of the intermediate W and Z bosons. The aim of this project is to explore helicity polarisation in the multiboson processes, and specifically the gg > WWZ process, at the Large Hadron Collider. This project is in the interface between theory and experiment, and you will work with Monte Carlo generators, analyses design and sensitivity studies.
  
=== LHCb: Testing the flavour anomalies at LHCb ===
+
''Contact: [mailto:f.dias@nikhef.nl Flavia de Almeida Dias]''
Lepton Flavour Universality (LFU) is an intrinsic property of the Standard Model, which implies that the three generation of leptons are subject to the same interactions. This fundamental law of the SM can be investigated by looking at rare B-meson decay with muons or electron in the final state. Recent measurements of these decays from LHCb show deviation from the SM (known as flavour anomalies) that, if confirmed, would lead to a major discovery of New Physics (NP). The project consists in the analysis of the 2017-18 dataset, which will double the statistic of the current results. This new dataset will lead to a measurement with better precision, which can either confirm or exclude the contribution of NP to these decays. The project will explore all the crucial aspect of data analysis, from simulation to signal modeling, including cutting-edge software, such us fitting large amount of data using GPU (Graphic Processing Unit).  
 
  
''Contact: [mailto:a.mauri@cern.ch Andrea Mauri] and [mailto:marcel.merk@nikhef.nl Marcel Merk]''
+
=== ATLAS: High-Performance Simulations for High-Energy Physics Experiments - Multiple Enhancements===
 +
The role of simulation and synthetic data generation for High-Energy Physics (HEP) research is profound. While there are physics-accurate simulation frameworks available to provide the most realistic data syntheses, these tools are slow. Additionally, the output from physics-accurate simulations is closely tied to the experiment that the simulation was developed for and its software.
  
=== LHCb: Search for long-lived heavy neutral leptons in B decays ===
+
Fast simulation frameworks on the other hand, can drastically simplify the simulation, while still striking a balance between speed and accuracy of the simulated events. The applications of simplified simulations and data are numerous. We will be focusing on the role of such data as an enabler for Machine Learning (ML) model design research.
The mass of neutrinos are many orders of magnitude smaller than that of the other fermions. In the seesaw mechanism this puzzling fact is explained by the existence of another set of neutral leptons that are much heavier in mass. If their mass is below about 5 GeV such neutrinos can be produced at the LHC in decays of B hadrons. Their small coupling will lead to a lifetime of the order of pico-seconds which means that they will fly an observable distance before they decay. In this project we search for such long-lived heavy neutrinos in decays of charged B mesons using the LHCb run-2 dataset.
 
  
'' Contact: [mailto:v.lukashenko@nikhef.nl Lera Lukashenko] and [mailto:wouter.hulsbergen@nikhef.nl Wouter Hulsbergen]''
+
This project aims to extend the REDVID simulation framework [1, 2] through addition of new features. The features considered for this iteration include:
  
=== LHCb: Discovering the Bc->eta_c mu nu decay ===
+
*Interaction with common Monte Carlo event generators: To calculate hit points for imported events
The Bc meson, consisting of heavy c and anti-b quarks, is of great interest for flavour physics. Recent LHCb measurement on Bc->J/psi l nu decays [1] showed a possible deviation from the Standard Model prediction, which entered the so-called lepton universality puzzle - the hottest topic in the b-physics in recent years. Following that, the study of a similar decay mode - Bc->eta_c mu nu - is strongly requested by the theory community. However, the reconstruction of the eta_c meson is challenging, so that the decay has not been discovered yet. The project aims at discovery of the Bc->eta_c mu nu decay using unique capabilities of the LHCb experiment. The data analysis will consist of finding the optimal event selection using machine learning techniques, research on background sources, performing fits to data, etc. The project requires to be not afraid of analysis software and statistics. The results will be presented in collaboration: talks at working group meetings, analysis note, etc.  Skills in git, python and ROOT (and similar packages) are extremely welcome.
+
*Addition of basic magnetic field effect: Simulation of a simplified, uniform magnetic field, affecting charged particle trajectories
 +
*Inclusion of pile-up effects during simulation: Multiple particle collisions occurring in close vicinity
 +
*Indication of bunch size
 +
*Spherical coordinates
 +
*Vectorised helical tracks
 +
*Considerations for reproducibility of collision events
  
[1] https://arxiv.org/pdf/1711.05623.pdf
+
The project is part of an ongoing effort to train and test ML models for particle track reconstruction for the HL-LHC. The improved version of REDVID can be used by the student and other users to generate training data for ML models. Depending on the progress and the interest, a secondary goal could be to perform comparisons with physics-accurate simulations or to investigate the impact of the new features on developed ML models.
  
''Contact: [mailto:andrii.usachov@nikhef.nl Andrii Usachov] and [mailto:marcel.merk@nikhef.nl Marcel Merk]''
+
'''Bonus:''' The student will be encouraged and supported to publish the output of this study in a relevant journal, such as "Data in Brief" by Elsevier.
  
=== ALICE: Searching for the strongest magnetic field in nature ===
+
====Appendix - Terminology====
In case of a non-central collision between two Pb ions, with a large value of impact parameter (b), the charged nucleons that do not participate in the interaction (called spectators) create strong magnetic fields. A back of the envelope calculation using the Biot-Savart law brings the magnitude of this filed close to 10^19Gauss in agreement with state of the art theoretical calculation, making it the strongest magnetic field in nature. The presence of this field could have direct implications in the motion of final state particles. The magnetic field, however, decays rapidly. The decay rate depends on the electric conductivity of the medium which is experimentally poorly constrained. Overall, the presence of the magnetic field, the main goal of this project, is so far not confirmed experimentally.
+
The terminology for the considered simulations and its features is domain-specific and are explained below:
  
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
+
*Synthetic data: Data generated during a simulation, which resembles real-data to limited extent.
 +
*Physics-accurate simulation: A type of simulation that strongly takes into account real-world physical interactions and utilises physics formulas to achieve this.
 +
*Complexity-aware simulation framework: A simulator which can be configured with different levels of simulation complexity, making the simulation closer or further away from the real-world case.
 +
*Complexity-reduced data set: Simplified data resulting from simplified simulations. This is in comparison to real data, or data generated by physics-accurate simulations.
 +
 
 +
==== References====
 +
[1] U. Odyurt et al., 2023, "Reduced Simulations for High-Energy Physics, a Middle Ground for Data-Driven Physics Research". URL: https://doi.org/10.48550/arXiv.2309.03780
 +
 
 +
[2] U. Odyurt, 2023, "REDVID Simulation Framework". URL: https://virtualdetector.com/redvid
 +
 
 +
Contact: ''[mailto:uodyurt@nikhef.nl dr. ir. Uraz Odyurt], [mailto:roel.aaij@nikhef.nl dr. Roel Aaij]''
 +
 
 +
=== ATLAS: High-Performance Simulations for High-Energy Physics Experiments - Electron and Muon Simulation (2 projects) ===
 +
The role of simulation and synthetic data generation for High-Energy Physics (HEP) research is profound. While there are physics-accurate simulation frameworks available to provide the most realistic data syntheses, these tools are slow. Additionally, the output from physics-accurate simulations is closely tied to the experiment, e.g., fixed detector geometry, that the simulation was developed for and its software.
 +
 
 +
Fast simulation frameworks on the other hand, can drastically simplify the simulation, while still striking a balance between speed and accuracy of the simulated events. The applications of simplified simulations and data are numerous. We will be focusing on the role of such data as an enabler for Machine Learning (ML) model design research.
  
=== ALICE: Looking for parity violating effects in strong interactions ===
+
This project aims to extend the REDVID simulation framework [1, 2] through addition of new features.
Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions, in the presence of a deconfined state, gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, what is called the Chiral Magnetic Effect (CME).
 
The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, however further studies are needed to constrain background effects. These highly anticipated results have the potential to reveal exiting, new physics.
 
  
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
+
==== Electron simulation ====
 +
The main feature considered for this iteration is support for different particles, especially electrons.
  
=== ALICE: Machine learning techniques as a tool to study the production of heavy flavour particles ===
+
It is paramount to have enough differentiation between different particle types within a simulation. To be able to simulate the behaviour of an electron, certain characteristics have to be implemented, which are as follows:
There was recently a shift in the field of heavy-ion physics triggered by experimental results obtained in collisions between small systems (e.g. protons on protons). These results resemble the ones obtained in collisions between heavy ions. This consequently raises the question of whether we create the smallest QGP droplet in collisions between small systems. The main objective of this project will be to study the production of charm particles such as D-mesons and Λc-baryons in pp collisions at the LHC. This will be done with the help of a new and innovative technique which is based on machine learning (ML). The student will also extend the studies to investigate how this production rate depends on the event activity e.g. on how many particles are created after every collision.
 
  
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou] and [mailto:Alessandro.Grelli@cern.ch Alessandro Grelli]''
+
* Electrons interact with matter and could emit bremsstrahlung radiation, in turn, leading to generation of secondary particles in the form of showers.
 +
* The concept of jets and showers can be designed in the same way within REDVID. This will be an acceptable simplification and boosts code reuse.
 +
* Electrons also lose energy through bremsstrahlung radiation as they go through the matter. This loss of energy can alter the electron's trajectory, causing it to slow down or change direction.
  
=== ALICE: Energy Loss of Energetic Quarks and Gluons in the Quark-Gluon Plasma ===
+
There will be a need for dedicated virtual detector segments to act as matter, or the detector sublayer should be considered with thickness, or both. The student will test the impact of the added information on developed ML models, which may involve training/retraining of these models.
One of the ways to study the quark-gluon plasma that is formed in high-energy nuclear collisions, is using high-energy partons (quarks or gluons) that are produced early in the collision and interact with the quark-gluon plasma as they propagate through it. There are several current open questions related to this topic, which can be explored in a Master's project. For example, we would like to use the new Monte Carlo generator framework JetScape to simulate collisions to see whether we can extract information about the interaction with the quark-gluon plasma. In the project you will collaborate with one of the PhD students or postdocs in our group to use the model to generate predictions of measurements and compare those to data analysis results. Depending on your interests, the project can focus more on the modeling aspects or on the analysis of experimental data from the ALICE detector at the LHC.
 
  
''Contact: [mailto:marco.van.leeuwen@cern.ch Marco van Leeuwen] and [mailto:marta.verweij@cern.ch Marta Verweij]''
+
==== Muon simulation ====
 +
The main feature considered for this iteration is support for different particles, especially muons.
  
=== ALICE: Extreme Rare Probes of the Quark-Gluon Plasma ===
+
It is paramount to have enough differentiation between different particle types within a simulation. To be able to simulate the behaviour of a muon, certain characteristics have to be implemented, which are as follows:
The quark-gluon plasma is formed in high-energy nuclear collisions and also existed shortly after the big bang. With the large amount of data collected in recent years at the Large Hadron Collider at CERN, rare processes that previously were not accessible provide now new ways to study how the quark-gluon plasma emerges from the fundamental theory of strong interaction. One of such processes is the heavy W boson which in many cases decays to two quarks. The W boson itself doesn’t interact with the quark-gluon plasma because it doesn’t carry color, but the quark decay products do interact with the plasma and therefore provide an ideal tool to study the space-time evolution of this hot and dense medium. In this project you will use data from the ALICE detector at the LHC and simulated data from generators to study various physics mechanisms that could be happening in the real collisions.
 
  
''Contact: [mailto:marta.verweij@cern.ch Marta Verweij] and [mailto:marco.van.leeuwen@cern.ch Marco van Leeuwen]''
+
* Muons are heavy particles and as a result, possess higher penetration power on matter.
 +
* Muons are unstable particles and decay into other particles, but not necessarily within the range of the detector.
 +
* Muons interact with matter, which could result in a change of the original direction.
 +
* Muons are charged particles and are affected by magnetic fields, resulting in bent trajectories. The curvature of muon trajectories in magnetic fields reveals information about their momentum.
 +
* Distinguishing muons from other particles, i.e., background signals, is crucial.
  
=== ALICE: Jet Quenching with Machine Learning ===
+
The student shall study, select and implement a minimum set of distinguishing characteristics to REDVID. A validation step, showcasing the differences in particle behaviour may be required. There may be a need for dedicated virtual detector layers to be defined. The student will test the impact of the added information on developed ML models, which may involve training/retraining of these models.
  
Machine learning applications are rising steadily as a vital tool in the field of data science but are relatively new in the particle physics community. In this project machine learning tools will be used to gain insights into the modification of a parton shower in the quark-gluon plasma (QGP). The QGP is created in high-energy nuclear collisions and only lives for a very short period of time. Highly energetic partons created in the same collisions interact with the plasma while they travers it and are observed as a collimated spray of particles, known as jets, in the detector.  One of the key recent insights is that the internal structure of jets provides information about the evolution of the QGP. With data recorded by the ALICE experiment, you will use jet substructure techniques in combination with machine learning algorithms to dissect the structure of the QGP. Machine learning will be used to select the regions of radiation phase space that are affected by the presence of the QGP.
+
The project is part of an ongoing effort to train and test ML models for particle track reconstruction for the HL-LHC. The improved version of REDVID can be used by the student and other users to generate training data for ML models. Depending on the progress and the interest, a secondary goal could be to perform comparisons with physics-accurate simulations or to investigate the impact of the new features on developed ML models.
  
''Contact: [mailto:marta.verweij@cern.ch Marta Verweij] and [mailto:marco.van.leeuwen@cern.ch Marco van Leeuwen]''
+
'''Bonus:''' The student will be encouraged and supported to publish the output of this study in a relevant journal, such as "Data in Brief" by Elsevier.
  
=== Lepton Collider: Pixel TPC testbeam ===
+
====Appendix - Terminology====
In the Lepton Collider group at Nikhef we work on a tracking detector for a future Collider (e.g. the ILC in Japan). We are developing a gaseous Time Projection Chamber with a pixel readout. At Nikhef we have built an 8-quad GridPix module based on the Timepix3 chip, which is a detector of about 20 cm x 40 cm x 10 cm in size. In August 2020 we will test the device at the DESY particle accelerator in Hamburg. For the project you could work on preparations for the test beam (e.g. running the data acquisition, perform data monitoring using our set up in the lab). The next topics will be the participation in the data taking during the test beam at DESY, the analysis of the data using C++ and ROOT and - finally - publication of the results in a scientific journal.
+
The terminology for the considered simulations and its features is domain-specific and are explained below:
  
Our latest paper can be found in https://www.nikhef.nl/~s01/quad_paper.pdf [www.nikhef.nl].
+
*Synthetic data: Data generated during a simulation, which resembles real-data to limited extent.
 +
*Physics-accurate simulation: A type of simulation that strongly takes into account real-world physical interactions and utilises physics formulas to achieve this.
 +
*Complexity-aware simulation framework: A simulator which can be configured with different levels of simulation complexity, making the simulation closer or further away from the real-world case.
 +
*Complexity-reduced data set: Simplified data resulting from simplified simulations. This is in comparison to real data, or data generated by physics-accurate simulations.
  
''Contact: [mailto:Peter.Kluit@nikhef.nl Peter Kluit] and Kees Ligtenberg''
+
==== References====
+
[1] U. Odyurt et al., 2023, "Reduced Simulations for High-Energy Physics, a Middle Ground for Data-Driven Physics Research". URL: https://doi.org/10.48550/arXiv.2309.03780
=== Dark Matter: Sensitive tests of wavelength-shifting properties of materials for dark matter detectors ===
 
Rare event search experiments that look for neutrino and dark matter interactions are performed with highly sensitive detector systems, often relying on scintillators, especially liquid noble gases, to detect particle interactions. Detectors consist of structural materials that are assumed to be optically passive, and light detection systems that use reflectors, light detectors, and sometimes, wavelength-shifting materials. MSc theses are available related to measuring the efficiency of light detection systems that might be used in future detectors. Furthermore, measurements to ensure that presumably passive materials do not fluoresce, at the low level relevant to the detectors, can be done. Part of the thesis work can include Monte Carlo simulations and data analysis for current and upcoming dark matter detectors, to study the effect of different levels of desired and nuisance wavelength shifting. In this project, students will acquire skills in photon detection, wavelength shifting technologies, vacuum systems, UV and extreme-UV optics, detector design, and optionally in C++ programming, data analysis, and Monte Carlo techniques.
 
  
''Contact: [mailto:Tina.Pollmann@tum.de Tina Pollmann] and [mailto:decowski@nikhef.nl Patrick Decowski]''
+
[2] U. Odyurt, 2023, "REDVID Simulation Framework". URL: https://virtualdetector.com/redvid
  
=== Dark Matter: Signal reconstruction in XENONnT ===
+
[3] T. Sjöstrand et al., 2006, "PYTHIA 6.4 physics and manual". URL: https://doi.org/10.1088/1126-&#x20;6708/2006/05/026
The next generation direct detection dark matter experiment - XENONnT - comprises close to 500 photomultiplier tubes (PMTs) in the main detector volume. These PMTs are configured to be able to detect even single photons. When a single photoelectron (PE) signal is detected the detected signal (a pulse) is convoluted with the detector response of the PMT. Due to this detector response the pulse shape of a single PE is spread out in time. For XENONnT we would like to explore the possibility to implement a digital (software) filter to deconvolve the detected pulse back to the “true” instantaneous shape (without the detector spread). This is a virtually unexplored new step in the Xenon analysis framework. Later in the analysis framework these pulses from all the PMTs are combined into a signal referred to as a ‘peak’. For XENONnT it is of essence to be extremely good in discriminating between two types of peaks caused by interactions in the detector; a prompt primary scintillation signal (S1) and a secondary ionization signal (S2). The parameters in the software haven’t - as of the time of writing - been optimized for the XENONnT-detector conditions.  
 
The student would investigate how a deconvolution filter would benefit the XENONnT analysis framework and develop such a filter. Furthermore, the student will work on the classification of these signals to fully exploit the XENONnT-detector to optimize the classification. This will be done with simulated data at first but may later even be performed on actual XENONnT-data. As an extension, the possibility of applying machine learning to correctly distinguish between the two signals could be explored. This is a data-analysis oriented project where Python skills are paramount.
 
  
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:j.angevaare@nikhef.nl Joran Angevaare]''
+
Contact: ''[mailto:uodyurt@nikhef.nl dr. ir. Uraz Odyurt], [mailto:f.dias@nikhef.nl dr. Flavia de Almeida Dias]''
 +
----
  
=== Dark Matter: XAMS  R&D Setup ===
+
=== Dark Matter: Building better Dark Matter Detectors - the XAMS  R&D Setup ===
The Amsterdam Dark Matter group operates an R&D xenon detector at Nikhef. The detector is a dual-phase xenon time-projection chamber and contains about 4kg of ultra-pure liquid xenon. We use this detector for the development of new detection techniques - such as utilizing our newly installed silicon photomultipliers - and to improve the understanding of the response of liquid xenon to various forms of radiation. The results could be directly used in the XENONnT experiment, the world’s most sensitive direct detection dark matter experiment at the Gran Sasso underground laboratory, or for future Dark Matter experiments like DARWIN. We have several interesting projects for this facility. We are looking for someone who is interested in working in a laboratory on high-tech equipment, modifying the detector, taking data and analyzing the data him/herself. You will "own" this experiment.  
+
The Amsterdam Dark Matter group operates an R&D xenon detector at Nikhef. The detector is a dual-phase xenon time-projection chamber and contains about 0.5kg of ultra-pure liquid xenon in the central volume. We use this detector for the development of new detection techniques - such as utilizing our newly installed silicon photomultipliers - and to improve the understanding of the response of liquid xenon to various forms of radiation. The results could be directly used in the XENONnT experiment, the world’s most sensitive direct detection dark matter experiment at the Gran Sasso underground laboratory, or for future Dark Matter experiments like DARWIN. We have several interesting projects for this facility. We are looking for someone who is interested in working in a laboratory on high-tech equipment, modifying the detector, taking data and analyzing the data themselves You will "own" this experiment.  
  
 
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
 
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
  
=== Dark Matter: DARWIN Sensitivity Studies ===
+
===Dark Matter: Searching for Dark Matter Particles - XENONnT Data Analysis===
DARWIN is the "ultimate" direct detection dark matter experiment, with the goal to reach the so-called "neutrino floor", when neutrinos become a hard-to-reduce background. The large and exquisitely clean xenon mass will allow DARWIN to also be sensitive to other physics signals such as solar neutrinos, double-beta decay from Xe-136, axions and axion-like particles etc. While the experiment will only start in 2025, we are in the midst of optimizing the experiment, which is driven by simulations. We have an opening for a student to work on the GEANT4 Monte Carlo simulations for DARWIN, as part of a simulation team together with the University of Freiburg and Zurich. We are also working on a "fast simulation" that could be included in this framework. It is your opportunity to steer the optimization of a large and unique experiment. This project requires good programming skills (Python and C++) and data analysis/physics interpretation skills.  
+
The XENON collaboration has used the XENON1T detector to achieve the world’s most sensitive direct detection dark matter results and is currently operating the XENONnT successor experiment. The detectors operate at the Gran Sasso underground laboratory and consist of so-called dual-phase xenon time-projection chambers filled with ultra-pure xenon. Our group has an opening for a motivated MSc student to do analysis with the new data coming from the XENONnT detector. The work will consist of understanding the detector signals and applying a deep neural network  to improve the (gas-) background discrimination in our Python-based analysis tool to improve the sensitivity for low-mass dark matter particles. The work will continue a study started by a recent graduate. There will also be opportunity to do data-taking shifts at the Gran Sasso underground laboratory in Italy.
  
 
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
 
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
  
=== Dark Matter: Fast simulation studies ===
+
===Dark Matter: Signal reconstruction and correction in XENONnT===
For Dark Matter experiments it is crucial to understand sources of backgrounds in great detail. The most common way to study the effect of backgrounds to the Dark Matter sensitivity is by the
+
XENONnT is a low background experiment operating at the INFN - Gran Sasso underground laboratory with the main goal of detecting Dark Matter interactions with xenon target nuclei. The detector, consisting of a dual-phase time projection chamber, is filled with ultra-pure xenon, which acts as a target and detection medium. Understanding the detector's response to various calibration sources is a mandatory step in exploiting the scientific data acquired. This MSc thesis aims to develop new methods to improve the reconstruction and correction of scintillation/ ionization signals from calibration data. The student will work with modern analysis techniques (python-based) and will collaborate with other analysts within the international XENON Collaboration.
use of Monte Carlo simulations. Unfortunately, the standard Monte Carlo techniques are extremely inefficient. One needs to sometimes simulate millions of events before one background event appears in the Dark Matter search area. We have developed a Monte Carlo technique that accelerates this process by up to 1000x. The method has been validated on very simple and unrealistic detector models. In goal of this project is to make a realistic detector model for the fast detector simulations. For this we are looking for a student with good programming skills, an interest in a software project, and the desire to deeply understand analysis of Dark Matter experimental data.
+
 
 +
''Contact: [mailto:mpierre@nikhef.nl Maxime Pierre], [mailto:decowski@nikhef.nl Patrick Decowski]''
 +
 
 +
===Dark Matter: The Ultimate Dark Matter Experiment - DARWIN Sensitivity Studies===
 +
DARWIN is the “ultimate” direct detection dark matter experiment, with the goal to reach the so-called “neutrino floor”, when neutrinos become a hard-to-reduce background. The large and exquisitely clean xenon mass will allow DARWIN to also be sensitive to other physics signals such as solar neutrinos, double-beta decay from Xe-136, axions and axion-like particles etc. While the experiment will only start in 2027, we are in the midst of optimizing the experiment, which is driven by simulations. We have an opening for a student to work on the GEANT4 Monte Carlo simulations for DARWIN. We are also working on a “fast simulation” that could be included in this framework. It is your opportunity to steer the optimization of a large and unique experiment. This project requires good programming skills (Python and C++) and data analysis/physics interpretation skills.
 +
 
 +
''Contact: [mailto:t.pollmann@nikhef.nl Tina Pollmann], [mailto:decowski@nikhef.nl Patrick Decowski] or [mailto:z37@nikhef.nl Auke Colijn]''
 +
 
 +
===Dark Matter: Exploring new background sources for DARWIN===
 +
Experiments based on the xenon dual-phase time projection chamber detection technology have already demonstrated their leading role in the search for Dark Matter. The unprecedented low level of background reached by the current generation, such as XENONnT, allows such experiments to be sensitive to new rare-events physics searches, broadening their physics program. The next generation of experiments is already under consideration with the DARWIN observatory, which aims to surpass its predecessors in terms of background level and mass of xenon target. With the increased sensitivity to new physics channels, such as the study of neutrino properties, new sources of backgrounds may arise. This MSc thesis aims to investigate potential new sources of background for DARWIN and is a good opportunity for the student to contribute to the design of the experiment. This project will rely on Monte Carlo simulation tools such as GEANT4 and FLUKA, and good programming skills (Python and  C++) are advantageous.
 +
 
 +
''Contact: [mailto:mpierre@nikhef.nl Maxime Pierre], [mailto:decowski@nikhef.nl Patrick Decowski]''
 +
 
 +
===Dark Matter: Sensitive tests of wavelength-shifting properties of materials for dark matter detectors===
 +
Rare event search experiments that look for neutrino and dark matter interactions are performed with highly sensitive detector systems, often relying on scintillators, especially liquid noble gases, to detect particle interactions. Detectors consist of structural materials that are assumed to be optically passive, and light detection systems that use reflectors, light detectors, and sometimes, wavelength-shifting materials. MSc theses are available related to measuring the efficiency of light detection systems that might be used in future detectors. Furthermore, measurements to ensure that presumably passive materials do not fluoresce, at the low level relevant to the detectors, can be done. Part of the thesis work can include Monte Carlo simulations and data analysis for current and upcoming dark matter detectors, to study the effect of different levels of desired and nuisance wavelength shifting. In this project, students will acquire skills in photon detection, wavelength shifting technologies, vacuum systems, UV and extreme-UV optics, detector design, and optionally in Python and C++ programming, data analysis, and Monte Carlo techniques.
 +
 
 +
''Contact: [mailto:Tina.Pollmann@tum.de Tina Pollmann]''
 +
 
 +
===Detector R&D: Energy Calibration of hybrid pixel detector with the Timepix4 chip===
 +
The Large Hadron Collider at CERN will increase its luminosity in the coming years. For the LHCb experiment the number of collisions per bunch crossing increases from 7 to more than 40. To distinguish all tracks from the quasi simultaneous collisions, time information will have to be used in addition to spatial information. A big step on the way to fast silicon detectors is the recently developed Timepix4 ASIC. Timepix4 consist of 448x512 pixels, but the pixels are not identical and there are pixel to pixel fluctuations in the time and charge measurement. The ultimate time resolution can only be achieved after calibration of both the time and energy measurements.
 +
The goal of this project is to study the energy calibration of Timepix4. Typical research questions are: how does the resolution depend on threshold and Krummenacher (discharge) current, and does a different sensor affect the energy resolution? In this research you will do measurements with calibration pulses, lasers and with radio-active sources to obtain data to calibrate the detector. The work consist of hands-on work in the lab to build/adapt the test set-up, and analysis of the data obtained.
 +
 
 +
''Contact: [mailto:(doppenhu@nikhef.nl) Daan Oppenhuis],[mailto:(hella.snoek@nikhef.nl) Hella Snoek],''
 +
 
 +
===Detector R&D: Studies of wafer-scale sensors for ALICE detector upgrade and beyond===
 +
One of the biggest milestones of the ALICE detector upgrade (foreseen in 2026) is the implementation of wafer-scale (~ 28 cm x 18 cm) monolithic silicon active pixel sensors in the tracking detector, with the goal of having truly cylindrical barrels around the beam pipe. To demonstrate such an unprecedented technology in high energy physics detectors, few chips will be soon available in Nikhef laboratories for testing and characterization purposes.
 +
The goal of the project is to contribute to the validation of the samples against the ALICE tracking detector requirements, with a focus on timing performance in view of other applications in future high energy physics experiments beyond ALICE.
 +
We are looking for a student with a focus on lab work and interested in high precision measurements with cutting-edge instrumentation. You will be part of the Nikhef Detector R&D group and you will have, at the same time, the chance to work in an international collaboration where you will report about the performance of these novel sensors. There may even be the opportunity to join beam tests at CERN or DESY facilities. Besides interest in hardware, some proficiency in computing is required (Python or C++/ROOT).
 +
 
 +
''Contact: [mailto:(jory.sonneveld@nikhef.nl Jory Sonneveld]''
  
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
+
===Detector R&D: Time resolution of monolithic silicon detectors===
 +
Monolithic silicon detectors based on industrial Complementary Metal Oxide Semiconductor (CMOS) processes offer a promising approach for large scale detectors due to their ease of production and low material budget. Until recently, their low radiation tolerance has hindered their applicability in high energy particle physics experiments. However, new prototypes ~~such as the one in this project~~ have started to overcome these hurdles, making them feasible candidates for future experiments in high energy particle physics. In this project, you will investigate the temporal performance of a radiation hard monolithic detector prototype, that was produced end of 2023, using laser setups in the laboratory. You will also participate in meetings with the international collaboration working on this detector to present reports on the prototype's performance. A detailed investigation into different aspects of the system are to be investigated concerning their impact on the temporal resolution such as charge calibration and power consumption. Depending on the progress of the work, a first full three dimensional characterization of the prototypes performance using a state-of-the-art two-photon absorption laser setup at Nikhef and/or an investigation into irradiated samples for a closer look on the impact of radiation damage on the prototype are possible. This project is looking for someone interested in working hands on with cutting edge detector and laser systems at the Nikhef laboratory. Python programming skills and linux experience are an advantage.
 +
 
 +
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld], [mailto:uwe.kraemer@nikhef.nl Uwe Kraemer]''
  
=== Dark Matter & Amsterdam Scientific Instruments: Simulations for Industry ===
+
===Detector R&D: Improving a Laser Setup for Testing Fast Silicon Pixel Detectors===
In the Nikhef Dark Matter group we have built up an extensive expertise with Monte Carlo simulations of ionizing radiation. Although these simulations have the aim to estimate background levels in our XENON experiments, the same techniques can be applied to study radiation transport in industrial devices. Amsterdam Scientific Instruments (ASI) is a company at Science Park that develops and sells radiation imaging equipment that is used amongst others in electron microscopy. For this application ASI needs a detailed study of gamma ray backgrounds to optimize shielding for their products. The project aims at optimizing a shielding design based on GEANT4 simulations. The results may be implemented in next generation products of ASI. We are looking for a student with preferably strong computing skills, and with an interest in science-industrial collaboration.
+
For the upgrades of the innermost detectors of experiments at the Large Hadron Collider in Geneva, in particular to cope with the large number of collisions per second from 2027, the Detector R&D group at Nikhef tests new pixel detector prototypes with a variety of laser equipment with several wavelengths. The lasers can be focused down to a small spot to scan over the pixels on a pixel chip. Since the laser penetrates the silicon, the pixels will not be illuminated by just the focal spot, but by the entire three dimensional hourglass or double cone like light intensity distribution. So, how well defined is the volume in which charge is released? And can that be made much smaller than a pixel? And, if so, what would the optimum focus be? For this project the student will first estimate the intensity distribution inside a sensor that can be expected. This will correspond to the density of released charge within the silicon. To verify predictions, you will measure real pixel sensors for the LHC experiments.
 +
This project will involve a lot of 'hands on work' in the lab and involve programming and work on unix machines.
  
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
+
''Contact: [mailto:martinfr@nikhef.nl Martin Fransen]''
  
=== The Modulation experiment: Data Analysis  ===
+
===Detector R&D: Time resolution of hybrid pixel detectors with the Timepix4 chip===
For years there have been controversial claims of potential new-physics on the basis of time-varying decay rates of radioactive sources on top of ordinary exponential decay. While some of these claims have been refuted, others have still to be confirmed or falsified. To this end, a dedicated experiment - the modulation experiment - has been designed and operational for the past four years. Using four identical and independent setups the experiment is almost ready for a final analysis to conclude on these claims. In this project the student will perform this analysis, preferably resulting in a conclusive paper. This will require combining the data of the four setups and close collaboration with a small group constituting a collaboration of the four different involved institutes (Purdue University (USA), Universität Zürich (Switzerland), Centro Brasileiro de Pesquisas Fisicas (Brasil) and Nikhef). This project is data-analysis oriented. Additionally, lab-skills can be required as one of the setups is situated at Nikhef.
+
Precise time measurements with silicon pixel detectors are very important for experiments at the High-Luminosity LHC and the future circular collider. The spatial resolution of current silicon trackers will not be sufficient to distinguish the large number of collisions that will occur within individual bunch crossings. In a new method, typically referred to as 4D tracking, spatial measurements of pixel detectors will be combined with time measurements to better distinguish collision vertices that occur close together. New sensor technologies are being explored to reach the required time measurement resolution of tens of picoseconds, and the results are promising.  
 +
However, the signals that these pixelated sensors produce have to be processed by front-end electronics, which hence play a large role in the total time resolution of the detector. The front-end electronics has many parameters that can be optimised to give the best time resolution for a specific sensor type.
 +
In this project you will be working with the Timepix4 chip, which is a so-called application specific integrated circuit (ASIC) that is designed to read out pixelated sensors. This ASIC is used extensively in detector R&D for the characterisation of new sensor technologies requiring precise timing (< 50 ps). To study the time resolution you will be using laser setups in our lab, and there might be an opportunity to join a test with charged particle beams at CERN.  
 +
These measurements will be complemented with data from the built-in calibration-pulse mechanism of the Timepix4 ASIC. Your work will enable further research performed with this ASIC, and serve as input to the design and operation of future ASICs for experiments at the High-Luminosity LHC.
  
 +
''Contact: [mailto:k.heijhoff@nikhef.nl Kevin Heijhoff] and [mailto:martinb@nikhef.nl Martin van Beuzekom]''
  
''Contact: [mailto:z37@nikhef.nl Auke Colijn] and [mailto:j.angevaare@nikhef.nl Joran Angevaare]''
+
===Detector R&D: Performance studies of Trench Isolated Low Gain Avalanche Detectors (TI-LGAD) ===
 +
The future vertex detector of the LHCb Experiment needs to measure the spatial coordinates and time of the particles originating in the LHC proton-proton collisions with resolutions better than 10 um and 50 ps, respectively. Several technologies are being considered to achieve these resolutions.  Among those is a novel sensor  technology called Trench Isolated Low Gain Avalanche Detector.
 +
Prototype pixelated sensors have been manufactured recently and have to be characterised. Therefore these new sensors will be bump bonded to a Timepix4 ASIC which provides charge and time measurements in each of 230 thousand pixels. Characterisation will be done using a lab setup at Nikhef, and includes tests with a micro-focused laser beam, radioactive sources, and possibly with particle tracks obtained in a test-beam. This project involves data taking with these new devices and analysing the data to determine the performance parameters such as the spatial and temporal resolution. as function of temperature and other operational conditions.  
  
 +
''Contacts: [mailto:kazu.akiba@nikhef.nl Kazu Akiba] and [mailto:martinb@nikhef.nl Martin van Beuzekom]''
  
=== Detector R&D: Test beam with a bent ALPIDE monolithic active pixel sensor ===
+
=== Detector R&D: A Telescope with Ultrathin Sensors for Beam Tests===
The ALICE inner tracking system (ITS) 2 is currently being installed at CERN. This detector makes use of ultra-lightweight monolithic active pixel sensors, the first to use this technology after the STAR experiment at RHIC in Brookhaven. These very thin pixel detectors have a low power consumption, result in very little material in the detector, and still have optimal timing and resolution. For the next long shutdown in 2025, an even smaller technology version of the ALPIDE chip will be used and will be installed by bending larger surfaces of sensor around the beam pipe. Recent test beams at DESY in Hamburg show this yields good results. One important property of a sensor is the resolution. This is however not straighforward to calculate for a bent sensor. In this project, you will work on measuring the resolution of the bent ALPIDE MAPS for the first time. If the travel situation allows you will have to opportunity to join the ALICE test beam group in Hamburg at DESY to take part in the exciting experience of taking real data.
+
To measure the performance of new prototypes for upgrades of the LHC experiments and beyond, typically a telescope is used in a beam line of charged particles that can be used to compare the results in the prototype to particle tracks measured with this telescope. In this project, you will continue work on a very lightweight, compact telescope using ALICE PIxel DEtectors (ALPIDEs). This includes work on the mechanics, data acquisition software, and a moveable stage. You will foreseeably test this telescope in the Delft Proton Therapy Center. If time allows, you will add a timing plane and perform a measurement with one of our prototypes. Apart from travel to Delft, there is a possiblity to travel to other beam line facilities.
  
 
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld]''
 
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld]''
  
 +
===Detector R&D: Laser Interferometer Space Antenna (LISA) - the first gravitational wave detector in space===
 +
 +
The space-based gravitational wave antenna LISA is one of the most challenging space missions ever proposed. ESA plans to launch around 2035 three spacecraft separated by a few million kilometres. This constellation measures tiny variations in the distances between test-masses located in each satellite to detect gravitational waves from sources such as supermassive black holes. LISA is based on laser interferometry, and the three satellites form a giant Michelson interferometer. LISA measures a relative phase shift between one local laser and one distant laser by light interference. The phase shift measurement requires sensitive sensors. The Nikhef DR&D group fabricated prototype sensors in 2020 together with the Photonics industry and the Dutch institute for space research SRON. Nikhef & SRON are responsible for the Quadrant PhotoReceiver (QPR) system: the sensors, the housing including a complex mount to align the sensors with 10's of nanometer accuracy, various environmental tests at the European Space Research and Technology Centre (ESTEC), and the overall performance of the QPR in the LISA instrument. Currently we are fabricating improved sensors, optimizing the mechanics and preparing environmental tests. As a MSc student, you will work on various aspects of the wavefront sensor development: study the performance of the epitaxial stacks of Indium-Gallium-Arsenide, setting up test benches to characterize the sensors and QPR system, performing the actual tests and data analysis, in combination with performance studies and simulations of the LISA instrument.
 +
Possible projects but better to contact us as the exact content may change:
 +
 +
#'''Title''': Simulating LISA QPD performance for LISA mission sensitivity. <br>    '''Topic''': Simulation and Data Analysis. <br>    '''Description''': we must provide accurate information to the LISA collaboration about the expected and actual performance of the LISA QPRs. This project will focus on using data from measurements taken at Nikhef to integrate into the simulation packages used within the LISA collaboration. The student will have the option to collect their own data to verify the simulations. Performance parameters include spatial uniformity and phase response, crosstalk and thermal response across the LISA sensitivity. <br> These simulations can then be used to investigate the full LISA performance and the impact on noise sources. This will involve simulating heterodyne signals expected on the LISA QPD and the impact on sensing techniques such as Differential Wavefront Sensing (DWS) and Tilt-to-Length (TTL) noise. Simulations tools include Finesse (Python), IFOCAD (C++) or FieldProp (MATLAB) depending on the student capabilities and preference. This work is important for understanding the stability and noise of LISA interferometry will perform during real operation in space.
 +
#'''Title''': Investigate the Response of the Gap in the LISA QPD. <br>    '''Topic''': Experimental. <br>    '''Description''': At Nikhef we are developing the photodiodes that will be used in the upcoming ESA/NASA LISA mission. We currently have our first batch of Quadrant Photodiodes (QPDs) that vary in diameter, thickness and gaps width between the quadrants. The goal of this project is to develop a free-space laser test set-up to measure the response of the gap between the quadrants of the LISA Quadrant Photodiode (QPD). It is important to understand the behaviour of the gap between the photodiode quadrants since this can impact the overall performance of the photodiode and thus the sensitivity of LISA. <br> The measurements will involve characterising the test laser beam, configuring test equipment, handling and installing optical components. Furthermore, as well as taking the data, the student will also be responsible for analysing the results using Python however other computer languages are acceptable (based on the student preference).
 +
#'''Title''': Investigate the Response of LISA QPDs for Einstein Telescope Pathfinder. <br>    '''Topic''': Experimental. <br>    '''Description''': Current gravitational wave (GW) interferometers typically operate using 1064 nm wavelengths. However, future GW detectors will operate at higher wavelengths such as 1550 nm or 2000 nm. As a result of the wavelength change, much of the current technology is unsuitable thus, developments are underway for the next generation GW detectors. Europe’s future GW detector, the Einstein Telescope, is currently in its’ infancy. A smaller scale prototype, known as ET pathfinder, is currently being built and serves as a test bench for the full scale detector. <br> At Nikhef’s R&D group, we want to develop quadrant photodiodes (QPDs) that sense the light from the interferometer light for the Einstein Telescope (ET) and ET Pathfinder. These QPDs require very low noise performance as well as high sensitivity in order to measure the small interferometer signals. To that end, out first step is to use the current QPDs that have been developed for the ESA/NASA LISA mission. <br> This project will focus on performance tests of the LISA QPDs using a 1550 nm. The student will be tasked with developing a test setup as well as taking the data and analysing the results. As part of this project, the student will learn about laser characterisation, gaussian optics and instrumentation techniques. These results will be important for designing the next generation QPDs and is of interest to the ET consortium, where the student can present their results.
 +
 +
 +
''Contact: [mailto:nielsvb@nikhef.nl Niels van Bakel] or [mailto:tmistry@nikhef.nl Timesh Mistry]''
  
=== Detector R&D: Simulating the performance of the ATLAS pixel detector after years of radiation ===
+
===Detector R&D: Other projects===
The innermost detector of the ATLAS experiment that is closest to the beam pipe is the ATLAS pixel detector. The pixel sensors in this area receive the highest amounts of radiation and their performance suffers accordingly. To better understand the effects of radiation damage and to be able to predict the future performance, the pixel sensors are modeled using programs such as technology computer aided design (TCAD) for modeling electric fields that serves as input for programs such as AllPix2 for modeling observables affecting the signal quality such as charge collection efficiency. In this project, you will use TCAD to make electric field maps of the sensor, and include effects of radiation damage in AllPix2 which has not been done before. You will compare your simulations to real data from the ATLAS experiment as well as to data from test beams. After validation, improved predictions for the performance of the next ATLAS pixel detector that is to be installed in the next long shutdown in 2025 will help ATLAS better prepare for the future LHC runs.
+
Are you looking for a slightly different project? Are the above projects already taken? Are you coming in at an unusual time of the year? Do not hesitate to contact us! We always have new projects coming up at different times in the year and we are open to your ideas.
  
 
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld]''
 
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld]''
  
=== Detector R&D: Characterization of the ALPIDE monolithic active pixel sensor ===
+
===Gravitational Waves: Computer modelling to design the laser interferometers for the Einstein Telescope===
The ALICE inner tracking system (ITS) 2 is currently being installed at CERN. This detector makes use of ultra-lightweight monolithic active pixel sensors, the first to use this technology after the STAR experiment at RHIC in Brookhaven. These very thin pixel detectors have a low power consumption, result in very little material in the detector, and still have optimal timing and resolution. To characterize the performance of these sensors, the threshold and noise as well as the analog pulse shape will be measured in the lab. To measure the charge collection efficiency, an important indicator of the performance of the sensor, the laser setups as well as radioactive sources in the detector R&D lab will be used. Work will involve setting up the experiments, carrying out measurements, and analyzing data, and could lead to novel insights of monolithic active pixel sensors.
+
 
 +
A new field of instrument science led to the successful detection of gravitational waves by the LIGO detectors in 2015. We are now preparing the next generation of gravitational wave observatories, such as the Einstein Telescope, with the aim to increase the detector sensitivity by a factor of ten, which would allow, for example, to detect stellar-mass black holes from early in the universe when the first stars began to form. This ambitious goal requires us to find ways to significantly improve the best laser interferometers in the world.
 +
 
 +
Gravitational wave detectors complex Michelson-type interferometers enhanced with optical cavities. We develop and use numerical models to study these laser interferometers, to invent new optical techniques and to quantify their performance. For example, we synthesize virtual mirror surfaces to study the effects of higher-order optical modes in the interferometers, and we use opto-mechanical models to test schemes for suppressing quantum fluctuations of the light field. We can offer several projects based on numerical modelling of laser interferometers. All projects will be directly linked to the ongoing design of the Einstein Telescope.
 +
 
 +
''Contact: [mailto:a.freise@nikhef.nl Andreas Freise]''
 +
 
 +
===Gravitational-Waves: Get rid of those damn vibrations!===
 +
In 2015 large scale, precision interferometry led to the detection of gravitational-waves. In 2017 Europe’s Advanced Virgo detector joined this international network and the best studied astrophysical event in history, GW170817, was detected in both gravitational waves and across the electromagnetic spectrum.
 +
 
 +
The Nikhef gravitational wave group is actively contributing to improvements towards current gravitational-wave detectors and the rapidly maturing design for Europe’s next generation of gravitational-wave observatory, Einstein Telescope, with one of two candidate sites located in the Netherlands. These detectors will unveil the gravitational symphony of the dark universe out to cosmological distances. Breaking past the sensitivity achieved by the current observatories will require a radically new approach to core components of these state of the art machines. This is especially true at the lowest, audio-band, frequencies that the Einstein Telescope is targeting where large improvements are needed.
 +
 
 +
Our project, Omnisens, brings the techniques from space based satellite control back to Earth building a platform capable of actively cancelling ground vibrations to levels never reached in the past. This is realised with state of the art compact interferometric sensors and precision mechanics. Substantial cancellation of seismic motion is an essential improvement for the Einstein Telescope, to reach below attometer (10<sup>-18</sup> m) displacements.
 +
 
 +
We are excited to offer two projects in 2024:
 +
 
 +
#You will experimentally demonstrate and optimise Omnisens’ novel vibration isolation for future deployment on the Einstein Telescope. The activity will involve hands-on experience with laser, electronics mechanical and high-vacuum systems.
 +
# You will contribute to the design of the Einstein Telescope by modelling the coupling of seismic and technical noises (such as actuation and sensing noises) through different configurations of seismic actuation chains. An accurate modelling of the origin and transmission of those noises is crucial in designing a system that prevents them from limiting the interferometer’s readout.
 +
 
 +
Contact: [mailto:c.m.mow-lowry@vu.nl Conor Mow-Lowry]
 +
 
 +
===Gravitational Waves: Signal models & tools for data analysis ===
 +
Theoretical predictions of gravitational-wave (GW) signals provide essential tools to detect and analyse transient GW events in the data of GW instruments like LIGO and Virgo. Over the last few years, there has been significant effort to develop signal models that accurately describe the complex morphology of GWs from merging neutron-star and black-hole binaries. Future analyses of Einstein Telescope (ET) data will need to tackle much longer and louder compact binary signals, which will require significant developments beyond the current status quo of GW modeling (i.e., improvements in model accuracy and computational efficiency, increased parameter space coverage, ...)  
 +
 
 +
We can offer up to two projects: in GW signal modeling (at the interface of perturbation theory, numerical relativity simulations and fast phenomenological descriptions), as well as developing applications of signal models in GW data analysis. Although not strictly required, prior knowledge of basic concepts of general relativity and/or GW theory will be helpful. Some proficiency in computing is required (Mathematica, Python or C++).
 +
 
 +
''Contact: [mailto:mhaney@nikhef.nl Maria Haney]''
 +
 
 +
=== Theoretical Particle Physics: High-energy neutrino physics at the LHC===
 +
High-energy collisions at the LHC and its High-Luminosity upgrade (HL-LHC) produce a large number of particles along the beam collision axis, outside of the acceptance of existing experiments. The FASER experiment has in 2023, for the first team, detected neutrinos produced in LHC collisions, and is now starting to elucidate their properties. In this context, the proposed Forward Physics Facility (FPF) to be located several hundred meters from the ATLAS interaction point and shielded by concrete and rock, will host a suite of experiments to probe Standard Model (SM) processes and search for physics beyond the Standard Model (BSM). High statistics neutrino detection will provide valuable data for fundamental topics in perturbative and non-perturbative QCD and in weak interactions. Experiments at the FPF will enable synergies between forward particle production at the LHC and astroparticle physics to be exploited. The FPF has the promising potential to probe our understanding of the strong interactions as well as of proton and nuclear structure, providing access to both the very low-x and the very high-x regions of the colliding protons. The former regime is sensitive to novel QCD production mechanisms, such as BFKL effects and non-linear dynamics, as well as the gluon parton distribution function (PDF) down to x=1e-7, well beyond the coverage of other experiments and providing key inputs for astroparticle physics. In addition, the FPF acts as a neutrino-induced deep-inelastic scattering (DIS) experiment with TeV-scale neutrino beams. The resulting measurements of neutrino DIS structure functions represent a valuable handle on the partonic structure of nucleons and nuclei, particularly their quark flavour separation, that is fully complementary to the charged-lepton DIS measurements expected at the upcoming Electron-Ion Collider (EIC).
 +
 
 +
In this project, the student will carry out updated predictions for the neutrino fluxes expected at the FPF, assess the precision with which neutrino cross-sections will be measured, develop novel monte carlo event generation tools for high-energy neutrino scattering, and quantify their impact on proton and nuclear structure by means of machine learning tools within the NNPDF framework and state-of-the-art calculations in perturbative Quantum Chromodynamics. This project contributes to ongoing work within the FPF Initiative towards a Conceptual Design Report (CDR) to be presented within two years. Topics that can be considered as part of this project include the assessment of to which extent nuclear modifications of the free-proton PDFs can be constrained by FPF measurements, the determination of the small-x gluon PDF from suitably defined observables at the FPF and the implications for ultra-high-energy particle astrophysics, the study of the intrinsic charm content in the proton and its consequences for the FPF physics program, and the validation of models for neutrino-nucleon cross-sections in the region beyond the validity of perturbative QCD.
 +
 
 +
References: https://arxiv.org/abs/2203.05090, https://arxiv.org/abs/2109.10905 ,https://arxiv.org/abs/2208.08372 , https://arxiv.org/abs/2201.12363  , https://arxiv.org/abs/2109.02653, https://github.com/NNPDF/ see also this [https://www.dropbox.com/s/30co188f1almzq2/rojo-GRAPPA-MSc-2023.pdf?dl=0 project description].
 +
 
 +
''Contacts: [mailto:j.rojo@vu.nl Juan Rojo]''
 +
===Theoretical Particle Physics: Unravelling proton structure with machine learning ===
 +
At energy-frontier facilities such as the Large Hadron Collider (LHC), scientists study the laws of nature in their quest for novel phenomena both within and beyond the Standard Model of particle physics. An in-depth understanding of the quark and gluon substructure of protons and heavy nuclei is crucial to address pressing questions from the nature of the Higgs boson to the origin of cosmic neutrinos. The key to address some of these questions is carrying out a global analysis of nucleon structure by combining an extensive experimental dataset and cutting-edge theory calculations. Within the NNPDF approach, this is achieved by means of a machine learning framework where neural networks parametrise the underlying physical laws while minimising ad-hoc model assumptions. In addition to the LHC, the upcoming Electron Ion Collider (EIC), to start taking data in 2029, will be the world's first ever polarised lepton-hadron collider and will offer a plethora of opportunities to address key open questions in our understanding of the strong nuclear force, such as the origin of the mass and the intrinsic angular momentum (spin) of hadrons and whether there exists a state of matter which is entirely dominated by gluons.  
 +
 
 +
In this project, the student will develop novel machine learning and AI approaches aimed to improve global analyses of proton structure and better predictions for the LHC, the EIC, and astroparticle physics experiments. These new approaches will be implemented within the machine learning  tools provided by the NNPDF open-source fitting framework and use state-of-the-art calculations in perturbative Quantum Chromodynamics. Techniques that will be considered include normalising flows, graph neural networks, gaussian processes, and kernel methods for unsupervised learning. Particular emphasis will be devoted to the automated determination of model hyperparameters, as well as to the estimate of the associated model uncertainties and their systematic validation with a battery of statistical tests. The outcome of the project will benefit the ongoing program of high-precision theory predictions for ongoing and future experiments in particle physics.
 +
 
 +
References:  https://arxiv.org/abs/2201.12363,  https://arxiv.org/abs/2109.02653 , https://arxiv.org/abs/2103.05419, https://arxiv.org/abs/1404.4293 , https://inspirehep.net/literature/1302398, https://github.com/NNPDF/ see also this [https://www.dropbox.com/s/30co188f1almzq2/rojo-GRAPPA-MSc-2023.pdf?dl=0 project description].
 +
 
 +
''Contacts: [mailto:j.rojo@vu.nl Juan Rojo]''
 +
 
 +
=== Theoretical Particle Physics: Sterile neutrino dark matter===
 +
 
 +
The existence of right-handed (sterile) neutrinos is well motivated, as all other Standard Model particles come in both  chiralities, and moreover, they naturally explaine the small masses of the left-handed (active) neutrinos. If the lightest sterile neutrino is very long lived, it could be dark matter. Although they can be produced by neutrino oscillations in the early universe, this is not efficient enough to explain all dark matter. It has been proposed that additional self-interactions between sterile neutrinos can solve this (https://arxiv.org/abs/2307.15565, see also the more recent https://arxiv.org/abs/2402.13878). In this project you would examine whether the additional field mediating the self-interactions can also explain the neutrino masses.  As a first step you would reproduce the results in the literature, and then extend it to map out the range of masses possible for this extra field.
 +
 
 +
''Contacts: [Mailto:mpostma@nikhef.nl Marieke Postma]''
 +
 
 +
===Theoretical Particle Physics: Baryogenesis at the electroweak scale===
 +
 
 +
Given that the Standard Model treats particle and antiparticles nearly the same, it is a puzzle why there is no antimatter left in our universe.
 +
Electroweak baryogenesis is the proposal  that the matter-antisymmetry is created during the phase transtion during which the Higgs field obtains a vev and the electroweak symmetry is broken.  One important ingredient in this scenario is that  there are new charge and parity (CP) violating interactions.  However, this is strongl constrained by the very precise measurements of the electric dipole moment of the electron.  An old proposal, that was recently revived, is to use a CP violating coupling of the Higgs field to the gauge field (https://arxiv.org/abs/2307.01270, https://inspirehep.net/literature/300823). The project would be to study the efficacy of these kind of operators for baryogenesis.
 +
 
 +
''Contacts: [Mailto:mpostma@nikhef.nl Marieke Postma]''
 +
 
 +
==='''Theoretical Particle Physics: Neutrinoless double beta decay with sterile neutrinos'''===
 +
Search for neutrinoless double beta decay represents a prominent probe of new particle physics and is very well motivated by its tight connection to neutrino masses, which, so far, lack an experimentally verified explanation. As such, it also provide a convenient probe of new interactions of the known elementary particles with hypothesized right-handed neutrinos that are thought to play a prime role in the neutrino mass generation. The main focus of this project would be the extension of NuDoBe, a Python tool for the computation of neutrinoless double beta decay (0vbb) rates in terms of lepton-number-violating operators in the Standard Model Effective Field Theory (SMEFT), see <nowiki>https://arxiv.org/abs/2304.05415</nowiki>. In the first step, the code should be expanded to include also the effective operators involving right-handed neutrinos based on the existing literature (<nowiki>https://arxiv.org/abs/2002.07182</nowiki>) covering the general rate of neutrinoless double beta decay within extended by right-handed neutrinos. Besides that, additional functionalities could be added to the code, such as a routine for extraction of the explicit form of a neutrino mass and mixing matrices, etc. This work would be very useful for future phenomenological studies and particularly timely given the ongoing experimental efforts, which are to be further boosted by the upcoming tonne-scale upgrades of the double-beta experiments.
 +
 
 +
''Contacts: [Mailto:j.devries4@uva.nl Jordy de Vries] and Lukas Graf''
 +
 
 +
==='''Theoretical Particle Physics: Phase space factors for single, double, and neutrinoless beta-decay rates.'''===
 +
In light of the increasingly precise measurements of beta-decay and double-beta-decay rates and spectra, the theoretical predictions seem to fall behind. The existing, rather phenomenological approaches to the associated phase-space calculations employ a variety of different approximations introducing errors that are, given their phenomenological nature, not easily quantifiable. A key goal of this project is to understand, reproduce and improve the methods and results available in the literature. Ideally, these efforts would be summarized in form of a compact Mathematica notebook or Python package available to the broad community of beta-decay experimentalists and phenomenologists that could easily implement it in the workflows of their analyses. The focus should be not only on the Standard-Model contributions to (double) beta decay, but also on hypothetical exotic modes stemming from various beyond-the-Standard-Model scenarios (see e.g.\ <nowiki>https://arxiv.org/abs/nucl-ex/0605029</nowiki> and <nowiki>https://arxiv.org/abs/2003.11836</nowiki>). If time permits, then new, more particle-physics based approaches to the phase-space computations can be investigated.
 +
 
 +
''Contacts: [Mailto:j.devries4@uva.nl Jordy de Vries] and Lukas Graf''
 +
==='''Theoretical Particle Physics''': Predictions for Charge Particle Tracks from First Principles===
 +
Measurements based on tracks of charged particles benefit from superior angular resolution. This is essential for a new class of observables called energy correlators, for which a range of interesting applications have been identified: studying the [https://arxiv.org/abs/2201.07800 confinement transition], measuring the [https://arxiv.org/abs/2201.08393 top quark mass] more precisely, etc. I developed a [https://arxiv.org/abs/1303.6637 framework] for calculating track-based observables, in which the conversion of quarks and gluons to charged hadrons is described by track functions. This generalization of the well-studied parton distribution functions and fragmentation functions is currently being measured by ATLAS, though the data is not public yet. Interestingly, two groups proposed predicting fragmentation functions from first principles in recent years (https://arxiv.org/abs/2010.02934, https://arxiv.org/abs/2301.09649). In this project you would extend one (or both) approaches to obtain a prediction for the track function.
 +
 
 +
''Contacts: [Mailto:w.j.waalewijn@uva.nl Wouter Waalewijn]''
 +
 
 +
===Neutrinos: Neutrino Oscillation Analysis with the KM3NeT/ORCA Detector===
 +
The KM3NeT/ORCA neutrino detector at the bottom of the Mediterranean Sea is able to detect oscillations of atmospheric neutrinos. Neutrinos traversing the detector are reconstructed as a function of two observables: the neutrino energy and the neutrino direction. In order to improve the neutrino oscillation analysis, we need to add one more observable, the so-called Björken-y, that indicates the fraction of the energy transferred from the incoming neutrino to its daughter particle. For this project, we will study simulated and real reconstructed data and use those to implement this additional observable in the existing analysis framework. Subsequently, we will study how much the sensitivity of the final analysis improves as a result.
 +
 
 +
C++ and Python programming skills are advantageous.
 +
 
 +
''Contacts:'' [mailto:dveijk@nikhef.nl Daan van Eijk], [mailto:h26@nikhef.nl Paul de Jong]
 +
 
 +
===Neutrinos: Searching for neutrinos of cosmic origin with KM3NeT===
 +
KM3NeT is a neutrino telescope under construction in the Mediterranean Sea, already taking data with the first deployed detection units. In particular the KM3NeT/ARCA detector off-shore of Sicily is designed for high-energy neutrinos and is suited for the detection of neutrinos of cosmic origin. In this project we will use the first KM3NeT data to search for evidence of a cosmic neutrino source, and also study ways to improve the analysis.
 +
 
 +
''Contact:'' [mailto:aart.heijboer@nikhef.nl Aart Heijboer]
 +
 
 +
===Neutrinos: the Deep Underground Neutrino Experiment (DUNE) ===
 +
 
 +
The Deep Underground Neutrino Experiment (DUNE) is under construction in the USA, and will consist of a powerful neutrino beam originating at Fermilab, a near detector at Fermilab, and a far detector in the SURF facility in Lead, South Dakota, 1300 km away. During travelling, neutrinos oscillate and a fraction of the neutrino beam changes flavour; DUNE will determine the neutrino oscillation parameters to unrivaled precision, and try and make a first detection of CP-violation in neutrinos. In this project, various elements of DUNE can be studied, including the neutrino oscillation fit, neutrino physics with the near detector, event reconstruction and classification (including machine learning), or elements of data selection and triggering.
 +
 
 +
''Contact:'' [mailto:h26@nikhef.nl Paul de Jong]
 +
 
 +
===Neutrinos: Searching for Majorana Neutrinos with KamLAND-Zen===
 +
The KamLAND-Zen experiment, located in the Kamioka mine in Japan, is a large liquid scintillator experiment with 750kg of ultra-pure Xe-136 to search for neutrinoless double-beta decay (0n2b). The observation of the 0n2b process would be evidence for lepton number violation and the Majorana nature of neutrinos, i.e. that neutrinos are their own anti-particles. Current limits on this extraordinary rare hypothetical decay process are presented as a half-life, with a lower limit of 10^26 years. KamLAND-Zen, the world’s most sensitive 0n2b experiment, is currently taking data and there is an opportunity to work on the data analysis, analyzing data with the possibility of taking part in a ground-breaking discovery. The main focus will be on developing new techniques to filter the spallation backgrounds, i.e.  the production of radioactive isotopes by passing muons. There will be close collaboration with groups in the US (MIT, Berkeley, UW) and Japan (Tohoku Univ).
 +
 
 +
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski]''
 +
 
 +
 
 +
===Neutrinos: TRIF𝒪RCE (PTOLEMY)===
 +
 
 +
The PTOLEMY demonstrator will place limits on the neutrino mass using the ''β-''decay endpoint of atomic tritium. The detector will require a CRES-based (cyclotron radiation emission spectroscopy) trigger and a non-destructive tracking system. The "''TRItium-endpoint From 𝒪(fW) Radio-frequency Cyclotron Emissions"'' group is developing radio-frequency cavities for the simultaneous transport of endpoint electrons and the extraction of their kinematic information. This is essential to providing a fast online trigger and precise energy-loss corrections to electrons reconstructed near the tritium endpoint. The cryogenic low-noise, high-frequency analogue electronics developed at Nikhef combined with FPGA-based front-end analysis capabilities will provide the PTOLEMY demonstrator with its CRES readout and a testbed to be hosted at the Gran Sasso National Laboratory for the full CνB detector. The focus of this project will be modelling CR in RF cavities and its detection for the purposes of single electron spectroscopy and optimised trajectory reconstruction for prototype and demonstrator setups. This may extend to firmware-based fast tagging and reconstruction algorithm development with the RF-SoC.
 +
 
 +
''Contact: [mailto:jmead@nikhef.nl James Vincent Mead]]''
 +
 
 +
===Cosmic Rays: Energy loss profile of cosmic ray muons in the KM3NeT neutrino detector ===
 +
The dominant signal in the KM3NeT detectors are not neutrinos, but muons created in particle cascades -extensive air-showers- initiated when cosmic rays interact in the top of the atmosphere. While these muons are a background for neutrino studies, they present an opportunity to study the nature of cosmic rays and hadronic interactions at the highest energies. Reconstruction algorithms are used to determine the properties of the particle interactions, normally of neutrinos,  from the recorded photons. The aim of this project is to explore the possibility to reconstruct the longitudinal energy loss profile of single and multiple simultaneous muons ('bundles') originating from cosmic ray interactions. The potential to use this energy loss profile to extract information on the amount of muons and the lateral extension of the muon 'bundles' will also be explored. These properties allow to extract information on the high-energy interactions of cosmic rays.
 +
 
 +
''Contact:  [mailto:rbruijn@nikhef.nl Ronald Bruijn]''
 +
 
 +
===LHCb: Search for light dark particles===
 +
The Standard Model of elementary particles does not contain a proper Dark Matter candidate. One of the most tantalizing theoretical developments is the so-called ''Hidden Valley models'': a mirror-like copy of the ''Standard Model'', with dark particles that communicate with standard ones via a very feeble interaction. These models predict the existence of ''dark hadrons'' – composite particles that are bound similarly to ordinary hadrons in the ''Standard Model''. Such ''dark hadrons''can be abundantly produced in high-energy proton-proton collisions, making the LHC a unique place to search for them. Some ''dark hadrons'' are stable like a proton, which makes them excellent ''Dark Matter'' candidates, while others decay to ordinary particles after flying a certain distance in the collider experiment. The LHCb detector has a unique capability to identify such decays, particularly if the new particles have a mass below ten times the proton mass.
 +
 
 +
This project assumes a unique search for light ''dark hadrons'' that covers a mass range not accessible to other experiments. It assumes an interesting program on data analysis (python-based) with non-trivial machine learning solutions and phenomenology research using fast simulation framework. Depending on the interest, there is quite a bit of flexibility in the precise focus of the project.
 +
 
 +
''Contact: [[Mailto:andrii.usachov@nikhef.nl Andrii Usachov]]''
 +
 
 +
===LHCb: Searching for dark matter in exotic six-quark particles===
 +
 
 +
Three quarters of the mass in the Universe is of unknown type. Many hypotheses about this dark matter have been proposed, but none confirmed. Recently it has been proposed that it could be made of particles made of the six quarks uuddss, which would be a Standard-Model solution to the dark matter problem. This idea has recently gained credibility as many similar multi-quarks states are being discovered by the LHCb experiment. Such a particle could be produced in decays of heavy baryons, or directly in proton-proton collisions. The anti-particle, made of six antiquarks, could be seen when annihilating with detector material. It is also proposed to use Xi_b baryons produced at LHCb to search for such a state where the state would appear as missing 4-momentum in a kinematically constrained decay. The project consists in defining a selection and applying it to LHCb data. See [https://arxiv.org/abs/2007.10378 arXiv:2007.10378].
 +
 
 +
Contact: ''[mailto:patrick.koppenburg@cern.ch Patrick Koppenburg]''
 +
 
 +
 
 +
===LHCb: New physics in the angular distributions of B decays to K*ee===
 +
 
 +
Lepton flavour violation in B decays can be explained by a variety of non-standard model interactions. Angular distributions in decays of a B meson to a hadron and two leptons are an important source of information to understand which model is correct. Previous analyses at the LHCb experiment have considered final states with a pair of muons. Our LHCb group at Nikhef concentrates on a new measurement of angular distributions in decays with two electrons. The main challenge in this measurement is the calibration of the detection efficiency. In this project you will confront estimates of the detection efficiency derived from simulation with decay distributions in a well known B decay. Once the calibration is understood, the very first analysis of the angular distributions in the electron final state can be performed.
 +
 
 +
Contact:  [mailto:m.senghi.soares@nikhef.nl Mara Soares] and [mailto:wouterh@nikhef.nl Wouter Hulsbergen]
 +
 
 +
===LHCb: CP violation in B -> J/psi Ks decays with first run-3 data===
 +
 
 +
The decay B -> J/psi Ks is the `golden channel' for measuring the CP violating angle beta in the CKM matrix. In this project we will use the first data from the upgraded LHCb detector to perform this measurement. Performing such a measurement with a new detector is going to be very challenging: We will learn a lot about whether the the upgraded LHCb will perform as good as expected.
 +
 
 +
Contact: ''[[mailto:wouterh@nikhef.nl Wouter Hulsbergen]]''
 +
 
 +
 
 +
===LHCb: Optimization of primary vertex reconstruction===
 +
A key part of the LHCb event classification is the reconstruction of the collision point of the protons from the LHC beams. This so-called primary vertex is found by constructing the origin of the charged particles found in the detector. A rudimentary algorithm exists, but it is expected that its performance can be improved by tuning parameters (or perhaps implementing an entirely new algorithm). In this project you are challenged to optimize the LHCb primary vertex reconstruction algorithm using recent simulated and real data from LHC run-3.
 +
 
 +
Contact: ''[[mailto:wouterh@nikhef.nl Wouter Hulsbergen]]''
 +
 
 +
===LHCb: Measurement of B decays to two electrons===
 +
Instead of searching for new physics by direct production of new particles, one can search for enhancements in very rare processes as an indirect signal for the existence of new particles or forces. The observed decay of Bs to two muons by the LHCb collaboration and Nikhef/Maastricht is such a measurement, and as rarest decay ever observed at the LHC it has a large impact on the new physics landscape. In this project, we will extend this work by searching for the even rarer decay into two electrons. You would join the ongoing work in context of an NWO Veni grant, and can be based in Maastricht or Nikhef.
 +
 
 +
Contact: ''[[mailto:jdevries@nikhef.nl Jacco de Vries]]''
 +
 
 +
===Muon Collider===
 +
There is currently a lively global debate about the next accelerator to succeed the successful LHC. Different options are on the table: linear, circular, electrons, protons, on various continents... Out of these, the most ambitious project is the muon collider, designed to collide the relatively massive (105 MeV) but relatively short-living (2.2 μs!) leptons. Such a novel collider would combine the advantages of electron-positron colliders (excellent precision) and proton-proton colliders (highest energy). In this project, we'll perform a feasibility study for the search of the elusive Double-Higgs process: this yet unobserved process is crucial to probe the simultaneous interaction of multiple Higgs bosons and thereby the shape of the Higgs potential as predicted in the Brout-Englert-Higgs mechanism. This sensitivity study will be instrumental to understand one of the main scientific prospects for this ambitious project, and also to optimize the detector design, as well as the interface of the particle detectors to the accelerator machine. The project is based at Nikhef but can also be (partially) performed at University of Twente.
 +
 
 +
Reference: https://www.science.org/content/article/muon-collider-could-revolutionize-particle-physics-if-it-can-be-built
 +
 
 +
Contact: ''[[mailto:f.dias@nikhef.nl Flavia Dias] and [mailto:tdupree@nikhef.nl Tristan du Pree] ]''
 +
 
 +
 
 +
 
 +
 
 +
==Projects with a 2023 start==
 +
 
 +
===ALICE: The next-generation multi-purpose detector at the LHC===
 +
This main goal of this project is to focus on the next-generation multi-purpose detector planned to be built at the LHC. Its core will be a nearly massless barrel detector consisting of truly cylindrical layers based on curved wafer-scale ultra-thin silicon sensors with MAPS technology, featuring an unprecedented low material budget of 0.05% X0 per layer, with the innermost layers possibly positioned inside the beam pipe. The proposed detector is conceived for studies of pp, pA and AA collisions at luminosities a factor of 20 to 50 times higher than possible with the upgraded ALICE detector, enabling a rich physics program ranging from measurements with electromagnetic probes at ultra-low transverse momenta to precision physics in the charm and beauty sector.
 +
 
 +
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou] and [mailto:Alessandro.Grelli@cern.ch Alessandro Grelli] and [mailto:marco.van.leeuwen@cern.ch Marco van Leeuwen]''
 +
 
 +
===ALICE: Searching for the strongest magnetic field in nature===
 +
In a non-central collision between two Pb ions, with a large value of impact parameter, the charged nucleons that do not participate in the interaction (called spectators) create strong magnetic fields. A back of the envelope calculation using the Biot-Savart law brings the magnitude of this filed close to 10^19Gauss in agreement with state of the art theoretical calculation, making it the strongest magnetic field in nature. The presence of this field could have direct implications in the motion of final state particles. The magnetic field, however, decays rapidly. The decay rate depends on the electric conductivity of the medium which is experimentally poorly constrained. Overall, the presence of the magnetic field, the main goal of this project, is so far not confirmed experimentally.
 +
 
 +
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
 +
 
 +
===ALICE: Looking for parity violating effects in strong interactions===
 +
Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions, in the presence of a deconfined state, gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, what is called the Chiral Magnetic Effect (CME).
 +
The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, however further studies are needed to constrain background effects. These highly anticipated results have the potential to reveal exiting, new physics.
  
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld]''
+
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
 +
 
 +
===ALICE: Machine learning techniques as a tool to study the production of heavy flavour particles ===
 +
There was recently a shift in the field of heavy-ion physics triggered by experimental results obtained in collisions between small systems (e.g. protons on protons). These results resemble the ones obtained in collisions between heavy ions. This consequently raises the question of whether we create the smallest QGP droplet in collisions between small systems. The main objective of this project will be to study the production of charm particles such as D-mesons and Λc-baryons in pp collisions at the LHC. This will be done with the help of a new and innovative technique which is based on machine learning (ML). The student will also extend the studies to investigate how this production rate depends on the event activity e.g. on how many particles are created after every collision.
 +
 
 +
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou] and [mailto:Alessandro.Grelli@cern.ch Alessandro Grelli]''
  
 +
===ALICE: Search for new physics with 4D tracking at the most sensitive vertex detector at the LHC===
  
 +
With the newly installed Inner Tracking System consisting fully of monolithic detectors, ALICE is very sensitive to particles with low transverse momenta, more so than ATLAS and CMS. This will be even more so for the ALICE upgrade detector in 2033. This detector could potentially be even more sensitive to longlived particles that leave peculiar tracks such as disappearing or kinked tracks in the tracker by using timing information along a track. In this project you will investigate how timing information in the different tracking layers can improve or even enable a search for new physics beyond the Standard Model in ALICE. If you show a possibility for major improvements, this can have real consequences for the choice of sensors for this ALICE inner tracker upgrade.
  
=== Detector R&D: Laser Interferometer Space Antenna (LISA) ===
+
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld] and [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
The space-based gravitational wave antenna LISA is, without a doubt, one of the most challenging space missions ever proposed. ESA plans to launch around 2030 three spacecraft that are separated by a few million kilometers to measure tiny variations in the distances between test-masses located in each satellite to detect the gravitational waves from sources such as supermassive black holes. The triangular constellation of the LISA mission is dynamic, requiring a constant fine-tuning related to the pointing of the laser links between the spacecraft and a simultaneous refocusing of the telescope. The noise sources related to the laser links expect to provide a dominant contribution to the LISA performance.
 
An update and extension of the LISA science simulation software are needed to assess the hardware development for LISA at Nikhef, TNO, and SRON. A position is therefore available for a master student to study the impact of instrumental noise on the performance of LISA. Realistic simulations based on hardware (noise) characterization measurements performed at TNO will be carried out and compared to the expected tantalizing gravitational wave sources.
 
  
''Contact: [mailto:nielsvb@nikhef.nl Niels van Bakel],[mailto:ernst-jan.buis@tno.nl  Ernst-Jan Buis]''
+
===ATLAS: The Higgs boson's self-coupling===
  
=== Detector R&D: Spectral X-ray imaging - Looking at colours the eyes can't see ===
+
The coupling of the Higgs boson to itself is one of the main unobserved interactions of the Standard Model and its observation is crucial to understand the shape of the Higgs potential. Here we propose to study the 'ttHH' final state: two top quarks and two Higgs bosons produced in a single collision. This topology is yet unexplored at the ATLAS experiment and the project consists of setting up the new analysis (including multivariate analysis techniques to recognise the complicated final state), optimising the sensitivity and including the result in the full ATLAS study of the Higgs boson's coupling to itself. With the LHC data from the upcoming Run-3, we might be able to see its first glimpses!
When a conventional X-ray image is taken, one acquires an image that only shows intensities. a ‘black and white’ image. Most of the information carried by the photon energy is lost. Lacking spectral information can result in an ambiguity between material composition and amount of material in the sample. If the X-ray intensity as a function of the energy can be measured (i.e. a ‘colour’ X-ray image) more information can be obtained from a sample. This translates to less required dose and/or to a better understanding of the sample that is being investigated. For example, two fields that can benefit from spectral X-ray imaging are mammography and real time CT.
 
  
Detectors using Medipix3 chips are used for X-ray imaging. Such a detector is composed of a pixel chip with a semiconductor sensor bonded on top of it. Photoelectric absorption of X-rays in the sensor results in an amount of charge being released that is proportional to the X-ray energy. This charge is registered by a pixel. Depending on configuration, in each pixel 1, 2, 4 or 8 detection thresholds can be set and so, a number of energy bins can be defined. One of the challenges is to maximise X-ray image quality by minimising effects caused by dispersion in the sensitivity of the pixels. The effects of this dispersion can partly be compensated by applying a specific measurement method in combination with image post processing.  
+
''Contact: [mailto:tdupree@nikhef.nl Tristan du Pree]  and [mailto:cpandini@nikhef.nl Carlo Pandini]''
  
You can work on improving measurement methods and on improving post processing methods. There is flexibility of the planned work depending on the skillset you have. The aim is to get the best X-ray energy resolution over the entire pixel chip. This in turn improves image quality and therefore X-ray CT reconstruction quality.
+
===ATLAS: Triple-Higgs production as a probe of the Higgs potential===
 +
So far, the investigation of Higgs self-couplings (the coupling of the Higgs boson to itself) at the LHC has focused on the measurement of the Higgs tri-linear coupling λ3 mainly through direct double-Higgs production searches. In this research project we propose the investigation of Higgs tri-linear and quartic coupling parameters λ3 and λ4, via a novel measurement of triple-Higgs production at the LHC (HHH) with the ATLAS experiment. While in the SM these parameters are expected to be identical, only a combined measurement can provide an answer regarding how the Higgs potential is realised in Nature. Processes in which three Higgs bosons are produced simultaneously are extremely rare, and very difficult to measure and disentangle from background. In this project we plan to investigate different decay channels (to bottom quarks and tau leptons), and to study advanced machine learning techniques to reconstruct such a complex hadronic final state. This kind of processes is still quite unexplored in ATLAS, and the goal of this project is to put the basis for the first measurement of HHH production at the LHC.
  
Important note: Much of this work is to be performed in the laboratory. Because of the corona pandemic it is not sure if it is possible to be physically present for enough of the time for this project. Please contact us to discuss the possibilities.
+
Furthermore, we'd like to study the possible implication of a precise measurement of the self-coupling parameters from HHH production from a phenomenological point of view: what could be the impact of a deviation in the HHH measurements on the big open questions in physics (for instance, the mechanisms at the root of baryogenesis)?
  
Please see the following videos for examples of our work:
+
Contact: ''[mailto:tdupree@nikhef.nl Tristan du Pree] and [mailto:cpandini@nikhef.nl Carlo Pandini]''
  
https://youtu.be/cgwQvjfUYns
+
===ATLAS: The Next Generation===
  
https://youtu.be/tf9ZLALPVNY
+
After the observation of the coupling of Higgs bosons to fermions of the third generation, the search for the coupling to fermions of the second generation is one of the next priorities for research at CERN's Large Hadron Collider. The search for the decay of the Higgs boson to two charm quarks is very new [1] and we see various opportunities for interesting developments. For this project we propose improvements in reconstruction (using exclusive decays) and advanced analysis techiques (using deep learning methods).
  
https://youtu.be/vjPX7SxvSUk
+
[https://atlas.cern/updates/briefing/charming-Higgs-decay][https://arxiv.org/abs/1802.04329 https://atlas.cern/updates/briefing/charming-Higgs-decay]
  
https://youtu.be/LqjNVSm7Hoo
+
''Contact: [mailto:tdupree@nikhef.nl Tristan du Pree]''
  
''Contact: [mailto:martinfr@nikhef.nl Martin Fransen],[mailto:navritb@nikhef.nl Navrit Bal]''
+
===ATLAS: Searching for new particles in very energetic diboson production===
  
=== Detector R&D: Holographic projector ===
+
The discovery of new phenomena in high-energy proton–proton collisions is one of the main goals of the Large Hadron Collider (LHC). New heavy particles decaying into a pair of vector bosons (WW, WZ, ZZ) are predicted in several extensions to the Standard Model (e.g. extended gauge-symmetry models, Grand Unified theories, theories with warped extra dimensions, etc). In this project we will investigate new ideas to look for these resonances in promising regions. We will focus on final states where both vector bosons decay into quarks, or where one decays into quarks and one into leptons. These have the potential to bring the highest sensitivity to the search for Beyond the Standard Model physics [1, 2]. We will try to reconstruct and exploit new ways to identify vector bosons (using machine learning methods) and then tackle the problem of estimating contributions from beyond the Standard Model processes in the tails of the mass distribution.
  
A difficulty in projecting holograms (based on the interference of light) is the required dense pixel pitch of a projector. One would need a pixel pitch of less than 200 nanometer. With larger pixels artefacts occur due to spatial under sampling. A pixel pitch of 200 nanometer is difficult, if not, impossible, to achieve, especially for larger areas. Another challenge is the massive amount of computing power that would be required to control such a dense pixel matrix.  
+
[1] https://arxiv.org/abs/1906.08589
  
A new holographic projection method has been developed that reduces under sampling artefacts for projectors with a ‘low’ pixel density. It uses 'pixels' at random but known positions, resulting in an array of (coherent) light points that lacks (or has suppressed) spatial periodicity. As a result a holographic projector can be built with a significantly lower pixel density and correspondingly less required computing power. This could bring holography in reach for many applications like display, lithography, 3D printing, metrology, etc...  
+
[2] https://arxiv.org/abs/2004.14636
  
Of course, nothing comes for free: With less pixels, holograms become noisier and the contrast will be reduced (not all light ends up in the hologram). The questions: How does the quality of a hologram depend on pixel density? How do we determine projector requirements based on requirements for hologram quality?
+
''Contact: [mailto:f.dias@nikhef.nl Flavia de Almeida Dias], [mailto:rhayes@nikhef.nl Robin Hayes], Elizaveta Cherepanova and Dylan van Arneman''
  
Requirements for a hologram can be expressed in terms of: Noise, contrast, resolution, suppression of under sampling artefacts, etc..
+
===ATLAS: Top-quark and Higgs-boson analysis combination, and Effective Field Theory interpretation (also in 2023)===
  
For this project we have built a proof of concept holographic emitter. This set-up will be used to verify simulation results (and also to project some cool holograms of course ;-).  
+
We are looking for a master student with interest in theory and data-analysis in the search for physics beyond the Standard Model in the top-quark and Higgs-boson sectors.
  
Examples of what you could be working on:
+
Your master-project starts just at the right time for preparing the Run-3 analysis of the ATLAS experiment at the LHC.  In Run-3 (2022-2026), three times more data becomes available, enabling analysis of rare processes with innovative software tools and techniques.
  
a. Calibration/characterisation of the current projector and compensation of systematic errors.
+
This project aims to explore the newest strategy to combine the top-quark and Higgs-boson measurements in the perspective of constraining the existence of new physics beyond the Standard Model (SM) of Particle Physics.  We selected the pp->tZq and gg->HZ processes as promising candidates for a combination to constrain new physics  in the context of  Standard Model Effective Field Theory (SMEFT).   SMEFT is the state-of-the-art framework for theoretical interpretation of LHC data. In particular, you will study the SMEFT OtZ and Ophit operators, which are not well constrained by current measurements.
  
b. To realize a phased array of randomly placed light sources the pixel matrix of the projector must be ‘relayed’ onto a mask with apertures at random but precisely known positions. Determine the best possible relaying optics and design an optimized mask accordingly. Factors like deformation of the projected pixel matrix and limitations in resolving power of the lens system must be taken into account for mask design.
+
Besides affinity with particle physics theory, the ideal candidate for this project has developed python/C++ skills and is eager to learn advanced techniques. You start with a simulation of the signal and background samples using existing software tools. Then, an event selection study is required using Machine Learning techniques. To evaluate the SMEFT effects, a fitting procedure based on the innovative  Morphing technique is foreseen, for which the basic tools in the ROOT and RooFit framework are available. The work is carried out in the ATLAS group at Nikhef and may lead to an ATLAS note.
  
Important note: Much of this work is to be performed in the laboratory. Because of the corona pandemic it is not sure if it is possible to be physically present for enough of the time for this project. Please contact me to discuss the possibilities.
+
''Contact: [mailto:o.rieger@nikhef.nl Oliver Rieger]  and [mailto:h73@nikhef.nl Marcel Vreeswijk]''
  
''Contact: [mailto:martinfr@nikhef.nl Martin Fransen]''
+
===ATLAS: Machine learning to search for very rare Higgs decays===
  
=== Theory: The Effective Field Theory Pathway to New Physics at the LHC ===
+
Since the Higgs boson discovery in 2012 at the ATLAS experiment, the investigation of the properties of the Higgs boson has been a priority for research at the Large Hadron Collider (LHC). However, there are still a many open questions: Is the Higgs boson the only origin of Electroweak Symmetry Breaking? Is there a mechanism which can explain the observed mass pattern of SM particles? Many of these questions are linked to the Higgs boson coupling structure.


A promising framework to parametrise in a robust and model-independent way deviations from the Standard Model (SM) induced by new heavy particles is the Standard Model Effective Field Theory (SMEFT). In this formalism, beyond the SM effects are encapsulated in higher-dimensional operators constructed from SM fields respecting their symmetry properties. In this project, we aim to carry out a global analysis of the SMEFT from high-precision LHC data, including Higgs boson production, flavour observables, and low-energy measurements. This analysis will be carried out in the context of the recently developed SMEFiT approach [1] based on Machine Learning techniques to efficiently explore the complex theory parameter space. The ultimate goal is either to uncover glimpses of new particles or interactions at the LHC, or to derive the most stringent model-independent bounds to date on general theories of New Physics. Of particular interest are novel methods for charting the parameter space [2], the matching to UV-complete theories in explicit BSM scenarios [3], and the interplay between EFT-based model-independent searches for new physics and determinations of the proton structure from LHC data [4].
 
  
[1] https://arxiv.org/abs/1901.05965
+
While the Higgs boson coupling to fermions of the third generation has been established experimentally, the investigation of the Higgs boson coupling to the light fermions of the second generation will be a major project for the upcoming data-taking period (2022-2025). The Higgs boson decay to muons is the most sensitive channel for probing this coupling. In this project, you will optimize the event selection for Higgs boson decays to muons in the Vector Boson Fusion (VBF) production channel with a focus on distinguishing signal events from background processes like Drell-Yan and electroweak Z boson production. For this purpose, you will develop, implement and validate advanced machine learning and deep learning algorithms.  
[2] https://arxiv.org/abs/1906.05296
 
[3] https://arxiv.org/abs/1908.05588
 
[4] https://arxiv.org/abs/1905.05215
 
  
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
+
''Contact: [mailto:oliver.rieger@nikhef.nl Oliver Rieger] and [mailto:verkerke@nikhef.nl Wouter Verkerke] and [mailto:s01@nikhef.nl Peter Kluit]''
  
=== Theory: Charting the quark and gluon structure of protons and nuclei with Machine Learning ===
+
===ATLAS: Interpretation of experimental data using SMEFT===
Deepening our knowledge of the partonic content of nucleons and nuclei [1] represents a central endeavour of modern high-energy and nuclear physics, with ramifications in related disciplines such as astroparticle physics. There are two main scientific drivers motivating these investigations of the partonic structure of hadrons. On the one hand, addressing fundamental open issues in our understanding in the strong interactions such as the origin of the nucleon mass, spin, and transverse structure; the presence of heavy quarks in the nucleon wave function; and the possible onset of novel gluon-dominated dynamical regimes. On the other hand, pinning down with the highest possible precision the substructure of nucleons and nuclei is a central component for theoretical predictions in a wide range of experiments, from proton and heavy ion collisions at the Large Hadron Collider to ultra-high energy neutrino interactions at neutrino telescopes. The goal of this project is to exploit Machine Learning and Artificial Intelligence tools [2,3] (neural networks trained by stochastic gradient descent) to pin down the quark and gluon substructure of protons and nuclei by using recent measurements from proton-proton and proton-lead collisions at the LHC. Topics of special interest are i) the strange content of protons and nuclei, ii) parton distributions at higher-orders in the QCD couplings for precision Higgs physics, iii) the interplay between jet, photon, and top quark production data to pin down the large-x gluon, and iv) charm quarks as a probe of gluon shadowing at small-x. The project also involves developing projects for the Electron-Ion Collider (EIC), a new lepton-nucleus experiment to start operations in the next years.
 
  
[1] https://arxiv.org/abs/1910.03408
+
The Standard Model Effective Field Theory (SMEFT) provides a systematic approach to test the impact of new physics at the energy scale of the LHC through higher-dimensional operators. The interpretation of experimental data using SMEFT requires a particular interest in solving complex technical challenges, advanced statistical techniques, and a deep understanding of particle physics. We would be happy to discuss different project opportunities based on your interests with you.
[2] https://arxiv.org/abs/1904.00018
 
[3] https://arxiv.org/abs/1706.00428
 
  
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
+
''Contact: [mailto:oliver.rieger@nikhef.nl Oliver Rieger] and [mailto:verkerke@nikhef.nl Wouter Verkerke]''
  
=== Theory: Machine learning for Electron Microscopy for next-generation materials ===
+
===ATLAS: A new timing detector - the HGTD===
Machine Learning tools developed and applied for particle physics hold great potential for applications in material science, in particular concerning faithful uncertainty estimation and model training for large parameter spaces. In this project, carried out in collaboration with the group of Dr. Sonia Conesa-Boj from the Kavli Institute Nanoscience Delft, http://www.conesabojlab.tudelft.nl, we will  develop and deploy ML tools for data analysis in Electron Microscopy. We will focus on pinning down the properties of novel quantum materials such as topological insulators and van der Waals materials. Examples of possible applications include model-independent background subtraction in electron-energy loss spectroscopy, automatic classification of crystalline structures, and enhancing spatial and spectral resolution using convolutional networks.
+
The ATLAS is going to get a new ability:  a timing detector. This allows us to reconstruct tracks not only in the 3 dimensions of space but adds the ability of measuring very precisely also the time (at picosecond level) at which the particles pass the sensitive layers of the HGTD detector. This allows to construct the trajectories of the particles created at the LHC in 4 dimensions and ultimately will lead to a better reconstruction of physics at ATLAS. The new HGTD detector is still in construction and work needs to be done on different levels such as understanding the detector response (taking measurements in the lab and performing simulations) or developing algorithms to reconstruct the particle trajectories (programming and analysis work).  
  
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
+
'''Several projects are available within the context of the new HGTD detector:'''  
  
===Theory: The electroweak phase transition and baryogenesis/gravitational wave production ===
+
#One can choose to either focus on '''''the impact on physics analysis performance''''' by studying how the timing measurements can be included in the reconstruction of tracks, and what effect this has on how much better we can understand the physical processes occurring in the particles produced in the LHC collisions. With this work you will be part of the Atlas group at Nikhef.
 +
#The second possibility is to '''''test the sensors in our lab''''' and in test-beam setups at CERN. The analysis performed will be in context of the ATLAS HGTD test beam group in connection to both the Atlas group and the R&D department at Nikhef.
 +
#The third is to contribute in an ongoing effort '''''to precisely simulate/model  he silicon avalanche detectors''''' in the Allpix2 frameword. There are several models that try to describe the detectors response. There are several dependencies to operation temperature, field strenghts and radiation damage.  We are getting close in being able to model our detector - but not there yet. This work will be within the ATLAS group together with Hella Snoek and Andrea Visibile
  
In extensions of the Standard Model the electroweak phase transition can be first order and proceed via the nucleation of bubbles. Colliding bubbles can produce gravitational waves [1] and plasma particles interacting with the bubbles can generate a matter-antimatter asymmetry [2]. A detailed understanding of the dynamics of the phase transitions is needed to accurately describe these processesOne project is to study QFT at finite temperature and compare/apply methods that address the non-perturbative IR dynamics of the thermal processes [3,4]. Another project is to calculate the velocity by which the bubbles expand, which is an important parameter for gravitational waves production and baryogensis. This entails among other things tunneling dymamics, (thermal) scattering rates and Boltzmann equations [5].
+
If you are interested, contact me to discuss the possibilities.  
 +
Contact: ''[mailto:hella.snoek@nikhef.nl Hella Snoek]''
  
[1]https://arxiv.org/abs/1705.01783
 
[2]https://arxiv.org/pdf/hep-ph/0609145.pdf
 
[3]https://arxiv.org/pdf/1609.06230.pdf
 
[4]https://arxiv.org/pdf/1612.00466.pdf
 
[5]https://arxiv.org/pdf/1809.04907.pdf
 
  
''Contact: [mailto:mpostma@nikhef.nl Marieke Postma]''
+
===ATLAS: The next full-silicon Inner Tracker: ITk===
 +
[[File:ITk endcap structure.jpg|210x210px|thumb|alt=]]The inner detector of the present ATLAS experiment has been designed and developed to function in the environment of the present Large Hadron Collider (LHC). At the ATLAS Phase-II Upgrade, the particle densities and radiation levels will exceed current levels by a factor of ten. The instantaneous luminosity is expected to reach unprecedented values, resulting in up to 200 proton-proton interactions in a typical bunch crossing. The new detectors must be faster and they need to be more highly segmented. The sensors used also need to be far more resistant to radiation, and they require much greater power delivery to the front-end systems. At the same time, they cannot introduce excess material which could undermine tracking performance. For those reasons, the inner tracker of the ATLAS detector (ITk) was redesigned and will be rebuilt completely.
  
===Theory: Cosmology of the QCD axion ===
+
Nikhef is one of the sites in charge of building and integrating some big parts of ITk. One of the next steps consists of testing the sensors that we will install in the structures we have built (check one of the structures in the picture of our cleanroom). This project offers the possibility of working on a full hardware project, doing something completely new, by testing the sensors of a future component of the next ATLAS detector.
  
The QCD axion provides an elegant solution to the strong CP problem in QCD[1]. This project focus on the cosmological dynamics of this hypothesized axion field, and in particular the possibility that it can both produce the observed matter-antimatter asymmetry and dark matter abundance in our universe [2,3].
+
''Contact'':  ''[mailto:aalonso@nikhef.nl Andrea García Alonso]''
  
[1]https://arxiv.org/abs/1812.02669
+
===Cosmic Rays/Neutrinos: Seasonal muon flux variations and the pion/kaon ratio===
[2]https://arxiv.org/pdf/hep-ph/0609145.pdf
+
The KM3NeT ARCA and ORCA detectors, located kilometers deep in the Mediterranean Sea, have neutrinos as primary probes. Muons from cosmic ray interactions reach the detectors in relatively large quantities too. These muons, exploiting the capabilities and location of the detectors allow the study of cosmic rays and their interactions. In this way, questions about their origin, type, propagation can be addressed. In particular these muons are tracers of hadronic interactions at energies inaccessible at particle accelerators.
[3]https://arxiv.org/pdf/1910.02080.pdf
 
  
''Contact: [mailto:mpostma@nikhef.nl Marieke Postma]''
+
The muons reaching the depths of the detectors result from decays of mesons, mostly pions and kaons, created in interactions of high-energy cosmic rays with atoms in the upper atmosphere. Seasonal changes of the temperature – and thus density - profile of  the atmosphere modulate the balance between the probability for these mesons to decay (producing muons) or to re-interact. Pions and kaons are affected differently, allowing to extract their production ratio by determining how changes in muon rate depend on changes in the effective temperature – an integral over the atmospheric temperature profile weighted by a depth dependent meson production rate.
  
===Theory: Neutrinos, hierarchy problem and cosmology ===
+
In this project, the aim is to measure the rate of muons in the detectors and to calculate the effective temperature above the KM3NeT detectors from atmospheric data, both as function of time. The relation between these two can be used to extract the pion to kaon ratio.
  
The electroweak hierachy problem is absent if the quadratic term in the Higgs potential is generated dynamically. This is achieved in 'the neutrino option' [1] where the Higgs potential stems exclusively from quantum effects of heavy right-handed neutrinos, which can also generate the mass pattern of the oberved left-handed neutrinos. The project focusses on model building aspects (e.g. [2]) and the cosmology (e.g. leptogenesis [3]) of these set-ups.
+
''Contact: [mailto:rbruijn@nikhef.nl Ronald Bruijn]''
 +
===Detector R&D: Studies of wafer-scale sensors for ALICE detector upgrade and beyond===
 +
One of the biggest milestones of the ALICE detector upgrade (foreseen in 2026) is the implementation of wafer-scale (~ 28 cm x 18 cm) monolithic silicon active pixel sensors in the tracking detector, with the goal of having truly cylindrical barrels around the beam pipe. To demonstrate such an unprecedented technology in high energy physics detectors, few chips will be soon available in Nikhef laboratories for testing and characterization purposes.
 +
The goal of the project is to contribute to the validation of the samples against the ALICE tracking detector requirements, with a focus on timing performance in view of other applications in future high energy physics experiments beyond ALICE.
 +
We are looking for a student with a focus on lab work and interested in high precision measurements with cutting-edge instrumentation. You will be part of the Nikhef Detector R&D group and you will have, at the same time, the chance to work in an international collaboration where you will report about the performance of these novel sensors. There may even be the opportunity to join beam tests at CERN or DESY facilities. Besides interest in hardware, some proficiency in computing is required (Python or C++/ROOT).
  
[1] https://arxiv.org/pdf/1703.10924.pdf
+
''Contact: [mailto:(jory.sonneveld@nikhef.nl Jory Sonneveld] , [mailto:rrusso@nikhef.nl Roberto Russo]''
[2] https://arxiv.org/pdf/1807.11490.pdf
 
[3] https://arxiv.org/pdf/1905.12642.pdf
 
  
''Contact: [mailto:mpostma@nikhef.nl Marieke Postma]''
+
===Detector R&D: Time resolution of monolithic silicon detectors===
 +
Monolithic silicon detectors based on industrial Complementary Metal Oxide Semiconductor (CMOS) processes offer a promising approach for large scale detectors due to their ease of production and low material budget. Until recently, their low radiation tolerance has hindered their applicability in high energy particle physics experiments. However, new prototypes ~~such as the one in this project~~ have overcome these hurdles, making them feasible candidates for future experiments in high energy particle physics. Achieving the required radiation tolerance has brought the spatial and temporal resolution of these detectors to the forefront. In this project, you will investigate the temporal performance of a radiation hard monolithic detector prototype, using laser setups in the laboratory. You will also participate in meetings with the international collaboration working on this detector, where you will report on the prototype's performance. Depending on the progress of the work, there may be a chance to participate in test beams performed at the CERN accelerator complex and a first full three dimensional characterization of the prototypes performance using a state-of-the-art two-photon absorption laser setup at Nikhef. This project is looking for someone interested in working hands on with cutting edge detector and laser systems at the Nikhef laboratory. Python programming skills and linux experience are an advantage.
  
=== KM3NeT: Reconstruction of first neutrino interactions in KM3NeT ===
+
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld], [mailto:uwe.kraemer@nikhef.nl Uwe Kraemer]''
  
The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first few strings with sensitive photodetectors have been deployed at both the Italian and the French detector sites. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In this project the available data will be used together with simulations to best reconstruct the event topologies and optimally identify and reconstruct the first neutrino interactions in the KM3NeT detector and with this pave the path towards accurate neutrino oscillation measurements and neutrino astronomy.  
+
===Detector R&D: Improving a Laser Setup for Testing Fast Silicon Pixel Detectors===
 +
For the upgrades of the innermost detectors of experiments at the Large Hadron Collider in Geneva, in particular to cope with the large number of collisions per second from 2027, the Detector R&D group at Nikhef tests new pixel detector prototypes with a variety of laser equipment with several wavelengths. The lasers can be focused down to a small spot to scan over the pixels on a pixel chip. Since the laser penetrates the silicon, the pixels will not be illuminated by just the focal spot, but by the entire three dimensional hourglass or double cone like light intensity distribution. So, how well defined is the volume in which charge is released? And can that be made much smaller than a pixel? And, if so, what would the optimum focus be? For this project the student will first estimate the intensity distribution inside a sensor that can be expected. This will correspond to the density of released charge within the silicon. To verify predictions, you will measure real pixel sensors for the LHC experiments.
 +
This project will involve a lot of 'hands on work' in the lab and involve programming and work on unix machines.
  
Programming skills are essential, mostly root and C++ will be used.
+
''Contact: [mailto:martinfr@nikhef.nl Martin Fransen]''
''Contact: [mailto:bruijn@nikhef.nl Ronald Bruijn] [mailto:dosamtnikhef.nl Dorothea Samtleben]'''
 
  
=== KM3NeT: Searching for New Heavy Neutrinos ===
+
===Detector R&D: Time resolution of hybrid pixel detectors with the Timepix4 chip ===
 +
Precise time measurements with silicon pixel detectors are very important for experiments at the High-Luminosity LHC and the future circular collider. The spatial resolution of current silicon trackers will not be sufficient to distinguish the large number of collisions that will occur within individual bunch crossings. In a new method, typically referred to as 4D tracking, spatial measurements of pixel detectors will be combined with time measurements to better distinguish collision vertices that occur close together.
 +
New sensor technologies are being explored to reach the required time measurement resolution of tens of picoseconds, and the results are promising. However, the signals that these pixelated sensors produce have to be processed by front-end electronics, which hence also play a role in the total time resolution of the detector. An important contribution comes from the systematic differences between the front-end electronics of different pixels. Many of these systematic effects can be corrected by performing detailed calibrations of the readout electronics. To achieve the required time resolution at future experiments, it is vital that these effects are understood and corrected.
 +
In this project you will be working with the Timepix4 chip. This is a so-called application specific integrated circuit (ASIC) that is designed to read out pixelated sensors. This ASIC will be used extensively in detector R&D for the characterisation of new sensor technologies requiring precise timing (< 50 ps). In order to do so, it is necessary to first study the systematic differences between the pixels, which you will do using a laser setup in our lab. This will be combined with data analysis of proton beam measurements, or with measurements performed using the built-in test-pulse mechanism of the Timepix4 ASIC. Your work will enable further research performed with this ASIC,
 +
and serve as input to the design and operation of future ASICs for experiments at the High-Luminosity LHC.
  
In this project we will be searching for a new heavy neutrino, looking at signatures created by atmospheric neutrinos interacting in the detector volume of KM3NeT-ORCA. The aim of this project is to study a specific event topology which appears as double blobs of signals detected separately by densely instrumented ORCA detector units. We will be exploiting the tau reconstruction algorithms to verify the possibility of ORCA to detect such signals and to estimate the potential sensitivity of the experiment as well. Basic knowledge of elementary particle physics and data analysis techniques will be advantageous. The knowledge of programming languages e.g. python (and possibly C++) and ROOT are advantageous but not mandatory.  
+
''Contact: [mailto:k.heijhoff@nikhef.nl Kevin Heijhoff] and [mailto:martinb@nikhef.nl Martin van Beuzekom]''
  
''Contact: [mailto:suzanbp@nikhef.nl Suzan B. du Pree] [mailto:dveijk@nikhef.nl Daan van Eijk]''
+
===Detector R&D: Performance studies of Trench Isolated Low Gain Avalanche Detectors (TI-LGAD)===
 +
The future vertex detector of the LHCb Experiment needs to measure the spatial coordinates and time of the particles originating in the LHC proton-proton collisions with resolutions better than 10 um and 50 ps, respectively. Several technologies are being considered to achieve these resolutions.  Among those is a novel sensor  technology called Trench Isolated Low Gain Avalanche Detector.
 +
Prototype pixelated sensors have been manufactured recently and have to be characterised. Therefore these new sensors will be bump bonded to a Timepix4 ASIC which provides charge and time measurements in each of 230 thousand pixels. Characterisation will be done using a lab setup at Nikhef, and includes tests with a micro-focused laser beam, radioactive sources, and possibly with particle tracks obtained in a test-beam. This project involves data taking with these new devices and analysing the data to determine the performance parameters such as the spatial and temporal resolution. as function of temperature and other operational conditions.  
  
=== KM3NeT: Dark Matter with KM3NeT-ORCA ===
+
''Contacts: [mailto:kazu.akiba@nikhef.nl Kazu Akiba] and [mailto:martinb@nikhef.nl Martin van Beuzekom]''
  
Dark Matter is thought to be everywhere (we should be swimming through it), but we have no idea what it is. Using the good energy and angular resolutions of the KM3NeT neutrino telescope, we can search for Dark Matter signatures that originate from the center of our galaxy. In this project, we will search for such signatures using the reconstructed track and shower events with the KM3NeT-ORCA detector to discover relatively light Dark Matter particles. Since this year, the KM3NeT-ORCA  experiment has 6 detection lines under the Mediterranean Sea: fully operational and continuously taking data. Using the available data, it is possible to compare data and simulation for different event topologies and to estimate the experiment's sensitivity. The project is suitable for a student who is interested to explore new physics scenarios and willing to develop new skills. Basic knowledge of elementary particle physics and data analysis techniques will be advantageous. The knowledge of programming languages e.g. python (possibly C++) and ROOT data analysis tool are advantageous but not mandatory.  
+
===Detector R&D: A Telescope with Ultrathin Sensors for Beam Tests===
 +
To measure the performance of new prototypes for upgrades of the LHC experiments and beyond, typically a telescope is used in a beam line of charged particles that can be used to compare the results in the prototype to particle tracks measured with this telescope. In this project, you will continue work on a very lightweight, compact telescope using ALICE PIxel DEtectors (ALPIDEs). This includes work on the mechanics, data acquisition software, and a moveable stage. You will foreseeably test this telescope in the Delft Proton Therapy Center. If time allows, you will add a timing plane and perform a measurement with one of our prototypes. Apart from travel to Delft, there is a possiblity to travel to other beam line facilities.
  
''Contact: [mailto:suzanbp@nikhef.nl Suzan B. du Pree] [mailto:dveijk@nikhef.nl Daan van Eijk]''
+
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld]''
  
 +
===Detector R&D: Laser Interferometer Space Antenna (LISA) - the first gravitational wave detector in space ===
  
=== Gravitational Waves: Unraveling the structure of neutron stars with gravitational wave observations ===
+
The space-based gravitational wave antenna LISA is one of the most challenging space missions ever proposed. ESA plans to launch around 2034 three spacecraft separated by a few million kilometres. This constellation measures tiny variations in the distances between test-masses located in each satellite to detect gravitational waves from sources such as supermassive black holes. LISA is based on laser interferometry, and the three satellites form a giant Michelson interferometer. LISA measures a relative phase shift between one local laser and one distant laser by light interference. The phase shift measurement requires sensitive sensors. The Nikhef DR&D group fabricated prototype sensors in 2020 together with the Photonics industry and the Dutch institute for space research SRON. Nikhef & SRON are responsible for the Quadrant PhotoReceiver (QPR) system: the sensors, the housing including a complex mount to align the sensors with 10's of nanometer accuracy, various environmental tests at the European Space Research and Technology Centre (ESTEC), and the overall performance of the QPR in the LISA instrument. Currently we are discussing possible sensor improvements for a second fabrication run in 2022, optimizing the mechanics and preparing environmental tests. As a MSc student, you will work on various aspects of the wavefront sensor development: study the performance of the epitaxial stacks of Indium-Gallium-Arsenide, setting up test benches to characterize the sensors and QPR system, performing the actual tests and data analysis, in combination with performance studies and simulations of the LISA instrument.
  
Neutron stars were first discovered more than half a century ago, yet their detailed internal structure largely remains a mystery. A range of theoretical models have been put forward for the neutron star "equation of state", but until recently there was no real way to test them. The direct detection of gravitational waves with LIGO and Virgo has the potential to remedy the situation. When two neutron stars spiral towards each other, they get tidally deformed in a way that is determined by the equation of state, and these deformations get imprinted upon the shape of the gravitational wave that gets emitted. After the first gravitational wave observation of such an event in 2017, several equation of state models could already be ruled out. With expected upgrades of the detectors, we will at some point have access not only to the "inspiral" of binary neutron stars, but to the merger itself, and what happens afterwards. The project will consist of using results from large-scale numerical simulations to come up with a heuristic model for the waveform that describes the inspiral-merger-postmerger process with sufficient accuracy given expected detector sensitivities, and to develop data analysis techniques to efficiently use this model to extract information about the neutron star equation of state.
+
''Contact: [mailto:nielsvb@nikhef.nl Niels van Bakel]''
  
''Contact: [mailto:vdbroeck@nikhef.nl Chris Van Den Broeck]''
+
===Detector R&D: Other projects===
 +
Are you looking for a slightly different project? Are the above projects already taken? Are you coming in at an unusual time of the year? Do not hesitate to contact us! We always have new projects coming up at different times in the year and we are open to your ideas.
  
 +
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld]''
  
=== Gravitational Waves: Searches for gravitational waves from compact binary coalescence ===
+
===FCC: The Next Collider===
Searches for gravitational waves from the mergers of black holes and neutron stars have been extraordinarily successful in the last four years. We are now beginning to study a population of heavy stellar-mass black holes in detail, including understanding how these systems came to form and whether they are consistent with general relativity. Additionally, the detection of binary neutron star mergers is allowing us to probe their extreme matter. However, we’ve only just scratched the surface of possible signals and the new physics they’d allow us to study. The detection of highly spinning and precessing systems would allow us to perform black hole population statistics to an extraordinary degree of accuracy. Detection of sub-solar mass systems would provide evidence of dark matter. However, these searches are difficult because they require us to work in high-dimensional spaces and develop new statistical methods. There are possibilities for several projects that involve the development and implementation of these new searches as well as the interpretation of the results, particularly in terms of the physics describing compact binary mergers.
 
  
''Contact: [mailto:physarah@gmail.com Sarah Caudill]''
+
After the LHC, the next planned large collider at CERN is the proposed 100 kilometer circular collider "FCC". In the first stage of the project, as a high-luminosity electron-positron collider, precision measurements of the Higgs boson are the main goal. One of the channels that will improve by orders of magnitude at this new accelerator is the decay of the Higgs boson to a pair of charm quarks. This project will estimate a projected sensitivity for the coupling of the Higgs boson to second generation quarks, and in particular target the improved reconstruction of the topology of long-lived mesons in the clean environment of a precision e+e- machine.
  
 +
''Contact: [mailto:tdupree@nikhef.nl Tristan du Pree]''
  
=== Gravitational Waves: Computer modelling to design the laser interferometers for the Einstein telescope ===
+
===Gravitational Waves: Computer modelling to design the laser interferometers for the Einstein telescope===
  
 
A new field of instrument science led to the successful detection of gravitational waves by the LIGO detectors in 2015. We are now preparing the next generation of gravitational wave observatories, such as the Einstein Telescope, with the aim to increase the detector sensitivity by a factor of ten, which would allow, for example, to detect stellar-mass black holes from early in the universe when the first stars began to form. This ambitious goal requires us to find ways to significantly improve the best laser interferometers in the world.
 
A new field of instrument science led to the successful detection of gravitational waves by the LIGO detectors in 2015. We are now preparing the next generation of gravitational wave observatories, such as the Einstein Telescope, with the aim to increase the detector sensitivity by a factor of ten, which would allow, for example, to detect stellar-mass black holes from early in the universe when the first stars began to form. This ambitious goal requires us to find ways to significantly improve the best laser interferometers in the world.
Line 328: Line 579:
 
''Contact: [mailto:a.freise@nikhef.nl Andreas Freise]''
 
''Contact: [mailto:a.freise@nikhef.nl Andreas Freise]''
  
 +
===LHCb: Search for light dark particles ===
 +
The Standard Model of elementary particles does not contain a proper Dark Matter candidate. One of the most tantalizing theoretical developments is the so-called ''Hidden Valley models'': a mirror-like copy of the ''Standard Model'', with dark particles that communicate with standard ones via a very feeble interaction. These models predict the existence of ''dark hadrons'' – composite particles that are bound similarly to ordinary hadrons in the ''Standard Model''. Such ''dark hadrons'' can be abundantly produced in high-energy proton-proton collisions, making the LHC a unique place to search for them. Some ''dark hadrons'' are stable like a proton, which makes them excellent ''Dark Matter'' candidates, while others decay to ordinary particles after flying a certain distance in the collider experiment. The LHCb detector has a unique capability to identify such decays, particularly if the new particles have a mass below ten times the proton mass.
 +
 +
This project assumes a unique search for light ''dark hadrons'' that covers a mass range not accessible to other experiments. It assumes an interesting program on data analysis (python-based) with non-trivial machine learning solutions and phenomenology research using fast simulation framework. Depending on the interest, there is quite a bit of flexibility in the precise focus of the project.
 +
 +
''Contact: [mailto:andrii.usachov@nikhef.nl Andrii Usachov]''
 +
 +
===LHCb: Searching for dark matter in exotic six-quark particles===
 +
 +
Three quarters of the mass in the Universe is of unknown type. Many hypotheses about this dark matter have been proposed, but none confirmed. Recently it has been proposed that it could be made of particles made of the six quarks uuddss, which would be a Standard-Model solution to the dark matter problem. This idea has recently gained credibility as many similar multi-quarks states are being discovered by the LHCb experiment. Such a particle could be produced in decays of heavy baryons, or directly in proton-proton collisions. The anti-particle, made of six antiquarks, could be seen when annihilating with detector material. It is also proposed to use Xi_b baryons produced at LHCb to search for such a state where the state would appear as missing 4-momentum in a kinematically constrained decay. The project consists in defining a selection and applying it to LHCb data. See [https://arxiv.org/abs/2007.10378 arXiv:2007.10378].
 +
 +
Contact: ''[mailto:patrick.koppenburg@cern.ch Patrick Koppenburg]''
 +
 +
===LHCb: Measuring lepton flavour universality with excited Ds states in semileptonic Bs decays===
 +
One of the most striking discrepancies between the Standard Model and measurements are the lepton flavour universality (LFU) measurements with tau decays. At the moment, we have observed an excess of 3-4 sigma in ''B → Dτν'' decays. This could point even to a new force of nature! To understand this discrepancy, we need to make further measurements.
 +
 +
One very exciting (pun intended) projects to verify these discrepancies involves measuring the ''B<sub>s</sub> → D<sub>s2</sub><sup>*</sup>τν'' and/or ''B<sub>s</sub> → D<sub>s1</sub><sup>*</sup>τν'' decays. These decays with excited states of the ''D<sub>s</sub>'' meson have not been observed before in the tau decay mode, and have a unique way of coupling to potential new physics candidates that can only be measured in ''B<sub>s</sub>'' decays [1]. See slides for more detail: [[File:LHCbLFUwithExcitedDs.pdf|thumb]]
 +
 +
[1] https://arxiv.org/abs/1606.09300
 +
 +
''Contact: [mailto:suzannek@nikhef.nl Suzanne Klaver]''
 +
 +
===LHCb: New physics in the angular distributions of B decays to K*ee===
 +
 +
Lepton flavour violation in B decays can be explained by a variety of non-standard model interactions. Angular distributions in decays of a B meson to a hadron and two leptons are an important source of information to understand which model is correct. Previous analyses at the LHCb experiment have considered final states with a pair of muons. Our LHCb group at Nikhef concentrates on a new measurement of angular distributions in decays with two electrons. The main challenge in this measurement is the calibration of the detection efficiency. In this project you will confront estimates of the detection efficiency derived from simulation with decay distributions in a well known B decay. Once the calibration is understood, the very first analysis of the angular distributions in the electron final state can be performed.
 +
 +
Contact:  [mailto:m.senghi.soares@nikhef.nl Mara Soares] and [mailto:wouterh@nikhef.nl Wouter Hulsbergen]
 +
 +
===LHCb: Discovering heavy neutrinos in B decays===
 +
 +
Neutrinos are the lightest of all fermions in the standard model. Mechanisms to explain their small mass rely on the introduction of new, much heavier, neutral leptons. If the mass of these new neutrinos is below the b-quark mass, they can be observed in B hadron decays.
 +
 +
In this project we search for the decay of B+ mesons in into an ordinary electron or muon and the yet undiscovered heavy neutrino. The heavy neutrino is expected to be unstable and in turn decay quickly into a charged pion and another electron or muon. The final state in which the two leptons differ in flavour, "B+ to e mu pi", is particularly interesting: It is forbidden in the standard model, such that backgrounds are small. The analysis will be performed within the LHCb group at Nikhef using LHCb run-2 data.
 +
 +
=== LHCb: Scintillating Fibre tracker software===
 +
The installation of the scintillating-fibre tracker in LHCb’s underground cavern was recently completed. This detector uses 10000 km of fibres to track particle trajectories in the LHCb detector when the LHC starts up again later this year. The light emitted by the scintillating fibres when a particle interacts with them is measured using photon multiplier tubes. The studies proposed for this project will focus on software, and could include writing a framework to monitor the detector output, improving the detector simulation or working on the data processing.
 +
 +
''Contact: [mailto:e.gabriel@nikhef.nl Emmy Gabriel]''
 +
 +
=== LHCb: Vertex detector calibration===
 +
In summer 2022 LHCb has started data taking will an almost entirely new detector. At the point closest to the interaction point, the trajectories of charge particles are reconstructed with a so-called silicon pixel detector. The design hit resolution of this detector is about 15 micron. However, to actually reach this resolution a precise calibration of the spatial positions of the silicon sensors needs to be performed. In this project, you will use the first data of the new LHCb detector to perform this calibration and measure the detector performance.
 +
 +
''Contact: [mailto:wouterh@nikhef.nl Wouter Hulsbergen]''
 +
 +
 +
===Neutrinos: Neutrino scattering: the ultimate resolution===
 +
 +
Neutrino telescopes like IceCube and KM3NeT aim at detecting neutrinos from cosmic sources. The neutrinos are detected with the best resolution when charged current interactions with nucleons produce a muon, which can be detected with high accuracy (depending on the detector). A crucial ingredient in the ultimate achievable pointing accuracy of neutrino telescopes is the scattering angle between the neutrino and the muon. While published computations have investigated the cross-section of the process in great detail, this important scattering angle has not received much attention. The aim of the project is to compute and characterize the distribution of this angle, and that the ultimate resolution of a neutrino telescope. If successful, the results of this project can lead to publication of interest to the neutrino telescope community.
 +
 +
Depending on your interests, the study could be based on a first-principles calculation (using the deep-inelastic scattering formalism), include state-of-the-art parton distribution functions, and/or exploit existing event-generation software for a more experimental approach.
 +
 +
''Contacts: [mailto:aart.heijboer@nikhef.nl Aart Heijboer]''
 +
 +
===Neutrinos: acoustic detection of ultra-high energy neutrinos===
 +
 +
The study of the cosmic neutrinos of energies above 1017 eV, the so-called ultra-high energy neutrinos, provides a unique view on the universe and may provide insight in the origin of the most violent astrophysical sources, such as gamma ray bursts, supernovae or even dark matter. In addition, the observation of high energy neutrinos may provide a unique tool to study interactions at high energies. The energy deposition of these extreme neutrinos in water induce a thermo-acoustic signal, which can be detected using sensitive hydrophones. The expected neutrino flux is however extremely low and the signal that neutrinos induce is small. TNO is presently developing sensitive hydrophone technology based on fiber optics. Optical fibers form a natural way to create a distributed sensing system. Using this technology a large scale neutrino telescope can be built in the deep sea. TNO is aiming for a prototype hydrophone which will form the building block of a future telescope.
 +
 +
The work will be executed at the Nikhef institute and/or the TNO laboratories in Delft. In this project master students have the opportunity to contribute in the following ways:
 +
 +
'''Project 1:''' Hardware development on fiber optics hydrophones technology Goal: characterize existing prototype optical fibre hydrophones in an anechoic basin at TNO laboratory. Data collection, calibration, characterization, analysis of consequences for design future acoustic hydrophone neutrino telescopes;
 +
Keywords: Optical fiber technology, signal processing, electronics, lab.
 +
 +
'''Project 2:''' Investigation of ultra-high energy neutrinos and their interactions with matter. Goal: Discriminate the neutrino signals from the background noises, in particular clicks from whales and dolphins in the deep sea. Study impact on physics reach for future acoustic hydrophone neutrino telescopes;
 +
Keywords: Monte Carlo simulations, particle physics, neutrino physics, data analysis algorithms.
 +
 +
Further information: Info on ultra-high energy neutrinos can be found at: http://arxiv.org/abs/1102.3591; Info on acoustic detection of neutrinos can be found at: http://arxiv.org/abs/1311.7588
 +
 +
''Contact: [mailto:ernst-jan.buis@tno.nl Ernst Jan Buis]'' or ''[mailto:ivo.van.vulpen@nikhef.nl Ivo van Vulpen]''
 +
 +
===Neutrinos: Oscillation analysis with the first data of KM3NeT ===
 +
 +
The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first few strings with sensitive photodetectors have been deployed at both the Italian and the French detector sites. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In this project the available data will be used together with simulations to best reconstruct the event topologies and optimally identify and reconstruct the first neutrino interactions in the KM3NeT detector. The data will then be used to measure neutrino oscillation parameters, and prepare for a future neutrino mass ordering determination.
 +
 +
Programming skills are essential, mostly root and C++ will be used.
 +
''Contact: [mailto:bruijn@nikhef.nl Ronald Bruijn] [mailto:h26@nikhef.nl Paul de Jong]''
 +
 +
 +
===Neutrinos: the Deep Underground Neutrino Experiment (DUNE)===
 +
 +
The Deep Underground Neutrino Experiment (DUNE) is under construction in the USA, and will consist of a powerful neutrino beam originating at Fermilab, a near detector at Fermilab, and a far detector in the SURF facility in Lead, South Dakota, 1300 km away. During travelling, neutrinos oscillate and a fraction of the neutrino beam changes flavour; DUNE will determine the neutrino oscillation parameters to unrivaled precision, and try and make a first detection of CP-violation in neutrinos. In this project, various elements of DUNE can be studied, including the neutrino oscillation fit, neutrino physics with the near detector, event reconstruction and classification (including machine learning), or elements of data selection and triggering.
 +
 +
''Contact: [mailto:h26@nikhef.nl Paul de Jong]''
 +
 +
===Neutrinos: relic neutrino detection with PTOLEMY===
 +
PTOLEMY aims to make the first direct observation of the Big Bang relic neutrinos (the cosmic neutrino background, CνB) by resolving the β-decay endpoint of atomic tritium (neutrino capture target) to O(meV) precision. This remains an outstanding test of the Standard Model in an expanding universe. Not only does the CνB carry with it a signal from the hot dense universe only one second after the Big Bang but helps to constrain the balance of hot versus cold dark matter responsible for its evolution. In doing so, the PTOLEMY experiment would also measure the lowest neutrino mass, an as-of-yet unknown fundamental constant. The experiment is currently in the prototyping phase and the group at Nikhef is responsible for developing the radio-frequency (RF) system used for cyclotron radiation (CR) based trigger and tracking. This component will provide the trajectory of electrons entering the novel transverse drift filter, constraining the electrons' energy losses before they reach the cryogenic calorimeter which in turn records their final energy. The focus of this project will be modelling CR and its detection for the purposes of single electron spectroscopy and optimised trajectory reconstruction. There is also the opportunity to test hardware and readout electronics for the prototype RF-system. 
 +
''Contact: [mailto:jmead@nikhef.nl James Vincent Mead]''
 +
 +
=== Theoretical Particle Physics: Effective Field Theories of Particle Physics from low- to high-energies===
 +
Known elementary matter particles exhibit a surprising three-fold structure. The particles belonging to each of these three “generations” seem to display a remarkable pattern of identical properties, yet have vastly different masses. This puzzling pattern is unexplained. Equally unexplained is the bewildering imbalance between matter and anti-matter observed in the universe, despite minimal differences in the properties of particles and anti-particles. These two mystifying phenomena may originate from a deeper, still unknown, fundamental structure characterised by novel types of particles and interactions, whose unveiling would revolutionise our understanding of nature. The ultimate goal of particle physics is uncovering a fundamental theory which allows the coherent interpretation of phenomena taking place at all energy and distance scales. In this project, the students will exploit the Standard Model Effective Field Theory (SMEFT) formalism, which allows the theoretical interpretation of particle physics data in terms of new fundamental quantum interactions which relate seemingly disconnected processes with minimal assumptions on the nature of an eventual UV-complete theory that replaces the Standard Model. Specifically, the goal is to connect measurements from ATLAS, CMS, and LHCb experiments at the CERN's LHC among them and to jointly interpret this information with that provided by other experiments including very low-energy probes such as the anomalous magnetic moment of the muon or electric dipole moments of the electron and neutron.
 +
 +
This project will be based on theoretical calculations in particle physics, numerical simulations in Python, analysis of existing data from the LHC and other experiments, as well as formal developments in understanding the operator structure of effective field theories. Depending on the student profile, sub-projects with a strong computational and/or machine learning component are also possible, for instance to construct new operators with optimal sensitivity to New Physics effects as encoded by the SMEFT higher-dimensional operators. Topics that can be considered in this project include the interpretation of novel physical observables at the LHC and their integration into the global SMEFiT analysis, matching of EFTs to UV-complete theories and their phenomenological analyses, projections for the impact in the SMEFT parameter space of data for future colliders, the synergies between EFT studies and proton structure fits, and the matching to the Weak Effective Field Theory to include data on flavour observables such as B-meson decays.
 +
 +
References: https://arxiv.org/abs/2105.00006 ,  https://arxiv.org/abs/2302.06660, https://arxiv.org/abs/2211.02058 , https://arxiv.org/abs/1901.05965  , https://arxiv.org/abs/1906.05296 ,  https://arxiv.org/abs/1908.05588,  https://arxiv.org/abs/1905.05215. see also this [https://www.dropbox.com/s/30co188f1almzq2/rojo-GRAPPA-MSc-2023.pdf?dl=0 project description].
 +
 +
''Contacts: [Mailto:j.rojo@vu.nl Juan Rojo]''
 +
 +
===Theoretical Particle Physics: High-energy neutrino-nucleon interactions at the Forward Physics Facility===
 +
High-energy collisions at the High-Luminosity Large Hadron Collider (HL-LHC) produce a large number of particles along the beam collision axis, outside of the acceptance of existing experiments. The proposed Forward Physics Facility (FPF) to be located several hundred meters from the ATLAS interaction point and shielded by concrete and rock, will host a suite of experiments to probe Standard Model (SM) processes and search for physics beyond the Standard Model (BSM). High statistics neutrino detection will provide valuable data for fundamental topics in perturbative and non-perturbative QCD and in weak interactions. Experiments at the FPF will enable synergies between forward particle production at the LHC and astroparticle physics to be exploited. The FPF has the promising potential to probe our understanding of the strong interactions as well as of proton and nuclear structure, providing access to both the very low-x and the very high-x regions of the colliding protons. The former regime is sensitive to novel QCD production mechanisms, such as BFKL effects and non-linear dynamics, as well as the gluon parton distribution function (PDF) down to x=1e-7, well beyond the coverage of other experiments and providing key inputs for astroparticle physics. In addition, the FPF acts as a neutrino-induced deep-inelastic scattering (DIS) experiment with TeV-scale neutrino beams. The resulting measurements of neutrino DIS structure functions represent a valuable handle on the partonic structure of nucleons and nuclei, particularly their quark flavour separation, that is fully complementary to the charged-lepton DIS measurements expected at the upcoming Electron-Ion Collider (EIC).
 +
 +
In this project, the student will carry out updated predictions for the neutrino fluxes expected at the FPF, assess the precision with which neutrino cross-sections will be measured, and quantify their impact on proton and nuclear structure by means of machine learning tools within the NNPDF framework and state-of-the-art calculations in perturbative Quantum Chromodynamics. This project contributes to ongoing work within the FPF Initiative towards a Conceptual Design Report (CDR) to be presented within two years. Topics that can be considered as part of this project include the assessment of to which extent nuclear modifications of the free-proton PDFs can be constrained by FPF measurements, the determination of the small-x gluon PDF from suitably defined observables at the FPF and the implications for ultra-high-energy particle astrophysics, the study of the intrinsic charm content in the proton and its consequences for the FPF physics program, and the validation of models for neutrino-nucleon cross-sections in the region beyond the validity of perturbative QCD.
 +
 +
References: https://arxiv.org/abs/2203.05090, https://arxiv.org/abs/2109.10905 ,https://arxiv.org/abs/2208.08372 , https://arxiv.org/abs/2201.12363  , https://arxiv.org/abs/2109.02653, https://github.com/NNPDF/ see also this [https://www.dropbox.com/s/30co188f1almzq2/rojo-GRAPPA-MSc-2023.pdf?dl=0 project description].
 +
 +
''Contacts: [Mailto:j.rojo@vu.nl Juan Rojo]''
  
=== Gravitational Waves: Digging away the noise to find the signal ===
+
===Theoretical Particle Physics: Probing the origin of the proton spin with machine learning===
 +
At energy-frontier facilities such as the Large Hadron Collider (LHC), scientists study the laws of nature in their quest for novel phenomena both within and beyond the Standard Model of particle physics. An in-depth understanding of the quark and gluon substructure of protons and heavy nuclei is crucial to address pressing questions from the nature of the Higgs boson to the origin of cosmic neutrinos. The key to address some of these questions is by carrying out an universal analysis of nucleon structure from the simultaneous determination of the momentum and spin distributions of quarks and gluons and their fragmentation into hadrons. This effort requires combining an extensive experimental dataset and cutting-edge theory calculations within a machine learning framework where neural networks parametrise the underlying physical laws while minimising ad-hoc model assumptions. The upcoming Electron Ion Collider (EIC), to start taking data in 2029, will be the world's first ever polarised lepton-hadron collider and will offer a plethora of opportunities to address key open questions in our understanding of the strong nuclear force, such as the origin of the mass and the intrinsic angular momentum (spin) of hadrons and whether there exists a state of matter which is entirely dominated by gluons. To fully exploit this scientific potential, novel analysis methodologies need to be develop that make it possible to carry out large-scale, coherent interpretations of measurements from the EIC and other high-energy colliders.
  
Gravitational Wave interferometers are extremely sensitive, but suffer
+
In this project, the student will carry out a new global analysis of the spin structure of the proton by means of the machine learning tools provided by the NNPDF open-source fitting framework and state-of-the-art calculations in perturbative Quantum Chromodynamics, and integrate it within the corresponding global NNPDF analyses of unpolarised proton and nuclear structure in the framework of a combined integrated global analysis of non-perturbative QCD. Specifically, the project aims to realise a NNLO global fit of polarised quark and gluon PDFs that combines all available data and state-of-the-art perturbative QCD calculations, and study the phenomenological implications for other experiments, including the EIC, for the spin content of the proton, for comparisons with lattice QCD calculations, and for nonpperturbative models of hadron structure.
from instrumental issues that produce noise that mimics astrophysical
 
signals. This needs to be solved as much as possible before the data
 
analysis. The problem is that  instrumentalists don't know about
 
analysis pipelines, and data analysts don't know about experimental
 
details. We need your help to bridge the gap. This is a good opportunity
 
to learn about both sides and contribute directly to a booming
 
international field. We have several tools and new ideas for correlating
 
noises with the state of the instrument. These need to be developed
 
further, used on years of data, and written up. Will require Python,  
 
signal processing and statistics.
 
  
''Contact: [mailto:swinkels@nikhef.nl Bas Swinkels] and [mailto:physarah@gmail.com Sarah Caudill]''
+
References: https://arxiv.org/abs/2201.12363,  https://arxiv.org/abs/2109.02653 , https://arxiv.org/abs/2103.05419, https://arxiv.org/abs/1404.4293 , https://inspirehep.net/literature/1302398, https://github.com/NNPDF/ see also this [https://www.dropbox.com/s/30co188f1almzq2/rojo-GRAPPA-MSc-2023.pdf?dl=0 project description].
  
 +
''Contacts: [Mailto:j.rojo@vu.nl Juan Rojo]''
  
=== Gravitational Waves: Machine Learning techniques for GW Interferometers ===
+
==='''Theoretical Particle Physics''': Charged lepton flavor violation in neutrino mass models===
The control of suspended optical cavities in the non linear regime.  
+
The nonzero value of neutrino masses requires an explanation beyond the Standard Model of particle physics. A promising solution involves the existence of extra neutrinos, often called right-handed or sterile neutrinos. These models elegantly explain neutrino masses and can also be connected to other puzzles such as the absence of anti-matter in our universe. In this project you will investigate potential experimental signatures of sterile neutrinos through decays that are extremely rare in the Standard Model. Examples are muon decays to electrons and photons, or muon + neutron -> electron + neutron. You will perform Quantum Field Theory calculations within the neutrino-extended Standard Model to compute the rates of these processes and compare them to experimental sensitivities.  
Gravitational Wave interferometers are extremely sensitive, however suffer from a very small control range, causing unlocks,
 
reducing the robustness of these instruments.  
 
In this project we will use a table top replica of a suspended optical cavity,
 
located in the new R&D laser lab at Nikhef, for the development of a neural
 
network to construct the positions from free falling mirror by using beam
 
images. A database with simulated beam images can be used to train
 
various neural networks before deployment in the table top experiment.
 
We are looking for a hands-on and enthusiastic master student, interested
 
in machine learning and experienced in programming languages like Python.
 
Contacts: Rob Walet, Frank Linde
 
  
''Contact: [mailto:r.walet@nikhef.nl Rob Walet] and [mailto:f.l.linde@gmail.com Frank Linde]''
+
''Contacts: [Mailto:j.devries4@uva.nl Jordy de Vries]''
  
=== VU LaserLaB: Measuring the electric dipole moment (EDM) of the electron ===
+
==='''Theoretical Particle Physics''': The electric dipole moment of paramagnetic systems in the Standard Model===
 +
Electric dipole moments (EDMs) of elementary particles, hadrons, nuclei, atoms, and molecules would indicate the violation of CP violation. The Standard Model (SM) contains CP violation in the weak interaction in the so-called CKM matrix (the quark-mixing matrix) but it leads to EDMs that are too small to be seen. At least this is often claimed. In this work we will reinvestigate the computation of the EDMs of systems that are used in state-of-the-art experiments. In particular we will compute a CP-violating interaction between electrons and nucleons mitigated by the SM weak interaction. During this project you will obtain a deep understanding of the Standard Model and explicit quantum field theory calculations across a wider range of energy scales.
  
In collaboration with Nikhef and the Van Swinderen Institute for Particle Physics and Gravity at the University of Groningen, we have recently started an exciting project to measure the electric dipole moment (EDM) of the electron in cold beams of barium-fluoride molecules. The eEDM, which is predicted by the Standard Model of particle physics to be extremely small, is a powerful probe to explore physics beyond this Standard Model. All extensions to the Standard Model, most prominently supersymmetry, naturally predict an electron EDM that is just below the current experimental limits. We aim to improve on the best current measurement by at least an order of magnitude. To do so we will perform a precision measurement on a slow beam of laser-cooled BaF molecules. With this low-energy precision experiment, we test physics at energies comparable to those of LHC!
+
''Contacts: [Mailto:j.devries4@uva.nl Jordy de Vries]''
 +
------------------------------------------------------------------
  
At LaserLaB VU, we are responsible for building and testing a cryogenic source of BaF molecules. The main parts of this source are currently being constructed in the workshop. We are looking for enthusiastic master students to help setup the laser system that will be used to detect BaF. Furthermore, projects are available to perform simulations of trajectory simulations to design a lens system that guides the BaF molecules from the exit of the cryogenic source to the experiment.
 
  
''Contact: [mailto:H.L.Bethlem@vu.nl Rick Bethlem]''
+
==Finished master projects==
  
=== VU LaserLaB: Physics beyond the Standard model from molecules ===
+
See:  
 +
*https://wiki.nikhef.nl/education/Master_Theses
 +
*https://www.nikhef.nl/master-theses-2021/
 +
*https://www.nikhef.nl/facts-figures-2020/master-theses-2020/
  
Our team, with a number of staff members (Ubachs, Eikema, Salumbides, Bethlem, Koelemeij) focuses on precision measurements in the hydrogen molecule, and its isotopomers. The work aims at testing the QED calculations of energy levels in H2, D2, T2, HD, etc. with the most precise measurements, where all kind of experimental laser techniques play a role (cw and pulsed lasers, atomic clocks, frequency combs, molecular beams). Also a target of studies is the connection to the "Proton size puzzle", which may be solved  through studies in the hydrogen molecular isotopes.
 
  
In the past half year we have produced a number of important results that are described in
+
----
the following papers:
 
* Frequency comb (Ramsey type) electronic  excitations in the  H2 molecule:
 
see: Deep-ultraviolet frequency metrology of H2 for tests of molecular quantum theory
 
http://www.nat.vu.nl/~wimu/Publications/Altmann-PRL-2018.pdf
 
* ''Precision measurement of an infrared transition in the HD molecule''
 
see: Sub-Doppler frequency metrology in HD for tests of fundamental physics: https://arxiv.org/abs/1712.08438
 
* ''The first precision study in molecular tritium T2''
 
see: Relativistic and QED effects in the fundamental vibration of T2:  http://arxiv.org/abs/1803.03161
 
* ''Dissociation energy of the hydrogen molecule at 10^-9 accuracy'' paper submitted to Phys. Rev. Lett.
 
* ''Probing QED and fundamental constants through laser spectroscopy of vibrational transitions in HD+''
 
This is also a study of the hydrogen molecular ion HD+, where important results were  obtained not so long ago, and where we have a strong activity: http://www.nat.vu.nl/~wimu/Publications/ncomms10385.pdf
 
  
These five results mark the various directions we are pursuing, and in all directions we aim at obtaining improvements. Specific projects with students can be defined; those are mostly experimental, although there might be some theoretical tasks, like performing calculations of hyperfine structures.
 
''Contact: [mailto:w.m.g.ubachs@vu.nl Wim Ubachs] [mailto:k.s.e.eikema@vu.nl Kjeld Eikema] [mailto:h.l.bethlem@vu.nl Rick Bethlem]''
 
  
  
  
 
[[Last years MSc Projects|Last year's MSc Projects]]
 
[[Last years MSc Projects|Last year's MSc Projects]]

Latest revision as of 14:40, 22 September 2024

Master Thesis Research Projects

The following Master thesis research projects are offered at Nikhef. If you are interested in one of these projects, please contact the coordinator listed with the project.

Projects with a 2024 start [WORK IN PROGRESS, please look below for older projects]

ALICE: Search for new physics with 4D tracking at the most sensitive vertex detector at the LHC

With the newly installed Inner Tracking System consisting fully of monolithic detectors, ALICE is very sensitive to particles with low transverse momenta, more so than ATLAS and CMS. This will be even more so for the ALICE upgrade detector in 2033. This detector could potentially be even more sensitive to longlived particles that leave peculiar tracks such as disappearing or kinked tracks in the tracker by using timing information along a track. In this project you will investigate how timing information in the different tracking layers can improve or even enable a search for new physics beyond the Standard Model in ALICE. If you show a possibility for major improvements, this can have real consequences for the choice of sensors for this ALICE inner tracker upgrade.

Contact: Jory Sonneveld and Panos Christakoglou

ALICE: Connecting the hot and cold QCD matter by searching for the strongest magnetic field in nature

In a non-central collision between two Pb ions, with a large value of impact parameter, the charged nucleons that do not participate in the interaction (called spectators) create strong magnetic fields. A back of the envelope calculation using the Biot-Savart law brings the magnitude of this filed close to 10^19Gauss in agreement with state of the art theoretical calculation, making it the strongest magnetic field in nature. The presence of this field could have direct implications in the motion of final state particles. The magnetic field, however, decays rapidly. The decay rate depends on the electric conductivity of the medium which is experimentally poorly constrained. Overall, the presence of the magnetic field, the main goal of this project, is so far not confirmed experimentally and can have implications for measurements of gravitational waves emitted from the merger of neutron stars.

Contact: Panos Christakoglou

ALICE/LHCb Tracking: Innovative tracking techniques exploting modern heterogeneous architectures

The recostruction of charged particle tracks is one of the most computationaly demanding components of modern high energy physics experiments. In particular, the upcoming High-Luminosity Large Hadron Collider (HL-LHC) makes the usage of fast tracking algorithms using modern computing architectures with many cores and accelerators essential. In this project we will be investigating innovative, machine learning, experiment agnostic tracking algorithms in modern architectures e.g. GPUs, FPGAs.

Contact: Jacco de Vries and Panos Christakoglou

ATLAS: Search for very rare Higgs decays to second-generation fermions

While the Higgs boson coupling to fermions of the third generation has been established experimentally, the investigation of the Higgs boson coupling to the light fermions of the second generation will be a central project for the current data-taking period of the LHC (2022-2025). The Higgs boson decay to muons is the most sensitive channel for probing this coupling. In this project, event selection algorithms for Higgs boson decays to muons in the associated production with a gauge boson (VH) are developed with the aim to distinguish signal events from background processes like Drell-Yan and WZ boson production. For this purpose, the candidate will implement and validate deep learning algorithms, and extract the final results based on fit  to the output of the deep learning classifier.

Contact: Oliver Rieger and Wouter Verkerke

ATLAS: Advanced deep-learning techniques for lepton identification

The ATLAS experiment at the Large Hadron Collider facilitates a broad spectrum of physics analyses. A critical aspect of these analyses is the efficient and accurate identification of leptons, which are crucial for both signal detection and background event rejection. The ability to distinguish between prompt leptons, arising directly from the collision, and nonprompt leptons, originating from heavy flavour hadron decays, is a challenging task. This project aims to develop and implement advanced techniques based on deep learning models to leverage the lepton identification beyond the capabilities of current standard methods.

Contact: Oliver Rieger and Wouter Verkerke

ATLAS: Probing CP-violation in the Higgs sector with the ATLAS experiment

The Standard Model Effective Field Theory (SMEFT) provides a systematic approach to test the impact of new physics at the energy scale of the LHC through higher-dimensional operators. The scarcity of antimatter in the cosmos arises from the slight differences in the behavior of particles and their antiparticle counterparts, known as CP-violation. The current data-taking period of the LHC is expected to yield a comprehensive dataset, enabling the investigation of CP-odd SMEFT operators in the Higgs boson's interactions with other particles.The interpretation of experimental data using SMEFT requires a particular interest in solving complex technical challenges, advanced statistical techniques, and a deep understanding of particle physics.

Contact: Lydia Brenner, Oliver Rieger and Wouter Verkerke

ATLAS: Signal and background sensitivity in Standard Model Effective Field Theory (SMEFT)

Complex statistical combinations of large sectors of the ATLAS scientific program are currently being used to obtain the best experimental sensitivity to SMEFT parameters. However, to achieve a fully consistent investigation of SMEFT and to push the limit of what is possible with the data already collected it is needed to include background modifications effects. Joining our efforts in this topic means contributing to a cutting-edge investigation that requires both a particular motivation in solving complex technical challenges and into obtaining a broad knowledge of experimental particle physics.

Contact: Andrea Visibile and Lydia Brenner

ATLAS: Performing a Bell test in Higgs to di-boson decays

Recently, theorist [1] have proposed to perform a Bell test in Higgs to di-boson decays. This is a fundamental test of not only quantum mechanics but also a test of quantum field theory using the elusive scalar Higgs particle. At Nikhef we started to brainstorm on the experimental aspects of this challenging measurement. Due to the studies of a PhD student [2] we have considerable experience in the reconstruction of Higgs rest frame angles that are essential to perform a Bell test. Is there a master student who wants to join our efforts to study the "spooky action at a distance" in Higgs to WW decays?

Contact: Peter Kluit

  [1] Review article https://arxiv.org/pdf/2402.07972.pdf

  [2] https://www.nikhef.nl/pub/services/biblio/theses_pdf/thesis_R_Aben.pdf

ATLAS: A new timing detector - the HGTD

The ATLAS is going to get a new ability: a timing detector. This allows us to reconstruct tracks not only in the 3 dimensions of space but adds the ability of measuring very precisely also the time (at picosecond level) at which the particles pass the sensitive layers of the HGTD detector. The added information helps to construct the trajectories of the particles created at the LHC in 4 dimensions and ultimately will lead to a better reconstruction of physics at ATLAS. The new HGTD detector is still in construction and work needs to be done on different levels such as understanding the detector response (taking measurements in the lab and performing simulations) or developing algorithms to reconstruct the particle trajectories (programming and analysis work).

Several projects are available within the context of the new HGTD detector:

  1. One can choose to either focus on the impact on physics analysis performance by studying how the timing measurements can be included in the reconstruction of tracks, and what effect this has on how much better we can understand the physical processes occurring in the particles produced in the LHC collisions. With this work you will be part of the Atlas group at Nikhef.
  2. The second possibility is to test the sensors in our lab and in test-beam setups at CERN/DESY. The analysis performed will be in context of the ATLAS HGTD test beam group in connection to both the Atlas group and the R&D department at Nikhef.
  3. The third is to contribute in an ongoing effort to precisely simulate/model the silicon avalanche detectors in the Allpix2 framework. There are several models that try to describe the detectors response. The models have depend on operation temperature, field strenghts and radiation damage. We are getting close in being able to model our detector - but not there yet. This work will be within the ATLAS group.

Contact: Hella Snoek

ATLAS: Studying rare modes of Higgs boson production at the LHC

The Higgs boson is a crucial piece of the Standard Model and its most recently-discovered particle. Studying Higgs boson production and decay at the LHC might hold the key for unlocking new information about the physical laws governing our universe. With the LHC now in its third run, we can also use the enormous amounts of data being collected to study Higgs boson production modes we have not previously been able to access. For instance, we can look at the production of a Higgs boson via the fusion of two vector bosons, accompanied by emission of a photon, with subsequent H->WW decay. This state is experimentally-distinctive and should be accessible to us using the current dataset of the LHC. It is also theoretically-interesting because it probes the Higgs boson’s interaction with W bosons. This exact interaction is a cornerstone of electroweak symmetry breaking, the process by which particles gain mass, so studying it provides a window onto a fundamental part of the Standard Model. This project will study the feasibility of measuring this or another rare Higgs production mode using H->WW decays, providing a chance to be involved in the design of an analysis from the ground up.

Contact: Robin Hayes, Flavia de Almeida Dias

ATLAS: Exploring triboson polarisation in loop-induced processes at the LHC

Spin is a fundamental, quantum mechanical property carried by (most) elementary particles. When high-energy particles scatter, their spin influences how angular momentum is propagated through the processes and ultimately how final-state particles are (geometrically) distributed. Helicity is the projection of the spin vector upon momentum. For example: in the loop-induced process gg > W+W-Z, the angular separation between the various decay products of the W and Z bosons depends on the helicity polarisation of the intermediate W and Z bosons. The aim of this project is to explore helicity polarisation in the multiboson processes, and specifically the gg > WWZ process, at the Large Hadron Collider. This project is in the interface between theory and experiment, and you will work with Monte Carlo generators, analyses design and sensitivity studies.

Contact: Flavia de Almeida Dias

ATLAS: High-Performance Simulations for High-Energy Physics Experiments - Multiple Enhancements

The role of simulation and synthetic data generation for High-Energy Physics (HEP) research is profound. While there are physics-accurate simulation frameworks available to provide the most realistic data syntheses, these tools are slow. Additionally, the output from physics-accurate simulations is closely tied to the experiment that the simulation was developed for and its software.

Fast simulation frameworks on the other hand, can drastically simplify the simulation, while still striking a balance between speed and accuracy of the simulated events. The applications of simplified simulations and data are numerous. We will be focusing on the role of such data as an enabler for Machine Learning (ML) model design research.

This project aims to extend the REDVID simulation framework [1, 2] through addition of new features. The features considered for this iteration include:

  • Interaction with common Monte Carlo event generators: To calculate hit points for imported events
  • Addition of basic magnetic field effect: Simulation of a simplified, uniform magnetic field, affecting charged particle trajectories
  • Inclusion of pile-up effects during simulation: Multiple particle collisions occurring in close vicinity
  • Indication of bunch size
  • Spherical coordinates
  • Vectorised helical tracks
  • Considerations for reproducibility of collision events

The project is part of an ongoing effort to train and test ML models for particle track reconstruction for the HL-LHC. The improved version of REDVID can be used by the student and other users to generate training data for ML models. Depending on the progress and the interest, a secondary goal could be to perform comparisons with physics-accurate simulations or to investigate the impact of the new features on developed ML models.

Bonus: The student will be encouraged and supported to publish the output of this study in a relevant journal, such as "Data in Brief" by Elsevier.

Appendix - Terminology

The terminology for the considered simulations and its features is domain-specific and are explained below:

  • Synthetic data: Data generated during a simulation, which resembles real-data to limited extent.
  • Physics-accurate simulation: A type of simulation that strongly takes into account real-world physical interactions and utilises physics formulas to achieve this.
  • Complexity-aware simulation framework: A simulator which can be configured with different levels of simulation complexity, making the simulation closer or further away from the real-world case.
  • Complexity-reduced data set: Simplified data resulting from simplified simulations. This is in comparison to real data, or data generated by physics-accurate simulations.

References

[1] U. Odyurt et al., 2023, "Reduced Simulations for High-Energy Physics, a Middle Ground for Data-Driven Physics Research". URL: https://doi.org/10.48550/arXiv.2309.03780

[2] U. Odyurt, 2023, "REDVID Simulation Framework". URL: https://virtualdetector.com/redvid

Contact: dr. ir. Uraz Odyurt, dr. Roel Aaij

ATLAS: High-Performance Simulations for High-Energy Physics Experiments - Electron and Muon Simulation (2 projects)

The role of simulation and synthetic data generation for High-Energy Physics (HEP) research is profound. While there are physics-accurate simulation frameworks available to provide the most realistic data syntheses, these tools are slow. Additionally, the output from physics-accurate simulations is closely tied to the experiment, e.g., fixed detector geometry, that the simulation was developed for and its software.

Fast simulation frameworks on the other hand, can drastically simplify the simulation, while still striking a balance between speed and accuracy of the simulated events. The applications of simplified simulations and data are numerous. We will be focusing on the role of such data as an enabler for Machine Learning (ML) model design research.

This project aims to extend the REDVID simulation framework [1, 2] through addition of new features.

Electron simulation

The main feature considered for this iteration is support for different particles, especially electrons.

It is paramount to have enough differentiation between different particle types within a simulation. To be able to simulate the behaviour of an electron, certain characteristics have to be implemented, which are as follows:

  • Electrons interact with matter and could emit bremsstrahlung radiation, in turn, leading to generation of secondary particles in the form of showers.
  • The concept of jets and showers can be designed in the same way within REDVID. This will be an acceptable simplification and boosts code reuse.
  • Electrons also lose energy through bremsstrahlung radiation as they go through the matter. This loss of energy can alter the electron's trajectory, causing it to slow down or change direction.

There will be a need for dedicated virtual detector segments to act as matter, or the detector sublayer should be considered with thickness, or both. The student will test the impact of the added information on developed ML models, which may involve training/retraining of these models.

Muon simulation

The main feature considered for this iteration is support for different particles, especially muons.

It is paramount to have enough differentiation between different particle types within a simulation. To be able to simulate the behaviour of a muon, certain characteristics have to be implemented, which are as follows:

  • Muons are heavy particles and as a result, possess higher penetration power on matter.
  • Muons are unstable particles and decay into other particles, but not necessarily within the range of the detector.
  • Muons interact with matter, which could result in a change of the original direction.
  • Muons are charged particles and are affected by magnetic fields, resulting in bent trajectories. The curvature of muon trajectories in magnetic fields reveals information about their momentum.
  • Distinguishing muons from other particles, i.e., background signals, is crucial.

The student shall study, select and implement a minimum set of distinguishing characteristics to REDVID. A validation step, showcasing the differences in particle behaviour may be required. There may be a need for dedicated virtual detector layers to be defined. The student will test the impact of the added information on developed ML models, which may involve training/retraining of these models.

The project is part of an ongoing effort to train and test ML models for particle track reconstruction for the HL-LHC. The improved version of REDVID can be used by the student and other users to generate training data for ML models. Depending on the progress and the interest, a secondary goal could be to perform comparisons with physics-accurate simulations or to investigate the impact of the new features on developed ML models.

Bonus: The student will be encouraged and supported to publish the output of this study in a relevant journal, such as "Data in Brief" by Elsevier.

Appendix - Terminology

The terminology for the considered simulations and its features is domain-specific and are explained below:

  • Synthetic data: Data generated during a simulation, which resembles real-data to limited extent.
  • Physics-accurate simulation: A type of simulation that strongly takes into account real-world physical interactions and utilises physics formulas to achieve this.
  • Complexity-aware simulation framework: A simulator which can be configured with different levels of simulation complexity, making the simulation closer or further away from the real-world case.
  • Complexity-reduced data set: Simplified data resulting from simplified simulations. This is in comparison to real data, or data generated by physics-accurate simulations.

References

[1] U. Odyurt et al., 2023, "Reduced Simulations for High-Energy Physics, a Middle Ground for Data-Driven Physics Research". URL: https://doi.org/10.48550/arXiv.2309.03780

[2] U. Odyurt, 2023, "REDVID Simulation Framework". URL: https://virtualdetector.com/redvid

[3] T. Sjöstrand et al., 2006, "PYTHIA 6.4 physics and manual". URL: https://doi.org/10.1088/1126-+6708/2006/05/026

Contact: dr. ir. Uraz Odyurt, dr. Flavia de Almeida Dias


Dark Matter: Building better Dark Matter Detectors - the XAMS R&D Setup

The Amsterdam Dark Matter group operates an R&D xenon detector at Nikhef. The detector is a dual-phase xenon time-projection chamber and contains about 0.5kg of ultra-pure liquid xenon in the central volume. We use this detector for the development of new detection techniques - such as utilizing our newly installed silicon photomultipliers - and to improve the understanding of the response of liquid xenon to various forms of radiation. The results could be directly used in the XENONnT experiment, the world’s most sensitive direct detection dark matter experiment at the Gran Sasso underground laboratory, or for future Dark Matter experiments like DARWIN. We have several interesting projects for this facility. We are looking for someone who is interested in working in a laboratory on high-tech equipment, modifying the detector, taking data and analyzing the data themselves You will "own" this experiment.

Contact: Patrick Decowski and Auke Colijn

Dark Matter: Searching for Dark Matter Particles - XENONnT Data Analysis

The XENON collaboration has used the XENON1T detector to achieve the world’s most sensitive direct detection dark matter results and is currently operating the XENONnT successor experiment. The detectors operate at the Gran Sasso underground laboratory and consist of so-called dual-phase xenon time-projection chambers filled with ultra-pure xenon. Our group has an opening for a motivated MSc student to do analysis with the new data coming from the XENONnT detector. The work will consist of understanding the detector signals and applying a deep neural network to improve the (gas-) background discrimination in our Python-based analysis tool to improve the sensitivity for low-mass dark matter particles. The work will continue a study started by a recent graduate. There will also be opportunity to do data-taking shifts at the Gran Sasso underground laboratory in Italy.

Contact: Patrick Decowski and Auke Colijn

Dark Matter: Signal reconstruction and correction in XENONnT

XENONnT is a low background experiment operating at the INFN - Gran Sasso underground laboratory with the main goal of detecting Dark Matter interactions with xenon target nuclei. The detector, consisting of a dual-phase time projection chamber, is filled with ultra-pure xenon, which acts as a target and detection medium. Understanding the detector's response to various calibration sources is a mandatory step in exploiting the scientific data acquired. This MSc thesis aims to develop new methods to improve the reconstruction and correction of scintillation/ ionization signals from calibration data. The student will work with modern analysis techniques (python-based) and will collaborate with other analysts within the international XENON Collaboration.

Contact: Maxime Pierre, Patrick Decowski

Dark Matter: The Ultimate Dark Matter Experiment - DARWIN Sensitivity Studies

DARWIN is the “ultimate” direct detection dark matter experiment, with the goal to reach the so-called “neutrino floor”, when neutrinos become a hard-to-reduce background. The large and exquisitely clean xenon mass will allow DARWIN to also be sensitive to other physics signals such as solar neutrinos, double-beta decay from Xe-136, axions and axion-like particles etc. While the experiment will only start in 2027, we are in the midst of optimizing the experiment, which is driven by simulations. We have an opening for a student to work on the GEANT4 Monte Carlo simulations for DARWIN. We are also working on a “fast simulation” that could be included in this framework. It is your opportunity to steer the optimization of a large and unique experiment. This project requires good programming skills (Python and C++) and data analysis/physics interpretation skills.

Contact: Tina Pollmann, Patrick Decowski or Auke Colijn

Dark Matter: Exploring new background sources for DARWIN

Experiments based on the xenon dual-phase time projection chamber detection technology have already demonstrated their leading role in the search for Dark Matter. The unprecedented low level of background reached by the current generation, such as XENONnT, allows such experiments to be sensitive to new rare-events physics searches, broadening their physics program. The next generation of experiments is already under consideration with the DARWIN observatory, which aims to surpass its predecessors in terms of background level and mass of xenon target. With the increased sensitivity to new physics channels, such as the study of neutrino properties, new sources of backgrounds may arise. This MSc thesis aims to investigate potential new sources of background for DARWIN and is a good opportunity for the student to contribute to the design of the experiment. This project will rely on Monte Carlo simulation tools such as GEANT4 and FLUKA, and good programming skills (Python and C++) are advantageous.

Contact: Maxime Pierre, Patrick Decowski

Dark Matter: Sensitive tests of wavelength-shifting properties of materials for dark matter detectors

Rare event search experiments that look for neutrino and dark matter interactions are performed with highly sensitive detector systems, often relying on scintillators, especially liquid noble gases, to detect particle interactions. Detectors consist of structural materials that are assumed to be optically passive, and light detection systems that use reflectors, light detectors, and sometimes, wavelength-shifting materials. MSc theses are available related to measuring the efficiency of light detection systems that might be used in future detectors. Furthermore, measurements to ensure that presumably passive materials do not fluoresce, at the low level relevant to the detectors, can be done. Part of the thesis work can include Monte Carlo simulations and data analysis for current and upcoming dark matter detectors, to study the effect of different levels of desired and nuisance wavelength shifting. In this project, students will acquire skills in photon detection, wavelength shifting technologies, vacuum systems, UV and extreme-UV optics, detector design, and optionally in Python and C++ programming, data analysis, and Monte Carlo techniques.

Contact: Tina Pollmann

Detector R&D: Energy Calibration of hybrid pixel detector with the Timepix4 chip

The Large Hadron Collider at CERN will increase its luminosity in the coming years. For the LHCb experiment the number of collisions per bunch crossing increases from 7 to more than 40. To distinguish all tracks from the quasi simultaneous collisions, time information will have to be used in addition to spatial information. A big step on the way to fast silicon detectors is the recently developed Timepix4 ASIC. Timepix4 consist of 448x512 pixels, but the pixels are not identical and there are pixel to pixel fluctuations in the time and charge measurement. The ultimate time resolution can only be achieved after calibration of both the time and energy measurements. The goal of this project is to study the energy calibration of Timepix4. Typical research questions are: how does the resolution depend on threshold and Krummenacher (discharge) current, and does a different sensor affect the energy resolution? In this research you will do measurements with calibration pulses, lasers and with radio-active sources to obtain data to calibrate the detector. The work consist of hands-on work in the lab to build/adapt the test set-up, and analysis of the data obtained.

Contact: Daan Oppenhuis,Hella Snoek,

Detector R&D: Studies of wafer-scale sensors for ALICE detector upgrade and beyond

One of the biggest milestones of the ALICE detector upgrade (foreseen in 2026) is the implementation of wafer-scale (~ 28 cm x 18 cm) monolithic silicon active pixel sensors in the tracking detector, with the goal of having truly cylindrical barrels around the beam pipe. To demonstrate such an unprecedented technology in high energy physics detectors, few chips will be soon available in Nikhef laboratories for testing and characterization purposes. The goal of the project is to contribute to the validation of the samples against the ALICE tracking detector requirements, with a focus on timing performance in view of other applications in future high energy physics experiments beyond ALICE. We are looking for a student with a focus on lab work and interested in high precision measurements with cutting-edge instrumentation. You will be part of the Nikhef Detector R&D group and you will have, at the same time, the chance to work in an international collaboration where you will report about the performance of these novel sensors. There may even be the opportunity to join beam tests at CERN or DESY facilities. Besides interest in hardware, some proficiency in computing is required (Python or C++/ROOT).

Contact: Jory Sonneveld

Detector R&D: Time resolution of monolithic silicon detectors

Monolithic silicon detectors based on industrial Complementary Metal Oxide Semiconductor (CMOS) processes offer a promising approach for large scale detectors due to their ease of production and low material budget. Until recently, their low radiation tolerance has hindered their applicability in high energy particle physics experiments. However, new prototypes ~~such as the one in this project~~ have started to overcome these hurdles, making them feasible candidates for future experiments in high energy particle physics. In this project, you will investigate the temporal performance of a radiation hard monolithic detector prototype, that was produced end of 2023, using laser setups in the laboratory. You will also participate in meetings with the international collaboration working on this detector to present reports on the prototype's performance. A detailed investigation into different aspects of the system are to be investigated concerning their impact on the temporal resolution such as charge calibration and power consumption. Depending on the progress of the work, a first full three dimensional characterization of the prototypes performance using a state-of-the-art two-photon absorption laser setup at Nikhef and/or an investigation into irradiated samples for a closer look on the impact of radiation damage on the prototype are possible. This project is looking for someone interested in working hands on with cutting edge detector and laser systems at the Nikhef laboratory. Python programming skills and linux experience are an advantage.

Contact: Jory Sonneveld, Uwe Kraemer

Detector R&D: Improving a Laser Setup for Testing Fast Silicon Pixel Detectors

For the upgrades of the innermost detectors of experiments at the Large Hadron Collider in Geneva, in particular to cope with the large number of collisions per second from 2027, the Detector R&D group at Nikhef tests new pixel detector prototypes with a variety of laser equipment with several wavelengths. The lasers can be focused down to a small spot to scan over the pixels on a pixel chip. Since the laser penetrates the silicon, the pixels will not be illuminated by just the focal spot, but by the entire three dimensional hourglass or double cone like light intensity distribution. So, how well defined is the volume in which charge is released? And can that be made much smaller than a pixel? And, if so, what would the optimum focus be? For this project the student will first estimate the intensity distribution inside a sensor that can be expected. This will correspond to the density of released charge within the silicon. To verify predictions, you will measure real pixel sensors for the LHC experiments. This project will involve a lot of 'hands on work' in the lab and involve programming and work on unix machines.

Contact: Martin Fransen

Detector R&D: Time resolution of hybrid pixel detectors with the Timepix4 chip

Precise time measurements with silicon pixel detectors are very important for experiments at the High-Luminosity LHC and the future circular collider. The spatial resolution of current silicon trackers will not be sufficient to distinguish the large number of collisions that will occur within individual bunch crossings. In a new method, typically referred to as 4D tracking, spatial measurements of pixel detectors will be combined with time measurements to better distinguish collision vertices that occur close together. New sensor technologies are being explored to reach the required time measurement resolution of tens of picoseconds, and the results are promising. However, the signals that these pixelated sensors produce have to be processed by front-end electronics, which hence play a large role in the total time resolution of the detector. The front-end electronics has many parameters that can be optimised to give the best time resolution for a specific sensor type. In this project you will be working with the Timepix4 chip, which is a so-called application specific integrated circuit (ASIC) that is designed to read out pixelated sensors. This ASIC is used extensively in detector R&D for the characterisation of new sensor technologies requiring precise timing (< 50 ps). To study the time resolution you will be using laser setups in our lab, and there might be an opportunity to join a test with charged particle beams at CERN. These measurements will be complemented with data from the built-in calibration-pulse mechanism of the Timepix4 ASIC. Your work will enable further research performed with this ASIC, and serve as input to the design and operation of future ASICs for experiments at the High-Luminosity LHC.

Contact: Kevin Heijhoff and Martin van Beuzekom

Detector R&D: Performance studies of Trench Isolated Low Gain Avalanche Detectors (TI-LGAD)

The future vertex detector of the LHCb Experiment needs to measure the spatial coordinates and time of the particles originating in the LHC proton-proton collisions with resolutions better than 10 um and 50 ps, respectively. Several technologies are being considered to achieve these resolutions. Among those is a novel sensor technology called Trench Isolated Low Gain Avalanche Detector. Prototype pixelated sensors have been manufactured recently and have to be characterised. Therefore these new sensors will be bump bonded to a Timepix4 ASIC which provides charge and time measurements in each of 230 thousand pixels. Characterisation will be done using a lab setup at Nikhef, and includes tests with a micro-focused laser beam, radioactive sources, and possibly with particle tracks obtained in a test-beam. This project involves data taking with these new devices and analysing the data to determine the performance parameters such as the spatial and temporal resolution. as function of temperature and other operational conditions.

Contacts: Kazu Akiba and Martin van Beuzekom

Detector R&D: A Telescope with Ultrathin Sensors for Beam Tests

To measure the performance of new prototypes for upgrades of the LHC experiments and beyond, typically a telescope is used in a beam line of charged particles that can be used to compare the results in the prototype to particle tracks measured with this telescope. In this project, you will continue work on a very lightweight, compact telescope using ALICE PIxel DEtectors (ALPIDEs). This includes work on the mechanics, data acquisition software, and a moveable stage. You will foreseeably test this telescope in the Delft Proton Therapy Center. If time allows, you will add a timing plane and perform a measurement with one of our prototypes. Apart from travel to Delft, there is a possiblity to travel to other beam line facilities.

Contact: Jory Sonneveld

Detector R&D: Laser Interferometer Space Antenna (LISA) - the first gravitational wave detector in space

The space-based gravitational wave antenna LISA is one of the most challenging space missions ever proposed. ESA plans to launch around 2035 three spacecraft separated by a few million kilometres. This constellation measures tiny variations in the distances between test-masses located in each satellite to detect gravitational waves from sources such as supermassive black holes. LISA is based on laser interferometry, and the three satellites form a giant Michelson interferometer. LISA measures a relative phase shift between one local laser and one distant laser by light interference. The phase shift measurement requires sensitive sensors. The Nikhef DR&D group fabricated prototype sensors in 2020 together with the Photonics industry and the Dutch institute for space research SRON. Nikhef & SRON are responsible for the Quadrant PhotoReceiver (QPR) system: the sensors, the housing including a complex mount to align the sensors with 10's of nanometer accuracy, various environmental tests at the European Space Research and Technology Centre (ESTEC), and the overall performance of the QPR in the LISA instrument. Currently we are fabricating improved sensors, optimizing the mechanics and preparing environmental tests. As a MSc student, you will work on various aspects of the wavefront sensor development: study the performance of the epitaxial stacks of Indium-Gallium-Arsenide, setting up test benches to characterize the sensors and QPR system, performing the actual tests and data analysis, in combination with performance studies and simulations of the LISA instrument. Possible projects but better to contact us as the exact content may change:

  1. Title: Simulating LISA QPD performance for LISA mission sensitivity.
    Topic: Simulation and Data Analysis.
    Description: we must provide accurate information to the LISA collaboration about the expected and actual performance of the LISA QPRs. This project will focus on using data from measurements taken at Nikhef to integrate into the simulation packages used within the LISA collaboration. The student will have the option to collect their own data to verify the simulations. Performance parameters include spatial uniformity and phase response, crosstalk and thermal response across the LISA sensitivity.
    These simulations can then be used to investigate the full LISA performance and the impact on noise sources. This will involve simulating heterodyne signals expected on the LISA QPD and the impact on sensing techniques such as Differential Wavefront Sensing (DWS) and Tilt-to-Length (TTL) noise. Simulations tools include Finesse (Python), IFOCAD (C++) or FieldProp (MATLAB) depending on the student capabilities and preference. This work is important for understanding the stability and noise of LISA interferometry will perform during real operation in space.
  2. Title: Investigate the Response of the Gap in the LISA QPD.
    Topic: Experimental.
    Description: At Nikhef we are developing the photodiodes that will be used in the upcoming ESA/NASA LISA mission. We currently have our first batch of Quadrant Photodiodes (QPDs) that vary in diameter, thickness and gaps width between the quadrants. The goal of this project is to develop a free-space laser test set-up to measure the response of the gap between the quadrants of the LISA Quadrant Photodiode (QPD). It is important to understand the behaviour of the gap between the photodiode quadrants since this can impact the overall performance of the photodiode and thus the sensitivity of LISA.
    The measurements will involve characterising the test laser beam, configuring test equipment, handling and installing optical components. Furthermore, as well as taking the data, the student will also be responsible for analysing the results using Python however other computer languages are acceptable (based on the student preference).
  3. Title: Investigate the Response of LISA QPDs for Einstein Telescope Pathfinder.
    Topic: Experimental.
    Description: Current gravitational wave (GW) interferometers typically operate using 1064 nm wavelengths. However, future GW detectors will operate at higher wavelengths such as 1550 nm or 2000 nm. As a result of the wavelength change, much of the current technology is unsuitable thus, developments are underway for the next generation GW detectors. Europe’s future GW detector, the Einstein Telescope, is currently in its’ infancy. A smaller scale prototype, known as ET pathfinder, is currently being built and serves as a test bench for the full scale detector.
    At Nikhef’s R&D group, we want to develop quadrant photodiodes (QPDs) that sense the light from the interferometer light for the Einstein Telescope (ET) and ET Pathfinder. These QPDs require very low noise performance as well as high sensitivity in order to measure the small interferometer signals. To that end, out first step is to use the current QPDs that have been developed for the ESA/NASA LISA mission.
    This project will focus on performance tests of the LISA QPDs using a 1550 nm. The student will be tasked with developing a test setup as well as taking the data and analysing the results. As part of this project, the student will learn about laser characterisation, gaussian optics and instrumentation techniques. These results will be important for designing the next generation QPDs and is of interest to the ET consortium, where the student can present their results.


Contact: Niels van Bakel or Timesh Mistry

Detector R&D: Other projects

Are you looking for a slightly different project? Are the above projects already taken? Are you coming in at an unusual time of the year? Do not hesitate to contact us! We always have new projects coming up at different times in the year and we are open to your ideas.

Contact: Jory Sonneveld

Gravitational Waves: Computer modelling to design the laser interferometers for the Einstein Telescope

A new field of instrument science led to the successful detection of gravitational waves by the LIGO detectors in 2015. We are now preparing the next generation of gravitational wave observatories, such as the Einstein Telescope, with the aim to increase the detector sensitivity by a factor of ten, which would allow, for example, to detect stellar-mass black holes from early in the universe when the first stars began to form. This ambitious goal requires us to find ways to significantly improve the best laser interferometers in the world.

Gravitational wave detectors complex Michelson-type interferometers enhanced with optical cavities. We develop and use numerical models to study these laser interferometers, to invent new optical techniques and to quantify their performance. For example, we synthesize virtual mirror surfaces to study the effects of higher-order optical modes in the interferometers, and we use opto-mechanical models to test schemes for suppressing quantum fluctuations of the light field. We can offer several projects based on numerical modelling of laser interferometers. All projects will be directly linked to the ongoing design of the Einstein Telescope.

Contact: Andreas Freise

Gravitational-Waves: Get rid of those damn vibrations!

In 2015 large scale, precision interferometry led to the detection of gravitational-waves. In 2017 Europe’s Advanced Virgo detector joined this international network and the best studied astrophysical event in history, GW170817, was detected in both gravitational waves and across the electromagnetic spectrum.

The Nikhef gravitational wave group is actively contributing to improvements towards current gravitational-wave detectors and the rapidly maturing design for Europe’s next generation of gravitational-wave observatory, Einstein Telescope, with one of two candidate sites located in the Netherlands. These detectors will unveil the gravitational symphony of the dark universe out to cosmological distances. Breaking past the sensitivity achieved by the current observatories will require a radically new approach to core components of these state of the art machines. This is especially true at the lowest, audio-band, frequencies that the Einstein Telescope is targeting where large improvements are needed.

Our project, Omnisens, brings the techniques from space based satellite control back to Earth building a platform capable of actively cancelling ground vibrations to levels never reached in the past. This is realised with state of the art compact interferometric sensors and precision mechanics. Substantial cancellation of seismic motion is an essential improvement for the Einstein Telescope, to reach below attometer (10-18 m) displacements.

We are excited to offer two projects in 2024:

  1. You will experimentally demonstrate and optimise Omnisens’ novel vibration isolation for future deployment on the Einstein Telescope. The activity will involve hands-on experience with laser, electronics mechanical and high-vacuum systems.
  2. You will contribute to the design of the Einstein Telescope by modelling the coupling of seismic and technical noises (such as actuation and sensing noises) through different configurations of seismic actuation chains. An accurate modelling of the origin and transmission of those noises is crucial in designing a system that prevents them from limiting the interferometer’s readout.

Contact: Conor Mow-Lowry

Gravitational Waves: Signal models & tools for data analysis

Theoretical predictions of gravitational-wave (GW) signals provide essential tools to detect and analyse transient GW events in the data of GW instruments like LIGO and Virgo. Over the last few years, there has been significant effort to develop signal models that accurately describe the complex morphology of GWs from merging neutron-star and black-hole binaries. Future analyses of Einstein Telescope (ET) data will need to tackle much longer and louder compact binary signals, which will require significant developments beyond the current status quo of GW modeling (i.e., improvements in model accuracy and computational efficiency, increased parameter space coverage, ...)

We can offer up to two projects: in GW signal modeling (at the interface of perturbation theory, numerical relativity simulations and fast phenomenological descriptions), as well as developing applications of signal models in GW data analysis. Although not strictly required, prior knowledge of basic concepts of general relativity and/or GW theory will be helpful. Some proficiency in computing is required (Mathematica, Python or C++).

Contact: Maria Haney

Theoretical Particle Physics: High-energy neutrino physics at the LHC

High-energy collisions at the LHC and its High-Luminosity upgrade (HL-LHC) produce a large number of particles along the beam collision axis, outside of the acceptance of existing experiments. The FASER experiment has in 2023, for the first team, detected neutrinos produced in LHC collisions, and is now starting to elucidate their properties. In this context, the proposed Forward Physics Facility (FPF) to be located several hundred meters from the ATLAS interaction point and shielded by concrete and rock, will host a suite of experiments to probe Standard Model (SM) processes and search for physics beyond the Standard Model (BSM). High statistics neutrino detection will provide valuable data for fundamental topics in perturbative and non-perturbative QCD and in weak interactions. Experiments at the FPF will enable synergies between forward particle production at the LHC and astroparticle physics to be exploited. The FPF has the promising potential to probe our understanding of the strong interactions as well as of proton and nuclear structure, providing access to both the very low-x and the very high-x regions of the colliding protons. The former regime is sensitive to novel QCD production mechanisms, such as BFKL effects and non-linear dynamics, as well as the gluon parton distribution function (PDF) down to x=1e-7, well beyond the coverage of other experiments and providing key inputs for astroparticle physics. In addition, the FPF acts as a neutrino-induced deep-inelastic scattering (DIS) experiment with TeV-scale neutrino beams. The resulting measurements of neutrino DIS structure functions represent a valuable handle on the partonic structure of nucleons and nuclei, particularly their quark flavour separation, that is fully complementary to the charged-lepton DIS measurements expected at the upcoming Electron-Ion Collider (EIC).

In this project, the student will carry out updated predictions for the neutrino fluxes expected at the FPF, assess the precision with which neutrino cross-sections will be measured, develop novel monte carlo event generation tools for high-energy neutrino scattering, and quantify their impact on proton and nuclear structure by means of machine learning tools within the NNPDF framework and state-of-the-art calculations in perturbative Quantum Chromodynamics. This project contributes to ongoing work within the FPF Initiative towards a Conceptual Design Report (CDR) to be presented within two years. Topics that can be considered as part of this project include the assessment of to which extent nuclear modifications of the free-proton PDFs can be constrained by FPF measurements, the determination of the small-x gluon PDF from suitably defined observables at the FPF and the implications for ultra-high-energy particle astrophysics, the study of the intrinsic charm content in the proton and its consequences for the FPF physics program, and the validation of models for neutrino-nucleon cross-sections in the region beyond the validity of perturbative QCD.

References: https://arxiv.org/abs/2203.05090, https://arxiv.org/abs/2109.10905 ,https://arxiv.org/abs/2208.08372 , https://arxiv.org/abs/2201.12363 , https://arxiv.org/abs/2109.02653, https://github.com/NNPDF/ see also this project description.

Contacts: Juan Rojo

Theoretical Particle Physics: Unravelling proton structure with machine learning

At energy-frontier facilities such as the Large Hadron Collider (LHC), scientists study the laws of nature in their quest for novel phenomena both within and beyond the Standard Model of particle physics. An in-depth understanding of the quark and gluon substructure of protons and heavy nuclei is crucial to address pressing questions from the nature of the Higgs boson to the origin of cosmic neutrinos. The key to address some of these questions is carrying out a global analysis of nucleon structure by combining an extensive experimental dataset and cutting-edge theory calculations. Within the NNPDF approach, this is achieved by means of a machine learning framework where neural networks parametrise the underlying physical laws while minimising ad-hoc model assumptions. In addition to the LHC, the upcoming Electron Ion Collider (EIC), to start taking data in 2029, will be the world's first ever polarised lepton-hadron collider and will offer a plethora of opportunities to address key open questions in our understanding of the strong nuclear force, such as the origin of the mass and the intrinsic angular momentum (spin) of hadrons and whether there exists a state of matter which is entirely dominated by gluons.

In this project, the student will develop novel machine learning and AI approaches aimed to improve global analyses of proton structure and better predictions for the LHC, the EIC, and astroparticle physics experiments. These new approaches will be implemented within the machine learning tools provided by the NNPDF open-source fitting framework and use state-of-the-art calculations in perturbative Quantum Chromodynamics. Techniques that will be considered include normalising flows, graph neural networks, gaussian processes, and kernel methods for unsupervised learning. Particular emphasis will be devoted to the automated determination of model hyperparameters, as well as to the estimate of the associated model uncertainties and their systematic validation with a battery of statistical tests. The outcome of the project will benefit the ongoing program of high-precision theory predictions for ongoing and future experiments in particle physics.

References: https://arxiv.org/abs/2201.12363, https://arxiv.org/abs/2109.02653 , https://arxiv.org/abs/2103.05419, https://arxiv.org/abs/1404.4293 , https://inspirehep.net/literature/1302398, https://github.com/NNPDF/ see also this project description.

Contacts: Juan Rojo

Theoretical Particle Physics: Sterile neutrino dark matter

The existence of right-handed (sterile) neutrinos is well motivated, as all other Standard Model particles come in both chiralities, and moreover, they naturally explaine the small masses of the left-handed (active) neutrinos. If the lightest sterile neutrino is very long lived, it could be dark matter. Although they can be produced by neutrino oscillations in the early universe, this is not efficient enough to explain all dark matter. It has been proposed that additional self-interactions between sterile neutrinos can solve this (https://arxiv.org/abs/2307.15565, see also the more recent https://arxiv.org/abs/2402.13878). In this project you would examine whether the additional field mediating the self-interactions can also explain the neutrino masses. As a first step you would reproduce the results in the literature, and then extend it to map out the range of masses possible for this extra field.

Contacts: Marieke Postma

Theoretical Particle Physics: Baryogenesis at the electroweak scale

Given that the Standard Model treats particle and antiparticles nearly the same, it is a puzzle why there is no antimatter left in our universe. Electroweak baryogenesis is the proposal that the matter-antisymmetry is created during the phase transtion during which the Higgs field obtains a vev and the electroweak symmetry is broken. One important ingredient in this scenario is that there are new charge and parity (CP) violating interactions. However, this is strongl constrained by the very precise measurements of the electric dipole moment of the electron. An old proposal, that was recently revived, is to use a CP violating coupling of the Higgs field to the gauge field (https://arxiv.org/abs/2307.01270, https://inspirehep.net/literature/300823). The project would be to study the efficacy of these kind of operators for baryogenesis.

Contacts: Marieke Postma

Theoretical Particle Physics: Neutrinoless double beta decay with sterile neutrinos

Search for neutrinoless double beta decay represents a prominent probe of new particle physics and is very well motivated by its tight connection to neutrino masses, which, so far, lack an experimentally verified explanation. As such, it also provide a convenient probe of new interactions of the known elementary particles with hypothesized right-handed neutrinos that are thought to play a prime role in the neutrino mass generation. The main focus of this project would be the extension of NuDoBe, a Python tool for the computation of neutrinoless double beta decay (0vbb) rates in terms of lepton-number-violating operators in the Standard Model Effective Field Theory (SMEFT), see https://arxiv.org/abs/2304.05415. In the first step, the code should be expanded to include also the effective operators involving right-handed neutrinos based on the existing literature (https://arxiv.org/abs/2002.07182) covering the general rate of neutrinoless double beta decay within extended by right-handed neutrinos. Besides that, additional functionalities could be added to the code, such as a routine for extraction of the explicit form of a neutrino mass and mixing matrices, etc. This work would be very useful for future phenomenological studies and particularly timely given the ongoing experimental efforts, which are to be further boosted by the upcoming tonne-scale upgrades of the double-beta experiments.

Contacts: Jordy de Vries and Lukas Graf

Theoretical Particle Physics: Phase space factors for single, double, and neutrinoless beta-decay rates.

In light of the increasingly precise measurements of beta-decay and double-beta-decay rates and spectra, the theoretical predictions seem to fall behind. The existing, rather phenomenological approaches to the associated phase-space calculations employ a variety of different approximations introducing errors that are, given their phenomenological nature, not easily quantifiable. A key goal of this project is to understand, reproduce and improve the methods and results available in the literature. Ideally, these efforts would be summarized in form of a compact Mathematica notebook or Python package available to the broad community of beta-decay experimentalists and phenomenologists that could easily implement it in the workflows of their analyses. The focus should be not only on the Standard-Model contributions to (double) beta decay, but also on hypothetical exotic modes stemming from various beyond-the-Standard-Model scenarios (see e.g.\ https://arxiv.org/abs/nucl-ex/0605029 and https://arxiv.org/abs/2003.11836). If time permits, then new, more particle-physics based approaches to the phase-space computations can be investigated.

Contacts: Jordy de Vries and Lukas Graf

Theoretical Particle Physics: Predictions for Charge Particle Tracks from First Principles

Measurements based on tracks of charged particles benefit from superior angular resolution. This is essential for a new class of observables called energy correlators, for which a range of interesting applications have been identified: studying the confinement transition, measuring the top quark mass more precisely, etc. I developed a framework for calculating track-based observables, in which the conversion of quarks and gluons to charged hadrons is described by track functions. This generalization of the well-studied parton distribution functions and fragmentation functions is currently being measured by ATLAS, though the data is not public yet. Interestingly, two groups proposed predicting fragmentation functions from first principles in recent years (https://arxiv.org/abs/2010.02934, https://arxiv.org/abs/2301.09649). In this project you would extend one (or both) approaches to obtain a prediction for the track function.

Contacts: Wouter Waalewijn

Neutrinos: Neutrino Oscillation Analysis with the KM3NeT/ORCA Detector

The KM3NeT/ORCA neutrino detector at the bottom of the Mediterranean Sea is able to detect oscillations of atmospheric neutrinos. Neutrinos traversing the detector are reconstructed as a function of two observables: the neutrino energy and the neutrino direction. In order to improve the neutrino oscillation analysis, we need to add one more observable, the so-called Björken-y, that indicates the fraction of the energy transferred from the incoming neutrino to its daughter particle. For this project, we will study simulated and real reconstructed data and use those to implement this additional observable in the existing analysis framework. Subsequently, we will study how much the sensitivity of the final analysis improves as a result.

C++ and Python programming skills are advantageous.

Contacts: Daan van Eijk, Paul de Jong

Neutrinos: Searching for neutrinos of cosmic origin with KM3NeT

KM3NeT is a neutrino telescope under construction in the Mediterranean Sea, already taking data with the first deployed detection units. In particular the KM3NeT/ARCA detector off-shore of Sicily is designed for high-energy neutrinos and is suited for the detection of neutrinos of cosmic origin. In this project we will use the first KM3NeT data to search for evidence of a cosmic neutrino source, and also study ways to improve the analysis.

Contact: Aart Heijboer

Neutrinos: the Deep Underground Neutrino Experiment (DUNE)

The Deep Underground Neutrino Experiment (DUNE) is under construction in the USA, and will consist of a powerful neutrino beam originating at Fermilab, a near detector at Fermilab, and a far detector in the SURF facility in Lead, South Dakota, 1300 km away. During travelling, neutrinos oscillate and a fraction of the neutrino beam changes flavour; DUNE will determine the neutrino oscillation parameters to unrivaled precision, and try and make a first detection of CP-violation in neutrinos. In this project, various elements of DUNE can be studied, including the neutrino oscillation fit, neutrino physics with the near detector, event reconstruction and classification (including machine learning), or elements of data selection and triggering.

Contact: Paul de Jong

Neutrinos: Searching for Majorana Neutrinos with KamLAND-Zen

The KamLAND-Zen experiment, located in the Kamioka mine in Japan, is a large liquid scintillator experiment with 750kg of ultra-pure Xe-136 to search for neutrinoless double-beta decay (0n2b). The observation of the 0n2b process would be evidence for lepton number violation and the Majorana nature of neutrinos, i.e. that neutrinos are their own anti-particles. Current limits on this extraordinary rare hypothetical decay process are presented as a half-life, with a lower limit of 10^26 years. KamLAND-Zen, the world’s most sensitive 0n2b experiment, is currently taking data and there is an opportunity to work on the data analysis, analyzing data with the possibility of taking part in a ground-breaking discovery. The main focus will be on developing new techniques to filter the spallation backgrounds, i.e. the production of radioactive isotopes by passing muons. There will be close collaboration with groups in the US (MIT, Berkeley, UW) and Japan (Tohoku Univ).

Contact: Patrick Decowski


Neutrinos: TRIF𝒪RCE (PTOLEMY)

The PTOLEMY demonstrator will place limits on the neutrino mass using the β-decay endpoint of atomic tritium. The detector will require a CRES-based (cyclotron radiation emission spectroscopy) trigger and a non-destructive tracking system. The "TRItium-endpoint From 𝒪(fW) Radio-frequency Cyclotron Emissions" group is developing radio-frequency cavities for the simultaneous transport of endpoint electrons and the extraction of their kinematic information. This is essential to providing a fast online trigger and precise energy-loss corrections to electrons reconstructed near the tritium endpoint. The cryogenic low-noise, high-frequency analogue electronics developed at Nikhef combined with FPGA-based front-end analysis capabilities will provide the PTOLEMY demonstrator with its CRES readout and a testbed to be hosted at the Gran Sasso National Laboratory for the full CνB detector. The focus of this project will be modelling CR in RF cavities and its detection for the purposes of single electron spectroscopy and optimised trajectory reconstruction for prototype and demonstrator setups. This may extend to firmware-based fast tagging and reconstruction algorithm development with the RF-SoC.

Contact: James Vincent Mead]

Cosmic Rays: Energy loss profile of cosmic ray muons in the KM3NeT neutrino detector

The dominant signal in the KM3NeT detectors are not neutrinos, but muons created in particle cascades -extensive air-showers- initiated when cosmic rays interact in the top of the atmosphere. While these muons are a background for neutrino studies, they present an opportunity to study the nature of cosmic rays and hadronic interactions at the highest energies. Reconstruction algorithms are used to determine the properties of the particle interactions, normally of neutrinos,  from the recorded photons. The aim of this project is to explore the possibility to reconstruct the longitudinal energy loss profile of single and multiple simultaneous muons ('bundles') originating from cosmic ray interactions. The potential to use this energy loss profile to extract information on the amount of muons and the lateral extension of the muon 'bundles' will also be explored. These properties allow to extract information on the high-energy interactions of cosmic rays.

Contact: Ronald Bruijn

LHCb: Search for light dark particles

The Standard Model of elementary particles does not contain a proper Dark Matter candidate. One of the most tantalizing theoretical developments is the so-called Hidden Valley models: a mirror-like copy of the Standard Model, with dark particles that communicate with standard ones via a very feeble interaction. These models predict the existence of dark hadrons – composite particles that are bound similarly to ordinary hadrons in the Standard Model. Such dark hadronscan be abundantly produced in high-energy proton-proton collisions, making the LHC a unique place to search for them. Some dark hadrons are stable like a proton, which makes them excellent Dark Matter candidates, while others decay to ordinary particles after flying a certain distance in the collider experiment. The LHCb detector has a unique capability to identify such decays, particularly if the new particles have a mass below ten times the proton mass.

This project assumes a unique search for light dark hadrons that covers a mass range not accessible to other experiments. It assumes an interesting program on data analysis (python-based) with non-trivial machine learning solutions and phenomenology research using fast simulation framework. Depending on the interest, there is quite a bit of flexibility in the precise focus of the project.

Contact: [Andrii Usachov]

LHCb: Searching for dark matter in exotic six-quark particles

Three quarters of the mass in the Universe is of unknown type. Many hypotheses about this dark matter have been proposed, but none confirmed. Recently it has been proposed that it could be made of particles made of the six quarks uuddss, which would be a Standard-Model solution to the dark matter problem. This idea has recently gained credibility as many similar multi-quarks states are being discovered by the LHCb experiment. Such a particle could be produced in decays of heavy baryons, or directly in proton-proton collisions. The anti-particle, made of six antiquarks, could be seen when annihilating with detector material. It is also proposed to use Xi_b baryons produced at LHCb to search for such a state where the state would appear as missing 4-momentum in a kinematically constrained decay. The project consists in defining a selection and applying it to LHCb data. See arXiv:2007.10378.

Contact: Patrick Koppenburg


LHCb: New physics in the angular distributions of B decays to K*ee

Lepton flavour violation in B decays can be explained by a variety of non-standard model interactions. Angular distributions in decays of a B meson to a hadron and two leptons are an important source of information to understand which model is correct. Previous analyses at the LHCb experiment have considered final states with a pair of muons. Our LHCb group at Nikhef concentrates on a new measurement of angular distributions in decays with two electrons. The main challenge in this measurement is the calibration of the detection efficiency. In this project you will confront estimates of the detection efficiency derived from simulation with decay distributions in a well known B decay. Once the calibration is understood, the very first analysis of the angular distributions in the electron final state can be performed.

Contact: Mara Soares and Wouter Hulsbergen

LHCb: CP violation in B -> J/psi Ks decays with first run-3 data

The decay B -> J/psi Ks is the `golden channel' for measuring the CP violating angle beta in the CKM matrix. In this project we will use the first data from the upgraded LHCb detector to perform this measurement. Performing such a measurement with a new detector is going to be very challenging: We will learn a lot about whether the the upgraded LHCb will perform as good as expected.

Contact: [Wouter Hulsbergen]


LHCb: Optimization of primary vertex reconstruction

A key part of the LHCb event classification is the reconstruction of the collision point of the protons from the LHC beams. This so-called primary vertex is found by constructing the origin of the charged particles found in the detector. A rudimentary algorithm exists, but it is expected that its performance can be improved by tuning parameters (or perhaps implementing an entirely new algorithm). In this project you are challenged to optimize the LHCb primary vertex reconstruction algorithm using recent simulated and real data from LHC run-3.

Contact: [Wouter Hulsbergen]

LHCb: Measurement of B decays to two electrons

Instead of searching for new physics by direct production of new particles, one can search for enhancements in very rare processes as an indirect signal for the existence of new particles or forces. The observed decay of Bs to two muons by the LHCb collaboration and Nikhef/Maastricht is such a measurement, and as rarest decay ever observed at the LHC it has a large impact on the new physics landscape. In this project, we will extend this work by searching for the even rarer decay into two electrons. You would join the ongoing work in context of an NWO Veni grant, and can be based in Maastricht or Nikhef.

Contact: [Jacco de Vries]

Muon Collider

There is currently a lively global debate about the next accelerator to succeed the successful LHC. Different options are on the table: linear, circular, electrons, protons, on various continents... Out of these, the most ambitious project is the muon collider, designed to collide the relatively massive (105 MeV) but relatively short-living (2.2 μs!) leptons. Such a novel collider would combine the advantages of electron-positron colliders (excellent precision) and proton-proton colliders (highest energy). In this project, we'll perform a feasibility study for the search of the elusive Double-Higgs process: this yet unobserved process is crucial to probe the simultaneous interaction of multiple Higgs bosons and thereby the shape of the Higgs potential as predicted in the Brout-Englert-Higgs mechanism. This sensitivity study will be instrumental to understand one of the main scientific prospects for this ambitious project, and also to optimize the detector design, as well as the interface of the particle detectors to the accelerator machine. The project is based at Nikhef but can also be (partially) performed at University of Twente.

Reference: https://www.science.org/content/article/muon-collider-could-revolutionize-particle-physics-if-it-can-be-built

Contact: [Flavia Dias and Tristan du Pree ]



Projects with a 2023 start

ALICE: The next-generation multi-purpose detector at the LHC

This main goal of this project is to focus on the next-generation multi-purpose detector planned to be built at the LHC. Its core will be a nearly massless barrel detector consisting of truly cylindrical layers based on curved wafer-scale ultra-thin silicon sensors with MAPS technology, featuring an unprecedented low material budget of 0.05% X0 per layer, with the innermost layers possibly positioned inside the beam pipe. The proposed detector is conceived for studies of pp, pA and AA collisions at luminosities a factor of 20 to 50 times higher than possible with the upgraded ALICE detector, enabling a rich physics program ranging from measurements with electromagnetic probes at ultra-low transverse momenta to precision physics in the charm and beauty sector.

Contact: Panos Christakoglou and Alessandro Grelli and Marco van Leeuwen

ALICE: Searching for the strongest magnetic field in nature

In a non-central collision between two Pb ions, with a large value of impact parameter, the charged nucleons that do not participate in the interaction (called spectators) create strong magnetic fields. A back of the envelope calculation using the Biot-Savart law brings the magnitude of this filed close to 10^19Gauss in agreement with state of the art theoretical calculation, making it the strongest magnetic field in nature. The presence of this field could have direct implications in the motion of final state particles. The magnetic field, however, decays rapidly. The decay rate depends on the electric conductivity of the medium which is experimentally poorly constrained. Overall, the presence of the magnetic field, the main goal of this project, is so far not confirmed experimentally.

Contact: Panos Christakoglou

ALICE: Looking for parity violating effects in strong interactions

Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions, in the presence of a deconfined state, gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, what is called the Chiral Magnetic Effect (CME). The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, however further studies are needed to constrain background effects. These highly anticipated results have the potential to reveal exiting, new physics.

Contact: Panos Christakoglou

ALICE: Machine learning techniques as a tool to study the production of heavy flavour particles

There was recently a shift in the field of heavy-ion physics triggered by experimental results obtained in collisions between small systems (e.g. protons on protons). These results resemble the ones obtained in collisions between heavy ions. This consequently raises the question of whether we create the smallest QGP droplet in collisions between small systems. The main objective of this project will be to study the production of charm particles such as D-mesons and Λc-baryons in pp collisions at the LHC. This will be done with the help of a new and innovative technique which is based on machine learning (ML). The student will also extend the studies to investigate how this production rate depends on the event activity e.g. on how many particles are created after every collision.

Contact: Panos Christakoglou and Alessandro Grelli

ALICE: Search for new physics with 4D tracking at the most sensitive vertex detector at the LHC

With the newly installed Inner Tracking System consisting fully of monolithic detectors, ALICE is very sensitive to particles with low transverse momenta, more so than ATLAS and CMS. This will be even more so for the ALICE upgrade detector in 2033. This detector could potentially be even more sensitive to longlived particles that leave peculiar tracks such as disappearing or kinked tracks in the tracker by using timing information along a track. In this project you will investigate how timing information in the different tracking layers can improve or even enable a search for new physics beyond the Standard Model in ALICE. If you show a possibility for major improvements, this can have real consequences for the choice of sensors for this ALICE inner tracker upgrade.

Contact: Jory Sonneveld and Panos Christakoglou

ATLAS: The Higgs boson's self-coupling

The coupling of the Higgs boson to itself is one of the main unobserved interactions of the Standard Model and its observation is crucial to understand the shape of the Higgs potential. Here we propose to study the 'ttHH' final state: two top quarks and two Higgs bosons produced in a single collision. This topology is yet unexplored at the ATLAS experiment and the project consists of setting up the new analysis (including multivariate analysis techniques to recognise the complicated final state), optimising the sensitivity and including the result in the full ATLAS study of the Higgs boson's coupling to itself. With the LHC data from the upcoming Run-3, we might be able to see its first glimpses!

Contact: Tristan du Pree and Carlo Pandini

ATLAS: Triple-Higgs production as a probe of the Higgs potential

So far, the investigation of Higgs self-couplings (the coupling of the Higgs boson to itself) at the LHC has focused on the measurement of the Higgs tri-linear coupling λ3 mainly through direct double-Higgs production searches. In this research project we propose the investigation of Higgs tri-linear and quartic coupling parameters λ3 and λ4, via a novel measurement of triple-Higgs production at the LHC (HHH) with the ATLAS experiment. While in the SM these parameters are expected to be identical, only a combined measurement can provide an answer regarding how the Higgs potential is realised in Nature. Processes in which three Higgs bosons are produced simultaneously are extremely rare, and very difficult to measure and disentangle from background. In this project we plan to investigate different decay channels (to bottom quarks and tau leptons), and to study advanced machine learning techniques to reconstruct such a complex hadronic final state. This kind of processes is still quite unexplored in ATLAS, and the goal of this project is to put the basis for the first measurement of HHH production at the LHC.

Furthermore, we'd like to study the possible implication of a precise measurement of the self-coupling parameters from HHH production from a phenomenological point of view: what could be the impact of a deviation in the HHH measurements on the big open questions in physics (for instance, the mechanisms at the root of baryogenesis)?

Contact: Tristan du Pree and Carlo Pandini

ATLAS: The Next Generation

After the observation of the coupling of Higgs bosons to fermions of the third generation, the search for the coupling to fermions of the second generation is one of the next priorities for research at CERN's Large Hadron Collider. The search for the decay of the Higgs boson to two charm quarks is very new [1] and we see various opportunities for interesting developments. For this project we propose improvements in reconstruction (using exclusive decays) and advanced analysis techiques (using deep learning methods).

[1]https://atlas.cern/updates/briefing/charming-Higgs-decay

Contact: Tristan du Pree

ATLAS: Searching for new particles in very energetic diboson production

The discovery of new phenomena in high-energy proton–proton collisions is one of the main goals of the Large Hadron Collider (LHC). New heavy particles decaying into a pair of vector bosons (WW, WZ, ZZ) are predicted in several extensions to the Standard Model (e.g. extended gauge-symmetry models, Grand Unified theories, theories with warped extra dimensions, etc). In this project we will investigate new ideas to look for these resonances in promising regions. We will focus on final states where both vector bosons decay into quarks, or where one decays into quarks and one into leptons. These have the potential to bring the highest sensitivity to the search for Beyond the Standard Model physics [1, 2]. We will try to reconstruct and exploit new ways to identify vector bosons (using machine learning methods) and then tackle the problem of estimating contributions from beyond the Standard Model processes in the tails of the mass distribution.

[1] https://arxiv.org/abs/1906.08589

[2] https://arxiv.org/abs/2004.14636

Contact: Flavia de Almeida Dias, Robin Hayes, Elizaveta Cherepanova and Dylan van Arneman

ATLAS: Top-quark and Higgs-boson analysis combination, and Effective Field Theory interpretation (also in 2023)

We are looking for a master student with interest in theory and data-analysis in the search for physics beyond the Standard Model in the top-quark and Higgs-boson sectors.

Your master-project starts just at the right time for preparing the Run-3 analysis of the ATLAS experiment at the LHC. In Run-3 (2022-2026), three times more data becomes available, enabling analysis of rare processes with innovative software tools and techniques.

This project aims to explore the newest strategy to combine the top-quark and Higgs-boson measurements in the perspective of constraining the existence of new physics beyond the Standard Model (SM) of Particle Physics. We selected the pp->tZq and gg->HZ processes as promising candidates for a combination to constrain new physics in the context of Standard Model Effective Field Theory (SMEFT). SMEFT is the state-of-the-art framework for theoretical interpretation of LHC data. In particular, you will study the SMEFT OtZ and Ophit operators, which are not well constrained by current measurements.

Besides affinity with particle physics theory, the ideal candidate for this project has developed python/C++ skills and is eager to learn advanced techniques. You start with a simulation of the signal and background samples using existing software tools. Then, an event selection study is required using Machine Learning techniques. To evaluate the SMEFT effects, a fitting procedure based on the innovative Morphing technique is foreseen, for which the basic tools in the ROOT and RooFit framework are available. The work is carried out in the ATLAS group at Nikhef and may lead to an ATLAS note.

Contact: Oliver Rieger and Marcel Vreeswijk

ATLAS: Machine learning to search for very rare Higgs decays

Since the Higgs boson discovery in 2012 at the ATLAS experiment, the investigation of the properties of the Higgs boson has been a priority for research at the Large Hadron Collider (LHC). However, there are still a many open questions: Is the Higgs boson the only origin of Electroweak Symmetry Breaking? Is there a mechanism which can explain the observed mass pattern of SM particles? Many of these questions are linked to the Higgs boson coupling structure.



While the Higgs boson coupling to fermions of the third generation has been established experimentally, the investigation of the Higgs boson coupling to the light fermions of the second generation will be a major project for the upcoming data-taking period (2022-2025). The Higgs boson decay to muons is the most sensitive channel for probing this coupling. In this project, you will optimize the event selection for Higgs boson decays to muons in the Vector Boson Fusion (VBF) production channel with a focus on distinguishing signal events from background processes like Drell-Yan and electroweak Z boson production. For this purpose, you will develop, implement and validate advanced machine learning and deep learning algorithms.

Contact: Oliver Rieger and Wouter Verkerke and Peter Kluit

ATLAS: Interpretation of experimental data using SMEFT

The Standard Model Effective Field Theory (SMEFT) provides a systematic approach to test the impact of new physics at the energy scale of the LHC through higher-dimensional operators. The interpretation of experimental data using SMEFT requires a particular interest in solving complex technical challenges, advanced statistical techniques, and a deep understanding of particle physics. We would be happy to discuss different project opportunities based on your interests with you.

Contact: Oliver Rieger and Wouter Verkerke

ATLAS: A new timing detector - the HGTD

The ATLAS is going to get a new ability: a timing detector. This allows us to reconstruct tracks not only in the 3 dimensions of space but adds the ability of measuring very precisely also the time (at picosecond level) at which the particles pass the sensitive layers of the HGTD detector. This allows to construct the trajectories of the particles created at the LHC in 4 dimensions and ultimately will lead to a better reconstruction of physics at ATLAS. The new HGTD detector is still in construction and work needs to be done on different levels such as understanding the detector response (taking measurements in the lab and performing simulations) or developing algorithms to reconstruct the particle trajectories (programming and analysis work).

Several projects are available within the context of the new HGTD detector:

  1. One can choose to either focus on the impact on physics analysis performance by studying how the timing measurements can be included in the reconstruction of tracks, and what effect this has on how much better we can understand the physical processes occurring in the particles produced in the LHC collisions. With this work you will be part of the Atlas group at Nikhef.
  2. The second possibility is to test the sensors in our lab and in test-beam setups at CERN. The analysis performed will be in context of the ATLAS HGTD test beam group in connection to both the Atlas group and the R&D department at Nikhef.
  3. The third is to contribute in an ongoing effort to precisely simulate/model he silicon avalanche detectors in the Allpix2 frameword. There are several models that try to describe the detectors response. There are several dependencies to operation temperature, field strenghts and radiation damage. We are getting close in being able to model our detector - but not there yet. This work will be within the ATLAS group together with Hella Snoek and Andrea Visibile

If you are interested, contact me to discuss the possibilities. Contact: Hella Snoek


ATLAS: The next full-silicon Inner Tracker: ITk

The inner detector of the present ATLAS experiment has been designed and developed to function in the environment of the present Large Hadron Collider (LHC). At the ATLAS Phase-II Upgrade, the particle densities and radiation levels will exceed current levels by a factor of ten. The instantaneous luminosity is expected to reach unprecedented values, resulting in up to 200 proton-proton interactions in a typical bunch crossing. The new detectors must be faster and they need to be more highly segmented. The sensors used also need to be far more resistant to radiation, and they require much greater power delivery to the front-end systems. At the same time, they cannot introduce excess material which could undermine tracking performance. For those reasons, the inner tracker of the ATLAS detector (ITk) was redesigned and will be rebuilt completely.

Nikhef is one of the sites in charge of building and integrating some big parts of ITk. One of the next steps consists of testing the sensors that we will install in the structures we have built (check one of the structures in the picture of our cleanroom). This project offers the possibility of working on a full hardware project, doing something completely new, by testing the sensors of a future component of the next ATLAS detector.

Contact: Andrea García Alonso

Cosmic Rays/Neutrinos: Seasonal muon flux variations and the pion/kaon ratio

The KM3NeT ARCA and ORCA detectors, located kilometers deep in the Mediterranean Sea, have neutrinos as primary probes. Muons from cosmic ray interactions reach the detectors in relatively large quantities too. These muons, exploiting the capabilities and location of the detectors allow the study of cosmic rays and their interactions. In this way, questions about their origin, type, propagation can be addressed. In particular these muons are tracers of hadronic interactions at energies inaccessible at particle accelerators.

The muons reaching the depths of the detectors result from decays of mesons, mostly pions and kaons, created in interactions of high-energy cosmic rays with atoms in the upper atmosphere. Seasonal changes of the temperature – and thus density - profile of the atmosphere modulate the balance between the probability for these mesons to decay (producing muons) or to re-interact. Pions and kaons are affected differently, allowing to extract their production ratio by determining how changes in muon rate depend on changes in the effective temperature – an integral over the atmospheric temperature profile weighted by a depth dependent meson production rate.

In this project, the aim is to measure the rate of muons in the detectors and to calculate the effective temperature above the KM3NeT detectors from atmospheric data, both as function of time. The relation between these two can be used to extract the pion to kaon ratio.

Contact: Ronald Bruijn

Detector R&D: Studies of wafer-scale sensors for ALICE detector upgrade and beyond

One of the biggest milestones of the ALICE detector upgrade (foreseen in 2026) is the implementation of wafer-scale (~ 28 cm x 18 cm) monolithic silicon active pixel sensors in the tracking detector, with the goal of having truly cylindrical barrels around the beam pipe. To demonstrate such an unprecedented technology in high energy physics detectors, few chips will be soon available in Nikhef laboratories for testing and characterization purposes. The goal of the project is to contribute to the validation of the samples against the ALICE tracking detector requirements, with a focus on timing performance in view of other applications in future high energy physics experiments beyond ALICE. We are looking for a student with a focus on lab work and interested in high precision measurements with cutting-edge instrumentation. You will be part of the Nikhef Detector R&D group and you will have, at the same time, the chance to work in an international collaboration where you will report about the performance of these novel sensors. There may even be the opportunity to join beam tests at CERN or DESY facilities. Besides interest in hardware, some proficiency in computing is required (Python or C++/ROOT).

Contact: Jory Sonneveld , Roberto Russo

Detector R&D: Time resolution of monolithic silicon detectors

Monolithic silicon detectors based on industrial Complementary Metal Oxide Semiconductor (CMOS) processes offer a promising approach for large scale detectors due to their ease of production and low material budget. Until recently, their low radiation tolerance has hindered their applicability in high energy particle physics experiments. However, new prototypes ~~such as the one in this project~~ have overcome these hurdles, making them feasible candidates for future experiments in high energy particle physics. Achieving the required radiation tolerance has brought the spatial and temporal resolution of these detectors to the forefront. In this project, you will investigate the temporal performance of a radiation hard monolithic detector prototype, using laser setups in the laboratory. You will also participate in meetings with the international collaboration working on this detector, where you will report on the prototype's performance. Depending on the progress of the work, there may be a chance to participate in test beams performed at the CERN accelerator complex and a first full three dimensional characterization of the prototypes performance using a state-of-the-art two-photon absorption laser setup at Nikhef. This project is looking for someone interested in working hands on with cutting edge detector and laser systems at the Nikhef laboratory. Python programming skills and linux experience are an advantage.

Contact: Jory Sonneveld, Uwe Kraemer

Detector R&D: Improving a Laser Setup for Testing Fast Silicon Pixel Detectors

For the upgrades of the innermost detectors of experiments at the Large Hadron Collider in Geneva, in particular to cope with the large number of collisions per second from 2027, the Detector R&D group at Nikhef tests new pixel detector prototypes with a variety of laser equipment with several wavelengths. The lasers can be focused down to a small spot to scan over the pixels on a pixel chip. Since the laser penetrates the silicon, the pixels will not be illuminated by just the focal spot, but by the entire three dimensional hourglass or double cone like light intensity distribution. So, how well defined is the volume in which charge is released? And can that be made much smaller than a pixel? And, if so, what would the optimum focus be? For this project the student will first estimate the intensity distribution inside a sensor that can be expected. This will correspond to the density of released charge within the silicon. To verify predictions, you will measure real pixel sensors for the LHC experiments. This project will involve a lot of 'hands on work' in the lab and involve programming and work on unix machines.

Contact: Martin Fransen

Detector R&D: Time resolution of hybrid pixel detectors with the Timepix4 chip

Precise time measurements with silicon pixel detectors are very important for experiments at the High-Luminosity LHC and the future circular collider. The spatial resolution of current silicon trackers will not be sufficient to distinguish the large number of collisions that will occur within individual bunch crossings. In a new method, typically referred to as 4D tracking, spatial measurements of pixel detectors will be combined with time measurements to better distinguish collision vertices that occur close together. New sensor technologies are being explored to reach the required time measurement resolution of tens of picoseconds, and the results are promising. However, the signals that these pixelated sensors produce have to be processed by front-end electronics, which hence also play a role in the total time resolution of the detector. An important contribution comes from the systematic differences between the front-end electronics of different pixels. Many of these systematic effects can be corrected by performing detailed calibrations of the readout electronics. To achieve the required time resolution at future experiments, it is vital that these effects are understood and corrected. In this project you will be working with the Timepix4 chip. This is a so-called application specific integrated circuit (ASIC) that is designed to read out pixelated sensors. This ASIC will be used extensively in detector R&D for the characterisation of new sensor technologies requiring precise timing (< 50 ps). In order to do so, it is necessary to first study the systematic differences between the pixels, which you will do using a laser setup in our lab. This will be combined with data analysis of proton beam measurements, or with measurements performed using the built-in test-pulse mechanism of the Timepix4 ASIC. Your work will enable further research performed with this ASIC, and serve as input to the design and operation of future ASICs for experiments at the High-Luminosity LHC.

Contact: Kevin Heijhoff and Martin van Beuzekom

Detector R&D: Performance studies of Trench Isolated Low Gain Avalanche Detectors (TI-LGAD)

The future vertex detector of the LHCb Experiment needs to measure the spatial coordinates and time of the particles originating in the LHC proton-proton collisions with resolutions better than 10 um and 50 ps, respectively. Several technologies are being considered to achieve these resolutions. Among those is a novel sensor technology called Trench Isolated Low Gain Avalanche Detector. Prototype pixelated sensors have been manufactured recently and have to be characterised. Therefore these new sensors will be bump bonded to a Timepix4 ASIC which provides charge and time measurements in each of 230 thousand pixels. Characterisation will be done using a lab setup at Nikhef, and includes tests with a micro-focused laser beam, radioactive sources, and possibly with particle tracks obtained in a test-beam. This project involves data taking with these new devices and analysing the data to determine the performance parameters such as the spatial and temporal resolution. as function of temperature and other operational conditions.

Contacts: Kazu Akiba and Martin van Beuzekom

Detector R&D: A Telescope with Ultrathin Sensors for Beam Tests

To measure the performance of new prototypes for upgrades of the LHC experiments and beyond, typically a telescope is used in a beam line of charged particles that can be used to compare the results in the prototype to particle tracks measured with this telescope. In this project, you will continue work on a very lightweight, compact telescope using ALICE PIxel DEtectors (ALPIDEs). This includes work on the mechanics, data acquisition software, and a moveable stage. You will foreseeably test this telescope in the Delft Proton Therapy Center. If time allows, you will add a timing plane and perform a measurement with one of our prototypes. Apart from travel to Delft, there is a possiblity to travel to other beam line facilities.

Contact: Jory Sonneveld

Detector R&D: Laser Interferometer Space Antenna (LISA) - the first gravitational wave detector in space

The space-based gravitational wave antenna LISA is one of the most challenging space missions ever proposed. ESA plans to launch around 2034 three spacecraft separated by a few million kilometres. This constellation measures tiny variations in the distances between test-masses located in each satellite to detect gravitational waves from sources such as supermassive black holes. LISA is based on laser interferometry, and the three satellites form a giant Michelson interferometer. LISA measures a relative phase shift between one local laser and one distant laser by light interference. The phase shift measurement requires sensitive sensors. The Nikhef DR&D group fabricated prototype sensors in 2020 together with the Photonics industry and the Dutch institute for space research SRON. Nikhef & SRON are responsible for the Quadrant PhotoReceiver (QPR) system: the sensors, the housing including a complex mount to align the sensors with 10's of nanometer accuracy, various environmental tests at the European Space Research and Technology Centre (ESTEC), and the overall performance of the QPR in the LISA instrument. Currently we are discussing possible sensor improvements for a second fabrication run in 2022, optimizing the mechanics and preparing environmental tests. As a MSc student, you will work on various aspects of the wavefront sensor development: study the performance of the epitaxial stacks of Indium-Gallium-Arsenide, setting up test benches to characterize the sensors and QPR system, performing the actual tests and data analysis, in combination with performance studies and simulations of the LISA instrument.

Contact: Niels van Bakel

Detector R&D: Other projects

Are you looking for a slightly different project? Are the above projects already taken? Are you coming in at an unusual time of the year? Do not hesitate to contact us! We always have new projects coming up at different times in the year and we are open to your ideas.

Contact: Jory Sonneveld

FCC: The Next Collider

After the LHC, the next planned large collider at CERN is the proposed 100 kilometer circular collider "FCC". In the first stage of the project, as a high-luminosity electron-positron collider, precision measurements of the Higgs boson are the main goal. One of the channels that will improve by orders of magnitude at this new accelerator is the decay of the Higgs boson to a pair of charm quarks. This project will estimate a projected sensitivity for the coupling of the Higgs boson to second generation quarks, and in particular target the improved reconstruction of the topology of long-lived mesons in the clean environment of a precision e+e- machine.

Contact: Tristan du Pree

Gravitational Waves: Computer modelling to design the laser interferometers for the Einstein telescope

A new field of instrument science led to the successful detection of gravitational waves by the LIGO detectors in 2015. We are now preparing the next generation of gravitational wave observatories, such as the Einstein Telescope, with the aim to increase the detector sensitivity by a factor of ten, which would allow, for example, to detect stellar-mass black holes from early in the universe when the first stars began to form. This ambitious goal requires us to find ways to significantly improve the best laser interferometers in the world.

Gravitational wave detectors, such as LIGO and VIRGO, are complex Michelson-type interferometers enhanced with optical cavities. We develop and use numerical models to study these laser interferometers, to invent new optical techniques and to quantify their performance. For example, we synthesize virtual mirror surfaces to study the effects of higher-order optical modes in the interferometers, and we use opto-mechanical models to test schemes for suppressing quantum fluctuations of the light field. We can offer several projects based on numerical modelling of laser interferometers. All projects will be directly linked to the ongoing design of the Einstein Telescope.

Contact: Andreas Freise

LHCb: Search for light dark particles

The Standard Model of elementary particles does not contain a proper Dark Matter candidate. One of the most tantalizing theoretical developments is the so-called Hidden Valley models: a mirror-like copy of the Standard Model, with dark particles that communicate with standard ones via a very feeble interaction. These models predict the existence of dark hadrons – composite particles that are bound similarly to ordinary hadrons in the Standard Model. Such dark hadrons can be abundantly produced in high-energy proton-proton collisions, making the LHC a unique place to search for them. Some dark hadrons are stable like a proton, which makes them excellent Dark Matter candidates, while others decay to ordinary particles after flying a certain distance in the collider experiment. The LHCb detector has a unique capability to identify such decays, particularly if the new particles have a mass below ten times the proton mass.

This project assumes a unique search for light dark hadrons that covers a mass range not accessible to other experiments. It assumes an interesting program on data analysis (python-based) with non-trivial machine learning solutions and phenomenology research using fast simulation framework. Depending on the interest, there is quite a bit of flexibility in the precise focus of the project.

Contact: Andrii Usachov

LHCb: Searching for dark matter in exotic six-quark particles

Three quarters of the mass in the Universe is of unknown type. Many hypotheses about this dark matter have been proposed, but none confirmed. Recently it has been proposed that it could be made of particles made of the six quarks uuddss, which would be a Standard-Model solution to the dark matter problem. This idea has recently gained credibility as many similar multi-quarks states are being discovered by the LHCb experiment. Such a particle could be produced in decays of heavy baryons, or directly in proton-proton collisions. The anti-particle, made of six antiquarks, could be seen when annihilating with detector material. It is also proposed to use Xi_b baryons produced at LHCb to search for such a state where the state would appear as missing 4-momentum in a kinematically constrained decay. The project consists in defining a selection and applying it to LHCb data. See arXiv:2007.10378.

Contact: Patrick Koppenburg

LHCb: Measuring lepton flavour universality with excited Ds states in semileptonic Bs decays

One of the most striking discrepancies between the Standard Model and measurements are the lepton flavour universality (LFU) measurements with tau decays. At the moment, we have observed an excess of 3-4 sigma in B → Dτν decays. This could point even to a new force of nature! To understand this discrepancy, we need to make further measurements.

One very exciting (pun intended) projects to verify these discrepancies involves measuring the Bs → Ds2*τν and/or Bs → Ds1*τν decays. These decays with excited states of the Ds meson have not been observed before in the tau decay mode, and have a unique way of coupling to potential new physics candidates that can only be measured in Bs decays [1]. See slides for more detail: File:LHCbLFUwithExcitedDs.pdf

[1] https://arxiv.org/abs/1606.09300

Contact: Suzanne Klaver

LHCb: New physics in the angular distributions of B decays to K*ee

Lepton flavour violation in B decays can be explained by a variety of non-standard model interactions. Angular distributions in decays of a B meson to a hadron and two leptons are an important source of information to understand which model is correct. Previous analyses at the LHCb experiment have considered final states with a pair of muons. Our LHCb group at Nikhef concentrates on a new measurement of angular distributions in decays with two electrons. The main challenge in this measurement is the calibration of the detection efficiency. In this project you will confront estimates of the detection efficiency derived from simulation with decay distributions in a well known B decay. Once the calibration is understood, the very first analysis of the angular distributions in the electron final state can be performed.

Contact: Mara Soares and Wouter Hulsbergen

LHCb: Discovering heavy neutrinos in B decays

Neutrinos are the lightest of all fermions in the standard model. Mechanisms to explain their small mass rely on the introduction of new, much heavier, neutral leptons. If the mass of these new neutrinos is below the b-quark mass, they can be observed in B hadron decays.

In this project we search for the decay of B+ mesons in into an ordinary electron or muon and the yet undiscovered heavy neutrino. The heavy neutrino is expected to be unstable and in turn decay quickly into a charged pion and another electron or muon. The final state in which the two leptons differ in flavour, "B+ to e mu pi", is particularly interesting: It is forbidden in the standard model, such that backgrounds are small. The analysis will be performed within the LHCb group at Nikhef using LHCb run-2 data.

LHCb: Scintillating Fibre tracker software

The installation of the scintillating-fibre tracker in LHCb’s underground cavern was recently completed. This detector uses 10000 km of fibres to track particle trajectories in the LHCb detector when the LHC starts up again later this year. The light emitted by the scintillating fibres when a particle interacts with them is measured using photon multiplier tubes. The studies proposed for this project will focus on software, and could include writing a framework to monitor the detector output, improving the detector simulation or working on the data processing.

Contact: Emmy Gabriel

LHCb: Vertex detector calibration

In summer 2022 LHCb has started data taking will an almost entirely new detector. At the point closest to the interaction point, the trajectories of charge particles are reconstructed with a so-called silicon pixel detector. The design hit resolution of this detector is about 15 micron. However, to actually reach this resolution a precise calibration of the spatial positions of the silicon sensors needs to be performed. In this project, you will use the first data of the new LHCb detector to perform this calibration and measure the detector performance.

Contact: Wouter Hulsbergen


Neutrinos: Neutrino scattering: the ultimate resolution

Neutrino telescopes like IceCube and KM3NeT aim at detecting neutrinos from cosmic sources. The neutrinos are detected with the best resolution when charged current interactions with nucleons produce a muon, which can be detected with high accuracy (depending on the detector). A crucial ingredient in the ultimate achievable pointing accuracy of neutrino telescopes is the scattering angle between the neutrino and the muon. While published computations have investigated the cross-section of the process in great detail, this important scattering angle has not received much attention. The aim of the project is to compute and characterize the distribution of this angle, and that the ultimate resolution of a neutrino telescope. If successful, the results of this project can lead to publication of interest to the neutrino telescope community.

Depending on your interests, the study could be based on a first-principles calculation (using the deep-inelastic scattering formalism), include state-of-the-art parton distribution functions, and/or exploit existing event-generation software for a more experimental approach.

Contacts: Aart Heijboer

Neutrinos: acoustic detection of ultra-high energy neutrinos

The study of the cosmic neutrinos of energies above 1017 eV, the so-called ultra-high energy neutrinos, provides a unique view on the universe and may provide insight in the origin of the most violent astrophysical sources, such as gamma ray bursts, supernovae or even dark matter. In addition, the observation of high energy neutrinos may provide a unique tool to study interactions at high energies. The energy deposition of these extreme neutrinos in water induce a thermo-acoustic signal, which can be detected using sensitive hydrophones. The expected neutrino flux is however extremely low and the signal that neutrinos induce is small. TNO is presently developing sensitive hydrophone technology based on fiber optics. Optical fibers form a natural way to create a distributed sensing system. Using this technology a large scale neutrino telescope can be built in the deep sea. TNO is aiming for a prototype hydrophone which will form the building block of a future telescope.

The work will be executed at the Nikhef institute and/or the TNO laboratories in Delft. In this project master students have the opportunity to contribute in the following ways:

Project 1: Hardware development on fiber optics hydrophones technology Goal: characterize existing prototype optical fibre hydrophones in an anechoic basin at TNO laboratory. Data collection, calibration, characterization, analysis of consequences for design future acoustic hydrophone neutrino telescopes; Keywords: Optical fiber technology, signal processing, electronics, lab.

Project 2: Investigation of ultra-high energy neutrinos and their interactions with matter. Goal: Discriminate the neutrino signals from the background noises, in particular clicks from whales and dolphins in the deep sea. Study impact on physics reach for future acoustic hydrophone neutrino telescopes; Keywords: Monte Carlo simulations, particle physics, neutrino physics, data analysis algorithms.

Further information: Info on ultra-high energy neutrinos can be found at: http://arxiv.org/abs/1102.3591; Info on acoustic detection of neutrinos can be found at: http://arxiv.org/abs/1311.7588

Contact: Ernst Jan Buis or Ivo van Vulpen

Neutrinos: Oscillation analysis with the first data of KM3NeT

The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first few strings with sensitive photodetectors have been deployed at both the Italian and the French detector sites. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In this project the available data will be used together with simulations to best reconstruct the event topologies and optimally identify and reconstruct the first neutrino interactions in the KM3NeT detector. The data will then be used to measure neutrino oscillation parameters, and prepare for a future neutrino mass ordering determination.

Programming skills are essential, mostly root and C++ will be used. Contact: Ronald Bruijn Paul de Jong


Neutrinos: the Deep Underground Neutrino Experiment (DUNE)

The Deep Underground Neutrino Experiment (DUNE) is under construction in the USA, and will consist of a powerful neutrino beam originating at Fermilab, a near detector at Fermilab, and a far detector in the SURF facility in Lead, South Dakota, 1300 km away. During travelling, neutrinos oscillate and a fraction of the neutrino beam changes flavour; DUNE will determine the neutrino oscillation parameters to unrivaled precision, and try and make a first detection of CP-violation in neutrinos. In this project, various elements of DUNE can be studied, including the neutrino oscillation fit, neutrino physics with the near detector, event reconstruction and classification (including machine learning), or elements of data selection and triggering.

Contact: Paul de Jong

Neutrinos: relic neutrino detection with PTOLEMY

PTOLEMY aims to make the first direct observation of the Big Bang relic neutrinos (the cosmic neutrino background, CνB) by resolving the β-decay endpoint of atomic tritium (neutrino capture target) to O(meV) precision. This remains an outstanding test of the Standard Model in an expanding universe. Not only does the CνB carry with it a signal from the hot dense universe only one second after the Big Bang but helps to constrain the balance of hot versus cold dark matter responsible for its evolution. In doing so, the PTOLEMY experiment would also measure the lowest neutrino mass, an as-of-yet unknown fundamental constant. The experiment is currently in the prototyping phase and the group at Nikhef is responsible for developing the radio-frequency (RF) system used for cyclotron radiation (CR) based trigger and tracking. This component will provide the trajectory of electrons entering the novel transverse drift filter, constraining the electrons' energy losses before they reach the cryogenic calorimeter which in turn records their final energy. The focus of this project will be modelling CR and its detection for the purposes of single electron spectroscopy and optimised trajectory reconstruction. There is also the opportunity to test hardware and readout electronics for the prototype RF-system. Contact: James Vincent Mead

Theoretical Particle Physics: Effective Field Theories of Particle Physics from low- to high-energies

Known elementary matter particles exhibit a surprising three-fold structure. The particles belonging to each of these three “generations” seem to display a remarkable pattern of identical properties, yet have vastly different masses. This puzzling pattern is unexplained. Equally unexplained is the bewildering imbalance between matter and anti-matter observed in the universe, despite minimal differences in the properties of particles and anti-particles. These two mystifying phenomena may originate from a deeper, still unknown, fundamental structure characterised by novel types of particles and interactions, whose unveiling would revolutionise our understanding of nature. The ultimate goal of particle physics is uncovering a fundamental theory which allows the coherent interpretation of phenomena taking place at all energy and distance scales. In this project, the students will exploit the Standard Model Effective Field Theory (SMEFT) formalism, which allows the theoretical interpretation of particle physics data in terms of new fundamental quantum interactions which relate seemingly disconnected processes with minimal assumptions on the nature of an eventual UV-complete theory that replaces the Standard Model. Specifically, the goal is to connect measurements from ATLAS, CMS, and LHCb experiments at the CERN's LHC among them and to jointly interpret this information with that provided by other experiments including very low-energy probes such as the anomalous magnetic moment of the muon or electric dipole moments of the electron and neutron.

This project will be based on theoretical calculations in particle physics, numerical simulations in Python, analysis of existing data from the LHC and other experiments, as well as formal developments in understanding the operator structure of effective field theories. Depending on the student profile, sub-projects with a strong computational and/or machine learning component are also possible, for instance to construct new operators with optimal sensitivity to New Physics effects as encoded by the SMEFT higher-dimensional operators. Topics that can be considered in this project include the interpretation of novel physical observables at the LHC and their integration into the global SMEFiT analysis, matching of EFTs to UV-complete theories and their phenomenological analyses, projections for the impact in the SMEFT parameter space of data for future colliders, the synergies between EFT studies and proton structure fits, and the matching to the Weak Effective Field Theory to include data on flavour observables such as B-meson decays.

References: https://arxiv.org/abs/2105.00006 , https://arxiv.org/abs/2302.06660, https://arxiv.org/abs/2211.02058 , https://arxiv.org/abs/1901.05965 , https://arxiv.org/abs/1906.05296 ,  https://arxiv.org/abs/1908.05588,  https://arxiv.org/abs/1905.05215. see also this project description.

Contacts: Juan Rojo

Theoretical Particle Physics: High-energy neutrino-nucleon interactions at the Forward Physics Facility

High-energy collisions at the High-Luminosity Large Hadron Collider (HL-LHC) produce a large number of particles along the beam collision axis, outside of the acceptance of existing experiments. The proposed Forward Physics Facility (FPF) to be located several hundred meters from the ATLAS interaction point and shielded by concrete and rock, will host a suite of experiments to probe Standard Model (SM) processes and search for physics beyond the Standard Model (BSM). High statistics neutrino detection will provide valuable data for fundamental topics in perturbative and non-perturbative QCD and in weak interactions. Experiments at the FPF will enable synergies between forward particle production at the LHC and astroparticle physics to be exploited. The FPF has the promising potential to probe our understanding of the strong interactions as well as of proton and nuclear structure, providing access to both the very low-x and the very high-x regions of the colliding protons. The former regime is sensitive to novel QCD production mechanisms, such as BFKL effects and non-linear dynamics, as well as the gluon parton distribution function (PDF) down to x=1e-7, well beyond the coverage of other experiments and providing key inputs for astroparticle physics. In addition, the FPF acts as a neutrino-induced deep-inelastic scattering (DIS) experiment with TeV-scale neutrino beams. The resulting measurements of neutrino DIS structure functions represent a valuable handle on the partonic structure of nucleons and nuclei, particularly their quark flavour separation, that is fully complementary to the charged-lepton DIS measurements expected at the upcoming Electron-Ion Collider (EIC).

In this project, the student will carry out updated predictions for the neutrino fluxes expected at the FPF, assess the precision with which neutrino cross-sections will be measured, and quantify their impact on proton and nuclear structure by means of machine learning tools within the NNPDF framework and state-of-the-art calculations in perturbative Quantum Chromodynamics. This project contributes to ongoing work within the FPF Initiative towards a Conceptual Design Report (CDR) to be presented within two years. Topics that can be considered as part of this project include the assessment of to which extent nuclear modifications of the free-proton PDFs can be constrained by FPF measurements, the determination of the small-x gluon PDF from suitably defined observables at the FPF and the implications for ultra-high-energy particle astrophysics, the study of the intrinsic charm content in the proton and its consequences for the FPF physics program, and the validation of models for neutrino-nucleon cross-sections in the region beyond the validity of perturbative QCD.

References: https://arxiv.org/abs/2203.05090, https://arxiv.org/abs/2109.10905 ,https://arxiv.org/abs/2208.08372 , https://arxiv.org/abs/2201.12363 , https://arxiv.org/abs/2109.02653, https://github.com/NNPDF/ see also this project description.

Contacts: Juan Rojo

Theoretical Particle Physics: Probing the origin of the proton spin with machine learning

At energy-frontier facilities such as the Large Hadron Collider (LHC), scientists study the laws of nature in their quest for novel phenomena both within and beyond the Standard Model of particle physics. An in-depth understanding of the quark and gluon substructure of protons and heavy nuclei is crucial to address pressing questions from the nature of the Higgs boson to the origin of cosmic neutrinos. The key to address some of these questions is by carrying out an universal analysis of nucleon structure from the simultaneous determination of the momentum and spin distributions of quarks and gluons and their fragmentation into hadrons. This effort requires combining an extensive experimental dataset and cutting-edge theory calculations within a machine learning framework where neural networks parametrise the underlying physical laws while minimising ad-hoc model assumptions. The upcoming Electron Ion Collider (EIC), to start taking data in 2029, will be the world's first ever polarised lepton-hadron collider and will offer a plethora of opportunities to address key open questions in our understanding of the strong nuclear force, such as the origin of the mass and the intrinsic angular momentum (spin) of hadrons and whether there exists a state of matter which is entirely dominated by gluons. To fully exploit this scientific potential, novel analysis methodologies need to be develop that make it possible to carry out large-scale, coherent interpretations of measurements from the EIC and other high-energy colliders.

In this project, the student will carry out a new global analysis of the spin structure of the proton by means of the machine learning tools provided by the NNPDF open-source fitting framework and state-of-the-art calculations in perturbative Quantum Chromodynamics, and integrate it within the corresponding global NNPDF analyses of unpolarised proton and nuclear structure in the framework of a combined integrated global analysis of non-perturbative QCD. Specifically, the project aims to realise a NNLO global fit of polarised quark and gluon PDFs that combines all available data and state-of-the-art perturbative QCD calculations, and study the phenomenological implications for other experiments, including the EIC, for the spin content of the proton, for comparisons with lattice QCD calculations, and for nonpperturbative models of hadron structure.

References: https://arxiv.org/abs/2201.12363, https://arxiv.org/abs/2109.02653 , https://arxiv.org/abs/2103.05419, https://arxiv.org/abs/1404.4293 , https://inspirehep.net/literature/1302398, https://github.com/NNPDF/ see also this project description.

Contacts: Juan Rojo

Theoretical Particle Physics: Charged lepton flavor violation in neutrino mass models

The nonzero value of neutrino masses requires an explanation beyond the Standard Model of particle physics. A promising solution involves the existence of extra neutrinos, often called right-handed or sterile neutrinos. These models elegantly explain neutrino masses and can also be connected to other puzzles such as the absence of anti-matter in our universe. In this project you will investigate potential experimental signatures of sterile neutrinos through decays that are extremely rare in the Standard Model. Examples are muon decays to electrons and photons, or muon + neutron -> electron + neutron. You will perform Quantum Field Theory calculations within the neutrino-extended Standard Model to compute the rates of these processes and compare them to experimental sensitivities.

Contacts: Jordy de Vries

Theoretical Particle Physics: The electric dipole moment of paramagnetic systems in the Standard Model

Electric dipole moments (EDMs) of elementary particles, hadrons, nuclei, atoms, and molecules would indicate the violation of CP violation. The Standard Model (SM) contains CP violation in the weak interaction in the so-called CKM matrix (the quark-mixing matrix) but it leads to EDMs that are too small to be seen. At least this is often claimed. In this work we will reinvestigate the computation of the EDMs of systems that are used in state-of-the-art experiments. In particular we will compute a CP-violating interaction between electrons and nucleons mitigated by the SM weak interaction. During this project you will obtain a deep understanding of the Standard Model and explicit quantum field theory calculations across a wider range of energy scales.

Contacts: Jordy de Vries



Finished master projects

See:





Last year's MSc Projects