Difference between revisions of "Master Projects"

From Education Wiki
Jump to navigation Jump to search
 
(364 intermediate revisions by 53 users not shown)
Line 3: Line 3:
 
The following Master thesis research projects are offered at Nikhef. If you are interested in one of these projects, please contact the coordinator listed with the project.  
 
The following Master thesis research projects are offered at Nikhef. If you are interested in one of these projects, please contact the coordinator listed with the project.  
  
== Projects with September 2019 start ==
+
== Projects with a 2024 start [WORK IN PROGRESS, please look below for older projects] ==
  
=== Theory: The Effective Field Theory Pathway to New Physics at the LHC ===
+
=== ALICE: Search for new physics with 4D tracking at the most sensitive vertex detector at the LHC ===
  
A very promising framework to parametrise in a robust and model-independent way deviations from the Standard Model (SM) induced by new heavy particles is the Standard  Model Effective Field Theory (SMEFT). In this formalism, Beyond the SM effects are encapsulated in higher-dimensional operators constructed from SM fields respecting their symmetry properties. In this project, we aim to carry out a global analysis of the SMEFT from high-precision LHC data, including Higgs boson production, flavour observables, and low-energy measurements. This analysis will be carried out in the context of the recently developed SMEFiT approach [1] based on Machine Learning techniques to efficiently explore the complex theory parameter space. The ultimate goal is either to uncover glimpses of new particles or interactions at the LHC, or to derive the most stringent model-independent bounds to date on general theories of New Physics.
+
With the newly installed Inner Tracking System consisting fully of monolithic detectors, ALICE is very sensitive to particles with low transverse momenta, more so than ATLAS and CMS. This will be even more so for the ALICE upgrade detector in 2033. This detector could potentially be even more sensitive to longlived particles that leave peculiar tracks such as disappearing or kinked tracks in the tracker by using timing information along a track. In this project you will investigate how timing information in the different tracking layers can improve or even enable a search for new physics beyond the Standard Model in ALICE. If you show a possibility for major improvements, this can have real consequences for the choice of sensors for this ALICE inner tracker upgrade.
  
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]"
+
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld] and [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
  
[1] https://arxiv.org/abs/1901.05965
+
=== ALICE: Connecting the hot and cold QCD matter by searching for the strongest magnetic field in nature===
 +
In a non-central collision between two Pb ions, with a large value of impact parameter, the charged nucleons that do not participate in the interaction (called spectators) create strong magnetic fields. A back of the envelope calculation using the Biot-Savart law brings the magnitude of this filed close to 10^19Gauss in agreement with state of the art theoretical calculation, making it the strongest magnetic field in nature. The presence of this field could have direct implications in the motion of final state particles. The magnetic field, however, decays rapidly. The decay rate depends on the electric conductivity of the medium which is experimentally poorly constrained. Overall, the presence of the magnetic field, the main goal of this project, is so far not confirmed experimentally and can have implications for measurements of gravitational waves emitted from the merger of neutron stars.
  
=== Theory: Pinning down the initial state of heavy-ion collisions with Machine Learning ===
+
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
 +
 
 +
=== ALICE/LHCb Tracking: Innovative tracking techniques exploting modern heterogeneous architectures===
 +
The recostruction of charged particle tracks is one of the most computationaly demanding components of modern high energy physics experiments. In particular, the upcoming High-Luminosity Large Hadron Collider (HL-LHC) makes the usage of fast tracking algorithms using modern computing architectures with many cores and accelerators essential. In this project we will be investigating innovative, machine learning, experiment agnostic tracking algorithms in modern architectures e.g. GPUs, FPGAs.
 +
 
 +
''Contact: [mailto:jdevries@nikhef.nl Jacco de Vries] and [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
 +
 
 +
=== ATLAS: Charged particle tracking in the ATLAS detector with new machine learning techniques ===
 +
This project concerns the application of new machine learning techniques to tackle the problem of track reconstruction at the ATLAS detector in CERN. While algorithms to construct particle tracks from low-level detector information such as particle hits have been around for decades, recent developments in the field of machine learning open up new opportunities to improve these algorithms significantly. In particular transformer neural networks (the architecture that chatGPT is based on) and graph neural networks will be studied in this problem. But there are a range of available options and this project includes the freedom for the student to choose particular types of networks if some are of particular interest.
 +
 
 +
At the start of this project simplified test data will be used for initial model development. Upon successful completion of this, simulated data from the actual ATLAS detector will be used. The student will need some familiarity with programming in python and an interest in machine learning, but a strong physics background is not necessary. In this project the student will be able to contribute to fundamental physics research and will familiarize themselves with state-of-the-art machine learning models.
 +
 
 +
Contact: ''[mailto:zwolffs@nikhef.nl Zef Wolffs] and [mailto:Ivo.van.Vulpen@nikhef.nl Ivo van Vulpen]''
 +
 
 +
=== ATLAS: Search for very rare Higgs decays to second-generation fermions ===
 +
While the Higgs boson coupling to fermions of the third generation has been established experimentally, the investigation of the Higgs boson coupling to the light fermions of the second generation will be a central project for the current data-taking period of the LHC (2022-2025). The Higgs boson decay to muons is the most sensitive channel for probing this coupling. In this project, event selection algorithms for Higgs boson decays to muons in the associated production with a gauge boson (VH) are developed with the aim to distinguish signal events from background processes like Drell-Yan and WZ boson production. For this purpose, the candidate will implement and validate deep learning algorithms, and extract the final results based on fit  to the output of the deep learning classifier.
 +
 
 +
''Contact: [mailto:oliver.rieger@nikhef.nl Oliver Rieger] and [mailto:verkerke@nikhef.nl Wouter Verkerke]''
 +
 
 +
=== ATLAS: Advanced deep-learning techniques for lepton identification ===
 +
The ATLAS experiment at the Large Hadron Collider facilitates a broad spectrum of physics analyses. A critical aspect of these analyses is the efficient and accurate identification of leptons, which are crucial for both signal detection and background event rejection. The ability to distinguish between prompt leptons, arising directly from the collision, and nonprompt leptons, originating from heavy flavour hadron decays, is a challenging task. This project aims to develop and implement advanced techniques based on deep learning models to leverage the lepton identification beyond the capabilities of current standard methods.
 +
 
 +
''Contact: [mailto:oliver.rieger@nikhef.nl Oliver Rieger] and [mailto:verkerke@nikhef.nl Wouter Verkerke]''
 +
 
 +
=== ATLAS: Probing CP-violation in the Higgs sector with the ATLAS experiment ===
 +
The Standard Model Effective Field Theory (SMEFT) provides a systematic approach to test the impact of new physics at the energy scale of the LHC through higher-dimensional operators. The scarcity of antimatter in the cosmos arises from the slight differences in the behavior of particles and their antiparticle counterparts, known as CP-violation. The current data-taking period of the LHC is expected to yield a comprehensive dataset, enabling the investigation of CP-odd SMEFT operators in the Higgs boson's interactions with other particles.The interpretation of experimental data using SMEFT requires a particular interest in solving complex technical challenges, advanced statistical techniques, and a deep understanding of particle physics.
 +
 
 +
''Contact: [mailto:lbrenner@nikhef.nl Lydia Brenner], [mailto:oliver.rieger@nikhef.nl Oliver Rieger] and [mailto:verkerke@nikhef.nl Wouter Verkerke]''
 +
 
 +
=== ATLAS: Signal and background sensitivity in Standard Model Effective Field Theory (SMEFT) ===
 +
Complex statistical combinations of large sectors of the ATLAS scientific program are currently being used to obtain the best experimental sensitivity to SMEFT parameters. However, to achieve a fully consistent investigation of SMEFT and to push the limit of what is possible with the data already collected it is needed to include background modifications effects. Joining our efforts in this topic means contributing to a cutting-edge investigation that requires both a particular motivation in solving complex technical challenges and into obtaining a broad knowledge of experimental particle physics.
 +
 
 +
Contact: ''[mailto:avisibil@nikhef.nl Andrea Visibile] and [mailto:lbrenner@nikhef.nl Lydia Brenner]''
 +
 
 +
=== ATLAS: Performing a Bell test in Higgs to di-boson decays ===
 +
Recently, theorist [1] have proposed to perform a Bell test in Higgs to di-boson decays. This is a fundamental test of not only quantum mechanics but also a test of quantum field theory using the elusive scalar Higgs particle. At Nikhef we started to brainstorm on the experimental aspects of this challenging measurement. Due to the studies of a PhD student [2] we have considerable experience in the reconstruction of Higgs rest frame angles that are essential to perform a Bell test. Is there a master student who wants to join our efforts to study the ''"spooky action at a distance"'' in Higgs to WW decays?
 +
 
 +
''Contact: [mailto:Peter.Kluit@nikhef.nl Peter Kluit]''
 +
 
 +
  [1] Review article <nowiki>https://arxiv.org/pdf/2402.07972.pdf</nowiki>
 +
 
 +
  [2] <nowiki>https://www.nikhef.nl/pub/services/biblio/theses_pdf/thesis_R_Aben.pdf</nowiki>
 +
 
 +
=== ATLAS: A new timing detector - the HGTD ===
 +
The ATLAS is going to get a new ability:  a timing detector. This allows us to reconstruct tracks not only in the 3 dimensions of space but adds the ability of measuring very precisely also the time (at picosecond level) at which the particles pass the sensitive layers of the HGTD detector. The added information helps to construct the trajectories of the particles created at the LHC in 4 dimensions and ultimately will lead to a better reconstruction of physics at ATLAS. The new HGTD detector is still in construction and work needs to be done on different levels such as understanding the detector response (taking measurements in the lab and performing simulations) or developing algorithms to reconstruct the particle trajectories (programming and analysis work).
 +
 
 +
'''Several projects are available within the context of the new HGTD detector:'''
 +
 
 +
# One can choose to either focus on '''''the impact on physics analysis performance''''' by studying how the timing measurements can be included in the reconstruction of tracks, and what effect this has on how much better we can understand the physical processes occurring in the particles produced in the LHC collisions. With this work you will be part of the Atlas group at Nikhef.
 +
# The second possibility is to '''''test the sensors in our lab''''' and in test-beam setups at CERN/DESY. The analysis performed will be in context of the ATLAS HGTD test beam group in connection to both the Atlas group and the R&D department at Nikhef.
 +
# The third is to contribute in an ongoing effort '''''to precisely simulate/model  the silicon avalanche detectors''''' in the Allpix2 framework. There are several models that try to describe the detectors response. The models have depend on operation temperature, field strenghts and radiation damage.  We are getting close in being able to model our detector - but not there yet. This work will be within the ATLAS group.
 +
 
 +
Contact:  ''[mailto:hella.snoek@nikhef.nl Hella Snoek]''
 +
 
 +
=== ATLAS: Studying rare modes of Higgs boson production at the LHC ===
 +
The Higgs boson is a crucial piece of the Standard Model and its most recently-discovered particle. Studying Higgs boson production and decay at the LHC might hold the key for unlocking new information about the physical laws governing our universe. With the LHC now in its third run, we can also use the enormous amounts of data being collected to study Higgs boson production modes we have not previously been able to access. For instance, we can look at the production of a Higgs boson via the fusion of two vector bosons, accompanied by emission of a photon, with subsequent H->WW decay. This state is experimentally-distinctive and should be accessible to us using the current dataset of the LHC. It is also theoretically-interesting because it probes the Higgs boson’s interaction with W bosons. This exact interaction is a cornerstone of electroweak symmetry breaking, the process by which particles gain mass, so studying it provides a window onto a fundamental part of the Standard Model. This project will study the feasibility of measuring this or another rare Higgs production mode using H->WW decays, providing a chance to be involved in the design of an analysis from the ground up.
 +
 
 +
''Contact: [mailto:rhayes@nikhef.nl Robin Hayes], [mailto:f.dias@nikhef.nl Flavia de Almeida Dias]''
 +
 
 +
=== ATLAS: Exploring triboson polarisation in loop-induced processes at the LHC ===
 +
Spin is a fundamental, quantum mechanical property carried by (most) elementary particles. When high-energy particles scatter, their spin influences how angular momentum is propagated through the processes and ultimately how final-state particles are (geometrically) distributed. Helicity is the projection of the spin vector upon momentum. For example: in the loop-induced process gg > W+W-Z, the angular separation between the various decay products of the W and Z bosons depends on the helicity polarisation of the intermediate W and Z bosons. The aim of this project is to explore helicity polarisation in the multiboson processes, and specifically the gg > WWZ process, at the Large Hadron Collider. This project is in the interface between theory and experiment, and you will work with Monte Carlo generators, analyses design and sensitivity studies.
 +
 
 +
''Contact: [mailto:f.dias@nikhef.nl Flavia de Almeida Dias]''
 +
 
 +
=== ATLAS: High-Performance Simulations for High-Energy Physics Experiments===
 +
The role of simulation and synthetic data generation for High-Energy Physics (HEP) research is profound. While there are physics-accurate simulation frameworks available to provide the most realistic data syntheses, these tools are slow. Additionally, the output from physics-accurate simulations is closely tied to the experiment that the simulation was developed for and its software.
 +
 
 +
Fast simulation frameworks on the other hand, can drastically simplify the simulation, while still striking a balance between speed and accuracy of the simulated events. The applications of simplified simulations and data are numerous. We will be focusing on the role of such data as an enabler for Machine Learning (ML) model design research.
  
It has been known for more than three decades that the parton distribution functions (PDFs) of nucleons bound within heavy nuclei are modified with respect to their free-nucleon counterparts. Despite active experimental and theoretical investigations, the underlying mechanisms that drive these in-medium modifications of nucleon substructure have yet to be fully understood.  The determination of nuclear PDFs is a topic of high relevance in order both to improve our fundamental understanding of the strong interactions in the nuclear environment, as well as and for the interpretation of heavy ion collisions at RHIC and the LHC, in particular for the characterization of the Quark-Gluon Plasma. The goal of this project is to exploit Machine Learning and Artificial Intelligence tools [1,2] (neural networks trained by stochastic gradient descent) to pin down the initial state of heavy ion collisions  by using recent measurements from proton-lead collisions at the LHC. Emphasis will be put on the poorly-known nuclear modifications of the gluon PDFs, which are still mostly ''terra incognita'' and highly relevant for phenomenological applications. In addition to theory calculations, the project will also involve code development using modern AI/ML tools such as TensorFlow and Keras.
+
This project aims to extend the REDVID simulation framework [1, 2] through addition of new features. The features considered for this iteration include:
  
[1] https://arxiv.org/abs/1811.05858
+
*Interaction with common Monte Carlo event generators: To calculate hit points for imported events
[2] https://arxiv.org/abs/1410.8849
+
*Addition of basic magnetic field effect: Simulation of a simplified, uniform magnetic field, affecting charged particle trajectories
 +
*Inclusion of pile-up effects during simulation: Multiple particle collisions occurring in close vicinity
 +
*Indication of bunch size
 +
*Spherical coordinates
 +
*Vectorised helical tracks
 +
*Considerations for reproducibility of collision events
  
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]"
+
The project is part of an ongoing effort to train and test ML models for particle track reconstruction for the HL-LHC. The improved version of REDVID can be used by the student and other users to generate training data for ML models. Depending on the progress and the interest, a secondary goal could be to perform comparisons with physics-accurate simulations or to investigate the impact of the new features on developed ML models.
  
 +
'''Bonus:''' The student will be encouraged and supported to publish the output of this study in a relevant journal, such as "Data in Brief" by Elsevier.
  
=== Dark Matter: XENON1T Data Analysis ===
+
====Appendix - Terminology====
The XENON collaboration has used the XENON1T detector to achieve the world’s most sensitive direct detection dark matter results and is currently building the XENONnT successor experiment. The detectors operate at the Gran Sasso underground laboratory and consist of so-called dual-phase xenon time-projection chambers filled with ultra-pure xenon. Our group has an opening for a motivated MSc student to do analysis with the data from the XENON1T detector. The work will consist of understanding the detector signals and applying machine learning tools such as deep neutral networks to improve the reconstruction performance in our Python-based analysis tool, following the approach described in arXiv:1804.09641. The final goal is to improve the energy and position reconstruction uncertainties for the dark matter search. There will also be opportunity to do data-taking shifts at the Gran Sasso underground laboratory in Italy.  
+
The terminology for the considered simulations and its features is domain-specific and are explained below:
 +
 
 +
*Synthetic data: Data generated during a simulation, which resembles real-data to limited extent.
 +
*Physics-accurate simulation: A type of simulation that strongly takes into account real-world physical interactions and utilises physics formulas to achieve this.
 +
*Complexity-aware simulation framework: A simulator which can be configured with different levels of simulation complexity, making the simulation closer or further away from the real-world case.
 +
*Complexity-reduced data set: Simplified data resulting from simplified simulations. This is in comparison to real data, or data generated by physics-accurate simulations.
 +
 
 +
==== References====
 +
[1] U. Odyurt et al. 2023. "Reduced Simulations for High-Energy Physics, a Middle Ground for Data-Driven Physics Research". URL: https://doi.org/10.48550/arXiv.2309.03780
 +
 
 +
[2] U. Odyurt. 2023. "REDVID Simulation Framework". URL: https://virtualdetector.com/redvid
 +
 
 +
Contact: ''[mailto:uodyurt@nikhef.nl dr. ir. Uraz Odyurt], [mailto:roel.aaij@nikhef.nl dr. Roel Aaij]''
 +
----
 +
 
 +
=== Dark Matter: Building better Dark Matter Detectors - the XAMS  R&D Setup ===
 +
The Amsterdam Dark Matter group operates an R&D xenon detector at Nikhef. The detector is a dual-phase xenon time-projection chamber and contains about 0.5kg of ultra-pure liquid xenon in the central volume. We use this detector for the development of new detection techniques - such as utilizing our newly installed silicon photomultipliers - and to improve the understanding of the response of liquid xenon to various forms of radiation. The results could be directly used in the XENONnT experiment, the world’s most sensitive direct detection dark matter experiment at the Gran Sasso underground laboratory, or for future Dark Matter experiments like DARWIN. We have several interesting projects for this facility. We are looking for someone who is interested in working in a laboratory on high-tech equipment, modifying the detector, taking data and analyzing the data themselves You will "own" this experiment.  
  
 
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
 
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
  
=== Dark Matter: XAMS  R&D Setup ===
+
===Dark Matter: Searching for Dark Matter Particles - XENONnT Data Analysis===
The Amsterdam Dark Matter group operates an R&D xenon detector at Nikhef. The detector is a dual-phase xenon time-projection chamber and contains about 4kg of ultra-pure liquid xenon. We plan to use this detector for the development of new detection techniques (such as utilizing new photosensors) and to improve the understanding of the response of liquid xenon to various forms of radiation. The results could be directly used in the XENON experiment, the world’s most sensitive direct detection dark matter experiment at the Gran Sasso underground laboratory. We have several interesting projects for this facility. We are looking for someone who is interested in working in a laboratory on high-tech equipment, modifying the detector, taking data and analyzing the data him/herself. You will "own" this experiment.  
+
The XENON collaboration has used the XENON1T detector to achieve the world’s most sensitive direct detection dark matter results and is currently operating the XENONnT successor experiment. The detectors operate at the Gran Sasso underground laboratory and consist of so-called dual-phase xenon time-projection chambers filled with ultra-pure xenon. Our group has an opening for a motivated MSc student to do analysis with the new data coming from the XENONnT detector. The work will consist of understanding the detector signals and applying a deep neural network  to improve the (gas-) background discrimination in our Python-based analysis tool to improve the sensitivity for low-mass dark matter particles. The work will continue a study started by a recent graduate.  There will also be opportunity to do data-taking shifts at the Gran Sasso underground laboratory in Italy.
  
 
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
 
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
  
=== Dark Matter: DARWIN Sensitivity Studies ===
+
===Dark Matter: Signal reconstruction and correction in XENONnT===
DARWIN is the "ultimate" direct detection dark matter experiment, with the goal to reach the so-called "neutrino floor", when neutrinos become a hard-to-reduce background. The large and exquisitely clean xenon mass will allow DARWIN to also be sensitive to other physics signals such as solar neutrinos, double-beta decay from Xe-136, axions and axion-like particles etc. While the experiment will only start in 2025, we are in the midst of optimizing the experiment, which is driven by simulations. We have an opening for a student to work on the GEANT4 Monte Carlo simulations for DARWIN, as part of a simulation team together with the University of Freiburg and Zurich. We are also working on a "fast simulation" that could be included in this framework. It is your opportunity to steer the optimization of a large and unique experiment. This project requires good programming skills (Python and C++) and data analysis/physics interpretation skills.  
+
XENONnT is a low background experiment operating at the INFN - Gran Sasso underground laboratory with the main goal of detecting Dark Matter interactions with xenon target nuclei. The detector, consisting of a dual-phase time projection chamber, is filled with ultra-pure xenon, which acts as a target and detection medium. Understanding the detector's response to various calibration sources is a mandatory step in exploiting the scientific data acquired. This MSc thesis aims to develop new methods to improve the reconstruction and correction of scintillation/ ionization signals from calibration data. The student will work with modern analysis techniques (python-based) and will collaborate with other analysts within the international XENON Collaboration.
 +
 
 +
''Contact: [mailto:mpierre@nikhef.nl Maxime Pierre], [mailto:decowski@nikhef.nl Patrick Decowski]''
 +
 
 +
===Dark Matter: The Ultimate Dark Matter Experiment - DARWIN Sensitivity Studies===
 +
DARWIN is the “ultimate” direct detection dark matter experiment, with the goal to reach the so-called “neutrino floor”, when neutrinos become a hard-to-reduce background. The large and exquisitely clean xenon mass will allow DARWIN to also be sensitive to other physics signals such as solar neutrinos, double-beta decay from Xe-136, axions and axion-like particles etc. While the experiment will only start in 2027, we are in the midst of optimizing the experiment, which is driven by simulations. We have an opening for a student to work on the GEANT4 Monte Carlo simulations for DARWIN. We are also working on a “fast simulation” that could be included in this framework. It is your opportunity to steer the optimization of a large and unique experiment. This project requires good programming skills (Python and C++) and data analysis/physics interpretation skills.
 +
 
 +
''Contact: [mailto:t.pollmann@nikhef.nl Tina Pollmann], [mailto:decowski@nikhef.nl Patrick Decowski] or [mailto:z37@nikhef.nl Auke Colijn]''
 +
 
 +
===Dark Matter: Exploring new background sources for DARWIN===
 +
Experiments based on the xenon dual-phase time projection chamber detection technology have already demonstrated their leading role in the search for Dark Matter. The unprecedented low level of background reached by the current generation, such as XENONnT, allows such experiments to be sensitive to new rare-events physics searches, broadening their physics program. The next generation of experiments is already under consideration with the DARWIN observatory, which aims to surpass its predecessors in terms of background level and mass of xenon target. With the increased sensitivity to new physics channels, such as the study of neutrino properties, new sources of backgrounds may arise. This MSc thesis aims to investigate potential new sources of background for DARWIN and is a good opportunity for the student to contribute to the design of the experiment. This project will rely on Monte Carlo simulation tools such as GEANT4 and FLUKA, and good programming skills (Python and  C++) are advantageous.
 +
 
 +
''Contact: [mailto:mpierre@nikhef.nl Maxime Pierre], [mailto:decowski@nikhef.nl Patrick Decowski]''
 +
 
 +
===Dark Matter: Sensitive tests of wavelength-shifting properties of materials for dark matter detectors===
 +
Rare event search experiments that look for neutrino and dark matter interactions are performed with highly sensitive detector systems, often relying on scintillators, especially liquid noble gases, to detect particle interactions. Detectors consist of structural materials that are assumed to be optically passive, and light detection systems that use reflectors, light detectors, and sometimes, wavelength-shifting materials. MSc theses are available related to measuring the efficiency of light detection systems that might be used in future detectors. Furthermore, measurements to ensure that presumably passive materials do not fluoresce, at the low level relevant to the detectors, can be done. Part of the thesis work can include Monte Carlo simulations and data analysis for current and upcoming dark matter detectors, to study the effect of different levels of desired and nuisance wavelength shifting. In this project, students will acquire skills in photon detection, wavelength shifting technologies, vacuum systems, UV and extreme-UV optics, detector design, and optionally in Python and C++ programming, data analysis, and Monte Carlo techniques.
 +
 
 +
''Contact: [mailto:Tina.Pollmann@tum.de Tina Pollmann]''
 +
 
 +
===Detector R&D: Energy Calibration of hybrid pixel detector with the Timepix4 chip===
 +
The Large Hadron Collider at CERN will increase its luminosity in the coming years. For the LHCb experiment the number of collisions per bunch crossing increases from 7 to more than 40. To distinguish all tracks from the quasi simultaneous collisions, time information will have to be used in addition to spatial information. A big step on the way to fast silicon detectors is the recently developed Timepix4 ASIC. Timepix4 consist of 448x512 pixels, but the pixels are not identical and there are pixel to pixel fluctuations in the time and charge measurement. The ultimate time resolution can only be achieved after calibration of both the time and energy measurements.
 +
The goal of this project is to study the energy calibration of Timepix4. Typical research questions are: how does the resolution depend on threshold and Krummenacher (discharge) current, and does a different sensor affect the energy resolution? In this research you will do measurements with calibration pulses, lasers and with radio-active sources to obtain data to calibrate the detector. The work consist of hands-on work in the lab to build/adapt the test set-up, and analysis of the data obtained.
 +
 
 +
''Contact: [mailto:(doppenhu@nikhef.nl) Daan Oppenhuis],[mailto:(hella.snoek@nikhef.nl) Hella Snoek],''
 +
 
 +
===Detector R&D: Studies of wafer-scale sensors for ALICE detector upgrade and beyond===
 +
One of the biggest milestones of the ALICE detector upgrade (foreseen in 2026) is the implementation of wafer-scale (~ 28 cm x 18 cm) monolithic silicon active pixel sensors in the tracking detector, with the goal of having truly cylindrical barrels around the beam pipe. To demonstrate such an unprecedented technology in high energy physics detectors, few chips will be soon available in Nikhef laboratories for testing and characterization purposes.
 +
The goal of the project is to contribute to the validation of the samples against the ALICE tracking detector requirements, with a focus on timing performance in view of other applications in future high energy physics experiments beyond ALICE.
 +
We are looking for a student with a focus on lab work and interested in high precision measurements with cutting-edge instrumentation. You will be part of the Nikhef Detector R&D group and you will have, at the same time, the chance to work in an international collaboration where you will report about the performance of these novel sensors. There may even be the opportunity to join beam tests at CERN or DESY facilities. Besides interest in hardware, some proficiency in computing is required (Python or C++/ROOT).
 +
 
 +
''Contact: [mailto:(jory.sonneveld@nikhef.nl Jory Sonneveld]''
 +
 
 +
===Detector R&D: Time resolution of monolithic silicon detectors===
 +
Monolithic silicon detectors based on industrial Complementary Metal Oxide Semiconductor (CMOS) processes offer a promising approach for large scale detectors due to their ease of production and low material budget. Until recently, their low radiation tolerance has hindered their applicability in high energy particle physics experiments. However, new prototypes ~~such as the one in this project~~ have started to overcome these hurdles, making them feasible candidates for future experiments in high energy particle physics. In this project, you will investigate the temporal performance of a radiation hard monolithic detector prototype, that was produced end of 2023, using laser setups in the laboratory. You will also participate in meetings with the international collaboration working on this detector to present reports on the prototype's performance. A detailed investigation into different aspects of the system are to be investigated concerning their impact on the temporal resolution such as charge calibration and power consumption. Depending on the progress of the work, a first full three dimensional characterization of the prototypes performance using a state-of-the-art two-photon absorption laser setup at Nikhef and/or an investigation into irradiated samples for a closer look on the impact of radiation damage on the prototype are possible. This project is looking for someone interested in working hands on with cutting edge detector and laser systems at the Nikhef laboratory. Python programming skills and linux experience are an advantage.
 +
 
 +
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld], [mailto:uwe.kraemer@nikhef.nl Uwe Kraemer]''
 +
 
 +
===Detector R&D: Improving a Laser Setup for Testing Fast Silicon Pixel Detectors===
 +
For the upgrades of the innermost detectors of experiments at the Large Hadron Collider in Geneva, in particular to cope with the large number of collisions per second from 2027, the Detector R&D group at Nikhef tests new pixel detector prototypes with a variety of laser equipment with several wavelengths. The lasers can be focused down to a small spot to scan over the pixels on a pixel chip. Since the laser penetrates the silicon, the pixels will not be illuminated by just the focal spot, but by the entire three dimensional hourglass or double cone like light intensity distribution. So, how well defined is the volume in which charge is released? And can that be made much smaller than a pixel? And, if so, what would the optimum focus be? For this project the student will first estimate the intensity distribution inside a sensor that can be expected. This will correspond to the density of released charge within the silicon. To verify predictions, you will measure real pixel sensors for the LHC experiments.
 +
This project will involve a lot of 'hands on work' in the lab and involve programming and work on unix machines.
 +
 
 +
''Contact: [mailto:martinfr@nikhef.nl Martin Fransen]''
 +
 
 +
===Detector R&D: Time resolution of hybrid pixel detectors with the Timepix4 chip===
 +
Precise time measurements with silicon pixel detectors are very important for experiments at the High-Luminosity LHC and the future circular collider. The spatial resolution of current silicon trackers will not be sufficient to distinguish the large number of collisions that will occur within individual bunch crossings. In a new method, typically referred to as 4D tracking, spatial measurements of pixel detectors will be combined with time measurements to better distinguish collision vertices that occur close together. New sensor technologies are being explored to reach the required time measurement resolution of tens of picoseconds, and the results are promising.
 +
However, the signals that these pixelated sensors produce have to be processed by front-end electronics, which hence play a large role in the total time resolution of the detector. The front-end electronics has many parameters that can be optimised to give the best time resolution for a specific sensor type.
 +
In this project you will be working with the Timepix4 chip, which is a so-called application specific integrated circuit (ASIC) that is designed to read out pixelated sensors. This ASIC is used extensively in detector R&D for the characterisation of new sensor technologies requiring precise timing (< 50 ps). To study the time resolution you will be using laser setups in our lab, and there might be an opportunity to join a test with charged particle beams at CERN.
 +
These measurements will be complemented with data from the built-in calibration-pulse mechanism of the Timepix4 ASIC. Your work will enable further research performed with this ASIC, and serve as input to the design and operation of future ASICs for experiments at the High-Luminosity LHC.
 +
 
 +
''Contact: [mailto:k.heijhoff@nikhef.nl Kevin Heijhoff] and [mailto:martinb@nikhef.nl Martin van Beuzekom]''
 +
 
 +
===Detector R&D: Performance studies of Trench Isolated Low Gain Avalanche Detectors (TI-LGAD) ===
 +
The future vertex detector of the LHCb Experiment needs to measure the spatial coordinates and time of the particles originating in the LHC proton-proton collisions with resolutions better than 10 um and 50 ps, respectively. Several technologies are being considered to achieve these resolutions.  Among those is a novel sensor  technology called Trench Isolated Low Gain Avalanche Detector.
 +
Prototype pixelated sensors have been manufactured recently and have to be characterised. Therefore these new sensors will be bump bonded to a Timepix4 ASIC which provides charge and time measurements in each of 230 thousand pixels. Characterisation will be done using a lab setup at Nikhef, and includes tests with a micro-focused laser beam, radioactive sources, and possibly with particle tracks obtained in a test-beam.  This project involves data taking with these new devices and analysing the data to determine the performance parameters such as the spatial and temporal resolution. as function of temperature and other operational conditions.
 +
 
 +
''Contacts: [mailto:kazu.akiba@nikhef.nl Kazu Akiba] and [mailto:martinb@nikhef.nl Martin van Beuzekom]''
 +
 
 +
=== Detector R&D: A Telescope with Ultrathin Sensors for Beam Tests===
 +
To measure the performance of new prototypes for upgrades of the LHC experiments and beyond, typically a telescope is used in a beam line of charged particles that can be used to compare the results in the prototype to particle tracks measured with this telescope. In this project, you will continue work on a very lightweight, compact telescope using ALICE PIxel DEtectors (ALPIDEs). This includes work on the mechanics, data acquisition software, and a moveable stage. You will foreseeably test this telescope in the Delft Proton Therapy Center. If time allows, you will add a timing plane and perform a measurement with one of our prototypes. Apart from travel to Delft, there is a possiblity to travel to other beam line facilities.
 +
 
 +
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld]''
  
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
+
===Detector R&D: Laser Interferometer Space Antenna (LISA) - the first gravitational wave detector in space===
 +
 
 +
The space-based gravitational wave antenna LISA is one of the most challenging space missions ever proposed. ESA plans to launch around 2035 three spacecraft separated by a few million kilometres. This constellation measures tiny variations in the distances between test-masses located in each satellite to detect gravitational waves from sources such as supermassive black holes. LISA is based on laser interferometry, and the three satellites form a giant Michelson interferometer. LISA measures a relative phase shift between one local laser and one distant laser by light interference. The phase shift measurement requires sensitive sensors. The Nikhef DR&D group fabricated prototype sensors in 2020 together with the Photonics industry and the Dutch institute for space research SRON. Nikhef & SRON are responsible for the Quadrant PhotoReceiver (QPR) system: the sensors, the housing including a complex mount to align the sensors with 10's of nanometer accuracy, various environmental tests at the European Space Research and Technology Centre (ESTEC), and the overall performance of the QPR in the LISA instrument. Currently we are fabricating improved sensors, optimizing the mechanics and preparing environmental tests. As a MSc student, you will work on various aspects of the wavefront sensor development: study the performance of the epitaxial stacks of Indium-Gallium-Arsenide, setting up test benches to characterize the sensors and QPR system, performing the actual tests and data analysis, in combination with performance studies and simulations of the LISA instrument.
 +
Possible projects but better to contact us as the exact content may change:
 +
 
 +
#'''Title''': Simulating LISA QPD performance for LISA mission sensitivity. <br>    '''Topic''': Simulation and Data Analysis. <br>    '''Description''': we must provide accurate information to the LISA collaboration about the expected and actual performance of the LISA QPRs. This project will focus on using data from measurements taken at Nikhef to integrate into the simulation packages used within the LISA collaboration. The student will have the option to collect their own data to verify the simulations. Performance parameters include spatial uniformity and phase response, crosstalk and thermal response across the LISA sensitivity. <br> These simulations can then be used to investigate the full LISA performance and the impact on noise sources. This will involve simulating heterodyne signals expected on the LISA QPD and the impact on sensing techniques such as Differential Wavefront Sensing (DWS) and Tilt-to-Length (TTL) noise. Simulations tools include Finesse (Python), IFOCAD (C++) or FieldProp (MATLAB) depending on the student capabilities and preference. This work is important for understanding the stability and noise of LISA interferometry will perform during real operation in space.
 +
#'''Title''': Investigate the Response of the Gap in the LISA QPD. <br>    '''Topic''': Experimental. <br>    '''Description''': At Nikhef we are developing the photodiodes that will be used in the upcoming ESA/NASA LISA mission. We currently have our first batch of Quadrant Photodiodes (QPDs) that vary in diameter, thickness and gaps width between the quadrants. The goal of this project is to develop a free-space laser test set-up to measure the response of the gap between the quadrants of the LISA Quadrant Photodiode (QPD). It is important to understand the behaviour of the gap between the photodiode quadrants since this can impact the overall performance of the photodiode and thus the sensitivity of LISA. <br> The measurements will involve characterising the test laser beam, configuring test equipment, handling and installing optical components. Furthermore, as well as taking the data, the student will also be responsible for analysing the results using Python however other computer languages are acceptable (based on the student preference).
 +
#'''Title''': Investigate the Response of LISA QPDs for Einstein Telescope Pathfinder. <br>    '''Topic''': Experimental. <br>    '''Description''': Current gravitational wave (GW) interferometers typically operate using 1064 nm wavelengths. However, future GW detectors will operate at higher wavelengths such as 1550 nm or 2000 nm. As a result of the wavelength change, much of the current technology is unsuitable thus, developments are underway for the next generation GW detectors. Europe’s future GW detector, the Einstein Telescope, is currently in its’ infancy. A smaller scale prototype, known as ET pathfinder, is currently being built and serves as a test bench for the full scale detector. <br> At Nikhef’s R&D group, we want to develop quadrant photodiodes (QPDs) that sense the light from the interferometer light for the Einstein Telescope (ET) and ET Pathfinder. These QPDs require very low noise performance as well as high sensitivity in order to measure the small interferometer signals. To that end, out first step is to use the current QPDs that have been developed for the ESA/NASA LISA mission. <br> This project will focus on performance tests of the LISA QPDs using a 1550 nm. The student will be tasked with developing a test setup as well as taking the data and analysing the results. As part of this project, the student will learn about laser characterisation, gaussian optics and instrumentation techniques. These results will be important for designing the next generation QPDs and is of interest to the ET consortium, where the student can present their results.
 +
 
 +
 
 +
''Contact: [mailto:nielsvb@nikhef.nl Niels van Bakel] or [mailto:tmistry@nikhef.nl Timesh Mistry]''
  
=== The Modulation Experiment: Data Analysis ===
+
===Detector R&D: Other projects===
There exist a few measurements that suggest an annual modulation in the activity of radioactive sources. With a few groups from the XENON collaboration we have developed four sets of table-top experiments to investigate this effect on a few well known radioactive sources. The experiments are under construction in Purdue University (USA), a mountain top in Switzerland, a beach in Rio de Janeiro and the last one at Nikhef in Amsterdam. We urgently need a master student to (1) analyze the first big data set, and (2) contribute to the first physics paper from the experiment. We are looking for all-round physicists with interest in both lab-work and data-analysis. The student(s) will directly collaborate with the other groups in this small collaboration (around 10 people), and the goal is to have the first physics publication ready by the end of the project. During the 2018-2019 season there are positions for two MSc students.
+
Are you looking for a slightly different project? Are the above projects already taken? Are you coming in at an unusual time of the year? Do not hesitate to contact us! We always have new projects coming up at different times in the year and we are open to your ideas.
  
''Contact: [mailto:z37@nikhef.nl Auke Colijn]''
+
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld]''
  
=== ATLAS : Excited lepton searches with multiple leptons ===
+
===Gravitational Waves: Computer modelling to design the laser interferometers for the Einstein Telescope===
  
The Standard Model of particle physics (SM) is extremely successful, but would it hold against check with data containing multiple leptons? Although very rare process, the production of leptons is calculated in SM with high precision. On detector side the leptons (electrons and muons) are easy to reconstruct and such a sample contains very little "non-lepton" background. This analysis has an ambitious goal to find beyond Standard Model processes like Excited leptons using events with 4 leptons.  With this project, the student would gain close familiarity with modern experimental techniques (statistical analysis, SM predictions, search for rare signals), with Monte Carlo generators and the standard HEP analysis tools (ROOT, C++, python).
+
A new field of instrument science led to the successful detection of gravitational waves by the LIGO detectors in 2015. We are now preparing the next generation of gravitational wave observatories, such as the Einstein Telescope, with the aim to increase the detector sensitivity by a factor of ten, which would allow, for example, to detect stellar-mass black holes from early in the universe when the first stars began to form. This ambitious goal requires us to find ways to significantly improve the best laser interferometers in the world.
  
''Contact: [mailto:O.Igonkina@nikhef.nl Olya Igonkina and Marcus Morgenstern and Pepijn Bakker]''
+
Gravitational wave detectors complex Michelson-type interferometers enhanced with optical cavities. We develop and use numerical models to study these laser interferometers, to invent new optical techniques and to quantify their performance. For example, we synthesize virtual mirror surfaces to study the effects of higher-order optical modes in the interferometers, and we use opto-mechanical models to test schemes for suppressing quantum fluctuations of the light field. We can offer several projects based on numerical modelling of laser interferometers. All projects will be directly linked to the ongoing design of the Einstein Telescope.
  
=== ATLAS : A search for lepton non-universality in Bc meson decays ===
+
''Contact: [mailto:a.freise@nikhef.nl Andreas Freise]''
  
Recently, LHCb experiment has reported a number of intriguing deviations from SM in leptonic decays of B mesons. With this project we would like to probe if ATLAS also observes the same kind of deviation, e.g. in Bc->Jpsi+tau+nu channel w.r.t BC->Jpsi+mu+nu.  Success of project will be essential to understand if we finally observe  beyond SM process or if LHCb has some detector bias.  The student would gain close familiarity with modern experimental techniques (statistical analysis, SM predictions, search for rare signals), background suppression techniques and the standard HEP analysis tools (ROOT, C++, python).
+
===Gravitational-Waves: Get rid of those damn vibrations!===
 +
In 2015 large scale, precision interferometry led to the detection of gravitational-waves. In 2017 Europe’s Advanced Virgo detector joined this international network and the best studied astrophysical event in history, GW170817, was detected in both gravitational waves and across the electromagnetic spectrum.
  
''Contact: [mailto:O.Igonkina@nikhef.nl Olya Igonkina and JJ Teoh]''
+
The Nikhef gravitational wave group is actively contributing to improvements towards current gravitational-wave detectors and the rapidly maturing design for Europe’s next generation of gravitational-wave observatory, Einstein Telescope, with one of two candidate sites located in the Netherlands. These detectors will unveil the gravitational symphony of the dark universe out to cosmological distances. Breaking past the sensitivity achieved by the current observatories will require a radically new approach to core components of these state of the art machines. This is especially true at the lowest, audio-band, frequencies that the Einstein Telescope is targeting where large improvements are needed.
  
 +
Our project, Omnisens, brings the techniques from space based satellite control back to Earth building a platform capable of actively cancelling ground vibrations to levels never reached in the past. This is realised with state of the art compact interferometric sensors and precision mechanics. Substantial cancellation of seismic motion is an essential improvement for the Einstein Telescope, to reach below attometer (10<sup>-18</sup> m) displacements.
  
=== ATLAS : The lifetime of the Higgs boson ===
+
We are excited to offer two projects in 2024:
  
While the Higgs boson was discovered in 2012, many of its properties still remain unconstrained. This master student project revolves around one such property, the lifetime of the Higgs boson. The lifetime can be obtained by measuring the width of the boson, but because the width is a few hundred times smaller than the detector resolution, a direct measurement is impossible at the moment. But there is an idea to overcome that limitation. By utilizing the interference between the Higgs boson decay and background processes we can perform an indirect measurement. This measurement potentially has the sensitivity that will allow us to perform a measurement of the width (or lifetime) as predicted by the Standard Model. Specifically, the master project will be about predicting the sensitivity of this measurement for different predictions of the Higgs width. The project is on the interface of theory and experiment, making use of Monte Carlo generators and the standard HEP analysis tools (ROOT, C++, python).  
+
#You will experimentally demonstrate and optimise Omnisens’ novel vibration isolation for future deployment on the Einstein Telescope. The activity will involve hands-on experience with laser, electronics mechanical and high-vacuum systems.
 +
# You will contribute to the design of the Einstein Telescope by modelling the coupling of seismic and technical noises (such as actuation and sensing noises) through different configurations of seismic actuation chains. An accurate modelling of the origin and transmission of those noises is crucial in designing a system that prevents them from limiting the interferometer’s readout.
  
''Contact: [mailto:mveen@nikhef.nl Michiel Veen] or [mailto:Ivo.van.Vulpen@nikhef.nl Hella Snoek & Ivo van Vulpen]''
+
Contact: [mailto:c.m.mow-lowry@vu.nl Conor Mow-Lowry]
  
 +
===Gravitational Waves: Signal models & tools for data analysis ===
 +
Theoretical predictions of gravitational-wave (GW) signals provide essential tools to detect and analyse transient GW events in the data of GW instruments like LIGO and Virgo. Over the last few years, there has been significant effort to develop signal models that accurately describe the complex morphology of GWs from merging neutron-star and black-hole binaries. Future analyses of Einstein Telescope (ET) data will need to tackle much longer and louder compact binary signals, which will require significant developments beyond the current status quo of GW modeling (i.e., improvements in model accuracy and computational efficiency, increased parameter space coverage, ...)
  
 +
We can offer up to two projects: in GW signal modeling (at the interface of perturbation theory, numerical relativity simulations and fast phenomenological descriptions), as well as developing applications of signal models in GW data analysis. Although not strictly required, prior knowledge of basic concepts of general relativity and/or GW theory will be helpful. Some proficiency in computing is required (Mathematica, Python or C++).
  
=== LHCb : Measurement of Central Exclusive Production Rates of Chi_c using converted photons in LHCb ===
+
''Contact: [mailto:mhaney@nikhef.nl Maria Haney]''
  
Central exclusive production (CEP) of particles at the LHC is characterised by a extremely clean signature. Differently from the typical inelastic collisions where many particles are created resulting in a so-called Primary Vertex, CEP events have only the final state particles of interest. In this project the particle of interest is a pair of charmed quarks creating a chi_c particle. In theory this process is generated by a long range gluon exchange and can elucidate the nature of the strong force, described by the quantum chromodynamics in the the standard model. The proposed work involves  analysing a pre-existing dataset with reconstructed chi_c and simulating events at the LHCb in order to obtain the relative occurrence rate of each chi_c species (spins 0, 1, 2), a quantity that can be easily compared to theoretical predictions.
+
=== Theoretical Particle Physics: High-energy neutrino physics at the LHC===
 +
High-energy collisions at the LHC and its High-Luminosity upgrade (HL-LHC) produce a large number of particles along the beam collision axis, outside of the acceptance of existing experiments. The FASER experiment has in 2023, for the first team, detected neutrinos produced in LHC collisions, and is now starting to elucidate their properties. In this context, the proposed Forward Physics Facility (FPF) to be located several hundred meters from the ATLAS interaction point and shielded by concrete and rock, will host a suite of experiments to probe Standard Model (SM) processes and search for physics beyond the Standard Model (BSM). High statistics neutrino detection will provide valuable data for fundamental topics in perturbative and non-perturbative QCD and in weak interactions. Experiments at the FPF will enable synergies between forward particle production at the LHC and astroparticle physics to be exploited. The FPF has the promising potential to probe our understanding of the strong interactions as well as of proton and nuclear structure, providing access to both the very low-x and the very high-x regions of the colliding protons. The former regime is sensitive to novel QCD production mechanisms, such as BFKL effects and non-linear dynamics, as well as the gluon parton distribution function (PDF) down to x=1e-7, well beyond the coverage of other experiments and providing key inputs for astroparticle physics. In addition, the FPF acts as a neutrino-induced deep-inelastic scattering (DIS) experiment with TeV-scale neutrino beams. The resulting measurements of neutrino DIS structure functions represent a valuable handle on the partonic structure of nucleons and nuclei, particularly their quark flavour separation, that is fully complementary to the charged-lepton DIS measurements expected at the upcoming Electron-Ion Collider (EIC).
  
"Contact: [mailto:K.Akiba@nikhef.nl Kazu Akiba]"
+
In this project, the student will carry out updated predictions for the neutrino fluxes expected at the FPF, assess the precision with which neutrino cross-sections will be measured, develop novel monte carlo event generation tools for high-energy neutrino scattering, and quantify their impact on proton and nuclear structure by means of machine learning tools within the NNPDF framework and state-of-the-art calculations in perturbative Quantum Chromodynamics. This project contributes to ongoing work within the FPF Initiative towards a Conceptual Design Report (CDR) to be presented within two years. Topics that can be considered as part of this project include the assessment of to which extent nuclear modifications of the free-proton PDFs can be constrained by FPF measurements, the determination of the small-x gluon PDF from suitably defined observables at the FPF and the implications for ultra-high-energy particle astrophysics, the study of the intrinsic charm content in the proton and its consequences for the FPF physics program, and the validation of models for neutrino-nucleon cross-sections in the region beyond the validity of perturbative QCD.
  
=== LHCb Optimization studies for Vertex detector at the High Lumi LHCb  ===
+
References: https://arxiv.org/abs/2203.05090, https://arxiv.org/abs/2109.10905 ,https://arxiv.org/abs/2208.08372 , https://arxiv.org/abs/2201.12363 , https://arxiv.org/abs/2109.02653, https://github.com/NNPDF/ see also this [https://www.dropbox.com/s/30co188f1almzq2/rojo-GRAPPA-MSc-2023.pdf?dl=0 project description].
  
The LHCb experiment is dedicated to measure tiny differences between matter and antimatter through the precise study of rare processes involving b or c quarks.  The LHCb detector will undergo a major modification in order to dramatically increase the luminosity and be able to  measure indirect effects of physics beyond the standard model. In this environment, over 42 simultaneous collisions are expected to happen at a time interval of 200 ps where the two proton bunches overlap. The particles of interest have a relatively long lifetime and therefore the best way to distinguish them from the background collisions is through the precise reconstruction of displaced vertices and pointing directions. The new detector considers using extremely recent or even future technologies to measure space (with resolutions below 10 um) and time (100 ps or better) to efficiently reconstruct the events of interest for physics. The project involves changing completely  the LHCb Vertex Locator (VELO) design in simulation and determine what can be the best performance for the upgraded detector, considering different spatial and temporal resolutions.
+
''Contacts: [mailto:j.rojo@vu.nl Juan Rojo]''
 +
===Theoretical Particle Physics: Unravelling proton structure with machine learning ===
 +
At energy-frontier facilities such as the Large Hadron Collider (LHC), scientists study the laws of nature in their quest for novel phenomena both within and beyond the Standard Model of particle physics. An in-depth understanding of the quark and gluon substructure of protons and heavy nuclei is crucial to address pressing questions from the nature of the Higgs boson to the origin of cosmic neutrinos. The key to address some of these questions is carrying out a global analysis of nucleon structure by combining an extensive experimental dataset and cutting-edge theory calculations. Within the NNPDF approach, this is achieved by means of a machine learning framework where neural networks parametrise the underlying physical laws while minimising ad-hoc model assumptions. In addition to the LHC, the upcoming Electron Ion Collider (EIC), to start taking data in 2029, will be the world's first ever polarised lepton-hadron collider and will offer a plethora of opportunities to address key open questions in our understanding of the strong nuclear force, such as the origin of the mass and the intrinsic angular momentum (spin) of hadrons and whether there exists a state of matter which is entirely dominated by gluons.  
  
"Contact: [mailto:K.Akiba@nikhef.nl Kazu Akiba]"
+
In this project, the student will develop novel machine learning and AI approaches aimed to improve global analyses of proton structure and better predictions for the LHC, the EIC, and astroparticle physics experiments. These new approaches will be implemented within the machine learning  tools provided by the NNPDF open-source fitting framework and use state-of-the-art calculations in perturbative Quantum Chromodynamics. Techniques that will be considered include normalising flows, graph neural networks, gaussian processes, and kernel methods for unsupervised learning. Particular emphasis will be devoted to the automated determination of model hyperparameters, as well as to the estimate of the associated model uncertainties and their systematic validation with a battery of statistical tests. The outcome of the project will benefit the ongoing program of high-precision theory predictions for ongoing and future experiments in particle physics.
  
=== LHCb Measurement of charge multiplication in heavily irradiated sensors ===
+
Referenceshttps://arxiv.org/abs/2201.12363,  https://arxiv.org/abs/2109.02653 , https://arxiv.org/abs/2103.05419, https://arxiv.org/abs/1404.4293 , https://inspirehep.net/literature/1302398, https://github.com/NNPDF/ see also this [https://www.dropbox.com/s/30co188f1almzq2/rojo-GRAPPA-MSc-2023.pdf?dl=0 project description].
  
During the R&D phase for the LHCb VELO Upgrade detector a few sensor prototypes were irradiated to the extreme fluence expected to be achieved during the detector lifetime. These samples were tested using high energy particles at the SPS facility at CERN with their trajectories reconstructed by the Timepix3 telescope. A preliminary analysis revealed that at the highest irradiation levels the amount of signal observed is higher than expected, and even larger than the signal obtained at lower doses.  At the Device Under Test (DUT) position inside the telescope, the spatial resolution attained by this system is below 2 um. This means that a detailed analysis can be performed in order to study where and how this signal amplification happens within  the 55x55 um^2 pixel cell.  This project involves analysing the telescope and DUT data to investigate the charge multiplication mechanism at the microscopic level.
+
''Contacts: [mailto:j.rojo@vu.nl Juan Rojo]''
  
"Contact: [mailto:K.Akiba@nikhef.nl Kazu Akiba]"
+
=== Theoretical Particle Physics: Sterile neutrino dark matter===
  
=== Detector R&D : Studying fast timing detectors ===
+
The existence of right-handed (sterile) neutrinos is well motivated, as all other Standard Model particles come in both  chiralities, and moreover, they naturally explaine the small masses of the left-handed (active) neutrinos. If the lightest sterile neutrino is very long lived, it could be dark matter. Although they can be produced by neutrino oscillations in the early universe, this is not efficient enough to explain all dark matter. It has been proposed that additional self-interactions between sterile neutrinos can solve this (https://arxiv.org/abs/2307.15565, see also the more recent https://arxiv.org/abs/2402.13878). In this project you would examine whether the additional field mediating the self-interactions can also explain the neutrino masses. As a first step you would reproduce the results in the literature, and then extend it to map out the range of masses possible for this extra field.
  
Fast timing detectors are the solution for future tracking detectors. In future LHC operation conditions and future colliders, more and more particles are produced per collision. The high particle densities make it increasingly more difficult to separate particle trajectories with the spatial information that current silicon tracking detectors provide. A solution would be to add very precise (in order of 10ps) timestamps to the spatial measurements of the particle trackers. A good understanding of the performance of fast timing detectors is necessary. With the user of a pulsed laser in the lab we study the characteristics of several prototype detectors.
+
''Contacts: [Mailto:mpostma@nikhef.nl Marieke Postma]''
  
"Contact: [mailto:H.Snoek@nikhef.nl Hella Snoek, Martin van Beuzekom, Kazu Akiba, Daniel Hynds]"
+
===Theoretical Particle Physics: Baryogenesis at the electroweak scale===
  
=== KM3NeT : Reconstruction of first neutrino interactions in KM3NeT ===
+
Given that the Standard Model treats particle and antiparticles nearly the same, it is a puzzle why there is no antimatter left in our universe.
 +
Electroweak baryogenesis is the proposal  that the matter-antisymmetry is created during the phase transtion during which the Higgs field obtains a vev and the electroweak symmetry is broken.  One important ingredient in this scenario is that  there are new charge and parity (CP) violating interactions.  However, this is strongl constrained by the very precise measurements of the electric dipole moment of the electron.  An old proposal, that was recently revived, is to use a CP violating coupling of the Higgs field to the gauge field (https://arxiv.org/abs/2307.01270, https://inspirehep.net/literature/300823). The project would be to study the efficacy of these kind of operators for baryogenesis.
  
The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first few strings with sensitive photodetectors have been deployed at both the Italian and the French detector sites. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In order to identify neutrinos an accurate reconstruction and optimal understanding of the backgrounds are crucial. In this project we will use the available data to identify and reconstruct the first neutrino interactions in the KM3NeT detector and with this pave the path towards accurate neutrino oscillation measurements and neutrino astronomy.
+
''Contacts: [Mailto:mpostma@nikhef.nl Marieke Postma]''
  
Programming skills are essential, mostly root and C++ will be used.
+
==='''Theoretical Particle Physics: Neutrinoless double beta decay with sterile neutrinos'''===
 +
Search for neutrinoless double beta decay represents a prominent probe of new particle physics and is very well motivated by its tight connection to neutrino masses, which, so far, lack an experimentally verified explanation. As such, it also provide a convenient probe of new interactions of the known elementary particles with hypothesized right-handed neutrinos that are thought to play a prime role in the neutrino mass generation. The main focus of this project would be the extension of NuDoBe, a Python tool for the computation of neutrinoless double beta decay (0vbb) rates in terms of lepton-number-violating operators in the Standard Model Effective Field Theory (SMEFT), see <nowiki>https://arxiv.org/abs/2304.05415</nowiki>. In the first step, the code should be expanded to include also the effective operators involving right-handed neutrinos based on the existing literature (<nowiki>https://arxiv.org/abs/2002.07182</nowiki>) covering the general rate of neutrinoless double beta decay within extended by right-handed neutrinos. Besides that, additional functionalities could be added to the code, such as a routine for extraction of the explicit form of a neutrino mass and mixing matrices, etc. This work would be very useful for future phenomenological studies and particularly timely given the ongoing experimental efforts, which are to be further boosted by the upcoming tonne-scale upgrades of the double-beta experiments.
  
'' Contact: [mailto:bruijn@nikhef.nl Ronald Bruijn] [mailto:dosamtnikhef.nl Dorothea Samtleben]'''
+
''Contacts: [Mailto:j.devries4@uva.nl Jordy de Vries] and Lukas Graf''
  
 +
==='''Theoretical Particle Physics: Phase space factors for single, double, and neutrinoless beta-decay rates.'''===
 +
In light of the increasingly precise measurements of beta-decay and double-beta-decay rates and spectra, the theoretical predictions seem to fall behind. The existing, rather phenomenological approaches to the associated phase-space calculations employ a variety of different approximations introducing errors that are, given their phenomenological nature, not easily quantifiable. A key goal of this project is to understand, reproduce and improve the methods and results available in the literature. Ideally, these efforts would be summarized in form of a compact Mathematica notebook or Python package available to the broad community of beta-decay experimentalists and phenomenologists that could easily implement it in the workflows of their analyses. The focus should be not only on the Standard-Model contributions to (double) beta decay, but also on hypothetical exotic modes stemming from various beyond-the-Standard-Model scenarios (see e.g.\ <nowiki>https://arxiv.org/abs/nucl-ex/0605029</nowiki> and <nowiki>https://arxiv.org/abs/2003.11836</nowiki>). If time permits, then new, more particle-physics based approaches to the phase-space computations can be investigated.
  
=== KM3NeT : Acoustic detection of ultra-high energy cosmic-ray neutrinos  (2 projects) ===
+
''Contacts: [Mailto:j.devries4@uva.nl Jordy de Vries] and Lukas Graf''
  
The study of the cosmic neutrinos of energies above 1017 eV, the so-called ultra-high
+
===Neutrinos: Neutrino Oscillation Analysis with the KM3NeT/ORCA Detector===
energy neutrinos, provides a unique view on the universe and may provide insight
+
The KM3NeT/ORCA neutrino detector at the bottom of the Mediterranean Sea is able to detect oscillations of atmospheric neutrinos. Neutrinos traversing the detector are reconstructed as a function of two observables: the neutrino energy and the neutrino direction. In order to improve the neutrino oscillation analysis, we need to add one more observable, the so-called Björken-y, that indicates the fraction of the energy transferred from the incoming neutrino to its daughter particle. For this project, we will study simulated and real reconstructed data and use those to implement this additional observable in the existing analysis framework. Subsequently, we will study how much the sensitivity of the final analysis improves as a result.
in the origin of the most violent astrophysical sources, such as gamma ray bursts,
 
supernovae or even dark matter. In addition, the observation of high energy neutrinos
 
may provide a unique tool to study interactions at high energies.
 
The energy deposition of these extreme neutrinos in water induce a thermo-
 
acoustic signal, which can be detected using sensitive hydrophones. The expected
 
neutrino flux is however extremely low and the signal that neutrinos induce is small.
 
TNO is presently developing sensitive hydrophone technology based on fiber optics.
 
Optical fibers form a natural way to create a distributed sensing system. Using this
 
technology a large scale neutrino telescope can be built in the deep sea. TNO is aiming
 
for a prototype hydrophone which will form the building block of a future telescope.
 
  
The work will be executed at the Nikhef institute and/or the TNO laboratories in Delft. In this project there are two opportunities for master students to participate:<br>
+
C++ and Python programming skills are advantageous.
  
<b>student project 1: </b> Hardware development on fiber optics hydrophones technology<br>
+
''Contacts:'' [mailto:dveijk@nikhef.nl Daan van Eijk], [mailto:h26@nikhef.nl Paul de Jong]
Goal: characterise existing proto-type optical fibre hydrophones in an anechoic basin at TNO laboratory. Data collection, calibration, characterisation, analysis of consequences for design future acoustic hydrophone neutrino telescopes<br>
 
Keywords: Optical fiber technology, signal processing, electronics, lab.
 
  
<b>student project 2:</b> Investigation of ultra-high energy neutrinos and their interactions with matter.<br>
+
===Neutrinos: Searching for neutrinos of cosmic origin with KM3NeT===
Goal: simulate (currently imperfectly modelled) interaction for extremely high energy interactions, characterise differences with currently available physics models and impact on physics reach for future acoustic hydrophone neutrino telescopes<br>
+
KM3NeT is a neutrino telescope under construction in the Mediterranean Sea, already taking data with the first deployed detection units. In particular the KM3NeT/ARCA detector off-shore of Sicily is designed for high-energy neutrinos and is suited for the detection of neutrinos of cosmic origin. In this project we will use the first KM3NeT data to search for evidence of a cosmic neutrino source, and also study ways to improve the analysis.
Keywords: Monte Carlo simulations, particle physics, cosmology. <br>
 
  
 +
''Contact:'' [mailto:aart.heijboer@nikhef.nl Aart Heijboer]
  
Further information:<br>
+
===Neutrinos: the Deep Underground Neutrino Experiment (DUNE) ===
- Info on ultra-high energy neutrinos can be found at: http://arxiv.org/abs/1102.3591<br>
 
- Info on acoustic detection of neutrinos can be found at: http://arxiv.org/abs/1311.7588
 
  
 +
The Deep Underground Neutrino Experiment (DUNE) is under construction in the USA, and will consist of a powerful neutrino beam originating at Fermilab, a near detector at Fermilab, and a far detector in the SURF facility in Lead, South Dakota, 1300 km away. During travelling, neutrinos oscillate and a fraction of the neutrino beam changes flavour; DUNE will determine the neutrino oscillation parameters to unrivaled precision, and try and make a first detection of CP-violation in neutrinos. In this project, various elements of DUNE can be studied, including the neutrino oscillation fit, neutrino physics with the near detector, event reconstruction and classification (including machine learning), or elements of data selection and triggering.
  
'' Contact: [mailto:ernst-jan.buis@tno.nl Ernst-Jan Buis] and [mailto:ivo.van.vulpen@nikhef.nl Ivo van Vulpen]'''
+
''Contact:'' [mailto:h26@nikhef.nl Paul de Jong]
  
 +
===Neutrinos: Searching for Majorana Neutrinos with KamLAND-Zen===
 +
The KamLAND-Zen experiment, located in the Kamioka mine in Japan, is a large liquid scintillator experiment with 750kg of ultra-pure Xe-136 to search for neutrinoless double-beta decay (0n2b). The observation of the 0n2b process would be evidence for lepton number violation and the Majorana nature of neutrinos, i.e. that neutrinos are their own anti-particles. Current limits on this extraordinary rare hypothetical decay process are presented as a half-life, with a lower limit of 10^26 years. KamLAND-Zen, the world’s most sensitive 0n2b experiment, is currently taking data and there is an opportunity to work on the data analysis, analyzing data with the possibility of taking part in a ground-breaking discovery. The main focus will be on developing new techniques to filter the spallation backgrounds, i.e.  the production of radioactive isotopes by passing muons. There will be close collaboration with groups in the US (MIT, Berkeley, UW) and Japan (Tohoku Univ).
  
=== KM3NeT : Applying state-of-the-art reconstruction software to 10-years of Antares data ===
+
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski]''
  
While the KM3NeT neutrino telescope is being constructed in
 
the deep waters of the Mediterranean Sea,
 
data from its precursor (Antares) have been accumulated for more than 10 years.
 
The main objective of these neutrino telescopes is to determine the origin of (cosmic) neutrinos.
 
The accuracy of the determination of the origin of neutrinos critically depends on
 
the probability density function (PDF) of the arrival time of Cherenkov light
 
produced by relativistic charged particles emerging from a neutrino interaction in the sea.
 
It has been shown that these PDFs can be calculated from first principles and
 
that the obtained values can efficiently be interpolated in 4 and 5 dimensions,
 
without compromising the functional dependencies.
 
The reconstruction software based on this input yields indeed for KM3NeT the best resolution.
 
This project is aimed at applying the KM3NeT software to available Antares data.
 
  
'' Contact: [mailto:mjg@nikhef.nl Maarten de Jong]''
+
===Neutrinos: TRIF𝒪RCE (PTOLEMY)===
  
=== VU LaserLaB: Measuring the electric dipole moment (EDM) of the electron ===
+
The PTOLEMY demonstrator will place limits on the neutrino mass using the ''β-''decay endpoint of atomic tritium. The detector will require a CRES-based (cyclotron radiation emission spectroscopy) trigger and a non-destructive tracking system. The "''TRItium-endpoint From 𝒪(fW) Radio-frequency Cyclotron Emissions"'' group is developing radio-frequency cavities for the simultaneous transport of endpoint electrons and the extraction of their kinematic information. This is essential to providing a fast online trigger and precise energy-loss corrections to electrons reconstructed near the tritium endpoint. The cryogenic low-noise, high-frequency analogue electronics developed at Nikhef combined with FPGA-based front-end analysis capabilities will provide the PTOLEMY demonstrator with its CRES readout and a testbed to be hosted at the Gran Sasso National Laboratory for the full CνB detector. The focus of this project will be modelling CR in RF cavities and its detection for the purposes of single electron spectroscopy and optimised trajectory reconstruction for prototype and demonstrator setups. This may extend to firmware-based fast tagging and reconstruction algorithm development with the RF-SoC.
  
In collaboration with Nikhef and the Van Swinderen Institute for Particle Physics and Gravity at the University of Groningen, we have recently started an exciting project to measure the electric dipole moment (EDM) of the electron in cold beams of barium-fluoride molecules. The eEDM, which is predicted by the Standard Model of particle physics to be extremely small, is a powerful probe to explore physics beyond this Standard Model. All extensions to the Standard Model, most prominently supersymmetry, naturally predict an electron EDM that is just below the current experimental limits. We aim to improve on the best current measurement by at least an order of magnitude. To do so we will perform a precision measurement on a slow beam of laser-cooled BaF molecules. With this low-energy precision experiment, we test physics at energies comparable to those of LHC!
+
''Contact: [mailto:jmead@nikhef.nl James Vincent Mead]]''
  
At LaserLaB VU, we are responsible for building and testing a cryogenic source of BaF molecules. The main parts of this source are currently being constructed in the workshop. We are looking for enthusiastic master students to help setup the laser system that will be used to detect BaF. Furthermore, projects are available to perform simulations of trajectory simulations to design a lens system that guides the BaF molecules from the exit of the cryogenic source to the experiment.
+
===Cosmic Rays: Energy loss profile of cosmic ray muons in the KM3NeT neutrino detector ===
 +
The dominant signal in the KM3NeT detectors are not neutrinos, but muons created in particle cascades -extensive air-showers- initiated when cosmic rays interact in the top of the atmosphere. While these muons are a background for neutrino studies, they present an opportunity to study the nature of cosmic rays and hadronic interactions at the highest energies. Reconstruction algorithms are used to determine the properties of the particle interactions, normally of neutrinos,  from the recorded photons. The aim of this project is to explore the possibility to reconstruct the longitudinal energy loss profile of single and multiple simultaneous muons ('bundles') originating from cosmic ray interactions. The potential to use this energy loss profile to extract information on the amount of muons and the lateral extension of the muon 'bundles' will also be explored. These properties allow to extract information on the high-energy interactions of cosmic rays.
  
'' Contact: [mailto:H.L.Bethlem@vu.nl Rick Bethlem]''
+
''Contact: [mailto:rbruijn@nikhef.nl Ronald Bruijn]''
  
 +
===LHCb: Search for light dark particles===
 +
The Standard Model of elementary particles does not contain a proper Dark Matter candidate. One of the most tantalizing theoretical developments is the so-called ''Hidden Valley models'': a mirror-like copy of the ''Standard Model'', with dark particles that communicate with standard ones via a very feeble interaction. These models predict the existence of ''dark hadrons'' – composite particles that are bound similarly to ordinary hadrons in the ''Standard Model''. Such ''dark hadrons''can be abundantly produced in high-energy proton-proton collisions, making the LHC a unique place to search for them. Some ''dark hadrons'' are stable like a proton, which makes them excellent ''Dark Matter'' candidates, while others decay to ordinary particles after flying a certain distance in the collider experiment. The LHCb detector has a unique capability to identify such decays, particularly if the new particles have a mass below ten times the proton mass.
  
=== VU LaserLab: Physics beyond the Standard model from molecules ===
+
This project assumes a unique search for light ''dark hadrons'' that covers a mass range not accessible to other experiments. It assumes an interesting program on data analysis (python-based) with non-trivial machine learning solutions and phenomenology research using fast simulation framework. Depending on the interest, there is quite a bit of flexibility in the precise focus of the project.
  
Our team, with a number of staff members (Ubachs, Eikema, Salumbides, Bethlem, Koelemeij) focuses on precision measurements in the hydrogen molecule, and its isotopomers. The work aims at testing the QED calculations of energy levels in H2, D2, T2, HD, etc. with the most precise measurements, where all kind of experimental laser techniques play a role (cw and pulsed lasers, atomic clocks, frequency combs, molecular beams). Also a target of studies is the connection to the "Proton size puzzle", which may be solved  through studies in the hydrogen molecular isotopes.
+
''Contact: [[Mailto:andrii.usachov@nikhef.nl Andrii Usachov]]''
  
In the past half year we have produced a number of important results that are described in
+
===LHCb: Searching for dark matter in exotic six-quark particles===
the following papers:
 
* Frequency comb (Ramsey type) electronic  excitations in the  H2 molecule:
 
see: Deep-ultraviolet frequency metrology of H2 for tests of molecular quantum theory
 
http://www.nat.vu.nl/~wimu/Publications/Altmann-PRL-2018.pdf
 
* ''Precision measurement of an infrared transition in the HD molecule''
 
see: Sub-Doppler frequency metrology in HD for tests of fundamental physics: https://arxiv.org/abs/1712.08438
 
* ''The first precision study in molecular tritium T2''
 
see: Relativistic and QED effects in the fundamental vibration of T2:  http://arxiv.org/abs/1803.03161
 
* ''Dissociation energy of the hydrogen molecule at 10^-9 accuracy'' paper submitted to Phys. Rev. Lett.
 
* ''Probing QED and fundamental constants through laser spectroscopy of vibrational transitions in HD+''
 
This is also a study of the hydrogen molecular ion HD+, where important results were  obtained not so long ago, and where we have a strong activity: http://www.nat.vu.nl/~wimu/Publications/ncomms10385.pdf
 
  
These five results mark the various directions we are pursuing, and in all directions we aim at obtaining improvements. Specific projects with students can be defined; those are mostly experimental, although there might be some theoretical tasks, like:
+
Three quarters of the mass in the Universe is of unknown type. Many hypotheses about this dark matter have been proposed, but none confirmed. Recently it has been proposed that it could be made of particles made of the six quarks uuddss, which would be a Standard-Model solution to the dark matter problem. This idea has recently gained credibility as many similar multi-quarks states are being discovered by the LHCb experiment. Such a particle could be produced in decays of heavy baryons, or directly in proton-proton collisions. The anti-particle, made of six antiquarks, could be seen when annihilating with detector material. It is also proposed to use Xi_b baryons produced at LHCb to search for such a state where the state would appear as missing 4-momentum in a kinematically constrained decay. The project consists in defining a selection and applying it to LHCb data. See [https://arxiv.org/abs/2007.10378 arXiv:2007.10378].
* Performing calculations of hyperfine structures
 
  
As for the theory there might also be an international connection for specifically bright theory students: we collaborate closely with prof. Krzystof Pachucki; we might find an opportunity
+
Contact: ''[mailto:patrick.koppenburg@cern.ch Patrick Koppenburg]''
for a student to perform (the best !) QED calculations in molecules, when working in Warsaw and partly in Amsterdam. Prof Frederic Merkt from the ETH Zurich, an expert in the field, will come to work with us on "hydrogen"
 
during August - Dec 2018 while on sabbatical.
 
  
'' Contact: [mailto:w.m.g.ubachs@vu.nl Wim Ubachs] [mailto:k.s.e.eikema@vu.nl Kjeld Eikema] [mailto:h.l.bethlem@vu.nl Rick Bethlem]''
 
  
 +
===LHCb: New physics in the angular distributions of B decays to K*ee===
  
 +
Lepton flavour violation in B decays can be explained by a variety of non-standard model interactions. Angular distributions in decays of a B meson to a hadron and two leptons are an important source of information to understand which model is correct. Previous analyses at the LHCb experiment have considered final states with a pair of muons. Our LHCb group at Nikhef concentrates on a new measurement of angular distributions in decays with two electrons. The main challenge in this measurement is the calibration of the detection efficiency. In this project you will confront estimates of the detection efficiency derived from simulation with decay distributions in a well known B decay. Once the calibration is understood, the very first analysis of the angular distributions in the electron final state can be performed.
  
 +
Contact:  [mailto:m.senghi.soares@nikhef.nl Mara Soares] and [mailto:wouterh@nikhef.nl Wouter Hulsbergen]
  
== Projects with September 2018 start ==
+
===LHCb: CP violation in B -> J/psi Ks decays with first run-3 data===
  
 +
The decay B -> J/psi Ks is the `golden channel' for measuring the CP violating angle beta in the CKM matrix. In this project we will use the first data from the upgraded LHCb detector to perform this measurement. Performing such a measurement with a new detector is going to be very challenging: We will learn a lot about whether the the upgraded LHCb will perform as good as expected.
  
 +
Contact: ''[[mailto:wouterh@nikhef.nl Wouter Hulsbergen]]''
  
=== Theory:  Stress-testing the Standard Model at the high-energy frontier ===
 
  
A suitable framework to parametrise in a model-independent way deviations from the SM induced by new heavy particles is the Standard Model Effective Field Theory (SMEFT). In this formalism, bSM effects are encapsulated in higher-dimensional operators constructed from SM fields respecting their symmetry properties. Here we aim to perform a global analysis of the SMEFT from high-precision LHC data. This will be achieved by extending the NNPDF fitting framework to constrain the SMEFT coefficients, with the ultimate aim of identifying possible bSM signals.
+
===LHCb: Optimization of primary vertex reconstruction===
 +
A key part of the LHCb event classification is the reconstruction of the collision point of the protons from the LHC beams. This so-called primary vertex is found by constructing the origin of the charged particles found in the detector. A rudimentary algorithm exists, but it is expected that its performance can be improved by tuning parameters (or perhaps implementing an entirely new algorithm). In this project you are challenged to optimize the LHCb primary vertex reconstruction algorithm using recent simulated and real data from LHC run-3.
  
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
+
Contact: ''[[mailto:wouterh@nikhef.nl Wouter Hulsbergen]]''
  
=== Theory: The quark and gluon internal structure of heavy nuclei in the LHC era  ===
+
===LHCb: Measurement of B decays to two electrons===
 +
Instead of searching for new physics by direct production of new particles, one can search for enhancements in very rare processes as an indirect signal for the existence of new particles or forces. The observed decay of Bs to two muons by the LHCb collaboration and Nikhef/Maastricht is such a measurement, and as rarest decay ever observed at the LHC it has a large impact on the new physics landscape. In this project, we will extend this work by searching for the even rarer decay into two electrons. You would join the ongoing work in context of an NWO Veni grant, and can be based in Maastricht or Nikhef.
  
A precise knowledge of the parton distribution functions (PDFs) of the proton is essential in order to make predictions for the Standard Model and beyond at hadron colliders.  The presence of nuclear medium and collective phenomena which involve several nucleons modifies the parton distribution functions of nuclei (nPDFs) compared to those of a free nucleon. These modifications have been investigated by different groups using global analyses of high energy nuclear reaction world data. It is important to determine the nPDFs not only for establishing perturbative QCD factorisation in nuclei but also for applications to heavy-ion physics and neutrino physics. In this project the student will join an ongoing effort towards the determination of a data-driven model of nPDFs, and will learn how to construct tailored Artificial Neural Networks (ANNs).  
+
Contact: ''[[mailto:jdevries@nikhef.nl Jacco de Vries]]''
  
"Further information [[http://pcteserver.mi.infn.it/~nnpdf/VU/2018-MasterProject-nPDFs.pdf here]]
+
===Muon Collider===
 +
There is currently a lively global debate about the next accelerator to succeed the successful LHC. Different options are on the table: linear, circular, electrons, protons, on various continents... Out of these, the most ambitious project is the muon collider, designed to collide the relatively massive (105 MeV) but relatively short-living (2.2 μs!) leptons. Such a novel collider would combine the advantages of electron-positron colliders (excellent precision) and proton-proton colliders (highest energy). In this project, we'll perform a feasibility study for the search of the elusive Double-Higgs process: this yet unobserved process is crucial to probe the simultaneous interaction of multiple Higgs bosons and thereby the shape of the Higgs potential as predicted in the Brout-Englert-Higgs mechanism. This sensitivity study will be instrumental to understand one of the main scientific prospects for this ambitious project, and also to optimize the detector design, as well as the interface of the particle detectors to the accelerator machine. The project is based at Nikhef but can also be (partially) performed at University of Twente.
  
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
+
Reference: https://www.science.org/content/article/muon-collider-could-revolutionize-particle-physics-if-it-can-be-built
  
=== Theory: Combined QCD analysis of parton distribution and fragmentation functions ===
+
Contact: ''[[mailto:f.dias@nikhef.nl Flavia Dias] and [mailto:tdupree@nikhef.nl Tristan du Pree] ]''
  
The formation of hadrons from quarks and gluons, or collectively partons, is a fundamental QCD process that has yet to be fully understood. Since parton-to-hadron fragmentation occurs over long-distance scales, such information can only be extracted from experimental observables that identify mesons and baryons in the final state. Recent progress has been made to determine these fragmentation functions (FFs) from charged pion and kaon production in single inclusive e+e−-annihilation (SIA) and additionally pp-collisions and semi-inclusive deep inelastic scattering (SIDIS). However, charged hadron production in unpolarized pp and inelastic lepton-proton scattering also require information about the momentum distributions of the quarks and gluons in the proton, which is encoded in non-perturbative parton distribution functions (PDFs). In this project, a simultaneous treatment of both PDFs and FFs in a global QCD analysis of single inclusive hadron production processes will be made to determine the individual parton-to-hadron FFs. Furthermore, a robust statistical methodology with an artificial neural network learning algorithm will be used to obtain a precise estimation of the FF uncertainties. This work will emphasis in particular the impact of pp-collision and SIDIS data on the gluon and separated quark/anti-quark FFs, respectively.
 
  
"Further information [[http://pcteserver.mi.infn.it/~nnpdf/VU/2018-MasterProject-FFpPDFs.pdf here]]
 
  
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
 
  
 +
==Projects with a 2023 start==
  
=== ALICE: Charm is in the Quark Gluon Plasma ===
+
===ALICE: The next-generation multi-purpose detector at the LHC===
The goal of heavy-ion physics is to study the Quark Gluon Plasma (QGP), a hot and dense medium where quarks and gluons move freely over large distances, larger than the typical size of a hadron. Hydrodynamic simulations expect that the QGP will expand under its own pressure, and cool while expanding. These simulations are particularly successful in describing some of the key observables measured experimentally, such as particle spectra and various orders of flow harmonics. Charm quarks are produced very early during the evolution of a heavy-ion collision and can thus serve as an idea probe of the properties of the QGP. The goal of the project is to study higher order flow harmonics (e.g. triangular flow - v3) that are more sensitive to the transport properties of the QGP for charm-mesons, such as D0, D*, Ds. This will be the first ever measurement of this kind.  
+
This main goal of this project is to focus on the next-generation multi-purpose detector planned to be built at the LHC. Its core will be a nearly massless barrel detector consisting of truly cylindrical layers based on curved wafer-scale ultra-thin silicon sensors with MAPS technology, featuring an unprecedented low material budget of 0.05% X0 per layer, with the innermost layers possibly positioned inside the beam pipe. The proposed detector is conceived for studies of pp, pA and AA collisions at luminosities a factor of 20 to 50 times higher than possible with the upgraded ALICE detector, enabling a rich physics program ranging from measurements with electromagnetic probes at ultra-low transverse momenta to precision physics in the charm and beauty sector.  
  
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou and Paul Kuijer]''
+
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou] and [mailto:Alessandro.Grelli@cern.ch Alessandro Grelli] and [mailto:marco.van.leeuwen@cern.ch Marco van Leeuwen]''
  
=== ALICE: Probing the time evolution of particle production in the Quark-Gluon Plasma ===
+
===ALICE: Searching for the strongest magnetic field in nature===
Particle production is governed by conservation laws, such as local charge conservation. The latter ensures that each charged particle is balanced by an oppositely-charged partner, created at the same location in space and time. The charge-dependent angular correlations, traditionally studied with the balance function, have emerged as a powerful tool to probe the properties of the Quark-Gluon Plasma (QGP) created in high energy collisions. The goal of this project is to take full advantage of the unique, among all LHC experiments, capabilities of the ALICE detector that is able to identify particles to extend the studies to different particle species (e.g. pions, kaons, protons…). These studies are highly anticipated by both the experimental and theoretical communities.
+
In a non-central collision between two Pb ions, with a large value of impact parameter, the charged nucleons that do not participate in the interaction (called spectators) create strong magnetic fields. A back of the envelope calculation using the Biot-Savart law brings the magnitude of this filed close to 10^19Gauss in agreement with state of the art theoretical calculation, making it the strongest magnetic field in nature. The presence of this field could have direct implications in the motion of final state particles. The magnetic field, however, decays rapidly. The decay rate depends on the electric conductivity of the medium which is experimentally poorly constrained. Overall, the presence of the magnetic field, the main goal of this project, is so far not confirmed experimentally.
  
 
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
 
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
  
=== ALICE: CP violating effects in QCD: looking for the chiral magnetic effect with ALICE at the LHC ===
+
===ALICE: Looking for parity violating effects in strong interactions===
Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, which is called Chiral Magnetic Effect (CME). The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, but background effects have not yet been properly disentangled. In this project you will develop and test new observables of the CME, trying to understand and discriminate the background sources that affects such a measurement.  
+
Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions, in the presence of a deconfined state, gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, what is called the Chiral Magnetic Effect (CME).
 +
The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, however further studies are needed to constrain background effects. These highly anticipated results have the potential to reveal exiting, new physics.
  
 
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
 
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
  
=== LHCb: Searching for dark matter in exotic six-quark particles ===
+
===ALICE: Machine learning techniques as a tool to study the production of heavy flavour particles ===
3/4 of the mass in the Universe is of unknown type. Many hypotheses about this dark matter have been proposed, but none confirmed. Recently it has been proposed that it could be made of particles made of the six quarks uuddss. Such a particle could be produced in decays of heavy baryons. It is proposed to use Xi_b baryons produced at LHCb to search for such a state. The latter would appear as missing 4-momentum in a kinematically constrained decay. The project consists in optimising a selection and applying it to LHCb data. See [https://arxiv.org/abs/1708.08951 arXiv:1708.08951]
+
There was recently a shift in the field of heavy-ion physics triggered by experimental results obtained in collisions between small systems (e.g. protons on protons). These results resemble the ones obtained in collisions between heavy ions. This consequently raises the question of whether we create the smallest QGP droplet in collisions between small systems. The main objective of this project will be to study the production of charm particles such as D-mesons and Λc-baryons in pp collisions at the LHC. This will be done with the help of a new and innovative technique which is based on machine learning (ML). The student will also extend the studies to investigate how this production rate depends on the event activity e.g. on how many particles are created after every collision.
 +
 
 +
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou] and [mailto:Alessandro.Grelli@cern.ch Alessandro Grelli]''
 +
 
 +
===ALICE: Search for new physics with 4D tracking at the most sensitive vertex detector at the LHC===
  
''Contact: [mailto:patrick.koppenburg@cern.ch Patrick Koppenburg]''
+
With the newly installed Inner Tracking System consisting fully of monolithic detectors, ALICE is very sensitive to particles with low transverse momenta, more so than ATLAS and CMS. This will be even more so for the ALICE upgrade detector in 2033. This detector could potentially be even more sensitive to longlived particles that leave peculiar tracks such as disappearing or kinked tracks in the tracker by using timing information along a track. In this project you will investigate how timing information in the different tracking layers can improve or even enable a search for new physics beyond the Standard Model in ALICE. If you show a possibility for major improvements, this can have real consequences for the choice of sensors for this ALICE inner tracker upgrade.
  
 +
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld] and [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
  
=== LHCb: Measurement of BR(B0 → Ds+ Ds-) ===
+
===ATLAS: The Higgs boson's self-coupling===
  
This project aims to discover the branching fraction of the decay B0->Ds- Ds+. The decay B0->Ds- Ds+ is quite rare, because it occurs through the exchange of a W-boson between the b and the d-quark of the B0-meson. This decay proceeds via Cabibbo-suppressed W-exchange and has not yet been observed; theoretical calculations predict a branching fraction at the order of 10^-5 with a best experimental upper limit of 3.6x10^-5.
+
The coupling of the Higgs boson to itself is one of the main unobserved interactions of the Standard Model and its observation is crucial to understand the shape of the Higgs potential. Here we propose to study the 'ttHH' final state: two top quarks and two Higgs bosons produced in a single collision. This topology is yet unexplored at the ATLAS experiment and the project consists of setting up the new analysis (including multivariate analysis techniques to recognise the complicated final state), optimising the sensitivity and including the result in the full ATLAS study of the Higgs boson's coupling to itself. With the LHC data from the upcoming Run-3, we might be able to see its first glimpses!
A measurement of the decay rate of B0 -> Ds+Ds- relative to that of B0 -> D+D- can provide an estimate of the W-exchange contribution to the latter decay, a crucial piece of information for extracting the CKM angle gamma from B0 -> D(*)D(*).
 
The aim is to determine the relative branching fraction of B0->Ds+Ds- with respect to B0->Ds+D- decays (which has the best known branching ratio at present, (7.2 +- 0.8)x10^-3), in close collaboration with the PhD. The aim is that this project results in a journal publication on behalf of the LHCb collaboration. For this project computer skills are needed. The ROOT programme and C++ and/or Python macros are used. This is a project that is closely related to previous analyses in the group. Weekly video meetings with CERN coordinate the efforts with in the LHCb collaboration.
 
Relevant information:
 
[1] M.Jung and S.Schacht, "Standard Model Predictions and New Physics Sensitivity in B -> DD Decays" https://arxiv.org/pdf/1410.8396.pdf
 
[2] L.Bel, K.de Bruyn, R. Fleischer, M.Mulder, N.Tuning, "Anatomy of B -> DD Decays" https://arxiv.org/pdf/1505.01361.pdf
 
[3] A.Zupanc et al [Belle Collaboration] "Improved measurement of B0 -> DsD+ and search for B0 -> Ds+Ds at Belle" https://arxiv.org/pdf/hep-ex/0703040.pdf
 
[4] B.Aubert et al. [Babar Collaboration] "Search for the W-exchange decays B0 -> DD+" https://arxiv.org/pdf/hep-ex/0510051.pdf
 
[5] R.Aaij et al. [LHCb Collaboration], "First observations of B0s -> D+D, Ds+D and D0D0 decays" https://arxiv.org/pdf/1302.5854.pdf
 
  
''Contact: [mailto:niels.tuning@nikhef.nl Niels Tuning], [mailto:m.veronesi@nikhef.nl Michele Veronesi (PhD)], [mailto:s.esen@nikhef.nl Sevda Esen (postdoc)]''
+
''Contact: [mailto:tdupree@nikhef.nl Tristan du Pree] and  [mailto:cpandini@nikhef.nl Carlo Pandini]''  
  
=== LHCb: Measurement of relative ratio of B+ → D0D+ and B+ → D0Ds decays ===
+
===ATLAS: Triple-Higgs production as a probe of the Higgs potential===
 +
So far, the investigation of Higgs self-couplings (the coupling of the Higgs boson to itself) at the LHC has focused on the measurement of the Higgs tri-linear coupling λ3 mainly through direct double-Higgs production searches. In this research project we propose the investigation of Higgs tri-linear and quartic coupling parameters λ3 and λ4, via a novel measurement of triple-Higgs production at the LHC (HHH) with the ATLAS experiment. While in the SM these parameters are expected to be identical, only a combined measurement can provide an answer regarding how the Higgs potential is realised in Nature. Processes in which three Higgs bosons are produced simultaneously are extremely rare, and very difficult to measure and disentangle from background. In this project we plan to investigate different decay channels (to bottom quarks and tau leptons), and to study advanced machine learning techniques to reconstruct such a complex hadronic final state. This kind of processes is still quite unexplored in ATLAS, and the goal of this project is to put the basis for the first measurement of HHH production at the LHC.
  
This decay is closely related to B0->Ds- Ds+ (see above), and close collaboration between the two master projects is foreseen. The decay mode B+->D0D+ is expected to be dominated by tree diagrams with some additional contributions from penguin diagrams. Assuming SU(3) symmetry, measurement of its branching fraction relative to Cabibbo-favored B+->D0D will enable better understanding of penguin contributions to the CP violating mixing phase.
+
Furthermore, we'd like to study the possible implication of a precise measurement of the self-coupling parameters from HHH production from a phenomenological point of view: what could be the impact of a deviation in the HHH measurements on the big open questions in physics (for instance, the mechanisms at the root of baryogenesis)?
Relevant information:
 
[1] L.Bel, K.de Bruyn, R. Fleischer, M.Mulder, N.Tuning, "Anatomy of B -> DD Decays" https://arxiv.org/pdf/1505.01361.pdf
 
[2] R.Aaij et al. [LHCb Collaboration], "First observations of B0s -> D+D, Ds+D and D0D0 decays" https://arxiv.org/pdf/1302.5854.pdf
 
[3] PDG: http://pdglive.lbl.gov/BranchingRatio.action?desig=261&parCode=S041
 
  
''Contact: [mailto:niels.tuning@nikhef.nl Niels Tuning], [mailto:m.veronesi@nikhef.nl Michele Veronesi (PhD)], [mailto:s.esen@nikhef.nl Sevda Esen (postdoc)]''
+
Contact: ''[mailto:tdupree@nikhef.nl Tristan du Pree] and [mailto:cpandini@nikhef.nl Carlo Pandini]''
  
 +
===ATLAS: The Next Generation===
  
 +
After the observation of the coupling of Higgs bosons to fermions of the third generation, the search for the coupling to fermions of the second generation is one of the next priorities for research at CERN's Large Hadron Collider. The search for the decay of the Higgs boson to two charm quarks is very new [1] and we see various opportunities for interesting developments. For this project we propose improvements in reconstruction (using exclusive decays) and advanced analysis techiques (using deep learning methods).
  
=== Virgo: Fast determination of gravitational wave properties ===
+
[https://atlas.cern/updates/briefing/charming-Higgs-decay][https://arxiv.org/abs/1802.04329 https://atlas.cern/updates/briefing/charming-Higgs-decay]
  
In the era of multi-messenger astronomy, the development of fast, accurate and computationally cheap methods for inference of properties of gravitational wave signal is of paramount importance. In this work, we will work on the development of rapid bayesian parameter estimation method for binary neutron stars as well as precessing black hole binaries.  Bayesian parameter estimation methods require the evaluation of a likelihood that describe the probability of obtaining data for a given set of model parameters, which are parameters of gravitational wave signals in this particular problem.  Bayesian inference for gravitational wave parameter estimation may require millions of these evaluation making them computationally costly.  This work will combine the benefits of machine learning/ deep learning methods and order reduction methods of gravitational wave source modelling to speed up Bayesian inference of gravitational waves.
+
''Contact: [mailto:tdupree@nikhef.nl Tristan du Pree]''
  
''Contact: [mailto:caudills@nikhef.nl Sarah Caudill]''
+
===ATLAS: Searching for new particles in very energetic diboson production===
  
=== Virgo: Simulations of Binary Neutron Star Mergers and applications for multimessenger astronomy ===
+
The discovery of new phenomena in high-energy proton–proton collisions is one of the main goals of the Large Hadron Collider (LHC). New heavy particles decaying into a pair of vector bosons (WW, WZ, ZZ) are predicted in several extensions to the Standard Model (e.g. extended gauge-symmetry models, Grand Unified theories, theories with warped extra dimensions, etc). In this project we will investigate new ideas to look for these resonances in promising regions. We will focus on final states where both vector bosons decay into quarks, or where one decays into quarks and one into leptons. These have the potential to bring the highest sensitivity to the search for Beyond the Standard Model physics [1, 2]. We will try to reconstruct and exploit new ways to identify vector bosons (using machine learning methods) and then tackle the problem of estimating contributions from beyond the Standard Model processes in the tails of the mass distribution.
  
With the detection of the binary neutron star merger in August 2017 (GW170817) a new era of multi-messenger astronomy started. GW170817 proved that neutron star mergers are ideal laboratories to constrain the equation of state of cold supranuclear matter, to study the central engines of short GRBs, and to understand the origin and production of heavy elements.
+
[1] https://arxiv.org/abs/1906.08589
The fundamental tool to understand the last stages of the binary dynamics are numerical relativity simulations. In this project the student will be introduced to the basics of numerical relativity simulations of binary neutron star simulations and will be able to perform simulations on its own. Based on these simulations and the first experience it will be possible to focus on one of the following aspects:
 
  
- the estimation of the ejected material released from the merger and the development of models for the electromagnetic signals
+
[2] https://arxiv.org/abs/2004.14636
  
- further improvement of gravitational waveform models including numerical relativity information
+
''Contact: [mailto:f.dias@nikhef.nl Flavia de Almeida Dias], [mailto:rhayes@nikhef.nl Robin Hayes], Elizaveta Cherepanova and Dylan van Arneman''
  
- further improvement of the construction of the initial conditions of binary neutron star simulations
+
===ATLAS: Top-quark and Higgs-boson analysis combination, and Effective Field Theory interpretation (also in 2023)===
  
- code improvements of the evolution code incorporating additional microphysical aspects as magnetic fields, tabulated equation of states, or neutrino leakage schemes.
+
We are looking for a master student with interest in theory and data-analysis in the search for physics beyond the Standard Model in the top-quark and Higgs-boson sectors.
  
- studying the merger properties of neutron stars with exotic objects as boson or axion stars.  
+
Your master-project starts just at the right time for preparing the Run-3 analysis of the ATLAS experiment at the LHC.  In Run-3 (2022-2026), three times more data becomes available, enabling analysis of rare processes with innovative software tools and techniques.
  
''Contact: [mailto:diettim@nikhef.nl Tim Dietrich]''
+
This project aims to explore the newest strategy to combine the top-quark and Higgs-boson measurements in the perspective of constraining the existence of new physics beyond the Standard Model (SM) of Particle Physics.  We selected the pp->tZq and gg->HZ processes as promising candidates for a combination to constrain new physics  in the context of  Standard Model Effective Field Theory (SMEFT).  SMEFT is the state-of-the-art framework for theoretical interpretation of LHC data. In particular, you will study the SMEFT OtZ and Ophit operators, which are not well constrained by current measurements.
  
=== Virgo: Measuring cosmological parameters from gravitational-wave observations of compact binaries ===
+
Besides affinity with particle physics theory, the ideal candidate for this project has developed python/C++ skills and is eager to learn advanced techniques. You start with a simulation of the signal and background samples using existing software tools. Then, an event selection study is required using Machine Learning techniques. To evaluate the SMEFT effects, a fitting procedure based on the innovative  Morphing technique is foreseen, for which the basic tools in the ROOT and RooFit framework are available. The work is carried out in the ATLAS group at Nikhef and may lead to an ATLAS note.
  
Gravitational wave observation of the binary neutron star merger GW170817 with its coincident optical counterpart led to a first "standard siren" measurement of the Hubble parameter independent of the cosmological distance ladder. While multiple similar observations are expected to improve the precision of the measurement, a statistical method of cross correlation with galaxy catalogues of gravitational-wave distance estimates is expected to work even without identified electromagnetic transients, and for binary black hole mergers in particular. The project would primarily be a study of various systematic effects in this analysis and correcting for them. The work will involve use of computational techniques to analyze LIGO-Virgo data. Some prior experience of programmimg is expected.
+
''Contact:  [mailto:o.rieger@nikhef.nl Oliver Rieger]  and [mailto:h73@nikhef.nl Marcel Vreeswijk]''
  
''Contact: [mailto:archis@nikhef.nl Archisman Ghosh] and [mailto:vdbroeck@nikhef.nl Chris Van Den Broeck]''
+
===ATLAS: Machine learning to search for very rare Higgs decays===
  
=== Detector R&D: Spectral X-ray imaging - Looking at colours the eyes can't see ===
+
Since the Higgs boson discovery in 2012 at the ATLAS experiment, the investigation of the properties of the Higgs boson has been a priority for research at the Large Hadron Collider (LHC). However, there are still a many open questions: Is the Higgs boson the only origin of Electroweak Symmetry Breaking? Is there a mechanism which can explain the observed mass pattern of SM particles? Many of these questions are linked to the Higgs boson coupling structure.


  
When a conventional X-ray image is made to analyse the composition of a sample, or to perform a medical examination on a patient, one acquires an image that only shows intensities. One obtains a ‘black and white’ image. Most of the information carried by the photon energy is lost. Lacking spectral information can result in an ambiguity between material composition and amount of material in the sample. If the X-ray intensity as a function of the energy can be measured (i.e. a ‘colour’ X-ray image) more information can be obtained from a sample. This translates to less required dose and/or to a better understanding of the sample that is being investigated. For example, two fields that can benefit from spectral X-ray imaging are mammography and real time CT.
+
While the Higgs boson coupling to fermions of the third generation has been established experimentally, the investigation of the Higgs boson coupling to the light fermions of the second generation will be a major project for the upcoming data-taking period (2022-2025). The Higgs boson decay to muons is the most sensitive channel for probing this coupling. In this project, you will optimize the event selection for Higgs boson decays to muons in the Vector Boson Fusion (VBF) production channel with a focus on distinguishing signal events from background processes like Drell-Yan and electroweak Z boson production. For this purpose, you will develop, implement and validate advanced machine learning and deep learning algorithms.  
  
X-ray detectors based on Medipix/Timepix pixel chips have spectral resolving capabilities and can be used to make polychromatic X-ray images. Medipix and Timepix chips have branched from pixel chips developed for detectors for high energy physics collider experiments.
+
''Contact: [mailto:oliver.rieger@nikhef.nl Oliver Rieger] and [mailto:verkerke@nikhef.nl Wouter Verkerke] and [mailto:s01@nikhef.nl Peter Kluit]''
  
Activities in the field of (spectral) CT scans are performed in a collaboration between two institutes (Nikhef and CWI) and two companies (ASI and XRE).
+
===ATLAS: Interpretation of experimental data using SMEFT===
  
Some activities that students can work on:
+
The Standard Model Effective Field Theory (SMEFT) provides a systematic approach to test the impact of new physics at the energy scale of the LHC through higher-dimensional operators. The interpretation of experimental data using SMEFT requires a particular interest in solving complex technical challenges, advanced statistical techniques, and a deep understanding of particle physics. We would be happy to discuss different project opportunities based on your interests with you.
  
- Medical X-ray imaging (CT and ‘flat’ X-ray images): Detection of iodine contrast agent. Detection of calcifications (hint for a tumour).
+
''Contact: [mailto:oliver.rieger@nikhef.nl Oliver Rieger] and [mailto:verkerke@nikhef.nl Wouter Verkerke]''
  
- Material research: Using spectral information to identify materials and recognise compounds.
+
=== ATLAS: Reconstructing tracks from particle physics detector hits with state-of-the-art machine learning techniques===
 +
This project concerns the application of new machine learning techniques to tackle the problem of track reconstruction at the ATLAS detector in CERN. While algorithms to construct particle tracks from low-level detector information such as particle hits and timestamps have been around for decades, recent developments in the field of machine learning open up new opportunities to improve these algorithms significantly. Some recent developments that could help in this context include graph-based neural networks, which embed the input data in the format of a graph and as such have the capability to enhance underlying correlations within events. Transformer neural networks are a particular extension of graph-based neural networks proposed only in 2017 which could also provide helpful in this case. Another option would be to build upon some of the work done within the field of computer vision and see if image segmentation networks can help solve this problem. There are a range of available options and this project includes the freedom for the student to choose particular types of networks, but more explicit guidance could be provided in case it is desired.
  
- Determine how much existing applications can benefit from spectral X-ray imaging and look for potential new applications.  
+
In this project the student will develop and compare the performance of various machine learning models to initially reconstruct tracks from simplified test data. Upon successful completion of this, simulated data from the actual ATLAS detector can be analysed as well in the scope of this project. The student will need some familiarity with programming in python and an interest in machine learning, but a physics background is not required. In this project the student will be able to contribute to fundamental physics research and will familiarize themselves with state-of-the-art machine learning models.
  
- Characterise, calibrate, optimise X-ray imaging detector systems.  
+
Contact: ''[mailto:zwolffs@nikhef.nl Zef Wolffs], [mailto:mvozak@cern.ch Matouš Vozák] and [mailto:Ivo.van.Vulpen@nikhef.nl Ivo van Vulpen]''
 +
 
 +
===ATLAS: New machine learning approaches to target Higgs interference signatures in LHC data===
 +
In this project we aim to improve an ongoing analysis to determine the lifetime of the Higgs Boson through state-of-the-art machine learning techniques, in particular by addressing a ''novel solution to an as of yet unsolved fundamental problem in modeling quantum interference''. While the Higgs is an elusive particle that generally only appears in physics processes with small cross sections, its signature can be amplified in the Large Hadron Collider (LHC) through quantum interference with larger background (non-Higgs) processes. This is the effect that the Higgs’ lifetime analysis relies on to be able to measure the relevant Higgs signature. A fundamental physics modelling problem arises though in the simulation of individual events for this interference due to the fact that these events are in reality described by a superposition of underlying Higgs and non-Higgs processes.
 +
 
 +
Since machine learning models in particle physics are typically trained to characterise individual physics events, the fact that interference events cannot currently be generated is a significant problem when interference is the target. In the currently existing Higgs lifetime analysis, a machine learning model was trained which instead focuses only on the explicit Higgs-mediated processes as a proxy, which is suboptimal. The aim of this project is to improve upon this current machine learning strategy used in this analysis by implementing either of the inference-aware approaches suggested in [1] and [2]. The idea behind these inference-aware machine learning algorithms is that they do not optimise for a simplified goal such as the loss function which is common in traditional machine learning, but rather for the end-goal of the analysis. In this case, this would omit the need for interference event generation altogether and allow the machine learning models to be trained optimally regardless.
 +
 
 +
The first checkpoint of this project is to use either of the frameworks used in [1] and [2] (which are both publicly available) and run them with a simplified dataset from the aforementioned analysis. After this proof-of-principle is achieved, the next goal would be to actually implement the newly developed machine learning models in the full analysis and to determine the improvement upon the existing result. A successful completion of these tasks would not only benefit the Higgs lifetime analysis, but would be an important stepping stone to future developments to make machine learning approaches also aware of other hard to model effects such as systematic uncertainties. Finally, there are further options to improve this analysis such as the generation of actual interference training data, which could be attempted in case the primary project finishes earlier than expected.
 +
 
 +
[1] De Castro, P., & Dorigo, T. (2019). INFERNO: inference-aware neural optimisation. ''Computer Physics Communications'', ''244'', 170-179.
 +
 
 +
[2] Simpson, N., & Heinrich, L. (2023, February). neos: End-to-end-optimised summary statistics for high energy physics. In ''Journal of Physics: Conference Series'' (Vol. 2438, No. 1, p. 012105). IOP Publishing.
 +
 
 +
Contact: ''[mailto:zwolffs@nikhef.nl Zef Wolffs], [mailto:mvozak@cern.ch Matouš Vozák] and [mailto:Ivo.van.Vulpen@nikhef.nl Ivo van Vulpen]''
 +
 
 +
===ATLAS: Development of state-of-the art modeling techniques to generate Higgs interference events ===
 +
In this project we aim to improve an ongoing analysis to determine the lifetime of the Higgs Boson through new event generation strategies, in particular by addressing a novel solution to an as of yet unsolved fundamental problem in modeling quantum interference. While the Higgs is an elusive particle that generally only appears in physics processes with small cross sections, its signature can be amplified in the Large Hadron Collider (LHC) through quantum interference with larger background (non-Higgs) processes. This is the effect that the Higgs’ lifetime analysis relies on to be able to measure the relevant Higgs signature. A fundamental physics modelling problem arises though in the simulation of individual events for this interference due to the fact that these events are in reality described by a superposition of underlying Higgs and non-Higgs processes.
 +
 
 +
The current approach to deal with this problem is to ignore the interference in analysis optimization and instead optimize only for explicitly Higgs mediated processes, but this severely impacts analysis performance. In the context of Effective Field Theories (EFT) however, a similar problem arises and has been solved for simple (leading order) processes. In this project we plan to take the machinery developed for EFT and apply it to the Higgs lifetime analysis. Furthermore, with the recent development of a Next-to-Leading Order (NLO) Higgs event generation tool [1] a subsequent goal would be to use this to also generate interference at the NLO level. Successful completion of this project would lead to a much improved analysis result, significantly constraining the lifetime of the Higgs Boson. Besides, the techniques developed would almost certainly be used in future analyses on Large Hadron Collider (LHC) run 3 data.
 +
 
 +
[1] Alioli, S., Ravasio, S. F., Lindert, J. M., & Röntsch, R. (2021). Four-lepton production in gluon fusion at NLO matched to parton showers. ''The European Physical Journal C'', ''81''(8), 687.
 +
 
 +
Contact: ''[mailto:zwolffs@nikhef.nl Zef Wolffs], [mailto:mvozak@cern.ch Matouš Vozák], [mailto:b.kortman@nikhef.nl Bryan Kortman]  and [mailto:Ivo.van.Vulpen@nikhef.nl Ivo van Vulpen]''
 +
 
 +
===ATLAS: Approaching the Higgs from a new direction: Constraining new physics with off shell Higgs data from the LHC===
 +
The Heisenberg uncertainly principle allows for all elementary particles---including the Higgs Boson---to momentarily disobey the fundamental energy-momentum relation, allowing the particle in question to have a significantly larger mass than usual. A description of the Higgs Boson in this state (“off shell Higgs Boson”) can provide a portal to the discovery of potential new physics, albeit very difficult to do due to its infrequent appearance. The goal of this project is to constrain or hint at new physics by estimating parameters of a generalized model which allows for new physics, Effective Field Theory (EFT), using off shell Higgs data.
 +
 
 +
Most of the underlying analysis to measure the prevalence of off shell Higgs bosons has already been set up, so the goal of this project is to do the aforementioned EFT interpretation on top of this existing analysis. From a theoretical point of view much of the groundwork has also been done on simulated data which showed the potential for this EFT interpretation to constrain new physics [1]. Being on the interface between experimental and theoretical physics this project allows the student to gain a deeper understanding of both, furthermore its successful completion could be one of the first hints towards as of yet not understood physics.
 +
 
 +
[1] Azatov, A., de Blas, J., Falkowski, A., Gritsan, A. V., Grojean, C., Kang, L., ... & Vryonidou, E. (2022). Off-shell Higgs Interpretations Task Force: Models and Effective Field Theories Subgroup Report. arXiv preprint arXiv:2203.02418.
 +
 
 +
Contact: ''[mailto:zwolffs@nikhef.nl Zef Wolffs], [mailto:mvozak@cern.ch Matouš Vozák], [mailto:b.kortman@nikhef.nl Bryan Kortman] and [mailto:Ivo.van.Vulpen@nikhef.nl Ivo van Vulpen]''
 +
 
 +
===ATLAS: A new timing detector - the HGTD===
 +
The ATLAS is going to get a new ability:  a timing detector. This allows us to reconstruct tracks not only in the 3 dimensions of space but adds the ability of measuring very precisely also the time (at picosecond level) at which the particles pass the sensitive layers of the HGTD detector. This allows to construct the trajectories of the particles created at the LHC in 4 dimensions and ultimately will lead to a better reconstruction of physics at ATLAS. The new HGTD detector is still in construction and work needs to be done on different levels such as understanding the detector response (taking measurements in the lab and performing simulations) or developing algorithms to reconstruct the particle trajectories (programming and analysis work).
 +
 
 +
'''Several projects are available within the context of the new HGTD detector:'''
 +
 
 +
#One can choose to either focus on '''''the impact on physics analysis performance''''' by studying how the timing measurements can be included in the reconstruction of tracks, and what effect this has on how much better we can understand the physical processes occurring in the particles produced in the LHC collisions. With this work you will be part of the Atlas group at Nikhef.
 +
#The second possibility is to '''''test the sensors in our lab''''' and in test-beam setups at CERN. The analysis performed will be in context of the ATLAS HGTD test beam group in connection to both the Atlas group and the R&D department at Nikhef.
 +
#The third is to contribute in an ongoing effort '''''to precisely simulate/model  he silicon avalanche detectors''''' in the Allpix2 frameword. There are several models that try to describe the detectors response. There are several dependencies to operation temperature, field strenghts and radiation damage.  We are getting close in being able to model our detector - but not there yet. This work will be within the ATLAS group together with Hella Snoek and Andrea Visibile
 +
 
 +
If you are interested, contact me to discuss the possibilities. 
 +
Contact:  ''[mailto:hella.snoek@nikhef.nl Hella Snoek]''
 +
 
 +
 
 +
===ATLAS: The next full-silicon Inner Tracker: ITk===
 +
[[File:ITk endcap structure.jpg|210x210px|thumb|alt=]]The inner detector of the present ATLAS experiment has been designed and developed to function in the environment of the present Large Hadron Collider (LHC). At the ATLAS Phase-II Upgrade, the particle densities and radiation levels will exceed current levels by a factor of ten. The instantaneous luminosity is expected to reach unprecedented values, resulting in up to 200 proton-proton interactions in a typical bunch crossing. The new detectors must be faster and they need to be more highly segmented. The sensors used also need to be far more resistant to radiation, and they require much greater power delivery to the front-end systems. At the same time, they cannot introduce excess material which could undermine tracking performance. For those reasons, the inner tracker of the ATLAS detector (ITk) was redesigned and will be rebuilt completely.
 +
 
 +
Nikhef is one of the sites in charge of building and integrating some big parts of ITk. One of the next steps consists of testing the sensors that we will install in the structures we have built (check one of the structures in the picture of our cleanroom). This project offers the possibility of working on a full hardware project, doing something completely new, by testing the sensors of a future component of the next ATLAS detector.
 +
 
 +
''Contact'':  ''[mailto:aalonso@nikhef.nl Andrea García Alonso]''
 +
 
 +
===Cosmic Rays/Neutrinos: Seasonal muon flux variations and the pion/kaon ratio===
 +
The KM3NeT ARCA and ORCA detectors, located kilometers deep in the Mediterranean Sea, have neutrinos as primary probes. Muons from cosmic ray interactions reach the detectors in relatively large quantities too. These muons, exploiting the capabilities and location of the detectors allow the study of cosmic rays and their interactions. In this way, questions about their origin, type, propagation can be addressed. In particular these muons are tracers of hadronic interactions at energies inaccessible at particle accelerators.
 +
 
 +
The muons reaching the depths of the detectors result from decays of mesons, mostly pions and kaons, created in interactions of high-energy cosmic rays with atoms in the upper atmosphere. Seasonal changes of the temperature – and thus density - profile of  the atmosphere modulate the balance between the probability for these mesons to decay (producing muons) or to re-interact. Pions and kaons are affected differently, allowing to extract their production ratio by determining how changes in muon rate depend on changes in the effective temperature – an integral over the atmospheric temperature profile weighted by a depth dependent meson production rate.
 +
 
 +
In this project, the aim is to measure the rate of muons in the detectors and  to calculate the effective temperature above the KM3NeT detectors from atmospheric data, both as function of time. The relation between these two can be used to extract the pion to kaon ratio.
 +
 
 +
''Contact: [mailto:rbruijn@nikhef.nl Ronald Bruijn]''
 +
===Detector R&D: Studies of wafer-scale sensors for ALICE detector upgrade and beyond===
 +
One of the biggest milestones of the ALICE detector upgrade (foreseen in 2026) is the implementation of wafer-scale (~ 28 cm x 18 cm) monolithic silicon active pixel sensors in the tracking detector, with the goal of having truly cylindrical barrels around the beam pipe. To demonstrate such an unprecedented technology in high energy physics detectors, few chips will be soon available in Nikhef laboratories for testing and characterization purposes.
 +
The goal of the project is to contribute to the validation of the samples against the ALICE tracking detector requirements, with a focus on timing performance in view of other applications in future high energy physics experiments beyond ALICE.
 +
We are looking for a student with a focus on lab work and interested in high precision measurements with cutting-edge instrumentation. You will be part of the Nikhef Detector R&D group and you will have, at the same time, the chance to work in an international collaboration where you will report about the performance of these novel sensors. There may even be the opportunity to join beam tests at CERN or DESY facilities. Besides interest in hardware, some proficiency in computing is required (Python or C++/ROOT).
 +
 
 +
''Contact: [mailto:(jory.sonneveld@nikhef.nl Jory Sonneveld] , [mailto:rrusso@nikhef.nl Roberto Russo]''
 +
 
 +
===Detector R&D: Time resolution of monolithic silicon detectors===
 +
Monolithic silicon detectors based on industrial Complementary Metal Oxide Semiconductor (CMOS) processes offer a promising approach for large scale detectors due to their ease of production and low material budget. Until recently, their low radiation tolerance has hindered their applicability in high energy particle physics experiments. However, new prototypes ~~such as the one in this project~~ have overcome these hurdles, making them feasible candidates for future experiments in high energy particle physics. Achieving the required radiation tolerance has brought the spatial and temporal resolution of these detectors to the forefront. In this project, you will investigate the temporal performance of a radiation hard monolithic detector prototype, using laser setups in the laboratory. You will also participate in meetings with the international collaboration working on this detector, where you will report on the prototype's performance. Depending on the progress of the work, there may be a chance to participate in test beams performed at the CERN accelerator complex and a first full three dimensional characterization of the prototypes performance using a state-of-the-art two-photon absorption laser setup at Nikhef. This project is looking for someone interested in working hands on with cutting edge detector and laser systems at the Nikhef laboratory. Python programming skills and linux experience are an advantage.
 +
 
 +
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld], [mailto:uwe.kraemer@nikhef.nl Uwe Kraemer]''
 +
 
 +
===Detector R&D: Improving a Laser Setup for Testing Fast Silicon Pixel Detectors===
 +
For the upgrades of the innermost detectors of experiments at the Large Hadron Collider in Geneva, in particular to cope with the large number of collisions per second from 2027, the Detector R&D group at Nikhef tests new pixel detector prototypes with a variety of laser equipment with several wavelengths. The lasers can be focused down to a small spot to scan over the pixels on a pixel chip. Since the laser penetrates the silicon, the pixels will not be illuminated by just the focal spot, but by the entire three dimensional hourglass or double cone like light intensity distribution. So, how well defined is the volume in which charge is released? And can that be made much smaller than a pixel? And, if so, what would the optimum focus be? For this project the student will first estimate the intensity distribution inside a sensor that can be expected. This will correspond to the density of released charge within the silicon. To verify predictions, you will measure real pixel sensors for the LHC experiments.
 +
This project will involve a lot of 'hands on work' in the lab and involve programming and work on unix machines.
  
 
''Contact: [mailto:martinfr@nikhef.nl Martin Fransen]''
 
''Contact: [mailto:martinfr@nikhef.nl Martin Fransen]''
  
=== Detector R&D: Compton camera ===
+
===Detector R&D: Time resolution of hybrid pixel detectors with the Timepix4 chip ===
 +
Precise time measurements with silicon pixel detectors are very important for experiments at the High-Luminosity LHC and the future circular collider. The spatial resolution of current silicon trackers will not be sufficient to distinguish the large number of collisions that will occur within individual bunch crossings. In a new method, typically referred to as 4D tracking, spatial measurements of pixel detectors will be combined with time measurements to better distinguish collision vertices that occur close together.
 +
New sensor technologies are being explored to reach the required time measurement resolution of tens of picoseconds, and the results are promising. However, the signals that these pixelated sensors produce have to be processed by front-end electronics, which hence also play a role in the total time resolution of the detector. An important contribution comes from the systematic differences between the front-end electronics of different pixels. Many of these systematic effects can be corrected by performing detailed calibrations of the readout electronics. To achieve the required time resolution at future experiments, it is vital that these effects are understood and corrected.
 +
In this project you will be working with the Timepix4 chip. This is a so-called application specific integrated circuit (ASIC) that is designed to read out pixelated sensors. This ASIC will be used extensively in detector R&D for the characterisation of new sensor technologies requiring precise timing (< 50 ps). In order to do so, it is necessary to first study the systematic differences between the pixels, which you will do using a laser setup in our lab. This will be combined with data analysis of proton beam measurements, or with measurements performed using the built-in test-pulse mechanism of the Timepix4 ASIC. Your work will enable further research performed with this ASIC,
 +
and serve as input to the design and operation of future ASICs for experiments at the High-Luminosity LHC.
 +
 
 +
''Contact: [mailto:k.heijhoff@nikhef.nl Kevin Heijhoff] and [mailto:martinb@nikhef.nl Martin van Beuzekom]''
 +
 
 +
===Detector R&D: Performance studies of Trench Isolated Low Gain Avalanche Detectors (TI-LGAD)===
 +
The future vertex detector of the LHCb Experiment needs to measure the spatial coordinates and time of the particles originating in the LHC proton-proton collisions with resolutions better than 10 um and 50 ps, respectively. Several technologies are being considered to achieve these resolutions.  Among those is a novel sensor  technology called Trench Isolated Low Gain Avalanche Detector.
 +
Prototype pixelated sensors have been manufactured recently and have to be characterised. Therefore these new sensors will be bump bonded to a Timepix4 ASIC which provides charge and time measurements in each of 230 thousand pixels. Characterisation will be done using a lab setup at Nikhef, and includes tests with a micro-focused laser beam, radioactive sources, and possibly with particle tracks obtained in a test-beam.  This project involves data taking with these new devices and analysing the data to determine the performance parameters such as the spatial and temporal resolution. as function of temperature and other operational conditions.
 +
 
 +
''Contacts: [mailto:kazu.akiba@nikhef.nl Kazu Akiba] and [mailto:martinb@nikhef.nl Martin van Beuzekom]''
 +
 
 +
===Detector R&D: A Telescope with Ultrathin Sensors for Beam Tests===
 +
To measure the performance of new prototypes for upgrades of the LHC experiments and beyond, typically a telescope is used in a beam line of charged particles that can be used to compare the results in the prototype to particle tracks measured with this telescope. In this project, you will continue work on a very lightweight, compact telescope using ALICE PIxel DEtectors (ALPIDEs). This includes work on the mechanics, data acquisition software, and a moveable stage. You will foreseeably test this telescope in the Delft Proton Therapy Center. If time allows, you will add a timing plane and perform a measurement with one of our prototypes. Apart from travel to Delft, there is a possiblity to travel to other beam line facilities.
 +
 
 +
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld]''
 +
 
 +
===Detector R&D: Laser Interferometer Space Antenna (LISA) - the first gravitational wave detector in space ===
  
In the Nikhef R&D group we develop instrumentation for particle physics but we also investigate how particle physics detectors can be used for different purposes. A successful development is the Medipix chip that can be used in X-ray imaging. For use in large scale medical applications compton scattering limits however the energy resolving possibilities. You will investigate whether it is in principle possible to design a X-ray application that detects the compton scattered electron and the absorbed photon. Your ideas can be tested in practice in the lab where a X-ray scan can be performed.
+
The space-based gravitational wave antenna LISA is one of the most challenging space missions ever proposed. ESA plans to launch around 2034 three spacecraft separated by a few million kilometres. This constellation measures tiny variations in the distances between test-masses located in each satellite to detect gravitational waves from sources such as supermassive black holes. LISA is based on laser interferometry, and the three satellites form a giant Michelson interferometer. LISA measures a relative phase shift between one local laser and one distant laser by light interference. The phase shift measurement requires sensitive sensors. The Nikhef DR&D group fabricated prototype sensors in 2020 together with the Photonics industry and the Dutch institute for space research SRON. Nikhef & SRON are responsible for the Quadrant PhotoReceiver (QPR) system: the sensors, the housing including a complex mount to align the sensors with 10's of nanometer accuracy, various environmental tests at the European Space Research and Technology Centre (ESTEC), and the overall performance of the QPR in the LISA instrument. Currently we are discussing possible sensor improvements for a second fabrication run in 2022, optimizing the mechanics and preparing environmental tests. As a MSc student, you will work on various aspects of the wavefront sensor development: study the performance of the epitaxial stacks of Indium-Gallium-Arsenide, setting up test benches to characterize the sensors and QPR system, performing the actual tests and data analysis, in combination with performance studies and simulations of the LISA instrument.
  
''Contact: [mailto:martinfr@nikhef.nl Martin Fransen]''
+
''Contact: [mailto:nielsvb@nikhef.nl Niels van Bakel]''
 +
 
 +
===Detector R&D: Other projects===
 +
Are you looking for a slightly different project? Are the above projects already taken? Are you coming in at an unusual time of the year? Do not hesitate to contact us! We always have new projects coming up at different times in the year and we are open to your ideas.
 +
 
 +
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld]''
 +
 
 +
===FCC: The Next Collider===
 +
 
 +
After the LHC, the next planned large collider at CERN is the proposed 100 kilometer circular collider "FCC". In the first stage of the project, as a high-luminosity electron-positron collider, precision measurements of the Higgs boson are the main goal. One of the channels that will improve by orders of magnitude at this new accelerator is the decay of the Higgs boson to a pair of charm quarks. This project will estimate a projected sensitivity for the coupling of the Higgs boson to second generation quarks, and in particular target the improved reconstruction of the topology of long-lived mesons in the clean environment of a precision e+e- machine.
 +
 
 +
''Contact: [mailto:tdupree@nikhef.nl Tristan du Pree]''
 +
 
 +
===Gravitational Waves: Computer modelling to design the laser interferometers for the Einstein telescope===
 +
 
 +
A new field of instrument science led to the successful detection of gravitational waves by the LIGO detectors in 2015. We are now preparing the next generation of gravitational wave observatories, such as the Einstein Telescope, with the aim to increase the detector sensitivity by a factor of ten, which would allow, for example, to detect stellar-mass black holes from early in the universe when the first stars began to form. This ambitious goal requires us to find ways to significantly improve the best laser interferometers in the world.
 +
 
 +
Gravitational wave detectors, such as LIGO and VIRGO, are complex Michelson-type interferometers enhanced with optical cavities. We develop and use numerical models to study these laser interferometers, to invent new optical techniques and to quantify their performance. For example, we synthesize virtual mirror surfaces to study the effects of higher-order optical modes in the interferometers, and we use opto-mechanical models to test schemes for suppressing quantum fluctuations of the light field. We can offer several projects based on numerical modelling of laser interferometers. All projects will be directly linked to the ongoing design of the Einstein Telescope.
 +
 
 +
''Contact: [mailto:a.freise@nikhef.nl Andreas Freise]''
 +
 
 +
===LHCb: Search for light dark particles ===
 +
The Standard Model of elementary particles does not contain a proper Dark Matter candidate. One of the most tantalizing theoretical developments is the so-called ''Hidden Valley models'': a mirror-like copy of the ''Standard Model'', with dark particles that communicate with standard ones via a very feeble interaction. These models predict the existence of ''dark hadrons'' – composite particles that are bound similarly to ordinary hadrons in the ''Standard Model''. Such ''dark hadrons'' can be abundantly produced in high-energy proton-proton collisions, making the LHC a unique place to search for them. Some ''dark hadrons'' are stable like a proton, which makes them excellent ''Dark Matter'' candidates, while others decay to ordinary particles after flying a certain distance in the collider experiment. The LHCb detector has a unique capability to identify such decays, particularly if the new particles have a mass below ten times the proton mass.
 +
 
 +
This project assumes a unique search for light ''dark hadrons'' that covers a mass range not accessible to other experiments. It assumes an interesting program on data analysis (python-based) with non-trivial machine learning solutions and phenomenology research using fast simulation framework. Depending on the interest, there is quite a bit of flexibility in the precise focus of the project.
 +
 
 +
''Contact: [mailto:andrii.usachov@nikhef.nl Andrii Usachov]''
 +
 
 +
===LHCb: Searching for dark matter in exotic six-quark particles===
 +
 
 +
Three quarters of the mass in the Universe is of unknown type. Many hypotheses about this dark matter have been proposed, but none confirmed. Recently it has been proposed that it could be made of particles made of the six quarks uuddss, which would be a Standard-Model solution to the dark matter problem. This idea has recently gained credibility as many similar multi-quarks states are being discovered by the LHCb experiment. Such a particle could be produced in decays of heavy baryons, or directly in proton-proton collisions. The anti-particle, made of six antiquarks, could be seen when annihilating with detector material. It is also proposed to use Xi_b baryons produced at LHCb to search for such a state where the state would appear as missing 4-momentum in a kinematically constrained decay. The project consists in defining a selection and applying it to LHCb data. See [https://arxiv.org/abs/2007.10378 arXiv:2007.10378].
 +
 
 +
Contact: ''[mailto:patrick.koppenburg@cern.ch Patrick Koppenburg]''
 +
 
 +
===LHCb: Measuring lepton flavour universality with excited Ds states in semileptonic Bs decays===
 +
One of the most striking discrepancies between the Standard Model and measurements are the lepton flavour universality (LFU) measurements with tau decays. At the moment, we have observed an excess of 3-4 sigma in ''B → Dτν'' decays. This could point even to a new force of nature! To understand this discrepancy, we need to make further measurements.
 +
 
 +
One very exciting (pun intended) projects to verify these discrepancies involves measuring the ''B<sub>s</sub> → D<sub>s2</sub><sup>*</sup>τν'' and/or ''B<sub>s</sub> → D<sub>s1</sub><sup>*</sup>τν'' decays. These decays with excited states of the ''D<sub>s</sub>'' meson have not been observed before in the tau decay mode, and have a unique way of coupling to potential new physics candidates that can only be measured in ''B<sub>s</sub>'' decays [1]. See slides for more detail: [[File:LHCbLFUwithExcitedDs.pdf|thumb]]
 +
 
 +
[1] https://arxiv.org/abs/1606.09300
 +
 
 +
''Contact: [mailto:suzannek@nikhef.nl Suzanne Klaver]''
 +
 
 +
===LHCb: New physics in the angular distributions of B decays to K*ee===
 +
 
 +
Lepton flavour violation in B decays can be explained by a variety of non-standard model interactions. Angular distributions in decays of a B meson to a hadron and two leptons are an important source of information to understand which model is correct. Previous analyses at the LHCb experiment have considered final states with a pair of muons. Our LHCb group at Nikhef concentrates on a new measurement of angular distributions in decays with two electrons. The main challenge in this measurement is the calibration of the detection efficiency. In this project you will confront estimates of the detection efficiency derived from simulation with decay distributions in a well known B decay. Once the calibration is understood, the very first analysis of the angular distributions in the electron final state can be performed.
 +
 
 +
Contact:  [mailto:m.senghi.soares@nikhef.nl Mara Soares] and [mailto:wouterh@nikhef.nl Wouter Hulsbergen]
 +
 
 +
===LHCb: Discovering heavy neutrinos in B decays===
 +
 
 +
Neutrinos are the lightest of all fermions in the standard model. Mechanisms to explain their small mass rely on the introduction of new, much heavier, neutral leptons. If the mass of these new neutrinos is below the b-quark mass, they can be observed in B hadron decays.
 +
 
 +
In this project we search for the decay of B+ mesons in into an ordinary electron or muon and the yet undiscovered heavy neutrino. The heavy neutrino is expected to be unstable and in turn decay quickly into a charged pion and another electron or muon. The final state in which the two leptons differ in flavour, "B+ to e mu pi", is particularly interesting: It is forbidden in the standard model, such that backgrounds are small. The analysis will be performed within the LHCb group at Nikhef using LHCb run-2 data.
 +
 
 +
=== LHCb: Scintillating Fibre tracker software===
 +
The installation of the scintillating-fibre tracker in LHCb’s underground cavern was recently completed. This detector uses 10000 km of fibres to track particle trajectories in the LHCb detector when the LHC starts up again later this year. The light emitted by the scintillating fibres when a particle interacts with them is measured using photon multiplier tubes. The studies proposed for this project will focus on software, and could include writing a framework to monitor the detector output, improving the detector simulation or working on the data processing.
 +
 
 +
''Contact: [mailto:e.gabriel@nikhef.nl Emmy Gabriel]''
 +
 
 +
=== LHCb: Vertex detector calibration===
 +
In summer 2022 LHCb has started data taking will an almost entirely new detector. At the point closest to the interaction point, the trajectories of charge particles are reconstructed with a so-called silicon pixel detector. The design hit resolution of this detector is about 15 micron. However, to actually reach this resolution a precise calibration of the spatial positions of the silicon sensors needs to be performed. In this project, you will use the first data of the new LHCb detector to perform this calibration and measure the detector performance.
 +
 
 +
''Contact: [mailto:wouterh@nikhef.nl Wouter Hulsbergen]''
  
=== Detector R&D: Holographic projector ===
 
  
A difficulty in generating holograms (based on the interference of light) is the required dense pixel pitch. One would need a pixel pitch of less than 200 nanometer. With larger pixels artefacts occur due to spatial under sampling. A pixel pitch of 200 nanometer is difficult, if not, impossible, to achieve, especially for larger areas. Another challenge is the massive amount of computing power that would be required to control such a dense pixel matrix.
+
===Neutrinos: Neutrino scattering: the ultimate resolution===
  
A new holographic projection method has been developed that reduces under sampling artefacts for projectors with a ‘low’ pixel density. It is using 'pixels' at random but known positions, resulting in an array of (coherent) light points that lacks (or has strongly surpressed) spatial periodicity. As a result a holographic projector can be built with a significantly lower pixel density and correspondingly less required computing power. This could bring holography in reach for many applications like display, lithography, 3D printing, metrology, etc..
+
Neutrino telescopes like IceCube and KM3NeT aim at detecting neutrinos from cosmic sources. The neutrinos are detected with the best resolution when charged current interactions with nucleons produce a muon, which can be detected with high accuracy (depending on the detector). A crucial ingredient in the ultimate achievable pointing accuracy of neutrino telescopes is the scattering angle between the neutrino and the muon. While published computations have investigated the cross-section of the process in great detail, this important scattering angle has not received much attention. The aim of the project is to compute and characterize the distribution of this angle, and that the ultimate resolution of a neutrino telescope. If successful, the results of this project can lead to publication of interest to the neutrino telescope community.
 
Of course, nothing comes for free: With less pixels, holograms become noisier and the contrast will be reduced. The big question: How do we determine the requirements (in terms of pixel density, pixel positioning, etc..) for the holographic projector based on requirements for the holograms?
 
Requirements for a hologram can be expressed in terms of: Noise, contrast, resolution, suppression of under sampling artefacts, etc..  
 
  
For this project we are building a proof of concept holographic emitter. This set-up will be used to verify simulation results (and also to project some cool holograms of course).  
+
Depending on your interests, the study could be based on a first-principles calculation (using the deep-inelastic scattering formalism), include state-of-the-art parton distribution functions, and/or exploit existing event-generation software for a more experimental approach.  
  
Students can do hands on lab-work (building and testing the proto type projector) and/or work on setting up simulation methods and models. Simulations in this field can be highly parallelized and are preferably written for parallel computing and/or GPU computing.
+
''Contacts: [mailto:aart.heijboer@nikhef.nl Aart Heijboer]''
  
 +
===Neutrinos: acoustic detection of ultra-high energy neutrinos===
  
''Contact: [mailto:martinfr@nikhef.nl Martin Fransen]
+
The study of the cosmic neutrinos of energies above 1017 eV, the so-called ultra-high energy neutrinos, provides a unique view on the universe and may provide insight in the origin of the most violent astrophysical sources, such as gamma ray bursts, supernovae or even dark matter. In addition, the observation of high energy neutrinos may provide a unique tool to study interactions at high energies. The energy deposition of these extreme neutrinos in water induce a thermo-acoustic signal, which can be detected using sensitive hydrophones. The expected neutrino flux is however extremely low and the signal that neutrinos induce is small. TNO is presently developing sensitive hydrophone technology based on fiber optics. Optical fibers form a natural way to create a distributed sensing system. Using this technology a large scale neutrino telescope can be built in the deep sea. TNO is aiming for a prototype hydrophone which will form the building block of a future telescope.
  
=== Detector R&D: Laser Interferometer Space Antenna (LISA) ===
+
The work will be executed at the Nikhef institute and/or the TNO laboratories in Delft. In this project master students have the opportunity to contribute in the following ways:
  
The space-based gravitational wave antenna LISA is without doubt one of the most challenging space missions ever proposed. ESA plans to launch around 2030 three spacecrafts that are separated by a few million kilometers to measure tiny variations in the distances between test-masses located in each spacecraft to detect the gravitational waves from sources such as supermassive black holes. The triangular constellation of the LISA mission is dynamic requiring a constant fine tuning related to the pointing of the laser links between the spacecrafts and a simultaneous refocusing of the telescope. The noise sources related to the laser links are expected to provide a dominant contribution to the LISA performance.
+
'''Project 1:''' Hardware development on fiber optics hydrophones technology Goal: characterize existing prototype optical fibre hydrophones in an anechoic basin at TNO laboratory. Data collection, calibration, characterization, analysis of consequences for design future acoustic hydrophone neutrino telescopes;
 +
Keywords: Optical fiber technology, signal processing, electronics, lab.
  
An update and extension of the LISA science simulation software is needed to assess the hardware development for LISA at Nikhef, TNO and SRON. A position is therefore available for a master student to study the impact of instrumental noise on the performance of LISA. Realistic simulations based on hardware (noise) characterization measurements that were done at TNO will be carried out and compared to the expected tantalizing gravitational wave sources.
+
'''Project 2:''' Investigation of ultra-high energy neutrinos and their interactions with matter. Goal: Discriminate the neutrino signals from the background noises, in particular clicks from whales and dolphins in the deep sea. Study impact on physics reach for future acoustic hydrophone neutrino telescopes;
 +
Keywords: Monte Carlo simulations, particle physics, neutrino physics, data analysis algorithms.
  
Key words: LISA, space, gravitational waves, simulations, signal processing
+
Further information: Info on ultra-high energy neutrinos can be found at: http://arxiv.org/abs/1102.3591; Info on acoustic detection of neutrinos can be found at: http://arxiv.org/abs/1311.7588
  
''Contact: [mailto:nielsvb@nikhef.nl Niels van Bakel],[mailto:ernst-jan.buis@tno.nl Ernst-Jan Buis]''
+
''Contact: [mailto:ernst-jan.buis@tno.nl Ernst Jan Buis]'' or ''[mailto:ivo.van.vulpen@nikhef.nl Ivo van Vulpen]''
  
=== KM3NeT : Reconstruction of first neutrino interactions in KM3NeT ===
+
===Neutrinos: Oscillation analysis with the first data of KM3NeT ===
  
The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first two strings with sensitive photodetectors have been deployed 2015&2016. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In order to identify neutrinos an accurate reconstruction and optimal understanding of the backgrounds are crucial. In this project we will use the available data to identify and reconstruct the first neutrino interactions in the KM3NeT detector and with this pave the path towards neutrino astronomy.
+
The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first few strings with sensitive photodetectors have been deployed at both the Italian and the French detector sites. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In this project the available data will be used together with simulations to best reconstruct the event topologies and optimally identify and reconstruct the first neutrino interactions in the KM3NeT detector. The data will then be used to measure neutrino oscillation parameters, and prepare for a future neutrino mass ordering determination.
  
 
Programming skills are essential, mostly root and C++ will be used.
 
Programming skills are essential, mostly root and C++ will be used.
 +
''Contact: [mailto:bruijn@nikhef.nl Ronald Bruijn] [mailto:h26@nikhef.nl Paul de Jong]''
 +
 +
 +
===Neutrinos: the Deep Underground Neutrino Experiment (DUNE)===
 +
 +
The Deep Underground Neutrino Experiment (DUNE) is under construction in the USA, and will consist of a powerful neutrino beam originating at Fermilab, a near detector at Fermilab, and a far detector in the SURF facility in Lead, South Dakota, 1300 km away. During travelling, neutrinos oscillate and a fraction of the neutrino beam changes flavour; DUNE will determine the neutrino oscillation parameters to unrivaled precision, and try and make a first detection of CP-violation in neutrinos. In this project, various elements of DUNE can be studied, including the neutrino oscillation fit, neutrino physics with the near detector, event reconstruction and classification (including machine learning), or elements of data selection and triggering.
 +
 +
''Contact: [mailto:h26@nikhef.nl Paul de Jong]''
 +
 +
===Neutrinos: relic neutrino detection with PTOLEMY===
 +
PTOLEMY aims to make the first direct observation of the Big Bang relic neutrinos (the cosmic neutrino background, CνB) by resolving the β-decay endpoint of atomic tritium (neutrino capture target) to O(meV) precision. This remains an outstanding test of the Standard Model in an expanding universe. Not only does the CνB carry with it a signal from the hot dense universe only one second after the Big Bang but helps to constrain the balance of hot versus cold dark matter responsible for its evolution. In doing so, the PTOLEMY experiment would also measure the lowest neutrino mass, an as-of-yet unknown fundamental constant. The experiment is currently in the prototyping phase and the group at Nikhef is responsible for developing the radio-frequency (RF) system used for cyclotron radiation (CR) based trigger and tracking. This component will provide the trajectory of electrons entering the novel transverse drift filter, constraining the electrons' energy losses before they reach the cryogenic calorimeter which in turn records their final energy. The focus of this project will be modelling CR and its detection for the purposes of single electron spectroscopy and optimised trajectory reconstruction. There is also the opportunity to test hardware and readout electronics for the prototype RF-system. 
 +
''Contact: [mailto:jmead@nikhef.nl James Vincent Mead]''
 +
 +
=== Theoretical Particle Physics: Effective Field Theories of Particle Physics from low- to high-energies===
 +
Known elementary matter particles exhibit a surprising three-fold structure. The particles belonging to each of these three “generations” seem to display a remarkable pattern of identical properties, yet have vastly different masses. This puzzling pattern is unexplained. Equally unexplained is the bewildering imbalance between matter and anti-matter observed in the universe, despite minimal differences in the properties of particles and anti-particles. These two mystifying phenomena may originate from a deeper, still unknown, fundamental structure characterised by novel types of particles and interactions, whose unveiling would revolutionise our understanding of nature. The ultimate goal of particle physics is uncovering a fundamental theory which allows the coherent interpretation of phenomena taking place at all energy and distance scales. In this project, the students will exploit the Standard Model Effective Field Theory (SMEFT) formalism, which allows the theoretical interpretation of particle physics data in terms of new fundamental quantum interactions which relate seemingly disconnected processes with minimal assumptions on the nature of an eventual UV-complete theory that replaces the Standard Model. Specifically, the goal is to connect measurements from ATLAS, CMS, and LHCb experiments at the CERN's LHC among them and to jointly interpret this information with that provided by other experiments including very low-energy probes such as the anomalous magnetic moment of the muon or electric dipole moments of the electron and neutron.
 +
 +
This project will be based on theoretical calculations in particle physics, numerical simulations in Python, analysis of existing data from the LHC and other experiments, as well as formal developments in understanding the operator structure of effective field theories. Depending on the student profile, sub-projects with a strong computational and/or machine learning component are also possible, for instance to construct new operators with optimal sensitivity to New Physics effects as encoded by the SMEFT higher-dimensional operators. Topics that can be considered in this project include the interpretation of novel physical observables at the LHC and their integration into the global SMEFiT analysis, matching of EFTs to UV-complete theories and their phenomenological analyses, projections for the impact in the SMEFT parameter space of data for future colliders, the synergies between EFT studies and proton structure fits, and the matching to the Weak Effective Field Theory to include data on flavour observables such as B-meson decays.
 +
 +
References: https://arxiv.org/abs/2105.00006 ,  https://arxiv.org/abs/2302.06660, https://arxiv.org/abs/2211.02058 , https://arxiv.org/abs/1901.05965  , https://arxiv.org/abs/1906.05296 ,  https://arxiv.org/abs/1908.05588,  https://arxiv.org/abs/1905.05215. see also this [https://www.dropbox.com/s/30co188f1almzq2/rojo-GRAPPA-MSc-2023.pdf?dl=0 project description].
 +
 +
''Contacts: [Mailto:j.rojo@vu.nl Juan Rojo]''
 +
 +
===Theoretical Particle Physics: High-energy neutrino-nucleon interactions at the Forward Physics Facility===
 +
High-energy collisions at the High-Luminosity Large Hadron Collider (HL-LHC) produce a large number of particles along the beam collision axis, outside of the acceptance of existing experiments. The proposed Forward Physics Facility (FPF) to be located several hundred meters from the ATLAS interaction point and shielded by concrete and rock, will host a suite of experiments to probe Standard Model (SM) processes and search for physics beyond the Standard Model (BSM). High statistics neutrino detection will provide valuable data for fundamental topics in perturbative and non-perturbative QCD and in weak interactions. Experiments at the FPF will enable synergies between forward particle production at the LHC and astroparticle physics to be exploited. The FPF has the promising potential to probe our understanding of the strong interactions as well as of proton and nuclear structure, providing access to both the very low-x and the very high-x regions of the colliding protons. The former regime is sensitive to novel QCD production mechanisms, such as BFKL effects and non-linear dynamics, as well as the gluon parton distribution function (PDF) down to x=1e-7, well beyond the coverage of other experiments and providing key inputs for astroparticle physics. In addition, the FPF acts as a neutrino-induced deep-inelastic scattering (DIS) experiment with TeV-scale neutrino beams. The resulting measurements of neutrino DIS structure functions represent a valuable handle on the partonic structure of nucleons and nuclei, particularly their quark flavour separation, that is fully complementary to the charged-lepton DIS measurements expected at the upcoming Electron-Ion Collider (EIC).
 +
 +
In this project, the student will carry out updated predictions for the neutrino fluxes expected at the FPF, assess the precision with which neutrino cross-sections will be measured, and quantify their impact on proton and nuclear structure by means of machine learning tools within the NNPDF framework and state-of-the-art calculations in perturbative Quantum Chromodynamics. This project contributes to ongoing work within the FPF Initiative towards a Conceptual Design Report (CDR) to be presented within two years. Topics that can be considered as part of this project include the assessment of to which extent nuclear modifications of the free-proton PDFs can be constrained by FPF measurements, the determination of the small-x gluon PDF from suitably defined observables at the FPF and the implications for ultra-high-energy particle astrophysics, the study of the intrinsic charm content in the proton and its consequences for the FPF physics program, and the validation of models for neutrino-nucleon cross-sections in the region beyond the validity of perturbative QCD.
 +
 +
References: https://arxiv.org/abs/2203.05090, https://arxiv.org/abs/2109.10905 ,https://arxiv.org/abs/2208.08372 , https://arxiv.org/abs/2201.12363  , https://arxiv.org/abs/2109.02653, https://github.com/NNPDF/ see also this [https://www.dropbox.com/s/30co188f1almzq2/rojo-GRAPPA-MSc-2023.pdf?dl=0 project description].
  
'' Contact: [mailto:bruijn@nikhef.nl Ronald Bruijn]''
+
''Contacts: [Mailto:j.rojo@vu.nl Juan Rojo]''
  
=== ANTARES: Analysis of IceCube neutrino sources. ===
+
===Theoretical Particle Physics: Probing the origin of the proton spin with machine learning===
 +
At energy-frontier facilities such as the Large Hadron Collider (LHC), scientists study the laws of nature in their quest for novel phenomena both within and beyond the Standard Model of particle physics. An in-depth understanding of the quark and gluon substructure of protons and heavy nuclei is crucial to address pressing questions from the nature of the Higgs boson to the origin of cosmic neutrinos. The key to address some of these questions is by carrying out an universal analysis of nucleon structure from the simultaneous determination of the momentum and spin distributions of quarks and gluons and their fragmentation into hadrons. This effort requires combining an extensive experimental dataset and cutting-edge theory calculations within a machine learning framework where neural networks parametrise the underlying physical laws while minimising ad-hoc model assumptions. The upcoming Electron Ion Collider (EIC), to start taking data in 2029, will be the world's first ever polarised lepton-hadron collider and will offer a plethora of opportunities to address key open questions in our understanding of the strong nuclear force, such as the origin of the mass and the intrinsic angular momentum (spin) of hadrons and whether there exists a state of matter which is entirely dominated by gluons. To fully exploit this scientific potential, novel analysis methodologies need to be develop that make it possible to carry out large-scale, coherent interpretations of measurements from the EIC and other high-energy colliders.
  
The only evidence for high energetic neutrinos from cosmic sources so far comes from detections with the IceCube detector. Most of the detected events were reconstructed with a large uncertainty on their direction, which has prevented an association to astrophysical sources. Only for the high energetic muon neutrino candidates a high resolution in the direction has been achieved, but also for those no significant correlation to astrophysical sources has to date been detected.
+
In this project, the student will carry out a new global analysis of the spin structure of the proton by means of the machine learning tools provided by the NNPDF open-source fitting framework and state-of-the-art calculations in perturbative Quantum Chromodynamics, and integrate it within the corresponding global NNPDF analyses of unpolarised proton and nuclear structure in the framework of a combined integrated global analysis of non-perturbative QCD. Specifically, the project aims to realise a NNLO global fit of polarised quark and gluon PDFs that combines all available data and state-of-the-art perturbative QCD calculations, and study the phenomenological implications for other experiments, including the EIC, for the spin content of the proton, for comparisons with lattice QCD calculations, and for nonpperturbative models of hadron structure.
The ANTARES neutrino telescope has since 2007 continuously taken neutrino data with high angular resolution, which can be exploited to further scrutinize the locations of these neutrino sources. In this project we will address the neutrino sources in a stacked analysis to further probe the origin of the neutrinos with enhanced sensitivity.
 
  
Programming skills are essential, mainly C++ and root will be used.  
+
References:  https://arxiv.org/abs/2201.12363,   https://arxiv.org/abs/2109.02653 , https://arxiv.org/abs/2103.05419, https://arxiv.org/abs/1404.4293 , https://inspirehep.net/literature/1302398, https://github.com/NNPDF/ see also this [https://www.dropbox.com/s/30co188f1almzq2/rojo-GRAPPA-MSc-2023.pdf?dl=0 project description].
  
'' Contact: [mailto:dosamt@nikhef.nl Dorothea Samtleben]''
+
''Contacts: [Mailto:j.rojo@vu.nl Juan Rojo]''
  
=== VU LaserLaB: Measuring the electric dipole moment (EDM) of the electron ===
+
==='''Theoretical Particle Physics''': Charged lepton flavor violation in neutrino mass models===
 +
The nonzero value of neutrino masses requires an explanation beyond the Standard Model of particle physics. A promising solution involves the existence of extra neutrinos, often called right-handed or sterile neutrinos. These models elegantly explain neutrino masses and can also be connected to other puzzles such as the absence of anti-matter in our universe. In this project you will investigate potential experimental signatures of sterile neutrinos through decays that are extremely rare in the Standard Model. Examples are muon decays to electrons and photons, or muon + neutron -> electron + neutron. You will perform Quantum Field Theory calculations within the neutrino-extended Standard Model to compute the rates of these processes and compare them to experimental sensitivities.
  
In collaboration with Nikhef and the Van Swinderen Institute for Particle Physics and Gravity at the University of Groningen, we have recently started an exciting project to measure the electric dipole moment (EDM) of the electron in cold beams of barium-fluoride molecules. The eEDM, which is predicted by the Standard Model of particle physics to be extremely small, is a powerful probe to explore physics beyond this Standard Model. All extensions to the Standard Model, most prominently supersymmetry, naturally predict an electron EDM that is just below the current experimental limits. We aim to improve on the best current measurement by at least an order of magnitude. To do so we will perform a precision measurement on a slow beam of laser-cooled BaF molecules. With this low-energy precision experiment, we test physics at energies comparable to those of LHC!
+
''Contacts: [Mailto:j.devries4@uva.nl Jordy de Vries]''
  
At LaserLaB VU, we are responsible for building and testing a cryogenic source of BaF molecules. The main parts of this source are currently being constructed in the workshop. We are looking for enthusiastic master students to help setup the laser system that will be used to detect BaF. Furthermore, projects are available to perform simulations of trajectory simulations to design a lens system that guides the BaF molecules from the exit of the cryogenic source to the experiment.
+
==='''Theoretical Particle Physics''': The electric dipole moment of paramagnetic systems in the Standard Model===
 +
Electric dipole moments (EDMs) of elementary particles, hadrons, nuclei, atoms, and molecules would indicate the violation of CP violation. The Standard Model (SM) contains CP violation in the weak interaction in the so-called CKM matrix (the quark-mixing matrix) but it leads to EDMs that are too small to be seen. At least this is often claimed. In this work we will reinvestigate the computation of the EDMs of systems that are used in state-of-the-art experiments. In particular we will compute a CP-violating interaction between electrons and nucleons mitigated by the SM weak interaction. During this project you will obtain a deep understanding of the Standard Model and explicit quantum field theory calculations across a wider range of energy scales.  
  
'' Contact: [mailto:H.L.Bethlem@vu.nl Rick Bethlem]''
+
''Contacts: [Mailto:j.devries4@uva.nl Jordy de Vries]''
 +
==='''Theoretical Particle Physics''': Predictions for Charge Particle Tracks from First Principles===
 +
Measurements based on tracks of charged particles benefit from superior angular resolution. This is essential for a new class of observables called energy correlators, for which a range of interesting applications have been identified: studying the [https://arxiv.org/abs/2201.07800 confinement transition], measuring the [https://arxiv.org/abs/2201.08393 top quark mass] more precisely, etc. I developed a [https://arxiv.org/abs/1303.6637 framework] for calculating track-based observables, in which the conversion of quarks and gluons to charged hadrons is described by track functions. This generalization of the well-studied parton distribution functions and fragmentation functions is currently being measured by ATLAS, though the data is not public yet. Interestingly, two groups proposed predicting fragmentation functions from first principles in recent years (https://arxiv.org/abs/2010.02934, https://arxiv.org/abs/2301.09649). In this project you would extend one (or both) approaches to obtain a prediction for the track function.
  
 +
''Contacts: [Mailto:w.j.waalewijn@uva.nl Wouter Waalewijn]''
  
=== VU LaserLab: Physics beyond the Standard model from molecules ===
+
------------------------------------------------------------------
  
Our team, with a number of staff members (Ubachs, Eikema, Salumbides, Bethlem, Koelemeij) focuses on precision measurements in the hydrogen molecule, and its isotopomers. The work aims at testing the QED calculations of energy levels in H2, D2, T2, HD, etc. with the most precise measurements, where all kind of experimental laser techniques play a role (cw and pulsed lasers, atomic clocks, frequency combs, molecular beams). Also a target of studies is the connection to the "Proton size puzzle", which may be solved  through studies in the hydrogen molecular isotopes.
 
  
In the past half year we have produced a number of important results that are described in
+
==Finished master projects==
the following papers:
 
* Frequency comb (Ramsey type) electronic  excitations in the  H2 molecule:
 
see: Deep-ultraviolet frequency metrology of H2 for tests of molecular quantum theory
 
http://www.nat.vu.nl/~wimu/Publications/Altmann-PRL-2018.pdf
 
* ''Precision measurement of an infrared transition in the HD molecule''
 
see: Sub-Doppler frequency metrology in HD for tests of fundamental physics: https://arxiv.org/abs/1712.08438
 
* ''The first precision study in molecular tritium T2''
 
see: Relativistic and QED effects in the fundamental vibration of T2:  http://arxiv.org/abs/1803.03161
 
* ''Dissociation energy of the hydrogen molecule at 10^-9 accuracy'' paper submitted to Phys. Rev. Lett.
 
* ''Probing QED and fundamental constants through laser spectroscopy of vibrational transitions in HD+''
 
This is also a study of the hydrogen molecular ion HD+, where important results were  obtained not so long ago, and where we have a strong activity: http://www.nat.vu.nl/~wimu/Publications/ncomms10385.pdf
 
  
These five results mark the various directions we are pursuing, and in all directions we aim at obtaining improvements. Specific projects with students can be defined; those are mostly experimental, although there might be some theoretical tasks, like:
+
See:
* Performing calculations of hyperfine structures
+
*https://wiki.nikhef.nl/education/Master_Theses
 +
*https://www.nikhef.nl/master-theses-2021/
 +
*https://www.nikhef.nl/facts-figures-2020/master-theses-2020/
  
As for the theory there might also be an international connection for specifically bright theory students: we collaborate closely with prof. Krzystof Pachucki; we might find an opportunity
 
for a student to perform (the best !) QED calculations in molecules, when working in Warsaw and partly in Amsterdam. Prof Frederic Merkt from the ETH Zurich, an expert in the field, will come to work with us on "hydrogen"
 
during August - Dec 2018 while on sabbatical.
 
  
'' Contact: [mailto:w.m.g.ubachs@vu.nl Wim Ubachs] [mailto:k.s.e.eikema@vu.nl Kjeld Eikema] [mailto:h.l.bethlem@vu.nl Rick Bethlem]''
+
----
  
  

Latest revision as of 15:21, 14 April 2024

Master Thesis Research Projects

The following Master thesis research projects are offered at Nikhef. If you are interested in one of these projects, please contact the coordinator listed with the project.

Projects with a 2024 start [WORK IN PROGRESS, please look below for older projects]

ALICE: Search for new physics with 4D tracking at the most sensitive vertex detector at the LHC

With the newly installed Inner Tracking System consisting fully of monolithic detectors, ALICE is very sensitive to particles with low transverse momenta, more so than ATLAS and CMS. This will be even more so for the ALICE upgrade detector in 2033. This detector could potentially be even more sensitive to longlived particles that leave peculiar tracks such as disappearing or kinked tracks in the tracker by using timing information along a track. In this project you will investigate how timing information in the different tracking layers can improve or even enable a search for new physics beyond the Standard Model in ALICE. If you show a possibility for major improvements, this can have real consequences for the choice of sensors for this ALICE inner tracker upgrade.

Contact: Jory Sonneveld and Panos Christakoglou

ALICE: Connecting the hot and cold QCD matter by searching for the strongest magnetic field in nature

In a non-central collision between two Pb ions, with a large value of impact parameter, the charged nucleons that do not participate in the interaction (called spectators) create strong magnetic fields. A back of the envelope calculation using the Biot-Savart law brings the magnitude of this filed close to 10^19Gauss in agreement with state of the art theoretical calculation, making it the strongest magnetic field in nature. The presence of this field could have direct implications in the motion of final state particles. The magnetic field, however, decays rapidly. The decay rate depends on the electric conductivity of the medium which is experimentally poorly constrained. Overall, the presence of the magnetic field, the main goal of this project, is so far not confirmed experimentally and can have implications for measurements of gravitational waves emitted from the merger of neutron stars.

Contact: Panos Christakoglou

ALICE/LHCb Tracking: Innovative tracking techniques exploting modern heterogeneous architectures

The recostruction of charged particle tracks is one of the most computationaly demanding components of modern high energy physics experiments. In particular, the upcoming High-Luminosity Large Hadron Collider (HL-LHC) makes the usage of fast tracking algorithms using modern computing architectures with many cores and accelerators essential. In this project we will be investigating innovative, machine learning, experiment agnostic tracking algorithms in modern architectures e.g. GPUs, FPGAs.

Contact: Jacco de Vries and Panos Christakoglou

ATLAS: Charged particle tracking in the ATLAS detector with new machine learning techniques

This project concerns the application of new machine learning techniques to tackle the problem of track reconstruction at the ATLAS detector in CERN. While algorithms to construct particle tracks from low-level detector information such as particle hits have been around for decades, recent developments in the field of machine learning open up new opportunities to improve these algorithms significantly. In particular transformer neural networks (the architecture that chatGPT is based on) and graph neural networks will be studied in this problem. But there are a range of available options and this project includes the freedom for the student to choose particular types of networks if some are of particular interest.

At the start of this project simplified test data will be used for initial model development. Upon successful completion of this, simulated data from the actual ATLAS detector will be used. The student will need some familiarity with programming in python and an interest in machine learning, but a strong physics background is not necessary. In this project the student will be able to contribute to fundamental physics research and will familiarize themselves with state-of-the-art machine learning models.

Contact: Zef Wolffs and Ivo van Vulpen

ATLAS: Search for very rare Higgs decays to second-generation fermions

While the Higgs boson coupling to fermions of the third generation has been established experimentally, the investigation of the Higgs boson coupling to the light fermions of the second generation will be a central project for the current data-taking period of the LHC (2022-2025). The Higgs boson decay to muons is the most sensitive channel for probing this coupling. In this project, event selection algorithms for Higgs boson decays to muons in the associated production with a gauge boson (VH) are developed with the aim to distinguish signal events from background processes like Drell-Yan and WZ boson production. For this purpose, the candidate will implement and validate deep learning algorithms, and extract the final results based on fit  to the output of the deep learning classifier.

Contact: Oliver Rieger and Wouter Verkerke

ATLAS: Advanced deep-learning techniques for lepton identification

The ATLAS experiment at the Large Hadron Collider facilitates a broad spectrum of physics analyses. A critical aspect of these analyses is the efficient and accurate identification of leptons, which are crucial for both signal detection and background event rejection. The ability to distinguish between prompt leptons, arising directly from the collision, and nonprompt leptons, originating from heavy flavour hadron decays, is a challenging task. This project aims to develop and implement advanced techniques based on deep learning models to leverage the lepton identification beyond the capabilities of current standard methods.

Contact: Oliver Rieger and Wouter Verkerke

ATLAS: Probing CP-violation in the Higgs sector with the ATLAS experiment

The Standard Model Effective Field Theory (SMEFT) provides a systematic approach to test the impact of new physics at the energy scale of the LHC through higher-dimensional operators. The scarcity of antimatter in the cosmos arises from the slight differences in the behavior of particles and their antiparticle counterparts, known as CP-violation. The current data-taking period of the LHC is expected to yield a comprehensive dataset, enabling the investigation of CP-odd SMEFT operators in the Higgs boson's interactions with other particles.The interpretation of experimental data using SMEFT requires a particular interest in solving complex technical challenges, advanced statistical techniques, and a deep understanding of particle physics.

Contact: Lydia Brenner, Oliver Rieger and Wouter Verkerke

ATLAS: Signal and background sensitivity in Standard Model Effective Field Theory (SMEFT)

Complex statistical combinations of large sectors of the ATLAS scientific program are currently being used to obtain the best experimental sensitivity to SMEFT parameters. However, to achieve a fully consistent investigation of SMEFT and to push the limit of what is possible with the data already collected it is needed to include background modifications effects. Joining our efforts in this topic means contributing to a cutting-edge investigation that requires both a particular motivation in solving complex technical challenges and into obtaining a broad knowledge of experimental particle physics.

Contact: Andrea Visibile and Lydia Brenner

ATLAS: Performing a Bell test in Higgs to di-boson decays

Recently, theorist [1] have proposed to perform a Bell test in Higgs to di-boson decays. This is a fundamental test of not only quantum mechanics but also a test of quantum field theory using the elusive scalar Higgs particle. At Nikhef we started to brainstorm on the experimental aspects of this challenging measurement. Due to the studies of a PhD student [2] we have considerable experience in the reconstruction of Higgs rest frame angles that are essential to perform a Bell test. Is there a master student who wants to join our efforts to study the "spooky action at a distance" in Higgs to WW decays?

Contact: Peter Kluit

  [1] Review article https://arxiv.org/pdf/2402.07972.pdf

  [2] https://www.nikhef.nl/pub/services/biblio/theses_pdf/thesis_R_Aben.pdf

ATLAS: A new timing detector - the HGTD

The ATLAS is going to get a new ability: a timing detector. This allows us to reconstruct tracks not only in the 3 dimensions of space but adds the ability of measuring very precisely also the time (at picosecond level) at which the particles pass the sensitive layers of the HGTD detector. The added information helps to construct the trajectories of the particles created at the LHC in 4 dimensions and ultimately will lead to a better reconstruction of physics at ATLAS. The new HGTD detector is still in construction and work needs to be done on different levels such as understanding the detector response (taking measurements in the lab and performing simulations) or developing algorithms to reconstruct the particle trajectories (programming and analysis work).

Several projects are available within the context of the new HGTD detector:

  1. One can choose to either focus on the impact on physics analysis performance by studying how the timing measurements can be included in the reconstruction of tracks, and what effect this has on how much better we can understand the physical processes occurring in the particles produced in the LHC collisions. With this work you will be part of the Atlas group at Nikhef.
  2. The second possibility is to test the sensors in our lab and in test-beam setups at CERN/DESY. The analysis performed will be in context of the ATLAS HGTD test beam group in connection to both the Atlas group and the R&D department at Nikhef.
  3. The third is to contribute in an ongoing effort to precisely simulate/model the silicon avalanche detectors in the Allpix2 framework. There are several models that try to describe the detectors response. The models have depend on operation temperature, field strenghts and radiation damage. We are getting close in being able to model our detector - but not there yet. This work will be within the ATLAS group.

Contact: Hella Snoek

ATLAS: Studying rare modes of Higgs boson production at the LHC

The Higgs boson is a crucial piece of the Standard Model and its most recently-discovered particle. Studying Higgs boson production and decay at the LHC might hold the key for unlocking new information about the physical laws governing our universe. With the LHC now in its third run, we can also use the enormous amounts of data being collected to study Higgs boson production modes we have not previously been able to access. For instance, we can look at the production of a Higgs boson via the fusion of two vector bosons, accompanied by emission of a photon, with subsequent H->WW decay. This state is experimentally-distinctive and should be accessible to us using the current dataset of the LHC. It is also theoretically-interesting because it probes the Higgs boson’s interaction with W bosons. This exact interaction is a cornerstone of electroweak symmetry breaking, the process by which particles gain mass, so studying it provides a window onto a fundamental part of the Standard Model. This project will study the feasibility of measuring this or another rare Higgs production mode using H->WW decays, providing a chance to be involved in the design of an analysis from the ground up.

Contact: Robin Hayes, Flavia de Almeida Dias

ATLAS: Exploring triboson polarisation in loop-induced processes at the LHC

Spin is a fundamental, quantum mechanical property carried by (most) elementary particles. When high-energy particles scatter, their spin influences how angular momentum is propagated through the processes and ultimately how final-state particles are (geometrically) distributed. Helicity is the projection of the spin vector upon momentum. For example: in the loop-induced process gg > W+W-Z, the angular separation between the various decay products of the W and Z bosons depends on the helicity polarisation of the intermediate W and Z bosons. The aim of this project is to explore helicity polarisation in the multiboson processes, and specifically the gg > WWZ process, at the Large Hadron Collider. This project is in the interface between theory and experiment, and you will work with Monte Carlo generators, analyses design and sensitivity studies.

Contact: Flavia de Almeida Dias

ATLAS: High-Performance Simulations for High-Energy Physics Experiments

The role of simulation and synthetic data generation for High-Energy Physics (HEP) research is profound. While there are physics-accurate simulation frameworks available to provide the most realistic data syntheses, these tools are slow. Additionally, the output from physics-accurate simulations is closely tied to the experiment that the simulation was developed for and its software.

Fast simulation frameworks on the other hand, can drastically simplify the simulation, while still striking a balance between speed and accuracy of the simulated events. The applications of simplified simulations and data are numerous. We will be focusing on the role of such data as an enabler for Machine Learning (ML) model design research.

This project aims to extend the REDVID simulation framework [1, 2] through addition of new features. The features considered for this iteration include:

  • Interaction with common Monte Carlo event generators: To calculate hit points for imported events
  • Addition of basic magnetic field effect: Simulation of a simplified, uniform magnetic field, affecting charged particle trajectories
  • Inclusion of pile-up effects during simulation: Multiple particle collisions occurring in close vicinity
  • Indication of bunch size
  • Spherical coordinates
  • Vectorised helical tracks
  • Considerations for reproducibility of collision events

The project is part of an ongoing effort to train and test ML models for particle track reconstruction for the HL-LHC. The improved version of REDVID can be used by the student and other users to generate training data for ML models. Depending on the progress and the interest, a secondary goal could be to perform comparisons with physics-accurate simulations or to investigate the impact of the new features on developed ML models.

Bonus: The student will be encouraged and supported to publish the output of this study in a relevant journal, such as "Data in Brief" by Elsevier.

Appendix - Terminology

The terminology for the considered simulations and its features is domain-specific and are explained below:

  • Synthetic data: Data generated during a simulation, which resembles real-data to limited extent.
  • Physics-accurate simulation: A type of simulation that strongly takes into account real-world physical interactions and utilises physics formulas to achieve this.
  • Complexity-aware simulation framework: A simulator which can be configured with different levels of simulation complexity, making the simulation closer or further away from the real-world case.
  • Complexity-reduced data set: Simplified data resulting from simplified simulations. This is in comparison to real data, or data generated by physics-accurate simulations.

References

[1] U. Odyurt et al. 2023. "Reduced Simulations for High-Energy Physics, a Middle Ground for Data-Driven Physics Research". URL: https://doi.org/10.48550/arXiv.2309.03780

[2] U. Odyurt. 2023. "REDVID Simulation Framework". URL: https://virtualdetector.com/redvid

Contact: dr. ir. Uraz Odyurt, dr. Roel Aaij


Dark Matter: Building better Dark Matter Detectors - the XAMS R&D Setup

The Amsterdam Dark Matter group operates an R&D xenon detector at Nikhef. The detector is a dual-phase xenon time-projection chamber and contains about 0.5kg of ultra-pure liquid xenon in the central volume. We use this detector for the development of new detection techniques - such as utilizing our newly installed silicon photomultipliers - and to improve the understanding of the response of liquid xenon to various forms of radiation. The results could be directly used in the XENONnT experiment, the world’s most sensitive direct detection dark matter experiment at the Gran Sasso underground laboratory, or for future Dark Matter experiments like DARWIN. We have several interesting projects for this facility. We are looking for someone who is interested in working in a laboratory on high-tech equipment, modifying the detector, taking data and analyzing the data themselves You will "own" this experiment.

Contact: Patrick Decowski and Auke Colijn

Dark Matter: Searching for Dark Matter Particles - XENONnT Data Analysis

The XENON collaboration has used the XENON1T detector to achieve the world’s most sensitive direct detection dark matter results and is currently operating the XENONnT successor experiment. The detectors operate at the Gran Sasso underground laboratory and consist of so-called dual-phase xenon time-projection chambers filled with ultra-pure xenon. Our group has an opening for a motivated MSc student to do analysis with the new data coming from the XENONnT detector. The work will consist of understanding the detector signals and applying a deep neural network to improve the (gas-) background discrimination in our Python-based analysis tool to improve the sensitivity for low-mass dark matter particles. The work will continue a study started by a recent graduate. There will also be opportunity to do data-taking shifts at the Gran Sasso underground laboratory in Italy.

Contact: Patrick Decowski and Auke Colijn

Dark Matter: Signal reconstruction and correction in XENONnT

XENONnT is a low background experiment operating at the INFN - Gran Sasso underground laboratory with the main goal of detecting Dark Matter interactions with xenon target nuclei. The detector, consisting of a dual-phase time projection chamber, is filled with ultra-pure xenon, which acts as a target and detection medium. Understanding the detector's response to various calibration sources is a mandatory step in exploiting the scientific data acquired. This MSc thesis aims to develop new methods to improve the reconstruction and correction of scintillation/ ionization signals from calibration data. The student will work with modern analysis techniques (python-based) and will collaborate with other analysts within the international XENON Collaboration.

Contact: Maxime Pierre, Patrick Decowski

Dark Matter: The Ultimate Dark Matter Experiment - DARWIN Sensitivity Studies

DARWIN is the “ultimate” direct detection dark matter experiment, with the goal to reach the so-called “neutrino floor”, when neutrinos become a hard-to-reduce background. The large and exquisitely clean xenon mass will allow DARWIN to also be sensitive to other physics signals such as solar neutrinos, double-beta decay from Xe-136, axions and axion-like particles etc. While the experiment will only start in 2027, we are in the midst of optimizing the experiment, which is driven by simulations. We have an opening for a student to work on the GEANT4 Monte Carlo simulations for DARWIN. We are also working on a “fast simulation” that could be included in this framework. It is your opportunity to steer the optimization of a large and unique experiment. This project requires good programming skills (Python and C++) and data analysis/physics interpretation skills.

Contact: Tina Pollmann, Patrick Decowski or Auke Colijn

Dark Matter: Exploring new background sources for DARWIN

Experiments based on the xenon dual-phase time projection chamber detection technology have already demonstrated their leading role in the search for Dark Matter. The unprecedented low level of background reached by the current generation, such as XENONnT, allows such experiments to be sensitive to new rare-events physics searches, broadening their physics program. The next generation of experiments is already under consideration with the DARWIN observatory, which aims to surpass its predecessors in terms of background level and mass of xenon target. With the increased sensitivity to new physics channels, such as the study of neutrino properties, new sources of backgrounds may arise. This MSc thesis aims to investigate potential new sources of background for DARWIN and is a good opportunity for the student to contribute to the design of the experiment. This project will rely on Monte Carlo simulation tools such as GEANT4 and FLUKA, and good programming skills (Python and C++) are advantageous.

Contact: Maxime Pierre, Patrick Decowski

Dark Matter: Sensitive tests of wavelength-shifting properties of materials for dark matter detectors

Rare event search experiments that look for neutrino and dark matter interactions are performed with highly sensitive detector systems, often relying on scintillators, especially liquid noble gases, to detect particle interactions. Detectors consist of structural materials that are assumed to be optically passive, and light detection systems that use reflectors, light detectors, and sometimes, wavelength-shifting materials. MSc theses are available related to measuring the efficiency of light detection systems that might be used in future detectors. Furthermore, measurements to ensure that presumably passive materials do not fluoresce, at the low level relevant to the detectors, can be done. Part of the thesis work can include Monte Carlo simulations and data analysis for current and upcoming dark matter detectors, to study the effect of different levels of desired and nuisance wavelength shifting. In this project, students will acquire skills in photon detection, wavelength shifting technologies, vacuum systems, UV and extreme-UV optics, detector design, and optionally in Python and C++ programming, data analysis, and Monte Carlo techniques.

Contact: Tina Pollmann

Detector R&D: Energy Calibration of hybrid pixel detector with the Timepix4 chip

The Large Hadron Collider at CERN will increase its luminosity in the coming years. For the LHCb experiment the number of collisions per bunch crossing increases from 7 to more than 40. To distinguish all tracks from the quasi simultaneous collisions, time information will have to be used in addition to spatial information. A big step on the way to fast silicon detectors is the recently developed Timepix4 ASIC. Timepix4 consist of 448x512 pixels, but the pixels are not identical and there are pixel to pixel fluctuations in the time and charge measurement. The ultimate time resolution can only be achieved after calibration of both the time and energy measurements. The goal of this project is to study the energy calibration of Timepix4. Typical research questions are: how does the resolution depend on threshold and Krummenacher (discharge) current, and does a different sensor affect the energy resolution? In this research you will do measurements with calibration pulses, lasers and with radio-active sources to obtain data to calibrate the detector. The work consist of hands-on work in the lab to build/adapt the test set-up, and analysis of the data obtained.

Contact: Daan Oppenhuis,Hella Snoek,

Detector R&D: Studies of wafer-scale sensors for ALICE detector upgrade and beyond

One of the biggest milestones of the ALICE detector upgrade (foreseen in 2026) is the implementation of wafer-scale (~ 28 cm x 18 cm) monolithic silicon active pixel sensors in the tracking detector, with the goal of having truly cylindrical barrels around the beam pipe. To demonstrate such an unprecedented technology in high energy physics detectors, few chips will be soon available in Nikhef laboratories for testing and characterization purposes. The goal of the project is to contribute to the validation of the samples against the ALICE tracking detector requirements, with a focus on timing performance in view of other applications in future high energy physics experiments beyond ALICE. We are looking for a student with a focus on lab work and interested in high precision measurements with cutting-edge instrumentation. You will be part of the Nikhef Detector R&D group and you will have, at the same time, the chance to work in an international collaboration where you will report about the performance of these novel sensors. There may even be the opportunity to join beam tests at CERN or DESY facilities. Besides interest in hardware, some proficiency in computing is required (Python or C++/ROOT).

Contact: Jory Sonneveld

Detector R&D: Time resolution of monolithic silicon detectors

Monolithic silicon detectors based on industrial Complementary Metal Oxide Semiconductor (CMOS) processes offer a promising approach for large scale detectors due to their ease of production and low material budget. Until recently, their low radiation tolerance has hindered their applicability in high energy particle physics experiments. However, new prototypes ~~such as the one in this project~~ have started to overcome these hurdles, making them feasible candidates for future experiments in high energy particle physics. In this project, you will investigate the temporal performance of a radiation hard monolithic detector prototype, that was produced end of 2023, using laser setups in the laboratory. You will also participate in meetings with the international collaboration working on this detector to present reports on the prototype's performance. A detailed investigation into different aspects of the system are to be investigated concerning their impact on the temporal resolution such as charge calibration and power consumption. Depending on the progress of the work, a first full three dimensional characterization of the prototypes performance using a state-of-the-art two-photon absorption laser setup at Nikhef and/or an investigation into irradiated samples for a closer look on the impact of radiation damage on the prototype are possible. This project is looking for someone interested in working hands on with cutting edge detector and laser systems at the Nikhef laboratory. Python programming skills and linux experience are an advantage.

Contact: Jory Sonneveld, Uwe Kraemer

Detector R&D: Improving a Laser Setup for Testing Fast Silicon Pixel Detectors

For the upgrades of the innermost detectors of experiments at the Large Hadron Collider in Geneva, in particular to cope with the large number of collisions per second from 2027, the Detector R&D group at Nikhef tests new pixel detector prototypes with a variety of laser equipment with several wavelengths. The lasers can be focused down to a small spot to scan over the pixels on a pixel chip. Since the laser penetrates the silicon, the pixels will not be illuminated by just the focal spot, but by the entire three dimensional hourglass or double cone like light intensity distribution. So, how well defined is the volume in which charge is released? And can that be made much smaller than a pixel? And, if so, what would the optimum focus be? For this project the student will first estimate the intensity distribution inside a sensor that can be expected. This will correspond to the density of released charge within the silicon. To verify predictions, you will measure real pixel sensors for the LHC experiments. This project will involve a lot of 'hands on work' in the lab and involve programming and work on unix machines.

Contact: Martin Fransen

Detector R&D: Time resolution of hybrid pixel detectors with the Timepix4 chip

Precise time measurements with silicon pixel detectors are very important for experiments at the High-Luminosity LHC and the future circular collider. The spatial resolution of current silicon trackers will not be sufficient to distinguish the large number of collisions that will occur within individual bunch crossings. In a new method, typically referred to as 4D tracking, spatial measurements of pixel detectors will be combined with time measurements to better distinguish collision vertices that occur close together. New sensor technologies are being explored to reach the required time measurement resolution of tens of picoseconds, and the results are promising. However, the signals that these pixelated sensors produce have to be processed by front-end electronics, which hence play a large role in the total time resolution of the detector. The front-end electronics has many parameters that can be optimised to give the best time resolution for a specific sensor type. In this project you will be working with the Timepix4 chip, which is a so-called application specific integrated circuit (ASIC) that is designed to read out pixelated sensors. This ASIC is used extensively in detector R&D for the characterisation of new sensor technologies requiring precise timing (< 50 ps). To study the time resolution you will be using laser setups in our lab, and there might be an opportunity to join a test with charged particle beams at CERN. These measurements will be complemented with data from the built-in calibration-pulse mechanism of the Timepix4 ASIC. Your work will enable further research performed with this ASIC, and serve as input to the design and operation of future ASICs for experiments at the High-Luminosity LHC.

Contact: Kevin Heijhoff and Martin van Beuzekom

Detector R&D: Performance studies of Trench Isolated Low Gain Avalanche Detectors (TI-LGAD)

The future vertex detector of the LHCb Experiment needs to measure the spatial coordinates and time of the particles originating in the LHC proton-proton collisions with resolutions better than 10 um and 50 ps, respectively. Several technologies are being considered to achieve these resolutions. Among those is a novel sensor technology called Trench Isolated Low Gain Avalanche Detector. Prototype pixelated sensors have been manufactured recently and have to be characterised. Therefore these new sensors will be bump bonded to a Timepix4 ASIC which provides charge and time measurements in each of 230 thousand pixels. Characterisation will be done using a lab setup at Nikhef, and includes tests with a micro-focused laser beam, radioactive sources, and possibly with particle tracks obtained in a test-beam. This project involves data taking with these new devices and analysing the data to determine the performance parameters such as the spatial and temporal resolution. as function of temperature and other operational conditions.

Contacts: Kazu Akiba and Martin van Beuzekom

Detector R&D: A Telescope with Ultrathin Sensors for Beam Tests

To measure the performance of new prototypes for upgrades of the LHC experiments and beyond, typically a telescope is used in a beam line of charged particles that can be used to compare the results in the prototype to particle tracks measured with this telescope. In this project, you will continue work on a very lightweight, compact telescope using ALICE PIxel DEtectors (ALPIDEs). This includes work on the mechanics, data acquisition software, and a moveable stage. You will foreseeably test this telescope in the Delft Proton Therapy Center. If time allows, you will add a timing plane and perform a measurement with one of our prototypes. Apart from travel to Delft, there is a possiblity to travel to other beam line facilities.

Contact: Jory Sonneveld

Detector R&D: Laser Interferometer Space Antenna (LISA) - the first gravitational wave detector in space

The space-based gravitational wave antenna LISA is one of the most challenging space missions ever proposed. ESA plans to launch around 2035 three spacecraft separated by a few million kilometres. This constellation measures tiny variations in the distances between test-masses located in each satellite to detect gravitational waves from sources such as supermassive black holes. LISA is based on laser interferometry, and the three satellites form a giant Michelson interferometer. LISA measures a relative phase shift between one local laser and one distant laser by light interference. The phase shift measurement requires sensitive sensors. The Nikhef DR&D group fabricated prototype sensors in 2020 together with the Photonics industry and the Dutch institute for space research SRON. Nikhef & SRON are responsible for the Quadrant PhotoReceiver (QPR) system: the sensors, the housing including a complex mount to align the sensors with 10's of nanometer accuracy, various environmental tests at the European Space Research and Technology Centre (ESTEC), and the overall performance of the QPR in the LISA instrument. Currently we are fabricating improved sensors, optimizing the mechanics and preparing environmental tests. As a MSc student, you will work on various aspects of the wavefront sensor development: study the performance of the epitaxial stacks of Indium-Gallium-Arsenide, setting up test benches to characterize the sensors and QPR system, performing the actual tests and data analysis, in combination with performance studies and simulations of the LISA instrument. Possible projects but better to contact us as the exact content may change:

  1. Title: Simulating LISA QPD performance for LISA mission sensitivity.
    Topic: Simulation and Data Analysis.
    Description: we must provide accurate information to the LISA collaboration about the expected and actual performance of the LISA QPRs. This project will focus on using data from measurements taken at Nikhef to integrate into the simulation packages used within the LISA collaboration. The student will have the option to collect their own data to verify the simulations. Performance parameters include spatial uniformity and phase response, crosstalk and thermal response across the LISA sensitivity.
    These simulations can then be used to investigate the full LISA performance and the impact on noise sources. This will involve simulating heterodyne signals expected on the LISA QPD and the impact on sensing techniques such as Differential Wavefront Sensing (DWS) and Tilt-to-Length (TTL) noise. Simulations tools include Finesse (Python), IFOCAD (C++) or FieldProp (MATLAB) depending on the student capabilities and preference. This work is important for understanding the stability and noise of LISA interferometry will perform during real operation in space.
  2. Title: Investigate the Response of the Gap in the LISA QPD.
    Topic: Experimental.
    Description: At Nikhef we are developing the photodiodes that will be used in the upcoming ESA/NASA LISA mission. We currently have our first batch of Quadrant Photodiodes (QPDs) that vary in diameter, thickness and gaps width between the quadrants. The goal of this project is to develop a free-space laser test set-up to measure the response of the gap between the quadrants of the LISA Quadrant Photodiode (QPD). It is important to understand the behaviour of the gap between the photodiode quadrants since this can impact the overall performance of the photodiode and thus the sensitivity of LISA.
    The measurements will involve characterising the test laser beam, configuring test equipment, handling and installing optical components. Furthermore, as well as taking the data, the student will also be responsible for analysing the results using Python however other computer languages are acceptable (based on the student preference).
  3. Title: Investigate the Response of LISA QPDs for Einstein Telescope Pathfinder.
    Topic: Experimental.
    Description: Current gravitational wave (GW) interferometers typically operate using 1064 nm wavelengths. However, future GW detectors will operate at higher wavelengths such as 1550 nm or 2000 nm. As a result of the wavelength change, much of the current technology is unsuitable thus, developments are underway for the next generation GW detectors. Europe’s future GW detector, the Einstein Telescope, is currently in its’ infancy. A smaller scale prototype, known as ET pathfinder, is currently being built and serves as a test bench for the full scale detector.
    At Nikhef’s R&D group, we want to develop quadrant photodiodes (QPDs) that sense the light from the interferometer light for the Einstein Telescope (ET) and ET Pathfinder. These QPDs require very low noise performance as well as high sensitivity in order to measure the small interferometer signals. To that end, out first step is to use the current QPDs that have been developed for the ESA/NASA LISA mission.
    This project will focus on performance tests of the LISA QPDs using a 1550 nm. The student will be tasked with developing a test setup as well as taking the data and analysing the results. As part of this project, the student will learn about laser characterisation, gaussian optics and instrumentation techniques. These results will be important for designing the next generation QPDs and is of interest to the ET consortium, where the student can present their results.


Contact: Niels van Bakel or Timesh Mistry

Detector R&D: Other projects

Are you looking for a slightly different project? Are the above projects already taken? Are you coming in at an unusual time of the year? Do not hesitate to contact us! We always have new projects coming up at different times in the year and we are open to your ideas.

Contact: Jory Sonneveld

Gravitational Waves: Computer modelling to design the laser interferometers for the Einstein Telescope

A new field of instrument science led to the successful detection of gravitational waves by the LIGO detectors in 2015. We are now preparing the next generation of gravitational wave observatories, such as the Einstein Telescope, with the aim to increase the detector sensitivity by a factor of ten, which would allow, for example, to detect stellar-mass black holes from early in the universe when the first stars began to form. This ambitious goal requires us to find ways to significantly improve the best laser interferometers in the world.

Gravitational wave detectors complex Michelson-type interferometers enhanced with optical cavities. We develop and use numerical models to study these laser interferometers, to invent new optical techniques and to quantify their performance. For example, we synthesize virtual mirror surfaces to study the effects of higher-order optical modes in the interferometers, and we use opto-mechanical models to test schemes for suppressing quantum fluctuations of the light field. We can offer several projects based on numerical modelling of laser interferometers. All projects will be directly linked to the ongoing design of the Einstein Telescope.

Contact: Andreas Freise

Gravitational-Waves: Get rid of those damn vibrations!

In 2015 large scale, precision interferometry led to the detection of gravitational-waves. In 2017 Europe’s Advanced Virgo detector joined this international network and the best studied astrophysical event in history, GW170817, was detected in both gravitational waves and across the electromagnetic spectrum.

The Nikhef gravitational wave group is actively contributing to improvements towards current gravitational-wave detectors and the rapidly maturing design for Europe’s next generation of gravitational-wave observatory, Einstein Telescope, with one of two candidate sites located in the Netherlands. These detectors will unveil the gravitational symphony of the dark universe out to cosmological distances. Breaking past the sensitivity achieved by the current observatories will require a radically new approach to core components of these state of the art machines. This is especially true at the lowest, audio-band, frequencies that the Einstein Telescope is targeting where large improvements are needed.

Our project, Omnisens, brings the techniques from space based satellite control back to Earth building a platform capable of actively cancelling ground vibrations to levels never reached in the past. This is realised with state of the art compact interferometric sensors and precision mechanics. Substantial cancellation of seismic motion is an essential improvement for the Einstein Telescope, to reach below attometer (10-18 m) displacements.

We are excited to offer two projects in 2024:

  1. You will experimentally demonstrate and optimise Omnisens’ novel vibration isolation for future deployment on the Einstein Telescope. The activity will involve hands-on experience with laser, electronics mechanical and high-vacuum systems.
  2. You will contribute to the design of the Einstein Telescope by modelling the coupling of seismic and technical noises (such as actuation and sensing noises) through different configurations of seismic actuation chains. An accurate modelling of the origin and transmission of those noises is crucial in designing a system that prevents them from limiting the interferometer’s readout.

Contact: Conor Mow-Lowry

Gravitational Waves: Signal models & tools for data analysis

Theoretical predictions of gravitational-wave (GW) signals provide essential tools to detect and analyse transient GW events in the data of GW instruments like LIGO and Virgo. Over the last few years, there has been significant effort to develop signal models that accurately describe the complex morphology of GWs from merging neutron-star and black-hole binaries. Future analyses of Einstein Telescope (ET) data will need to tackle much longer and louder compact binary signals, which will require significant developments beyond the current status quo of GW modeling (i.e., improvements in model accuracy and computational efficiency, increased parameter space coverage, ...)

We can offer up to two projects: in GW signal modeling (at the interface of perturbation theory, numerical relativity simulations and fast phenomenological descriptions), as well as developing applications of signal models in GW data analysis. Although not strictly required, prior knowledge of basic concepts of general relativity and/or GW theory will be helpful. Some proficiency in computing is required (Mathematica, Python or C++).

Contact: Maria Haney

Theoretical Particle Physics: High-energy neutrino physics at the LHC

High-energy collisions at the LHC and its High-Luminosity upgrade (HL-LHC) produce a large number of particles along the beam collision axis, outside of the acceptance of existing experiments. The FASER experiment has in 2023, for the first team, detected neutrinos produced in LHC collisions, and is now starting to elucidate their properties. In this context, the proposed Forward Physics Facility (FPF) to be located several hundred meters from the ATLAS interaction point and shielded by concrete and rock, will host a suite of experiments to probe Standard Model (SM) processes and search for physics beyond the Standard Model (BSM). High statistics neutrino detection will provide valuable data for fundamental topics in perturbative and non-perturbative QCD and in weak interactions. Experiments at the FPF will enable synergies between forward particle production at the LHC and astroparticle physics to be exploited. The FPF has the promising potential to probe our understanding of the strong interactions as well as of proton and nuclear structure, providing access to both the very low-x and the very high-x regions of the colliding protons. The former regime is sensitive to novel QCD production mechanisms, such as BFKL effects and non-linear dynamics, as well as the gluon parton distribution function (PDF) down to x=1e-7, well beyond the coverage of other experiments and providing key inputs for astroparticle physics. In addition, the FPF acts as a neutrino-induced deep-inelastic scattering (DIS) experiment with TeV-scale neutrino beams. The resulting measurements of neutrino DIS structure functions represent a valuable handle on the partonic structure of nucleons and nuclei, particularly their quark flavour separation, that is fully complementary to the charged-lepton DIS measurements expected at the upcoming Electron-Ion Collider (EIC).

In this project, the student will carry out updated predictions for the neutrino fluxes expected at the FPF, assess the precision with which neutrino cross-sections will be measured, develop novel monte carlo event generation tools for high-energy neutrino scattering, and quantify their impact on proton and nuclear structure by means of machine learning tools within the NNPDF framework and state-of-the-art calculations in perturbative Quantum Chromodynamics. This project contributes to ongoing work within the FPF Initiative towards a Conceptual Design Report (CDR) to be presented within two years. Topics that can be considered as part of this project include the assessment of to which extent nuclear modifications of the free-proton PDFs can be constrained by FPF measurements, the determination of the small-x gluon PDF from suitably defined observables at the FPF and the implications for ultra-high-energy particle astrophysics, the study of the intrinsic charm content in the proton and its consequences for the FPF physics program, and the validation of models for neutrino-nucleon cross-sections in the region beyond the validity of perturbative QCD.

References: https://arxiv.org/abs/2203.05090, https://arxiv.org/abs/2109.10905 ,https://arxiv.org/abs/2208.08372 , https://arxiv.org/abs/2201.12363 , https://arxiv.org/abs/2109.02653, https://github.com/NNPDF/ see also this project description.

Contacts: Juan Rojo

Theoretical Particle Physics: Unravelling proton structure with machine learning

At energy-frontier facilities such as the Large Hadron Collider (LHC), scientists study the laws of nature in their quest for novel phenomena both within and beyond the Standard Model of particle physics. An in-depth understanding of the quark and gluon substructure of protons and heavy nuclei is crucial to address pressing questions from the nature of the Higgs boson to the origin of cosmic neutrinos. The key to address some of these questions is carrying out a global analysis of nucleon structure by combining an extensive experimental dataset and cutting-edge theory calculations. Within the NNPDF approach, this is achieved by means of a machine learning framework where neural networks parametrise the underlying physical laws while minimising ad-hoc model assumptions. In addition to the LHC, the upcoming Electron Ion Collider (EIC), to start taking data in 2029, will be the world's first ever polarised lepton-hadron collider and will offer a plethora of opportunities to address key open questions in our understanding of the strong nuclear force, such as the origin of the mass and the intrinsic angular momentum (spin) of hadrons and whether there exists a state of matter which is entirely dominated by gluons.

In this project, the student will develop novel machine learning and AI approaches aimed to improve global analyses of proton structure and better predictions for the LHC, the EIC, and astroparticle physics experiments. These new approaches will be implemented within the machine learning tools provided by the NNPDF open-source fitting framework and use state-of-the-art calculations in perturbative Quantum Chromodynamics. Techniques that will be considered include normalising flows, graph neural networks, gaussian processes, and kernel methods for unsupervised learning. Particular emphasis will be devoted to the automated determination of model hyperparameters, as well as to the estimate of the associated model uncertainties and their systematic validation with a battery of statistical tests. The outcome of the project will benefit the ongoing program of high-precision theory predictions for ongoing and future experiments in particle physics.

References: https://arxiv.org/abs/2201.12363, https://arxiv.org/abs/2109.02653 , https://arxiv.org/abs/2103.05419, https://arxiv.org/abs/1404.4293 , https://inspirehep.net/literature/1302398, https://github.com/NNPDF/ see also this project description.

Contacts: Juan Rojo

Theoretical Particle Physics: Sterile neutrino dark matter

The existence of right-handed (sterile) neutrinos is well motivated, as all other Standard Model particles come in both chiralities, and moreover, they naturally explaine the small masses of the left-handed (active) neutrinos. If the lightest sterile neutrino is very long lived, it could be dark matter. Although they can be produced by neutrino oscillations in the early universe, this is not efficient enough to explain all dark matter. It has been proposed that additional self-interactions between sterile neutrinos can solve this (https://arxiv.org/abs/2307.15565, see also the more recent https://arxiv.org/abs/2402.13878). In this project you would examine whether the additional field mediating the self-interactions can also explain the neutrino masses. As a first step you would reproduce the results in the literature, and then extend it to map out the range of masses possible for this extra field.

Contacts: Marieke Postma

Theoretical Particle Physics: Baryogenesis at the electroweak scale

Given that the Standard Model treats particle and antiparticles nearly the same, it is a puzzle why there is no antimatter left in our universe. Electroweak baryogenesis is the proposal that the matter-antisymmetry is created during the phase transtion during which the Higgs field obtains a vev and the electroweak symmetry is broken. One important ingredient in this scenario is that there are new charge and parity (CP) violating interactions. However, this is strongl constrained by the very precise measurements of the electric dipole moment of the electron. An old proposal, that was recently revived, is to use a CP violating coupling of the Higgs field to the gauge field (https://arxiv.org/abs/2307.01270, https://inspirehep.net/literature/300823). The project would be to study the efficacy of these kind of operators for baryogenesis.

Contacts: Marieke Postma

Theoretical Particle Physics: Neutrinoless double beta decay with sterile neutrinos

Search for neutrinoless double beta decay represents a prominent probe of new particle physics and is very well motivated by its tight connection to neutrino masses, which, so far, lack an experimentally verified explanation. As such, it also provide a convenient probe of new interactions of the known elementary particles with hypothesized right-handed neutrinos that are thought to play a prime role in the neutrino mass generation. The main focus of this project would be the extension of NuDoBe, a Python tool for the computation of neutrinoless double beta decay (0vbb) rates in terms of lepton-number-violating operators in the Standard Model Effective Field Theory (SMEFT), see https://arxiv.org/abs/2304.05415. In the first step, the code should be expanded to include also the effective operators involving right-handed neutrinos based on the existing literature (https://arxiv.org/abs/2002.07182) covering the general rate of neutrinoless double beta decay within extended by right-handed neutrinos. Besides that, additional functionalities could be added to the code, such as a routine for extraction of the explicit form of a neutrino mass and mixing matrices, etc. This work would be very useful for future phenomenological studies and particularly timely given the ongoing experimental efforts, which are to be further boosted by the upcoming tonne-scale upgrades of the double-beta experiments.

Contacts: Jordy de Vries and Lukas Graf

Theoretical Particle Physics: Phase space factors for single, double, and neutrinoless beta-decay rates.

In light of the increasingly precise measurements of beta-decay and double-beta-decay rates and spectra, the theoretical predictions seem to fall behind. The existing, rather phenomenological approaches to the associated phase-space calculations employ a variety of different approximations introducing errors that are, given their phenomenological nature, not easily quantifiable. A key goal of this project is to understand, reproduce and improve the methods and results available in the literature. Ideally, these efforts would be summarized in form of a compact Mathematica notebook or Python package available to the broad community of beta-decay experimentalists and phenomenologists that could easily implement it in the workflows of their analyses. The focus should be not only on the Standard-Model contributions to (double) beta decay, but also on hypothetical exotic modes stemming from various beyond-the-Standard-Model scenarios (see e.g.\ https://arxiv.org/abs/nucl-ex/0605029 and https://arxiv.org/abs/2003.11836). If time permits, then new, more particle-physics based approaches to the phase-space computations can be investigated.

Contacts: Jordy de Vries and Lukas Graf

Neutrinos: Neutrino Oscillation Analysis with the KM3NeT/ORCA Detector

The KM3NeT/ORCA neutrino detector at the bottom of the Mediterranean Sea is able to detect oscillations of atmospheric neutrinos. Neutrinos traversing the detector are reconstructed as a function of two observables: the neutrino energy and the neutrino direction. In order to improve the neutrino oscillation analysis, we need to add one more observable, the so-called Björken-y, that indicates the fraction of the energy transferred from the incoming neutrino to its daughter particle. For this project, we will study simulated and real reconstructed data and use those to implement this additional observable in the existing analysis framework. Subsequently, we will study how much the sensitivity of the final analysis improves as a result.

C++ and Python programming skills are advantageous.

Contacts: Daan van Eijk, Paul de Jong

Neutrinos: Searching for neutrinos of cosmic origin with KM3NeT

KM3NeT is a neutrino telescope under construction in the Mediterranean Sea, already taking data with the first deployed detection units. In particular the KM3NeT/ARCA detector off-shore of Sicily is designed for high-energy neutrinos and is suited for the detection of neutrinos of cosmic origin. In this project we will use the first KM3NeT data to search for evidence of a cosmic neutrino source, and also study ways to improve the analysis.

Contact: Aart Heijboer

Neutrinos: the Deep Underground Neutrino Experiment (DUNE)

The Deep Underground Neutrino Experiment (DUNE) is under construction in the USA, and will consist of a powerful neutrino beam originating at Fermilab, a near detector at Fermilab, and a far detector in the SURF facility in Lead, South Dakota, 1300 km away. During travelling, neutrinos oscillate and a fraction of the neutrino beam changes flavour; DUNE will determine the neutrino oscillation parameters to unrivaled precision, and try and make a first detection of CP-violation in neutrinos. In this project, various elements of DUNE can be studied, including the neutrino oscillation fit, neutrino physics with the near detector, event reconstruction and classification (including machine learning), or elements of data selection and triggering.

Contact: Paul de Jong

Neutrinos: Searching for Majorana Neutrinos with KamLAND-Zen

The KamLAND-Zen experiment, located in the Kamioka mine in Japan, is a large liquid scintillator experiment with 750kg of ultra-pure Xe-136 to search for neutrinoless double-beta decay (0n2b). The observation of the 0n2b process would be evidence for lepton number violation and the Majorana nature of neutrinos, i.e. that neutrinos are their own anti-particles. Current limits on this extraordinary rare hypothetical decay process are presented as a half-life, with a lower limit of 10^26 years. KamLAND-Zen, the world’s most sensitive 0n2b experiment, is currently taking data and there is an opportunity to work on the data analysis, analyzing data with the possibility of taking part in a ground-breaking discovery. The main focus will be on developing new techniques to filter the spallation backgrounds, i.e. the production of radioactive isotopes by passing muons. There will be close collaboration with groups in the US (MIT, Berkeley, UW) and Japan (Tohoku Univ).

Contact: Patrick Decowski


Neutrinos: TRIF𝒪RCE (PTOLEMY)

The PTOLEMY demonstrator will place limits on the neutrino mass using the β-decay endpoint of atomic tritium. The detector will require a CRES-based (cyclotron radiation emission spectroscopy) trigger and a non-destructive tracking system. The "TRItium-endpoint From 𝒪(fW) Radio-frequency Cyclotron Emissions" group is developing radio-frequency cavities for the simultaneous transport of endpoint electrons and the extraction of their kinematic information. This is essential to providing a fast online trigger and precise energy-loss corrections to electrons reconstructed near the tritium endpoint. The cryogenic low-noise, high-frequency analogue electronics developed at Nikhef combined with FPGA-based front-end analysis capabilities will provide the PTOLEMY demonstrator with its CRES readout and a testbed to be hosted at the Gran Sasso National Laboratory for the full CνB detector. The focus of this project will be modelling CR in RF cavities and its detection for the purposes of single electron spectroscopy and optimised trajectory reconstruction for prototype and demonstrator setups. This may extend to firmware-based fast tagging and reconstruction algorithm development with the RF-SoC.

Contact: James Vincent Mead]

Cosmic Rays: Energy loss profile of cosmic ray muons in the KM3NeT neutrino detector

The dominant signal in the KM3NeT detectors are not neutrinos, but muons created in particle cascades -extensive air-showers- initiated when cosmic rays interact in the top of the atmosphere. While these muons are a background for neutrino studies, they present an opportunity to study the nature of cosmic rays and hadronic interactions at the highest energies. Reconstruction algorithms are used to determine the properties of the particle interactions, normally of neutrinos,  from the recorded photons. The aim of this project is to explore the possibility to reconstruct the longitudinal energy loss profile of single and multiple simultaneous muons ('bundles') originating from cosmic ray interactions. The potential to use this energy loss profile to extract information on the amount of muons and the lateral extension of the muon 'bundles' will also be explored. These properties allow to extract information on the high-energy interactions of cosmic rays.

Contact: Ronald Bruijn

LHCb: Search for light dark particles

The Standard Model of elementary particles does not contain a proper Dark Matter candidate. One of the most tantalizing theoretical developments is the so-called Hidden Valley models: a mirror-like copy of the Standard Model, with dark particles that communicate with standard ones via a very feeble interaction. These models predict the existence of dark hadrons – composite particles that are bound similarly to ordinary hadrons in the Standard Model. Such dark hadronscan be abundantly produced in high-energy proton-proton collisions, making the LHC a unique place to search for them. Some dark hadrons are stable like a proton, which makes them excellent Dark Matter candidates, while others decay to ordinary particles after flying a certain distance in the collider experiment. The LHCb detector has a unique capability to identify such decays, particularly if the new particles have a mass below ten times the proton mass.

This project assumes a unique search for light dark hadrons that covers a mass range not accessible to other experiments. It assumes an interesting program on data analysis (python-based) with non-trivial machine learning solutions and phenomenology research using fast simulation framework. Depending on the interest, there is quite a bit of flexibility in the precise focus of the project.

Contact: [Andrii Usachov]

LHCb: Searching for dark matter in exotic six-quark particles

Three quarters of the mass in the Universe is of unknown type. Many hypotheses about this dark matter have been proposed, but none confirmed. Recently it has been proposed that it could be made of particles made of the six quarks uuddss, which would be a Standard-Model solution to the dark matter problem. This idea has recently gained credibility as many similar multi-quarks states are being discovered by the LHCb experiment. Such a particle could be produced in decays of heavy baryons, or directly in proton-proton collisions. The anti-particle, made of six antiquarks, could be seen when annihilating with detector material. It is also proposed to use Xi_b baryons produced at LHCb to search for such a state where the state would appear as missing 4-momentum in a kinematically constrained decay. The project consists in defining a selection and applying it to LHCb data. See arXiv:2007.10378.

Contact: Patrick Koppenburg


LHCb: New physics in the angular distributions of B decays to K*ee

Lepton flavour violation in B decays can be explained by a variety of non-standard model interactions. Angular distributions in decays of a B meson to a hadron and two leptons are an important source of information to understand which model is correct. Previous analyses at the LHCb experiment have considered final states with a pair of muons. Our LHCb group at Nikhef concentrates on a new measurement of angular distributions in decays with two electrons. The main challenge in this measurement is the calibration of the detection efficiency. In this project you will confront estimates of the detection efficiency derived from simulation with decay distributions in a well known B decay. Once the calibration is understood, the very first analysis of the angular distributions in the electron final state can be performed.

Contact: Mara Soares and Wouter Hulsbergen

LHCb: CP violation in B -> J/psi Ks decays with first run-3 data

The decay B -> J/psi Ks is the `golden channel' for measuring the CP violating angle beta in the CKM matrix. In this project we will use the first data from the upgraded LHCb detector to perform this measurement. Performing such a measurement with a new detector is going to be very challenging: We will learn a lot about whether the the upgraded LHCb will perform as good as expected.

Contact: [Wouter Hulsbergen]


LHCb: Optimization of primary vertex reconstruction

A key part of the LHCb event classification is the reconstruction of the collision point of the protons from the LHC beams. This so-called primary vertex is found by constructing the origin of the charged particles found in the detector. A rudimentary algorithm exists, but it is expected that its performance can be improved by tuning parameters (or perhaps implementing an entirely new algorithm). In this project you are challenged to optimize the LHCb primary vertex reconstruction algorithm using recent simulated and real data from LHC run-3.

Contact: [Wouter Hulsbergen]

LHCb: Measurement of B decays to two electrons

Instead of searching for new physics by direct production of new particles, one can search for enhancements in very rare processes as an indirect signal for the existence of new particles or forces. The observed decay of Bs to two muons by the LHCb collaboration and Nikhef/Maastricht is such a measurement, and as rarest decay ever observed at the LHC it has a large impact on the new physics landscape. In this project, we will extend this work by searching for the even rarer decay into two electrons. You would join the ongoing work in context of an NWO Veni grant, and can be based in Maastricht or Nikhef.

Contact: [Jacco de Vries]

Muon Collider

There is currently a lively global debate about the next accelerator to succeed the successful LHC. Different options are on the table: linear, circular, electrons, protons, on various continents... Out of these, the most ambitious project is the muon collider, designed to collide the relatively massive (105 MeV) but relatively short-living (2.2 μs!) leptons. Such a novel collider would combine the advantages of electron-positron colliders (excellent precision) and proton-proton colliders (highest energy). In this project, we'll perform a feasibility study for the search of the elusive Double-Higgs process: this yet unobserved process is crucial to probe the simultaneous interaction of multiple Higgs bosons and thereby the shape of the Higgs potential as predicted in the Brout-Englert-Higgs mechanism. This sensitivity study will be instrumental to understand one of the main scientific prospects for this ambitious project, and also to optimize the detector design, as well as the interface of the particle detectors to the accelerator machine. The project is based at Nikhef but can also be (partially) performed at University of Twente.

Reference: https://www.science.org/content/article/muon-collider-could-revolutionize-particle-physics-if-it-can-be-built

Contact: [Flavia Dias and Tristan du Pree ]



Projects with a 2023 start

ALICE: The next-generation multi-purpose detector at the LHC

This main goal of this project is to focus on the next-generation multi-purpose detector planned to be built at the LHC. Its core will be a nearly massless barrel detector consisting of truly cylindrical layers based on curved wafer-scale ultra-thin silicon sensors with MAPS technology, featuring an unprecedented low material budget of 0.05% X0 per layer, with the innermost layers possibly positioned inside the beam pipe. The proposed detector is conceived for studies of pp, pA and AA collisions at luminosities a factor of 20 to 50 times higher than possible with the upgraded ALICE detector, enabling a rich physics program ranging from measurements with electromagnetic probes at ultra-low transverse momenta to precision physics in the charm and beauty sector.

Contact: Panos Christakoglou and Alessandro Grelli and Marco van Leeuwen

ALICE: Searching for the strongest magnetic field in nature

In a non-central collision between two Pb ions, with a large value of impact parameter, the charged nucleons that do not participate in the interaction (called spectators) create strong magnetic fields. A back of the envelope calculation using the Biot-Savart law brings the magnitude of this filed close to 10^19Gauss in agreement with state of the art theoretical calculation, making it the strongest magnetic field in nature. The presence of this field could have direct implications in the motion of final state particles. The magnetic field, however, decays rapidly. The decay rate depends on the electric conductivity of the medium which is experimentally poorly constrained. Overall, the presence of the magnetic field, the main goal of this project, is so far not confirmed experimentally.

Contact: Panos Christakoglou

ALICE: Looking for parity violating effects in strong interactions

Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions, in the presence of a deconfined state, gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, what is called the Chiral Magnetic Effect (CME). The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, however further studies are needed to constrain background effects. These highly anticipated results have the potential to reveal exiting, new physics.

Contact: Panos Christakoglou

ALICE: Machine learning techniques as a tool to study the production of heavy flavour particles

There was recently a shift in the field of heavy-ion physics triggered by experimental results obtained in collisions between small systems (e.g. protons on protons). These results resemble the ones obtained in collisions between heavy ions. This consequently raises the question of whether we create the smallest QGP droplet in collisions between small systems. The main objective of this project will be to study the production of charm particles such as D-mesons and Λc-baryons in pp collisions at the LHC. This will be done with the help of a new and innovative technique which is based on machine learning (ML). The student will also extend the studies to investigate how this production rate depends on the event activity e.g. on how many particles are created after every collision.

Contact: Panos Christakoglou and Alessandro Grelli

ALICE: Search for new physics with 4D tracking at the most sensitive vertex detector at the LHC

With the newly installed Inner Tracking System consisting fully of monolithic detectors, ALICE is very sensitive to particles with low transverse momenta, more so than ATLAS and CMS. This will be even more so for the ALICE upgrade detector in 2033. This detector could potentially be even more sensitive to longlived particles that leave peculiar tracks such as disappearing or kinked tracks in the tracker by using timing information along a track. In this project you will investigate how timing information in the different tracking layers can improve or even enable a search for new physics beyond the Standard Model in ALICE. If you show a possibility for major improvements, this can have real consequences for the choice of sensors for this ALICE inner tracker upgrade.

Contact: Jory Sonneveld and Panos Christakoglou

ATLAS: The Higgs boson's self-coupling

The coupling of the Higgs boson to itself is one of the main unobserved interactions of the Standard Model and its observation is crucial to understand the shape of the Higgs potential. Here we propose to study the 'ttHH' final state: two top quarks and two Higgs bosons produced in a single collision. This topology is yet unexplored at the ATLAS experiment and the project consists of setting up the new analysis (including multivariate analysis techniques to recognise the complicated final state), optimising the sensitivity and including the result in the full ATLAS study of the Higgs boson's coupling to itself. With the LHC data from the upcoming Run-3, we might be able to see its first glimpses!

Contact: Tristan du Pree and Carlo Pandini

ATLAS: Triple-Higgs production as a probe of the Higgs potential

So far, the investigation of Higgs self-couplings (the coupling of the Higgs boson to itself) at the LHC has focused on the measurement of the Higgs tri-linear coupling λ3 mainly through direct double-Higgs production searches. In this research project we propose the investigation of Higgs tri-linear and quartic coupling parameters λ3 and λ4, via a novel measurement of triple-Higgs production at the LHC (HHH) with the ATLAS experiment. While in the SM these parameters are expected to be identical, only a combined measurement can provide an answer regarding how the Higgs potential is realised in Nature. Processes in which three Higgs bosons are produced simultaneously are extremely rare, and very difficult to measure and disentangle from background. In this project we plan to investigate different decay channels (to bottom quarks and tau leptons), and to study advanced machine learning techniques to reconstruct such a complex hadronic final state. This kind of processes is still quite unexplored in ATLAS, and the goal of this project is to put the basis for the first measurement of HHH production at the LHC.

Furthermore, we'd like to study the possible implication of a precise measurement of the self-coupling parameters from HHH production from a phenomenological point of view: what could be the impact of a deviation in the HHH measurements on the big open questions in physics (for instance, the mechanisms at the root of baryogenesis)?

Contact: Tristan du Pree and Carlo Pandini

ATLAS: The Next Generation

After the observation of the coupling of Higgs bosons to fermions of the third generation, the search for the coupling to fermions of the second generation is one of the next priorities for research at CERN's Large Hadron Collider. The search for the decay of the Higgs boson to two charm quarks is very new [1] and we see various opportunities for interesting developments. For this project we propose improvements in reconstruction (using exclusive decays) and advanced analysis techiques (using deep learning methods).

[1]https://atlas.cern/updates/briefing/charming-Higgs-decay

Contact: Tristan du Pree

ATLAS: Searching for new particles in very energetic diboson production

The discovery of new phenomena in high-energy proton–proton collisions is one of the main goals of the Large Hadron Collider (LHC). New heavy particles decaying into a pair of vector bosons (WW, WZ, ZZ) are predicted in several extensions to the Standard Model (e.g. extended gauge-symmetry models, Grand Unified theories, theories with warped extra dimensions, etc). In this project we will investigate new ideas to look for these resonances in promising regions. We will focus on final states where both vector bosons decay into quarks, or where one decays into quarks and one into leptons. These have the potential to bring the highest sensitivity to the search for Beyond the Standard Model physics [1, 2]. We will try to reconstruct and exploit new ways to identify vector bosons (using machine learning methods) and then tackle the problem of estimating contributions from beyond the Standard Model processes in the tails of the mass distribution.

[1] https://arxiv.org/abs/1906.08589

[2] https://arxiv.org/abs/2004.14636

Contact: Flavia de Almeida Dias, Robin Hayes, Elizaveta Cherepanova and Dylan van Arneman

ATLAS: Top-quark and Higgs-boson analysis combination, and Effective Field Theory interpretation (also in 2023)

We are looking for a master student with interest in theory and data-analysis in the search for physics beyond the Standard Model in the top-quark and Higgs-boson sectors.

Your master-project starts just at the right time for preparing the Run-3 analysis of the ATLAS experiment at the LHC. In Run-3 (2022-2026), three times more data becomes available, enabling analysis of rare processes with innovative software tools and techniques.

This project aims to explore the newest strategy to combine the top-quark and Higgs-boson measurements in the perspective of constraining the existence of new physics beyond the Standard Model (SM) of Particle Physics. We selected the pp->tZq and gg->HZ processes as promising candidates for a combination to constrain new physics in the context of Standard Model Effective Field Theory (SMEFT). SMEFT is the state-of-the-art framework for theoretical interpretation of LHC data. In particular, you will study the SMEFT OtZ and Ophit operators, which are not well constrained by current measurements.

Besides affinity with particle physics theory, the ideal candidate for this project has developed python/C++ skills and is eager to learn advanced techniques. You start with a simulation of the signal and background samples using existing software tools. Then, an event selection study is required using Machine Learning techniques. To evaluate the SMEFT effects, a fitting procedure based on the innovative Morphing technique is foreseen, for which the basic tools in the ROOT and RooFit framework are available. The work is carried out in the ATLAS group at Nikhef and may lead to an ATLAS note.

Contact: Oliver Rieger and Marcel Vreeswijk

ATLAS: Machine learning to search for very rare Higgs decays

Since the Higgs boson discovery in 2012 at the ATLAS experiment, the investigation of the properties of the Higgs boson has been a priority for research at the Large Hadron Collider (LHC). However, there are still a many open questions: Is the Higgs boson the only origin of Electroweak Symmetry Breaking? Is there a mechanism which can explain the observed mass pattern of SM particles? Many of these questions are linked to the Higgs boson coupling structure.



While the Higgs boson coupling to fermions of the third generation has been established experimentally, the investigation of the Higgs boson coupling to the light fermions of the second generation will be a major project for the upcoming data-taking period (2022-2025). The Higgs boson decay to muons is the most sensitive channel for probing this coupling. In this project, you will optimize the event selection for Higgs boson decays to muons in the Vector Boson Fusion (VBF) production channel with a focus on distinguishing signal events from background processes like Drell-Yan and electroweak Z boson production. For this purpose, you will develop, implement and validate advanced machine learning and deep learning algorithms.

Contact: Oliver Rieger and Wouter Verkerke and Peter Kluit

ATLAS: Interpretation of experimental data using SMEFT

The Standard Model Effective Field Theory (SMEFT) provides a systematic approach to test the impact of new physics at the energy scale of the LHC through higher-dimensional operators. The interpretation of experimental data using SMEFT requires a particular interest in solving complex technical challenges, advanced statistical techniques, and a deep understanding of particle physics. We would be happy to discuss different project opportunities based on your interests with you.

Contact: Oliver Rieger and Wouter Verkerke

ATLAS: Reconstructing tracks from particle physics detector hits with state-of-the-art machine learning techniques

This project concerns the application of new machine learning techniques to tackle the problem of track reconstruction at the ATLAS detector in CERN. While algorithms to construct particle tracks from low-level detector information such as particle hits and timestamps have been around for decades, recent developments in the field of machine learning open up new opportunities to improve these algorithms significantly. Some recent developments that could help in this context include graph-based neural networks, which embed the input data in the format of a graph and as such have the capability to enhance underlying correlations within events. Transformer neural networks are a particular extension of graph-based neural networks proposed only in 2017 which could also provide helpful in this case. Another option would be to build upon some of the work done within the field of computer vision and see if image segmentation networks can help solve this problem. There are a range of available options and this project includes the freedom for the student to choose particular types of networks, but more explicit guidance could be provided in case it is desired.

In this project the student will develop and compare the performance of various machine learning models to initially reconstruct tracks from simplified test data. Upon successful completion of this, simulated data from the actual ATLAS detector can be analysed as well in the scope of this project. The student will need some familiarity with programming in python and an interest in machine learning, but a physics background is not required. In this project the student will be able to contribute to fundamental physics research and will familiarize themselves with state-of-the-art machine learning models.

Contact: Zef Wolffs, Matouš Vozák and Ivo van Vulpen

ATLAS: New machine learning approaches to target Higgs interference signatures in LHC data

In this project we aim to improve an ongoing analysis to determine the lifetime of the Higgs Boson through state-of-the-art machine learning techniques, in particular by addressing a novel solution to an as of yet unsolved fundamental problem in modeling quantum interference. While the Higgs is an elusive particle that generally only appears in physics processes with small cross sections, its signature can be amplified in the Large Hadron Collider (LHC) through quantum interference with larger background (non-Higgs) processes. This is the effect that the Higgs’ lifetime analysis relies on to be able to measure the relevant Higgs signature. A fundamental physics modelling problem arises though in the simulation of individual events for this interference due to the fact that these events are in reality described by a superposition of underlying Higgs and non-Higgs processes.

Since machine learning models in particle physics are typically trained to characterise individual physics events, the fact that interference events cannot currently be generated is a significant problem when interference is the target. In the currently existing Higgs lifetime analysis, a machine learning model was trained which instead focuses only on the explicit Higgs-mediated processes as a proxy, which is suboptimal. The aim of this project is to improve upon this current machine learning strategy used in this analysis by implementing either of the inference-aware approaches suggested in [1] and [2]. The idea behind these inference-aware machine learning algorithms is that they do not optimise for a simplified goal such as the loss function which is common in traditional machine learning, but rather for the end-goal of the analysis. In this case, this would omit the need for interference event generation altogether and allow the machine learning models to be trained optimally regardless.

The first checkpoint of this project is to use either of the frameworks used in [1] and [2] (which are both publicly available) and run them with a simplified dataset from the aforementioned analysis. After this proof-of-principle is achieved, the next goal would be to actually implement the newly developed machine learning models in the full analysis and to determine the improvement upon the existing result. A successful completion of these tasks would not only benefit the Higgs lifetime analysis, but would be an important stepping stone to future developments to make machine learning approaches also aware of other hard to model effects such as systematic uncertainties. Finally, there are further options to improve this analysis such as the generation of actual interference training data, which could be attempted in case the primary project finishes earlier than expected.

[1] De Castro, P., & Dorigo, T. (2019). INFERNO: inference-aware neural optimisation. Computer Physics Communications, 244, 170-179.

[2] Simpson, N., & Heinrich, L. (2023, February). neos: End-to-end-optimised summary statistics for high energy physics. In Journal of Physics: Conference Series (Vol. 2438, No. 1, p. 012105). IOP Publishing.

Contact: Zef Wolffs, Matouš Vozák and Ivo van Vulpen

ATLAS: Development of state-of-the art modeling techniques to generate Higgs interference events

In this project we aim to improve an ongoing analysis to determine the lifetime of the Higgs Boson through new event generation strategies, in particular by addressing a novel solution to an as of yet unsolved fundamental problem in modeling quantum interference. While the Higgs is an elusive particle that generally only appears in physics processes with small cross sections, its signature can be amplified in the Large Hadron Collider (LHC) through quantum interference with larger background (non-Higgs) processes. This is the effect that the Higgs’ lifetime analysis relies on to be able to measure the relevant Higgs signature. A fundamental physics modelling problem arises though in the simulation of individual events for this interference due to the fact that these events are in reality described by a superposition of underlying Higgs and non-Higgs processes.

The current approach to deal with this problem is to ignore the interference in analysis optimization and instead optimize only for explicitly Higgs mediated processes, but this severely impacts analysis performance. In the context of Effective Field Theories (EFT) however, a similar problem arises and has been solved for simple (leading order) processes. In this project we plan to take the machinery developed for EFT and apply it to the Higgs lifetime analysis. Furthermore, with the recent development of a Next-to-Leading Order (NLO) Higgs event generation tool [1] a subsequent goal would be to use this to also generate interference at the NLO level. Successful completion of this project would lead to a much improved analysis result, significantly constraining the lifetime of the Higgs Boson. Besides, the techniques developed would almost certainly be used in future analyses on Large Hadron Collider (LHC) run 3 data.

[1] Alioli, S., Ravasio, S. F., Lindert, J. M., & Röntsch, R. (2021). Four-lepton production in gluon fusion at NLO matched to parton showers. The European Physical Journal C, 81(8), 687.

Contact: Zef Wolffs, Matouš Vozák, Bryan Kortman and Ivo van Vulpen

ATLAS: Approaching the Higgs from a new direction: Constraining new physics with off shell Higgs data from the LHC

The Heisenberg uncertainly principle allows for all elementary particles---including the Higgs Boson---to momentarily disobey the fundamental energy-momentum relation, allowing the particle in question to have a significantly larger mass than usual. A description of the Higgs Boson in this state (“off shell Higgs Boson”) can provide a portal to the discovery of potential new physics, albeit very difficult to do due to its infrequent appearance. The goal of this project is to constrain or hint at new physics by estimating parameters of a generalized model which allows for new physics, Effective Field Theory (EFT), using off shell Higgs data.

Most of the underlying analysis to measure the prevalence of off shell Higgs bosons has already been set up, so the goal of this project is to do the aforementioned EFT interpretation on top of this existing analysis. From a theoretical point of view much of the groundwork has also been done on simulated data which showed the potential for this EFT interpretation to constrain new physics [1]. Being on the interface between experimental and theoretical physics this project allows the student to gain a deeper understanding of both, furthermore its successful completion could be one of the first hints towards as of yet not understood physics.

[1] Azatov, A., de Blas, J., Falkowski, A., Gritsan, A. V., Grojean, C., Kang, L., ... & Vryonidou, E. (2022). Off-shell Higgs Interpretations Task Force: Models and Effective Field Theories Subgroup Report. arXiv preprint arXiv:2203.02418.

Contact: Zef Wolffs, Matouš Vozák, Bryan Kortman and Ivo van Vulpen

ATLAS: A new timing detector - the HGTD

The ATLAS is going to get a new ability: a timing detector. This allows us to reconstruct tracks not only in the 3 dimensions of space but adds the ability of measuring very precisely also the time (at picosecond level) at which the particles pass the sensitive layers of the HGTD detector. This allows to construct the trajectories of the particles created at the LHC in 4 dimensions and ultimately will lead to a better reconstruction of physics at ATLAS. The new HGTD detector is still in construction and work needs to be done on different levels such as understanding the detector response (taking measurements in the lab and performing simulations) or developing algorithms to reconstruct the particle trajectories (programming and analysis work).

Several projects are available within the context of the new HGTD detector:

  1. One can choose to either focus on the impact on physics analysis performance by studying how the timing measurements can be included in the reconstruction of tracks, and what effect this has on how much better we can understand the physical processes occurring in the particles produced in the LHC collisions. With this work you will be part of the Atlas group at Nikhef.
  2. The second possibility is to test the sensors in our lab and in test-beam setups at CERN. The analysis performed will be in context of the ATLAS HGTD test beam group in connection to both the Atlas group and the R&D department at Nikhef.
  3. The third is to contribute in an ongoing effort to precisely simulate/model he silicon avalanche detectors in the Allpix2 frameword. There are several models that try to describe the detectors response. There are several dependencies to operation temperature, field strenghts and radiation damage. We are getting close in being able to model our detector - but not there yet. This work will be within the ATLAS group together with Hella Snoek and Andrea Visibile

If you are interested, contact me to discuss the possibilities. Contact: Hella Snoek


ATLAS: The next full-silicon Inner Tracker: ITk

The inner detector of the present ATLAS experiment has been designed and developed to function in the environment of the present Large Hadron Collider (LHC). At the ATLAS Phase-II Upgrade, the particle densities and radiation levels will exceed current levels by a factor of ten. The instantaneous luminosity is expected to reach unprecedented values, resulting in up to 200 proton-proton interactions in a typical bunch crossing. The new detectors must be faster and they need to be more highly segmented. The sensors used also need to be far more resistant to radiation, and they require much greater power delivery to the front-end systems. At the same time, they cannot introduce excess material which could undermine tracking performance. For those reasons, the inner tracker of the ATLAS detector (ITk) was redesigned and will be rebuilt completely.

Nikhef is one of the sites in charge of building and integrating some big parts of ITk. One of the next steps consists of testing the sensors that we will install in the structures we have built (check one of the structures in the picture of our cleanroom). This project offers the possibility of working on a full hardware project, doing something completely new, by testing the sensors of a future component of the next ATLAS detector.

Contact: Andrea García Alonso

Cosmic Rays/Neutrinos: Seasonal muon flux variations and the pion/kaon ratio

The KM3NeT ARCA and ORCA detectors, located kilometers deep in the Mediterranean Sea, have neutrinos as primary probes. Muons from cosmic ray interactions reach the detectors in relatively large quantities too. These muons, exploiting the capabilities and location of the detectors allow the study of cosmic rays and their interactions. In this way, questions about their origin, type, propagation can be addressed. In particular these muons are tracers of hadronic interactions at energies inaccessible at particle accelerators.

The muons reaching the depths of the detectors result from decays of mesons, mostly pions and kaons, created in interactions of high-energy cosmic rays with atoms in the upper atmosphere. Seasonal changes of the temperature – and thus density - profile of the atmosphere modulate the balance between the probability for these mesons to decay (producing muons) or to re-interact. Pions and kaons are affected differently, allowing to extract their production ratio by determining how changes in muon rate depend on changes in the effective temperature – an integral over the atmospheric temperature profile weighted by a depth dependent meson production rate.

In this project, the aim is to measure the rate of muons in the detectors and to calculate the effective temperature above the KM3NeT detectors from atmospheric data, both as function of time. The relation between these two can be used to extract the pion to kaon ratio.

Contact: Ronald Bruijn

Detector R&D: Studies of wafer-scale sensors for ALICE detector upgrade and beyond

One of the biggest milestones of the ALICE detector upgrade (foreseen in 2026) is the implementation of wafer-scale (~ 28 cm x 18 cm) monolithic silicon active pixel sensors in the tracking detector, with the goal of having truly cylindrical barrels around the beam pipe. To demonstrate such an unprecedented technology in high energy physics detectors, few chips will be soon available in Nikhef laboratories for testing and characterization purposes. The goal of the project is to contribute to the validation of the samples against the ALICE tracking detector requirements, with a focus on timing performance in view of other applications in future high energy physics experiments beyond ALICE. We are looking for a student with a focus on lab work and interested in high precision measurements with cutting-edge instrumentation. You will be part of the Nikhef Detector R&D group and you will have, at the same time, the chance to work in an international collaboration where you will report about the performance of these novel sensors. There may even be the opportunity to join beam tests at CERN or DESY facilities. Besides interest in hardware, some proficiency in computing is required (Python or C++/ROOT).

Contact: Jory Sonneveld , Roberto Russo

Detector R&D: Time resolution of monolithic silicon detectors

Monolithic silicon detectors based on industrial Complementary Metal Oxide Semiconductor (CMOS) processes offer a promising approach for large scale detectors due to their ease of production and low material budget. Until recently, their low radiation tolerance has hindered their applicability in high energy particle physics experiments. However, new prototypes ~~such as the one in this project~~ have overcome these hurdles, making them feasible candidates for future experiments in high energy particle physics. Achieving the required radiation tolerance has brought the spatial and temporal resolution of these detectors to the forefront. In this project, you will investigate the temporal performance of a radiation hard monolithic detector prototype, using laser setups in the laboratory. You will also participate in meetings with the international collaboration working on this detector, where you will report on the prototype's performance. Depending on the progress of the work, there may be a chance to participate in test beams performed at the CERN accelerator complex and a first full three dimensional characterization of the prototypes performance using a state-of-the-art two-photon absorption laser setup at Nikhef. This project is looking for someone interested in working hands on with cutting edge detector and laser systems at the Nikhef laboratory. Python programming skills and linux experience are an advantage.

Contact: Jory Sonneveld, Uwe Kraemer

Detector R&D: Improving a Laser Setup for Testing Fast Silicon Pixel Detectors

For the upgrades of the innermost detectors of experiments at the Large Hadron Collider in Geneva, in particular to cope with the large number of collisions per second from 2027, the Detector R&D group at Nikhef tests new pixel detector prototypes with a variety of laser equipment with several wavelengths. The lasers can be focused down to a small spot to scan over the pixels on a pixel chip. Since the laser penetrates the silicon, the pixels will not be illuminated by just the focal spot, but by the entire three dimensional hourglass or double cone like light intensity distribution. So, how well defined is the volume in which charge is released? And can that be made much smaller than a pixel? And, if so, what would the optimum focus be? For this project the student will first estimate the intensity distribution inside a sensor that can be expected. This will correspond to the density of released charge within the silicon. To verify predictions, you will measure real pixel sensors for the LHC experiments. This project will involve a lot of 'hands on work' in the lab and involve programming and work on unix machines.

Contact: Martin Fransen

Detector R&D: Time resolution of hybrid pixel detectors with the Timepix4 chip

Precise time measurements with silicon pixel detectors are very important for experiments at the High-Luminosity LHC and the future circular collider. The spatial resolution of current silicon trackers will not be sufficient to distinguish the large number of collisions that will occur within individual bunch crossings. In a new method, typically referred to as 4D tracking, spatial measurements of pixel detectors will be combined with time measurements to better distinguish collision vertices that occur close together. New sensor technologies are being explored to reach the required time measurement resolution of tens of picoseconds, and the results are promising. However, the signals that these pixelated sensors produce have to be processed by front-end electronics, which hence also play a role in the total time resolution of the detector. An important contribution comes from the systematic differences between the front-end electronics of different pixels. Many of these systematic effects can be corrected by performing detailed calibrations of the readout electronics. To achieve the required time resolution at future experiments, it is vital that these effects are understood and corrected. In this project you will be working with the Timepix4 chip. This is a so-called application specific integrated circuit (ASIC) that is designed to read out pixelated sensors. This ASIC will be used extensively in detector R&D for the characterisation of new sensor technologies requiring precise timing (< 50 ps). In order to do so, it is necessary to first study the systematic differences between the pixels, which you will do using a laser setup in our lab. This will be combined with data analysis of proton beam measurements, or with measurements performed using the built-in test-pulse mechanism of the Timepix4 ASIC. Your work will enable further research performed with this ASIC, and serve as input to the design and operation of future ASICs for experiments at the High-Luminosity LHC.

Contact: Kevin Heijhoff and Martin van Beuzekom

Detector R&D: Performance studies of Trench Isolated Low Gain Avalanche Detectors (TI-LGAD)

The future vertex detector of the LHCb Experiment needs to measure the spatial coordinates and time of the particles originating in the LHC proton-proton collisions with resolutions better than 10 um and 50 ps, respectively. Several technologies are being considered to achieve these resolutions. Among those is a novel sensor technology called Trench Isolated Low Gain Avalanche Detector. Prototype pixelated sensors have been manufactured recently and have to be characterised. Therefore these new sensors will be bump bonded to a Timepix4 ASIC which provides charge and time measurements in each of 230 thousand pixels. Characterisation will be done using a lab setup at Nikhef, and includes tests with a micro-focused laser beam, radioactive sources, and possibly with particle tracks obtained in a test-beam. This project involves data taking with these new devices and analysing the data to determine the performance parameters such as the spatial and temporal resolution. as function of temperature and other operational conditions.

Contacts: Kazu Akiba and Martin van Beuzekom

Detector R&D: A Telescope with Ultrathin Sensors for Beam Tests

To measure the performance of new prototypes for upgrades of the LHC experiments and beyond, typically a telescope is used in a beam line of charged particles that can be used to compare the results in the prototype to particle tracks measured with this telescope. In this project, you will continue work on a very lightweight, compact telescope using ALICE PIxel DEtectors (ALPIDEs). This includes work on the mechanics, data acquisition software, and a moveable stage. You will foreseeably test this telescope in the Delft Proton Therapy Center. If time allows, you will add a timing plane and perform a measurement with one of our prototypes. Apart from travel to Delft, there is a possiblity to travel to other beam line facilities.

Contact: Jory Sonneveld

Detector R&D: Laser Interferometer Space Antenna (LISA) - the first gravitational wave detector in space

The space-based gravitational wave antenna LISA is one of the most challenging space missions ever proposed. ESA plans to launch around 2034 three spacecraft separated by a few million kilometres. This constellation measures tiny variations in the distances between test-masses located in each satellite to detect gravitational waves from sources such as supermassive black holes. LISA is based on laser interferometry, and the three satellites form a giant Michelson interferometer. LISA measures a relative phase shift between one local laser and one distant laser by light interference. The phase shift measurement requires sensitive sensors. The Nikhef DR&D group fabricated prototype sensors in 2020 together with the Photonics industry and the Dutch institute for space research SRON. Nikhef & SRON are responsible for the Quadrant PhotoReceiver (QPR) system: the sensors, the housing including a complex mount to align the sensors with 10's of nanometer accuracy, various environmental tests at the European Space Research and Technology Centre (ESTEC), and the overall performance of the QPR in the LISA instrument. Currently we are discussing possible sensor improvements for a second fabrication run in 2022, optimizing the mechanics and preparing environmental tests. As a MSc student, you will work on various aspects of the wavefront sensor development: study the performance of the epitaxial stacks of Indium-Gallium-Arsenide, setting up test benches to characterize the sensors and QPR system, performing the actual tests and data analysis, in combination with performance studies and simulations of the LISA instrument.

Contact: Niels van Bakel

Detector R&D: Other projects

Are you looking for a slightly different project? Are the above projects already taken? Are you coming in at an unusual time of the year? Do not hesitate to contact us! We always have new projects coming up at different times in the year and we are open to your ideas.

Contact: Jory Sonneveld

FCC: The Next Collider

After the LHC, the next planned large collider at CERN is the proposed 100 kilometer circular collider "FCC". In the first stage of the project, as a high-luminosity electron-positron collider, precision measurements of the Higgs boson are the main goal. One of the channels that will improve by orders of magnitude at this new accelerator is the decay of the Higgs boson to a pair of charm quarks. This project will estimate a projected sensitivity for the coupling of the Higgs boson to second generation quarks, and in particular target the improved reconstruction of the topology of long-lived mesons in the clean environment of a precision e+e- machine.

Contact: Tristan du Pree

Gravitational Waves: Computer modelling to design the laser interferometers for the Einstein telescope

A new field of instrument science led to the successful detection of gravitational waves by the LIGO detectors in 2015. We are now preparing the next generation of gravitational wave observatories, such as the Einstein Telescope, with the aim to increase the detector sensitivity by a factor of ten, which would allow, for example, to detect stellar-mass black holes from early in the universe when the first stars began to form. This ambitious goal requires us to find ways to significantly improve the best laser interferometers in the world.

Gravitational wave detectors, such as LIGO and VIRGO, are complex Michelson-type interferometers enhanced with optical cavities. We develop and use numerical models to study these laser interferometers, to invent new optical techniques and to quantify their performance. For example, we synthesize virtual mirror surfaces to study the effects of higher-order optical modes in the interferometers, and we use opto-mechanical models to test schemes for suppressing quantum fluctuations of the light field. We can offer several projects based on numerical modelling of laser interferometers. All projects will be directly linked to the ongoing design of the Einstein Telescope.

Contact: Andreas Freise

LHCb: Search for light dark particles

The Standard Model of elementary particles does not contain a proper Dark Matter candidate. One of the most tantalizing theoretical developments is the so-called Hidden Valley models: a mirror-like copy of the Standard Model, with dark particles that communicate with standard ones via a very feeble interaction. These models predict the existence of dark hadrons – composite particles that are bound similarly to ordinary hadrons in the Standard Model. Such dark hadrons can be abundantly produced in high-energy proton-proton collisions, making the LHC a unique place to search for them. Some dark hadrons are stable like a proton, which makes them excellent Dark Matter candidates, while others decay to ordinary particles after flying a certain distance in the collider experiment. The LHCb detector has a unique capability to identify such decays, particularly if the new particles have a mass below ten times the proton mass.

This project assumes a unique search for light dark hadrons that covers a mass range not accessible to other experiments. It assumes an interesting program on data analysis (python-based) with non-trivial machine learning solutions and phenomenology research using fast simulation framework. Depending on the interest, there is quite a bit of flexibility in the precise focus of the project.

Contact: Andrii Usachov

LHCb: Searching for dark matter in exotic six-quark particles

Three quarters of the mass in the Universe is of unknown type. Many hypotheses about this dark matter have been proposed, but none confirmed. Recently it has been proposed that it could be made of particles made of the six quarks uuddss, which would be a Standard-Model solution to the dark matter problem. This idea has recently gained credibility as many similar multi-quarks states are being discovered by the LHCb experiment. Such a particle could be produced in decays of heavy baryons, or directly in proton-proton collisions. The anti-particle, made of six antiquarks, could be seen when annihilating with detector material. It is also proposed to use Xi_b baryons produced at LHCb to search for such a state where the state would appear as missing 4-momentum in a kinematically constrained decay. The project consists in defining a selection and applying it to LHCb data. See arXiv:2007.10378.

Contact: Patrick Koppenburg

LHCb: Measuring lepton flavour universality with excited Ds states in semileptonic Bs decays

One of the most striking discrepancies between the Standard Model and measurements are the lepton flavour universality (LFU) measurements with tau decays. At the moment, we have observed an excess of 3-4 sigma in B → Dτν decays. This could point even to a new force of nature! To understand this discrepancy, we need to make further measurements.

One very exciting (pun intended) projects to verify these discrepancies involves measuring the Bs → Ds2*τν and/or Bs → Ds1*τν decays. These decays with excited states of the Ds meson have not been observed before in the tau decay mode, and have a unique way of coupling to potential new physics candidates that can only be measured in Bs decays [1]. See slides for more detail: File:LHCbLFUwithExcitedDs.pdf

[1] https://arxiv.org/abs/1606.09300

Contact: Suzanne Klaver

LHCb: New physics in the angular distributions of B decays to K*ee

Lepton flavour violation in B decays can be explained by a variety of non-standard model interactions. Angular distributions in decays of a B meson to a hadron and two leptons are an important source of information to understand which model is correct. Previous analyses at the LHCb experiment have considered final states with a pair of muons. Our LHCb group at Nikhef concentrates on a new measurement of angular distributions in decays with two electrons. The main challenge in this measurement is the calibration of the detection efficiency. In this project you will confront estimates of the detection efficiency derived from simulation with decay distributions in a well known B decay. Once the calibration is understood, the very first analysis of the angular distributions in the electron final state can be performed.

Contact: Mara Soares and Wouter Hulsbergen

LHCb: Discovering heavy neutrinos in B decays

Neutrinos are the lightest of all fermions in the standard model. Mechanisms to explain their small mass rely on the introduction of new, much heavier, neutral leptons. If the mass of these new neutrinos is below the b-quark mass, they can be observed in B hadron decays.

In this project we search for the decay of B+ mesons in into an ordinary electron or muon and the yet undiscovered heavy neutrino. The heavy neutrino is expected to be unstable and in turn decay quickly into a charged pion and another electron or muon. The final state in which the two leptons differ in flavour, "B+ to e mu pi", is particularly interesting: It is forbidden in the standard model, such that backgrounds are small. The analysis will be performed within the LHCb group at Nikhef using LHCb run-2 data.

LHCb: Scintillating Fibre tracker software

The installation of the scintillating-fibre tracker in LHCb’s underground cavern was recently completed. This detector uses 10000 km of fibres to track particle trajectories in the LHCb detector when the LHC starts up again later this year. The light emitted by the scintillating fibres when a particle interacts with them is measured using photon multiplier tubes. The studies proposed for this project will focus on software, and could include writing a framework to monitor the detector output, improving the detector simulation or working on the data processing.

Contact: Emmy Gabriel

LHCb: Vertex detector calibration

In summer 2022 LHCb has started data taking will an almost entirely new detector. At the point closest to the interaction point, the trajectories of charge particles are reconstructed with a so-called silicon pixel detector. The design hit resolution of this detector is about 15 micron. However, to actually reach this resolution a precise calibration of the spatial positions of the silicon sensors needs to be performed. In this project, you will use the first data of the new LHCb detector to perform this calibration and measure the detector performance.

Contact: Wouter Hulsbergen


Neutrinos: Neutrino scattering: the ultimate resolution

Neutrino telescopes like IceCube and KM3NeT aim at detecting neutrinos from cosmic sources. The neutrinos are detected with the best resolution when charged current interactions with nucleons produce a muon, which can be detected with high accuracy (depending on the detector). A crucial ingredient in the ultimate achievable pointing accuracy of neutrino telescopes is the scattering angle between the neutrino and the muon. While published computations have investigated the cross-section of the process in great detail, this important scattering angle has not received much attention. The aim of the project is to compute and characterize the distribution of this angle, and that the ultimate resolution of a neutrino telescope. If successful, the results of this project can lead to publication of interest to the neutrino telescope community.

Depending on your interests, the study could be based on a first-principles calculation (using the deep-inelastic scattering formalism), include state-of-the-art parton distribution functions, and/or exploit existing event-generation software for a more experimental approach.

Contacts: Aart Heijboer

Neutrinos: acoustic detection of ultra-high energy neutrinos

The study of the cosmic neutrinos of energies above 1017 eV, the so-called ultra-high energy neutrinos, provides a unique view on the universe and may provide insight in the origin of the most violent astrophysical sources, such as gamma ray bursts, supernovae or even dark matter. In addition, the observation of high energy neutrinos may provide a unique tool to study interactions at high energies. The energy deposition of these extreme neutrinos in water induce a thermo-acoustic signal, which can be detected using sensitive hydrophones. The expected neutrino flux is however extremely low and the signal that neutrinos induce is small. TNO is presently developing sensitive hydrophone technology based on fiber optics. Optical fibers form a natural way to create a distributed sensing system. Using this technology a large scale neutrino telescope can be built in the deep sea. TNO is aiming for a prototype hydrophone which will form the building block of a future telescope.

The work will be executed at the Nikhef institute and/or the TNO laboratories in Delft. In this project master students have the opportunity to contribute in the following ways:

Project 1: Hardware development on fiber optics hydrophones technology Goal: characterize existing prototype optical fibre hydrophones in an anechoic basin at TNO laboratory. Data collection, calibration, characterization, analysis of consequences for design future acoustic hydrophone neutrino telescopes; Keywords: Optical fiber technology, signal processing, electronics, lab.

Project 2: Investigation of ultra-high energy neutrinos and their interactions with matter. Goal: Discriminate the neutrino signals from the background noises, in particular clicks from whales and dolphins in the deep sea. Study impact on physics reach for future acoustic hydrophone neutrino telescopes; Keywords: Monte Carlo simulations, particle physics, neutrino physics, data analysis algorithms.

Further information: Info on ultra-high energy neutrinos can be found at: http://arxiv.org/abs/1102.3591; Info on acoustic detection of neutrinos can be found at: http://arxiv.org/abs/1311.7588

Contact: Ernst Jan Buis or Ivo van Vulpen

Neutrinos: Oscillation analysis with the first data of KM3NeT

The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first few strings with sensitive photodetectors have been deployed at both the Italian and the French detector sites. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In this project the available data will be used together with simulations to best reconstruct the event topologies and optimally identify and reconstruct the first neutrino interactions in the KM3NeT detector. The data will then be used to measure neutrino oscillation parameters, and prepare for a future neutrino mass ordering determination.

Programming skills are essential, mostly root and C++ will be used. Contact: Ronald Bruijn Paul de Jong


Neutrinos: the Deep Underground Neutrino Experiment (DUNE)

The Deep Underground Neutrino Experiment (DUNE) is under construction in the USA, and will consist of a powerful neutrino beam originating at Fermilab, a near detector at Fermilab, and a far detector in the SURF facility in Lead, South Dakota, 1300 km away. During travelling, neutrinos oscillate and a fraction of the neutrino beam changes flavour; DUNE will determine the neutrino oscillation parameters to unrivaled precision, and try and make a first detection of CP-violation in neutrinos. In this project, various elements of DUNE can be studied, including the neutrino oscillation fit, neutrino physics with the near detector, event reconstruction and classification (including machine learning), or elements of data selection and triggering.

Contact: Paul de Jong

Neutrinos: relic neutrino detection with PTOLEMY

PTOLEMY aims to make the first direct observation of the Big Bang relic neutrinos (the cosmic neutrino background, CνB) by resolving the β-decay endpoint of atomic tritium (neutrino capture target) to O(meV) precision. This remains an outstanding test of the Standard Model in an expanding universe. Not only does the CνB carry with it a signal from the hot dense universe only one second after the Big Bang but helps to constrain the balance of hot versus cold dark matter responsible for its evolution. In doing so, the PTOLEMY experiment would also measure the lowest neutrino mass, an as-of-yet unknown fundamental constant. The experiment is currently in the prototyping phase and the group at Nikhef is responsible for developing the radio-frequency (RF) system used for cyclotron radiation (CR) based trigger and tracking. This component will provide the trajectory of electrons entering the novel transverse drift filter, constraining the electrons' energy losses before they reach the cryogenic calorimeter which in turn records their final energy. The focus of this project will be modelling CR and its detection for the purposes of single electron spectroscopy and optimised trajectory reconstruction. There is also the opportunity to test hardware and readout electronics for the prototype RF-system. Contact: James Vincent Mead

Theoretical Particle Physics: Effective Field Theories of Particle Physics from low- to high-energies

Known elementary matter particles exhibit a surprising three-fold structure. The particles belonging to each of these three “generations” seem to display a remarkable pattern of identical properties, yet have vastly different masses. This puzzling pattern is unexplained. Equally unexplained is the bewildering imbalance between matter and anti-matter observed in the universe, despite minimal differences in the properties of particles and anti-particles. These two mystifying phenomena may originate from a deeper, still unknown, fundamental structure characterised by novel types of particles and interactions, whose unveiling would revolutionise our understanding of nature. The ultimate goal of particle physics is uncovering a fundamental theory which allows the coherent interpretation of phenomena taking place at all energy and distance scales. In this project, the students will exploit the Standard Model Effective Field Theory (SMEFT) formalism, which allows the theoretical interpretation of particle physics data in terms of new fundamental quantum interactions which relate seemingly disconnected processes with minimal assumptions on the nature of an eventual UV-complete theory that replaces the Standard Model. Specifically, the goal is to connect measurements from ATLAS, CMS, and LHCb experiments at the CERN's LHC among them and to jointly interpret this information with that provided by other experiments including very low-energy probes such as the anomalous magnetic moment of the muon or electric dipole moments of the electron and neutron.

This project will be based on theoretical calculations in particle physics, numerical simulations in Python, analysis of existing data from the LHC and other experiments, as well as formal developments in understanding the operator structure of effective field theories. Depending on the student profile, sub-projects with a strong computational and/or machine learning component are also possible, for instance to construct new operators with optimal sensitivity to New Physics effects as encoded by the SMEFT higher-dimensional operators. Topics that can be considered in this project include the interpretation of novel physical observables at the LHC and their integration into the global SMEFiT analysis, matching of EFTs to UV-complete theories and their phenomenological analyses, projections for the impact in the SMEFT parameter space of data for future colliders, the synergies between EFT studies and proton structure fits, and the matching to the Weak Effective Field Theory to include data on flavour observables such as B-meson decays.

References: https://arxiv.org/abs/2105.00006 , https://arxiv.org/abs/2302.06660, https://arxiv.org/abs/2211.02058 , https://arxiv.org/abs/1901.05965 , https://arxiv.org/abs/1906.05296 ,  https://arxiv.org/abs/1908.05588,  https://arxiv.org/abs/1905.05215. see also this project description.

Contacts: Juan Rojo

Theoretical Particle Physics: High-energy neutrino-nucleon interactions at the Forward Physics Facility

High-energy collisions at the High-Luminosity Large Hadron Collider (HL-LHC) produce a large number of particles along the beam collision axis, outside of the acceptance of existing experiments. The proposed Forward Physics Facility (FPF) to be located several hundred meters from the ATLAS interaction point and shielded by concrete and rock, will host a suite of experiments to probe Standard Model (SM) processes and search for physics beyond the Standard Model (BSM). High statistics neutrino detection will provide valuable data for fundamental topics in perturbative and non-perturbative QCD and in weak interactions. Experiments at the FPF will enable synergies between forward particle production at the LHC and astroparticle physics to be exploited. The FPF has the promising potential to probe our understanding of the strong interactions as well as of proton and nuclear structure, providing access to both the very low-x and the very high-x regions of the colliding protons. The former regime is sensitive to novel QCD production mechanisms, such as BFKL effects and non-linear dynamics, as well as the gluon parton distribution function (PDF) down to x=1e-7, well beyond the coverage of other experiments and providing key inputs for astroparticle physics. In addition, the FPF acts as a neutrino-induced deep-inelastic scattering (DIS) experiment with TeV-scale neutrino beams. The resulting measurements of neutrino DIS structure functions represent a valuable handle on the partonic structure of nucleons and nuclei, particularly their quark flavour separation, that is fully complementary to the charged-lepton DIS measurements expected at the upcoming Electron-Ion Collider (EIC).

In this project, the student will carry out updated predictions for the neutrino fluxes expected at the FPF, assess the precision with which neutrino cross-sections will be measured, and quantify their impact on proton and nuclear structure by means of machine learning tools within the NNPDF framework and state-of-the-art calculations in perturbative Quantum Chromodynamics. This project contributes to ongoing work within the FPF Initiative towards a Conceptual Design Report (CDR) to be presented within two years. Topics that can be considered as part of this project include the assessment of to which extent nuclear modifications of the free-proton PDFs can be constrained by FPF measurements, the determination of the small-x gluon PDF from suitably defined observables at the FPF and the implications for ultra-high-energy particle astrophysics, the study of the intrinsic charm content in the proton and its consequences for the FPF physics program, and the validation of models for neutrino-nucleon cross-sections in the region beyond the validity of perturbative QCD.

References: https://arxiv.org/abs/2203.05090, https://arxiv.org/abs/2109.10905 ,https://arxiv.org/abs/2208.08372 , https://arxiv.org/abs/2201.12363 , https://arxiv.org/abs/2109.02653, https://github.com/NNPDF/ see also this project description.

Contacts: Juan Rojo

Theoretical Particle Physics: Probing the origin of the proton spin with machine learning

At energy-frontier facilities such as the Large Hadron Collider (LHC), scientists study the laws of nature in their quest for novel phenomena both within and beyond the Standard Model of particle physics. An in-depth understanding of the quark and gluon substructure of protons and heavy nuclei is crucial to address pressing questions from the nature of the Higgs boson to the origin of cosmic neutrinos. The key to address some of these questions is by carrying out an universal analysis of nucleon structure from the simultaneous determination of the momentum and spin distributions of quarks and gluons and their fragmentation into hadrons. This effort requires combining an extensive experimental dataset and cutting-edge theory calculations within a machine learning framework where neural networks parametrise the underlying physical laws while minimising ad-hoc model assumptions. The upcoming Electron Ion Collider (EIC), to start taking data in 2029, will be the world's first ever polarised lepton-hadron collider and will offer a plethora of opportunities to address key open questions in our understanding of the strong nuclear force, such as the origin of the mass and the intrinsic angular momentum (spin) of hadrons and whether there exists a state of matter which is entirely dominated by gluons. To fully exploit this scientific potential, novel analysis methodologies need to be develop that make it possible to carry out large-scale, coherent interpretations of measurements from the EIC and other high-energy colliders.

In this project, the student will carry out a new global analysis of the spin structure of the proton by means of the machine learning tools provided by the NNPDF open-source fitting framework and state-of-the-art calculations in perturbative Quantum Chromodynamics, and integrate it within the corresponding global NNPDF analyses of unpolarised proton and nuclear structure in the framework of a combined integrated global analysis of non-perturbative QCD. Specifically, the project aims to realise a NNLO global fit of polarised quark and gluon PDFs that combines all available data and state-of-the-art perturbative QCD calculations, and study the phenomenological implications for other experiments, including the EIC, for the spin content of the proton, for comparisons with lattice QCD calculations, and for nonpperturbative models of hadron structure.

References: https://arxiv.org/abs/2201.12363, https://arxiv.org/abs/2109.02653 , https://arxiv.org/abs/2103.05419, https://arxiv.org/abs/1404.4293 , https://inspirehep.net/literature/1302398, https://github.com/NNPDF/ see also this project description.

Contacts: Juan Rojo

Theoretical Particle Physics: Charged lepton flavor violation in neutrino mass models

The nonzero value of neutrino masses requires an explanation beyond the Standard Model of particle physics. A promising solution involves the existence of extra neutrinos, often called right-handed or sterile neutrinos. These models elegantly explain neutrino masses and can also be connected to other puzzles such as the absence of anti-matter in our universe. In this project you will investigate potential experimental signatures of sterile neutrinos through decays that are extremely rare in the Standard Model. Examples are muon decays to electrons and photons, or muon + neutron -> electron + neutron. You will perform Quantum Field Theory calculations within the neutrino-extended Standard Model to compute the rates of these processes and compare them to experimental sensitivities.

Contacts: Jordy de Vries

Theoretical Particle Physics: The electric dipole moment of paramagnetic systems in the Standard Model

Electric dipole moments (EDMs) of elementary particles, hadrons, nuclei, atoms, and molecules would indicate the violation of CP violation. The Standard Model (SM) contains CP violation in the weak interaction in the so-called CKM matrix (the quark-mixing matrix) but it leads to EDMs that are too small to be seen. At least this is often claimed. In this work we will reinvestigate the computation of the EDMs of systems that are used in state-of-the-art experiments. In particular we will compute a CP-violating interaction between electrons and nucleons mitigated by the SM weak interaction. During this project you will obtain a deep understanding of the Standard Model and explicit quantum field theory calculations across a wider range of energy scales.

Contacts: Jordy de Vries

Theoretical Particle Physics: Predictions for Charge Particle Tracks from First Principles

Measurements based on tracks of charged particles benefit from superior angular resolution. This is essential for a new class of observables called energy correlators, for which a range of interesting applications have been identified: studying the confinement transition, measuring the top quark mass more precisely, etc. I developed a framework for calculating track-based observables, in which the conversion of quarks and gluons to charged hadrons is described by track functions. This generalization of the well-studied parton distribution functions and fragmentation functions is currently being measured by ATLAS, though the data is not public yet. Interestingly, two groups proposed predicting fragmentation functions from first principles in recent years (https://arxiv.org/abs/2010.02934, https://arxiv.org/abs/2301.09649). In this project you would extend one (or both) approaches to obtain a prediction for the track function.

Contacts: Wouter Waalewijn



Finished master projects

See:





Last year's MSc Projects