Master Projects

From Education Wiki
(Difference between revisions)
Jump to: navigation, search
Line 3: Line 3:
The following Master thesis research projects are offered at Nikhef. If you are interested in one of these projects, please contact the coordinator listed with the project.  
The following Master thesis research projects are offered at Nikhef. If you are interested in one of these projects, please contact the coordinator listed with the project.  
== Projects with September 2019 start ==
== Projects with September 2020 start ==
=== Theory: The Effective Field Theory Pathway to New Physics at the LHC ===
A very promising framework to parametrise in a robust and model-independent way deviations from the Standard Model (SM) induced by new heavy particles is the Standard  Model Effective Field Theory (SMEFT). In this formalism, Beyond the SM effects are encapsulated in higher-dimensional operators constructed from SM fields respecting their symmetry properties. In this project, we aim to carry out a global analysis of the SMEFT from high-precision LHC data, including Higgs boson production, flavour observables, and low-energy measurements. This analysis will be carried out in the context of the recently developed SMEFiT approach [1] based on Machine Learning techniques to efficiently explore the complex theory parameter space. The ultimate goal is either to uncover glimpses of new particles or interactions at the LHC, or to derive the most stringent model-independent bounds to date on general theories of New Physics.
''Contact: [ Juan Rojo]''
=== Theory: Pinning down the initial state of heavy-ion collisions with Machine Learning ===
It has been known for more than three decades that the parton distribution functions (PDFs) of nucleons bound within heavy nuclei are modified with respect to their free-nucleon counterparts. Despite active experimental and theoretical investigations, the underlying mechanisms that drive these in-medium modifications of nucleon substructure have yet to be fully understood.  The determination of nuclear PDFs is a topic of high relevance in order both to improve our fundamental understanding of the strong interactions in the nuclear environment, as well as and for the interpretation of heavy ion collisions at RHIC and the LHC, in particular for the characterization of the Quark-Gluon Plasma. The goal of this project is to exploit Machine Learning and Artificial Intelligence tools [1,2] (neural networks trained by stochastic gradient descent) to pin down the initial state of heavy ion collisions  by using recent measurements from proton-lead collisions at the LHC. Emphasis will be put on the poorly-known nuclear modifications of the gluon PDFs, which are still mostly ''terra incognita'' and highly relevant for phenomenological applications. In addition to theory calculations, the project will also involve code development using modern AI/ML tools such as TensorFlow and Keras.
''Contact: [ Juan Rojo]''
=== Theory: The High-Energy Muon Crisis and Perturbative QCD ===
The production of charmed meson from the collision of high-energy cosmic rays with air nucleons in the upper atmosphere provides an important component of the flux of high-energy muons and neutrinos that can be detected at cosmic ray experiments such as AUGER and neutrino telescopes such as KM3NET or IceCube. The production of forward muons from charmed meson decays is usually predicted from QCD models tuned to the data, rather than from first principles QCD calculation. Interestingly, the number of such high-energy muons observed by AUGER seems to differ markedly from current theory predictions. In this project we aim to exploit state-of-the-art perturbative and non-perturbative QCD techniques to compute the flux of high-energy muons from charm decays and make predictions for a number of experiments sensitive to them
''Contact: [ Juan Rojo]''
=== Dark Matter: XENON1T Data Analysis ===
=== Dark Matter: XENON1T Data Analysis ===
Line 52: Line 24:
''Contact: [ Auke Colijn]''
''Contact: [ Auke Colijn]''
=== ATLAS: The lifetime of the Higgs boson ===
While the Higgs boson was discovered in 2012, many of its properties still remain unconstrained. This master student project revolves around one such property, the lifetime of the Higgs boson. The lifetime can be obtained by measuring the width of the boson, but because the width is a few hundred times smaller than the detector resolution, a direct measurement is impossible at the moment. But there is an idea to overcome that limitation. By utilizing the interference between the Higgs boson decay and background processes we can perform an indirect measurement. This measurement potentially has the sensitivity that will allow us to perform a measurement of the width (or lifetime) as predicted by the Standard Model. Specifically, the master project will be about predicting the sensitivity of this measurement for different predictions of the Higgs width. The project is on the interface of theory and experiment, making use of Monte Carlo generators and the standard HEP analysis tools (ROOT, C++, python).
''Contact: [ Michiel Veen] or [ Hella Snoek & Ivo van Vulpen]''
=== ATLAS: The Next Generation ===
After the observation of the coupling of Higgs bosons to fermions of the third generation, the search for the coupling to fermions of the second generation is one of the next priorities for research at CERN's Large Hadron Collider. The search for the decay of the Higgs boson to two charm quarks is very new [1] and we see various opportunities for interesting developments. For this project we propose improvements in reconstruction (using exclusive decays), advanced analysis techiques (using deep learning methods) and expanding the new physics models (e.g. including a search for off-diagonal H->uc couplings). Another opportunity would be the development of the first statistical combination of results between the ATLAS and CMS experiment, which could significantly improve the discovery potentional.
''Contact: [ Tristan du Pree and Marko Stamenkovic]''
=== ATLAS: The Most Energetic Higgs Boson ===
The production of Higgs bosons at the highest energies could give the first indications for deviations from the standard model of particle physics, but production energies above 500 GeV have not been observed yet [1]. The LHC Run-2 dataset, collected during the last 4 years, might be the first opportunity to observe such processes, and we have various ideas for new studies. Possible developments include the improvement of boosted reconstruction techniques, for example using multivariate deep learning methods. Also, there are various opportunities for unexplored theory interpretations (using the MadGraph event generator), including effective field theory models (with novel ‘morphing’ techniques) and the study of the Higgs boson’s self coupling.
''Contact: [ Tristan du Pree and Brian Moser]''
=== LHCb:  Measurement of Central Exclusive Production Rates of Chi_c using converted photons in LHCb ===
Central exclusive production (CEP) of particles at the LHC is characterised by a extremely clean signature.  Differently from the typical inelastic collisions where many particles are created resulting in a so-called Primary Vertex, CEP events have only the final state particles of interest. In this project the particle of interest is a pair of charmed quarks creating a chi_c particle. In theory this process is generated by a long range gluon exchange and can elucidate the nature of the strong force, described by the quantum chromodynamics in the the standard model. The proposed work involves  analysing a pre-existing dataset with reconstructed chi_c and simulating events at the LHCb in order to obtain the relative occurrence rate of each chi_c species (spins 0, 1, 2), a quantity that can be easily compared to theoretical predictions.
''Contact: [ Kazu Akiba]''
=== LHCb:  Optimization studies for Vertex detector at the High Lumi LHCb  ===
The LHCb experiment is dedicated to measure tiny differences between matter and antimatter through the precise study of rare processes involving b or c quarks.  The LHCb detector will undergo a major modification in order to dramatically increase the luminosity and be able to  measure indirect effects of physics beyond the standard model.  In this environment, over 42 simultaneous collisions are expected to happen at a time interval of 200 ps where the two proton bunches overlap. The particles of interest have a relatively long lifetime and therefore the best way to distinguish them from the background collisions is through the precise reconstruction of displaced vertices and pointing directions.  The new detector considers using extremely recent or even future technologies to measure space (with resolutions below 10 um) and time (100 ps or better) to efficiently reconstruct the events of interest for physics.  The project involves changing completely  the LHCb Vertex Locator (VELO) design in simulation and determine what can be the best performance for the upgraded detector, considering different spatial and temporal resolutions.
''Contact: [ Kazu Akiba]''
=== LHCb:  Measurement of charge multiplication in heavily irradiated sensors ===
During the R&D phase for the LHCb VELO Upgrade detector a few sensor prototypes were irradiated to the extreme fluence expected to be achieved during the detector lifetime. These samples were tested using high energy particles at the SPS facility at CERN with their trajectories reconstructed by the Timepix3 telescope. A preliminary analysis revealed that at the highest irradiation levels the amount of signal observed is higher than expected, and even larger than the signal obtained at lower doses.  At the Device Under Test (DUT) position inside the telescope, the spatial resolution attained by this system is below 2 um. This means that a detailed analysis can be performed in order to study where and how this signal amplification happens within  the 55x55 um^2 pixel cell.  This project involves analysing the telescope and DUT data to investigate the charge multiplication mechanism at the microscopic level.
''Contact: [ Kazu Akiba]''
=== Detector R&D: Studying fast timing detectors  ===
Fast timing detectors are the solution for future tracking detectors. In future LHC operation conditions and future colliders, more and more particles are produced per collision. The high particle densities make it increasingly more difficult to separate particle trajectories with the spatial information that current silicon tracking detectors provide. A solution would be to add very precise (in order of 10ps) timestamps to the spatial measurements of the particle trackers. A good understanding of the performance of fast timing detectors is necessary. With the user of a pulsed laser in the lab we study the characteristics of several prototype detectors.
''Contact: [ Hella Snoek, Martin van Beuzekom, Kazu Akiba, Daniel Hynds]''
===Detector R&D: Laser Interferometer Space Antenna (LISA) ===
The space-based gravitational wave antenna LISA is without doubt one of the most challenging space missions ever proposed. ESA plans to launch around 2030 three spacecrafts that are separated by a few million kilometers to measure tiny variations in the distances between test-masses located in each spacecraft to detect the gravitational waves from sources such as supermassive black holes. The triangular constellation of the LISA mission is dynamic requiring a constant fine tuning related to the pointing of the laser links between the spacecrafts and a simultaneous refocusing of the telescope. The noise sources related to the laser links are expected to provide a dominant contribution to the LISA performance.
An update and extension of the LISA science simulation software is needed to assess the hardware development for LISA at Nikhef, TNO and SRON. A position is therefore available for a master student to study the impact of instrumental noise on the performance of LISA. Realistic simulations based on hardware (noise) characterization measurements that were done at TNO will be carried out and compared to the expected tantalizing gravitational wave sources.
Key words: LISA, space, gravitational waves, simulations, signal processing
''Contact: [ Niels van Bakel],[  Ernst-Jan Buis]''
===Detector R&D: Spectral X-ray imaging - Looking at colours the eyes can't see===
When a conventional X-ray image is made to analyse the composition of a sample, or to perform a medical examination on a patient, one acquires an image that only shows intensities. One obtains a ‘black and white’ image. Most of the information carried by the photon energy is lost. Lacking spectral information can result in an ambiguity between material composition and amount of material in the sample. If the X-ray intensity as a function of the energy can be measured (i.e. a ‘colour’ X-ray image) more information can be obtained from a sample. This translates to less required dose and/or to a better understanding of the sample that is being investigated. For example, two fields that can benefit from spectral X-ray imaging are mammography and real time CT.
X-ray detectors based on Medipix/Timepix pixel chips have spectral resolving capabilities and can be used to make polychromatic X-ray images. Medipix and Timepix chips have branched from pixel chips developed for detectors for high energy physics collider experiments.
Activities in the field of (spectral) CT scans are performed in a collaboration between two institutes (Nikhef and CWI) and two companies (ASI and XRE)
Some activities that students can work on:
- Medical X-ray imaging (CT and conventional X-ray images): Detection of iodine contrast agent. Detection of calcifications (hint for a tumour).
- Material research: Using spectral information to identify materials and recognize compounds.
- Determining how much existing applications can benefit from spectral X-ray imaging and looking for potential new applications.
- Characterizing, calibrating, optimizing X-ray imaging detector systems.
''Contact: [ Martin Fransen]''
===Detector R&D: Holographic projector===
A difficulty in generating holograms (based on the interference of light) is the required dense pixel pitch. One would need a pixel pitch of less than 200 nanometer. With larger pixels artefacts occur due to spatial under sampling. A pixel pitch of 200 nanometer is difficult, if not, impossible, to achieve, especially for larger areas. Another challenge is the massive amount of computing power that would be required to control such a dense pixel matrix.
A new holographic projection method has been developed that reduces under sampling artefacts for projectors with a ‘low’ pixel density. It uses 'pixels' at random but known positions, resulting in an array of (coherent) light points that lacks (or has strongly surpressed) spatial periodicity. As a result a holographic projector can be built with a significantly lower pixel density and correspondingly less required computing power. This could bring holography in reach for many applications like display, lithography, 3D printing, metrology, etc...
Of course, nothing comes for free: With less pixels, holograms become noisier and the contrast will be reduced (not all light ends up in the hologram). The questions: How does the quality of a hologram depend on pixel density? How do we determine projector requirements based on requirements for hologram quality?
Requirements for a hologram can be expressed in terms of: Noise, contrast, resolution, suppression of under sampling artefacts, etc..
For this project we are building a proof of concept holographic emitter. This set-up will be used to verify simulation results (and also to project some cool holograms of course).
Students can do hands on lab-work (building and testing the proto-type projector) and/or work on setting up simulation methods and models. Simulations can be highly parallelel and are preferably written for parallel/multithreading computing and/or GPU computing.
''Contact: [ Martin Fransen]''
=== KM3NeT: Reconstruction of first neutrino interactions in KM3NeT ===
The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first few strings with sensitive photodetectors have been deployed at both the Italian and the French detector sites. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In order to identify neutrinos an accurate reconstruction and optimal understanding of the backgrounds are crucial. In this project we will use the available data together with simulations to optimally identify and reconstruct the first neutrino interactions in the KM3NeT detector (applying also machine learning for background suppression) and with this pave the path towards accurate neutrino oscillation measurements and neutrino astronomy.
Programming skills are essential, mostly root and C++ will be used.
''Contact: [ Ronald Bruijn] [ Dorothea Samtleben]'''
=== KM3NeT: Acoustic detection of ultra-high energy cosmic-ray neutrinos  (2 projects) ===
The study of the cosmic neutrinos of energies above 1017 eV, the so-called ultra-high
energy neutrinos, provides a unique view on the universe and may provide insight
in the origin of the most violent astrophysical sources, such as gamma ray bursts,
supernovae or even dark matter. In addition, the observation of high energy neutrinos
may provide a unique tool to study interactions at high energies.
The energy deposition of these extreme neutrinos in water induce a thermo-
acoustic signal, which can be detected using sensitive hydrophones. The expected
neutrino flux is however extremely low and the signal that neutrinos induce is small.
TNO is presently developing sensitive hydrophone technology based on fiber optics.
Optical fibers form a natural way to create a distributed sensing system. Using this
technology a large scale neutrino telescope can be built in the deep sea. TNO is aiming
for a prototype hydrophone which will form the building block of a future telescope.
The work will be executed at the Nikhef institute and/or the TNO laboratories in Delft. In this project there are two opportunities for master students to participate:<br>
<b>student project 1: </b> Hardware development on fiber optics hydrophones technology Goal: characterise existing proto-type optical fibre hydrophones in an anechoic basin at TNO laboratory. Data collection, calibration, characterisation, analysis of consequences for design future acoustic hydrophone neutrino telescopes; Keywords: Optical fiber technology, signal processing, electronics, lab. <b>student project 2:</b> Investigation of ultra-high energy neutrinos and their interactions with matter. Goal: simulate (currently imperfectly modelled) interaction for extremely high energy interactions, characterise differences with currently available physics models and impact on physics reach for future acoustic hydrophone neutrino telescopes; Keywords: Monte Carlo simulations, particle physics, cosmology. <br>
Further information: Info on ultra-high energy neutrinos can be found at:; Info on acoustic detection of neutrinos can be found at:
''Contact: [ Ernst-Jan Buis] and [ Ivo van Vulpen]'''
=== KM3NeT: Applying state-of-the-art reconstruction software to 10-years of Antares data ===
While the KM3NeT neutrino telescope is being constructed in
the deep waters of the Mediterranean Sea,
data from its precursor (Antares) have been accumulated for more than 10 years.
The main objective of these neutrino telescopes is to determine the origin of (cosmic) neutrinos.
The accuracy of the determination of the origin of neutrinos critically depends on
the probability density function (PDF) of the arrival time of Cherenkov light
produced by relativistic charged particles emerging from a neutrino interaction in the sea.
It has been shown that these PDFs can be calculated from first principles and
that the obtained values can efficiently be interpolated in 4 and 5 dimensions,
without compromising the functional dependencies.
The reconstruction software based on this input yields indeed for KM3NeT the best resolution.
This project is aimed at applying the KM3NeT software to available Antares data.
''Contact: [ Maarten de Jong]''
=== HiSPARC: Extensive Air Shower Reconstruction using Machine Learning ===
An important aspect of high energy cosmic ray research is the reconstruction of the direction and energy of the primary cosmic ray. This is done by measuring the footprint of the extensive air shower initiated by the cosmic ray. The goal of this project is to advance the creation of a reconstruction algorithm based on machine learning (ML) techniques.
A previous master student has made great progress in the creation of a ML algorithm for the direction reconstruction. The algorithm was trained on simulations and applied to real data. The method works quite well but we expect that better results can be achieved by improving the simulated data set. In this project you will implement a more accurate description of the photomultiplier tube in the simulation pipeline and check if the reconstruction will improve. The next step would be to advance the algorithm towards energy reconstruction. This means upscaling the current method and will involve the creation and manipulation of large simulated data sets.
The HiSPARC group is small. As a student you can have a big impact and there is freedom to tailor your own project. The proposed project is for students with a particular interest in computational (astro)physics. Advanced programming skills (mainly Python) and Linux knowledge are desirable.
''Contact: [ Kasper van Dam] en [ Bob van Eijk]''
=== VU LaserLaB: Measuring the electric dipole moment (EDM) of the electron ===
In collaboration with Nikhef and the Van Swinderen Institute for Particle Physics and Gravity at the University of Groningen, we have recently started an exciting project to measure the electric dipole moment (EDM) of the electron in cold beams of barium-fluoride molecules. The eEDM, which is predicted by the Standard Model of particle physics to be extremely small, is a powerful probe to explore physics beyond this Standard Model. All extensions to the Standard Model, most prominently supersymmetry, naturally predict an electron EDM that is just below the current experimental limits. We aim to improve on the best current measurement by at least an order of magnitude. To do so we will perform a precision measurement on a slow beam of laser-cooled BaF molecules. With this low-energy precision experiment, we test physics at energies comparable to those of LHC!
At LaserLaB VU, we are responsible for building and testing a cryogenic source of BaF molecules. The main parts of this source are currently being constructed in the workshop. We are looking for enthusiastic master students to help setup the laser system that will be used to detect BaF. Furthermore, projects are available to perform simulations of trajectory simulations to design a lens system that guides the BaF molecules from the exit of the cryogenic source to the experiment.
''Contact: [ Rick Bethlem]''
=== VU LaserLab: Physics beyond the Standard model from molecules ===
Our team, with a number of staff members (Ubachs, Eikema, Salumbides, Bethlem, Koelemeij) focuses on precision measurements in the hydrogen molecule, and its isotopomers. The work aims at testing the QED calculations of energy levels in H2, D2, T2, HD, etc. with the most precise measurements, where all kind of experimental laser techniques play a role (cw and pulsed lasers, atomic clocks, frequency combs, molecular beams). Also a target of studies is the connection to the "Proton size puzzle", which may be solved  through studies in the hydrogen molecular isotopes.
In the past half year we have produced a number of important results that are described in
the following papers:
* Frequency comb (Ramsey type) electronic  excitations in the  H2 molecule:
see: Deep-ultraviolet frequency metrology of H2 for tests of molecular quantum theory
* ''Precision measurement of an infrared transition in the HD molecule''
see: Sub-Doppler frequency metrology in HD for tests of fundamental physics:
* ''The first precision study in molecular tritium T2''
see: Relativistic and QED effects in the fundamental vibration of T2:
* ''Dissociation energy of the hydrogen molecule at 10^-9 accuracy'' paper submitted to Phys. Rev. Lett.
* ''Probing QED and fundamental constants through laser spectroscopy of vibrational transitions in HD+''
This is also a study of the hydrogen molecular ion HD+, where important results were  obtained not so long ago, and where we have a strong activity:
These five results mark the various directions we are pursuing, and in all directions we aim at obtaining improvements. Specific projects with students can be defined; those are mostly experimental, although there might be some theoretical tasks, like:
* Performing calculations of hyperfine structures
As for the theory there might also be an international connection for specifically bright theory students: we collaborate closely with prof. Krzystof Pachucki; we might find an opportunity
for a student to perform (the best !) QED calculations in molecules, when working in Warsaw and partly in Amsterdam. Prof Frederic Merkt from the ETH Zurich, an expert in the field, will come to work with us on "hydrogen"
during August - Dec 2018 while on sabbatical.
''Contact: [ Wim Ubachs] [ Kjeld Eikema] [ Rick Bethlem]''

Revision as of 11:07, 17 January 2020

Master Thesis Research Projects

The following Master thesis research projects are offered at Nikhef. If you are interested in one of these projects, please contact the coordinator listed with the project.


Projects with September 2020 start

Dark Matter: XENON1T Data Analysis

The XENON collaboration has used the XENON1T detector to achieve the world’s most sensitive direct detection dark matter results and is currently building the XENONnT successor experiment. The detectors operate at the Gran Sasso underground laboratory and consist of so-called dual-phase xenon time-projection chambers filled with ultra-pure xenon. Our group has an opening for a motivated MSc student to do analysis with the data from the XENON1T detector. The work will consist of understanding the detector signals and applying machine learning tools such as deep neutral networks to improve the reconstruction performance in our Python-based analysis tool, following the approach described in arXiv:1804.09641. The final goal is to improve the energy and position reconstruction uncertainties for the dark matter search. There will also be opportunity to do data-taking shifts at the Gran Sasso underground laboratory in Italy.

Contact: Patrick Decowski and Auke Colijn

Dark Matter: XAMS R&D Setup

The Amsterdam Dark Matter group operates an R&D xenon detector at Nikhef. The detector is a dual-phase xenon time-projection chamber and contains about 4kg of ultra-pure liquid xenon. We plan to use this detector for the development of new detection techniques (such as utilizing new photosensors) and to improve the understanding of the response of liquid xenon to various forms of radiation. The results could be directly used in the XENON experiment, the world’s most sensitive direct detection dark matter experiment at the Gran Sasso underground laboratory. We have several interesting projects for this facility. We are looking for someone who is interested in working in a laboratory on high-tech equipment, modifying the detector, taking data and analyzing the data him/herself. You will "own" this experiment.

Contact: Patrick Decowski and Auke Colijn

Dark Matter: DARWIN Sensitivity Studies

DARWIN is the "ultimate" direct detection dark matter experiment, with the goal to reach the so-called "neutrino floor", when neutrinos become a hard-to-reduce background. The large and exquisitely clean xenon mass will allow DARWIN to also be sensitive to other physics signals such as solar neutrinos, double-beta decay from Xe-136, axions and axion-like particles etc. While the experiment will only start in 2025, we are in the midst of optimizing the experiment, which is driven by simulations. We have an opening for a student to work on the GEANT4 Monte Carlo simulations for DARWIN, as part of a simulation team together with the University of Freiburg and Zurich. We are also working on a "fast simulation" that could be included in this framework. It is your opportunity to steer the optimization of a large and unique experiment. This project requires good programming skills (Python and C++) and data analysis/physics interpretation skills.

Contact: Patrick Decowski and Auke Colijn

The Modulation Experiment: Data Analysis

There exist a few measurements that suggest an annual modulation in the activity of radioactive sources. With a few groups from the XENON collaboration we have developed four sets of table-top experiments to investigate this effect on a few well known radioactive sources. The experiments are under construction in Purdue University (USA), a mountain top in Switzerland, a beach in Rio de Janeiro and the last one at Nikhef in Amsterdam. We urgently need a master student to (1) analyze the first big data set, and (2) contribute to the first physics paper from the experiment. We are looking for all-round physicists with interest in both lab-work and data-analysis. The student(s) will directly collaborate with the other groups in this small collaboration (around 10 people), and the goal is to have the first physics publication ready by the end of the project. During the 2018-2019 season there are positions for two MSc students.

Contact: Auke Colijn

Last year's MSc Projects

Personal tools