Master Projects

From Education Wiki
Jump to navigation Jump to search

Master Thesis Research Projects

The following Master thesis research projects are offered at Nikhef. If you are interested in one of these projects, please contact the coordinator listed with the project.

Projects with September 2020 start

ALICE: Searching for the strongest magnetic field in nature

In case of a non-central collision between two Pb ions, with a large value of impact parameter (b), the charged nucleons that do not participate in the interaction (called spectators) create strong magnetic fields. A back of the envelope calculation using the Biot-Savart law brings the magnitude of this filed close to 10^19Gauss in agreement with state of the art theoretical calculation, making it the strongest magnetic field in nature. The presence of this field could have direct implications in the motion of final state particles. The magnetic field, however, decays rapidly. The decay rate depends on the electric conductivity of the medium which is experimentally poorly constrained. Overall, the presence of the magnetic field, the main goal of this project, is so far not confirmed experimentally.

Contact: Panos Christakoglou

ALICE: Looking for parity violating effete in strong interactions

Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions, in the presence of a deconfined state, gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, what is called the Chiral Magnetic Effect (CME). The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, however further studies are needed to constrain background effects. These highly anticipated results have the potential to reveal exiting, new physics.


Contact: Panos Christakoglou

ALICE: Machine learning techniques as a tool to study the production of heavy flavour particles

There was recently a shift in the field of heavy-ion physics triggered by experimental results obtained in collisions between small systems (e.g. protons on protons). These results resemble the ones obtained in collisions between heavy ions. This consequently raises the question of whether we create the smallest QGP droplet in collisions between small systems. The main objective of this project will be to study the production of charm particles such as D-mesons and Λc-baryons in pp collisions at the LHC. This will be done with the help of a new and innovative technique which is based on machine learning (ML). The student will also extend the studies to investigate how this production rate depends on the event activity e.g. on how many particles are created after every collision.


Contact: Panos Christakoglou and Alessandro Grelli

Lepton Collider: Pixel TPC testbeam

In the Lepton Collider group at Nikhef we work on a tracking detector for a future Collider (e.g. the ILC in Japan). We are developing a gaseous Time Projection Chamber with a pixel readout. At Nikhef we have built an 8-quad GridPix module based on the Timepix3 chip, which is a detector of about 20 cm x 40 cm x 10 cm in size. In August 2020 we will test the device at the DESY particle accelerator in Hamburg. For the project you could work on preparations for the test beam (e.g. running the data acquisition, perform data monitoring using our set up in the lab). The next topics will be the participation in the data taking during the test beam at DESY, the analysis of the data using C++ and ROOT and - finally - publication of the results in a scientific journal.

Our latest paper can be found in https://www.nikhef.nl/~s01/quad_paper.pdf [www.nikhef.nl].

Contact: Peter Kluit and Kees Ligtenberg

ATLAS: Top Spin optimal observables using Artificial Intelligence

The top quark has an exceptional high mass, close to the electroweak symmetry breaking scale and therefore sensitive to new physics effects. Theoretically, new physics is well described in the EFT framework [1]. The (EFT) operators are experimentally well accessible in single top t-channel production where the top quark is produced spin polarized. The focus at Nikhef is the operator O_{tW} with a possible imaginary phase, leading to CP violation. Experimentally, many angular distribution are reconstructed in the top rest frame to hunt for these effects. We are looking for a limited set of optimal observables. The objective of your Master project would be to find optimal observables using simulated events including the detector effects and possible systematic deviations. All techniques are allowed, but promising new developments are methods which involve artifical intelligence. This work could lead to an ATLAS note.

[1] https://arxiv.org/abs/1807.03576

Contact: Marcel Vreeswijk [1] and Jordy Degens [2]

ATLAS: The Next Generation

After the observation of the coupling of Higgs bosons to fermions of the third generation, the search for the coupling to fermions of the second generation is one of the next priorities for research at CERN's Large Hadron Collider. The search for the decay of the Higgs boson to two charm quarks is very new [1] and we see various opportunities for interesting developments. For this project we propose improvements in reconstruction (using exclusive decays), advanced analysis techiques (using deep learning methods) and expanding the theory interpretation. Another opportunity would be the development of the first statistical combination of results between the ATLAS and CMS experiment, which could significantly improve the discovery potentional.

[1] https://arxiv.org/abs/1802.04329

Contact: Tristan du Pree and Marko Stamenkovic

ATLAS: The Most Energetic Higgs Boson

The production of Higgs bosons at the highest energies could give the first indications for deviations from the standard model of particle physics, but production energies above 500 GeV have not been observed yet [1]. The LHC Run-2 dataset, collected during the last 4 years, might be the first opportunity to observe such processes, and we have various ideas for new studies. Possible developments include the improvement of boosted reconstruction techniques, for example using multivariate deep learning methods. Also, there are various opportunities for unexplored theory interpretations (using the MadGraph event generator), including effective field theory models (with novel ‘morphing’ techniques) and new interpretations of the newly observed boosted VZ(bb) process.

[1] https://arxiv.org/abs/1709.05543

Contact: Tristan du Pree and Brian Moser

Dark Matter: XENON1T Data Analysis

The XENON collaboration has used the XENON1T detector to achieve the world’s most sensitive direct detection dark matter results and is currently building the XENONnT successor experiment. The detectors operate at the Gran Sasso underground laboratory and consist of so-called dual-phase xenon time-projection chambers filled with ultra-pure xenon. Our group has an opening for a motivated MSc student to do analysis with the data from the XENON1T detector. The work will consist of understanding the detector signals and applying machine learning tools such as deep neutral networks to improve the reconstruction performance in our Python-based analysis tool, following the approach described in arXiv:1804.09641. The final goal is to improve the energy and position reconstruction uncertainties for the dark matter search. There will also be opportunity to do data-taking shifts at the Gran Sasso underground laboratory in Italy.

Contact: Patrick Decowski and Auke Colijn

Dark Matter: Signal reconstruction in XENONnT

The next generation direct detection dark matter experiment - XENONnT - comprises close to 500 photomultiplier tubes (PMTs) in the main detector volume. These PMTs are configured to be able to detect even single photons. When a single photoelectron (PE) signal is detected the detected signal (a pulse) is convoluted with the detector response of the PMT. Due to this detector response the pulse shape of a single PE is spread out in time. For XENONnT we would like to explore the possibility to implement a digital (software) filter to deconvolve the detected pulse back to the “true” instantaneous shape (without the detector spread). This is a virtually unexplored new step in the Xenon analysis framework. Later in the analysis framework these pulses from all the PMTs are combined into a signal referred to as a ‘peak’. For XENONnT it is of essence to be extremely good in discriminating between two types of peaks caused by interactions in the detector; a prompt primary scintillation signal (S1) and a secondary ionization signal (S2). The parameters in the software haven’t - as of the time of writing - been optimized for the XENONnT-detector conditions. The student would investigate how a deconvolution filter would benefit the XENONnT analysis framework and develop such a filter. Furthermore, the student will work on the classification of these signals to fully exploit the XENONnT-detector to optimize the classification. This will be done with simulated data at first but may later even be performed on actual XENONnT-data. As an extension, the possibility of applying machine learning to correctly distinguish between the two signals could be explored. This is a data-analysis oriented project where Python skills are paramount.

Contact: Patrick Decowski and Joran Angevaare

Dark Matter: XAMS R&D Setup

The Amsterdam Dark Matter group operates an R&D xenon detector at Nikhef. The detector is a dual-phase xenon time-projection chamber and contains about 4kg of ultra-pure liquid xenon. We use this detector for the development of new detection techniques - such as utilizing our newly installed silicon photomultipliers - and to improve the understanding of the response of liquid xenon to various forms of radiation. The results could be directly used in the XENONnT experiment, the world’s most sensitive direct detection dark matter experiment at the Gran Sasso underground laboratory, or for future Dark Matter experiments like DARWIN. We have several interesting projects for this facility. We are looking for someone who is interested in working in a laboratory on high-tech equipment, modifying the detector, taking data and analyzing the data him/herself. You will "own" this experiment.

Contact: Patrick Decowski and Auke Colijn

Dark Matter: DARWIN Sensitivity Studies

DARWIN is the "ultimate" direct detection dark matter experiment, with the goal to reach the so-called "neutrino floor", when neutrinos become a hard-to-reduce background. The large and exquisitely clean xenon mass will allow DARWIN to also be sensitive to other physics signals such as solar neutrinos, double-beta decay from Xe-136, axions and axion-like particles etc. While the experiment will only start in 2025, we are in the midst of optimizing the experiment, which is driven by simulations. We have an opening for a student to work on the GEANT4 Monte Carlo simulations for DARWIN, as part of a simulation team together with the University of Freiburg and Zurich. We are also working on a "fast simulation" that could be included in this framework. It is your opportunity to steer the optimization of a large and unique experiment. This project requires good programming skills (Python and C++) and data analysis/physics interpretation skills.

Contact: Patrick Decowski and Auke Colijn

Dark Matter: Fast simulation studies

For Dark Matter experiments it is crucial to understand sources of backgrounds in great detail. The most common way to study the effect of backgrounds to the Dark Matter sensitivity is by the use of Monte Carlo simulations. Unfortunately, the standard Monte Carlo techniques are extremely inefficient. One needs to sometimes simulate millions of events before one background event appears in the Dark Matter search area. We have developed a Monte Carlo technique that accelerates this process by up to 1000x. The method has been validated on very simple and unrealistic detector models. In goal of this project is to make a realistic detector model for the fast detector simulations. For this we are looking for a student with good programming skills, an interest in a software project, and the desire to deeply understand analysis of Dark Matter experimental data.

Contact: Patrick Decowski and Auke Colijn

Dark Matter & Amsterdam Scientific Instruments: Simulations for Industry

In the Nikhef Dark Matter group we have built up an extensive expertise with Monte Carlo simulations of ionizing radiation. Although these simulations have the aim to estimate background levels in our XENON experiments, the same techniques can be applied to study radiation transport in industrial devices. Amsterdam Scientific Instruments (ASI) is a company at Science Park that develops and sells radiation imaging equipment that is used amongst others in electron microscopy. For this application ASI needs a detailed study of gamma ray backgrounds to optimize shielding for their products. The project aims at optimizing a shielding design based on GEANT4 simulations. The results may be implemented in next generation products of ASI. We are looking for a student with preferably strong computing skills, and with an interest in science-industrial collaboration.

Contact: Patrick Decowski and Auke Colijn

The Modulation experiment: Data Analysis

For years there have been controversial claims of potential new-physics on the basis of time-varying decay rates of radioactive sources on top of ordinary exponential decay. While some of these claims have been refuted, others have still to be confirmed or falsified. To this end, a dedicated experiment - the modulation experiment - has been designed and operational for the past four years. Using four identical and independent setups the experiment is almost ready for a final analysis to conclude on these claims. In this project the student will perform this analysis, preferably resulting in a conclusive paper. This will require combining the data of the four setups and close collaboration with a small group constituting a collaboration of the four different involved institutes (Purdue University (USA), Universität Zürich (Switzerland), Centro Brasileiro de Pesquisas Fisicas (Brasil) and Nikhef). This project is data-analysis oriented. Additionally, lab-skills can be required as one of the setups is situated at Nikhef.

Contact: Auke Colijn and Joran Angevaare

Detector R&D: Laser Interferometer Space Antenna (LISA)

The space-based gravitational wave antenna LISA is, without a doubt, one of the most challenging space missions ever proposed. ESA plans to launch around 2030 three spacecraft that are separated by a few million kilometers to measure tiny variations in the distances between test-masses located in each satellite to detect the gravitational waves from sources such as supermassive black holes. The triangular constellation of the LISA mission is dynamic, requiring a constant fine-tuning related to the pointing of the laser links between the spacecraft and a simultaneous refocusing of the telescope. The noise sources related to the laser links expect to provide a dominant contribution to the LISA performance. An update and extension of the LISA science simulation software are needed to assess the hardware development for LISA at Nikhef, TNO, and SRON. A position is therefore available for a master student to study the impact of instrumental noise on the performance of LISA. Realistic simulations based on hardware (noise) characterization measurements performed at TNO will be carried out and compared to the expected tantalizing gravitational wave sources.

Contact: Niels van Bakel,Ernst-Jan Buis

Detector R&D: Spectral X-ray imaging - Looking at colours the eyes can't see

When a conventional X-ray image is taken, one acquires an image that only shows intensities. a ‘black and white’ image. Most of the information carried by the photon energy is lost. Lacking spectral information can result in an ambiguity between material composition and amount of material in the sample. If the X-ray intensity as a function of the energy can be measured (i.e. a ‘colour’ X-ray image) more information can be obtained from a sample. This translates to less required dose and/or to a better understanding of the sample that is being investigated. For example, two fields that can benefit from spectral X-ray imaging are mammography and real time CT.

Detectors using Medipix3 chips are used for X-ray imaging. Such a detector is composed of a pixel chip with a semiconductor sensor bonded on top of it. Photoelectric absorption of X-rays in the sensor results in an amount of charge being released that is proportional to the X-ray energy. This charge is registered by a pixel. Depending on configuration, in each pixel 1, 2, 4 or 8 detection thresholds can be set and so, a number of energy bins can be defined. One of the challenges is to maximise X-ray image quality by minimising effects caused by dispersion in the sensitivity of the pixels. The effects of this dispersion can partly be compensated by applying a specific measurement method in combination with image post processing.

You can work on improving measurement methods and on improving post processing methods. There is flexibility of the planned work depending on the skillset you have. The aim is to get the best X-ray energy resolution over the entire pixel chip. This in turn improves image quality and therefore X-ray CT reconstruction quality.

Important note: Much of this work is to be performed in the laboratory. For as long as corona safety measures are active, the labs at Nikhef are not accessible for students and this project cannot be worked on except for post-processing in software. Currently we hope that the situation will have improved by August. Please see the following videos for examples of our work:

https://youtu.be/cgwQvjfUYns

https://youtu.be/tf9ZLALPVNY

https://youtu.be/vjPX7SxvSUk

https://youtu.be/LqjNVSm7Hoo

Contact: Martin Fransen,Navrit Bal

Detector R&D: Holographic projector

A difficulty in projecting holograms (based on the interference of light) is the required dense pixel pitch of a projector. One would need a pixel pitch of less than 200 nanometer. With larger pixels artefacts occur due to spatial under sampling. A pixel pitch of 200 nanometer is difficult, if not, impossible, to achieve, especially for larger areas. Another challenge is the massive amount of computing power that would be required to control such a dense pixel matrix.

A new holographic projection method has been developed that reduces under sampling artefacts for projectors with a ‘low’ pixel density. It uses 'pixels' at random but known positions, resulting in an array of (coherent) light points that lacks (or has suppressed) spatial periodicity. As a result a holographic projector can be built with a significantly lower pixel density and correspondingly less required computing power. This could bring holography in reach for many applications like display, lithography, 3D printing, metrology, etc...

Of course, nothing comes for free: With less pixels, holograms become noisier and the contrast will be reduced (not all light ends up in the hologram). The questions: How does the quality of a hologram depend on pixel density? How do we determine projector requirements based on requirements for hologram quality?

Requirements for a hologram can be expressed in terms of: Noise, contrast, resolution, suppression of under sampling artefacts, etc..

For this project we have built a proof of concept holographic emitter. This set-up will be used to verify simulation results (and also to project some cool holograms of course ;-).

Examples of what you could be working on:

a. Calibration/characterisation of the current projector and compensation of systematic errors.

b. To realize a phased array of randomly placed light sources the pixel matrix of the projector must be ‘relayed’ onto a mask with apertures at random but precisely known positions. Determine the best possible relaying optics and design an optimized mask accordingly. Factors like deformation of the projected pixel matrix and limitations in resolving power of the lens system must be taken into account for mask design.

Important note: Much of this work is to be performed in the laboratory. For as long as corona safety measures are active, the labs at Nikhef are not accessible for students and this project cannot be worked on. Currently we hope that the situation will have improved by august.

Contact: Martin Fransen

Theory: The Effective Field Theory Pathway to New Physics at the LHC

A promising framework to parametrise in a robust and model-independent way deviations from the Standard Model (SM) induced by new heavy particles is the Standard Model Effective Field Theory (SMEFT). In this formalism, beyond the SM effects are encapsulated in higher-dimensional operators constructed from SM fields respecting their symmetry properties. In this project, we aim to carry out a global analysis of the SMEFT from high-precision LHC data, including Higgs boson production, flavour observables, and low-energy measurements. This analysis will be carried out in the context of the recently developed SMEFiT approach [1] based on Machine Learning techniques to efficiently explore the complex theory parameter space. The ultimate goal is either to uncover glimpses of new particles or interactions at the LHC, or to derive the most stringent model-independent bounds to date on general theories of New Physics. Of particular interest are novel methods for charting the parameter space [2], the matching to UV-complete theories in explicit BSM scenarios [3], and the interplay between EFT-based model-independent searches for new physics and determinations of the proton structure from LHC data [4].

[1] https://arxiv.org/abs/1901.05965 [2] https://arxiv.org/abs/1906.05296 [3] https://arxiv.org/abs/1908.05588 [4] https://arxiv.org/abs/1905.05215

Contact: Juan Rojo

Theory: Charting the quark and gluon structure of protons and nuclei with Machine Learning

Deepening our knowledge of the partonic content of nucleons and nuclei [1] represents a central endeavour of modern high-energy and nuclear physics, with ramifications in related disciplines such as astroparticle physics. There are two main scientific drivers motivating these investigations of the partonic structure of hadrons. On the one hand, addressing fundamental open issues in our understanding in the strong interactions such as the origin of the nucleon mass, spin, and transverse structure; the presence of heavy quarks in the nucleon wave function; and the possible onset of novel gluon-dominated dynamical regimes. On the other hand, pinning down with the highest possible precision the substructure of nucleons and nuclei is a central component for theoretical predictions in a wide range of experiments, from proton and heavy ion collisions at the Large Hadron Collider to ultra-high energy neutrino interactions at neutrino telescopes. The goal of this project is to exploit Machine Learning and Artificial Intelligence tools [2,3] (neural networks trained by stochastic gradient descent) to pin down the quark and gluon substructure of protons and nuclei by using recent measurements from proton-proton and proton-lead collisions at the LHC. Topics of special interest are i) the strange content of protons and nuclei, ii) parton distributions at higher-orders in the QCD couplings for precision Higgs physics, iii) the interplay between jet, photon, and top quark production data to pin down the large-x gluon, and iv) charm quarks as a probe of gluon shadowing at small-x. The project also involves developing projects for the Electron-Ion Collider (EIC), a new lepton-nucleus experiment to start operations in the next years.

[1] https://arxiv.org/abs/1910.03408 [2] https://arxiv.org/abs/1904.00018 [3] https://arxiv.org/abs/1706.00428

Contact: Juan Rojo

Theory: The electroweak phase transition and baryogenesis/gravitational wave production

In extensions of the Standard Model the electroweak phase transition can be first order and proceed via the nucleation of bubbles. Colliding bubbles can produce gravitational waves [1] and plasma particles interacting with the bubbles can generate a matter-antimatter asymmetry [2]. A detailed understanding of the dynamics of the phase transitions is needed to accurately describe these processes. One project is to study QFT at finite temperature and compare/apply methods that address the non-perturbative IR dynamics of the thermal processes [3]. Another project is to calculate the velocity by which the bubbles expand, which is an important parameter for gravitational waves production and baryogensis. This entails among other things tunneling dymamics, (thermal) scattering rates and Boltzmann equations [4].

[1]https://arxiv.org/abs/1705.01783 [2]https://arxiv.org/pdf/hep-ph/0609145.pdf [3]https://arxiv.org/pdf/1609.06230.pdf, https://arxiv.org/pdf/1612.00466.pdf [4]https://arxiv.org/pdf/1809.04907.pdf

Contact: Marieke Postma


Theory: Cosmology of the QCD axion

The QCD axion provides an elegant solution to the strong CP problem in QCD[1]. This project focus on the cosmological dynamics of this hypothesized axion field, and in particular the possibility that it can both produce the observed matter-antimatter asymmetry and dark matter abundance in our universe [2].

[1]https://arxiv.org/abs/1812.02669 [2]https://arxiv.org/pdf/hep-ph/0609145.pdf, https://arxiv.org/pdf/1910.02080.pdf

Contact: Marieke Postma


Theory: Neutrinos, hierarchy problem and cosmology

The electroweak hierachy problem is absent if the quadratic term in the Higgs potential is generated dynamically. This is achieved in 'the neutrino option' [1] where the Higgs potential stems exclusively from quantum effects of heavy right-handed neutrinos, which can also generate the mass pattern of the oberved left-handed neutrinos. The project focusses on model building aspects (e.g. [2]) and the cosmology (e.g. leptogenesis [3]) of these set-ups.

[1] https://arxiv.org/pdf/1703.10924.pdf [2] https://arxiv.org/pdf/1807.11490.pdf [3] https://arxiv.org/pdf/1905.12642.pdf

Contact: Marieke Postma


Last year's MSc Projects