Master Projects

From Education Wiki
(Difference between revisions)
Jump to: navigation, search
(Detector R&D: Performance of the ALPIDE monolithic active pixel sensor with radiation damage)
 
(239 intermediate revisions by 29 users not shown)
Line 3: Line 3:
 
The following Master thesis research projects are offered at Nikhef. If you are interested in one of these projects, please contact the coordinator listed with the project.  
 
The following Master thesis research projects are offered at Nikhef. If you are interested in one of these projects, please contact the coordinator listed with the project.  
  
[MORE PROJECTS TO COME!]
+
== Projects with September 2020 start ==
  
[[Last years MSc Projects|Last year's MSc Projects]]
+
=== ATLAS: Top Spin optimal observables using Artificial Intelligence ===
  
 +
The top quark has an exceptional high mass, close to the electroweak symmetry breaking scale and therefore sensitive to new physics effects. Theoretically, new physics is well described in the EFT framework [1]. The (EFT) operators are experimentally well accessible in single top t-channel production where the top quark is produced spin polarized. The focus at Nikhef is the operator O_{tW} with a possible imaginary phase, leading to CP violation. Experimentally, many angular distribution are reconstructed in the top rest frame to hunt for these effects. We are looking for a limited set of optimal observables. The objective of your Master project would be to find optimal observables using simulated events including the detector effects and possible systematic deviations. All techniques are allowed, but promising new developments are methods which involve artifical intelligence. This work could lead to an ATLAS note.
  
=== The XENON Dark Matter Experiment: Data Analysis ===
+
[1] https://arxiv.org/abs/1807.03576
  
The XENON collaboration is currently commissioning the XENON1T detector, soon to be the world’s most sensitive direct detection dark matter experiment, with the Nikhef group playing an important role in this work. The detector operates at the Gran Sasso underground laboratory and consists of a so-called dual-phase xenon time-projection chamber filled with 3500kg of ultra-pure xenon. Our group has an opening for a motivated MSc student to do data-analysis on this new detector. The work will consist of understanding the signals that come out of the detector and in particular focus on the so-called double scatter events. We are interested in developing methods in order to interpret the response of the detector better and are developing sophisticated statistical tools to do this. This work will include looking at data and developing new algorithms in our Python-based analysis tool.  
+
''Contact: Marcel Vreeswijk [mailto:h73@nikhef.nl] and Jordy Degens [mailto:jdegens@nikhef.nl]  ''
  
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski]''
+
=== ATLAS: The Next Generation ===
  
=== XAMS Dark Matter R&D Setup ===
+
After the observation of the coupling of Higgs bosons to fermions of the third generation, the search for the coupling to fermions of the second generation is one of the next priorities for research at CERN's Large Hadron Collider. The search for the decay of the Higgs boson to two charm quarks is very new [1] and we see various opportunities for interesting developments. For this project we propose improvements in reconstruction (using exclusive decays), advanced analysis techiques (using deep learning methods) and expanding the theory interpretation. Another opportunity would be the development of the first statistical combination of results between the ATLAS and CMS experiment, which could significantly improve the discovery potentional.
  
The Amsterdam Dark Matter group has built an R&D xenon detector at Nikhef. The detector is a dual-phase xenon time-projection chamber and contains about 4kg of ultra-pure liquid xenon. We plan to use this detector for the development of new detection techniques (such as utilizing new photosensors) and to improve the understanding of the response of liquid xenon to various forms of radiation. The results could be directly used in the XENON experiment, the world’s most sensitive direct detection dark matter experiment at the Gran Sasso underground laboratory. We have several interesting projects for this facility. We are looking for someone who is interested in working in a laboratory on high-tech equipment, modifying the detector, taking data and analyzing the data him/herself. You will "own" this experiment.  
+
[1] https://arxiv.org/abs/1802.04329
  
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski]''
+
''Contact: [mailto:tdupree@nikhef.nl Tristan du Pree and Marko Stamenkovic]''
  
=== ATLAS : Beyond Standard Model with multiple leptons ===
+
=== ATLAS: The Most Energetic Higgs Boson ===
  
The Standard Model of particle physics (SM) is extremely successful, but would it hold against of check of with data containing multiple leptons? Although very rare process, the production of leptons is calculated in SM with high precision. On detector side the leptons (electrons and muons) are easy to reconstruct and such a sample contains very little "non-lepton" background. This analysis has a very ambitious goal to test many final states at once, without over-tuning for a specific model. The second step would then be to test obtained results against models of composite structure of leptons or presence of heavy right handed neutrinos favored in seesaw theories. With this project, the student would gain close familiarity with modern experimental techniques (statistical analysis, SM background estimates, etc.), with Monte Carlo generators and the standard HEP analysis tools (ROOT, C++, etc.).
+
The production of Higgs bosons at the highest energies could give the first indications for deviations from the standard model of particle physics, but production energies above 500 GeV have not been observed yet [1]. The LHC Run-2 dataset, collected during the last 4 years, might be the first opportunity to observe such processes, and we have various ideas for new studies. Possible developments include the improvement of boosted reconstruction techniques, for example using multivariate deep learning methods. Also, there are various opportunities for unexplored theory interpretations (using the MadGraph event generator), including effective field theory models (with novel ‘morphing’ techniques) and new interpretations of the newly observed boosted VZ(bb) process.
  
''Contact: [mailto:O.Igonkina@nikhef.nl Olya Igonkina]''
+
[1] https://arxiv.org/abs/1709.05543
  
=== KM3NeT : Reconstruction of first neutrinos in KM3NeT ===
+
''Contact: [mailto:tdupree@nikhef.nl Tristan du Pree and Brian Moser]''
  
The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first string with sensitive photodetectors has been deployed end of 2015, in total 30 will be deployed til end of 2017. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. The performance and calibration of the detector will be evaluated also in comparison with simulations. Procedures to identify and also optimally reconstruct the directions of the muons and neutrinos will be developed to verify the performance and potential of the detector and to pave the path towards the neutrino astronomy. Programming skills are essential, mostly root and C++ will be used.
+
=== LHCb: Measurement of delta md  ===
 +
The decay B0->D-pi+ is very abundant in LHCb, and therefore ideal to study the oscillation frequency
 +
delta md, with which B0 mesons oscillate into anti-B0 mesons, and vice versa.
 +
This process proceeds through a so-called box diagram which might hide new yet-undiscovered particles.
 +
Recently, it has been realized that value of delta md is in tension with the valu of CKM-angle gamma,
 +
triggering renewed interest in this measurement.
  
''Contact: [mailto:rbruijn@nikhef.nl Ronald Bruijn]''
+
''Contact: [mailto:Marcel.Merk@nikhef.nl Marcel Merk]''
  
=== Neutrino mass hierarchy with KM3NeT/ORCA ===
+
=== LHCb: Searching for CPT violation ===  
 +
CPT symmetry is closely linked to Lorentz symmetry, and any violation
 +
would revolutionize science. There are possibilities though that supergravity could
 +
cause CPT violating effects in the system of neutral mesons.
 +
The precise study of B0s oscillations in the abundant Bs->Dspi decays can
 +
give the most stringent limits on Im(z) to date.
  
Neutrinos exist in three flavors and are known to oscillate between flavors whereby the detected flavor depends on the (partly) known oscillation parameters, the mass differences, their energy and travel length. The neutrino telescope KM3NeT is planning for a dedicated set of detection units in order to pursue an oscillation measurement of an unprecedented precision using neutrinos from atmospheric interactions and with this enabling the measurement of the so far still unknown neutrino mass hierarchy. The measurement of this subtle effect requires unprecedented precision in the reconstruction and identification of the flavor, energy and direction. Various projects are available in the reconstruction and evaluation of the mass hierarchy using dedicated simulations. Programming skills are essential, mainly C++ and root will be used.
+
''Contact: [mailto:Marcel.Merk@nikhef.nl Marcel Merk]''
  
''Contact: [mailto:aart.heijboer@nikhef.nl Aart Heijboer]''
+
=== LHCb: BR(B0->D-pi+) and fd/fu with B+->D0pi+ ===
 +
The abundant decay B0->D-pi+ is often used as normalization channel, given its
 +
clean signal, and well-known branching fraction, as measured by the B-factories.
 +
However, this branching fraction can be determined more precisely, when comparing
 +
to the decay B+->D0pi+ , which has a twice better precision.
 +
In addition, the production of B0 and B+ mesons is often assumed to be equal,
 +
based on isospin symmetry. The study of B+->D0pi+ and B0->D-pi+ allows for the
 +
first measurement of this ratio, fd/fu.
  
=== All-flavor-neutrino analysis of ANTARES data ===
+
''Contact: [mailto:Marcel.Merk@nikhef.nl Marcel Merk]''
  
The ANTARES neutrino telescope has been taking data continuously since 2007. Most analyses of the data have been performed using the signature of a muon neutrino interaction whereby a long track can be reconstructed in the detector.
 
Recent developments allowed for the first time also the reconstruction of a cascade signature in the detector at high angular resolution so that also electron and tau neutrino interactions can be detected (here these two are not distinguishable from each other). A search for neutrinos from cosmic sources on the first 6 years of data has by now been accomplished using this new reconstruction. In this project this search will be continued and exploited also on 2 more years of data for a dedicated optimized analysis of the Galactic Center to probe the possible neutrino emission from this highly interesting region.
 
Programming skills are essential, mainly C++ and root will be used.
 
Also other options for analyses of the ANTARES data are available.
 
  
''Contact: [mailto:dosamt@nikhef.nl Dorothea Samtleben]''
+
=== LHCb: Optimization studies for Vertex detector at the High Lumi LHCb ===
 +
The LHCb experiment is dedicated to measure tiny differences between matter and antimatter through the precise study of rare processes involving b or c quarks.  The LHCb detector will undergo a major modification in order to dramatically increase the luminosity and be able to  measure indirect effects of physics beyond the standard model.  In this environment, over 42 simultaneous collisions are expected to happen at a time interval of 200 ps where the two proton bunches overlap. The particles of interest have a relatively long lifetime and therefore the best way to distinguish them from the background collisions is through the precise reconstruction of displaced vertices and pointing directions.  The new detector considers using extremely recent or even future technologies to measure space (with resolutions below 10 um) and time (100 ps or better) to efficiently reconstruct the events of interest for physics.  The project involves changing completely  the LHCb Vertex Locator (VELO) design in simulation and determine what can be the best performance for the upgraded detector, considering different spatial and temporal resolutions.
  
=== Particle Polarization in Strong Magnetic Fields ===
+
''Contact: [mailto:kazu.akiba@nikhef.nl Kazu Akiba]''
When two atomic nuclei, moving in opposite directions, collide off- center then the Quark Gluon Plasma (QGP) created in the overlap zone is expected to rotate. The nucleons not participating in the collision represent electric currents generating an intense magnetic field. The magnetic field could be as large as 10^{18} gauss, orders of magnitude larger than the strongest magnetic fields found in astronomical objects. Proving the existence of the rotation and/or the magnetic field could be done by checking if particles with spin are aligned with the rotation axis or if charged particles have different production rates relative to the direction of the magnetic field. In particular, the longitudinal and transverse polarisation of the Lambda^0 baryon will be studied. This project requires some affinity with computer programming.  
+
  
''Contact: [mailto:Panos.Christakoglou@nikhef.nl P. Christakoglou], [mailto:Paul.Kuijer@nikhef.nl P. Kuijer]''
+
=== LHCb: Measurement of charge multiplication in heavily irradiated sensors ===
 +
During the R&D phase for the LHCb VELO Upgrade detector a few sensor prototypes were irradiated to the extreme fluence expected to be achieved during the detector lifetime. These samples were tested using high energy particles at the SPS facility at CERN with their trajectories reconstructed by the Timepix3 telescope. A preliminary analysis revealed that at the highest irradiation levels the amount of signal observed is higher than expected, and even larger than the signal obtained at lower doses.   At the Device Under Test (DUT) position inside the telescope, the spatial resolution attained by this system is below 2 um. This means that a detailed analysis can be performed in order to study where and how this signal amplification happens within  the 55x55 um^2 pixel cell. This project involves analysing the telescope and DUT data to investigate the charge multiplication mechanism at the microscopic level.
  
=== Forward Particle Production from the Color Glass Condensate ===
+
''Contact: [mailto:kazu.akiba@nikhef.nl Kazu Akiba]''
It has been proposed that a new state of matter (the color-glass condensate, or CGC) may provide a universal description of hadronic collisions (e.g. proton-proton collisions) at very high energy. The CGC may be seen as the classical field limit of Quantum Chromodynamics, and a framework for calculating observables from this state has been developed. Several measurements are consistent with the assumption of a CGC, but no experimental proof exists so far. In this project we intend to perform a systematic study of the sensitivity to the CGC of different possible measurements at the LHC. The work will be performed in close collaboration with an external world expert in this field. It is advantageous to have a good background in theoretical physics. (contact: T. Peitzmann, M. van Leeuwen)
+
Blast-Wave Model in Heavy-Ion collisions
+
The goal of heavy-ion physics is to study the Quark Gluon Plasma (QGP), a hot and dense medium where quarks and gluons move freely over large distances, larger than the typical size of a hadron. Hydrodynamic simulations expect that the QGP will expand under its own pressure, and cool while expanding. These simulations are particularly successful in describing some of the key observables measured experimentally, such as particle spectra and elliptic flow. A reasonable reproduction of the same observables is also achieved with models that use parameterisations that resemble the hydrodynamical evolution of the system assuming a given freeze-out scenario, usually referred to as blast-wave models. The goal of this project is to work on different blast wave parametrisations, test their dependence on the input parameters and extend their applicability by including more observables studied in heavy-ion collisions in the global fit.  
+
  
''Contact: [mailto:Panos.Christakoglou@nikhef.nl P. Christakoglou], [mailto:Paul.Kuijer@nikhef.nl P. Kuijer]''
+
=== LHCb: Testing the flavour anomalies at LHCb ===
 +
Lepton Flavour Universality (LFU) is an intrinsic property of the Standard Model, which implies that the three generation of leptons are subject to the same interactions. This fundamental law of the SM can be investigated by looking at rare B-meson decay with muons or electron in the final state. Recent measurements of these decays from LHCb show deviation from the SM (known as flavour anomalies) that, if confirmed, would lead to a major discovery of New Physics (NP). The project consists in the analysis of the 2017-18 dataset, which will double the statistic of the current results. This new dataset will lead to a measurement with better precision, which can either confirm or exclude the contribution of NP to these decays. The project will explore all the crucial aspect of data analysis, from simulation to signal modeling, including cutting-edge software, such us fitting large amount of data using GPU (Graphic Processing Unit).  
  
=== Energy Loss of Energetic Quarks and Gluons in the Quark-Gluon Plasma ===
+
''Contact: [mailto:a.mauri@cern.ch Andrea Mauri] and [mailto:marcel.merk@nikhef.nl Marcel Merk]''
One of the ways to study the quark-gluon plasma that is formed in high-energy nuclear collisions, is using high-energy partons (quarks or gluons) that are produced early in the collision and interact with the quark-gluon plasma as they propagate through it. There are several current open questions related to this topic, which can be explored in a Master's project. For example, we would like to use a new Monte Carlo generator model (JEWEL) of the collision to see whether we can measure the shape of the collision region using measurements of hadron pairs. In the project you will collaborate with one the PhD students in our group to use the model to generate predictions of measurements and compare those to data analysis results. Depending on your interests, the project can focus more on the modeling aspects or on the analysis of experimental data from the ALICE detector at the LHC.  
+
  
''Contact: [mailto:marco.van.leeuwen@nikhef.nl M. van Leeuwen]''
+
=== LHCb: Search for long-lived heavy neutral leptons in B decays ===
 +
The mass of neutrinos are many orders of magnitude smaller than that of the other fermions. In the seesaw mechanism this puzzling fact is explained by the existence of another set of neutral leptons that are much heavier in mass. If their mass is below about 5 GeV such neutrinos can be produced at the LHC in decays of B hadrons. Their small coupling will lead to a lifetime of the order of pico-seconds which means that they will fly an observable distance before they decay. In this project we search for such long-lived heavy neutrinos in decays of charged B mesons using the LHCb run-2 dataset.
  
=== Chiral Magnetic Effect and the Strong CP Problem ===
+
'' Contact: [mailto:v.lukashenko@nikhef.nl Lera Lukashenko] and [mailto:wouter.hulsbergen@nikhef.nl Wouter Hulsbergen]''
Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, which is called Chiral Magnetic Effect (CME). The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, but background effects have not yet been properly disentangled. In this project you will develop and test new observables of the CME, trying to understand and discriminate the background sources that affects such a measurement.  
+
  
''Contact: [mailto:Panos.Christakoglou@nikhef.nl P. Christakoglou], [mailto:Paul.Kuijer@nikhef.nl P. Kuijer]''
+
=== LHCb: Discovering the Bc->eta_c mu nu decay ===
 +
The Bc meson, consisting of heavy c and anti-b quarks, is of great interest for flavour physics. Recent LHCb measurement on Bc->J/psi l nu decays [1] showed a possible deviation from the Standard Model prediction, which entered the so-called lepton universality puzzle - the hottest topic in the b-physics in recent years. Following that, the study of a similar decay mode - Bc->eta_c mu nu - is strongly requested by the theory community. However, the reconstruction of the eta_c meson is challenging, so that the decay has not been discovered yet. The project aims at discovery of the Bc->eta_c mu nu decay using unique capabilities of the LHCb experiment. The data analysis will consist of finding the optimal event selection using machine learning techniques, research on background sources, performing fits to data, etc. The project requires to be not afraid of analysis software and statistics. The results will be presented in collaboration: talks at working group meetings, analysis note, etc.  Skills in git, python and ROOT (and similar packages) are extremely welcome.
  
=== Quantum Coherence in Particle Production with Intensity Interferometry ===
+
[1] https://arxiv.org/pdf/1711.05623.pdf
Intensity interferometry – also known as HBT-effect or Bose-Einstein-correlations – is a method to study the space-time structure of the particle-emitting source in high-energy physics. The main interest so far has been on the width of correlation functions in momentum space, which reflects the space-time information. The strength of the correlation also carries information, but this has been ignored by many people. The correlation strength is in particular influenced by the degree of coherence of particle production. Recently new studies have been performed to extract this degree of coherence, however, many other effects might distort such a measurement, in particular the production of pions via resonance decay. In this project we will study the role of resonance decays for a measurement of coherence in intensity interferometry and try to establish possible correction methods for any distortions they may cause. We will perform theoretical model calculations with Monte-Carlo simulation methods.
+
  
''Contact: [mailto:T.Peitzmann@uu.nl T. Peitzmann]''
+
''Contact: [mailto:andrii.usachov@nikhef.nl Andrii Usachov] and [mailto:marcel.merk@nikhef.nl Marcel Merk]''
  
=== Higher Harmonic Flow ===
+
=== ALICE: Searching for the strongest magnetic field in nature ===
When two ions collide, if the impact parameter is not zero, the overlap region is not isotropic. This spatial anisotropy of the overlap region is transformed into an anisotropy in momentum space through interactions between partons and at a later stage between the produced particles. It was recently realized that the overlap region of the colliding nuclei exhibits an irregular shape. These irregularities originate from the initial density profile of nucleons participating in the collision which is not smooth and is different from one event to the other. The resulting higher order flow harmonics (e.g. v3, v4, and v5, usually referred to as triangular, quadrangular, and pentangular flow, respectively) and in particular their transverse momentum dependence are argued to be more sensitive probes than elliptic flow not only of the initial geometry and its fluctuations but also of shear viscosity over entropy density (η/s). The goal of this project is to study v3, v4, and v5 for identified particles in collisions of heavy-ions at the LHC.  
+
In case of a non-central collision between two Pb ions, with a large value of impact parameter (b), the charged nucleons that do not participate in the interaction (called spectators) create strong magnetic fields. A back of the envelope calculation using the Biot-Savart law brings the magnitude of this filed close to 10^19Gauss in agreement with state of the art theoretical calculation, making it the strongest magnetic field in nature. The presence of this field could have direct implications in the motion of final state particles. The magnetic field, however, decays rapidly. The decay rate depends on the electric conductivity of the medium which is experimentally poorly constrained. Overall, the presence of the magnetic field, the main goal of this project, is so far not confirmed experimentally.
  
''Contact: [mailto:Panos.Christakoglou@nikhef.nl P. Christakoglou], [mailto:Paul.Kuijer@nikhef.nl P. Kuijer]''
+
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
  
=== A New Detector for Very High-Energy Photons: FoCal ===  
+
=== ALICE: Looking for parity violating effects in strong interactions ===
High-energy photons are important messenger particles in particle physics. In particular direct photons (i.e. directly produced from elementary scattering processes) are interesting, but it is a difficult task to discriminate them from the photons originating from particle decays. Existing detector have limited capabilities for such a discrimination, in particular at the highest energies. Our institute has pioneered a detector based on a new concept, a digital pixel calorimeter with Si-sensors of unprecedented granularity. First proof-of-principle measurements have already been performed. In this project we will study the performance of a particular detector design for measurements of direct photons at the LHC and optimize the design parameters for such a measurement. Performance studies for other measurements – e.g. jets, J/ψ, or ϒ particles – may be carried out in addition.  
+
Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions, in the presence of a deconfined state, gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, what is called the Chiral Magnetic Effect (CME).
 +
The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, however further studies are needed to constrain background effects. These highly anticipated results have the potential to reveal exiting, new physics.
  
''Contact: [mailto:T.Peitzmann@uu.nl T. Peitzmann] [mailto:marco.van.leeuwen@nikhef.nl M. van Leeuwen]''
+
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
  
=== Thermal Photon Emission: Quark-Gluon Plasma or Hadron Gas? ===
+
=== ALICE: Machine learning techniques as a tool to study the production of heavy flavour particles ===
Recently, measurements of thermal photon emission in high-energy nuclear collisions have been performed at RHIC and at LHC. It is generally believed that a quark-gluon plasma equation of state is the natural description of the hot initial phase of these collisions, and so far only theoretical model calculations including such a phase have been compared to those measurements. In this project we will revisit hadron gas models and try to reproduce the thermal photon yield together with other observables. In this work we will use and possibly modify Monte Carlo implementations of relativistic hydrodynamics, tune it to existing data of hadron production and then estimate the photon production from the same model. The model implementation will be based on previous work of external theoretical colleagues and will be carried out in collaboration with them.
+
There was recently a shift in the field of heavy-ion physics triggered by experimental results obtained in collisions between small systems (e.g. protons on protons). These results resemble the ones obtained in collisions between heavy ions. This consequently raises the question of whether we create the smallest QGP droplet in collisions between small systems. The main objective of this project will be to study the production of charm particles such as D-mesons and Λc-baryons in pp collisions at the LHC. This will be done with the help of a new and innovative technique which is based on machine learning (ML). The student will also extend the studies to investigate how this production rate depends on the event activity e.g. on how many particles are created after every collision.
  
''Contact: [mailto:T.Peitzmann@uu.nl T. Peitzmann]''
+
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou] and [mailto:Alessandro.Grelli@cern.ch Alessandro Grelli]''
  
=== A New Detector for Proton Therapy and Proton Computed Tomography ===
+
=== ALICE: Energy Loss of Energetic Quarks and Gluons in the Quark-Gluon Plasma ===
Conventional imaging of humans in medical treatment relies mostly on electromagnetic radiation (CT, MRT) or positrons (PET). A recently proposed new imaging strategy, in particular in the context of proton therapy for cancer treatment, is to use proton beams. Current detectors for the scattered protons have severe limitations, in particular for their precision and measurement times. New development of intelligent Si-sensors in particle physics offer possibilities to develop much more efficient detectors for such proton CT measurements. We will perform R&D on the use of new silicon pixel detectors developed in the context of the ALICE experiment at CERN for such medical applications. Studies will include Monte-Carlo simulations of a possible detector setup and measurements with the first samples of the appropriate silicon sensors, which will become available in early 2016. The project will be carried out in the context of a scientific collaboration with Bergen University, Norway.  
+
One of the ways to study the quark-gluon plasma that is formed in high-energy nuclear collisions, is using high-energy partons (quarks or gluons) that are produced early in the collision and interact with the quark-gluon plasma as they propagate through it. There are several current open questions related to this topic, which can be explored in a Master's project. For example, we would like to use the new Monte Carlo generator framework JetScape to simulate collisions to see whether we can extract information about the interaction with the quark-gluon plasma. In the project you will collaborate with one of the PhD students or postdocs in our group to use the model to generate predictions of measurements and compare those to data analysis results. Depending on your interests, the project can focus more on the modeling aspects or on the analysis of experimental data from the ALICE detector at the LHC.
  
''Contact: [mailto:T.Peitzmann@uu.nl T. Peitzmann]''
+
''Contact: [mailto:marco.van.leeuwen@cern.ch Marco van Leeuwen] and [mailto:marta.verweij@cern.ch Marta Verweij]''
  
=== Medical X-ray Imaging ===
+
=== ALICE: Extreme Rare Probes of the Quark-Gluon Plasma ===
With the upcoming of true multi-threshold X-Ray detectors the possibilities for Spectral Imaging with low dose, including spectral CT, is now a reality around the corner. The Medipix3RX chip, from the Medipix Collaboration (CERN) features up to 8 programmable thresholds which can select energy bins without a threshold scan. A number of projects could be derived from the R&D activities with the Medipix3RX within the Nikhef R&D group on X-ray imaging for medical applications:
+
The quark-gluon plasma is formed in high-energy nuclear collisions and also existed shortly after the big bang.  With the large amount of data collected in recent years at the Large Hadron Collider at CERN, rare processes that previously were not accessible provide now new ways to study how the quark-gluon plasma emerges from the fundamental theory of strong interaction. One of such processes is the heavy W boson which in many cases decays to two quarks. The W boson itself doesn’t interact with the quark-gluon plasma because it doesn’t carry color, but the quark decay products do interact with the plasma and therefore provide an ideal tool to study the space-time evolution of this hot and dense medium. In this project you will use data from the ALICE detector at the LHC and simulated data from generators to study various physics mechanisms that could be happening in the real collisions.
* Medipix3RX characterization in all its operation modes and gains.  
+
* Spectral CT and scarce sampling 3D reconstruction
+
* Charge sharing: the charge-sum capabilities of the chip can be exploited to further understand the problem of charge sharing in pixelized detectors. A combination of the characterization of the charge-summing mode plus the use of both planar, and 3D sensors, at the light of MC simulation, could reveal valuable information about charge sharing.
+
  
''Contact: [mailto:johnid@nikhef.nl John Idarraga],[mailto:nielsvb@nihef.nl Niels van Bakel]''
+
''Contact: [mailto:marta.verweij@cern.ch Marta Verweij] and [mailto:marco.van.leeuwen@cern.ch Marco van Leeuwen]''
  
=== Compton camera ===
+
=== ALICE: Jet Quenching with Machine Learning ===
In the Nikhef R&D group we develop instrumentation for particle physics but we also investigate how particle physics detectors can be used for different purposes. A succesfull development is the Medipix chip that can be used in Xray imaging. For use in large scale medical applications compton scattering limits however the energy resolving possibilities. You will investigate whether it is in principle possible to design a Xray application that detects the compton scattered elctron and the absorbed photon. Your ideas can be tested in practice in the lab where a Xray scan can be performed.
+
  
''Contact: [mailto:koffeman@nikhef.nl Els Koffeman]''
+
Machine learning applications are rising steadily as a vital tool in the field of data science but are relatively new in the particle physics community. In this project machine learning tools will be used to gain insights into the modification of a parton shower in the quark-gluon plasma (QGP). The QGP is created in high-energy nuclear collisions and only lives for a very short period of time. Highly energetic partons created in the same collisions interact with the plasma while they travers it and are observed as a collimated spray of particles, known as jets, in the detector.  One of the key recent insights is that the internal structure of jets provides information about the evolution of the QGP. With data recorded by the ALICE experiment, you will use jet substructure techniques in combination with machine learning algorithms to dissect the structure of the QGP. Machine learning will be used to select the regions of radiation phase space that are affected by the presence of the QGP.
  
=== The Modulation experiment ===
+
''Contact: [mailto:marta.verweij@cern.ch Marta Verweij] and [mailto:marco.van.leeuwen@cern.ch Marco van Leeuwen]''
There exist a few measurements that suggest an annual modulation in the activity of radioactive sources. With a few groups from the XENON collaboration we have developed four sets of table-top experiments to investigate this effect on a few well known radioactive sources. The experiments are under construction in Purdue University (USA), a mountain top in Switzerland, a beach in Rio de Janeiro and the last one at Nikhef in Amsterdam. We urgently need a master student to (1) do the final commissioning of the experiment, (2) collect the 1st big data set, and (3) analyse the first data. We are looking for an all-round physicist with interest in both lab-work and data-analysis. The student will directly collaborate with the other groups in this small collaboration (around 10 people), and the goal is to have the first publication ready by the end of the project.
+
  
''Contact: [mailto:colijn@nikhef.nl Auke Colijn]''
+
=== Lepton Collider: Pixel TPC testbeam ===
 +
In the Lepton Collider group at Nikhef we work on a tracking detector for a future Collider (e.g. the ILC in Japan). We are developing a gaseous Time Projection Chamber with a pixel readout. At Nikhef we have built an 8-quad GridPix module based on the Timepix3 chip, which is a detector of about 20 cm x 40 cm x 10 cm in size. In August 2020 we will test the device at the DESY particle accelerator in Hamburg. For the project you could work on preparations for the test beam (e.g. running the data acquisition, perform data monitoring using our set up in the lab). The next topics will be the participation in the data taking during the test beam at DESY, the analysis of the data using C++ and ROOT and - finally - publication of the results in a scientific journal.
  
===  Acoustic detection of ultra-high energy cosmic-ray neutrinos ===
+
Our latest paper can be found in https://www.nikhef.nl/~s01/quad_paper.pdf [www.nikhef.nl].
The study of the cosmic neutrinos of energies above 10^17 eV, so-called ultra-high
+
energy neutrinos, provides a unique view on the universe and may provide insight in
+
the origin of the most violent sources, such as gamma ray bursts, supernovae or even
+
dark matter. The energy deposition of cosmic neutrinos in water induces a thermo-acoustic
+
signal, which can be detected using sensitive hydrophones. The expected neutrino
+
flux is however extremely low and the signal that neutrinos induce is small. TNO
+
is presently developing sensitive hydrophone technology that is based on fiber optics.
+
Optical fibers form a natural way to create a distributed sensing system. Using this
+
technology a large scale neutrino telescope can be built in the deep sea. TNO is aiming
+
for a prototype hydrophone which will form the building block of a future telescope.
+
  
Students have the possibility to participate to this project is the following ways:
+
''Contact: [mailto:Peter.Kluit@nikhef.nl Peter Kluit] and Kees Ligtenberg''
(i) Modeling of cosmic rays induced acoustic signal in a neutrino telescope.
+
Keywords: Cosmic rays, Monte Carlo, signal processing, telescope optimization.
+
=== Dark Matter: Sensitive tests of wavelength-shifting properties of materials for dark matter detectors ===
(ii) Testing and optimization of fiber optical hydrophone for a large scale neutrino
+
Rare event search experiments that look for neutrino and dark matter interactions are performed with highly sensitive detector systems, often relying on scintillators, especially liquid noble gases, to detect particle interactions. Detectors consist of structural materials that are assumed to be optically passive, and light detection systems that use reflectors, light detectors, and sometimes, wavelength-shifting materials. MSc theses are available related to measuring the efficiency of light detection systems that might be used in future detectors. Furthermore, measurements to ensure that presumably passive materials do not fluoresce, at the low level relevant to the detectors, can be done. Part of the thesis work can include Monte Carlo simulations and data analysis for current and upcoming dark matter detectors, to study the effect of different levels of desired and nuisance wavelength shifting. In this project, students will acquire skills in photon detection, wavelength shifting technologies, vacuum systems, UV and extreme-UV optics, detector design, and optionally in C++ programming, data analysis, and Monte Carlo techniques.
telescope. Keywords: Experimental, physics, system design.
+
  
The work will be (partly) executed in Delft.  
+
''Contact: [mailto:Tina.Pollmann@tum.de Tina Pollmann] and [mailto:decowski@nikhef.nl Patrick Decowski]''
  
''Contact: [mailto:ernst-jan.buis@tno.NOSPAMnl Ernst-Jan Buis]''
+
=== Dark Matter: Signal reconstruction in XENONnT ===
 +
The next generation direct detection dark matter experiment - XENONnT - comprises close to 500 photomultiplier tubes (PMTs) in the main detector volume. These PMTs are configured to be able to detect even single photons. When a single photoelectron (PE) signal is detected the detected signal (a pulse) is convoluted with the detector response of the PMT. Due to this detector response the pulse shape of a single PE is spread out in time. For XENONnT we would like to explore the possibility to implement a digital (software) filter to deconvolve the detected pulse back to the “true” instantaneous shape (without the detector spread). This is a virtually unexplored new step in the Xenon analysis framework. Later in the analysis framework these pulses from all the PMTs are combined into a signal referred to as a ‘peak’. For XENONnT it is of essence to be extremely good in discriminating between two types of peaks caused by interactions in the detector; a prompt primary scintillation signal (S1) and a secondary ionization signal (S2). The parameters in the software haven’t - as of the time of writing - been optimized for the XENONnT-detector conditions.
 +
The student would investigate how a deconvolution filter would benefit the XENONnT analysis framework and develop such a filter. Furthermore, the student will work on the classification of these signals to fully exploit the XENONnT-detector to optimize the classification. This will be done with simulated data at first but may later even be performed on actual XENONnT-data. As an extension, the possibility of applying machine learning to correctly distinguish between the two signals could be explored. This is a data-analysis oriented project where Python skills are paramount.
 +
 
 +
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:j.angevaare@nikhef.nl Joran Angevaare]''
 +
 
 +
=== Dark Matter: XAMS  R&D Setup ===
 +
The Amsterdam Dark Matter group operates an R&D xenon detector at Nikhef. The detector is a dual-phase xenon time-projection chamber and contains about 4kg of ultra-pure liquid xenon. We use this detector for the development of new detection techniques - such as utilizing our newly installed silicon photomultipliers - and to improve the understanding of the response of liquid xenon to various forms of radiation. The results could be directly used in the XENONnT experiment, the world’s most sensitive direct detection dark matter experiment at the Gran Sasso underground laboratory, or for future Dark Matter experiments like DARWIN. We have several interesting projects for this facility. We are looking for someone who is interested in working in a laboratory on high-tech equipment, modifying the detector, taking data and analyzing the data him/herself. You will "own" this experiment.
 +
 
 +
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
 +
 
 +
=== Dark Matter: DARWIN Sensitivity Studies ===
 +
DARWIN is the "ultimate" direct detection dark matter experiment, with the goal to reach the so-called "neutrino floor", when neutrinos become a hard-to-reduce background. The large and exquisitely clean xenon mass will allow DARWIN to also be sensitive to other physics signals such as solar neutrinos, double-beta decay from Xe-136, axions and axion-like particles etc. While the experiment will only start in 2025, we are in the midst of optimizing the experiment, which is driven by simulations. We have an opening for a student to work on the GEANT4 Monte Carlo simulations for DARWIN, as part of a simulation team together with the University of Freiburg and Zurich. We are also working on a "fast simulation" that could be included in this framework. It is your opportunity to steer the optimization of a large and unique experiment. This project requires good programming skills (Python and C++) and data analysis/physics interpretation skills.
 +
 
 +
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
 +
 
 +
=== Dark Matter: Fast simulation studies ===
 +
For Dark Matter experiments it is crucial to understand sources of backgrounds in great detail. The most common way to study the effect of backgrounds to the Dark Matter sensitivity is by the
 +
use of Monte Carlo simulations. Unfortunately, the standard Monte Carlo techniques are extremely inefficient. One needs to sometimes simulate millions of events before one background event appears in the Dark Matter search area. We have developed a Monte Carlo technique that accelerates this process by up to 1000x. The method has been validated on very simple and unrealistic detector models. In goal of this project is to make a realistic detector model for the fast detector simulations. For this we are looking for a student with good programming skills, an interest in a software project, and the desire to deeply understand analysis of Dark Matter experimental data. 
 +
 
 +
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
 +
 
 +
=== Dark Matter & Amsterdam Scientific Instruments: Simulations for Industry ===
 +
In the Nikhef Dark Matter group we have built up an extensive expertise with Monte Carlo simulations of ionizing radiation. Although these simulations have the aim to estimate background levels in our XENON experiments, the same techniques can be applied to study radiation transport in industrial devices. Amsterdam Scientific Instruments (ASI) is a company at Science Park that develops and sells radiation imaging equipment that is used amongst others in electron microscopy. For this application ASI needs a detailed study of gamma ray backgrounds to optimize shielding for their products. The project aims at optimizing a shielding design based on GEANT4 simulations. The results may be implemented in next generation products of ASI. We are looking for a student with preferably strong computing skills, and with an interest in science-industrial collaboration.
 +
 
 +
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
 +
 
 +
=== The Modulation experiment: Data Analysis  ===
 +
For years there have been controversial claims of potential new-physics on the basis of time-varying decay rates of radioactive sources on top of ordinary exponential decay. While some of these claims have been refuted, others have still to be confirmed or falsified. To this end, a dedicated experiment - the modulation experiment - has been designed and operational for the past four years. Using four identical and independent setups the experiment is almost ready for a final analysis to conclude on these claims. In this project the student will perform this analysis, preferably resulting in a conclusive paper. This will require combining the data of the four setups and close collaboration with a small group constituting a collaboration of the four different involved institutes (Purdue University (USA), Universität Zürich (Switzerland), Centro Brasileiro de Pesquisas Fisicas (Brasil) and Nikhef). This project is data-analysis oriented. Additionally, lab-skills can be required as one of the setups is situated at Nikhef.
 +
 
 +
 
 +
''Contact: [mailto:z37@nikhef.nl Auke Colijn] and [mailto:j.angevaare@nikhef.nl Joran Angevaare]''
 +
 
 +
=== Detector R&D: Performance of the ALPIDE monolithic active pixel sensor with radiation damage ===
 +
The ALICE inner tracking system (ITS) 2 is currently being installed at the large hadron collider (LHC) at CERN. This detector makes use of ultra-lightweight monolithic active pixel sensors, the first to use this technology at a particle collider after the STAR experiment at RHIC in Brookhaven. These very thin pixel detectors have a low power consumption, result in very little material in the detector, and still have optimal timing and resolution -- and are a promising technology for future experiments. You will be part of the international ALICE collaboration and investigate the ALICE ALPIDE chip. Although ALICE will not see high levels of radiation at the LHC, it has so far not been tested whether this chip can withstand very high levels of radiation and could be, if there is no large degradation in performance, be used in experiments like ATLAS as well. You will be part of the Nikhef R&D group where you will learn about new detector technologies for high energy physics and learn to design a test setup to characterize the ALPIDE chip in a particle beam using the many instruments at the Nikhef R&D labs. You will then test the chip at the Delft or Groningen facilities that provide a particle beam.
 +
 
 +
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld]''
 +
 
 +
=== Detector R&D: Time resolution of a high voltage monolithic active pixel sensor ===
 +
For the first time, CMOS monolithic active pixel sensors (MAPS), where chip and sensor are integrated, are being used in an experiment at the LHC. Although this is a common technology in industry, it is rather new in the high rate, high radiation environments of high energy particle physics. The ALICE experiment is currently installing such MAPS to which a moderate bias voltage can be applied. You will work in the international RD50 collaboration that works on radiation hard semiconductor devices for very high luminosity colliders, and investigate their MAPS that can be biased to very high voltages to avoid signal degradation after radiation damage. You will be part of the Nikhef R&D group where you will learn about new detector technologies for high energy physics and learn to design a test setup to get a first measurement of time resolution of the RD50 HV-CMOS chip using the many instruments at the Nikhef R&D labs.
 +
 
 +
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld]''
 +
 
 +
 
 +
=== Detector R&D: Test beam with a bent ALPIDE monolithic active pixel sensor ===
 +
The ALICE inner tracking system (ITS) 2 is currently being installed at the large hadron collider (LHC) at CERN. This detector makes use of ultra-lightweight monolithic active pixel sensors, the first to use this technology at a particle collider after the STAR experiment at RHIC in Brookhaven. These very thin pixel detectors have a low power consumption, result in very little material in the detector, and still have optimal timing and resolution -- and are a promising technology for future experiments. For the next long shutdown in 2025, an even smaller feature size version of the ALPIDE chip will be used and will be installed by bending larger surfaces of sensor around the beam pipe. Recent test beams at DESY in Hamburg show this yields good results. You will be part of the Nikhef R&D group where you will learn about new detector technologies for high energy physics and learn to design a test setup to characterize the ALPIDE chip using the many instruments at the Nikhef R&D labs. You will work within an international collaboration where you will learn to analyze test beam data. If the travel situation allows, you will have the opportunity to join the ALICE test beam group in Hamburg at DESY to take part in the exciting experience of taking real data.
 +
 
 +
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld]''
 +
 
 +
=== Detector R&D: Simulating the performance of the ATLAS pixel detector after years of radiation ===
 +
The innermost detector of the ATLAS experiment at the large hadron collider (LHC) that is closest to the beam pipe is the ATLAS pixel detector. The pixel sensors in this area receive the highest amounts of radiation and their performance suffers accordingly. To better understand the effects of radiation damage and to be able to predict the future performance, the pixel sensors are modeled using programs such as technology computer aided design (TCAD) for modeling electric fields that serves as input for programs such as AllPix2 for modeling observables affecting the signal quality such as charge collection efficiency. In this project, you will learn to use TCAD, a tool widely used in the semiconductor industry, to model electric field maps of the sensor, and get an estimate of the uncertainties by comparing the prediction for different models. You will compare your simulations to real data from the ATLAS experiment as well as to data from test beams. You will work in an international environment within the ATLAS collaboration and be part of the Nikhef detector R&D group where you will learn about the newest detector technologies for high energy physics and beyond. Your improved predictions for the performance of the next ATLAS pixel detector will help ATLAS better prepare for future LHC data taking after the installation of this detector in 2025.
 +
 
 +
''Contact: [mailto:jory.sonneveld@nikhef.nl Jory Sonneveld]''
 +
 
 +
=== Detector R&D: Laser Interferometer Space Antenna (LISA) ===
 +
The space-based gravitational wave antenna LISA is, without a doubt, one of the most challenging space missions ever proposed. ESA plans to launch around 2030 three spacecraft that are separated by a few million kilometers to measure tiny variations in the distances between test-masses located in each satellite to detect the gravitational waves from sources such as supermassive black holes. The triangular constellation of the LISA mission is dynamic, requiring a constant fine-tuning related to the pointing of the laser links between the spacecraft and a simultaneous refocusing of the telescope. The noise sources related to the laser links expect to provide a dominant contribution to the LISA performance.
 +
An update and extension of the LISA science simulation software are needed to assess the hardware development for LISA at Nikhef, TNO, and SRON. A position is therefore available for a master student to study the impact of instrumental noise on the performance of LISA. Realistic simulations based on hardware (noise) characterization measurements performed at TNO will be carried out and compared to the expected tantalizing gravitational wave sources.
 +
 
 +
''Contact: [mailto:nielsvb@nikhef.nl Niels van Bakel],[mailto:ernst-jan.buis@tno.nl  Ernst-Jan Buis]''
 +
 
 +
=== Detector R&D: Spectral X-ray imaging - Looking at colours the eyes can't see ===
 +
When a conventional X-ray image is taken, one acquires an image that only shows intensities. a ‘black and white’ image. Most of the information carried by the photon energy is lost. Lacking spectral information can result in an ambiguity between material composition and amount of material in the sample. If the X-ray intensity as a function of the energy can be measured (i.e. a ‘colour’ X-ray image) more information can be obtained from a sample. This translates to less required dose and/or to a better understanding of the sample that is being investigated. For example, two fields that can benefit from spectral X-ray imaging are mammography and real time CT.
 +
 
 +
Detectors using Medipix3 chips are used for X-ray imaging. Such a detector is composed of a pixel chip with a semiconductor sensor bonded on top of it. Photoelectric absorption of X-rays in the sensor results in an amount of charge being released that is proportional to the X-ray energy. This charge is registered by a pixel. Depending on configuration, in each pixel 1, 2, 4 or 8 detection thresholds can be set and so, a number of energy bins can be defined. One of the challenges is to maximise X-ray image quality by minimising effects caused by dispersion in the sensitivity of the pixels. The effects of this dispersion can partly be compensated by applying a specific measurement method in combination with image post processing.
 +
 
 +
You can work on improving measurement methods and on improving post processing methods. There is flexibility of the planned work depending on the skillset you have. The aim is to get the best X-ray energy resolution over the entire pixel chip. This in turn improves image quality and therefore X-ray CT reconstruction quality.
 +
 
 +
Important note: Much of this work is to be performed in the laboratory. Because of the corona pandemic it is not sure if it is possible to be physically present for enough of the time for this project. Please contact us to discuss the possibilities.
 +
 
 +
Please see the following videos for examples of our work:
 +
 
 +
https://youtu.be/cgwQvjfUYns
 +
 
 +
https://youtu.be/tf9ZLALPVNY
 +
 
 +
https://youtu.be/vjPX7SxvSUk
 +
 
 +
https://youtu.be/LqjNVSm7Hoo
 +
 
 +
''Contact: [mailto:martinfr@nikhef.nl Martin Fransen],[mailto:navritb@nikhef.nl Navrit Bal]''
 +
 
 +
=== Detector R&D: Holographic projector ===
 +
 
 +
A difficulty in projecting holograms (based on the interference of light) is the required dense pixel pitch of a projector. One would need a pixel pitch of less than 200 nanometer. With larger pixels artefacts occur due to spatial under sampling. A pixel pitch of 200 nanometer is difficult, if not, impossible, to achieve, especially for larger areas. Another challenge is the massive amount of computing power that would be required to control such a dense pixel matrix.
 +
 
 +
A new holographic projection method has been developed that reduces under sampling artefacts for projectors with a ‘low’ pixel density. It uses 'pixels' at random but known positions, resulting in an array of (coherent) light points that lacks (or has suppressed) spatial periodicity. As a result a holographic projector can be built with a significantly lower pixel density and correspondingly less required computing power. This could bring holography in reach for many applications like display, lithography, 3D printing, metrology, etc...
 +
 
 +
Of course, nothing comes for free: With less pixels, holograms become noisier and the contrast will be reduced (not all light ends up in the hologram). The questions: How does the quality of a hologram depend on pixel density? How do we determine projector requirements based on requirements for hologram quality?
 +
 
 +
Requirements for a hologram can be expressed in terms of: Noise, contrast, resolution, suppression of under sampling artefacts, etc..
 +
 
 +
For this project we have built a proof of concept holographic emitter. This set-up will be used to verify simulation results (and also to project some cool holograms of course ;-).
 +
 
 +
Examples of what you could be working on:
 +
 
 +
a. Calibration/characterisation of the current projector and compensation of systematic errors.
 +
 
 +
b. To realize a phased array of randomly placed light sources the pixel matrix of the projector must be ‘relayed’ onto a mask with apertures at random but precisely known positions. Determine the best possible relaying optics and design an optimized mask accordingly. Factors like deformation of the projected pixel matrix and limitations in resolving power of the lens system must be taken into account for mask design.
 +
 
 +
Important note: Much of this work is to be performed in the laboratory. Because of the corona pandemic it is not sure if it is possible to be physically present for enough of the time for this project. Please contact me to discuss the possibilities. 
 +
 
 +
''Contact: [mailto:martinfr@nikhef.nl Martin Fransen]''
 +
 
 +
=== Theory: The Effective Field Theory Pathway to New Physics at the LHC ===
 +
A promising framework to parametrise in a robust and model-independent way deviations from the Standard Model (SM) induced by new heavy particles is the Standard Model Effective Field Theory (SMEFT). In this formalism, beyond the SM effects are encapsulated in higher-dimensional operators constructed from SM fields respecting their symmetry properties. In this project, we aim to carry out a global analysis of the SMEFT from high-precision LHC data, including Higgs boson production, flavour observables, and low-energy measurements. This analysis will be carried out in the context of the recently developed SMEFiT approach [1] based on Machine Learning techniques to efficiently explore the complex theory parameter space. The ultimate goal is either to uncover glimpses of new particles or interactions at the LHC, or to derive the most stringent model-independent bounds to date on general theories of New Physics. Of particular interest are novel methods for charting the parameter space [2], the matching to UV-complete theories in explicit BSM scenarios [3], and the interplay between EFT-based model-independent searches for new physics and determinations of the proton structure from LHC data [4].
 +
 
 +
[1] https://arxiv.org/abs/1901.05965
 +
[2] https://arxiv.org/abs/1906.05296
 +
[3] https://arxiv.org/abs/1908.05588
 +
[4] https://arxiv.org/abs/1905.05215
 +
 
 +
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
 +
 
 +
=== Theory: Charting the quark and gluon structure of protons and nuclei with Machine Learning ===
 +
Deepening our knowledge of the partonic content of nucleons and nuclei [1] represents a central endeavour of modern high-energy and nuclear physics, with ramifications in related disciplines such as astroparticle physics. There are two main scientific drivers motivating these investigations of the partonic structure of hadrons. On the one hand, addressing fundamental open issues in our understanding in the strong interactions such as the origin of the nucleon mass, spin, and transverse structure; the presence of heavy quarks in the nucleon wave function; and the possible onset of novel gluon-dominated dynamical regimes. On the other hand, pinning down with the highest possible precision the substructure of nucleons and nuclei is a central component for theoretical predictions in a wide range of experiments, from proton and heavy ion collisions at the Large Hadron Collider to ultra-high energy neutrino interactions at neutrino telescopes. The goal of this project is to exploit Machine Learning and Artificial Intelligence tools [2,3] (neural networks trained by stochastic gradient descent) to pin down the quark and gluon substructure of protons and nuclei by using recent measurements from proton-proton and proton-lead collisions at the LHC. Topics of special interest are i) the strange content of protons and nuclei, ii) parton distributions at higher-orders in the QCD couplings for precision Higgs physics, iii) the interplay between jet, photon, and top quark production data to pin down the large-x gluon, and iv) charm quarks as a probe of gluon shadowing at small-x. The project also involves developing projects for the Electron-Ion Collider (EIC), a new lepton-nucleus experiment to start operations in the next years.
 +
 
 +
[1] https://arxiv.org/abs/1910.03408
 +
[2] https://arxiv.org/abs/1904.00018
 +
[3] https://arxiv.org/abs/1706.00428
 +
 
 +
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
 +
 
 +
=== Theory: Machine learning for Electron Microscopy for next-generation materials ===
 +
Machine Learning tools developed and applied for particle physics hold great potential for applications in material science, in particular concerning faithful uncertainty estimation and model training for large parameter spaces. In this project, carried out in collaboration with the group of Dr. Sonia Conesa-Boj from the Kavli Institute Nanoscience Delft, http://www.conesabojlab.tudelft.nl, we will  develop and deploy ML tools for data analysis in Electron Microscopy. We will focus on pinning down the properties of novel quantum materials such as topological insulators and van der Waals materials. Examples of possible applications include model-independent background subtraction in electron-energy loss spectroscopy, automatic classification of crystalline structures, and enhancing spatial and spectral resolution using convolutional networks.
 +
 
 +
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
 +
 
 +
===Theory: The electroweak phase transition and baryogenesis/gravitational wave production ===
 +
 
 +
In extensions of the Standard Model the electroweak phase transition can be first order and proceed via the nucleation of bubbles. Colliding bubbles can produce gravitational waves [1] and plasma particles interacting with the bubbles can generate a matter-antimatter asymmetry [2]. A detailed understanding of the dynamics of the phase transitions is needed to accurately describe these processes.  One project is to study QFT at finite temperature and compare/apply methods that address the non-perturbative IR dynamics of the thermal processes [3,4].  Another project is to calculate the velocity by which the bubbles expand, which is an important parameter for gravitational waves production and baryogensis. This entails among other things tunneling dymamics, (thermal) scattering rates and Boltzmann equations [5].
 +
 
 +
[1]https://arxiv.org/abs/1705.01783
 +
[2]https://arxiv.org/pdf/hep-ph/0609145.pdf
 +
[3]https://arxiv.org/pdf/1609.06230.pdf
 +
[4]https://arxiv.org/pdf/1612.00466.pdf
 +
[5]https://arxiv.org/pdf/1809.04907.pdf
 +
 
 +
''Contact: [mailto:mpostma@nikhef.nl Marieke Postma]''
 +
 
 +
===Theory: Cosmology of the QCD axion ===
 +
 
 +
The QCD axion provides an elegant solution to the strong CP problem in QCD[1]. This project focus on the cosmological dynamics of this hypothesized axion field, and in particular the possibility that it can both produce the observed matter-antimatter asymmetry and dark matter abundance in our universe [2,3].
 +
 
 +
[1]https://arxiv.org/abs/1812.02669
 +
[2]https://arxiv.org/pdf/hep-ph/0609145.pdf
 +
[3]https://arxiv.org/pdf/1910.02080.pdf
 +
 
 +
''Contact: [mailto:mpostma@nikhef.nl Marieke Postma]''
 +
 
 +
===Theory: Neutrinos, hierarchy problem and cosmology ===
 +
 
 +
The electroweak hierachy problem is absent if the quadratic term in the Higgs potential is generated dynamically. This is achieved in 'the neutrino option' [1] where the Higgs potential stems exclusively from quantum effects of heavy right-handed neutrinos, which can also generate the mass pattern of the oberved left-handed neutrinos.  The project focusses on model building aspects (e.g. [2]) and the cosmology (e.g. leptogenesis [3]) of these set-ups.
 +
 
 +
[1] https://arxiv.org/pdf/1703.10924.pdf
 +
[2] https://arxiv.org/pdf/1807.11490.pdf
 +
[3] https://arxiv.org/pdf/1905.12642.pdf
 +
 
 +
''Contact: [mailto:mpostma@nikhef.nl Marieke Postma]''
 +
 
 +
=== KM3NeT: Reconstruction of first neutrino interactions in KM3NeT ===
 +
 
 +
The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first few strings with sensitive photodetectors have been deployed at both the Italian and the French detector sites. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In this project the available data will be used together with simulations to best reconstruct the event topologies and optimally identify and reconstruct the first neutrino interactions in the KM3NeT detector and with this pave the path towards accurate neutrino oscillation measurements and neutrino astronomy.
 +
 
 +
Programming skills are essential, mostly root and C++ will be used.
 +
''Contact: [mailto:bruijn@nikhef.nl Ronald Bruijn] [mailto:dosamtnikhef.nl Dorothea Samtleben]'''
 +
 
 +
=== KM3NeT: Searching for New Heavy Neutrinos ===
 +
 
 +
In this project we will be searching for a new heavy neutrino, looking at signatures created by atmospheric neutrinos interacting in the detector volume of KM3NeT-ORCA. The aim of this project is to study a specific event topology which appears as double blobs of signals detected separately by densely instrumented ORCA detector units. We will be exploiting the tau reconstruction algorithms to verify the possibility of ORCA to detect such signals and to estimate the potential sensitivity of the experiment as well. Basic knowledge of elementary particle physics and data analysis techniques will be advantageous. The knowledge of programming languages e.g. python (and possibly C++) and ROOT are advantageous but not mandatory.
 +
 
 +
''Contact: [mailto:suzanbp@nikhef.nl Suzan B. du Pree] [mailto:dveijk@nikhef.nl Daan van Eijk]''
 +
 
 +
=== KM3NeT: Dark Matter with KM3NeT-ORCA ===
 +
 
 +
Dark Matter is thought to be everywhere (we should be swimming through it), but we have no idea what it is. Using the good energy and angular resolutions of the KM3NeT neutrino telescope, we can search for Dark Matter signatures that originate from the center of our galaxy. In this project, we will search for such signatures using the reconstructed track and shower events with the KM3NeT-ORCA detector to discover relatively light Dark Matter particles. Since this year, the KM3NeT-ORCA  experiment has 6 detection lines under the Mediterranean Sea: fully operational and continuously taking data. Using the available data, it is possible to compare data and simulation for different event topologies and to estimate the experiment's sensitivity. The project is suitable for a student who is interested to explore new physics scenarios and willing to develop new skills. Basic knowledge of elementary particle physics and data analysis techniques will be advantageous. The knowledge of programming languages e.g. python (possibly C++) and ROOT data analysis tool are advantageous but not mandatory.
 +
 
 +
''Contact: [mailto:suzanbp@nikhef.nl Suzan B. du Pree] [mailto:dveijk@nikhef.nl Daan van Eijk]''
 +
 
 +
 
 +
=== Gravitational Waves: Unraveling the structure of neutron stars with gravitational wave observations ===
 +
 
 +
Neutron stars were first discovered more than half a century ago, yet their detailed internal structure largely remains a mystery. A range of theoretical models have been put forward for the neutron star "equation of state", but until recently there was no real way to test them. The direct detection of gravitational waves with LIGO and Virgo has the potential to remedy the situation. When two neutron stars spiral towards each other, they get tidally deformed in a way that is determined by the equation of state, and these deformations get imprinted upon the shape of the gravitational wave that gets emitted. After the first gravitational wave observation of such an event in 2017, several equation of state models could already be ruled out. With expected upgrades of the detectors, we will at some point have access not only to the "inspiral" of binary neutron stars, but to the merger itself, and what happens afterwards. The project will consist of using results from large-scale numerical simulations to come up with a heuristic model for the waveform that describes the inspiral-merger-postmerger process with sufficient accuracy given expected detector sensitivities, and to develop data analysis techniques to efficiently use this model to extract information about the neutron star equation of state.
 +
 
 +
''Contact: [mailto:vdbroeck@nikhef.nl Chris Van Den Broeck]''
 +
 
 +
 
 +
=== Gravitational Waves: Searches for gravitational waves from compact binary coalescence ===
 +
Searches for gravitational waves from the mergers of black holes and neutron stars have been extraordinarily successful in the last four years. We are now beginning to study a population of heavy stellar-mass black holes in detail, including understanding how these systems came to form and whether they are consistent with general relativity. Additionally, the detection of binary neutron star mergers is allowing us to probe their extreme matter. However, we’ve only just scratched the surface of possible signals and the new physics they’d allow us to study. The detection of highly spinning and precessing systems would allow us to perform black hole population statistics to an extraordinary degree of accuracy. Detection of sub-solar mass systems would provide evidence of dark matter. However, these searches are difficult because they require us to work in high-dimensional spaces and develop new statistical methods. There are possibilities for several projects that involve the development and implementation of these new searches as well as the interpretation of the results, particularly in terms of the physics describing compact binary mergers.
 +
 
 +
''Contact: [mailto:physarah@gmail.com Sarah Caudill]''
 +
 
 +
 
 +
=== Gravitational Waves: Computer modelling to design the laser interferometers for the Einstein telescope ===
 +
 
 +
A new field of instrument science led to the successful detection of gravitational waves by the LIGO detectors in 2015. We are now preparing the next generation of gravitational wave observatories, such as the Einstein Telescope, with the aim to increase the detector sensitivity by a factor of ten, which would allow, for example, to detect stellar-mass black holes from early in the universe when the first stars began to form. This ambitious goal requires us to find ways to significantly improve the best laser interferometers in the world.
 +
 
 +
Gravitational wave detectors, such as LIGO and VIRGO, are complex Michelson-type interferometers enhanced with optical cavities. We develop and use numerical models to study these laser interferometers, to invent new optical techniques and to quantify their performance. For example, we synthesize virtual mirror surfaces to study the effects of higher-order optical modes in the interferometers, and we use opto-mechanical models to test schemes for suppressing quantum fluctuations of the light field. We can offer several projects based on numerical modelling of laser interferometers. All projects will be directly linked to the ongoing design of the Einstein Telescope.
 +
 
 +
''Contact: [mailto:a.freise@nikhef.nl Andreas Freise]''
 +
 
 +
 
 +
=== Gravitational Waves: Digging away the noise to find the signal ===
 +
 
 +
Gravitational Wave interferometers are extremely sensitive, but suffer
 +
from instrumental issues that produce noise that mimics astrophysical
 +
signals. This needs to be solved as much as possible before the data
 +
analysis. The problem is that  instrumentalists don't know about
 +
analysis pipelines, and data analysts don't know about experimental
 +
details. We need your help to bridge the gap. This is a good opportunity
 +
to learn about both sides and contribute directly to a booming
 +
international field. We have several tools and new ideas for correlating
 +
noises with the state of the instrument. These need to be developed
 +
further, used on years of data, and written up. Will require Python,
 +
signal processing and statistics.
 +
 
 +
''Contact: [mailto:swinkels@nikhef.nl Bas Swinkels] and [mailto:physarah@gmail.com Sarah Caudill]''
 +
 
 +
 
 +
=== Gravitational Waves: Machine Learning techniques for GW Interferometers ===
 +
The control of suspended optical cavities in the non linear regime. 
 +
Gravitational Wave interferometers are extremely sensitive, however suffer from a very small control range, causing unlocks,
 +
reducing the robustness of these instruments.
 +
In this project we will use a table top replica of a suspended optical cavity,
 +
located in the new R&D laser lab at Nikhef, for the development of a neural
 +
network to construct the positions from free falling mirror by using beam
 +
images. A database with simulated beam images can be used to train
 +
various neural networks before deployment in the table top experiment.
 +
We are looking for a hands-on and enthusiastic master student, interested
 +
in machine learning and experienced in programming languages like Python.
 +
Contacts: Rob Walet, Frank Linde
 +
 
 +
''Contact: [mailto:r.walet@nikhef.nl Rob Walet] and [mailto:f.l.linde@gmail.com Frank Linde]''
 +
 
 +
=== VU LaserLaB: Measuring the electric dipole moment (EDM) of the electron ===
 +
 
 +
In collaboration with Nikhef and the Van Swinderen Institute for Particle Physics and Gravity at the University of Groningen, we have recently started an exciting project to measure the electric dipole moment (EDM) of the electron in cold beams of barium-fluoride molecules. The eEDM, which is predicted by the Standard Model of particle physics to be extremely small, is a powerful probe to explore physics beyond this Standard Model. All extensions to the Standard Model, most prominently supersymmetry, naturally predict an electron EDM that is just below the current experimental limits. We aim to improve on the best current measurement by at least an order of magnitude. To do so we will perform a precision measurement on a slow beam of laser-cooled BaF molecules. With this low-energy precision experiment, we test physics at energies comparable to those of LHC!
 +
 
 +
At LaserLaB VU, we are responsible for building and testing a cryogenic source of BaF molecules. The main parts of this source are currently being constructed in the workshop. We are looking for enthusiastic master students to help setup the laser system that will be used to detect BaF. Furthermore, projects are available to perform simulations of trajectory simulations to design a lens system that guides the BaF molecules from the exit of the cryogenic source to the experiment.
 +
 
 +
''Contact: [mailto:H.L.Bethlem@vu.nl Rick Bethlem]''
 +
 
 +
=== VU LaserLaB: Physics beyond the Standard model from molecules ===
 +
 
 +
Our team, with a number of staff members (Ubachs, Eikema, Salumbides, Bethlem, Koelemeij) focuses on precision measurements in the hydrogen molecule, and its isotopomers. The work aims at testing the QED calculations of energy levels in H2, D2, T2, HD, etc. with the most precise measurements, where all kind of experimental laser techniques play a role (cw and pulsed lasers, atomic clocks, frequency combs, molecular beams). Also a target of studies is the connection to the "Proton size puzzle", which may be solved  through studies in the hydrogen molecular isotopes.
 +
 
 +
In the past half year we have produced a number of important results that are described in
 +
the following papers:
 +
* Frequency comb (Ramsey type) electronic  excitations in the  H2 molecule:
 +
see: Deep-ultraviolet frequency metrology of H2 for tests of molecular quantum theory
 +
http://www.nat.vu.nl/~wimu/Publications/Altmann-PRL-2018.pdf
 +
* ''Precision measurement of an infrared transition in the HD molecule''
 +
see: Sub-Doppler frequency metrology in HD for tests of fundamental physics: https://arxiv.org/abs/1712.08438
 +
* ''The first precision study in molecular tritium T2''
 +
see: Relativistic and QED effects in the fundamental vibration of T2:  http://arxiv.org/abs/1803.03161
 +
* ''Dissociation energy of the hydrogen molecule at 10^-9 accuracy'' paper submitted to Phys. Rev. Lett.
 +
* ''Probing QED and fundamental constants through laser spectroscopy of vibrational transitions in HD+''
 +
This is also a study of the hydrogen molecular ion HD+, where important results were  obtained not so long ago, and where we have a strong activity: http://www.nat.vu.nl/~wimu/Publications/ncomms10385.pdf
 +
 
 +
These five results mark the various directions we are pursuing, and in all directions we aim at obtaining improvements. Specific projects with students can be defined; those are mostly experimental, although there might be some theoretical tasks, like performing calculations of hyperfine structures.
 +
''Contact: [mailto:w.m.g.ubachs@vu.nl Wim Ubachs] [mailto:k.s.e.eikema@vu.nl Kjeld Eikema] [mailto:h.l.bethlem@vu.nl Rick Bethlem]''
 +
 
 +
 
 +
 
 +
[[Last years MSc Projects|Last year's MSc Projects]]

Latest revision as of 09:41, 9 November 2020

Master Thesis Research Projects

The following Master thesis research projects are offered at Nikhef. If you are interested in one of these projects, please contact the coordinator listed with the project.

Contents

[edit] Projects with September 2020 start

[edit] ATLAS: Top Spin optimal observables using Artificial Intelligence

The top quark has an exceptional high mass, close to the electroweak symmetry breaking scale and therefore sensitive to new physics effects. Theoretically, new physics is well described in the EFT framework [1]. The (EFT) operators are experimentally well accessible in single top t-channel production where the top quark is produced spin polarized. The focus at Nikhef is the operator O_{tW} with a possible imaginary phase, leading to CP violation. Experimentally, many angular distribution are reconstructed in the top rest frame to hunt for these effects. We are looking for a limited set of optimal observables. The objective of your Master project would be to find optimal observables using simulated events including the detector effects and possible systematic deviations. All techniques are allowed, but promising new developments are methods which involve artifical intelligence. This work could lead to an ATLAS note.

[1] https://arxiv.org/abs/1807.03576

Contact: Marcel Vreeswijk [1] and Jordy Degens [2]

[edit] ATLAS: The Next Generation

After the observation of the coupling of Higgs bosons to fermions of the third generation, the search for the coupling to fermions of the second generation is one of the next priorities for research at CERN's Large Hadron Collider. The search for the decay of the Higgs boson to two charm quarks is very new [1] and we see various opportunities for interesting developments. For this project we propose improvements in reconstruction (using exclusive decays), advanced analysis techiques (using deep learning methods) and expanding the theory interpretation. Another opportunity would be the development of the first statistical combination of results between the ATLAS and CMS experiment, which could significantly improve the discovery potentional.

[1] https://arxiv.org/abs/1802.04329

Contact: Tristan du Pree and Marko Stamenkovic

[edit] ATLAS: The Most Energetic Higgs Boson

The production of Higgs bosons at the highest energies could give the first indications for deviations from the standard model of particle physics, but production energies above 500 GeV have not been observed yet [1]. The LHC Run-2 dataset, collected during the last 4 years, might be the first opportunity to observe such processes, and we have various ideas for new studies. Possible developments include the improvement of boosted reconstruction techniques, for example using multivariate deep learning methods. Also, there are various opportunities for unexplored theory interpretations (using the MadGraph event generator), including effective field theory models (with novel ‘morphing’ techniques) and new interpretations of the newly observed boosted VZ(bb) process.

[1] https://arxiv.org/abs/1709.05543

Contact: Tristan du Pree and Brian Moser

[edit] LHCb: Measurement of delta md

The decay B0->D-pi+ is very abundant in LHCb, and therefore ideal to study the oscillation frequency delta md, with which B0 mesons oscillate into anti-B0 mesons, and vice versa. This process proceeds through a so-called box diagram which might hide new yet-undiscovered particles. Recently, it has been realized that value of delta md is in tension with the valu of CKM-angle gamma, triggering renewed interest in this measurement.

Contact: Marcel Merk

[edit] LHCb: Searching for CPT violation

CPT symmetry is closely linked to Lorentz symmetry, and any violation would revolutionize science. There are possibilities though that supergravity could cause CPT violating effects in the system of neutral mesons. The precise study of B0s oscillations in the abundant Bs->Dspi decays can give the most stringent limits on Im(z) to date.

Contact: Marcel Merk

[edit] LHCb: BR(B0->D-pi+) and fd/fu with B+->D0pi+

The abundant decay B0->D-pi+ is often used as normalization channel, given its clean signal, and well-known branching fraction, as measured by the B-factories. However, this branching fraction can be determined more precisely, when comparing to the decay B+->D0pi+ , which has a twice better precision. In addition, the production of B0 and B+ mesons is often assumed to be equal, based on isospin symmetry. The study of B+->D0pi+ and B0->D-pi+ allows for the first measurement of this ratio, fd/fu.

Contact: Marcel Merk


[edit] LHCb: Optimization studies for Vertex detector at the High Lumi LHCb

The LHCb experiment is dedicated to measure tiny differences between matter and antimatter through the precise study of rare processes involving b or c quarks. The LHCb detector will undergo a major modification in order to dramatically increase the luminosity and be able to measure indirect effects of physics beyond the standard model. In this environment, over 42 simultaneous collisions are expected to happen at a time interval of 200 ps where the two proton bunches overlap. The particles of interest have a relatively long lifetime and therefore the best way to distinguish them from the background collisions is through the precise reconstruction of displaced vertices and pointing directions. The new detector considers using extremely recent or even future technologies to measure space (with resolutions below 10 um) and time (100 ps or better) to efficiently reconstruct the events of interest for physics. The project involves changing completely the LHCb Vertex Locator (VELO) design in simulation and determine what can be the best performance for the upgraded detector, considering different spatial and temporal resolutions.

Contact: Kazu Akiba

[edit] LHCb: Measurement of charge multiplication in heavily irradiated sensors

During the R&D phase for the LHCb VELO Upgrade detector a few sensor prototypes were irradiated to the extreme fluence expected to be achieved during the detector lifetime. These samples were tested using high energy particles at the SPS facility at CERN with their trajectories reconstructed by the Timepix3 telescope. A preliminary analysis revealed that at the highest irradiation levels the amount of signal observed is higher than expected, and even larger than the signal obtained at lower doses. At the Device Under Test (DUT) position inside the telescope, the spatial resolution attained by this system is below 2 um. This means that a detailed analysis can be performed in order to study where and how this signal amplification happens within the 55x55 um^2 pixel cell. This project involves analysing the telescope and DUT data to investigate the charge multiplication mechanism at the microscopic level.

Contact: Kazu Akiba

[edit] LHCb: Testing the flavour anomalies at LHCb

Lepton Flavour Universality (LFU) is an intrinsic property of the Standard Model, which implies that the three generation of leptons are subject to the same interactions. This fundamental law of the SM can be investigated by looking at rare B-meson decay with muons or electron in the final state. Recent measurements of these decays from LHCb show deviation from the SM (known as flavour anomalies) that, if confirmed, would lead to a major discovery of New Physics (NP). The project consists in the analysis of the 2017-18 dataset, which will double the statistic of the current results. This new dataset will lead to a measurement with better precision, which can either confirm or exclude the contribution of NP to these decays. The project will explore all the crucial aspect of data analysis, from simulation to signal modeling, including cutting-edge software, such us fitting large amount of data using GPU (Graphic Processing Unit).

Contact: Andrea Mauri and Marcel Merk

[edit] LHCb: Search for long-lived heavy neutral leptons in B decays

The mass of neutrinos are many orders of magnitude smaller than that of the other fermions. In the seesaw mechanism this puzzling fact is explained by the existence of another set of neutral leptons that are much heavier in mass. If their mass is below about 5 GeV such neutrinos can be produced at the LHC in decays of B hadrons. Their small coupling will lead to a lifetime of the order of pico-seconds which means that they will fly an observable distance before they decay. In this project we search for such long-lived heavy neutrinos in decays of charged B mesons using the LHCb run-2 dataset.

Contact: Lera Lukashenko and Wouter Hulsbergen

[edit] LHCb: Discovering the Bc->eta_c mu nu decay

The Bc meson, consisting of heavy c and anti-b quarks, is of great interest for flavour physics. Recent LHCb measurement on Bc->J/psi l nu decays [1] showed a possible deviation from the Standard Model prediction, which entered the so-called lepton universality puzzle - the hottest topic in the b-physics in recent years. Following that, the study of a similar decay mode - Bc->eta_c mu nu - is strongly requested by the theory community. However, the reconstruction of the eta_c meson is challenging, so that the decay has not been discovered yet. The project aims at discovery of the Bc->eta_c mu nu decay using unique capabilities of the LHCb experiment. The data analysis will consist of finding the optimal event selection using machine learning techniques, research on background sources, performing fits to data, etc. The project requires to be not afraid of analysis software and statistics. The results will be presented in collaboration: talks at working group meetings, analysis note, etc. Skills in git, python and ROOT (and similar packages) are extremely welcome.

[1] https://arxiv.org/pdf/1711.05623.pdf

Contact: Andrii Usachov and Marcel Merk

[edit] ALICE: Searching for the strongest magnetic field in nature

In case of a non-central collision between two Pb ions, with a large value of impact parameter (b), the charged nucleons that do not participate in the interaction (called spectators) create strong magnetic fields. A back of the envelope calculation using the Biot-Savart law brings the magnitude of this filed close to 10^19Gauss in agreement with state of the art theoretical calculation, making it the strongest magnetic field in nature. The presence of this field could have direct implications in the motion of final state particles. The magnetic field, however, decays rapidly. The decay rate depends on the electric conductivity of the medium which is experimentally poorly constrained. Overall, the presence of the magnetic field, the main goal of this project, is so far not confirmed experimentally.

Contact: Panos Christakoglou

[edit] ALICE: Looking for parity violating effects in strong interactions

Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions, in the presence of a deconfined state, gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, what is called the Chiral Magnetic Effect (CME). The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, however further studies are needed to constrain background effects. These highly anticipated results have the potential to reveal exiting, new physics.

Contact: Panos Christakoglou

[edit] ALICE: Machine learning techniques as a tool to study the production of heavy flavour particles

There was recently a shift in the field of heavy-ion physics triggered by experimental results obtained in collisions between small systems (e.g. protons on protons). These results resemble the ones obtained in collisions between heavy ions. This consequently raises the question of whether we create the smallest QGP droplet in collisions between small systems. The main objective of this project will be to study the production of charm particles such as D-mesons and Λc-baryons in pp collisions at the LHC. This will be done with the help of a new and innovative technique which is based on machine learning (ML). The student will also extend the studies to investigate how this production rate depends on the event activity e.g. on how many particles are created after every collision.

Contact: Panos Christakoglou and Alessandro Grelli

[edit] ALICE: Energy Loss of Energetic Quarks and Gluons in the Quark-Gluon Plasma

One of the ways to study the quark-gluon plasma that is formed in high-energy nuclear collisions, is using high-energy partons (quarks or gluons) that are produced early in the collision and interact with the quark-gluon plasma as they propagate through it. There are several current open questions related to this topic, which can be explored in a Master's project. For example, we would like to use the new Monte Carlo generator framework JetScape to simulate collisions to see whether we can extract information about the interaction with the quark-gluon plasma. In the project you will collaborate with one of the PhD students or postdocs in our group to use the model to generate predictions of measurements and compare those to data analysis results. Depending on your interests, the project can focus more on the modeling aspects or on the analysis of experimental data from the ALICE detector at the LHC.

Contact: Marco van Leeuwen and Marta Verweij

[edit] ALICE: Extreme Rare Probes of the Quark-Gluon Plasma

The quark-gluon plasma is formed in high-energy nuclear collisions and also existed shortly after the big bang. With the large amount of data collected in recent years at the Large Hadron Collider at CERN, rare processes that previously were not accessible provide now new ways to study how the quark-gluon plasma emerges from the fundamental theory of strong interaction. One of such processes is the heavy W boson which in many cases decays to two quarks. The W boson itself doesn’t interact with the quark-gluon plasma because it doesn’t carry color, but the quark decay products do interact with the plasma and therefore provide an ideal tool to study the space-time evolution of this hot and dense medium. In this project you will use data from the ALICE detector at the LHC and simulated data from generators to study various physics mechanisms that could be happening in the real collisions.

Contact: Marta Verweij and Marco van Leeuwen

[edit] ALICE: Jet Quenching with Machine Learning

Machine learning applications are rising steadily as a vital tool in the field of data science but are relatively new in the particle physics community. In this project machine learning tools will be used to gain insights into the modification of a parton shower in the quark-gluon plasma (QGP). The QGP is created in high-energy nuclear collisions and only lives for a very short period of time. Highly energetic partons created in the same collisions interact with the plasma while they travers it and are observed as a collimated spray of particles, known as jets, in the detector. One of the key recent insights is that the internal structure of jets provides information about the evolution of the QGP. With data recorded by the ALICE experiment, you will use jet substructure techniques in combination with machine learning algorithms to dissect the structure of the QGP. Machine learning will be used to select the regions of radiation phase space that are affected by the presence of the QGP.

Contact: Marta Verweij and Marco van Leeuwen

[edit] Lepton Collider: Pixel TPC testbeam

In the Lepton Collider group at Nikhef we work on a tracking detector for a future Collider (e.g. the ILC in Japan). We are developing a gaseous Time Projection Chamber with a pixel readout. At Nikhef we have built an 8-quad GridPix module based on the Timepix3 chip, which is a detector of about 20 cm x 40 cm x 10 cm in size. In August 2020 we will test the device at the DESY particle accelerator in Hamburg. For the project you could work on preparations for the test beam (e.g. running the data acquisition, perform data monitoring using our set up in the lab). The next topics will be the participation in the data taking during the test beam at DESY, the analysis of the data using C++ and ROOT and - finally - publication of the results in a scientific journal.

Our latest paper can be found in https://www.nikhef.nl/~s01/quad_paper.pdf [www.nikhef.nl].

Contact: Peter Kluit and Kees Ligtenberg

[edit] Dark Matter: Sensitive tests of wavelength-shifting properties of materials for dark matter detectors

Rare event search experiments that look for neutrino and dark matter interactions are performed with highly sensitive detector systems, often relying on scintillators, especially liquid noble gases, to detect particle interactions. Detectors consist of structural materials that are assumed to be optically passive, and light detection systems that use reflectors, light detectors, and sometimes, wavelength-shifting materials. MSc theses are available related to measuring the efficiency of light detection systems that might be used in future detectors. Furthermore, measurements to ensure that presumably passive materials do not fluoresce, at the low level relevant to the detectors, can be done. Part of the thesis work can include Monte Carlo simulations and data analysis for current and upcoming dark matter detectors, to study the effect of different levels of desired and nuisance wavelength shifting. In this project, students will acquire skills in photon detection, wavelength shifting technologies, vacuum systems, UV and extreme-UV optics, detector design, and optionally in C++ programming, data analysis, and Monte Carlo techniques.

Contact: Tina Pollmann and Patrick Decowski

[edit] Dark Matter: Signal reconstruction in XENONnT

The next generation direct detection dark matter experiment - XENONnT - comprises close to 500 photomultiplier tubes (PMTs) in the main detector volume. These PMTs are configured to be able to detect even single photons. When a single photoelectron (PE) signal is detected the detected signal (a pulse) is convoluted with the detector response of the PMT. Due to this detector response the pulse shape of a single PE is spread out in time. For XENONnT we would like to explore the possibility to implement a digital (software) filter to deconvolve the detected pulse back to the “true” instantaneous shape (without the detector spread). This is a virtually unexplored new step in the Xenon analysis framework. Later in the analysis framework these pulses from all the PMTs are combined into a signal referred to as a ‘peak’. For XENONnT it is of essence to be extremely good in discriminating between two types of peaks caused by interactions in the detector; a prompt primary scintillation signal (S1) and a secondary ionization signal (S2). The parameters in the software haven’t - as of the time of writing - been optimized for the XENONnT-detector conditions. The student would investigate how a deconvolution filter would benefit the XENONnT analysis framework and develop such a filter. Furthermore, the student will work on the classification of these signals to fully exploit the XENONnT-detector to optimize the classification. This will be done with simulated data at first but may later even be performed on actual XENONnT-data. As an extension, the possibility of applying machine learning to correctly distinguish between the two signals could be explored. This is a data-analysis oriented project where Python skills are paramount.

Contact: Patrick Decowski and Joran Angevaare

[edit] Dark Matter: XAMS R&D Setup

The Amsterdam Dark Matter group operates an R&D xenon detector at Nikhef. The detector is a dual-phase xenon time-projection chamber and contains about 4kg of ultra-pure liquid xenon. We use this detector for the development of new detection techniques - such as utilizing our newly installed silicon photomultipliers - and to improve the understanding of the response of liquid xenon to various forms of radiation. The results could be directly used in the XENONnT experiment, the world’s most sensitive direct detection dark matter experiment at the Gran Sasso underground laboratory, or for future Dark Matter experiments like DARWIN. We have several interesting projects for this facility. We are looking for someone who is interested in working in a laboratory on high-tech equipment, modifying the detector, taking data and analyzing the data him/herself. You will "own" this experiment.

Contact: Patrick Decowski and Auke Colijn

[edit] Dark Matter: DARWIN Sensitivity Studies

DARWIN is the "ultimate" direct detection dark matter experiment, with the goal to reach the so-called "neutrino floor", when neutrinos become a hard-to-reduce background. The large and exquisitely clean xenon mass will allow DARWIN to also be sensitive to other physics signals such as solar neutrinos, double-beta decay from Xe-136, axions and axion-like particles etc. While the experiment will only start in 2025, we are in the midst of optimizing the experiment, which is driven by simulations. We have an opening for a student to work on the GEANT4 Monte Carlo simulations for DARWIN, as part of a simulation team together with the University of Freiburg and Zurich. We are also working on a "fast simulation" that could be included in this framework. It is your opportunity to steer the optimization of a large and unique experiment. This project requires good programming skills (Python and C++) and data analysis/physics interpretation skills.

Contact: Patrick Decowski and Auke Colijn

[edit] Dark Matter: Fast simulation studies

For Dark Matter experiments it is crucial to understand sources of backgrounds in great detail. The most common way to study the effect of backgrounds to the Dark Matter sensitivity is by the use of Monte Carlo simulations. Unfortunately, the standard Monte Carlo techniques are extremely inefficient. One needs to sometimes simulate millions of events before one background event appears in the Dark Matter search area. We have developed a Monte Carlo technique that accelerates this process by up to 1000x. The method has been validated on very simple and unrealistic detector models. In goal of this project is to make a realistic detector model for the fast detector simulations. For this we are looking for a student with good programming skills, an interest in a software project, and the desire to deeply understand analysis of Dark Matter experimental data.

Contact: Patrick Decowski and Auke Colijn

[edit] Dark Matter & Amsterdam Scientific Instruments: Simulations for Industry

In the Nikhef Dark Matter group we have built up an extensive expertise with Monte Carlo simulations of ionizing radiation. Although these simulations have the aim to estimate background levels in our XENON experiments, the same techniques can be applied to study radiation transport in industrial devices. Amsterdam Scientific Instruments (ASI) is a company at Science Park that develops and sells radiation imaging equipment that is used amongst others in electron microscopy. For this application ASI needs a detailed study of gamma ray backgrounds to optimize shielding for their products. The project aims at optimizing a shielding design based on GEANT4 simulations. The results may be implemented in next generation products of ASI. We are looking for a student with preferably strong computing skills, and with an interest in science-industrial collaboration.

Contact: Patrick Decowski and Auke Colijn

[edit] The Modulation experiment: Data Analysis

For years there have been controversial claims of potential new-physics on the basis of time-varying decay rates of radioactive sources on top of ordinary exponential decay. While some of these claims have been refuted, others have still to be confirmed or falsified. To this end, a dedicated experiment - the modulation experiment - has been designed and operational for the past four years. Using four identical and independent setups the experiment is almost ready for a final analysis to conclude on these claims. In this project the student will perform this analysis, preferably resulting in a conclusive paper. This will require combining the data of the four setups and close collaboration with a small group constituting a collaboration of the four different involved institutes (Purdue University (USA), Universität Zürich (Switzerland), Centro Brasileiro de Pesquisas Fisicas (Brasil) and Nikhef). This project is data-analysis oriented. Additionally, lab-skills can be required as one of the setups is situated at Nikhef.


Contact: Auke Colijn and Joran Angevaare

[edit] Detector R&D: Performance of the ALPIDE monolithic active pixel sensor with radiation damage

The ALICE inner tracking system (ITS) 2 is currently being installed at the large hadron collider (LHC) at CERN. This detector makes use of ultra-lightweight monolithic active pixel sensors, the first to use this technology at a particle collider after the STAR experiment at RHIC in Brookhaven. These very thin pixel detectors have a low power consumption, result in very little material in the detector, and still have optimal timing and resolution -- and are a promising technology for future experiments. You will be part of the international ALICE collaboration and investigate the ALICE ALPIDE chip. Although ALICE will not see high levels of radiation at the LHC, it has so far not been tested whether this chip can withstand very high levels of radiation and could be, if there is no large degradation in performance, be used in experiments like ATLAS as well. You will be part of the Nikhef R&D group where you will learn about new detector technologies for high energy physics and learn to design a test setup to characterize the ALPIDE chip in a particle beam using the many instruments at the Nikhef R&D labs. You will then test the chip at the Delft or Groningen facilities that provide a particle beam.

Contact: Jory Sonneveld

[edit] Detector R&D: Time resolution of a high voltage monolithic active pixel sensor

For the first time, CMOS monolithic active pixel sensors (MAPS), where chip and sensor are integrated, are being used in an experiment at the LHC. Although this is a common technology in industry, it is rather new in the high rate, high radiation environments of high energy particle physics. The ALICE experiment is currently installing such MAPS to which a moderate bias voltage can be applied. You will work in the international RD50 collaboration that works on radiation hard semiconductor devices for very high luminosity colliders, and investigate their MAPS that can be biased to very high voltages to avoid signal degradation after radiation damage. You will be part of the Nikhef R&D group where you will learn about new detector technologies for high energy physics and learn to design a test setup to get a first measurement of time resolution of the RD50 HV-CMOS chip using the many instruments at the Nikhef R&D labs.

Contact: Jory Sonneveld


[edit] Detector R&D: Test beam with a bent ALPIDE monolithic active pixel sensor

The ALICE inner tracking system (ITS) 2 is currently being installed at the large hadron collider (LHC) at CERN. This detector makes use of ultra-lightweight monolithic active pixel sensors, the first to use this technology at a particle collider after the STAR experiment at RHIC in Brookhaven. These very thin pixel detectors have a low power consumption, result in very little material in the detector, and still have optimal timing and resolution -- and are a promising technology for future experiments. For the next long shutdown in 2025, an even smaller feature size version of the ALPIDE chip will be used and will be installed by bending larger surfaces of sensor around the beam pipe. Recent test beams at DESY in Hamburg show this yields good results. You will be part of the Nikhef R&D group where you will learn about new detector technologies for high energy physics and learn to design a test setup to characterize the ALPIDE chip using the many instruments at the Nikhef R&D labs. You will work within an international collaboration where you will learn to analyze test beam data. If the travel situation allows, you will have the opportunity to join the ALICE test beam group in Hamburg at DESY to take part in the exciting experience of taking real data.

Contact: Jory Sonneveld

[edit] Detector R&D: Simulating the performance of the ATLAS pixel detector after years of radiation

The innermost detector of the ATLAS experiment at the large hadron collider (LHC) that is closest to the beam pipe is the ATLAS pixel detector. The pixel sensors in this area receive the highest amounts of radiation and their performance suffers accordingly. To better understand the effects of radiation damage and to be able to predict the future performance, the pixel sensors are modeled using programs such as technology computer aided design (TCAD) for modeling electric fields that serves as input for programs such as AllPix2 for modeling observables affecting the signal quality such as charge collection efficiency. In this project, you will learn to use TCAD, a tool widely used in the semiconductor industry, to model electric field maps of the sensor, and get an estimate of the uncertainties by comparing the prediction for different models. You will compare your simulations to real data from the ATLAS experiment as well as to data from test beams. You will work in an international environment within the ATLAS collaboration and be part of the Nikhef detector R&D group where you will learn about the newest detector technologies for high energy physics and beyond. Your improved predictions for the performance of the next ATLAS pixel detector will help ATLAS better prepare for future LHC data taking after the installation of this detector in 2025.

Contact: Jory Sonneveld

[edit] Detector R&D: Laser Interferometer Space Antenna (LISA)

The space-based gravitational wave antenna LISA is, without a doubt, one of the most challenging space missions ever proposed. ESA plans to launch around 2030 three spacecraft that are separated by a few million kilometers to measure tiny variations in the distances between test-masses located in each satellite to detect the gravitational waves from sources such as supermassive black holes. The triangular constellation of the LISA mission is dynamic, requiring a constant fine-tuning related to the pointing of the laser links between the spacecraft and a simultaneous refocusing of the telescope. The noise sources related to the laser links expect to provide a dominant contribution to the LISA performance. An update and extension of the LISA science simulation software are needed to assess the hardware development for LISA at Nikhef, TNO, and SRON. A position is therefore available for a master student to study the impact of instrumental noise on the performance of LISA. Realistic simulations based on hardware (noise) characterization measurements performed at TNO will be carried out and compared to the expected tantalizing gravitational wave sources.

Contact: Niels van Bakel,Ernst-Jan Buis

[edit] Detector R&D: Spectral X-ray imaging - Looking at colours the eyes can't see

When a conventional X-ray image is taken, one acquires an image that only shows intensities. a ‘black and white’ image. Most of the information carried by the photon energy is lost. Lacking spectral information can result in an ambiguity between material composition and amount of material in the sample. If the X-ray intensity as a function of the energy can be measured (i.e. a ‘colour’ X-ray image) more information can be obtained from a sample. This translates to less required dose and/or to a better understanding of the sample that is being investigated. For example, two fields that can benefit from spectral X-ray imaging are mammography and real time CT.

Detectors using Medipix3 chips are used for X-ray imaging. Such a detector is composed of a pixel chip with a semiconductor sensor bonded on top of it. Photoelectric absorption of X-rays in the sensor results in an amount of charge being released that is proportional to the X-ray energy. This charge is registered by a pixel. Depending on configuration, in each pixel 1, 2, 4 or 8 detection thresholds can be set and so, a number of energy bins can be defined. One of the challenges is to maximise X-ray image quality by minimising effects caused by dispersion in the sensitivity of the pixels. The effects of this dispersion can partly be compensated by applying a specific measurement method in combination with image post processing.

You can work on improving measurement methods and on improving post processing methods. There is flexibility of the planned work depending on the skillset you have. The aim is to get the best X-ray energy resolution over the entire pixel chip. This in turn improves image quality and therefore X-ray CT reconstruction quality.

Important note: Much of this work is to be performed in the laboratory. Because of the corona pandemic it is not sure if it is possible to be physically present for enough of the time for this project. Please contact us to discuss the possibilities.

Please see the following videos for examples of our work:

https://youtu.be/cgwQvjfUYns

https://youtu.be/tf9ZLALPVNY

https://youtu.be/vjPX7SxvSUk

https://youtu.be/LqjNVSm7Hoo

Contact: Martin Fransen,Navrit Bal

[edit] Detector R&D: Holographic projector

A difficulty in projecting holograms (based on the interference of light) is the required dense pixel pitch of a projector. One would need a pixel pitch of less than 200 nanometer. With larger pixels artefacts occur due to spatial under sampling. A pixel pitch of 200 nanometer is difficult, if not, impossible, to achieve, especially for larger areas. Another challenge is the massive amount of computing power that would be required to control such a dense pixel matrix.

A new holographic projection method has been developed that reduces under sampling artefacts for projectors with a ‘low’ pixel density. It uses 'pixels' at random but known positions, resulting in an array of (coherent) light points that lacks (or has suppressed) spatial periodicity. As a result a holographic projector can be built with a significantly lower pixel density and correspondingly less required computing power. This could bring holography in reach for many applications like display, lithography, 3D printing, metrology, etc...

Of course, nothing comes for free: With less pixels, holograms become noisier and the contrast will be reduced (not all light ends up in the hologram). The questions: How does the quality of a hologram depend on pixel density? How do we determine projector requirements based on requirements for hologram quality?

Requirements for a hologram can be expressed in terms of: Noise, contrast, resolution, suppression of under sampling artefacts, etc..

For this project we have built a proof of concept holographic emitter. This set-up will be used to verify simulation results (and also to project some cool holograms of course ;-).

Examples of what you could be working on:

a. Calibration/characterisation of the current projector and compensation of systematic errors.

b. To realize a phased array of randomly placed light sources the pixel matrix of the projector must be ‘relayed’ onto a mask with apertures at random but precisely known positions. Determine the best possible relaying optics and design an optimized mask accordingly. Factors like deformation of the projected pixel matrix and limitations in resolving power of the lens system must be taken into account for mask design.

Important note: Much of this work is to be performed in the laboratory. Because of the corona pandemic it is not sure if it is possible to be physically present for enough of the time for this project. Please contact me to discuss the possibilities.

Contact: Martin Fransen

[edit] Theory: The Effective Field Theory Pathway to New Physics at the LHC

A promising framework to parametrise in a robust and model-independent way deviations from the Standard Model (SM) induced by new heavy particles is the Standard Model Effective Field Theory (SMEFT). In this formalism, beyond the SM effects are encapsulated in higher-dimensional operators constructed from SM fields respecting their symmetry properties. In this project, we aim to carry out a global analysis of the SMEFT from high-precision LHC data, including Higgs boson production, flavour observables, and low-energy measurements. This analysis will be carried out in the context of the recently developed SMEFiT approach [1] based on Machine Learning techniques to efficiently explore the complex theory parameter space. The ultimate goal is either to uncover glimpses of new particles or interactions at the LHC, or to derive the most stringent model-independent bounds to date on general theories of New Physics. Of particular interest are novel methods for charting the parameter space [2], the matching to UV-complete theories in explicit BSM scenarios [3], and the interplay between EFT-based model-independent searches for new physics and determinations of the proton structure from LHC data [4].

[1] https://arxiv.org/abs/1901.05965 [2] https://arxiv.org/abs/1906.05296 [3] https://arxiv.org/abs/1908.05588 [4] https://arxiv.org/abs/1905.05215

Contact: Juan Rojo

[edit] Theory: Charting the quark and gluon structure of protons and nuclei with Machine Learning

Deepening our knowledge of the partonic content of nucleons and nuclei [1] represents a central endeavour of modern high-energy and nuclear physics, with ramifications in related disciplines such as astroparticle physics. There are two main scientific drivers motivating these investigations of the partonic structure of hadrons. On the one hand, addressing fundamental open issues in our understanding in the strong interactions such as the origin of the nucleon mass, spin, and transverse structure; the presence of heavy quarks in the nucleon wave function; and the possible onset of novel gluon-dominated dynamical regimes. On the other hand, pinning down with the highest possible precision the substructure of nucleons and nuclei is a central component for theoretical predictions in a wide range of experiments, from proton and heavy ion collisions at the Large Hadron Collider to ultra-high energy neutrino interactions at neutrino telescopes. The goal of this project is to exploit Machine Learning and Artificial Intelligence tools [2,3] (neural networks trained by stochastic gradient descent) to pin down the quark and gluon substructure of protons and nuclei by using recent measurements from proton-proton and proton-lead collisions at the LHC. Topics of special interest are i) the strange content of protons and nuclei, ii) parton distributions at higher-orders in the QCD couplings for precision Higgs physics, iii) the interplay between jet, photon, and top quark production data to pin down the large-x gluon, and iv) charm quarks as a probe of gluon shadowing at small-x. The project also involves developing projects for the Electron-Ion Collider (EIC), a new lepton-nucleus experiment to start operations in the next years.

[1] https://arxiv.org/abs/1910.03408 [2] https://arxiv.org/abs/1904.00018 [3] https://arxiv.org/abs/1706.00428

Contact: Juan Rojo

[edit] Theory: Machine learning for Electron Microscopy for next-generation materials

Machine Learning tools developed and applied for particle physics hold great potential for applications in material science, in particular concerning faithful uncertainty estimation and model training for large parameter spaces. In this project, carried out in collaboration with the group of Dr. Sonia Conesa-Boj from the Kavli Institute Nanoscience Delft, http://www.conesabojlab.tudelft.nl, we will develop and deploy ML tools for data analysis in Electron Microscopy. We will focus on pinning down the properties of novel quantum materials such as topological insulators and van der Waals materials. Examples of possible applications include model-independent background subtraction in electron-energy loss spectroscopy, automatic classification of crystalline structures, and enhancing spatial and spectral resolution using convolutional networks.

Contact: Juan Rojo

[edit] Theory: The electroweak phase transition and baryogenesis/gravitational wave production

In extensions of the Standard Model the electroweak phase transition can be first order and proceed via the nucleation of bubbles. Colliding bubbles can produce gravitational waves [1] and plasma particles interacting with the bubbles can generate a matter-antimatter asymmetry [2]. A detailed understanding of the dynamics of the phase transitions is needed to accurately describe these processes. One project is to study QFT at finite temperature and compare/apply methods that address the non-perturbative IR dynamics of the thermal processes [3,4]. Another project is to calculate the velocity by which the bubbles expand, which is an important parameter for gravitational waves production and baryogensis. This entails among other things tunneling dymamics, (thermal) scattering rates and Boltzmann equations [5].

[1]https://arxiv.org/abs/1705.01783 [2]https://arxiv.org/pdf/hep-ph/0609145.pdf [3]https://arxiv.org/pdf/1609.06230.pdf [4]https://arxiv.org/pdf/1612.00466.pdf [5]https://arxiv.org/pdf/1809.04907.pdf

Contact: Marieke Postma

[edit] Theory: Cosmology of the QCD axion

The QCD axion provides an elegant solution to the strong CP problem in QCD[1]. This project focus on the cosmological dynamics of this hypothesized axion field, and in particular the possibility that it can both produce the observed matter-antimatter asymmetry and dark matter abundance in our universe [2,3].

[1]https://arxiv.org/abs/1812.02669 [2]https://arxiv.org/pdf/hep-ph/0609145.pdf [3]https://arxiv.org/pdf/1910.02080.pdf

Contact: Marieke Postma

[edit] Theory: Neutrinos, hierarchy problem and cosmology

The electroweak hierachy problem is absent if the quadratic term in the Higgs potential is generated dynamically. This is achieved in 'the neutrino option' [1] where the Higgs potential stems exclusively from quantum effects of heavy right-handed neutrinos, which can also generate the mass pattern of the oberved left-handed neutrinos. The project focusses on model building aspects (e.g. [2]) and the cosmology (e.g. leptogenesis [3]) of these set-ups.

[1] https://arxiv.org/pdf/1703.10924.pdf [2] https://arxiv.org/pdf/1807.11490.pdf [3] https://arxiv.org/pdf/1905.12642.pdf

Contact: Marieke Postma

[edit] KM3NeT: Reconstruction of first neutrino interactions in KM3NeT

The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first few strings with sensitive photodetectors have been deployed at both the Italian and the French detector sites. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In this project the available data will be used together with simulations to best reconstruct the event topologies and optimally identify and reconstruct the first neutrino interactions in the KM3NeT detector and with this pave the path towards accurate neutrino oscillation measurements and neutrino astronomy.

Programming skills are essential, mostly root and C++ will be used. Contact: Ronald Bruijn Dorothea Samtleben'

[edit] KM3NeT: Searching for New Heavy Neutrinos

In this project we will be searching for a new heavy neutrino, looking at signatures created by atmospheric neutrinos interacting in the detector volume of KM3NeT-ORCA. The aim of this project is to study a specific event topology which appears as double blobs of signals detected separately by densely instrumented ORCA detector units. We will be exploiting the tau reconstruction algorithms to verify the possibility of ORCA to detect such signals and to estimate the potential sensitivity of the experiment as well. Basic knowledge of elementary particle physics and data analysis techniques will be advantageous. The knowledge of programming languages e.g. python (and possibly C++) and ROOT are advantageous but not mandatory.

Contact: Suzan B. du Pree Daan van Eijk

[edit] KM3NeT: Dark Matter with KM3NeT-ORCA

Dark Matter is thought to be everywhere (we should be swimming through it), but we have no idea what it is. Using the good energy and angular resolutions of the KM3NeT neutrino telescope, we can search for Dark Matter signatures that originate from the center of our galaxy. In this project, we will search for such signatures using the reconstructed track and shower events with the KM3NeT-ORCA detector to discover relatively light Dark Matter particles. Since this year, the KM3NeT-ORCA experiment has 6 detection lines under the Mediterranean Sea: fully operational and continuously taking data. Using the available data, it is possible to compare data and simulation for different event topologies and to estimate the experiment's sensitivity. The project is suitable for a student who is interested to explore new physics scenarios and willing to develop new skills. Basic knowledge of elementary particle physics and data analysis techniques will be advantageous. The knowledge of programming languages e.g. python (possibly C++) and ROOT data analysis tool are advantageous but not mandatory.

Contact: Suzan B. du Pree Daan van Eijk


[edit] Gravitational Waves: Unraveling the structure of neutron stars with gravitational wave observations

Neutron stars were first discovered more than half a century ago, yet their detailed internal structure largely remains a mystery. A range of theoretical models have been put forward for the neutron star "equation of state", but until recently there was no real way to test them. The direct detection of gravitational waves with LIGO and Virgo has the potential to remedy the situation. When two neutron stars spiral towards each other, they get tidally deformed in a way that is determined by the equation of state, and these deformations get imprinted upon the shape of the gravitational wave that gets emitted. After the first gravitational wave observation of such an event in 2017, several equation of state models could already be ruled out. With expected upgrades of the detectors, we will at some point have access not only to the "inspiral" of binary neutron stars, but to the merger itself, and what happens afterwards. The project will consist of using results from large-scale numerical simulations to come up with a heuristic model for the waveform that describes the inspiral-merger-postmerger process with sufficient accuracy given expected detector sensitivities, and to develop data analysis techniques to efficiently use this model to extract information about the neutron star equation of state.

Contact: Chris Van Den Broeck


[edit] Gravitational Waves: Searches for gravitational waves from compact binary coalescence

Searches for gravitational waves from the mergers of black holes and neutron stars have been extraordinarily successful in the last four years. We are now beginning to study a population of heavy stellar-mass black holes in detail, including understanding how these systems came to form and whether they are consistent with general relativity. Additionally, the detection of binary neutron star mergers is allowing us to probe their extreme matter. However, we’ve only just scratched the surface of possible signals and the new physics they’d allow us to study. The detection of highly spinning and precessing systems would allow us to perform black hole population statistics to an extraordinary degree of accuracy. Detection of sub-solar mass systems would provide evidence of dark matter. However, these searches are difficult because they require us to work in high-dimensional spaces and develop new statistical methods. There are possibilities for several projects that involve the development and implementation of these new searches as well as the interpretation of the results, particularly in terms of the physics describing compact binary mergers.

Contact: Sarah Caudill


[edit] Gravitational Waves: Computer modelling to design the laser interferometers for the Einstein telescope

A new field of instrument science led to the successful detection of gravitational waves by the LIGO detectors in 2015. We are now preparing the next generation of gravitational wave observatories, such as the Einstein Telescope, with the aim to increase the detector sensitivity by a factor of ten, which would allow, for example, to detect stellar-mass black holes from early in the universe when the first stars began to form. This ambitious goal requires us to find ways to significantly improve the best laser interferometers in the world.

Gravitational wave detectors, such as LIGO and VIRGO, are complex Michelson-type interferometers enhanced with optical cavities. We develop and use numerical models to study these laser interferometers, to invent new optical techniques and to quantify their performance. For example, we synthesize virtual mirror surfaces to study the effects of higher-order optical modes in the interferometers, and we use opto-mechanical models to test schemes for suppressing quantum fluctuations of the light field. We can offer several projects based on numerical modelling of laser interferometers. All projects will be directly linked to the ongoing design of the Einstein Telescope.

Contact: Andreas Freise


[edit] Gravitational Waves: Digging away the noise to find the signal

Gravitational Wave interferometers are extremely sensitive, but suffer from instrumental issues that produce noise that mimics astrophysical signals. This needs to be solved as much as possible before the data analysis. The problem is that instrumentalists don't know about analysis pipelines, and data analysts don't know about experimental details. We need your help to bridge the gap. This is a good opportunity to learn about both sides and contribute directly to a booming international field. We have several tools and new ideas for correlating noises with the state of the instrument. These need to be developed further, used on years of data, and written up. Will require Python, signal processing and statistics.

Contact: Bas Swinkels and Sarah Caudill


[edit] Gravitational Waves: Machine Learning techniques for GW Interferometers

The control of suspended optical cavities in the non linear regime. Gravitational Wave interferometers are extremely sensitive, however suffer from a very small control range, causing unlocks, reducing the robustness of these instruments. In this project we will use a table top replica of a suspended optical cavity, located in the new R&D laser lab at Nikhef, for the development of a neural network to construct the positions from free falling mirror by using beam images. A database with simulated beam images can be used to train various neural networks before deployment in the table top experiment. We are looking for a hands-on and enthusiastic master student, interested in machine learning and experienced in programming languages like Python. Contacts: Rob Walet, Frank Linde

Contact: Rob Walet and Frank Linde

[edit] VU LaserLaB: Measuring the electric dipole moment (EDM) of the electron

In collaboration with Nikhef and the Van Swinderen Institute for Particle Physics and Gravity at the University of Groningen, we have recently started an exciting project to measure the electric dipole moment (EDM) of the electron in cold beams of barium-fluoride molecules. The eEDM, which is predicted by the Standard Model of particle physics to be extremely small, is a powerful probe to explore physics beyond this Standard Model. All extensions to the Standard Model, most prominently supersymmetry, naturally predict an electron EDM that is just below the current experimental limits. We aim to improve on the best current measurement by at least an order of magnitude. To do so we will perform a precision measurement on a slow beam of laser-cooled BaF molecules. With this low-energy precision experiment, we test physics at energies comparable to those of LHC!

At LaserLaB VU, we are responsible for building and testing a cryogenic source of BaF molecules. The main parts of this source are currently being constructed in the workshop. We are looking for enthusiastic master students to help setup the laser system that will be used to detect BaF. Furthermore, projects are available to perform simulations of trajectory simulations to design a lens system that guides the BaF molecules from the exit of the cryogenic source to the experiment.

Contact: Rick Bethlem

[edit] VU LaserLaB: Physics beyond the Standard model from molecules

Our team, with a number of staff members (Ubachs, Eikema, Salumbides, Bethlem, Koelemeij) focuses on precision measurements in the hydrogen molecule, and its isotopomers. The work aims at testing the QED calculations of energy levels in H2, D2, T2, HD, etc. with the most precise measurements, where all kind of experimental laser techniques play a role (cw and pulsed lasers, atomic clocks, frequency combs, molecular beams). Also a target of studies is the connection to the "Proton size puzzle", which may be solved through studies in the hydrogen molecular isotopes.

In the past half year we have produced a number of important results that are described in the following papers:

  • Frequency comb (Ramsey type) electronic excitations in the H2 molecule:

see: Deep-ultraviolet frequency metrology of H2 for tests of molecular quantum theory http://www.nat.vu.nl/~wimu/Publications/Altmann-PRL-2018.pdf

  • Precision measurement of an infrared transition in the HD molecule

see: Sub-Doppler frequency metrology in HD for tests of fundamental physics: https://arxiv.org/abs/1712.08438

  • The first precision study in molecular tritium T2

see: Relativistic and QED effects in the fundamental vibration of T2: http://arxiv.org/abs/1803.03161

  • Dissociation energy of the hydrogen molecule at 10^-9 accuracy paper submitted to Phys. Rev. Lett.
  • Probing QED and fundamental constants through laser spectroscopy of vibrational transitions in HD+

This is also a study of the hydrogen molecular ion HD+, where important results were obtained not so long ago, and where we have a strong activity: http://www.nat.vu.nl/~wimu/Publications/ncomms10385.pdf

These five results mark the various directions we are pursuing, and in all directions we aim at obtaining improvements. Specific projects with students can be defined; those are mostly experimental, although there might be some theoretical tasks, like performing calculations of hyperfine structures. Contact: Wim Ubachs Kjeld Eikema Rick Bethlem


Last year's MSc Projects

Views
Personal tools