Master Projects

From Education Wiki
(Difference between revisions)
Jump to: navigation, search
(LHCb: Searching for dark matter in exotic six-quark particles)
(LHCb: Discovering the Bc->eta_c mu nu decay)
 
(156 intermediate revisions by 25 users not shown)
Line 3: Line 3:
 
The following Master thesis research projects are offered at Nikhef. If you are interested in one of these projects, please contact the coordinator listed with the project.  
 
The following Master thesis research projects are offered at Nikhef. If you are interested in one of these projects, please contact the coordinator listed with the project.  
  
 +
== Projects with September 2020 start ==
  
== New Projects [start in September 2018] ==
+
=== ATLAS: Top Spin optimal observables using Artificial Intelligence ===
  
 +
The top quark has an exceptional high mass, close to the electroweak symmetry breaking scale and therefore sensitive to new physics effects. Theoretically, new physics is well described in the EFT framework [1]. The (EFT) operators are experimentally well accessible in single top t-channel production where the top quark is produced spin polarized. The focus at Nikhef is the operator O_{tW} with a possible imaginary phase, leading to CP violation. Experimentally, many angular distribution are reconstructed in the top rest frame to hunt for these effects. We are looking for a limited set of optimal observables. The objective of your Master project would be to find optimal observables using simulated events including the detector effects and possible systematic deviations. All techniques are allowed, but promising new developments are methods which involve artifical intelligence. This work could lead to an ATLAS note.
  
=== The XENON Dark Matter Experiment: Data Analysis ===
+
[1] https://arxiv.org/abs/1807.03576
  
The XENON collaboration is operating the XENON1T detector, the world’s most sensitive direct detection dark matter experiment. The Nikhef group is playing an important role in this experiment. The detector operates at the Gran Sasso underground laboratory and consists of a so-called dual-phase xenon time-projection chamber filled with 3200kg of ultra-pure xenon. Our group has an opening for a motivated MSc student to do analysis with the data from this detector. The work will consist of understanding the signals that come out of the detector and applying machine learning tools to improve the reconstruction performance in our Python-based analysis tool. The final goal is to improve the signal-to-background for the dark matter search. There will also be opportunity to do data-taking shifts at the Gran Sasso underground laboratory in Italy.  
+
''Contact: Marcel Vreeswijk [mailto:h73@nikhef.nl] and Jordy Degens [mailto:jdegens@nikhef.nl]  ''
  
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski]''
+
=== ATLAS: The Next Generation ===
  
=== The Modulation Experiment: Data Analysis ===
+
After the observation of the coupling of Higgs bosons to fermions of the third generation, the search for the coupling to fermions of the second generation is one of the next priorities for research at CERN's Large Hadron Collider. The search for the decay of the Higgs boson to two charm quarks is very new [1] and we see various opportunities for interesting developments. For this project we propose improvements in reconstruction (using exclusive decays), advanced analysis techiques (using deep learning methods) and expanding the theory interpretation. Another opportunity would be the development of the first statistical combination of results between the ATLAS and CMS experiment, which could significantly improve the discovery potentional.
  
There exist a few measurements that suggest an annual modulation in the activity of radioactive sources. With a few groups from the XENON collaboration we have developed four sets of table-top experiments to investigate this effect on a few well known radioactive sources. The experiments are under construction in Purdue University (USA), a mountain top in Switzerland, a beach in Rio de Janeiro and the last one at Nikhef in Amsterdam. We urgently need a master student to (1) analyze the first big data set, and (2) contribute to the first physics paper from the experiment. We are looking for all-round physicists with interest in both lab-work and data-analysis. The student(s) will directly collaborate with the other groups in this small collaboration (around 10 people), and the goal is to have the first physics publication ready by the end of the project. During the 2018-2019 season there are positions for two MSc students.
+
[1] https://arxiv.org/abs/1802.04329
  
''Contact: [mailto:z37@nikhef.nl Auke Colijn]''
+
''Contact: [mailto:tdupree@nikhef.nl Tristan du Pree and Marko Stamenkovic]''
  
=== Theory: Unbiased global analysis of heavy quark fragmentation functions ===
+
=== ATLAS: The Most Energetic Higgs Boson ===
  
The hadronisation of quarks and gluons into D mesons is of particular relevance in the era of the Large Hadron Collider (LHC). For example, production cross sections are used to constrain the gluon parton distribution function (PDF) at small momentum fractions, they play a vital role in cosmic-ray and neutrino astrophysics, and they provide the background to study modifications of heavy flavour yields in heavy-ion collisions. In perturbative Quantum Chromodynamics (QCD), the hadronisation of partons into hadrons is encoded in nonperturbative fragmentation functions (FFs). In the case of charmed hadrons, like the mesons, the heavy quark mass introduces an additional large scale – apart from some other hard scale that characterises the process – whose effects are perturbatively computable. This project is about a determination of the FFs of D mesons from a QCD analysis of data. Measurements of cross sections in a broad range of single-inclusive hadron production will be considered, namely in electron-positron annihilation, and in hadron-hadron collisions (including in-jet fragmentation). The analysis will be carried through the well-established NNPDF framework. Its main pillars are Monte Carlo sampling for the representation of data uncertainties, and neural networks for the parametrisation of PDFs. The project will allow for the acquisition of a broad set of data analysis computational techniques, widely used in high-energy physics and beyond.
+
The production of Higgs bosons at the highest energies could give the first indications for deviations from the standard model of particle physics, but production energies above 500 GeV have not been observed yet [1]. The LHC Run-2 dataset, collected during the last 4 years, might be the first opportunity to observe such processes, and we have various ideas for new studies. Possible developments include the improvement of boosted reconstruction techniques, for example using multivariate deep learning methods. Also, there are various opportunities for unexplored theory interpretations (using the MadGraph event generator), including effective field theory models (with novel ‘morphing’ techniques) and new interpretations of the newly observed boosted VZ(bb) process.
  
"Further information [[http://pcteserver.mi.infn.it/~nnpdf/VU/2018-MasterProject-DFFs.pdf here]]
+
[1] https://arxiv.org/abs/1709.05543
  
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
+
''Contact: [mailto:tdupree@nikhef.nl Tristan du Pree and Brian Moser]''
  
=== Theory:  The quark and gluon internal structure of heavy nuclei in the LHC era  ===
+
=== LHCb: Measurement of delta md ===
 +
The decay B0->D-pi+ is very abundant in LHCb, and therefore ideal to study the oscillation frequency
 +
delta md, with which B0 mesons oscillate into anti-B0 mesons, and vice versa.
 +
This process proceeds through a so-called box diagram which might hide new yet-undiscovered particles.
 +
Recently, it has been realized that value of delta md is in tension with the valu of CKM-angle gamma,
 +
triggering renewed interest in this measurement.
  
A precise knowledge of the parton distribution functions (PDFs) of the proton is essential in order to make predictions for the Standard Model and beyond at hadron colliders. The presence of nuclear medium and collective phenomena which involve several nucleons modifies the parton distribution functions of nuclei (nPDFs) compared to those of a free nucleon. These modifications have been investigated by different groups using global analyses of high energy nuclear reaction world data. It is important to determine the nPDFs not only for establishing perturbative QCD factorisation in nuclei but also for applications to heavy-ion physics and neutrino physics. In this project the student will join an ongoing effort towards the determination of a data-driven model of nPDFs, and will learn how to construct tailored Artificial Neural Networks (ANNs).  
+
''Contact: [mailto:Marcel.Merk@nikhef.nl Marcel Merk]''
  
"Further information [[http://pcteserver.mi.infn.it/~nnpdf/VU/2018-MasterProject-nPDFs.pdf here]]
+
=== LHCb: Searching for CPT violation ===
 +
CPT symmetry is closely linked to Lorentz symmetry, and any violation
 +
would revolutionize science. There are possibilities though that supergravity could
 +
cause CPT violating effects in the system of neutral mesons.
 +
The precise study of B0s oscillations in the abundant Bs->Dspi decays can
 +
give the most stringent limits on Im(z) to date.
  
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
+
''Contact: [mailto:Marcel.Merk@nikhef.nl Marcel Merk]''
  
=== Theory: Combined QCD analysis of parton distribution and fragmentation functions ===
+
=== LHCb: BR(B0->D-pi+) and fd/fu with B+->D0pi+ ===  
 +
The abundant decay B0->D-pi+ is often used as normalization channel, given its
 +
clean signal, and well-known branching fraction, as measured by the B-factories.
 +
However, this branching fraction can be determined more precisely, when comparing
 +
to the decay B+->D0pi+ , which has a twice better precision.
 +
In addition, the production of B0 and B+ mesons is often assumed to be equal,
 +
based on isospin symmetry. The study of B+->D0pi+ and B0->D-pi+ allows for the
 +
first measurement of this ratio, fd/fu.
  
The formation of hadrons from quarks and gluons, or collectively partons, is a fundamental QCD process that has yet to be fully understood. Since parton-to-hadron fragmentation occurs over long-distance scales, such information can only be extracted from experimental observables that identify mesons and baryons in the final state. Recent progress has been made to determine these fragmentation functions (FFs) from charged pion and kaon production in single inclusive e+e−-annihilation (SIA) and additionally pp-collisions and semi-inclusive deep inelastic scattering (SIDIS). However, charged hadron production in unpolarized pp and inelastic lepton-proton scattering also require information about the momentum distributions of the quarks and gluons in the proton, which is encoded in non-perturbative parton distribution functions (PDFs). In this project, a simultaneous treatment of both PDFs and FFs in a global QCD analysis of single inclusive hadron production processes will be made to determine the individual parton-to-hadron FFs. Furthermore, a robust statistical methodology with an artificial neural network learning algorithm will be used to obtain a precise estimation of the FF uncertainties. This work will emphasis in particular the impact of pp-collision and SIDIS data on the gluon and separated quark/anti-quark FFs, respectively.
+
''Contact: [mailto:Marcel.Merk@nikhef.nl Marcel Merk]''
  
"Further information [[http://pcteserver.mi.infn.it/~nnpdf/VU/2018-MasterProject-FFpPDFs.pdf here]]
 
  
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
+
=== LHCb: Optimization studies for Vertex detector at the High Lumi LHCb ===
 +
The LHCb experiment is dedicated to measure tiny differences between matter and antimatter through the precise study of rare processes involving b or c quarks.  The LHCb detector will undergo a major modification in order to dramatically increase the luminosity and be able to  measure indirect effects of physics beyond the standard model.  In this environment, over 42 simultaneous collisions are expected to happen at a time interval of 200 ps where the two proton bunches overlap. The particles of interest have a relatively long lifetime and therefore the best way to distinguish them from the background collisions is through the precise reconstruction of displaced vertices and pointing directions.  The new detector considers using extremely recent or even future technologies to measure space (with resolutions below 10 um) and time (100 ps or better) to efficiently reconstruct the events of interest for physics. The project involves changing completely  the LHCb Vertex Locator (VELO) design in simulation and determine what can be the best performance for the upgraded detector, considering different spatial and temporal resolutions.
  
 +
''Contact: [mailto:kazu.akiba@nikhef.nl Kazu Akiba]''
  
=== ATLAS : Double Higgs searches with multiple leptons ===
+
=== LHCb: Measurement of charge multiplication in heavily irradiated sensors ===
 +
During the R&D phase for the LHCb VELO Upgrade detector a few sensor prototypes were irradiated to the extreme fluence expected to be achieved during the detector lifetime. These samples were tested using high energy particles at the SPS facility at CERN with their trajectories reconstructed by the Timepix3 telescope. A preliminary analysis revealed that at the highest irradiation levels the amount of signal observed is higher than expected, and even larger than the signal obtained at lower doses.  At the Device Under Test (DUT) position inside the telescope, the spatial resolution attained by this system is below 2 um. This means that a detailed analysis can be performed in order to study where and how this signal amplification happens within  the 55x55 um^2 pixel cell.  This project involves analysing the telescope and DUT data to investigate the charge multiplication mechanism at the microscopic level.
  
The Standard Model of particle physics (SM) is extremely successful, but would it hold against check with data containing multiple leptons? Although very rare process, the production of leptons is calculated in SM with high precision. On detector side the leptons (electrons and muons) are easy to reconstruct and such a sample contains very little "non-lepton" background. This analysis has an ambitious goal to reconstruct events with two Higgs bosons using events with 4 leptons.  With this project, the student would gain close familiarity with modern experimental techniques (statistical analysis, SM predictions, search for rare signals), with Monte Carlo generators and the standard HEP analysis tools (ROOT, C++, python).
+
''Contact: [mailto:kazu.akiba@nikhef.nl Kazu Akiba]''
  
''Contact: [mailto:O.Igonkina@nikhef.nl Olya Igonkina and Marcus Morgenstern and Pepijn Bakker]''
+
=== LHCb: Testing the flavour anomalies at LHCb ===
 +
Lepton Flavour Universality (LFU) is an intrinsic property of the Standard Model, which implies that the three generation of leptons are subject to the same interactions. This fundamental law of the SM can be investigated by looking at rare B-meson decay with muons or electron in the final state. Recent measurements of these decays from LHCb show deviation from the SM (known as flavour anomalies) that, if confirmed, would lead to a major discovery of New Physics (NP). The project consists in the analysis of the 2017-18 dataset, which will double the statistic of the current results. This new dataset will lead to a measurement with better precision, which can either confirm or exclude the contribution of NP to these decays. The project will explore all the crucial aspect of data analysis, from simulation to signal modeling, including cutting-edge software, such us fitting large amount of data using GPU (Graphic Processing Unit).  
  
=== ATLAS : A search for lepton flavor violation with tau decays ===
+
''Contact: [mailto:a.mauri@cern.ch Andrea Mauri] and [mailto:marcel.merk@nikhef.nl Marcel Merk]''
  
Quarks mix, neutrinos mix, charged leptons do not mix. Why? Is that really how the nature works, or is it just a limitation in our detection techniques. ATLAS has recorded now a huge sample of data. Even such difficult final states as tau->3mu become accessible. However, the decays of charm and beauty mesons could spoil the picture with decays that resembles the signal. The goal of the project is to understand what
+
=== LHCb: Search for long-lived heavy neutral leptons in B decays ===
background decays are present and to find a way to suppress them. Success of project will allow much higher sensitivity to beyond Standard Model physics of tau->3mu. The student would gain close familiarity with modern experimental techniques (statistical analysis, SM predictions, search for rare signals), background suppression techniques and the standard HEP analysis tools (ROOT, C++, python).
+
The mass of neutrinos are many orders of magnitude smaller than that of the other fermions. In the seesaw mechanism this puzzling fact is explained by the existence of another set of neutral leptons that are much heavier in mass. If their mass is below about 5 GeV such neutrinos can be produced at the LHC in decays of B hadrons. Their small coupling will lead to a lifetime of the order of pico-seconds which means that they will fly an observable distance before they decay. In this project we search for such long-lived heavy neutrinos in decays of charged B mesons using the LHCb run-2 dataset.
  
''Contact: [mailto:O.Igonkina@nikhef.nl Olya Igonkina and Edwin Chow]''
+
'' Contact: [mailto:v.lukashenko@nikhef.nl Lera Lukashenko] and [mailto:wouter.hulsbergen@nikhef.nl Wouter Hulsbergen]''
  
 +
=== LHCb: Discovering the Bc->eta_c mu nu decay ===
 +
The Bc meson, consisting of heavy c and anti-b quarks, is of great interest for flavour physics. Recent LHCb measurement on Bc->J/psi l nu decays [1] showed a possible deviation from the Standard Model prediction, which entered the so-called lepton universality puzzle - the hottest topic in the b-physics in recent years. Following that, the study of a similar decay mode - Bc->eta_c mu nu - is strongly requested by the theory community. However, the reconstruction of the eta_c meson is challenging, so that the decay has not been discovered yet. The project aims at discovery of the Bc->eta_c mu nu decay using unique capabilities of the LHCb experiment. The data analysis will consist of finding the optimal event selection using machine learning techniques, research on background sources, performing fits to data, etc. The project requires to be not afraid of analysis software and statistics. The results will be presented in collaboration: talks at working group meetings, analysis note, etc.  Skills in git, python and ROOT (and similar packages) are extremely welcome.
  
=== ATLAS : A search for lepton non-universality in Bc meson decays ===
+
[1] https://arxiv.org/pdf/1711.05623.pdf
  
Recently, LHCb experiment has reported a number of intriguing deviations from SM in leptonic decays of B mesons. With this project we would like to probe if ATLAS also observes the same kind of deviation, e.g. in Bc->Jpsi+tau+nu channel w.r.t BC->Jpsi+mu+nu.  Success of project will be essential to understand if we finally observe  beyond SM process or if LHCb has some detector bias.  The student would gain close familiarity with modern experimental techniques (statistical analysis, SM predictions, search for rare signals), background suppression techniques and the standard HEP analysis tools (ROOT, C++, python).
+
''Contact: [mailto:andrii.usachov@nikhef.nl Andrii Usachov] and [mailto:marcel.merk@nikhef.nl Marcel Merk]''
  
''Contact: [mailto:O.Igonkina@nikhef.nl Olya Igonkina and Edwin Chow]''
+
=== ALICE: Searching for the strongest magnetic field in nature ===
 +
In case of a non-central collision between two Pb ions, with a large value of impact parameter (b), the charged nucleons that do not participate in the interaction (called spectators) create strong magnetic fields. A back of the envelope calculation using the Biot-Savart law brings the magnitude of this filed close to 10^19Gauss in agreement with state of the art theoretical calculation, making it the strongest magnetic field in nature. The presence of this field could have direct implications in the motion of final state particles. The magnetic field, however, decays rapidly. The decay rate depends on the electric conductivity of the medium which is experimentally poorly constrained. Overall, the presence of the magnetic field, the main goal of this project, is so far not confirmed experimentally.
  
 +
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
  
=== LHCb: Searching for dark matter in exotic six-quark particles ===
+
=== ALICE: Looking for parity violating effects in strong interactions ===
3/4 of the mass in the Universe is of unknown type. Many hypotheses about this dark matter have been proposed, but none confirmed. Recently it has been proposed that it could be made of particles made of the six quarks uuddss. Such a particle could be produced in decays of heavy baryons. It is proposed to use Xi_b baryons produced at LHCb to search for such a state. The latter would appear as missing 4-momentum in a kinematically constrained decay. The project consists in optimising a selection and applying it to LHCb data. See [https://arxiv.org/abs/1708.08951 arXiv:1708.08951]
+
Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions, in the presence of a deconfined state, gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, what is called the Chiral Magnetic Effect (CME).
 +
The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, however further studies are needed to constrain background effects. These highly anticipated results have the potential to reveal exiting, new physics.
  
''Contact: [mailto:patrick.koppenburg@cern.ch Patrick Koppenburg]''
+
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou]''
  
 +
=== ALICE: Machine learning techniques as a tool to study the production of heavy flavour particles ===
 +
There was recently a shift in the field of heavy-ion physics triggered by experimental results obtained in collisions between small systems (e.g. protons on protons). These results resemble the ones obtained in collisions between heavy ions. This consequently raises the question of whether we create the smallest QGP droplet in collisions between small systems. The main objective of this project will be to study the production of charm particles such as D-mesons and Λc-baryons in pp collisions at the LHC. This will be done with the help of a new and innovative technique which is based on machine learning (ML). The student will also extend the studies to investigate how this production rate depends on the event activity e.g. on how many particles are created after every collision.
  
----
+
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou] and [mailto:Alessandro.Grelli@cern.ch Alessandro Grelli]''
  
== OLD Projects [from last year] ==
+
=== ALICE: Energy Loss of Energetic Quarks and Gluons in the Quark-Gluon Plasma ===
 +
One of the ways to study the quark-gluon plasma that is formed in high-energy nuclear collisions, is using high-energy partons (quarks or gluons) that are produced early in the collision and interact with the quark-gluon plasma as they propagate through it. There are several current open questions related to this topic, which can be explored in a Master's project. For example, we would like to use the new Monte Carlo generator framework JetScape to simulate collisions to see whether we can extract information about the interaction with the quark-gluon plasma. In the project you will collaborate with one of the PhD students or postdocs in our group to use the model to generate predictions of measurements and compare those to data analysis results. Depending on your interests, the project can focus more on the modeling aspects or on the analysis of experimental data from the ALICE detector at the LHC.
  
 +
''Contact: [mailto:marco.van.leeuwen@cern.ch Marco van Leeuwen] and [mailto:marta.verweij@cern.ch Marta Verweij]''
  
 +
=== ALICE: Extreme Rare Probes of the Quark-Gluon Plasma ===
 +
The quark-gluon plasma is formed in high-energy nuclear collisions and also existed shortly after the big bang.  With the large amount of data collected in recent years at the Large Hadron Collider at CERN, rare processes that previously were not accessible provide now new ways to study how the quark-gluon plasma emerges from the fundamental theory of strong interaction. One of such processes is the heavy W boson which in many cases decays to two quarks. The W boson itself doesn’t interact with the quark-gluon plasma because it doesn’t carry color, but the quark decay products do interact with the plasma and therefore provide an ideal tool to study the space-time evolution of this hot and dense medium. In this project you will use data from the ALICE detector at the LHC and simulated data from generators to study various physics mechanisms that could be happening in the real collisions.
  
=== The Modulation Experiment: Data Analysis ===
+
''Contact: [mailto:marta.verweij@cern.ch Marta Verweij] and [mailto:marco.van.leeuwen@cern.ch Marco van Leeuwen]''
  
There exist a few measurements that suggest an annual modulation in the activity of radioactive sources. With a few groups from the XENON collaboration we have developed four sets of table-top experiments to investigate this effect on a few well known radioactive sources. The experiments are under construction in Purdue University (USA), a mountain top in Switzerland, a beach in Rio de Janeiro and the last one at Nikhef in Amsterdam. We urgently need a master student to (1) analyze the first big data set, and (2) contribute to the first physics paper from the experiment. We are looking for an all-round physicist with interest in both lab-work and data-analysis. The student will directly collaborate with the other groups in this small collaboration (around 10 people), and the goal is to have the first physics publication ready by the end of the project.
+
=== ALICE: Jet Quenching with Machine Learning ===
  
''Contact: [mailto:z37@nikhef.nl Auke Colijn]''
+
Machine learning applications are rising steadily as a vital tool in the field of data science but are relatively new in the particle physics community. In this project machine learning tools will be used to gain insights into the modification of a parton shower in the quark-gluon plasma (QGP). The QGP is created in high-energy nuclear collisions and only lives for a very short period of time. Highly energetic partons created in the same collisions interact with the plasma while they travers it and are observed as a collimated spray of particles, known as jets, in the detector.  One of the key recent insights is that the internal structure of jets provides information about the evolution of the QGP. With data recorded by the ALICE experiment, you will use jet substructure techniques in combination with machine learning algorithms to dissect the structure of the QGP. Machine learning will be used to select the regions of radiation phase space that are affected by the presence of the QGP.
  
=== The XENON Dark Matter Experiment: Data Analysis ===
+
''Contact: [mailto:marta.verweij@cern.ch Marta Verweij] and [mailto:marco.van.leeuwen@cern.ch Marco van Leeuwen]''
  
The XENON collaboration has started operating the XENON1T detector, the world’s most sensitive direct detection dark matter experiment. The Nikhef group is playing an important role in this experiment. The detector operates at the Gran Sasso underground laboratory and consists of a so-called dual-phase xenon time-projection chamber filled with 3200kg of ultra-pure xenon. Our group has an opening for a motivated MSc student to do data-analysis on this new detector. The work will consist of understanding the signals that come out of the detector and in particular focus on the so-called double scatter events. We are interested in developing methods in order to interpret the response of the detector better and are developing sophisticated statistical tools to do this. This work will include looking at data and developing new algorithms in our Python-based analysis tool. There will also be opportunity to do data-taking shifts at the Gran Sasso underground laboratory in Italy.  
+
=== Lepton Collider: Pixel TPC testbeam ===
 +
In the Lepton Collider group at Nikhef we work on a tracking detector for a future Collider (e.g. the ILC in Japan). We are developing a gaseous Time Projection Chamber with a pixel readout. At Nikhef we have built an 8-quad GridPix module based on the Timepix3 chip, which is a detector of about 20 cm x 40 cm x 10 cm in size. In August 2020 we will test the device at the DESY particle accelerator in Hamburg. For the project you could work on preparations for the test beam (e.g. running the data acquisition, perform data monitoring using our set up in the lab). The next topics will be the participation in the data taking during the test beam at DESY, the analysis of the data using C++ and ROOT and - finally - publication of the results in a scientific journal.
  
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski]''
+
Our latest paper can be found in https://www.nikhef.nl/~s01/quad_paper.pdf [www.nikhef.nl].
  
=== XAMS Dark Matter R&D Setup ===
+
''Contact: [mailto:Peter.Kluit@nikhef.nl Peter Kluit] and Kees Ligtenberg''
 +
 +
=== Dark Matter: Sensitive tests of wavelength-shifting properties of materials for dark matter detectors ===
 +
Rare event search experiments that look for neutrino and dark matter interactions are performed with highly sensitive detector systems, often relying on scintillators, especially liquid noble gases, to detect particle interactions. Detectors consist of structural materials that are assumed to be optically passive, and light detection systems that use reflectors, light detectors, and sometimes, wavelength-shifting materials. MSc theses are available related to measuring the efficiency of light detection systems that might be used in future detectors. Furthermore, measurements to ensure that presumably passive materials do not fluoresce, at the low level relevant to the detectors, can be done. Part of the thesis work can include Monte Carlo simulations and data analysis for current and upcoming dark matter detectors, to study the effect of different levels of desired and nuisance wavelength shifting. In this project, students will acquire skills in photon detection, wavelength shifting technologies, vacuum systems, UV and extreme-UV optics, detector design, and optionally in C++ programming, data analysis, and Monte Carlo techniques.
  
The Amsterdam Dark Matter group has built an R&D xenon detector at Nikhef. The detector is a dual-phase xenon time-projection chamber and contains about 4kg of ultra-pure liquid xenon. We plan to use this detector for the development of new detection techniques (such as utilizing new photosensors) and to improve the understanding of the response of liquid xenon to various forms of radiation. The results could be directly used in the XENON experiment, the world’s most sensitive direct detection dark matter experiment at the Gran Sasso underground laboratory. We have several interesting projects for this facility. We are looking for someone who is interested in working in a laboratory on high-tech equipment, modifying the detector, taking data and analyzing the data him/herself. You will "own" this experiment.  
+
''Contact: [mailto:Tina.Pollmann@tum.de Tina Pollmann] and [mailto:decowski@nikhef.nl Patrick Decowski]''
  
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski]''
+
=== Dark Matter: Signal reconstruction in XENONnT ===
 +
The next generation direct detection dark matter experiment - XENONnT - comprises close to 500 photomultiplier tubes (PMTs) in the main detector volume. These PMTs are configured to be able to detect even single photons. When a single photoelectron (PE) signal is detected the detected signal (a pulse) is convoluted with the detector response of the PMT. Due to this detector response the pulse shape of a single PE is spread out in time. For XENONnT we would like to explore the possibility to implement a digital (software) filter to deconvolve the detected pulse back to the “true” instantaneous shape (without the detector spread). This is a virtually unexplored new step in the Xenon analysis framework. Later in the analysis framework these pulses from all the PMTs are combined into a signal referred to as a ‘peak’. For XENONnT it is of essence to be extremely good in discriminating between two types of peaks caused by interactions in the detector; a prompt primary scintillation signal (S1) and a secondary ionization signal (S2). The parameters in the software haven’t - as of the time of writing - been optimized for the XENONnT-detector conditions.
 +
The student would investigate how a deconvolution filter would benefit the XENONnT analysis framework and develop such a filter. Furthermore, the student will work on the classification of these signals to fully exploit the XENONnT-detector to optimize the classification. This will be done with simulated data at first but may later even be performed on actual XENONnT-data. As an extension, the possibility of applying machine learning to correctly distinguish between the two signals could be explored. This is a data-analysis oriented project where Python skills are paramount.
  
=== LHCb: A Scintillator Fibers Tracker ===
+
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:j.angevaare@nikhef.nl Joran Angevaare]''
  
The LHCb collaboration is upgrading the present tracking system
+
=== Dark Matter: XAMS  R&D Setup ===
constructing a new tracker based on scintillating fibers combined
+
The Amsterdam Dark Matter group operates an R&D xenon detector at Nikhef. The detector is a dual-phase xenon time-projection chamber and contains about 4kg of ultra-pure liquid xenon. We use this detector for the development of new detection techniques - such as utilizing our newly installed silicon photomultipliers - and to improve the understanding of the response of liquid xenon to various forms of radiation. The results could be directly used in the XENONnT experiment, the world’s most sensitive direct detection dark matter experiment at the Gran Sasso underground laboratory, or for future Dark Matter experiments like DARWIN. We have several interesting projects for this facility. We are looking for someone who is interested in working in a laboratory on high-tech equipment, modifying the detector, taking data and analyzing the data him/herself. You will "own" this experiment.  
with silicon photo-multipliers (SiPM): the SciFi Tracker!
+
Nikhef plays a key role in the project, as we will build the
+
SciFi fibers modules, the cold-box enclosure housing the SiPMs,
+
and a large part of the on-detector electronics. In all these
+
areas, interesting test hardware and software has to be realized,
+
and several research topics for a Master project are available,
+
taking the student in contact with state-of-the-art particle detectors,
+
in a large team of physicists and engineers. Possible collaborations
+
with the Nikhef R&D group can also be envisaged.
+
  
''Contact: [mailto:antonio@nikhef.nl Antonio Pellegrino]''
+
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
  
=== LHCb: Discovery of the Decay Lb --> p Ds+ ===
+
=== Dark Matter: DARWIN Sensitivity Studies ===
This project aims to measure the branching fraction of the decay Lb->p Ds+ (bud -> uud + ds).  
+
DARWIN is the "ultimate" direct detection dark matter experiment, with the goal to reach the so-called "neutrino floor", when neutrinos become a hard-to-reduce background. The large and exquisitely clean xenon mass will allow DARWIN to also be sensitive to other physics signals such as solar neutrinos, double-beta decay from Xe-136, axions and axion-like particles etc. While the experiment will only start in 2025, we are in the midst of optimizing the experiment, which is driven by simulations. We have an opening for a student to work on the GEANT4 Monte Carlo simulations for DARWIN, as part of a simulation team together with the University of Freiburg and Zurich. We are also working on a "fast simulation" that could be included in this framework. It is your opportunity to steer the optimization of a large and unique experiment. This project requires good programming skills (Python and C++) and data analysis/physics interpretation skills.
The decay Lb->p Ds+ is quite rare, because it occurs through the transition of a b-quark to a u-quark.  
+
It has not been measured yet (although some LHCb colleagues claim to have seen it).
+
This decay is interesting, because
+
  
1) It is sensitive to the b->u coupling (CKM-element Vub), which determination is heavily debated.
+
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
2) It can quantify non-factorisable QCD effects in b-baryon decays.
+
  
The decay is closely related to B0->pi-Ds+, which proceeds through a similar Feynman diagram.  
+
=== Dark Matter: Fast simulation studies ===
Also, the final state of B0->pi-Ds+ is almost identical to Lb->p Ds+.
+
For Dark Matter experiments it is crucial to understand sources of backgrounds in great detail. The most common way to study the effect of backgrounds to the Dark Matter sensitivity is by the
The aim is to determine the relative branching fraction of Lb->pDs+ with respect to B0->D+pi- decays,  
+
use of Monte Carlo simulations. Unfortunately, the standard Monte Carlo techniques are extremely inefficient. One needs to sometimes simulate millions of events before one background event appears in the Dark Matter search area. We have developed a Monte Carlo technique that accelerates this process by up to 1000x. The method has been validated on very simple and unrealistic detector models. In goal of this project is to make a realistic detector model for the fast detector simulations. For this we are looking for a student with good programming skills, an interest in a software project, and the desire to deeply understand analysis of Dark Matter experimental data.
in close collaboration with the PhD (who will study BR(B0->pi-Ds+)/BR(B0->D+pi-) ).
+
This project will result in a journal publication on behalf of the LHCb collaboration, written by you.  
+
For this project computer skills are needed. The ROOT programme and C++ and/or Python macros are used.  
+
This is a project that is closely related to previous analyses in the group.  
+
Weekly video meetings with CERN coordinate the efforts with in the LHCb collaboration.
+
Relevant information:
+
  
[1] R.Aaij et al. [LHCb Collaboration], ``Determination of the branching fractions of B0s->DsK and B0->DsK, JHEP 05 (2015) 019 [arXiv:1412.7654 [hep-ex]].
+
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
[2] R. Fleischer, N. Serra and N. Tuning, ``Tests of Factorization and SU(3) Relations in B Decays into Heavy-Light Final States, Phys. Rev. D 83, 014017 (2011) [arXiv:1012.2784 [hep-ph]].
+
  
''Contact: [mailto:h71@nikhef.nl Niels Tuning and Lennaert Bel and Mick Mulder]''
+
=== Dark Matter & Amsterdam Scientific Instruments: Simulations for Industry ===
 +
In the Nikhef Dark Matter group we have built up an extensive expertise with Monte Carlo simulations of ionizing radiation. Although these simulations have the aim to estimate background levels in our XENON experiments, the same techniques can be applied to study radiation transport in industrial devices. Amsterdam Scientific Instruments (ASI) is a company at Science Park that develops and sells radiation imaging equipment that is used amongst others in electron microscopy. For this application ASI needs a detailed study of gamma ray backgrounds to optimize shielding for their products. The project aims at optimizing a shielding design based on GEANT4 simulations. The results may be implemented in next generation products of ASI. We are looking for a student with preferably strong computing skills, and with an interest in science-industrial collaboration.
  
=== LHCb: Measurement of B0 -> pi Ds- , the b -> u quark transition ===
+
''Contact: [mailto:decowski@nikhef.nl Patrick Decowski] and [mailto:z37@nikhef.nl Auke Colijn]''
  
This project aims to measure the branching fraction of the decay B0->pi Ds+.
+
=== The Modulation experiment: Data Analysis  ===
This decay is closely related to Lb->p Ds+ (see above), and close collaboration between the two master projects is foreseen.
+
For years there have been controversial claims of potential new-physics on the basis of time-varying decay rates of radioactive sources on top of ordinary exponential decay. While some of these claims have been refuted, others have still to be confirmed or falsified. To this end, a dedicated experiment - the modulation experiment - has been designed and operational for the past four years. Using four identical and independent setups the experiment is almost ready for a final analysis to conclude on these claims. In this project the student will perform this analysis, preferably resulting in a conclusive paper. This will require combining the data of the four setups and close collaboration with a small group constituting a collaboration of the four different involved institutes (Purdue University (USA), Universität Zürich (Switzerland), Centro Brasileiro de Pesquisas Fisicas (Brasil) and Nikhef). This project is data-analysis oriented. Additionally, lab-skills can be required as one of the setups is situated at Nikhef.
This research was started by a previous master student.  
+
The new measurement will finish the work, and include the new data from 2015 and 2016.
+
  
See Mick Mulders [http://www.nikhef.nl/pub/experiments/bfys/lhcb/Theses/master/2015_MickMulder.pdf master thesis] for more information.
+
''Contact: [mailto:z37@nikhef.nl Auke Colijn] and [mailto:j.angevaare@nikhef.nl Joran Angevaare]''
  
''Contact: [mailto:h71@nikhef.nl Niels Tuning and Lennaert Bel and Mick Mulder]''
+
=== Detector R&D: Laser Interferometer Space Antenna (LISA) ===
 +
The space-based gravitational wave antenna LISA is, without a doubt, one of the most challenging space missions ever proposed. ESA plans to launch around 2030 three spacecraft that are separated by a few million kilometers to measure tiny variations in the distances between test-masses located in each satellite to detect the gravitational waves from sources such as supermassive black holes. The triangular constellation of the LISA mission is dynamic, requiring a constant fine-tuning related to the pointing of the laser links between the spacecraft and a simultaneous refocusing of the telescope. The noise sources related to the laser links expect to provide a dominant contribution to the LISA performance.
 +
An update and extension of the LISA science simulation software are needed to assess the hardware development for LISA at Nikhef, TNO, and SRON. A position is therefore available for a master student to study the impact of instrumental noise on the performance of LISA. Realistic simulations based on hardware (noise) characterization measurements performed at TNO will be carried out and compared to the expected tantalizing gravitational wave sources.
  
=== LHCb: A search for heavy neutrinos in the decay of W bosons at LHCb ===
+
''Contact: [mailto:nielsvb@nikhef.nl Niels van Bakel],[mailto:ernst-jan.buis@tno.nl  Ernst-Jan Buis]''
  
Neutrinos are arguably the most mysterious of all known fundamental fermions as they are both much lighter than all others and only weakly interacting. It is thought that the tiny mass of neutrinos can be explained by their mixing with so-far unknown, much heavier, neutrino-like particles. In this research proposal we look for these new neutrinos in the decay of the SM W-boson using data with the LHCb experiment at CERN. The W boson is assumed to decay to a heavy neutrino and a muon. The heavy neutrino subsequently decays to a muon and a pair of quarks. Both like-sign and opposite-sign muon pairs will be studied. The result of the analysis will either be a limit on the production of the new neutrinos or the discovery of something entirely new.
+
=== Detector R&D: Spectral X-ray imaging - Looking at colours the eyes can't see ===
 +
When a conventional X-ray image is taken, one acquires an image that only shows intensities. a ‘black and white’ image. Most of the information carried by the photon energy is lost. Lacking spectral information can result in an ambiguity between material composition and amount of material in the sample. If the X-ray intensity as a function of the energy can be measured (i.e. a ‘colour’ X-ray image) more information can be obtained from a sample. This translates to less required dose and/or to a better understanding of the sample that is being investigated. For example, two fields that can benefit from spectral X-ray imaging are mammography and real time CT.
  
''Contact: [mailto:wouterh@nikhef.nl Wouter Hulsbergen and Elena Dall'Occo]''
+
Detectors using Medipix3 chips are used for X-ray imaging. Such a detector is composed of a pixel chip with a semiconductor sensor bonded on top of it. Photoelectric absorption of X-rays in the sensor results in an amount of charge being released that is proportional to the X-ray energy. This charge is registered by a pixel. Depending on configuration, in each pixel 1, 2, 4 or 8 detection thresholds can be set and so, a number of energy bins can be defined. One of the challenges is to maximise X-ray image quality by minimising effects caused by dispersion in the sensitivity of the pixels. The effects of this dispersion can partly be compensated by applying a specific measurement method in combination with image post processing.
  
 +
You can work on improving measurement methods and on improving post processing methods. There is flexibility of the planned work depending on the skillset you have. The aim is to get the best X-ray energy resolution over the entire pixel chip. This in turn improves image quality and therefore X-ray CT reconstruction quality.
  
=== ALICE : Particle polarisation in strong magnetic fields ===
+
Important note: Much of this work is to be performed in the laboratory. For as long as corona safety measures are active, the labs at Nikhef are not accessible for students and this project cannot be worked on except for post-processing in software. Currently we hope that the situation will have improved by August.  
When two atomic nuclei, moving in opposite directions, collide off- center then the Quark Gluon Plasma (QGP) created in the overlap zone is expected to rotate. The nucleons not participating in the collision represent electric currents generating an intense magnetic field. The magnetic field could be as large as 10^{18} gauss, orders of magnitude larger than the strongest magnetic fields found in astronomical objects. Proving the existence of the rotation and/or the magnetic field could be done by checking if particles with spin are aligned with the rotation axis or if charged particles have different production rates relative to the direction of the magnetic field. In particular, the longitudinal and transverse polarisation of the Lambda^0 baryon will be studied. This project requires some affinity with computer programming.
+
Please see the following videos for examples of our work:
  
''Contact: [mailto:Paul.Kuijer@nikhef.nl Paul Kuijer and Panos Christakoglou]''
+
https://youtu.be/cgwQvjfUYns
  
=== ALICE : Blast-Wave Model in heavy-ion collisions ===
+
https://youtu.be/tf9ZLALPVNY
The goal of heavy-ion physics is to study the Quark Gluon Plasma (QGP), a hot and dense medium where quarks and gluons move freely over large distances, larger than the typical size of a hadron. Hydrodynamic simulations expect that the QGP will expand under its own pressure, and cool while expanding. These simulations are particularly successful in describing some of the key observables measured experimentally, such as particle spectra and elliptic flow. A reasonable reproduction of the same observables is also achieved with models that use parameterisations that resemble the hydrodynamical evolution of the system assuming a given freeze-out scenario, usually referred to as blast-wave models. The goal of this project is to work on different blast wave parametrisations, test their dependence on the input parameters and extend their applicability by including more observables studied in heavy-ion collisions in the global fit.  
+
  
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou and Paul Kuijer]''
+
https://youtu.be/vjPX7SxvSUk
  
=== ALICE : Higher Harmonic Flow ===
+
https://youtu.be/LqjNVSm7Hoo
When two ions collide, if the impact parameter is not zero, the overlap region is not isotropic. This spatial anisotropy of the overlap region is transformed into an anisotropy in momentum space through interactions between partons and at a later stage between the produced particles. It was recently realized that the overlap region of the colliding nuclei exhibits an irregular shape. These irregularities originate from the initial density profile of nucleons participating in the collision which is not smooth and is different from one event to the other. The resulting higher order flow harmonics (e.g. v3, v4, and v5, usually referred to as triangular, quadrangular, and pentangular flow, respectively) and in particular their transverse momentum dependence are argued to be more sensitive probes than elliptic flow not only of the initial geometry and its fluctuations but also of shear viscosity over entropy density (η/s). The goal of this project is to study v3, v4, and v5 for identified particles in collisions of heavy-ions at the LHC.
+
  
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou and Paul Kuijer]''
+
''Contact: [mailto:martinfr@nikhef.nl Martin Fransen],[mailto:navritb@nikhef.nl Navrit Bal]''
  
=== ALICE : Chiral Magnetic Effect and the Strong CP Problem ===
+
=== Detector R&D: Holographic projector ===
Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, which is called Chiral Magnetic Effect (CME). The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, but background effects have not yet been properly disentangled. In this project you will develop and test new observables of the CME, trying to understand and discriminate the background sources that affects such a measurement.
+
  
''Contact: [mailto:Panos.Christakoglou@nikhef.nl Panos Christakoglou and Paul Kuijer]''
+
A difficulty in projecting holograms (based on the interference of light) is the required dense pixel pitch of a projector. One would need a pixel pitch of less than 200 nanometer. With larger pixels artefacts occur due to spatial under sampling. A pixel pitch of 200 nanometer is difficult, if not, impossible, to achieve, especially for larger areas. Another challenge is the massive amount of computing power that would be required to control such a dense pixel matrix.  
  
=== DR&D : Medical X-ray Imaging ===
+
A new holographic projection method has been developed that reduces under sampling artefacts for projectors with a ‘low’ pixel density. It uses 'pixels' at random but known positions, resulting in an array of (coherent) light points that lacks (or has suppressed) spatial periodicity. As a result a holographic projector can be built with a significantly lower pixel density and correspondingly less required computing power. This could bring holography in reach for many applications like display, lithography, 3D printing, metrology, etc...  
With the upcoming of true multi-threshold X-Ray detectors the possibilities for Spectral Imaging with low dose, including spectral CT, is now a reality around the corner. The Medipix3RX chip, from the Medipix Collaboration (CERN) features up to 8 programmable thresholds which can select energy bins without a threshold scan. A number of projects could be derived from the R&D activities with the Medipix3RX within the Nikhef R&D group on X-ray imaging for medical applications:
+
* Medipix3RX characterization in all its operation modes and gains.
+
* Spectral CT and scarce sampling 3D reconstruction
+
* Charge sharing: the charge-sum capabilities of the chip can be exploited to further understand the problem of charge sharing in pixelized detectors. A combination of the characterization of the charge-summing mode plus the use of both planar, and 3D sensors, at the light of MC simulation, could reveal valuable information about charge sharing.
+
  
''Contact: [mailto:koffeman@nikhef.nl Els Koffeman],[mailto:martinfr@nihef.nl Martin Fransen]''
+
Of course, nothing comes for free: With less pixels, holograms become noisier and the contrast will be reduced (not all light ends up in the hologram). The questions: How does the quality of a hologram depend on pixel density? How do we determine projector requirements based on requirements for hologram quality?
  
=== DR&D : Compton camera ===
+
Requirements for a hologram can be expressed in terms of: Noise, contrast, resolution, suppression of under sampling artefacts, etc..  
In the Nikhef R&D group we develop instrumentation for particle physics but we also investigate how particle physics detectors can be used for different purposes. A succesfull development is the Medipix chip that can be used in X-ray imaging. For use in large scale medical applications compton scattering limits however the energy resolving possibilities. You will investigate whether it is in principle possible to design a X-ray application that detects the compton scattered electron and the absorbed photon. Your ideas can be tested in practice in the lab where a X-ray scan can be performed.
+
  
''Contact: [mailto:koffeman@nikhef.nl Els Koffeman]''
+
For this project we have built a proof of concept holographic emitter. This set-up will be used to verify simulation results (and also to project some cool holograms of course ;-).  
  
=== KM3NeT : Reconstruction of first neutrino interactions in KM3NeT ===
+
Examples of what you could be working on:
  
The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first two strings with sensitive photodetectors have been deployed 2015&2016, in total 30 to be deployed til end of next year. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In order to identify neutrinos an accurate reconstruction and optimal understanding of the backgrounds are crucial. In this project we will use the available data to identify and reconstruct the first neutrino interactions in the KM3NeT detector and with this pave the path towards neutrino astronomy.
+
a. Calibration/characterisation of the current projector and compensation of systematic errors.
 +
 
 +
b. To realize a phased array of randomly placed light sources the pixel matrix of the projector must be ‘relayed’ onto a mask with apertures at random but precisely known positions. Determine the best possible relaying optics and design an optimized mask accordingly. Factors like deformation of the projected pixel matrix and limitations in resolving power of the lens system must be taken into account for mask design.
 +
 
 +
Important note: Much of this work is to be performed in the laboratory. For as long as corona safety measures are active, the labs at Nikhef are not accessible for students and this project cannot be worked on. Currently we hope that the situation will have improved by august.
 +
 
 +
''Contact: [mailto:martinfr@nikhef.nl Martin Fransen]''
 +
 
 +
=== Theory: The Effective Field Theory Pathway to New Physics at the LHC ===
 +
A promising framework to parametrise in a robust and model-independent way deviations from the Standard Model (SM) induced by new heavy particles is the Standard Model Effective Field Theory (SMEFT). In this formalism, beyond the SM effects are encapsulated in higher-dimensional operators constructed from SM fields respecting their symmetry properties. In this project, we aim to carry out a global analysis of the SMEFT from high-precision LHC data, including Higgs boson production, flavour observables, and low-energy measurements. This analysis will be carried out in the context of the recently developed SMEFiT approach [1] based on Machine Learning techniques to efficiently explore the complex theory parameter space. The ultimate goal is either to uncover glimpses of new particles or interactions at the LHC, or to derive the most stringent model-independent bounds to date on general theories of New Physics. Of particular interest are novel methods for charting the parameter space [2], the matching to UV-complete theories in explicit BSM scenarios [3], and the interplay between EFT-based model-independent searches for new physics and determinations of the proton structure from LHC data [4].
 +
 
 +
[1] https://arxiv.org/abs/1901.05965
 +
[2] https://arxiv.org/abs/1906.05296
 +
[3] https://arxiv.org/abs/1908.05588
 +
[4] https://arxiv.org/abs/1905.05215
 +
 
 +
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
 +
 
 +
=== Theory: Charting the quark and gluon structure of protons and nuclei with Machine Learning ===
 +
Deepening our knowledge of the partonic content of nucleons and nuclei [1] represents a central endeavour of modern high-energy and nuclear physics, with ramifications in related disciplines such as astroparticle physics. There are two main scientific drivers motivating these investigations of the partonic structure of hadrons. On the one hand, addressing fundamental open issues in our understanding in the strong interactions such as the origin of the nucleon mass, spin, and transverse structure; the presence of heavy quarks in the nucleon wave function; and the possible onset of novel gluon-dominated dynamical regimes. On the other hand, pinning down with the highest possible precision the substructure of nucleons and nuclei is a central component for theoretical predictions in a wide range of experiments, from proton and heavy ion collisions at the Large Hadron Collider to ultra-high energy neutrino interactions at neutrino telescopes. The goal of this project is to exploit Machine Learning and Artificial Intelligence tools [2,3] (neural networks trained by stochastic gradient descent) to pin down the quark and gluon substructure of protons and nuclei by using recent measurements from proton-proton and proton-lead collisions at the LHC. Topics of special interest are i) the strange content of protons and nuclei, ii) parton distributions at higher-orders in the QCD couplings for precision Higgs physics, iii) the interplay between jet, photon, and top quark production data to pin down the large-x gluon, and iv) charm quarks as a probe of gluon shadowing at small-x. The project also involves developing projects for the Electron-Ion Collider (EIC), a new lepton-nucleus experiment to start operations in the next years.
 +
 
 +
[1] https://arxiv.org/abs/1910.03408
 +
[2] https://arxiv.org/abs/1904.00018
 +
[3] https://arxiv.org/abs/1706.00428
 +
 
 +
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
 +
 
 +
=== Theory: Machine learning for Electron Microscopy for next-generation materials ===
 +
Machine Learning tools developed and applied for particle physics hold great potential for applications in material science, in particular concerning faithful uncertainty estimation and model training for large parameter spaces. In this project, carried out in collaboration with the group of Dr. Sonia Conesa-Boj from the Kavli Institute Nanoscience Delft, http://www.conesabojlab.tudelft.nl, we will  develop and deploy ML tools for data analysis in Electron Microscopy. We will focus on pinning down the properties of novel quantum materials such as topological insulators and van der Waals materials. Examples of possible applications include model-independent background subtraction in electron-energy loss spectroscopy, automatic classification of crystalline structures, and enhancing spatial and spectral resolution using convolutional networks.
 +
 
 +
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
 +
 
 +
===Theory: The electroweak phase transition and baryogenesis/gravitational wave production ===
 +
 
 +
In extensions of the Standard Model the electroweak phase transition can be first order and proceed via the nucleation of bubbles. Colliding bubbles can produce gravitational waves [1] and plasma particles interacting with the bubbles can generate a matter-antimatter asymmetry [2]. A detailed understanding of the dynamics of the phase transitions is needed to accurately describe these processes.  One project is to study QFT at finite temperature and compare/apply methods that address the non-perturbative IR dynamics of the thermal processes [3,4].  Another project is to calculate the velocity by which the bubbles expand, which is an important parameter for gravitational waves production and baryogensis. This entails among other things tunneling dymamics, (thermal) scattering rates and Boltzmann equations [5].
 +
 
 +
[1]https://arxiv.org/abs/1705.01783
 +
[2]https://arxiv.org/pdf/hep-ph/0609145.pdf
 +
[3]https://arxiv.org/pdf/1609.06230.pdf
 +
[4]https://arxiv.org/pdf/1612.00466.pdf
 +
[5]https://arxiv.org/pdf/1809.04907.pdf
 +
 
 +
''Contact: [mailto:mpostma@nikhef.nl Marieke Postma]''
 +
 
 +
===Theory: Cosmology of the QCD axion ===
 +
 
 +
The QCD axion provides an elegant solution to the strong CP problem in QCD[1]. This project focus on the cosmological dynamics of this hypothesized axion field, and in particular the possibility that it can both produce the observed matter-antimatter asymmetry and dark matter abundance in our universe [2,3].
 +
 
 +
[1]https://arxiv.org/abs/1812.02669
 +
[2]https://arxiv.org/pdf/hep-ph/0609145.pdf
 +
[3]https://arxiv.org/pdf/1910.02080.pdf
 +
 
 +
''Contact: [mailto:mpostma@nikhef.nl Marieke Postma]''
 +
 
 +
===Theory: Neutrinos, hierarchy problem and cosmology ===
 +
 
 +
The electroweak hierachy problem is absent if the quadratic term in the Higgs potential is generated dynamically. This is achieved in 'the neutrino option' [1] where the Higgs potential stems exclusively from quantum effects of heavy right-handed neutrinos, which can also generate the mass pattern of the oberved left-handed neutrinos.  The project focusses on model building aspects (e.g. [2]) and the cosmology (e.g. leptogenesis [3]) of these set-ups.
 +
 
 +
[1] https://arxiv.org/pdf/1703.10924.pdf
 +
[2] https://arxiv.org/pdf/1807.11490.pdf
 +
[3] https://arxiv.org/pdf/1905.12642.pdf
 +
 
 +
''Contact: [mailto:mpostma@nikhef.nl Marieke Postma]''
 +
 
 +
=== KM3NeT: Reconstruction of first neutrino interactions in KM3NeT ===
 +
 
 +
The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first few strings with sensitive photodetectors have been deployed at both the Italian and the French detector sites. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In this project the available data will be used together with simulations to best reconstruct the event topologies and optimally identify and reconstruct the first neutrino interactions in the KM3NeT detector and with this pave the path towards accurate neutrino oscillation measurements and neutrino astronomy.  
  
 
Programming skills are essential, mostly root and C++ will be used.
 
Programming skills are essential, mostly root and C++ will be used.
 +
''Contact: [mailto:bruijn@nikhef.nl Ronald Bruijn] [mailto:dosamtnikhef.nl Dorothea Samtleben]'''
  
'' Contact: [mailto:bruijn@nikhef.nl Ronald Bruijn]''
+
=== KM3NeT: Searching for New Heavy Neutrinos ===
  
=== ANTARES: Analysis of IceCube neutrino sources. ===
+
In this project we will be searching for a new heavy neutrino, looking at signatures created by atmospheric neutrinos interacting in the detector volume of KM3NeT-ORCA. The aim of this project is to study a specific event topology which appears as double blobs of signals detected separately by densely instrumented ORCA detector units. We will be exploiting the tau reconstruction algorithms to verify the possibility of ORCA to detect such signals and to estimate the potential sensitivity of the experiment as well. Basic knowledge of elementary particle physics and data analysis techniques will be advantageous. The knowledge of programming languages e.g. python (and possibly C++) and ROOT are advantageous but not mandatory.  
  
The only evidence for high energetic neutrinos from cosmic sources so far comes from detections with the IceCube detector. Most of the detected events were reconstructed with a large uncertainty on their direction, which has prevented an association to astrophysical sources. Only for the high energetic muon neutrino candidates a high resolution in the direction has been achieved, but also for those no significant correlation to astrophysical sources has to date been detected.
+
''Contact: [mailto:suzanbp@nikhef.nl Suzan B. du Pree] [mailto:dveijk@nikhef.nl Daan van Eijk]''
The ANTARES neutrino telescope has since 2007 continuously taken neutrino data with high angular resolution, which can be exploited to further scrutinize the locations of these neutrino sources. In this project we will address the neutrino sources in a stacked analysis to further probe the origin of the neutrinos with enhanced sensitivity.
+
  
Programming skills are essential, mainly C++ and root will be used.
+
=== KM3NeT: Dark Matter with KM3NeT-ORCA ===
  
'' Contact: [mailto:dosamt@nikhef.nl Dorothea Samtleben]''
+
Dark Matter is thought to be everywhere (we should be swimming through it), but we have no idea what it is. Using the good energy and angular resolutions of the KM3NeT neutrino telescope, we can search for Dark Matter signatures that originate from the center of our galaxy. In this project, we will search for such signatures using the reconstructed track and shower events with the KM3NeT-ORCA detector to discover relatively light Dark Matter particles. Since this year, the KM3NeT-ORCA  experiment has 6 detection lines under the Mediterranean Sea: fully operational and continuously taking data. Using the available data, it is possible to compare data and simulation for different event topologies and to estimate the experiment's sensitivity. The project is suitable for a student who is interested to explore new physics scenarios and willing to develop new skills. Basic knowledge of elementary particle physics and data analysis techniques will be advantageous. The knowledge of programming languages e.g. python (possibly C++) and ROOT data analysis tool are advantageous but not mandatory.
  
 +
''Contact: [mailto:suzanbp@nikhef.nl Suzan B. du Pree] [mailto:dveijk@nikhef.nl Daan van Eijk]''
  
=== ATLAS: Implementation of Morphing techniques for ATLAS top physics analysis. ===
 
  
Perhaps the most promising gateway to physics beyond the Standard Model is the top quark, the heaviest elementary particle. Particularly
+
=== Gravitational Waves: Unraveling the structure of neutron stars with gravitational wave observations ===
interesting is how the different top quark spin states influence the angular distribution of electrons and other decays products, which can be measured very accurately. New interactions would alter this coupling, leading to decay patterns that are different from those predicted by
+
the Standard Model. At Nikhef we are implementing NLO predictions of the so called dimension-6 operators to describe several measurable distributions. To confront these distributions with data, a continues parametrization is required. For this purpose, we want to introduce a novel technique in top quark analysis which is based on Morphing. The project consist of an implementation of Morphing to parametrize the top's angular distributions and to demonstrate that the paramdeters can be extracted in a fitting procedure using (pseudo)data.
+
  
Affinity with software is essential, mainly C++ and root will be used.  
+
Neutron stars were first discovered more than half a century ago, yet their detailed internal structure largely remains a mystery. A range of theoretical models have been put forward for the neutron star "equation of state", but until recently there was no real way to test them. The direct detection of gravitational waves with LIGO and Virgo has the potential to remedy the situation. When two neutron stars spiral towards each other, they get tidally deformed in a way that is determined by the equation of state, and these deformations get imprinted upon the shape of the gravitational wave that gets emitted. After the first gravitational wave observation of such an event in 2017, several equation of state models could already be ruled out. With expected upgrades of the detectors, we will at some point have access not only to the "inspiral" of binary neutron stars, but to the merger itself, and what happens afterwards. The project will consist of using results from large-scale numerical simulations to come up with a heuristic model for the waveform that describes the inspiral-merger-postmerger process with sufficient accuracy given expected detector sensitivities, and to develop data analysis techniques to efficiently use this model to extract information about the neutron star equation of state.
  
'' Contact: [mailto:h73@nikhef.nl Marcel Vreeswijk]''
+
''Contact: [mailto:vdbroeck@nikhef.nl Chris Van Den Broeck]''
  
=== Theory – Probing electroweak symmetry breaking with Higgs pair production at the LHC and beyond ===
 
  
The measurement of Higgs pair production will be a cornerstone of the LHC program in the coming years. Double Higgs production provides a crucial window upon the mechanism of electroweak symmetry breaking, and has a unique sensitivity to a number of currently unknown Higgs couplings, like the Higgs self-coupling λ and the coupling between a pair of Higgs bosons and two vector bosons. In this project, the student will explore the feasibility of the measurement of Higgs pair production in the 4b final state both at the LHC and at future 100 TeV collider. A number of production modes will be considered, including gluon-fusion, vector-boson-fusion, as well as Higgs pair production in association with a top-quark pair. A key ingredient of the project will be the exploitation of multivariate techniques such as Artificial Neural Networks and other multivariate discriminants to enhance the ratio of di-Higgs signal over backgrounds.  
+
=== Gravitational Waves: Searches for gravitational waves from compact binary coalescence ===
 +
Searches for gravitational waves from the mergers of black holes and neutron stars have been extraordinarily successful in the last four years. We are now beginning to study a population of heavy stellar-mass black holes in detail, including understanding how these systems came to form and whether they are consistent with general relativity. Additionally, the detection of binary neutron star mergers is allowing us to probe their extreme matter. However, we’ve only just scratched the surface of possible signals and the new physics they’d allow us to study. The detection of highly spinning and precessing systems would allow us to perform black hole population statistics to an extraordinary degree of accuracy. Detection of sub-solar mass systems would provide evidence of dark matter. However, these searches are difficult because they require us to work in high-dimensional spaces and develop new statistical methods. There are possibilities for several projects that involve the development and implementation of these new searches as well as the interpretation of the results, particularly in terms of the physics describing compact binary mergers.
  
The project involves to estimate the precision that can be achieved in the extraction of the Higgs self-coupling for a number of assumptions about the performance of the LHC detectors, and in particular to quantify the information that can be extracted from the Run II dataset with L = 300 1/fb . A similar approach will be applied to the determination of other unknown properties of the Higgs sector, such as the coupling between two Higgs bosons and two weak vector bosons, as well as the Wilson coefficients of higher-dimensional operators in the Standard Model Effective Field Theory (SM-EFT). Additional information on this project can be found here: [http://pcteserver.mi.infn.it/~nnpdf/VU/2017-MasterProject-HH.pdf].
+
''Contact: [mailto:physarah@gmail.com Sarah Caudill]''
  
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
 
  
=== Theory – Constraining the proton structure with Run II LHC data ===
+
=== Gravitational Waves: Computer modelling to design the laser interferometers for the Einstein telescope ===
  
The non-perturbative dynamics that determine the energy distribution of quarks and gluons inside protons, the so-called parton distribution functions (PDFs), cannot be computed from first principles from Quantum Chromodynamics (QCD), and need to be determined from experimental data. PDFs are an essential ingredient for the scientific program at the Large Hadron Collider (LHC), from Higgs characterisation to searches for New Physics beyond the Standard Model. One recent breakthrough in PDF analysis has been the exploitation of the constraints from LHC data. From direct photons to top quark pair production cross-sections and charmed meson differential distributions, LHC measurements are now a central ingredient of PDF fits, providing important information on poorly-known PDFs such as the large and small-x gluon or the large-x antiquarks. With the upcoming availability of data from the Run II of the LHC, at a center-of-mass energy of 13 TeV, these constraints are expected to become even more stringent.
+
A new field of instrument science led to the successful detection of gravitational waves by the LIGO detectors in 2015. We are now preparing the next generation of gravitational wave observatories, such as the Einstein Telescope, with the aim to increase the detector sensitivity by a factor of ten, which would allow, for example, to detect stellar-mass black holes from early in the universe when the first stars began to form. This ambitious goal requires us to find ways to significantly improve the best laser interferometers in the world.
  
In this project, the implications of PDF-sensitive measurements at the LHC 13 TeV will be quantified. Processes that will be considered include jet and dijet production at the multi-TeV scale, single-top quark production, and weak boson production in association with heavy quarks, among several others. These studies will be performed using the NNPDF fitting framework, based on artificial neural networks and genetic algorithms. The  phenomenological implications of the improved PDF modelling for Higgs and new physics searches at the LHC will also be explored. Additional information on this project can be found here: [http://pcteserver.mi.infn.it/~nnpdf/talks/MSc_projects/2017-MasterProject-PDFs.pdf].
+
Gravitational wave detectors, such as LIGO and VIRGO, are complex Michelson-type interferometers enhanced with optical cavities. We develop and use numerical models to study these laser interferometers, to invent new optical techniques and to quantify their performance. For example, we synthesize virtual mirror surfaces to study the effects of higher-order optical modes in the interferometers, and we use opto-mechanical models to test schemes for suppressing quantum fluctuations of the light field. We can offer several projects based on numerical modelling of laser interferometers. All projects will be directly linked to the ongoing design of the Einstein Telescope.
 +
 
 +
''Contact: [mailto:a.freise@nikhef.nl Andreas Freise]''
 +
 
 +
 
 +
=== Gravitational Waves: Digging away the noise to find the signal ===
 +
 
 +
Gravitational Wave interferometers are extremely sensitive, but suffer
 +
from instrumental issues that produce noise that mimics astrophysical
 +
signals. This needs to be solved as much as possible before the data
 +
analysis. The problem is that  instrumentalists don't know about
 +
analysis pipelines, and data analysts don't know about experimental
 +
details. We need your help to bridge the gap. This is a good opportunity
 +
to learn about both sides and contribute directly to a booming
 +
international field. We have several tools and new ideas for correlating
 +
noises with the state of the instrument. These need to be developed
 +
further, used on years of data, and written up. Will require Python,
 +
signal processing and statistics.
 +
 
 +
''Contact: [mailto:swinkels@nikhef.nl Bas Swinkels] and [mailto:physarah@gmail.com Sarah Caudill]''
 +
 
 +
 
 +
=== Gravitational Waves: Machine Learning techniques for GW Interferometers ===
 +
The control of suspended optical cavities in the non linear regime. 
 +
Gravitational Wave interferometers are extremely sensitive, however suffer from a very small control range, causing unlocks,
 +
reducing the robustness of these instruments.  
 +
In this project we will use a table top replica of a suspended optical cavity,
 +
located in the new R&D laser lab at Nikhef, for the development of a neural
 +
network to construct the positions from free falling mirror by using beam
 +
images. A database with simulated beam images can be used to train
 +
various neural networks before deployment in the table top experiment.
 +
We are looking for a hands-on and enthusiastic master student, interested
 +
in machine learning and experienced in programming languages like Python.
 +
Contacts: Rob Walet, Frank Linde
 +
 
 +
''Contact: [mailto:r.walet@nikhef.nl Rob Walet] and [mailto:f.l.linde@gmail.com Frank Linde]''
 +
 
 +
=== VU LaserLaB: Measuring the electric dipole moment (EDM) of the electron ===
 +
 
 +
In collaboration with Nikhef and the Van Swinderen Institute for Particle Physics and Gravity at the University of Groningen, we have recently started an exciting project to measure the electric dipole moment (EDM) of the electron in cold beams of barium-fluoride molecules. The eEDM, which is predicted by the Standard Model of particle physics to be extremely small, is a powerful probe to explore physics beyond this Standard Model. All extensions to the Standard Model, most prominently supersymmetry, naturally predict an electron EDM that is just below the current experimental limits. We aim to improve on the best current measurement by at least an order of magnitude. To do so we will perform a precision measurement on a slow beam of laser-cooled BaF molecules. With this low-energy precision experiment, we test physics at energies comparable to those of LHC!
 +
 
 +
At LaserLaB VU, we are responsible for building and testing a cryogenic source of BaF molecules. The main parts of this source are currently being constructed in the workshop. We are looking for enthusiastic master students to help setup the laser system that will be used to detect BaF. Furthermore, projects are available to perform simulations of trajectory simulations to design a lens system that guides the BaF molecules from the exit of the cryogenic source to the experiment.
 +
 
 +
''Contact: [mailto:H.L.Bethlem@vu.nl Rick Bethlem]''
 +
 
 +
=== VU LaserLaB: Physics beyond the Standard model from molecules ===
 +
 
 +
Our team, with a number of staff members (Ubachs, Eikema, Salumbides, Bethlem, Koelemeij) focuses on precision measurements in the hydrogen molecule, and its isotopomers. The work aims at testing the QED calculations of energy levels in H2, D2, T2, HD, etc. with the most precise measurements, where all kind of experimental laser techniques play a role (cw and pulsed lasers, atomic clocks, frequency combs, molecular beams). Also a target of studies is the connection to the "Proton size puzzle", which may be solved  through studies in the hydrogen molecular isotopes.
 +
 
 +
In the past half year we have produced a number of important results that are described in
 +
the following papers:
 +
* Frequency comb (Ramsey type) electronic  excitations in the  H2 molecule:
 +
see: Deep-ultraviolet frequency metrology of H2 for tests of molecular quantum theory
 +
http://www.nat.vu.nl/~wimu/Publications/Altmann-PRL-2018.pdf
 +
* ''Precision measurement of an infrared transition in the HD molecule''
 +
see: Sub-Doppler frequency metrology in HD for tests of fundamental physics: https://arxiv.org/abs/1712.08438
 +
* ''The first precision study in molecular tritium T2''
 +
see: Relativistic and QED effects in the fundamental vibration of T2:  http://arxiv.org/abs/1803.03161
 +
* ''Dissociation energy of the hydrogen molecule at 10^-9 accuracy'' paper submitted to Phys. Rev. Lett.
 +
* ''Probing QED and fundamental constants through laser spectroscopy of vibrational transitions in HD+''
 +
This is also a study of the hydrogen molecular ion HD+, where important results were  obtained not so long ago, and where we have a strong activity: http://www.nat.vu.nl/~wimu/Publications/ncomms10385.pdf
 +
 
 +
These five results mark the various directions we are pursuing, and in all directions we aim at obtaining improvements. Specific projects with students can be defined; those are mostly experimental, although there might be some theoretical tasks, like performing calculations of hyperfine structures.
 +
''Contact: [mailto:w.m.g.ubachs@vu.nl Wim Ubachs] [mailto:k.s.e.eikema@vu.nl Kjeld Eikema] [mailto:h.l.bethlem@vu.nl Rick Bethlem]''
  
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
 
  
  
 
[[Last years MSc Projects|Last year's MSc Projects]]
 
[[Last years MSc Projects|Last year's MSc Projects]]

Latest revision as of 12:13, 15 July 2020

Master Thesis Research Projects

The following Master thesis research projects are offered at Nikhef. If you are interested in one of these projects, please contact the coordinator listed with the project.

Contents

[edit] Projects with September 2020 start

[edit] ATLAS: Top Spin optimal observables using Artificial Intelligence

The top quark has an exceptional high mass, close to the electroweak symmetry breaking scale and therefore sensitive to new physics effects. Theoretically, new physics is well described in the EFT framework [1]. The (EFT) operators are experimentally well accessible in single top t-channel production where the top quark is produced spin polarized. The focus at Nikhef is the operator O_{tW} with a possible imaginary phase, leading to CP violation. Experimentally, many angular distribution are reconstructed in the top rest frame to hunt for these effects. We are looking for a limited set of optimal observables. The objective of your Master project would be to find optimal observables using simulated events including the detector effects and possible systematic deviations. All techniques are allowed, but promising new developments are methods which involve artifical intelligence. This work could lead to an ATLAS note.

[1] https://arxiv.org/abs/1807.03576

Contact: Marcel Vreeswijk [1] and Jordy Degens [2]

[edit] ATLAS: The Next Generation

After the observation of the coupling of Higgs bosons to fermions of the third generation, the search for the coupling to fermions of the second generation is one of the next priorities for research at CERN's Large Hadron Collider. The search for the decay of the Higgs boson to two charm quarks is very new [1] and we see various opportunities for interesting developments. For this project we propose improvements in reconstruction (using exclusive decays), advanced analysis techiques (using deep learning methods) and expanding the theory interpretation. Another opportunity would be the development of the first statistical combination of results between the ATLAS and CMS experiment, which could significantly improve the discovery potentional.

[1] https://arxiv.org/abs/1802.04329

Contact: Tristan du Pree and Marko Stamenkovic

[edit] ATLAS: The Most Energetic Higgs Boson

The production of Higgs bosons at the highest energies could give the first indications for deviations from the standard model of particle physics, but production energies above 500 GeV have not been observed yet [1]. The LHC Run-2 dataset, collected during the last 4 years, might be the first opportunity to observe such processes, and we have various ideas for new studies. Possible developments include the improvement of boosted reconstruction techniques, for example using multivariate deep learning methods. Also, there are various opportunities for unexplored theory interpretations (using the MadGraph event generator), including effective field theory models (with novel ‘morphing’ techniques) and new interpretations of the newly observed boosted VZ(bb) process.

[1] https://arxiv.org/abs/1709.05543

Contact: Tristan du Pree and Brian Moser

[edit] LHCb: Measurement of delta md

The decay B0->D-pi+ is very abundant in LHCb, and therefore ideal to study the oscillation frequency delta md, with which B0 mesons oscillate into anti-B0 mesons, and vice versa. This process proceeds through a so-called box diagram which might hide new yet-undiscovered particles. Recently, it has been realized that value of delta md is in tension with the valu of CKM-angle gamma, triggering renewed interest in this measurement.

Contact: Marcel Merk

[edit] LHCb: Searching for CPT violation

CPT symmetry is closely linked to Lorentz symmetry, and any violation would revolutionize science. There are possibilities though that supergravity could cause CPT violating effects in the system of neutral mesons. The precise study of B0s oscillations in the abundant Bs->Dspi decays can give the most stringent limits on Im(z) to date.

Contact: Marcel Merk

[edit] LHCb: BR(B0->D-pi+) and fd/fu with B+->D0pi+

The abundant decay B0->D-pi+ is often used as normalization channel, given its clean signal, and well-known branching fraction, as measured by the B-factories. However, this branching fraction can be determined more precisely, when comparing to the decay B+->D0pi+ , which has a twice better precision. In addition, the production of B0 and B+ mesons is often assumed to be equal, based on isospin symmetry. The study of B+->D0pi+ and B0->D-pi+ allows for the first measurement of this ratio, fd/fu.

Contact: Marcel Merk


[edit] LHCb: Optimization studies for Vertex detector at the High Lumi LHCb

The LHCb experiment is dedicated to measure tiny differences between matter and antimatter through the precise study of rare processes involving b or c quarks. The LHCb detector will undergo a major modification in order to dramatically increase the luminosity and be able to measure indirect effects of physics beyond the standard model. In this environment, over 42 simultaneous collisions are expected to happen at a time interval of 200 ps where the two proton bunches overlap. The particles of interest have a relatively long lifetime and therefore the best way to distinguish them from the background collisions is through the precise reconstruction of displaced vertices and pointing directions. The new detector considers using extremely recent or even future technologies to measure space (with resolutions below 10 um) and time (100 ps or better) to efficiently reconstruct the events of interest for physics. The project involves changing completely the LHCb Vertex Locator (VELO) design in simulation and determine what can be the best performance for the upgraded detector, considering different spatial and temporal resolutions.

Contact: Kazu Akiba

[edit] LHCb: Measurement of charge multiplication in heavily irradiated sensors

During the R&D phase for the LHCb VELO Upgrade detector a few sensor prototypes were irradiated to the extreme fluence expected to be achieved during the detector lifetime. These samples were tested using high energy particles at the SPS facility at CERN with their trajectories reconstructed by the Timepix3 telescope. A preliminary analysis revealed that at the highest irradiation levels the amount of signal observed is higher than expected, and even larger than the signal obtained at lower doses. At the Device Under Test (DUT) position inside the telescope, the spatial resolution attained by this system is below 2 um. This means that a detailed analysis can be performed in order to study where and how this signal amplification happens within the 55x55 um^2 pixel cell. This project involves analysing the telescope and DUT data to investigate the charge multiplication mechanism at the microscopic level.

Contact: Kazu Akiba

[edit] LHCb: Testing the flavour anomalies at LHCb

Lepton Flavour Universality (LFU) is an intrinsic property of the Standard Model, which implies that the three generation of leptons are subject to the same interactions. This fundamental law of the SM can be investigated by looking at rare B-meson decay with muons or electron in the final state. Recent measurements of these decays from LHCb show deviation from the SM (known as flavour anomalies) that, if confirmed, would lead to a major discovery of New Physics (NP). The project consists in the analysis of the 2017-18 dataset, which will double the statistic of the current results. This new dataset will lead to a measurement with better precision, which can either confirm or exclude the contribution of NP to these decays. The project will explore all the crucial aspect of data analysis, from simulation to signal modeling, including cutting-edge software, such us fitting large amount of data using GPU (Graphic Processing Unit).

Contact: Andrea Mauri and Marcel Merk

[edit] LHCb: Search for long-lived heavy neutral leptons in B decays

The mass of neutrinos are many orders of magnitude smaller than that of the other fermions. In the seesaw mechanism this puzzling fact is explained by the existence of another set of neutral leptons that are much heavier in mass. If their mass is below about 5 GeV such neutrinos can be produced at the LHC in decays of B hadrons. Their small coupling will lead to a lifetime of the order of pico-seconds which means that they will fly an observable distance before they decay. In this project we search for such long-lived heavy neutrinos in decays of charged B mesons using the LHCb run-2 dataset.

Contact: Lera Lukashenko and Wouter Hulsbergen

[edit] LHCb: Discovering the Bc->eta_c mu nu decay

The Bc meson, consisting of heavy c and anti-b quarks, is of great interest for flavour physics. Recent LHCb measurement on Bc->J/psi l nu decays [1] showed a possible deviation from the Standard Model prediction, which entered the so-called lepton universality puzzle - the hottest topic in the b-physics in recent years. Following that, the study of a similar decay mode - Bc->eta_c mu nu - is strongly requested by the theory community. However, the reconstruction of the eta_c meson is challenging, so that the decay has not been discovered yet. The project aims at discovery of the Bc->eta_c mu nu decay using unique capabilities of the LHCb experiment. The data analysis will consist of finding the optimal event selection using machine learning techniques, research on background sources, performing fits to data, etc. The project requires to be not afraid of analysis software and statistics. The results will be presented in collaboration: talks at working group meetings, analysis note, etc. Skills in git, python and ROOT (and similar packages) are extremely welcome.

[1] https://arxiv.org/pdf/1711.05623.pdf

Contact: Andrii Usachov and Marcel Merk

[edit] ALICE: Searching for the strongest magnetic field in nature

In case of a non-central collision between two Pb ions, with a large value of impact parameter (b), the charged nucleons that do not participate in the interaction (called spectators) create strong magnetic fields. A back of the envelope calculation using the Biot-Savart law brings the magnitude of this filed close to 10^19Gauss in agreement with state of the art theoretical calculation, making it the strongest magnetic field in nature. The presence of this field could have direct implications in the motion of final state particles. The magnetic field, however, decays rapidly. The decay rate depends on the electric conductivity of the medium which is experimentally poorly constrained. Overall, the presence of the magnetic field, the main goal of this project, is so far not confirmed experimentally.

Contact: Panos Christakoglou

[edit] ALICE: Looking for parity violating effects in strong interactions

Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions, in the presence of a deconfined state, gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, what is called the Chiral Magnetic Effect (CME). The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, however further studies are needed to constrain background effects. These highly anticipated results have the potential to reveal exiting, new physics.

Contact: Panos Christakoglou

[edit] ALICE: Machine learning techniques as a tool to study the production of heavy flavour particles

There was recently a shift in the field of heavy-ion physics triggered by experimental results obtained in collisions between small systems (e.g. protons on protons). These results resemble the ones obtained in collisions between heavy ions. This consequently raises the question of whether we create the smallest QGP droplet in collisions between small systems. The main objective of this project will be to study the production of charm particles such as D-mesons and Λc-baryons in pp collisions at the LHC. This will be done with the help of a new and innovative technique which is based on machine learning (ML). The student will also extend the studies to investigate how this production rate depends on the event activity e.g. on how many particles are created after every collision.

Contact: Panos Christakoglou and Alessandro Grelli

[edit] ALICE: Energy Loss of Energetic Quarks and Gluons in the Quark-Gluon Plasma

One of the ways to study the quark-gluon plasma that is formed in high-energy nuclear collisions, is using high-energy partons (quarks or gluons) that are produced early in the collision and interact with the quark-gluon plasma as they propagate through it. There are several current open questions related to this topic, which can be explored in a Master's project. For example, we would like to use the new Monte Carlo generator framework JetScape to simulate collisions to see whether we can extract information about the interaction with the quark-gluon plasma. In the project you will collaborate with one of the PhD students or postdocs in our group to use the model to generate predictions of measurements and compare those to data analysis results. Depending on your interests, the project can focus more on the modeling aspects or on the analysis of experimental data from the ALICE detector at the LHC.

Contact: Marco van Leeuwen and Marta Verweij

[edit] ALICE: Extreme Rare Probes of the Quark-Gluon Plasma

The quark-gluon plasma is formed in high-energy nuclear collisions and also existed shortly after the big bang. With the large amount of data collected in recent years at the Large Hadron Collider at CERN, rare processes that previously were not accessible provide now new ways to study how the quark-gluon plasma emerges from the fundamental theory of strong interaction. One of such processes is the heavy W boson which in many cases decays to two quarks. The W boson itself doesn’t interact with the quark-gluon plasma because it doesn’t carry color, but the quark decay products do interact with the plasma and therefore provide an ideal tool to study the space-time evolution of this hot and dense medium. In this project you will use data from the ALICE detector at the LHC and simulated data from generators to study various physics mechanisms that could be happening in the real collisions.

Contact: Marta Verweij and Marco van Leeuwen

[edit] ALICE: Jet Quenching with Machine Learning

Machine learning applications are rising steadily as a vital tool in the field of data science but are relatively new in the particle physics community. In this project machine learning tools will be used to gain insights into the modification of a parton shower in the quark-gluon plasma (QGP). The QGP is created in high-energy nuclear collisions and only lives for a very short period of time. Highly energetic partons created in the same collisions interact with the plasma while they travers it and are observed as a collimated spray of particles, known as jets, in the detector. One of the key recent insights is that the internal structure of jets provides information about the evolution of the QGP. With data recorded by the ALICE experiment, you will use jet substructure techniques in combination with machine learning algorithms to dissect the structure of the QGP. Machine learning will be used to select the regions of radiation phase space that are affected by the presence of the QGP.

Contact: Marta Verweij and Marco van Leeuwen

[edit] Lepton Collider: Pixel TPC testbeam

In the Lepton Collider group at Nikhef we work on a tracking detector for a future Collider (e.g. the ILC in Japan). We are developing a gaseous Time Projection Chamber with a pixel readout. At Nikhef we have built an 8-quad GridPix module based on the Timepix3 chip, which is a detector of about 20 cm x 40 cm x 10 cm in size. In August 2020 we will test the device at the DESY particle accelerator in Hamburg. For the project you could work on preparations for the test beam (e.g. running the data acquisition, perform data monitoring using our set up in the lab). The next topics will be the participation in the data taking during the test beam at DESY, the analysis of the data using C++ and ROOT and - finally - publication of the results in a scientific journal.

Our latest paper can be found in https://www.nikhef.nl/~s01/quad_paper.pdf [www.nikhef.nl].

Contact: Peter Kluit and Kees Ligtenberg

[edit] Dark Matter: Sensitive tests of wavelength-shifting properties of materials for dark matter detectors

Rare event search experiments that look for neutrino and dark matter interactions are performed with highly sensitive detector systems, often relying on scintillators, especially liquid noble gases, to detect particle interactions. Detectors consist of structural materials that are assumed to be optically passive, and light detection systems that use reflectors, light detectors, and sometimes, wavelength-shifting materials. MSc theses are available related to measuring the efficiency of light detection systems that might be used in future detectors. Furthermore, measurements to ensure that presumably passive materials do not fluoresce, at the low level relevant to the detectors, can be done. Part of the thesis work can include Monte Carlo simulations and data analysis for current and upcoming dark matter detectors, to study the effect of different levels of desired and nuisance wavelength shifting. In this project, students will acquire skills in photon detection, wavelength shifting technologies, vacuum systems, UV and extreme-UV optics, detector design, and optionally in C++ programming, data analysis, and Monte Carlo techniques.

Contact: Tina Pollmann and Patrick Decowski

[edit] Dark Matter: Signal reconstruction in XENONnT

The next generation direct detection dark matter experiment - XENONnT - comprises close to 500 photomultiplier tubes (PMTs) in the main detector volume. These PMTs are configured to be able to detect even single photons. When a single photoelectron (PE) signal is detected the detected signal (a pulse) is convoluted with the detector response of the PMT. Due to this detector response the pulse shape of a single PE is spread out in time. For XENONnT we would like to explore the possibility to implement a digital (software) filter to deconvolve the detected pulse back to the “true” instantaneous shape (without the detector spread). This is a virtually unexplored new step in the Xenon analysis framework. Later in the analysis framework these pulses from all the PMTs are combined into a signal referred to as a ‘peak’. For XENONnT it is of essence to be extremely good in discriminating between two types of peaks caused by interactions in the detector; a prompt primary scintillation signal (S1) and a secondary ionization signal (S2). The parameters in the software haven’t - as of the time of writing - been optimized for the XENONnT-detector conditions. The student would investigate how a deconvolution filter would benefit the XENONnT analysis framework and develop such a filter. Furthermore, the student will work on the classification of these signals to fully exploit the XENONnT-detector to optimize the classification. This will be done with simulated data at first but may later even be performed on actual XENONnT-data. As an extension, the possibility of applying machine learning to correctly distinguish between the two signals could be explored. This is a data-analysis oriented project where Python skills are paramount.

Contact: Patrick Decowski and Joran Angevaare

[edit] Dark Matter: XAMS R&D Setup

The Amsterdam Dark Matter group operates an R&D xenon detector at Nikhef. The detector is a dual-phase xenon time-projection chamber and contains about 4kg of ultra-pure liquid xenon. We use this detector for the development of new detection techniques - such as utilizing our newly installed silicon photomultipliers - and to improve the understanding of the response of liquid xenon to various forms of radiation. The results could be directly used in the XENONnT experiment, the world’s most sensitive direct detection dark matter experiment at the Gran Sasso underground laboratory, or for future Dark Matter experiments like DARWIN. We have several interesting projects for this facility. We are looking for someone who is interested in working in a laboratory on high-tech equipment, modifying the detector, taking data and analyzing the data him/herself. You will "own" this experiment.

Contact: Patrick Decowski and Auke Colijn

[edit] Dark Matter: DARWIN Sensitivity Studies

DARWIN is the "ultimate" direct detection dark matter experiment, with the goal to reach the so-called "neutrino floor", when neutrinos become a hard-to-reduce background. The large and exquisitely clean xenon mass will allow DARWIN to also be sensitive to other physics signals such as solar neutrinos, double-beta decay from Xe-136, axions and axion-like particles etc. While the experiment will only start in 2025, we are in the midst of optimizing the experiment, which is driven by simulations. We have an opening for a student to work on the GEANT4 Monte Carlo simulations for DARWIN, as part of a simulation team together with the University of Freiburg and Zurich. We are also working on a "fast simulation" that could be included in this framework. It is your opportunity to steer the optimization of a large and unique experiment. This project requires good programming skills (Python and C++) and data analysis/physics interpretation skills.

Contact: Patrick Decowski and Auke Colijn

[edit] Dark Matter: Fast simulation studies

For Dark Matter experiments it is crucial to understand sources of backgrounds in great detail. The most common way to study the effect of backgrounds to the Dark Matter sensitivity is by the use of Monte Carlo simulations. Unfortunately, the standard Monte Carlo techniques are extremely inefficient. One needs to sometimes simulate millions of events before one background event appears in the Dark Matter search area. We have developed a Monte Carlo technique that accelerates this process by up to 1000x. The method has been validated on very simple and unrealistic detector models. In goal of this project is to make a realistic detector model for the fast detector simulations. For this we are looking for a student with good programming skills, an interest in a software project, and the desire to deeply understand analysis of Dark Matter experimental data.

Contact: Patrick Decowski and Auke Colijn

[edit] Dark Matter & Amsterdam Scientific Instruments: Simulations for Industry

In the Nikhef Dark Matter group we have built up an extensive expertise with Monte Carlo simulations of ionizing radiation. Although these simulations have the aim to estimate background levels in our XENON experiments, the same techniques can be applied to study radiation transport in industrial devices. Amsterdam Scientific Instruments (ASI) is a company at Science Park that develops and sells radiation imaging equipment that is used amongst others in electron microscopy. For this application ASI needs a detailed study of gamma ray backgrounds to optimize shielding for their products. The project aims at optimizing a shielding design based on GEANT4 simulations. The results may be implemented in next generation products of ASI. We are looking for a student with preferably strong computing skills, and with an interest in science-industrial collaboration.

Contact: Patrick Decowski and Auke Colijn

[edit] The Modulation experiment: Data Analysis

For years there have been controversial claims of potential new-physics on the basis of time-varying decay rates of radioactive sources on top of ordinary exponential decay. While some of these claims have been refuted, others have still to be confirmed or falsified. To this end, a dedicated experiment - the modulation experiment - has been designed and operational for the past four years. Using four identical and independent setups the experiment is almost ready for a final analysis to conclude on these claims. In this project the student will perform this analysis, preferably resulting in a conclusive paper. This will require combining the data of the four setups and close collaboration with a small group constituting a collaboration of the four different involved institutes (Purdue University (USA), Universität Zürich (Switzerland), Centro Brasileiro de Pesquisas Fisicas (Brasil) and Nikhef). This project is data-analysis oriented. Additionally, lab-skills can be required as one of the setups is situated at Nikhef.

Contact: Auke Colijn and Joran Angevaare

[edit] Detector R&D: Laser Interferometer Space Antenna (LISA)

The space-based gravitational wave antenna LISA is, without a doubt, one of the most challenging space missions ever proposed. ESA plans to launch around 2030 three spacecraft that are separated by a few million kilometers to measure tiny variations in the distances between test-masses located in each satellite to detect the gravitational waves from sources such as supermassive black holes. The triangular constellation of the LISA mission is dynamic, requiring a constant fine-tuning related to the pointing of the laser links between the spacecraft and a simultaneous refocusing of the telescope. The noise sources related to the laser links expect to provide a dominant contribution to the LISA performance. An update and extension of the LISA science simulation software are needed to assess the hardware development for LISA at Nikhef, TNO, and SRON. A position is therefore available for a master student to study the impact of instrumental noise on the performance of LISA. Realistic simulations based on hardware (noise) characterization measurements performed at TNO will be carried out and compared to the expected tantalizing gravitational wave sources.

Contact: Niels van Bakel,Ernst-Jan Buis

[edit] Detector R&D: Spectral X-ray imaging - Looking at colours the eyes can't see

When a conventional X-ray image is taken, one acquires an image that only shows intensities. a ‘black and white’ image. Most of the information carried by the photon energy is lost. Lacking spectral information can result in an ambiguity between material composition and amount of material in the sample. If the X-ray intensity as a function of the energy can be measured (i.e. a ‘colour’ X-ray image) more information can be obtained from a sample. This translates to less required dose and/or to a better understanding of the sample that is being investigated. For example, two fields that can benefit from spectral X-ray imaging are mammography and real time CT.

Detectors using Medipix3 chips are used for X-ray imaging. Such a detector is composed of a pixel chip with a semiconductor sensor bonded on top of it. Photoelectric absorption of X-rays in the sensor results in an amount of charge being released that is proportional to the X-ray energy. This charge is registered by a pixel. Depending on configuration, in each pixel 1, 2, 4 or 8 detection thresholds can be set and so, a number of energy bins can be defined. One of the challenges is to maximise X-ray image quality by minimising effects caused by dispersion in the sensitivity of the pixels. The effects of this dispersion can partly be compensated by applying a specific measurement method in combination with image post processing.

You can work on improving measurement methods and on improving post processing methods. There is flexibility of the planned work depending on the skillset you have. The aim is to get the best X-ray energy resolution over the entire pixel chip. This in turn improves image quality and therefore X-ray CT reconstruction quality.

Important note: Much of this work is to be performed in the laboratory. For as long as corona safety measures are active, the labs at Nikhef are not accessible for students and this project cannot be worked on except for post-processing in software. Currently we hope that the situation will have improved by August. Please see the following videos for examples of our work:

https://youtu.be/cgwQvjfUYns

https://youtu.be/tf9ZLALPVNY

https://youtu.be/vjPX7SxvSUk

https://youtu.be/LqjNVSm7Hoo

Contact: Martin Fransen,Navrit Bal

[edit] Detector R&D: Holographic projector

A difficulty in projecting holograms (based on the interference of light) is the required dense pixel pitch of a projector. One would need a pixel pitch of less than 200 nanometer. With larger pixels artefacts occur due to spatial under sampling. A pixel pitch of 200 nanometer is difficult, if not, impossible, to achieve, especially for larger areas. Another challenge is the massive amount of computing power that would be required to control such a dense pixel matrix.

A new holographic projection method has been developed that reduces under sampling artefacts for projectors with a ‘low’ pixel density. It uses 'pixels' at random but known positions, resulting in an array of (coherent) light points that lacks (or has suppressed) spatial periodicity. As a result a holographic projector can be built with a significantly lower pixel density and correspondingly less required computing power. This could bring holography in reach for many applications like display, lithography, 3D printing, metrology, etc...

Of course, nothing comes for free: With less pixels, holograms become noisier and the contrast will be reduced (not all light ends up in the hologram). The questions: How does the quality of a hologram depend on pixel density? How do we determine projector requirements based on requirements for hologram quality?

Requirements for a hologram can be expressed in terms of: Noise, contrast, resolution, suppression of under sampling artefacts, etc..

For this project we have built a proof of concept holographic emitter. This set-up will be used to verify simulation results (and also to project some cool holograms of course ;-).

Examples of what you could be working on:

a. Calibration/characterisation of the current projector and compensation of systematic errors.

b. To realize a phased array of randomly placed light sources the pixel matrix of the projector must be ‘relayed’ onto a mask with apertures at random but precisely known positions. Determine the best possible relaying optics and design an optimized mask accordingly. Factors like deformation of the projected pixel matrix and limitations in resolving power of the lens system must be taken into account for mask design.

Important note: Much of this work is to be performed in the laboratory. For as long as corona safety measures are active, the labs at Nikhef are not accessible for students and this project cannot be worked on. Currently we hope that the situation will have improved by august.

Contact: Martin Fransen

[edit] Theory: The Effective Field Theory Pathway to New Physics at the LHC

A promising framework to parametrise in a robust and model-independent way deviations from the Standard Model (SM) induced by new heavy particles is the Standard Model Effective Field Theory (SMEFT). In this formalism, beyond the SM effects are encapsulated in higher-dimensional operators constructed from SM fields respecting their symmetry properties. In this project, we aim to carry out a global analysis of the SMEFT from high-precision LHC data, including Higgs boson production, flavour observables, and low-energy measurements. This analysis will be carried out in the context of the recently developed SMEFiT approach [1] based on Machine Learning techniques to efficiently explore the complex theory parameter space. The ultimate goal is either to uncover glimpses of new particles or interactions at the LHC, or to derive the most stringent model-independent bounds to date on general theories of New Physics. Of particular interest are novel methods for charting the parameter space [2], the matching to UV-complete theories in explicit BSM scenarios [3], and the interplay between EFT-based model-independent searches for new physics and determinations of the proton structure from LHC data [4].

[1] https://arxiv.org/abs/1901.05965 [2] https://arxiv.org/abs/1906.05296 [3] https://arxiv.org/abs/1908.05588 [4] https://arxiv.org/abs/1905.05215

Contact: Juan Rojo

[edit] Theory: Charting the quark and gluon structure of protons and nuclei with Machine Learning

Deepening our knowledge of the partonic content of nucleons and nuclei [1] represents a central endeavour of modern high-energy and nuclear physics, with ramifications in related disciplines such as astroparticle physics. There are two main scientific drivers motivating these investigations of the partonic structure of hadrons. On the one hand, addressing fundamental open issues in our understanding in the strong interactions such as the origin of the nucleon mass, spin, and transverse structure; the presence of heavy quarks in the nucleon wave function; and the possible onset of novel gluon-dominated dynamical regimes. On the other hand, pinning down with the highest possible precision the substructure of nucleons and nuclei is a central component for theoretical predictions in a wide range of experiments, from proton and heavy ion collisions at the Large Hadron Collider to ultra-high energy neutrino interactions at neutrino telescopes. The goal of this project is to exploit Machine Learning and Artificial Intelligence tools [2,3] (neural networks trained by stochastic gradient descent) to pin down the quark and gluon substructure of protons and nuclei by using recent measurements from proton-proton and proton-lead collisions at the LHC. Topics of special interest are i) the strange content of protons and nuclei, ii) parton distributions at higher-orders in the QCD couplings for precision Higgs physics, iii) the interplay between jet, photon, and top quark production data to pin down the large-x gluon, and iv) charm quarks as a probe of gluon shadowing at small-x. The project also involves developing projects for the Electron-Ion Collider (EIC), a new lepton-nucleus experiment to start operations in the next years.

[1] https://arxiv.org/abs/1910.03408 [2] https://arxiv.org/abs/1904.00018 [3] https://arxiv.org/abs/1706.00428

Contact: Juan Rojo

[edit] Theory: Machine learning for Electron Microscopy for next-generation materials

Machine Learning tools developed and applied for particle physics hold great potential for applications in material science, in particular concerning faithful uncertainty estimation and model training for large parameter spaces. In this project, carried out in collaboration with the group of Dr. Sonia Conesa-Boj from the Kavli Institute Nanoscience Delft, http://www.conesabojlab.tudelft.nl, we will develop and deploy ML tools for data analysis in Electron Microscopy. We will focus on pinning down the properties of novel quantum materials such as topological insulators and van der Waals materials. Examples of possible applications include model-independent background subtraction in electron-energy loss spectroscopy, automatic classification of crystalline structures, and enhancing spatial and spectral resolution using convolutional networks.

Contact: Juan Rojo

[edit] Theory: The electroweak phase transition and baryogenesis/gravitational wave production

In extensions of the Standard Model the electroweak phase transition can be first order and proceed via the nucleation of bubbles. Colliding bubbles can produce gravitational waves [1] and plasma particles interacting with the bubbles can generate a matter-antimatter asymmetry [2]. A detailed understanding of the dynamics of the phase transitions is needed to accurately describe these processes. One project is to study QFT at finite temperature and compare/apply methods that address the non-perturbative IR dynamics of the thermal processes [3,4]. Another project is to calculate the velocity by which the bubbles expand, which is an important parameter for gravitational waves production and baryogensis. This entails among other things tunneling dymamics, (thermal) scattering rates and Boltzmann equations [5].

[1]https://arxiv.org/abs/1705.01783 [2]https://arxiv.org/pdf/hep-ph/0609145.pdf [3]https://arxiv.org/pdf/1609.06230.pdf [4]https://arxiv.org/pdf/1612.00466.pdf [5]https://arxiv.org/pdf/1809.04907.pdf

Contact: Marieke Postma

[edit] Theory: Cosmology of the QCD axion

The QCD axion provides an elegant solution to the strong CP problem in QCD[1]. This project focus on the cosmological dynamics of this hypothesized axion field, and in particular the possibility that it can both produce the observed matter-antimatter asymmetry and dark matter abundance in our universe [2,3].

[1]https://arxiv.org/abs/1812.02669 [2]https://arxiv.org/pdf/hep-ph/0609145.pdf [3]https://arxiv.org/pdf/1910.02080.pdf

Contact: Marieke Postma

[edit] Theory: Neutrinos, hierarchy problem and cosmology

The electroweak hierachy problem is absent if the quadratic term in the Higgs potential is generated dynamically. This is achieved in 'the neutrino option' [1] where the Higgs potential stems exclusively from quantum effects of heavy right-handed neutrinos, which can also generate the mass pattern of the oberved left-handed neutrinos. The project focusses on model building aspects (e.g. [2]) and the cosmology (e.g. leptogenesis [3]) of these set-ups.

[1] https://arxiv.org/pdf/1703.10924.pdf [2] https://arxiv.org/pdf/1807.11490.pdf [3] https://arxiv.org/pdf/1905.12642.pdf

Contact: Marieke Postma

[edit] KM3NeT: Reconstruction of first neutrino interactions in KM3NeT

The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first few strings with sensitive photodetectors have been deployed at both the Italian and the French detector sites. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In this project the available data will be used together with simulations to best reconstruct the event topologies and optimally identify and reconstruct the first neutrino interactions in the KM3NeT detector and with this pave the path towards accurate neutrino oscillation measurements and neutrino astronomy.

Programming skills are essential, mostly root and C++ will be used. Contact: Ronald Bruijn Dorothea Samtleben'

[edit] KM3NeT: Searching for New Heavy Neutrinos

In this project we will be searching for a new heavy neutrino, looking at signatures created by atmospheric neutrinos interacting in the detector volume of KM3NeT-ORCA. The aim of this project is to study a specific event topology which appears as double blobs of signals detected separately by densely instrumented ORCA detector units. We will be exploiting the tau reconstruction algorithms to verify the possibility of ORCA to detect such signals and to estimate the potential sensitivity of the experiment as well. Basic knowledge of elementary particle physics and data analysis techniques will be advantageous. The knowledge of programming languages e.g. python (and possibly C++) and ROOT are advantageous but not mandatory.

Contact: Suzan B. du Pree Daan van Eijk

[edit] KM3NeT: Dark Matter with KM3NeT-ORCA

Dark Matter is thought to be everywhere (we should be swimming through it), but we have no idea what it is. Using the good energy and angular resolutions of the KM3NeT neutrino telescope, we can search for Dark Matter signatures that originate from the center of our galaxy. In this project, we will search for such signatures using the reconstructed track and shower events with the KM3NeT-ORCA detector to discover relatively light Dark Matter particles. Since this year, the KM3NeT-ORCA experiment has 6 detection lines under the Mediterranean Sea: fully operational and continuously taking data. Using the available data, it is possible to compare data and simulation for different event topologies and to estimate the experiment's sensitivity. The project is suitable for a student who is interested to explore new physics scenarios and willing to develop new skills. Basic knowledge of elementary particle physics and data analysis techniques will be advantageous. The knowledge of programming languages e.g. python (possibly C++) and ROOT data analysis tool are advantageous but not mandatory.

Contact: Suzan B. du Pree Daan van Eijk


[edit] Gravitational Waves: Unraveling the structure of neutron stars with gravitational wave observations

Neutron stars were first discovered more than half a century ago, yet their detailed internal structure largely remains a mystery. A range of theoretical models have been put forward for the neutron star "equation of state", but until recently there was no real way to test them. The direct detection of gravitational waves with LIGO and Virgo has the potential to remedy the situation. When two neutron stars spiral towards each other, they get tidally deformed in a way that is determined by the equation of state, and these deformations get imprinted upon the shape of the gravitational wave that gets emitted. After the first gravitational wave observation of such an event in 2017, several equation of state models could already be ruled out. With expected upgrades of the detectors, we will at some point have access not only to the "inspiral" of binary neutron stars, but to the merger itself, and what happens afterwards. The project will consist of using results from large-scale numerical simulations to come up with a heuristic model for the waveform that describes the inspiral-merger-postmerger process with sufficient accuracy given expected detector sensitivities, and to develop data analysis techniques to efficiently use this model to extract information about the neutron star equation of state.

Contact: Chris Van Den Broeck


[edit] Gravitational Waves: Searches for gravitational waves from compact binary coalescence

Searches for gravitational waves from the mergers of black holes and neutron stars have been extraordinarily successful in the last four years. We are now beginning to study a population of heavy stellar-mass black holes in detail, including understanding how these systems came to form and whether they are consistent with general relativity. Additionally, the detection of binary neutron star mergers is allowing us to probe their extreme matter. However, we’ve only just scratched the surface of possible signals and the new physics they’d allow us to study. The detection of highly spinning and precessing systems would allow us to perform black hole population statistics to an extraordinary degree of accuracy. Detection of sub-solar mass systems would provide evidence of dark matter. However, these searches are difficult because they require us to work in high-dimensional spaces and develop new statistical methods. There are possibilities for several projects that involve the development and implementation of these new searches as well as the interpretation of the results, particularly in terms of the physics describing compact binary mergers.

Contact: Sarah Caudill


[edit] Gravitational Waves: Computer modelling to design the laser interferometers for the Einstein telescope

A new field of instrument science led to the successful detection of gravitational waves by the LIGO detectors in 2015. We are now preparing the next generation of gravitational wave observatories, such as the Einstein Telescope, with the aim to increase the detector sensitivity by a factor of ten, which would allow, for example, to detect stellar-mass black holes from early in the universe when the first stars began to form. This ambitious goal requires us to find ways to significantly improve the best laser interferometers in the world.

Gravitational wave detectors, such as LIGO and VIRGO, are complex Michelson-type interferometers enhanced with optical cavities. We develop and use numerical models to study these laser interferometers, to invent new optical techniques and to quantify their performance. For example, we synthesize virtual mirror surfaces to study the effects of higher-order optical modes in the interferometers, and we use opto-mechanical models to test schemes for suppressing quantum fluctuations of the light field. We can offer several projects based on numerical modelling of laser interferometers. All projects will be directly linked to the ongoing design of the Einstein Telescope.

Contact: Andreas Freise


[edit] Gravitational Waves: Digging away the noise to find the signal

Gravitational Wave interferometers are extremely sensitive, but suffer from instrumental issues that produce noise that mimics astrophysical signals. This needs to be solved as much as possible before the data analysis. The problem is that instrumentalists don't know about analysis pipelines, and data analysts don't know about experimental details. We need your help to bridge the gap. This is a good opportunity to learn about both sides and contribute directly to a booming international field. We have several tools and new ideas for correlating noises with the state of the instrument. These need to be developed further, used on years of data, and written up. Will require Python, signal processing and statistics.

Contact: Bas Swinkels and Sarah Caudill


[edit] Gravitational Waves: Machine Learning techniques for GW Interferometers

The control of suspended optical cavities in the non linear regime. Gravitational Wave interferometers are extremely sensitive, however suffer from a very small control range, causing unlocks, reducing the robustness of these instruments. In this project we will use a table top replica of a suspended optical cavity, located in the new R&D laser lab at Nikhef, for the development of a neural network to construct the positions from free falling mirror by using beam images. A database with simulated beam images can be used to train various neural networks before deployment in the table top experiment. We are looking for a hands-on and enthusiastic master student, interested in machine learning and experienced in programming languages like Python. Contacts: Rob Walet, Frank Linde

Contact: Rob Walet and Frank Linde

[edit] VU LaserLaB: Measuring the electric dipole moment (EDM) of the electron

In collaboration with Nikhef and the Van Swinderen Institute for Particle Physics and Gravity at the University of Groningen, we have recently started an exciting project to measure the electric dipole moment (EDM) of the electron in cold beams of barium-fluoride molecules. The eEDM, which is predicted by the Standard Model of particle physics to be extremely small, is a powerful probe to explore physics beyond this Standard Model. All extensions to the Standard Model, most prominently supersymmetry, naturally predict an electron EDM that is just below the current experimental limits. We aim to improve on the best current measurement by at least an order of magnitude. To do so we will perform a precision measurement on a slow beam of laser-cooled BaF molecules. With this low-energy precision experiment, we test physics at energies comparable to those of LHC!

At LaserLaB VU, we are responsible for building and testing a cryogenic source of BaF molecules. The main parts of this source are currently being constructed in the workshop. We are looking for enthusiastic master students to help setup the laser system that will be used to detect BaF. Furthermore, projects are available to perform simulations of trajectory simulations to design a lens system that guides the BaF molecules from the exit of the cryogenic source to the experiment.

Contact: Rick Bethlem

[edit] VU LaserLaB: Physics beyond the Standard model from molecules

Our team, with a number of staff members (Ubachs, Eikema, Salumbides, Bethlem, Koelemeij) focuses on precision measurements in the hydrogen molecule, and its isotopomers. The work aims at testing the QED calculations of energy levels in H2, D2, T2, HD, etc. with the most precise measurements, where all kind of experimental laser techniques play a role (cw and pulsed lasers, atomic clocks, frequency combs, molecular beams). Also a target of studies is the connection to the "Proton size puzzle", which may be solved through studies in the hydrogen molecular isotopes.

In the past half year we have produced a number of important results that are described in the following papers:

  • Frequency comb (Ramsey type) electronic excitations in the H2 molecule:

see: Deep-ultraviolet frequency metrology of H2 for tests of molecular quantum theory http://www.nat.vu.nl/~wimu/Publications/Altmann-PRL-2018.pdf

  • Precision measurement of an infrared transition in the HD molecule

see: Sub-Doppler frequency metrology in HD for tests of fundamental physics: https://arxiv.org/abs/1712.08438

  • The first precision study in molecular tritium T2

see: Relativistic and QED effects in the fundamental vibration of T2: http://arxiv.org/abs/1803.03161

  • Dissociation energy of the hydrogen molecule at 10^-9 accuracy paper submitted to Phys. Rev. Lett.
  • Probing QED and fundamental constants through laser spectroscopy of vibrational transitions in HD+

This is also a study of the hydrogen molecular ion HD+, where important results were obtained not so long ago, and where we have a strong activity: http://www.nat.vu.nl/~wimu/Publications/ncomms10385.pdf

These five results mark the various directions we are pursuing, and in all directions we aim at obtaining improvements. Specific projects with students can be defined; those are mostly experimental, although there might be some theoretical tasks, like performing calculations of hyperfine structures. Contact: Wim Ubachs Kjeld Eikema Rick Bethlem


Last year's MSc Projects

Views
Personal tools