Difference between revisions of "Last years MSc Projects"

From Education Wiki
Jump to navigation Jump to search
Line 1: Line 1:
 +
== 2019: ==
 +
=== Theory: The Effective Field Theory Pathway to New Physics at the LHC ===
 +
 +
A very promising framework to parametrise in a robust and model-independent way deviations from the Standard Model (SM) induced by new heavy particles is the Standard  Model Effective Field Theory (SMEFT). In this formalism, Beyond the SM effects are encapsulated in higher-dimensional operators constructed from SM fields respecting their symmetry properties. In this project, we aim to carry out a global analysis of the SMEFT from high-precision LHC data, including Higgs boson production, flavour observables, and low-energy measurements. This analysis will be carried out in the context of the recently developed SMEFiT approach [1] based on Machine Learning techniques to efficiently explore the complex theory parameter space. The ultimate goal is either to uncover glimpses of new particles or interactions at the LHC, or to derive the most stringent model-independent bounds to date on general theories of New Physics.
 +
 +
[1] https://arxiv.org/abs/1901.05965
 +
 +
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
 +
 +
=== Theory: Pinning down the initial state of heavy-ion collisions with Machine Learning ===
 +
 +
It has been known for more than three decades that the parton distribution functions (PDFs) of nucleons bound within heavy nuclei are modified with respect to their free-nucleon counterparts. Despite active experimental and theoretical investigations, the underlying mechanisms that drive these in-medium modifications of nucleon substructure have yet to be fully understood.  The determination of nuclear PDFs is a topic of high relevance in order both to improve our fundamental understanding of the strong interactions in the nuclear environment, as well as and for the interpretation of heavy ion collisions at RHIC and the LHC, in particular for the characterization of the Quark-Gluon Plasma. The goal of this project is to exploit Machine Learning and Artificial Intelligence tools [1,2] (neural networks trained by stochastic gradient descent) to pin down the initial state of heavy ion collisions  by using recent measurements from proton-lead collisions at the LHC. Emphasis will be put on the poorly-known nuclear modifications of the gluon PDFs, which are still mostly ''terra incognita'' and highly relevant for phenomenological applications. In addition to theory calculations, the project will also involve code development using modern AI/ML tools such as TensorFlow and Keras.
 +
 +
[1] https://arxiv.org/abs/1811.05858
 +
[2] https://arxiv.org/abs/1410.8849
 +
 +
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
 +
 +
=== Theory: The High-Energy Muon Crisis and Perturbative QCD ===
 +
 +
The production of charmed meson from the collision of high-energy cosmic rays with air nucleons in the upper atmosphere provides an important component of the flux of high-energy muons and neutrinos that can be detected at cosmic ray experiments such as AUGER and neutrino telescopes such as KM3NET or IceCube. The production of forward muons from charmed meson decays is usually predicted from QCD models tuned to the data, rather than from first principles QCD calculation. Interestingly, the number of such high-energy muons observed by AUGER seems to differ markedly from current theory predictions. In this project we aim to exploit state-of-the-art perturbative and non-perturbative QCD techniques to compute the flux of high-energy muons from charm decays and make predictions for a number of experiments sensitive to them
 +
 +
 +
[1] https://arxiv.org/abs/1904.12547
 +
[2] https://arxiv.org/abs/1808.02034
 +
[3] https://arxiv.org/abs/1511.06346
 +
 +
''Contact: [mailto:j.rojo@vu.nl Juan Rojo]''
 +
 +
 +
=== ATLAS: The lifetime of the Higgs boson ===
 +
 +
While the Higgs boson was discovered in 2012, many of its properties still remain unconstrained. This master student project revolves around one such property, the lifetime of the Higgs boson. The lifetime can be obtained by measuring the width of the boson, but because the width is a few hundred times smaller than the detector resolution, a direct measurement is impossible at the moment. But there is an idea to overcome that limitation. By utilizing the interference between the Higgs boson decay and background processes we can perform an indirect measurement. This measurement potentially has the sensitivity that will allow us to perform a measurement of the width (or lifetime) as predicted by the Standard Model. Specifically, the master project will be about predicting the sensitivity of this measurement for different predictions of the Higgs width. The project is on the interface of theory and experiment, making use of Monte Carlo generators and the standard HEP analysis tools (ROOT, C++, python).
 +
 +
''Contact: [mailto:mveen@nikhef.nl Michiel Veen] or [mailto:Ivo.van.Vulpen@nikhef.nl Hella Snoek & Ivo van Vulpen]''
 +
 +
=== ATLAS: The Next Generation ===
 +
 +
After the observation of the coupling of Higgs bosons to fermions of the third generation, the search for the coupling to fermions of the second generation is one of the next priorities for research at CERN's Large Hadron Collider. The search for the decay of the Higgs boson to two charm quarks is very new [1] and we see various opportunities for interesting developments. For this project we propose improvements in reconstruction (using exclusive decays), advanced analysis techiques (using deep learning methods) and expanding the new physics models (e.g. including a search for off-diagonal H->uc couplings). Another opportunity would be the development of the first statistical combination of results between the ATLAS and CMS experiment, which could significantly improve the discovery potentional.
 +
 +
[1] https://arxiv.org/abs/1802.04329
 +
 +
''Contact: [mailto:tdupree@nikhef.nl Tristan du Pree and Marko Stamenkovic]''
 +
 +
=== ATLAS: The Most Energetic Higgs Boson ===
 +
 +
The production of Higgs bosons at the highest energies could give the first indications for deviations from the standard model of particle physics, but production energies above 500 GeV have not been observed yet [1]. The LHC Run-2 dataset, collected during the last 4 years, might be the first opportunity to observe such processes, and we have various ideas for new studies. Possible developments include the improvement of boosted reconstruction techniques, for example using multivariate deep learning methods. Also, there are various opportunities for unexplored theory interpretations (using the MadGraph event generator), including effective field theory models (with novel ‘morphing’ techniques) and the study of the Higgs boson’s self coupling.
 +
 +
[1] https://arxiv.org/abs/1709.05543
 +
 +
''Contact: [mailto:tdupree@nikhef.nl Tristan du Pree and Brian Moser]''
 +
 +
=== LHCb:  Measurement of Central Exclusive Production Rates of Chi_c using converted photons in LHCb ===
 +
 +
Central exclusive production (CEP) of particles at the LHC is characterised by a extremely clean signature.  Differently from the typical inelastic collisions where many particles are created resulting in a so-called Primary Vertex, CEP events have only the final state particles of interest. In this project the particle of interest is a pair of charmed quarks creating a chi_c particle. In theory this process is generated by a long range gluon exchange and can elucidate the nature of the strong force, described by the quantum chromodynamics in the the standard model. The proposed work involves  analysing a pre-existing dataset with reconstructed chi_c and simulating events at the LHCb in order to obtain the relative occurrence rate of each chi_c species (spins 0, 1, 2), a quantity that can be easily compared to theoretical predictions.
 +
 +
''Contact: [mailto:K.Akiba@nikhef.nl Kazu Akiba]''
 +
 +
=== LHCb:  Optimization studies for Vertex detector at the High Lumi LHCb  ===
 +
 +
The LHCb experiment is dedicated to measure tiny differences between matter and antimatter through the precise study of rare processes involving b or c quarks.  The LHCb detector will undergo a major modification in order to dramatically increase the luminosity and be able to  measure indirect effects of physics beyond the standard model.  In this environment, over 42 simultaneous collisions are expected to happen at a time interval of 200 ps where the two proton bunches overlap. The particles of interest have a relatively long lifetime and therefore the best way to distinguish them from the background collisions is through the precise reconstruction of displaced vertices and pointing directions.  The new detector considers using extremely recent or even future technologies to measure space (with resolutions below 10 um) and time (100 ps or better) to efficiently reconstruct the events of interest for physics.  The project involves changing completely  the LHCb Vertex Locator (VELO) design in simulation and determine what can be the best performance for the upgraded detector, considering different spatial and temporal resolutions.
 +
 +
''Contact: [mailto:K.Akiba@nikhef.nl Kazu Akiba]''
 +
 +
=== LHCb:  Measurement of charge multiplication in heavily irradiated sensors ===
 +
 +
During the R&D phase for the LHCb VELO Upgrade detector a few sensor prototypes were irradiated to the extreme fluence expected to be achieved during the detector lifetime. These samples were tested using high energy particles at the SPS facility at CERN with their trajectories reconstructed by the Timepix3 telescope. A preliminary analysis revealed that at the highest irradiation levels the amount of signal observed is higher than expected, and even larger than the signal obtained at lower doses.  At the Device Under Test (DUT) position inside the telescope, the spatial resolution attained by this system is below 2 um. This means that a detailed analysis can be performed in order to study where and how this signal amplification happens within  the 55x55 um^2 pixel cell.  This project involves analysing the telescope and DUT data to investigate the charge multiplication mechanism at the microscopic level.
 +
 +
''Contact: [mailto:K.Akiba@nikhef.nl Kazu Akiba]''
 +
 +
=== Detector R&D: Studying fast timing detectors  ===
 +
 +
Fast timing detectors are the solution for future tracking detectors. In future LHC operation conditions and future colliders, more and more particles are produced per collision. The high particle densities make it increasingly more difficult to separate particle trajectories with the spatial information that current silicon tracking detectors provide. A solution would be to add very precise (in order of 10ps) timestamps to the spatial measurements of the particle trackers. A good understanding of the performance of fast timing detectors is necessary. With the user of a pulsed laser in the lab we study the characteristics of several prototype detectors.
 +
 +
''Contact: [mailto:H.Snoek@nikhef.nl Hella Snoek, Martin van Beuzekom, Kazu Akiba, Daniel Hynds]''
 +
 +
===Detector R&D: Laser Interferometer Space Antenna (LISA) ===
 +
 +
The space-based gravitational wave antenna LISA is without doubt one of the most challenging space missions ever proposed. ESA plans to launch around 2030 three spacecrafts that are separated by a few million kilometers to measure tiny variations in the distances between test-masses located in each spacecraft to detect the gravitational waves from sources such as supermassive black holes. The triangular constellation of the LISA mission is dynamic requiring a constant fine tuning related to the pointing of the laser links between the spacecrafts and a simultaneous refocusing of the telescope. The noise sources related to the laser links are expected to provide a dominant contribution to the LISA performance.
 +
 +
An update and extension of the LISA science simulation software is needed to assess the hardware development for LISA at Nikhef, TNO and SRON. A position is therefore available for a master student to study the impact of instrumental noise on the performance of LISA. Realistic simulations based on hardware (noise) characterization measurements that were done at TNO will be carried out and compared to the expected tantalizing gravitational wave sources.
 +
 +
Key words: LISA, space, gravitational waves, simulations, signal processing
 +
 +
''Contact: [mailto:nielsvb@nikhef.nl Niels van Bakel],[mailto:ernst-jan.buis@tno.nl  Ernst-Jan Buis]''
 +
 +
===Detector R&D: Spectral X-ray imaging - Looking at colours the eyes can't see===
 +
 +
When a conventional X-ray image is made to analyse the composition of a sample, or to perform a medical examination on a patient, one acquires an image that only shows intensities. One obtains a ‘black and white’ image. Most of the information carried by the photon energy is lost. Lacking spectral information can result in an ambiguity between material composition and amount of material in the sample. If the X-ray intensity as a function of the energy can be measured (i.e. a ‘colour’ X-ray image) more information can be obtained from a sample. This translates to less required dose and/or to a better understanding of the sample that is being investigated. For example, two fields that can benefit from spectral X-ray imaging are mammography and real time CT.
 +
 +
X-ray detectors based on Medipix/Timepix pixel chips have spectral resolving capabilities and can be used to make polychromatic X-ray images. Medipix and Timepix chips have branched from pixel chips developed for detectors for high energy physics collider experiments.
 +
 +
Activities in the field of (spectral) CT scans are performed in a collaboration between two institutes (Nikhef and CWI) and two companies (ASI and XRE)
 +
Some activities that students can work on:
 +
 +
- Medical X-ray imaging (CT and conventional X-ray images): Detection of iodine contrast agent. Detection of calcifications (hint for a tumour).
 +
 +
- Material research: Using spectral information to identify materials and recognize compounds.
 +
 +
- Determining how much existing applications can benefit from spectral X-ray imaging and looking for potential new applications.
 +
 +
- Characterizing, calibrating, optimizing X-ray imaging detector systems.
 +
 +
''Contact: [mailto:martinfr@nikhef.nl Martin Fransen]''
 +
 +
===Detector R&D: Holographic projector===
 +
 +
A difficulty in generating holograms (based on the interference of light) is the required dense pixel pitch. One would need a pixel pitch of less than 200 nanometer. With larger pixels artefacts occur due to spatial under sampling. A pixel pitch of 200 nanometer is difficult, if not, impossible, to achieve, especially for larger areas. Another challenge is the massive amount of computing power that would be required to control such a dense pixel matrix.
 +
 +
A new holographic projection method has been developed that reduces under sampling artefacts for projectors with a ‘low’ pixel density. It uses 'pixels' at random but known positions, resulting in an array of (coherent) light points that lacks (or has strongly surpressed) spatial periodicity. As a result a holographic projector can be built with a significantly lower pixel density and correspondingly less required computing power. This could bring holography in reach for many applications like display, lithography, 3D printing, metrology, etc...
 +
 +
Of course, nothing comes for free: With less pixels, holograms become noisier and the contrast will be reduced (not all light ends up in the hologram). The questions: How does the quality of a hologram depend on pixel density? How do we determine projector requirements based on requirements for hologram quality?
 +
 +
Requirements for a hologram can be expressed in terms of: Noise, contrast, resolution, suppression of under sampling artefacts, etc..
 +
 +
For this project we are building a proof of concept holographic emitter. This set-up will be used to verify simulation results (and also to project some cool holograms of course).
 +
 +
Students can do hands on lab-work (building and testing the proto-type projector) and/or work on setting up simulation methods and models. Simulations can be highly parallelel and are preferably written for parallel/multithreading computing and/or GPU computing.
 +
 +
''Contact: [mailto:martinfr@nikhef.nl Martin Fransen]''
 +
 +
=== KM3NeT: Reconstruction of first neutrino interactions in KM3NeT ===
 +
 +
The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first few strings with sensitive photodetectors have been deployed at both the Italian and the French detector sites. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In order to identify neutrinos an accurate reconstruction and optimal understanding of the backgrounds are crucial. In this project we will use the available data together with simulations to optimally identify and reconstruct the first neutrino interactions in the KM3NeT detector (applying also machine learning for background suppression) and with this pave the path towards accurate neutrino oscillation measurements and neutrino astronomy.
 +
 +
Programming skills are essential, mostly root and C++ will be used.
 +
 +
''Contact: [mailto:bruijn@nikhef.nl Ronald Bruijn] [mailto:dosamtnikhef.nl Dorothea Samtleben]'''
 +
 +
=== KM3NeT: Acoustic detection of ultra-high energy cosmic-ray neutrinos  (2 projects) ===
 +
 +
The study of the cosmic neutrinos of energies above 1017 eV, the so-called ultra-high
 +
energy neutrinos, provides a unique view on the universe and may provide insight
 +
in the origin of the most violent astrophysical sources, such as gamma ray bursts,
 +
supernovae or even dark matter. In addition, the observation of high energy neutrinos
 +
may provide a unique tool to study interactions at high energies.
 +
The energy deposition of these extreme neutrinos in water induce a thermo-
 +
acoustic signal, which can be detected using sensitive hydrophones. The expected
 +
neutrino flux is however extremely low and the signal that neutrinos induce is small.
 +
TNO is presently developing sensitive hydrophone technology based on fiber optics.
 +
Optical fibers form a natural way to create a distributed sensing system. Using this
 +
technology a large scale neutrino telescope can be built in the deep sea. TNO is aiming
 +
for a prototype hydrophone which will form the building block of a future telescope.
 +
 +
The work will be executed at the Nikhef institute and/or the TNO laboratories in Delft. In this project there are two opportunities for master students to participate:<br>
 +
<b>student project 1: </b> Hardware development on fiber optics hydrophones technology Goal: characterise existing proto-type optical fibre hydrophones in an anechoic basin at TNO laboratory. Data collection, calibration, characterisation, analysis of consequences for design future acoustic hydrophone neutrino telescopes; Keywords: Optical fiber technology, signal processing, electronics, lab. <b>student project 2:</b> Investigation of ultra-high energy neutrinos and their interactions with matter. Goal: simulate (currently imperfectly modelled) interaction for extremely high energy interactions, characterise differences with currently available physics models and impact on physics reach for future acoustic hydrophone neutrino telescopes; Keywords: Monte Carlo simulations, particle physics, cosmology. <br>
 +
 +
Further information: Info on ultra-high energy neutrinos can be found at: http://arxiv.org/abs/1102.3591; Info on acoustic detection of neutrinos can be found at: http://arxiv.org/abs/1311.7588
 +
 +
''Contact: [mailto:ernst-jan.buis@tno.nl Ernst-Jan Buis] and [mailto:ivo.van.vulpen@nikhef.nl Ivo van Vulpen]'''
 +
 +
=== KM3NeT: Applying state-of-the-art reconstruction software to 10-years of Antares data ===
 +
 +
While the KM3NeT neutrino telescope is being constructed in
 +
the deep waters of the Mediterranean Sea,
 +
data from its precursor (Antares) have been accumulated for more than 10 years.
 +
The main objective of these neutrino telescopes is to determine the origin of (cosmic) neutrinos.
 +
The accuracy of the determination of the origin of neutrinos critically depends on
 +
the probability density function (PDF) of the arrival time of Cherenkov light
 +
produced by relativistic charged particles emerging from a neutrino interaction in the sea.
 +
It has been shown that these PDFs can be calculated from first principles and
 +
that the obtained values can efficiently be interpolated in 4 and 5 dimensions,
 +
without compromising the functional dependencies.
 +
The reconstruction software based on this input yields indeed for KM3NeT the best resolution.
 +
This project is aimed at applying the KM3NeT software to available Antares data.
 +
 +
''Contact: [mailto:mjg@nikhef.nl Maarten de Jong]''
 +
 +
=== HiSPARC: Extensive Air Shower Reconstruction using Machine Learning ===
 +
 +
An important aspect of high energy cosmic ray research is the reconstruction of the direction and energy of the primary cosmic ray. This is done by measuring the footprint of the extensive air shower initiated by the cosmic ray. The goal of this project is to advance the creation of a reconstruction algorithm based on machine learning (ML) techniques.
 +
 +
A previous master student has made great progress in the creation of a ML algorithm for the direction reconstruction. The algorithm was trained on simulations and applied to real data. The method works quite well but we expect that better results can be achieved by improving the simulated data set. In this project you will implement a more accurate description of the photomultiplier tube in the simulation pipeline and check if the reconstruction will improve. The next step would be to advance the algorithm towards energy reconstruction. This means upscaling the current method and will involve the creation and manipulation of large simulated data sets.
 +
 +
The HiSPARC group is small. As a student you can have a big impact and there is freedom to tailor your own project. The proposed project is for students with a particular interest in computational (astro)physics. Advanced programming skills (mainly Python) and Linux knowledge are desirable.
 +
 +
''Contact: [mailto:kaspervd@nikhef.nl Kasper van Dam] en [mailto:vaneijk@nikhef.nl Bob van Eijk]''
 +
 +
=== VU LaserLaB: Measuring the electric dipole moment (EDM) of the electron ===
 +
 +
In collaboration with Nikhef and the Van Swinderen Institute for Particle Physics and Gravity at the University of Groningen, we have recently started an exciting project to measure the electric dipole moment (EDM) of the electron in cold beams of barium-fluoride molecules. The eEDM, which is predicted by the Standard Model of particle physics to be extremely small, is a powerful probe to explore physics beyond this Standard Model. All extensions to the Standard Model, most prominently supersymmetry, naturally predict an electron EDM that is just below the current experimental limits. We aim to improve on the best current measurement by at least an order of magnitude. To do so we will perform a precision measurement on a slow beam of laser-cooled BaF molecules. With this low-energy precision experiment, we test physics at energies comparable to those of LHC!
 +
 +
At LaserLaB VU, we are responsible for building and testing a cryogenic source of BaF molecules. The main parts of this source are currently being constructed in the workshop. We are looking for enthusiastic master students to help setup the laser system that will be used to detect BaF. Furthermore, projects are available to perform simulations of trajectory simulations to design a lens system that guides the BaF molecules from the exit of the cryogenic source to the experiment.
 +
 +
''Contact: [mailto:H.L.Bethlem@vu.nl Rick Bethlem]''
 +
 +
=== VU LaserLab: Physics beyond the Standard model from molecules ===
 +
 +
Our team, with a number of staff members (Ubachs, Eikema, Salumbides, Bethlem, Koelemeij) focuses on precision measurements in the hydrogen molecule, and its isotopomers. The work aims at testing the QED calculations of energy levels in H2, D2, T2, HD, etc. with the most precise measurements, where all kind of experimental laser techniques play a role (cw and pulsed lasers, atomic clocks, frequency combs, molecular beams). Also a target of studies is the connection to the "Proton size puzzle", which may be solved  through studies in the hydrogen molecular isotopes.
 +
 +
In the past half year we have produced a number of important results that are described in
 +
the following papers:
 +
* Frequency comb (Ramsey type) electronic  excitations in the  H2 molecule:
 +
see: Deep-ultraviolet frequency metrology of H2 for tests of molecular quantum theory
 +
http://www.nat.vu.nl/~wimu/Publications/Altmann-PRL-2018.pdf
 +
* ''Precision measurement of an infrared transition in the HD molecule''
 +
see: Sub-Doppler frequency metrology in HD for tests of fundamental physics: https://arxiv.org/abs/1712.08438
 +
* ''The first precision study in molecular tritium T2''
 +
see: Relativistic and QED effects in the fundamental vibration of T2:  http://arxiv.org/abs/1803.03161
 +
* ''Dissociation energy of the hydrogen molecule at 10^-9 accuracy'' paper submitted to Phys. Rev. Lett.
 +
* ''Probing QED and fundamental constants through laser spectroscopy of vibrational transitions in HD+''
 +
This is also a study of the hydrogen molecular ion HD+, where important results were  obtained not so long ago, and where we have a strong activity: http://www.nat.vu.nl/~wimu/Publications/ncomms10385.pdf
 +
 +
These five results mark the various directions we are pursuing, and in all directions we aim at obtaining improvements. Specific projects with students can be defined; those are mostly experimental, although there might be some theoretical tasks, like:
 +
* Performing calculations of hyperfine structures
 +
 +
As for the theory there might also be an international connection for specifically bright theory students: we collaborate closely with prof. Krzystof Pachucki; we might find an opportunity
 +
for a student to perform (the best !) QED calculations in molecules, when working in Warsaw and partly in Amsterdam. Prof Frederic Merkt from the ETH Zurich, an expert in the field, will come to work with us on "hydrogen"
 +
during August - Dec 2018 while on sabbatical.
 +
 +
''Contact: [mailto:w.m.g.ubachs@vu.nl Wim Ubachs] [mailto:k.s.e.eikema@vu.nl Kjeld Eikema] [mailto:h.l.bethlem@vu.nl Rick Bethlem]''
 +
 +
 +
 +
 +
 
== 2018: ==
 
== 2018: ==
 
=== Theory:  Stress-testing the Standard Model at the high-energy frontier ===
 
=== Theory:  Stress-testing the Standard Model at the high-energy frontier ===

Revision as of 12:07, 17 January 2020

2019:

Theory: The Effective Field Theory Pathway to New Physics at the LHC

A very promising framework to parametrise in a robust and model-independent way deviations from the Standard Model (SM) induced by new heavy particles is the Standard Model Effective Field Theory (SMEFT). In this formalism, Beyond the SM effects are encapsulated in higher-dimensional operators constructed from SM fields respecting their symmetry properties. In this project, we aim to carry out a global analysis of the SMEFT from high-precision LHC data, including Higgs boson production, flavour observables, and low-energy measurements. This analysis will be carried out in the context of the recently developed SMEFiT approach [1] based on Machine Learning techniques to efficiently explore the complex theory parameter space. The ultimate goal is either to uncover glimpses of new particles or interactions at the LHC, or to derive the most stringent model-independent bounds to date on general theories of New Physics.

[1] https://arxiv.org/abs/1901.05965

Contact: Juan Rojo

Theory: Pinning down the initial state of heavy-ion collisions with Machine Learning

It has been known for more than three decades that the parton distribution functions (PDFs) of nucleons bound within heavy nuclei are modified with respect to their free-nucleon counterparts. Despite active experimental and theoretical investigations, the underlying mechanisms that drive these in-medium modifications of nucleon substructure have yet to be fully understood. The determination of nuclear PDFs is a topic of high relevance in order both to improve our fundamental understanding of the strong interactions in the nuclear environment, as well as and for the interpretation of heavy ion collisions at RHIC and the LHC, in particular for the characterization of the Quark-Gluon Plasma. The goal of this project is to exploit Machine Learning and Artificial Intelligence tools [1,2] (neural networks trained by stochastic gradient descent) to pin down the initial state of heavy ion collisions by using recent measurements from proton-lead collisions at the LHC. Emphasis will be put on the poorly-known nuclear modifications of the gluon PDFs, which are still mostly terra incognita and highly relevant for phenomenological applications. In addition to theory calculations, the project will also involve code development using modern AI/ML tools such as TensorFlow and Keras.

[1] https://arxiv.org/abs/1811.05858 [2] https://arxiv.org/abs/1410.8849

Contact: Juan Rojo

Theory: The High-Energy Muon Crisis and Perturbative QCD

The production of charmed meson from the collision of high-energy cosmic rays with air nucleons in the upper atmosphere provides an important component of the flux of high-energy muons and neutrinos that can be detected at cosmic ray experiments such as AUGER and neutrino telescopes such as KM3NET or IceCube. The production of forward muons from charmed meson decays is usually predicted from QCD models tuned to the data, rather than from first principles QCD calculation. Interestingly, the number of such high-energy muons observed by AUGER seems to differ markedly from current theory predictions. In this project we aim to exploit state-of-the-art perturbative and non-perturbative QCD techniques to compute the flux of high-energy muons from charm decays and make predictions for a number of experiments sensitive to them


[1] https://arxiv.org/abs/1904.12547 [2] https://arxiv.org/abs/1808.02034 [3] https://arxiv.org/abs/1511.06346

Contact: Juan Rojo


ATLAS: The lifetime of the Higgs boson

While the Higgs boson was discovered in 2012, many of its properties still remain unconstrained. This master student project revolves around one such property, the lifetime of the Higgs boson. The lifetime can be obtained by measuring the width of the boson, but because the width is a few hundred times smaller than the detector resolution, a direct measurement is impossible at the moment. But there is an idea to overcome that limitation. By utilizing the interference between the Higgs boson decay and background processes we can perform an indirect measurement. This measurement potentially has the sensitivity that will allow us to perform a measurement of the width (or lifetime) as predicted by the Standard Model. Specifically, the master project will be about predicting the sensitivity of this measurement for different predictions of the Higgs width. The project is on the interface of theory and experiment, making use of Monte Carlo generators and the standard HEP analysis tools (ROOT, C++, python).

Contact: Michiel Veen or Hella Snoek & Ivo van Vulpen

ATLAS: The Next Generation

After the observation of the coupling of Higgs bosons to fermions of the third generation, the search for the coupling to fermions of the second generation is one of the next priorities for research at CERN's Large Hadron Collider. The search for the decay of the Higgs boson to two charm quarks is very new [1] and we see various opportunities for interesting developments. For this project we propose improvements in reconstruction (using exclusive decays), advanced analysis techiques (using deep learning methods) and expanding the new physics models (e.g. including a search for off-diagonal H->uc couplings). Another opportunity would be the development of the first statistical combination of results between the ATLAS and CMS experiment, which could significantly improve the discovery potentional.

[1] https://arxiv.org/abs/1802.04329

Contact: Tristan du Pree and Marko Stamenkovic

ATLAS: The Most Energetic Higgs Boson

The production of Higgs bosons at the highest energies could give the first indications for deviations from the standard model of particle physics, but production energies above 500 GeV have not been observed yet [1]. The LHC Run-2 dataset, collected during the last 4 years, might be the first opportunity to observe such processes, and we have various ideas for new studies. Possible developments include the improvement of boosted reconstruction techniques, for example using multivariate deep learning methods. Also, there are various opportunities for unexplored theory interpretations (using the MadGraph event generator), including effective field theory models (with novel ‘morphing’ techniques) and the study of the Higgs boson’s self coupling.

[1] https://arxiv.org/abs/1709.05543

Contact: Tristan du Pree and Brian Moser

LHCb: Measurement of Central Exclusive Production Rates of Chi_c using converted photons in LHCb

Central exclusive production (CEP) of particles at the LHC is characterised by a extremely clean signature. Differently from the typical inelastic collisions where many particles are created resulting in a so-called Primary Vertex, CEP events have only the final state particles of interest. In this project the particle of interest is a pair of charmed quarks creating a chi_c particle. In theory this process is generated by a long range gluon exchange and can elucidate the nature of the strong force, described by the quantum chromodynamics in the the standard model. The proposed work involves analysing a pre-existing dataset with reconstructed chi_c and simulating events at the LHCb in order to obtain the relative occurrence rate of each chi_c species (spins 0, 1, 2), a quantity that can be easily compared to theoretical predictions.

Contact: Kazu Akiba

LHCb: Optimization studies for Vertex detector at the High Lumi LHCb

The LHCb experiment is dedicated to measure tiny differences between matter and antimatter through the precise study of rare processes involving b or c quarks. The LHCb detector will undergo a major modification in order to dramatically increase the luminosity and be able to measure indirect effects of physics beyond the standard model. In this environment, over 42 simultaneous collisions are expected to happen at a time interval of 200 ps where the two proton bunches overlap. The particles of interest have a relatively long lifetime and therefore the best way to distinguish them from the background collisions is through the precise reconstruction of displaced vertices and pointing directions. The new detector considers using extremely recent or even future technologies to measure space (with resolutions below 10 um) and time (100 ps or better) to efficiently reconstruct the events of interest for physics. The project involves changing completely the LHCb Vertex Locator (VELO) design in simulation and determine what can be the best performance for the upgraded detector, considering different spatial and temporal resolutions.

Contact: Kazu Akiba

LHCb: Measurement of charge multiplication in heavily irradiated sensors

During the R&D phase for the LHCb VELO Upgrade detector a few sensor prototypes were irradiated to the extreme fluence expected to be achieved during the detector lifetime. These samples were tested using high energy particles at the SPS facility at CERN with their trajectories reconstructed by the Timepix3 telescope. A preliminary analysis revealed that at the highest irradiation levels the amount of signal observed is higher than expected, and even larger than the signal obtained at lower doses. At the Device Under Test (DUT) position inside the telescope, the spatial resolution attained by this system is below 2 um. This means that a detailed analysis can be performed in order to study where and how this signal amplification happens within the 55x55 um^2 pixel cell. This project involves analysing the telescope and DUT data to investigate the charge multiplication mechanism at the microscopic level.

Contact: Kazu Akiba

Detector R&D: Studying fast timing detectors

Fast timing detectors are the solution for future tracking detectors. In future LHC operation conditions and future colliders, more and more particles are produced per collision. The high particle densities make it increasingly more difficult to separate particle trajectories with the spatial information that current silicon tracking detectors provide. A solution would be to add very precise (in order of 10ps) timestamps to the spatial measurements of the particle trackers. A good understanding of the performance of fast timing detectors is necessary. With the user of a pulsed laser in the lab we study the characteristics of several prototype detectors.

Contact: Hella Snoek, Martin van Beuzekom, Kazu Akiba, Daniel Hynds

Detector R&D: Laser Interferometer Space Antenna (LISA)

The space-based gravitational wave antenna LISA is without doubt one of the most challenging space missions ever proposed. ESA plans to launch around 2030 three spacecrafts that are separated by a few million kilometers to measure tiny variations in the distances between test-masses located in each spacecraft to detect the gravitational waves from sources such as supermassive black holes. The triangular constellation of the LISA mission is dynamic requiring a constant fine tuning related to the pointing of the laser links between the spacecrafts and a simultaneous refocusing of the telescope. The noise sources related to the laser links are expected to provide a dominant contribution to the LISA performance.

An update and extension of the LISA science simulation software is needed to assess the hardware development for LISA at Nikhef, TNO and SRON. A position is therefore available for a master student to study the impact of instrumental noise on the performance of LISA. Realistic simulations based on hardware (noise) characterization measurements that were done at TNO will be carried out and compared to the expected tantalizing gravitational wave sources.

Key words: LISA, space, gravitational waves, simulations, signal processing

Contact: Niels van Bakel,Ernst-Jan Buis

Detector R&D: Spectral X-ray imaging - Looking at colours the eyes can't see

When a conventional X-ray image is made to analyse the composition of a sample, or to perform a medical examination on a patient, one acquires an image that only shows intensities. One obtains a ‘black and white’ image. Most of the information carried by the photon energy is lost. Lacking spectral information can result in an ambiguity between material composition and amount of material in the sample. If the X-ray intensity as a function of the energy can be measured (i.e. a ‘colour’ X-ray image) more information can be obtained from a sample. This translates to less required dose and/or to a better understanding of the sample that is being investigated. For example, two fields that can benefit from spectral X-ray imaging are mammography and real time CT.

X-ray detectors based on Medipix/Timepix pixel chips have spectral resolving capabilities and can be used to make polychromatic X-ray images. Medipix and Timepix chips have branched from pixel chips developed for detectors for high energy physics collider experiments.

Activities in the field of (spectral) CT scans are performed in a collaboration between two institutes (Nikhef and CWI) and two companies (ASI and XRE) Some activities that students can work on:

- Medical X-ray imaging (CT and conventional X-ray images): Detection of iodine contrast agent. Detection of calcifications (hint for a tumour).

- Material research: Using spectral information to identify materials and recognize compounds.

- Determining how much existing applications can benefit from spectral X-ray imaging and looking for potential new applications.

- Characterizing, calibrating, optimizing X-ray imaging detector systems.

Contact: Martin Fransen

Detector R&D: Holographic projector

A difficulty in generating holograms (based on the interference of light) is the required dense pixel pitch. One would need a pixel pitch of less than 200 nanometer. With larger pixels artefacts occur due to spatial under sampling. A pixel pitch of 200 nanometer is difficult, if not, impossible, to achieve, especially for larger areas. Another challenge is the massive amount of computing power that would be required to control such a dense pixel matrix.

A new holographic projection method has been developed that reduces under sampling artefacts for projectors with a ‘low’ pixel density. It uses 'pixels' at random but known positions, resulting in an array of (coherent) light points that lacks (or has strongly surpressed) spatial periodicity. As a result a holographic projector can be built with a significantly lower pixel density and correspondingly less required computing power. This could bring holography in reach for many applications like display, lithography, 3D printing, metrology, etc...

Of course, nothing comes for free: With less pixels, holograms become noisier and the contrast will be reduced (not all light ends up in the hologram). The questions: How does the quality of a hologram depend on pixel density? How do we determine projector requirements based on requirements for hologram quality?

Requirements for a hologram can be expressed in terms of: Noise, contrast, resolution, suppression of under sampling artefacts, etc..

For this project we are building a proof of concept holographic emitter. This set-up will be used to verify simulation results (and also to project some cool holograms of course).

Students can do hands on lab-work (building and testing the proto-type projector) and/or work on setting up simulation methods and models. Simulations can be highly parallelel and are preferably written for parallel/multithreading computing and/or GPU computing.

Contact: Martin Fransen

KM3NeT: Reconstruction of first neutrino interactions in KM3NeT

The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first few strings with sensitive photodetectors have been deployed at both the Italian and the French detector sites. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In order to identify neutrinos an accurate reconstruction and optimal understanding of the backgrounds are crucial. In this project we will use the available data together with simulations to optimally identify and reconstruct the first neutrino interactions in the KM3NeT detector (applying also machine learning for background suppression) and with this pave the path towards accurate neutrino oscillation measurements and neutrino astronomy.

Programming skills are essential, mostly root and C++ will be used.

Contact: Ronald Bruijn Dorothea Samtleben'

KM3NeT: Acoustic detection of ultra-high energy cosmic-ray neutrinos (2 projects)

The study of the cosmic neutrinos of energies above 1017 eV, the so-called ultra-high energy neutrinos, provides a unique view on the universe and may provide insight in the origin of the most violent astrophysical sources, such as gamma ray bursts, supernovae or even dark matter. In addition, the observation of high energy neutrinos may provide a unique tool to study interactions at high energies. The energy deposition of these extreme neutrinos in water induce a thermo- acoustic signal, which can be detected using sensitive hydrophones. The expected neutrino flux is however extremely low and the signal that neutrinos induce is small. TNO is presently developing sensitive hydrophone technology based on fiber optics. Optical fibers form a natural way to create a distributed sensing system. Using this technology a large scale neutrino telescope can be built in the deep sea. TNO is aiming for a prototype hydrophone which will form the building block of a future telescope.

The work will be executed at the Nikhef institute and/or the TNO laboratories in Delft. In this project there are two opportunities for master students to participate:
student project 1: Hardware development on fiber optics hydrophones technology Goal: characterise existing proto-type optical fibre hydrophones in an anechoic basin at TNO laboratory. Data collection, calibration, characterisation, analysis of consequences for design future acoustic hydrophone neutrino telescopes; Keywords: Optical fiber technology, signal processing, electronics, lab. student project 2: Investigation of ultra-high energy neutrinos and their interactions with matter. Goal: simulate (currently imperfectly modelled) interaction for extremely high energy interactions, characterise differences with currently available physics models and impact on physics reach for future acoustic hydrophone neutrino telescopes; Keywords: Monte Carlo simulations, particle physics, cosmology.

Further information: Info on ultra-high energy neutrinos can be found at: http://arxiv.org/abs/1102.3591; Info on acoustic detection of neutrinos can be found at: http://arxiv.org/abs/1311.7588

Contact: Ernst-Jan Buis and Ivo van Vulpen'

KM3NeT: Applying state-of-the-art reconstruction software to 10-years of Antares data

While the KM3NeT neutrino telescope is being constructed in the deep waters of the Mediterranean Sea, data from its precursor (Antares) have been accumulated for more than 10 years. The main objective of these neutrino telescopes is to determine the origin of (cosmic) neutrinos. The accuracy of the determination of the origin of neutrinos critically depends on the probability density function (PDF) of the arrival time of Cherenkov light produced by relativistic charged particles emerging from a neutrino interaction in the sea. It has been shown that these PDFs can be calculated from first principles and that the obtained values can efficiently be interpolated in 4 and 5 dimensions, without compromising the functional dependencies. The reconstruction software based on this input yields indeed for KM3NeT the best resolution. This project is aimed at applying the KM3NeT software to available Antares data.

Contact: Maarten de Jong

HiSPARC: Extensive Air Shower Reconstruction using Machine Learning

An important aspect of high energy cosmic ray research is the reconstruction of the direction and energy of the primary cosmic ray. This is done by measuring the footprint of the extensive air shower initiated by the cosmic ray. The goal of this project is to advance the creation of a reconstruction algorithm based on machine learning (ML) techniques.

A previous master student has made great progress in the creation of a ML algorithm for the direction reconstruction. The algorithm was trained on simulations and applied to real data. The method works quite well but we expect that better results can be achieved by improving the simulated data set. In this project you will implement a more accurate description of the photomultiplier tube in the simulation pipeline and check if the reconstruction will improve. The next step would be to advance the algorithm towards energy reconstruction. This means upscaling the current method and will involve the creation and manipulation of large simulated data sets.

The HiSPARC group is small. As a student you can have a big impact and there is freedom to tailor your own project. The proposed project is for students with a particular interest in computational (astro)physics. Advanced programming skills (mainly Python) and Linux knowledge are desirable.

Contact: Kasper van Dam en Bob van Eijk

VU LaserLaB: Measuring the electric dipole moment (EDM) of the electron

In collaboration with Nikhef and the Van Swinderen Institute for Particle Physics and Gravity at the University of Groningen, we have recently started an exciting project to measure the electric dipole moment (EDM) of the electron in cold beams of barium-fluoride molecules. The eEDM, which is predicted by the Standard Model of particle physics to be extremely small, is a powerful probe to explore physics beyond this Standard Model. All extensions to the Standard Model, most prominently supersymmetry, naturally predict an electron EDM that is just below the current experimental limits. We aim to improve on the best current measurement by at least an order of magnitude. To do so we will perform a precision measurement on a slow beam of laser-cooled BaF molecules. With this low-energy precision experiment, we test physics at energies comparable to those of LHC!

At LaserLaB VU, we are responsible for building and testing a cryogenic source of BaF molecules. The main parts of this source are currently being constructed in the workshop. We are looking for enthusiastic master students to help setup the laser system that will be used to detect BaF. Furthermore, projects are available to perform simulations of trajectory simulations to design a lens system that guides the BaF molecules from the exit of the cryogenic source to the experiment.

Contact: Rick Bethlem

VU LaserLab: Physics beyond the Standard model from molecules

Our team, with a number of staff members (Ubachs, Eikema, Salumbides, Bethlem, Koelemeij) focuses on precision measurements in the hydrogen molecule, and its isotopomers. The work aims at testing the QED calculations of energy levels in H2, D2, T2, HD, etc. with the most precise measurements, where all kind of experimental laser techniques play a role (cw and pulsed lasers, atomic clocks, frequency combs, molecular beams). Also a target of studies is the connection to the "Proton size puzzle", which may be solved through studies in the hydrogen molecular isotopes.

In the past half year we have produced a number of important results that are described in the following papers:

  • Frequency comb (Ramsey type) electronic excitations in the H2 molecule:

see: Deep-ultraviolet frequency metrology of H2 for tests of molecular quantum theory http://www.nat.vu.nl/~wimu/Publications/Altmann-PRL-2018.pdf

  • Precision measurement of an infrared transition in the HD molecule

see: Sub-Doppler frequency metrology in HD for tests of fundamental physics: https://arxiv.org/abs/1712.08438

  • The first precision study in molecular tritium T2

see: Relativistic and QED effects in the fundamental vibration of T2: http://arxiv.org/abs/1803.03161

  • Dissociation energy of the hydrogen molecule at 10^-9 accuracy paper submitted to Phys. Rev. Lett.
  • Probing QED and fundamental constants through laser spectroscopy of vibrational transitions in HD+

This is also a study of the hydrogen molecular ion HD+, where important results were obtained not so long ago, and where we have a strong activity: http://www.nat.vu.nl/~wimu/Publications/ncomms10385.pdf

These five results mark the various directions we are pursuing, and in all directions we aim at obtaining improvements. Specific projects with students can be defined; those are mostly experimental, although there might be some theoretical tasks, like:

  • Performing calculations of hyperfine structures

As for the theory there might also be an international connection for specifically bright theory students: we collaborate closely with prof. Krzystof Pachucki; we might find an opportunity for a student to perform (the best !) QED calculations in molecules, when working in Warsaw and partly in Amsterdam. Prof Frederic Merkt from the ETH Zurich, an expert in the field, will come to work with us on "hydrogen" during August - Dec 2018 while on sabbatical.

Contact: Wim Ubachs Kjeld Eikema Rick Bethlem



2018:

Theory: Stress-testing the Standard Model at the high-energy frontier

A suitable framework to parametrise in a model-independent way deviations from the SM induced by new heavy particles is the Standard Model Effective Field Theory (SMEFT). In this formalism, bSM effects are encapsulated in higher-dimensional operators constructed from SM fields respecting their symmetry properties. Here we aim to perform a global analysis of the SMEFT from high-precision LHC data. This will be achieved by extending the NNPDF fitting framework to constrain the SMEFT coefficients, with the ultimate aim of identifying possible bSM signals.

Contact: Juan Rojo

Theory: The quark and gluon internal structure of heavy nuclei in the LHC era

A precise knowledge of the parton distribution functions (PDFs) of the proton is essential in order to make predictions for the Standard Model and beyond at hadron colliders. The presence of nuclear medium and collective phenomena which involve several nucleons modifies the parton distribution functions of nuclei (nPDFs) compared to those of a free nucleon. These modifications have been investigated by different groups using global analyses of high energy nuclear reaction world data. It is important to determine the nPDFs not only for establishing perturbative QCD factorisation in nuclei but also for applications to heavy-ion physics and neutrino physics. In this project the student will join an ongoing effort towards the determination of a data-driven model of nPDFs, and will learn how to construct tailored Artificial Neural Networks (ANNs).

"Further information [here]

Contact: Juan Rojo

Theory: Combined QCD analysis of parton distribution and fragmentation functions

The formation of hadrons from quarks and gluons, or collectively partons, is a fundamental QCD process that has yet to be fully understood. Since parton-to-hadron fragmentation occurs over long-distance scales, such information can only be extracted from experimental observables that identify mesons and baryons in the final state. Recent progress has been made to determine these fragmentation functions (FFs) from charged pion and kaon production in single inclusive e+e−-annihilation (SIA) and additionally pp-collisions and semi-inclusive deep inelastic scattering (SIDIS). However, charged hadron production in unpolarized pp and inelastic lepton-proton scattering also require information about the momentum distributions of the quarks and gluons in the proton, which is encoded in non-perturbative parton distribution functions (PDFs). In this project, a simultaneous treatment of both PDFs and FFs in a global QCD analysis of single inclusive hadron production processes will be made to determine the individual parton-to-hadron FFs. Furthermore, a robust statistical methodology with an artificial neural network learning algorithm will be used to obtain a precise estimation of the FF uncertainties. This work will emphasis in particular the impact of pp-collision and SIDIS data on the gluon and separated quark/anti-quark FFs, respectively.

"Further information [here]

Contact: Juan Rojo


ALICE: Charm is in the Quark Gluon Plasma

The goal of heavy-ion physics is to study the Quark Gluon Plasma (QGP), a hot and dense medium where quarks and gluons move freely over large distances, larger than the typical size of a hadron. Hydrodynamic simulations expect that the QGP will expand under its own pressure, and cool while expanding. These simulations are particularly successful in describing some of the key observables measured experimentally, such as particle spectra and various orders of flow harmonics. Charm quarks are produced very early during the evolution of a heavy-ion collision and can thus serve as an idea probe of the properties of the QGP. The goal of the project is to study higher order flow harmonics (e.g. triangular flow - v3) that are more sensitive to the transport properties of the QGP for charm-mesons, such as D0, D*, Ds. This will be the first ever measurement of this kind.

Contact: Panos Christakoglou and Paul Kuijer

ALICE: Probing the time evolution of particle production in the Quark-Gluon Plasma

Particle production is governed by conservation laws, such as local charge conservation. The latter ensures that each charged particle is balanced by an oppositely-charged partner, created at the same location in space and time. The charge-dependent angular correlations, traditionally studied with the balance function, have emerged as a powerful tool to probe the properties of the Quark-Gluon Plasma (QGP) created in high energy collisions. The goal of this project is to take full advantage of the unique, among all LHC experiments, capabilities of the ALICE detector that is able to identify particles to extend the studies to different particle species (e.g. pions, kaons, protons…). These studies are highly anticipated by both the experimental and theoretical communities.

Contact: Panos Christakoglou

ALICE: CP violating effects in QCD: looking for the chiral magnetic effect with ALICE at the LHC

Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, which is called Chiral Magnetic Effect (CME). The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, but background effects have not yet been properly disentangled. In this project you will develop and test new observables of the CME, trying to understand and discriminate the background sources that affects such a measurement.

Contact: Panos Christakoglou

LHCb: Searching for dark matter in exotic six-quark particles

3/4 of the mass in the Universe is of unknown type. Many hypotheses about this dark matter have been proposed, but none confirmed. Recently it has been proposed that it could be made of particles made of the six quarks uuddss. Such a particle could be produced in decays of heavy baryons. It is proposed to use Xi_b baryons produced at LHCb to search for such a state. The latter would appear as missing 4-momentum in a kinematically constrained decay. The project consists in optimising a selection and applying it to LHCb data. See arXiv:1708.08951

Contact: Patrick Koppenburg


LHCb: Measurement of BR(B0 → Ds+ Ds-)

This project aims to discover the branching fraction of the decay B0->Ds- Ds+. The decay B0->Ds- Ds+ is quite rare, because it occurs through the exchange of a W-boson between the b and the d-quark of the B0-meson. This decay proceeds via Cabibbo-suppressed W-exchange and has not yet been observed; theoretical calculations predict a branching fraction at the order of 10^-5 with a best experimental upper limit of 3.6x10^-5. A measurement of the decay rate of B0 -> Ds+Ds- relative to that of B0 -> D+D- can provide an estimate of the W-exchange contribution to the latter decay, a crucial piece of information for extracting the CKM angle gamma from B0 -> D(*)D(*). The aim is to determine the relative branching fraction of B0->Ds+Ds- with respect to B0->Ds+D- decays (which has the best known branching ratio at present, (7.2 +- 0.8)x10^-3), in close collaboration with the PhD. The aim is that this project results in a journal publication on behalf of the LHCb collaboration. For this project computer skills are needed. The ROOT programme and C++ and/or Python macros are used. This is a project that is closely related to previous analyses in the group. Weekly video meetings with CERN coordinate the efforts with in the LHCb collaboration. Relevant information: [1] M.Jung and S.Schacht, "Standard Model Predictions and New Physics Sensitivity in B -> DD Decays" https://arxiv.org/pdf/1410.8396.pdf [2] L.Bel, K.de Bruyn, R. Fleischer, M.Mulder, N.Tuning, "Anatomy of B -> DD Decays" https://arxiv.org/pdf/1505.01361.pdf [3] A.Zupanc et al [Belle Collaboration] "Improved measurement of B0 -> DsD+ and search for B0 -> Ds+Ds at Belle" https://arxiv.org/pdf/hep-ex/0703040.pdf [4] B.Aubert et al. [Babar Collaboration] "Search for the W-exchange decays B0 -> DD+" https://arxiv.org/pdf/hep-ex/0510051.pdf [5] R.Aaij et al. [LHCb Collaboration], "First observations of B0s -> D+D, Ds+D and D0D0 decays" https://arxiv.org/pdf/1302.5854.pdf

Contact: Niels Tuning, Michele Veronesi (PhD), Sevda Esen (postdoc)

LHCb: Measurement of relative ratio of B+ → D0D+ and B+ → D0Ds decays

This decay is closely related to B0->Ds- Ds+ (see above), and close collaboration between the two master projects is foreseen. The decay mode B+->D0D+ is expected to be dominated by tree diagrams with some additional contributions from penguin diagrams. Assuming SU(3) symmetry, measurement of its branching fraction relative to Cabibbo-favored B+->D0D will enable better understanding of penguin contributions to the CP violating mixing phase. Relevant information: [1] L.Bel, K.de Bruyn, R. Fleischer, M.Mulder, N.Tuning, "Anatomy of B -> DD Decays" https://arxiv.org/pdf/1505.01361.pdf [2] R.Aaij et al. [LHCb Collaboration], "First observations of B0s -> D+D, Ds+D and D0D0 decays" https://arxiv.org/pdf/1302.5854.pdf [3] PDG: http://pdglive.lbl.gov/BranchingRatio.action?desig=261&parCode=S041

Contact: Niels Tuning, Michele Veronesi (PhD), Sevda Esen (postdoc)


Virgo: Fast determination of gravitational wave properties

In the era of multi-messenger astronomy, the development of fast, accurate and computationally cheap methods for inference of properties of gravitational wave signal is of paramount importance. In this work, we will work on the development of rapid bayesian parameter estimation method for binary neutron stars as well as precessing black hole binaries. Bayesian parameter estimation methods require the evaluation of a likelihood that describe the probability of obtaining data for a given set of model parameters, which are parameters of gravitational wave signals in this particular problem. Bayesian inference for gravitational wave parameter estimation may require millions of these evaluation making them computationally costly. This work will combine the benefits of machine learning/ deep learning methods and order reduction methods of gravitational wave source modelling to speed up Bayesian inference of gravitational waves.

Contact: Sarah Caudill

Virgo: Simulations of Binary Neutron Star Mergers and applications for multimessenger astronomy

With the detection of the binary neutron star merger in August 2017 (GW170817) a new era of multi-messenger astronomy started. GW170817 proved that neutron star mergers are ideal laboratories to constrain the equation of state of cold supranuclear matter, to study the central engines of short GRBs, and to understand the origin and production of heavy elements. The fundamental tool to understand the last stages of the binary dynamics are numerical relativity simulations. In this project the student will be introduced to the basics of numerical relativity simulations of binary neutron star simulations and will be able to perform simulations on its own. Based on these simulations and the first experience it will be possible to focus on one of the following aspects:

- the estimation of the ejected material released from the merger and the development of models for the electromagnetic signals

- further improvement of gravitational waveform models including numerical relativity information

- further improvement of the construction of the initial conditions of binary neutron star simulations

- code improvements of the evolution code incorporating additional microphysical aspects as magnetic fields, tabulated equation of states, or neutrino leakage schemes.

- studying the merger properties of neutron stars with exotic objects as boson or axion stars.

Contact: Tim Dietrich

Virgo: Measuring cosmological parameters from gravitational-wave observations of compact binaries

Gravitational wave observation of the binary neutron star merger GW170817 with its coincident optical counterpart led to a first "standard siren" measurement of the Hubble parameter independent of the cosmological distance ladder. While multiple similar observations are expected to improve the precision of the measurement, a statistical method of cross correlation with galaxy catalogues of gravitational-wave distance estimates is expected to work even without identified electromagnetic transients, and for binary black hole mergers in particular. The project would primarily be a study of various systematic effects in this analysis and correcting for them. The work will involve use of computational techniques to analyze LIGO-Virgo data. Some prior experience of programmimg is expected.

Contact: Archisman Ghosh and Chris Van Den Broeck

Detector R&D: Spectral X-ray imaging - Looking at colours the eyes can't see

When a conventional X-ray image is made to analyse the composition of a sample, or to perform a medical examination on a patient, one acquires an image that only shows intensities. One obtains a ‘black and white’ image. Most of the information carried by the photon energy is lost. Lacking spectral information can result in an ambiguity between material composition and amount of material in the sample. If the X-ray intensity as a function of the energy can be measured (i.e. a ‘colour’ X-ray image) more information can be obtained from a sample. This translates to less required dose and/or to a better understanding of the sample that is being investigated. For example, two fields that can benefit from spectral X-ray imaging are mammography and real time CT.

X-ray detectors based on Medipix/Timepix pixel chips have spectral resolving capabilities and can be used to make polychromatic X-ray images. Medipix and Timepix chips have branched from pixel chips developed for detectors for high energy physics collider experiments.

Activities in the field of (spectral) CT scans are performed in a collaboration between two institutes (Nikhef and CWI) and two companies (ASI and XRE).

Some activities that students can work on:

- Medical X-ray imaging (CT and ‘flat’ X-ray images): Detection of iodine contrast agent. Detection of calcifications (hint for a tumour).

- Material research: Using spectral information to identify materials and recognise compounds.

- Determine how much existing applications can benefit from spectral X-ray imaging and look for potential new applications.

- Characterise, calibrate, optimise X-ray imaging detector systems.

Contact: Martin Fransen

Detector R&D: Compton camera

In the Nikhef R&D group we develop instrumentation for particle physics but we also investigate how particle physics detectors can be used for different purposes. A successful development is the Medipix chip that can be used in X-ray imaging. For use in large scale medical applications compton scattering limits however the energy resolving possibilities. You will investigate whether it is in principle possible to design a X-ray application that detects the compton scattered electron and the absorbed photon. Your ideas can be tested in practice in the lab where a X-ray scan can be performed.

Contact: Martin Fransen

Detector R&D: Holographic projector

A difficulty in generating holograms (based on the interference of light) is the required dense pixel pitch. One would need a pixel pitch of less than 200 nanometer. With larger pixels artefacts occur due to spatial under sampling. A pixel pitch of 200 nanometer is difficult, if not, impossible, to achieve, especially for larger areas. Another challenge is the massive amount of computing power that would be required to control such a dense pixel matrix.

A new holographic projection method has been developed that reduces under sampling artefacts for projectors with a ‘low’ pixel density. It is using 'pixels' at random but known positions, resulting in an array of (coherent) light points that lacks (or has strongly surpressed) spatial periodicity. As a result a holographic projector can be built with a significantly lower pixel density and correspondingly less required computing power. This could bring holography in reach for many applications like display, lithography, 3D printing, metrology, etc..

Of course, nothing comes for free: With less pixels, holograms become noisier and the contrast will be reduced. The big question: How do we determine the requirements (in terms of pixel density, pixel positioning, etc..) for the holographic projector based on requirements for the holograms? Requirements for a hologram can be expressed in terms of: Noise, contrast, resolution, suppression of under sampling artefacts, etc..

For this project we are building a proof of concept holographic emitter. This set-up will be used to verify simulation results (and also to project some cool holograms of course).

Students can do hands on lab-work (building and testing the proto type projector) and/or work on setting up simulation methods and models. Simulations in this field can be highly parallelized and are preferably written for parallel computing and/or GPU computing.


Contact: Martin Fransen

Detector R&D: Laser Interferometer Space Antenna (LISA)

The space-based gravitational wave antenna LISA is without doubt one of the most challenging space missions ever proposed. ESA plans to launch around 2030 three spacecrafts that are separated by a few million kilometers to measure tiny variations in the distances between test-masses located in each spacecraft to detect the gravitational waves from sources such as supermassive black holes. The triangular constellation of the LISA mission is dynamic requiring a constant fine tuning related to the pointing of the laser links between the spacecrafts and a simultaneous refocusing of the telescope. The noise sources related to the laser links are expected to provide a dominant contribution to the LISA performance.

An update and extension of the LISA science simulation software is needed to assess the hardware development for LISA at Nikhef, TNO and SRON. A position is therefore available for a master student to study the impact of instrumental noise on the performance of LISA. Realistic simulations based on hardware (noise) characterization measurements that were done at TNO will be carried out and compared to the expected tantalizing gravitational wave sources.

Key words: LISA, space, gravitational waves, simulations, signal processing

Contact: Niels van Bakel,Ernst-Jan Buis

KM3NeT : Reconstruction of first neutrino interactions in KM3NeT

The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first two strings with sensitive photodetectors have been deployed 2015&2016. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In order to identify neutrinos an accurate reconstruction and optimal understanding of the backgrounds are crucial. In this project we will use the available data to identify and reconstruct the first neutrino interactions in the KM3NeT detector and with this pave the path towards neutrino astronomy.

Programming skills are essential, mostly root and C++ will be used.

Contact: Ronald Bruijn

ANTARES: Analysis of IceCube neutrino sources.

The only evidence for high energetic neutrinos from cosmic sources so far comes from detections with the IceCube detector. Most of the detected events were reconstructed with a large uncertainty on their direction, which has prevented an association to astrophysical sources. Only for the high energetic muon neutrino candidates a high resolution in the direction has been achieved, but also for those no significant correlation to astrophysical sources has to date been detected. The ANTARES neutrino telescope has since 2007 continuously taken neutrino data with high angular resolution, which can be exploited to further scrutinize the locations of these neutrino sources. In this project we will address the neutrino sources in a stacked analysis to further probe the origin of the neutrinos with enhanced sensitivity.

Programming skills are essential, mainly C++ and root will be used.

Contact: Dorothea Samtleben

VU LaserLaB: Measuring the electric dipole moment (EDM) of the electron

In collaboration with Nikhef and the Van Swinderen Institute for Particle Physics and Gravity at the University of Groningen, we have recently started an exciting project to measure the electric dipole moment (EDM) of the electron in cold beams of barium-fluoride molecules. The eEDM, which is predicted by the Standard Model of particle physics to be extremely small, is a powerful probe to explore physics beyond this Standard Model. All extensions to the Standard Model, most prominently supersymmetry, naturally predict an electron EDM that is just below the current experimental limits. We aim to improve on the best current measurement by at least an order of magnitude. To do so we will perform a precision measurement on a slow beam of laser-cooled BaF molecules. With this low-energy precision experiment, we test physics at energies comparable to those of LHC!

At LaserLaB VU, we are responsible for building and testing a cryogenic source of BaF molecules. The main parts of this source are currently being constructed in the workshop. We are looking for enthusiastic master students to help setup the laser system that will be used to detect BaF. Furthermore, projects are available to perform simulations of trajectory simulations to design a lens system that guides the BaF molecules from the exit of the cryogenic source to the experiment.

Contact: Rick Bethlem


VU LaserLab: Physics beyond the Standard model from molecules

Our team, with a number of staff members (Ubachs, Eikema, Salumbides, Bethlem, Koelemeij) focuses on precision measurements in the hydrogen molecule, and its isotopomers. The work aims at testing the QED calculations of energy levels in H2, D2, T2, HD, etc. with the most precise measurements, where all kind of experimental laser techniques play a role (cw and pulsed lasers, atomic clocks, frequency combs, molecular beams). Also a target of studies is the connection to the "Proton size puzzle", which may be solved through studies in the hydrogen molecular isotopes.

In the past half year we have produced a number of important results that are described in the following papers:

  • Frequency comb (Ramsey type) electronic excitations in the H2 molecule:

see: Deep-ultraviolet frequency metrology of H2 for tests of molecular quantum theory http://www.nat.vu.nl/~wimu/Publications/Altmann-PRL-2018.pdf

  • Precision measurement of an infrared transition in the HD molecule

see: Sub-Doppler frequency metrology in HD for tests of fundamental physics: https://arxiv.org/abs/1712.08438

  • The first precision study in molecular tritium T2

see: Relativistic and QED effects in the fundamental vibration of T2: http://arxiv.org/abs/1803.03161

  • Dissociation energy of the hydrogen molecule at 10^-9 accuracy paper submitted to Phys. Rev. Lett.
  • Probing QED and fundamental constants through laser spectroscopy of vibrational transitions in HD+

This is also a study of the hydrogen molecular ion HD+, where important results were obtained not so long ago, and where we have a strong activity: http://www.nat.vu.nl/~wimu/Publications/ncomms10385.pdf

These five results mark the various directions we are pursuing, and in all directions we aim at obtaining improvements. Specific projects with students can be defined; those are mostly experimental, although there might be some theoretical tasks, like:

  • Performing calculations of hyperfine structures

As for the theory there might also be an international connection for specifically bright theory students: we collaborate closely with prof. Krzystof Pachucki; we might find an opportunity for a student to perform (the best !) QED calculations in molecules, when working in Warsaw and partly in Amsterdam. Prof Frederic Merkt from the ETH Zurich, an expert in the field, will come to work with us on "hydrogen" during August - Dec 2018 while on sabbatical.

Contact: Wim Ubachs Kjeld Eikema Rick Bethlem



2017:

The Modulation Experiment: Data Analysis

There exist a few measurements that suggest an annual modulation in the activity of radioactive sources. With a few groups from the XENON collaboration we have developed four sets of table-top experiments to investigate this effect on a few well known radioactive sources. The experiments are under construction in Purdue University (USA), a mountain top in Switzerland, a beach in Rio de Janeiro and the last one at Nikhef in Amsterdam. We urgently need a master student to (1) analyze the first big data set, and (2) contribute to the first physics paper from the experiment. We are looking for an all-round physicist with interest in both lab-work and data-analysis. The student will directly collaborate with the other groups in this small collaboration (around 10 people), and the goal is to have the first physics publication ready by the end of the project.

Contact: Auke Colijn

The XENON Dark Matter Experiment: Data Analysis

The XENON collaboration has started operating the XENON1T detector, the world’s most sensitive direct detection dark matter experiment. The Nikhef group is playing an important role in this experiment. The detector operates at the Gran Sasso underground laboratory and consists of a so-called dual-phase xenon time-projection chamber filled with 3200kg of ultra-pure xenon. Our group has an opening for a motivated MSc student to do data-analysis on this new detector. The work will consist of understanding the signals that come out of the detector and in particular focus on the so-called double scatter events. We are interested in developing methods in order to interpret the response of the detector better and are developing sophisticated statistical tools to do this. This work will include looking at data and developing new algorithms in our Python-based analysis tool. There will also be opportunity to do data-taking shifts at the Gran Sasso underground laboratory in Italy.

Contact: Patrick Decowski

XAMS Dark Matter R&D Setup

The Amsterdam Dark Matter group has built an R&D xenon detector at Nikhef. The detector is a dual-phase xenon time-projection chamber and contains about 4kg of ultra-pure liquid xenon. We plan to use this detector for the development of new detection techniques (such as utilizing new photosensors) and to improve the understanding of the response of liquid xenon to various forms of radiation. The results could be directly used in the XENON experiment, the world’s most sensitive direct detection dark matter experiment at the Gran Sasso underground laboratory. We have several interesting projects for this facility. We are looking for someone who is interested in working in a laboratory on high-tech equipment, modifying the detector, taking data and analyzing the data him/herself. You will "own" this experiment.

Contact: Patrick Decowski

LHCb: A Scintillator Fibers Tracker

The LHCb collaboration is upgrading the present tracking system constructing a new tracker based on scintillating fibers combined with silicon photo-multipliers (SiPM): the SciFi Tracker! Nikhef plays a key role in the project, as we will build the SciFi fibers modules, the cold-box enclosure housing the SiPMs, and a large part of the on-detector electronics. In all these areas, interesting test hardware and software has to be realized, and several research topics for a Master project are available, taking the student in contact with state-of-the-art particle detectors, in a large team of physicists and engineers. Possible collaborations with the Nikhef R&D group can also be envisaged.

Contact: Antonio Pellegrino

LHCb: Discovery of the Decay Lb --> p Ds+

This project aims to measure the branching fraction of the decay Lb->p Ds+ (bud -> uud + ds). The decay Lb->p Ds+ is quite rare, because it occurs through the transition of a b-quark to a u-quark. It has not been measured yet (although some LHCb colleagues claim to have seen it). This decay is interesting, because

1) It is sensitive to the b->u coupling (CKM-element Vub), which determination is heavily debated. 2) It can quantify non-factorisable QCD effects in b-baryon decays.

The decay is closely related to B0->pi-Ds+, which proceeds through a similar Feynman diagram. Also, the final state of B0->pi-Ds+ is almost identical to Lb->p Ds+. The aim is to determine the relative branching fraction of Lb->pDs+ with respect to B0->D+pi- decays, in close collaboration with the PhD (who will study BR(B0->pi-Ds+)/BR(B0->D+pi-) ). This project will result in a journal publication on behalf of the LHCb collaboration, written by you. For this project computer skills are needed. The ROOT programme and C++ and/or Python macros are used. This is a project that is closely related to previous analyses in the group. Weekly video meetings with CERN coordinate the efforts with in the LHCb collaboration. Relevant information:

[1] R.Aaij et al. [LHCb Collaboration], ``Determination of the branching fractions of B0s->DsK and B0->DsK, JHEP 05 (2015) 019 [arXiv:1412.7654 [hep-ex]]. [2] R. Fleischer, N. Serra and N. Tuning, ``Tests of Factorization and SU(3) Relations in B Decays into Heavy-Light Final States, Phys. Rev. D 83, 014017 (2011) [arXiv:1012.2784 [hep-ph]].

Contact: Niels Tuning and Lennaert Bel and Mick Mulder

LHCb: Measurement of B0 -> pi Ds- , the b -> u quark transition

This project aims to measure the branching fraction of the decay B0->pi Ds+. This decay is closely related to Lb->p Ds+ (see above), and close collaboration between the two master projects is foreseen. This research was started by a previous master student. The new measurement will finish the work, and include the new data from 2015 and 2016.

See Mick Mulders master thesis for more information.

Contact: Niels Tuning and Lennaert Bel and Mick Mulder

LHCb: A search for heavy neutrinos in the decay of W bosons at LHCb

Neutrinos are arguably the most mysterious of all known fundamental fermions as they are both much lighter than all others and only weakly interacting. It is thought that the tiny mass of neutrinos can be explained by their mixing with so-far unknown, much heavier, neutrino-like particles. In this research proposal we look for these new neutrinos in the decay of the SM W-boson using data with the LHCb experiment at CERN. The W boson is assumed to decay to a heavy neutrino and a muon. The heavy neutrino subsequently decays to a muon and a pair of quarks. Both like-sign and opposite-sign muon pairs will be studied. The result of the analysis will either be a limit on the production of the new neutrinos or the discovery of something entirely new.

Contact: Wouter Hulsbergen and Elena Dall'Occo


ALICE : Particle polarisation in strong magnetic fields

When two atomic nuclei, moving in opposite directions, collide off- center then the Quark Gluon Plasma (QGP) created in the overlap zone is expected to rotate. The nucleons not participating in the collision represent electric currents generating an intense magnetic field. The magnetic field could be as large as 10^{18} gauss, orders of magnitude larger than the strongest magnetic fields found in astronomical objects. Proving the existence of the rotation and/or the magnetic field could be done by checking if particles with spin are aligned with the rotation axis or if charged particles have different production rates relative to the direction of the magnetic field. In particular, the longitudinal and transverse polarisation of the Lambda^0 baryon will be studied. This project requires some affinity with computer programming.

Contact: Paul Kuijer and Panos Christakoglou

ALICE : Blast-Wave Model in heavy-ion collisions

The goal of heavy-ion physics is to study the Quark Gluon Plasma (QGP), a hot and dense medium where quarks and gluons move freely over large distances, larger than the typical size of a hadron. Hydrodynamic simulations expect that the QGP will expand under its own pressure, and cool while expanding. These simulations are particularly successful in describing some of the key observables measured experimentally, such as particle spectra and elliptic flow. A reasonable reproduction of the same observables is also achieved with models that use parameterisations that resemble the hydrodynamical evolution of the system assuming a given freeze-out scenario, usually referred to as blast-wave models. The goal of this project is to work on different blast wave parametrisations, test their dependence on the input parameters and extend their applicability by including more observables studied in heavy-ion collisions in the global fit.

Contact: Panos Christakoglou and Paul Kuijer

ALICE : Higher Harmonic Flow

When two ions collide, if the impact parameter is not zero, the overlap region is not isotropic. This spatial anisotropy of the overlap region is transformed into an anisotropy in momentum space through interactions between partons and at a later stage between the produced particles. It was recently realized that the overlap region of the colliding nuclei exhibits an irregular shape. These irregularities originate from the initial density profile of nucleons participating in the collision which is not smooth and is different from one event to the other. The resulting higher order flow harmonics (e.g. v3, v4, and v5, usually referred to as triangular, quadrangular, and pentangular flow, respectively) and in particular their transverse momentum dependence are argued to be more sensitive probes than elliptic flow not only of the initial geometry and its fluctuations but also of shear viscosity over entropy density (η/s). The goal of this project is to study v3, v4, and v5 for identified particles in collisions of heavy-ions at the LHC.

Contact: Panos Christakoglou and Paul Kuijer

ALICE : Chiral Magnetic Effect and the Strong CP Problem

Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, which is called Chiral Magnetic Effect (CME). The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, but background effects have not yet been properly disentangled. In this project you will develop and test new observables of the CME, trying to understand and discriminate the background sources that affects such a measurement.

Contact: Panos Christakoglou and Paul Kuijer

DR&D : Medical X-ray Imaging

With the upcoming of true multi-threshold X-Ray detectors the possibilities for Spectral Imaging with low dose, including spectral CT, is now a reality around the corner. The Medipix3RX chip, from the Medipix Collaboration (CERN) features up to 8 programmable thresholds which can select energy bins without a threshold scan. A number of projects could be derived from the R&D activities with the Medipix3RX within the Nikhef R&D group on X-ray imaging for medical applications:

  • Medipix3RX characterization in all its operation modes and gains.
  • Spectral CT and scarce sampling 3D reconstruction
  • Charge sharing: the charge-sum capabilities of the chip can be exploited to further understand the problem of charge sharing in pixelized detectors. A combination of the characterization of the charge-summing mode plus the use of both planar, and 3D sensors, at the light of MC simulation, could reveal valuable information about charge sharing.

Contact: Els Koffeman,Martin Fransen

DR&D : Compton camera

In the Nikhef R&D group we develop instrumentation for particle physics but we also investigate how particle physics detectors can be used for different purposes. A succesfull development is the Medipix chip that can be used in X-ray imaging. For use in large scale medical applications compton scattering limits however the energy resolving possibilities. You will investigate whether it is in principle possible to design a X-ray application that detects the compton scattered electron and the absorbed photon. Your ideas can be tested in practice in the lab where a X-ray scan can be performed.

Contact: Els Koffeman

KM3NeT : Reconstruction of first neutrino interactions in KM3NeT

The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first two strings with sensitive photodetectors have been deployed 2015&2016, in total 30 to be deployed til end of next year. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In order to identify neutrinos an accurate reconstruction and optimal understanding of the backgrounds are crucial. In this project we will use the available data to identify and reconstruct the first neutrino interactions in the KM3NeT detector and with this pave the path towards neutrino astronomy.

Programming skills are essential, mostly root and C++ will be used.

Contact: Ronald Bruijn

ANTARES: Analysis of IceCube neutrino sources.

The only evidence for high energetic neutrinos from cosmic sources so far comes from detections with the IceCube detector. Most of the detected events were reconstructed with a large uncertainty on their direction, which has prevented an association to astrophysical sources. Only for the high energetic muon neutrino candidates a high resolution in the direction has been achieved, but also for those no significant correlation to astrophysical sources has to date been detected. The ANTARES neutrino telescope has since 2007 continuously taken neutrino data with high angular resolution, which can be exploited to further scrutinize the locations of these neutrino sources. In this project we will address the neutrino sources in a stacked analysis to further probe the origin of the neutrinos with enhanced sensitivity.

Programming skills are essential, mainly C++ and root will be used.

Contact: Dorothea Samtleben


ATLAS: Implementation of Morphing techniques for ATLAS top physics analysis.

Perhaps the most promising gateway to physics beyond the Standard Model is the top quark, the heaviest elementary particle. Particularly interesting is how the different top quark spin states influence the angular distribution of electrons and other decays products, which can be measured very accurately. New interactions would alter this coupling, leading to decay patterns that are different from those predicted by the Standard Model. At Nikhef we are implementing NLO predictions of the so called dimension-6 operators to describe several measurable distributions. To confront these distributions with data, a continues parametrization is required. For this purpose, we want to introduce a novel technique in top quark analysis which is based on Morphing. The project consist of an implementation of Morphing to parametrize the top's angular distributions and to demonstrate that the paramdeters can be extracted in a fitting procedure using (pseudo)data.

Affinity with software is essential, mainly C++ and root will be used.

Contact: Marcel Vreeswijk

Theory – Probing electroweak symmetry breaking with Higgs pair production at the LHC and beyond

The measurement of Higgs pair production will be a cornerstone of the LHC program in the coming years. Double Higgs production provides a crucial window upon the mechanism of electroweak symmetry breaking, and has a unique sensitivity to a number of currently unknown Higgs couplings, like the Higgs self-coupling λ and the coupling between a pair of Higgs bosons and two vector bosons. In this project, the student will explore the feasibility of the measurement of Higgs pair production in the 4b final state both at the LHC and at future 100 TeV collider. A number of production modes will be considered, including gluon-fusion, vector-boson-fusion, as well as Higgs pair production in association with a top-quark pair. A key ingredient of the project will be the exploitation of multivariate techniques such as Artificial Neural Networks and other multivariate discriminants to enhance the ratio of di-Higgs signal over backgrounds.

The project involves to estimate the precision that can be achieved in the extraction of the Higgs self-coupling for a number of assumptions about the performance of the LHC detectors, and in particular to quantify the information that can be extracted from the Run II dataset with L = 300 1/fb . A similar approach will be applied to the determination of other unknown properties of the Higgs sector, such as the coupling between two Higgs bosons and two weak vector bosons, as well as the Wilson coefficients of higher-dimensional operators in the Standard Model Effective Field Theory (SM-EFT). Additional information on this project can be found here: [1].

Contact: Juan Rojo

Theory – Constraining the proton structure with Run II LHC data

The non-perturbative dynamics that determine the energy distribution of quarks and gluons inside protons, the so-called parton distribution functions (PDFs), cannot be computed from first principles from Quantum Chromodynamics (QCD), and need to be determined from experimental data. PDFs are an essential ingredient for the scientific program at the Large Hadron Collider (LHC), from Higgs characterisation to searches for New Physics beyond the Standard Model. One recent breakthrough in PDF analysis has been the exploitation of the constraints from LHC data. From direct photons to top quark pair production cross-sections and charmed meson differential distributions, LHC measurements are now a central ingredient of PDF fits, providing important information on poorly-known PDFs such as the large and small-x gluon or the large-x antiquarks. With the upcoming availability of data from the Run II of the LHC, at a center-of-mass energy of 13 TeV, these constraints are expected to become even more stringent.

In this project, the implications of PDF-sensitive measurements at the LHC 13 TeV will be quantified. Processes that will be considered include jet and dijet production at the multi-TeV scale, single-top quark production, and weak boson production in association with heavy quarks, among several others. These studies will be performed using the NNPDF fitting framework, based on artificial neural networks and genetic algorithms. The phenomenological implications of the improved PDF modelling for Higgs and new physics searches at the LHC will also be explored. Additional information on this project can be found here: [2].

Contact: Juan Rojo

2016:

Extreme Astronomy – Preparing for CTA, the Next-Generation Gamma-Ray Observatory

The Cherenkov Telescope Array (CTA) is a planned facility for measuring gamma rays from space covering more than four orders of magnitude in energy, up to energies exceeding 100 TeV. CTA employs the imaging atmospheric Cherenkov technique to measure properties of cosmic gamma rays. This technique is based on measuring Cherenkov light emitted during the development of a gamma-ray air shower. CTA will be built at two experimental sites, one in the Northern, one in the Southern hemisphere, and will consist of up to 100 telescopes. It represents a major leap forward in sensitivity and precision for gamma-ray astronomy, and will allow us to explore very-high-energy processes of the extreme Universe at an unprecedented level.

Several master projects are available in the CTA group of UvA and the students will participate in photonic and electronic R&D studies contributing to the starting phase of CTA. These studies will either focus on laboratory-based measurements or simulations of novel kinds of single-photon detectors, referred to as silicon photomultipliers (SiPMs). By means of these simulations the performance of SiPMs used for arrays of Cherenkov telescopes will be assessed and their optimal operational parameters will be evaluated. Within one of the laboratory-based projects different types of SiPMs will be characterised and a measuring system to calibrate the photosensors will be designed and operated. In a different project various imaging and non-imaging light sources will be used to study the trigger performance of a CTA prototype camera.

Contact: David Berge, Maurice Stephan


ATLAS : Astroparticle Physics at the LHC

Understanding particle acceleration up to very high energies in the Universe requires Earth-bound experimental techniques that exploit the Earth’s atmosphere as detection medium. Only the shear size of the atmosphere provides a sufficiently large sensitive area to measure the very rare highest energy particles from the cosmos as they impinge on the Earth. The idea of the atmospheric measurement is simple: a cosmic particle hitting the atmosphere is being absorbed by developing into an air shower, a spray of secondary particles that originates in the collision of the primary cosmic particle with air molecules, and successive interactions of those secondary particles in the atmosphere. Such air showers can be traced and therefore measured on Earth, providing information about the energy, type, and direction of the primary cosmic particle, by different means. Important examples of such atmospheric detection techniques include the measurement of muons with particle counters at the Earth’s surface and the measurement of Cherenkov or Fluorescence light emitted during the air shower development. The connection between measured quantities like particle numbers or light intensity and original quantities like particle energy or type is in all cases inferred using simulations of particle collisions and cascades in the atmosphere.

The goal of this master project is to exploit data of proton collisions measured with ATLAS, an experiment at the Large Hadron Collider (LHC), the highest energy human particle collider currently operating at CERN in Geneva (Switzerland), to test and improve simulations of particle collisions in the atmosphere up to the highest known energies (a few times 1020 eV). The student will work on ATLAS data analysis and Monte Carlo simulations of particle collisions, both for simulating proton colliding in ATLAS and cosmic-ray protons colliding with air molecules in the atmosphere. The ultimate goal is to improve Monte Carlo model predictions used for experiments like the upcoming CTA (http://www.cta-observatory.org/) and Auger (http://www.auger.org/).

Contact: David Berge, David Salek

ATLAS : Beyond Standard Model with multiple leptons

The Standard Model of particle physics (SM) is extremely successful, but would it hold against of check of with data containing multiple leptons? Although very rare process, the production of leptons is calculated in SM with high precision. On detector side the leptons (electrons and muons) are easy to reconstruct and such a sample contains very little "non-lepton" background. This analysis has a very ambitious goal to test many final states at once, without over-tuning for a specific model. The second step would then be to test obtained results against models of composite structure of leptons or presence of heavy right handed neutrinos favored in seesaw theories. With this project, the student would gain close familiarity with modern experimental techniques (statistical analysis, SM background estimates, etc.), with Monte Carlo generators and the standard HEP analysis tools (ROOT, C++, etc.).

Contact: Olya Igonkina

ATLAS : Search for supersymmetric dark matter-like particles

Supersymmetry is a theory of physics beyond the Standard Model of particle physics that makes a link between fermions and bosons. Apart from many other attractive features, supersymmetry also predicts a particle that may be a candidate for dark matter. Such particles, and their related particles, may be produced at the LHC. Searching for them, however, is challenging. The production cross-sections are small, and the final states are difficult to measure, for example because the final state particles may have little energy, and are ignored by the ATLAS trigger and reconstruction software. In this project we will investigate the measurement of such particles, with the aim to improve the current ATLAS search strategy. You will then apply the improved analysis to the ATLAS data (to be) collected in 2016. In the project you will learn modern experimental analysis techniques, as well as programming in ROOT and C++.

Contact: Paul de Jong

ATLAS inner tracker upgrade

Research description: One of the key sub-systems of the ATLAS experiment at the Large Hadron Collider (LHC) is the Inner Detector (ID), designed to provide excellent charged particles momentum and vertex resolution measurements. At Phase-2 of the LHC run the operating luminosity of the collider will be increased significantly. This will imply an upgrade of all ATLAS subsystems. In particular, the ID will be fully replaced with a tracker completely made of Silicon, having higher granularity and radiation hardness. The R&D process for the new ATLAS ID is now ongoing. Different geometrical layouts are simulated and their performance is studied under different operating conditions in search for the optimal detector architecture. Also, the performance of the new Si-sensors/modules is under investigation with dedicated laboratory tests. The focus of the project could be on the simulation of the High-Luminosity LHC version of the ATLAS Inner Detector. The student will learn how a high-energy physics experiment is designed and optimized. Alternatively, if possible at that moment, the student could work on a project at the Nikhef Silicon laboratory at the test-bench for new ATLAS Si-strip detectors and participate in the quality assurance procedure for the new ATLAS Si detectors.

Contact: Peter Vankov


ATLAS : Model testing for Beyond the Standard Model Higgs

Recently there's been a lot of excitement about hints for new Beyond the Standard Model (BSM) physics. When the search for the Higgs was ongoing we had a model to test possible bumps against, now we are looking for any new particle and there are many models available. Testing the newest results from the LHC is challenging because of low statistics and model uncertainties. In this project we will investigate different BSM Higgs models that could possible explain the newest results of the LHC. For this we will use the data collected in 2015 and the data that will be collected in 2016. In the end we will set limits on the tested models (or discover a new particle). During the project you will learn modern experimental analysis techniques as well as programming.

Contact: Wouter Verkerke, Lydia Brenner

Theory & ATLAS: How to violate lepton flavour?

One of the big outstanding questions in particle physics is what has caused the matter-antimatter asymmetry in the Universe. One of the possible solutions is related to processes that violate lepton flavour. Such phenomena could be observable in the ATLAS experiment at the Large Hadron Collider. In fact, recent data show a hint for the violation of lepton flavour in the decay of the Higgs boson into a mu and tau lepton. However, additional experimental and theoretical investigations are necessary to further probe this very interesting phenomenon. The goal of this project is to identify complementary lepton flavour violating processes and to study correlations between them and other flavour probes. The work will involve theoretical calculations and a classification of the most useful observables for the ATLAS experiment.

Contact: Jordy de Vries, Olya Igonkina, Robert Fleischer

KM3NeT : Reconstruction of first neutrinos in KM3NeT

The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first string with sensitive photodetectors has been deployed end of 2015, in total 30 will be deployed til end of 2017. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. The performance and calibration of the detector will be evaluated also in comparison with simulations. Procedures to identify and also optimally reconstruct the directions of the muons and neutrinos will be developed to verify the performance and potential of the detector and to pave the path towards the neutrino astronomy. Programming skills are essential, mostly root and C++ will be used.

Contact: Ronald Bruijn

Neutrino mass hierarchy with KM3NeT/ORCA

Neutrinos exist in three flavors and are known to oscillate between flavors whereby the detected flavor depends on the (partly) known oscillation parameters, the mass differences, their energy and travel length. The neutrino telescope KM3NeT is planning for a dedicated set of detection units in order to pursue an oscillation measurement of an unprecedented precision using neutrinos from atmospheric interactions and with this enabling the measurement of the so far still unknown neutrino mass hierarchy. The measurement of this subtle effect requires unprecedented precision in the reconstruction and identification of the flavor, energy and direction. Various projects are available in the reconstruction and evaluation of the mass hierarchy using dedicated simulations. Programming skills are essential, mainly C++ and root will be used.

Contact: Aart Heijboer

All-flavor-neutrino analysis of ANTARES data

The ANTARES neutrino telescope has been taking data continuously since 2007. Most analyses of the data have been performed using the signature of a muon neutrino interaction whereby a long track can be reconstructed in the detector. Recent developments allowed for the first time also the reconstruction of a cascade signature in the detector at high angular resolution so that also electron and tau neutrino interactions can be detected (here these two are not distinguishable from each other). A search for neutrinos from cosmic sources on the first 6 years of data has by now been accomplished using this new reconstruction. In this project this search will be continued and exploited also on 2 more years of data for a dedicated optimized analysis of the Galactic Center to probe the possible neutrino emission from this highly interesting region. Programming skills are essential, mainly C++ and root will be used. Also other options for analyses of the ANTARES data are available.

Contact: Dorothea Samtleben

ALICE: Particle Polarization in Strong Magnetic Fields

When two atomic nuclei, moving in opposite directions, collide off- center then the Quark Gluon Plasma (QGP) created in the overlap zone is expected to rotate. The nucleons not participating in the collision represent electric currents generating an intense magnetic field. The magnetic field could be as large as 10^{18} gauss, orders of magnitude larger than the strongest magnetic fields found in astronomical objects. Proving the existence of the rotation and/or the magnetic field could be done by checking if particles with spin are aligned with the rotation axis or if charged particles have different production rates relative to the direction of the magnetic field. In particular, the longitudinal and transverse polarisation of the Lambda^0 baryon will be studied. This project requires some affinity with computer programming.

Contact: P. Christakoglou, P. Kuijer

ALICE: Forward Particle Production from the Color Glass Condensate

It has been proposed that a new state of matter (the color-glass condensate, or CGC) may provide a universal description of hadronic collisions (e.g. proton-proton collisions) at very high energy. The CGC may be seen as the classical field limit of Quantum Chromodynamics, and a framework for calculating observables from this state has been developed. Several measurements are consistent with the assumption of a CGC, but no experimental proof exists so far. In this project we intend to perform a systematic study of the sensitivity to the CGC of different possible measurements at the LHC. The work will be performed in close collaboration with an external world expert in this field. It is advantageous to have a good background in theoretical physics. (contact: T. Peitzmann, M. van Leeuwen) Blast-Wave Model in Heavy-Ion collisions The goal of heavy-ion physics is to study the Quark Gluon Plasma (QGP), a hot and dense medium where quarks and gluons move freely over large distances, larger than the typical size of a hadron. Hydrodynamic simulations expect that the QGP will expand under its own pressure, and cool while expanding. These simulations are particularly successful in describing some of the key observables measured experimentally, such as particle spectra and elliptic flow. A reasonable reproduction of the same observables is also achieved with models that use parameterisations that resemble the hydrodynamical evolution of the system assuming a given freeze-out scenario, usually referred to as blast-wave models. The goal of this project is to work on different blast wave parametrisations, test their dependence on the input parameters and extend their applicability by including more observables studied in heavy-ion collisions in the global fit.

Contact: P. Christakoglou, P. Kuijer

ALICE: Energy Loss of Energetic Quarks and Gluons in the Quark-Gluon Plasma

One of the ways to study the quark-gluon plasma that is formed in high-energy nuclear collisions, is using high-energy partons (quarks or gluons) that are produced early in the collision and interact with the quark-gluon plasma as they propagate through it. There are several current open questions related to this topic, which can be explored in a Master's project. For example, we would like to use a new Monte Carlo generator model (JEWEL) of the collision to see whether we can measure the shape of the collision region using measurements of hadron pairs. In the project you will collaborate with one the PhD students in our group to use the model to generate predictions of measurements and compare those to data analysis results. Depending on your interests, the project can focus more on the modeling aspects or on the analysis of experimental data from the ALICE detector at the LHC.

Contact: M. van Leeuwen

ALICE: Chiral Magnetic Effect and the Strong CP Problem

Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, which is called Chiral Magnetic Effect (CME). The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, but background effects have not yet been properly disentangled. In this project you will develop and test new observables of the CME, trying to understand and discriminate the background sources that affects such a measurement.

Contact: P. Christakoglou, P. Kuijer

ALICE: Quantum Coherence in Particle Production with Intensity Interferometry

Intensity interferometry – also known as HBT-effect or Bose-Einstein-correlations – is a method to study the space-time structure of the particle-emitting source in high-energy physics. The main interest so far has been on the width of correlation functions in momentum space, which reflects the space-time information. The strength of the correlation also carries information, but this has been ignored by many people. The correlation strength is in particular influenced by the degree of coherence of particle production. Recently new studies have been performed to extract this degree of coherence, however, many other effects might distort such a measurement, in particular the production of pions via resonance decay. In this project we will study the role of resonance decays for a measurement of coherence in intensity interferometry and try to establish possible correction methods for any distortions they may cause. We will perform theoretical model calculations with Monte-Carlo simulation methods.

Contact: T. Peitzmann

ALICE: Higher Harmonic Flow

When two ions collide, if the impact parameter is not zero, the overlap region is not isotropic. This spatial anisotropy of the overlap region is transformed into an anisotropy in momentum space through interactions between partons and at a later stage between the produced particles. It was recently realized that the overlap region of the colliding nuclei exhibits an irregular shape. These irregularities originate from the initial density profile of nucleons participating in the collision which is not smooth and is different from one event to the other. The resulting higher order flow harmonics (e.g. v3, v4, and v5, usually referred to as triangular, quadrangular, and pentangular flow, respectively) and in particular their transverse momentum dependence are argued to be more sensitive probes than elliptic flow not only of the initial geometry and its fluctuations but also of shear viscosity over entropy density (η/s). The goal of this project is to study v3, v4, and v5 for identified particles in collisions of heavy-ions at the LHC.

Contact: P. Christakoglou, P. Kuijer

ALICE: A New Detector for Very High-Energy Photons: FoCal

High-energy photons are important messenger particles in particle physics. In particular direct photons (i.e. directly produced from elementary scattering processes) are interesting, but it is a difficult task to discriminate them from the photons originating from particle decays. Existing detector have limited capabilities for such a discrimination, in particular at the highest energies. Our institute has pioneered a detector based on a new concept, a digital pixel calorimeter with Si-sensors of unprecedented granularity. First proof-of-principle measurements have already been performed. In this project we will study the performance of a particular detector design for measurements of direct photons at the LHC and optimize the design parameters for such a measurement. Performance studies for other measurements – e.g. jets, J/ψ, or ϒ particles – may be carried out in addition.

Contact: T. Peitzmann M. van Leeuwen

ALICE: Thermal Photon Emission: Quark-Gluon Plasma or Hadron Gas?

Recently, measurements of thermal photon emission in high-energy nuclear collisions have been performed at RHIC and at LHC. It is generally believed that a quark-gluon plasma equation of state is the natural description of the hot initial phase of these collisions, and so far only theoretical model calculations including such a phase have been compared to those measurements. In this project we will revisit hadron gas models and try to reproduce the thermal photon yield together with other observables. In this work we will use and possibly modify Monte Carlo implementations of relativistic hydrodynamics, tune it to existing data of hadron production and then estimate the photon production from the same model. The model implementation will be based on previous work of external theoretical colleagues and will be carried out in collaboration with them.

Contact: T. Peitzmann

A New Detector for Proton Therapy and Proton Computed Tomography

Conventional imaging of humans in medical treatment relies mostly on electromagnetic radiation (CT, MRT) or positrons (PET). A recently proposed new imaging strategy, in particular in the context of proton therapy for cancer treatment, is to use proton beams. Current detectors for the scattered protons have severe limitations, in particular for their precision and measurement times. New development of intelligent Si-sensors in particle physics offer possibilities to develop much more efficient detectors for such proton CT measurements. We will perform R&D on the use of new silicon pixel detectors developed in the context of the ALICE experiment at CERN for such medical applications. Studies will include Monte-Carlo simulations of a possible detector setup and measurements with the first samples of the appropriate silicon sensors, which will become available in early 2016. The project will be carried out in the context of a scientific collaboration with Bergen University, Norway.

Contact: T. Peitzmann

Medical X-ray Imaging

With the upcoming of true multi-threshold X-Ray detectors the possibilities for Spectral Imaging with low dose, including spectral CT, is now a reality around the corner. The Medipix3RX chip, from the Medipix Collaboration (CERN) features up to 8 programmable thresholds which can select energy bins without a threshold scan. A number of projects could be derived from the R&D activities with the Medipix3RX within the Nikhef R&D group on X-ray imaging for medical applications:

  • Medipix3RX characterization in all its operation modes and gains.
  • Spectral CT and scarce sampling 3D reconstruction
  • Charge sharing: the charge-sum capabilities of the chip can be exploited to further understand the problem of charge sharing in pixelized detectors. A combination of the characterization of the charge-summing mode plus the use of both planar, and 3D sensors, at the light of MC simulation, could reveal valuable information about charge sharing.

Contact: John Idarraga,Niels van Bakel

Compton camera

In the Nikhef R&D group we develop instrumentation for particle physics but we also investigate how particle physics detectors can be used for different purposes. A succesfull development is the Medipix chip that can be used in Xray imaging. For use in large scale medical applications compton scattering limits however the energy resolving possibilities. You will investigate whether it is in principle possible to design a Xray application that detects the compton scattered elctron and the absorbed photon. Your ideas can be tested in practice in the lab where a Xray scan can be performed.

Contact: Els Koffeman

The Modulation experiment

There exist a few measurements that suggest an annual modulation in the activity of radioactive sources. With a few groups from the XENON collaboration we have developed four sets of table-top experiments to investigate this effect on a few well known radioactive sources. The experiments are under construction in Purdue University (USA), a mountain top in Switzerland, a beach in Rio de Janeiro and the last one at Nikhef in Amsterdam. We urgently need a master student to (1) do the final commissioning of the experiment, (2) collect the 1st big data set, and (3) analyse the first data. We are looking for an all-round physicist with interest in both lab-work and data-analysis. The student will directly collaborate with the other groups in this small collaboration (around 10 people), and the goal is to have the first publication ready by the end of the project.

Contact: Auke Colijn

Acoustic detection of ultra-high energy cosmic-ray neutrinos

The study of the cosmic neutrinos of energies above 10^17 eV, so-called ultra-high energy neutrinos, provides a unique view on the universe and may provide insight in the origin of the most violent sources, such as gamma ray bursts, supernovae or even dark matter. The energy deposition of cosmic neutrinos in water induces a thermo-acoustic signal, which can be detected using sensitive hydrophones. The expected neutrino flux is however extremely low and the signal that neutrinos induce is small. TNO is presently developing sensitive hydrophone technology that is based on fiber optics. Optical fibers form a natural way to create a distributed sensing system. Using this technology a large scale neutrino telescope can be built in the deep sea. TNO is aiming for a prototype hydrophone which will form the building block of a future telescope.

Students have the possibility to participate to this project is the following ways: (i) Modeling of cosmic rays induced acoustic signal in a neutrino telescope. Keywords: Cosmic rays, Monte Carlo, signal processing, telescope optimization. (ii) Testing and optimization of fiber optical hydrophone for a large scale neutrino telescope. Keywords: Experimental, physics, system design.

The work will be (partly) executed in Delft.

Contact: Ernst-Jan Buis

LHCb: A Scintillator Fibers Tracker

The LHCb collaboration is upgrading the present tracking system constructing a new tracker based on scintillating fibers combined with silicon photo-multipliers (SiPM): the SciFi Tracker! Nikhef plays a key role in the project, as we will build the SciFi fibers modules, the cold-box enclosure housing the SiPMs, and a large part of the on-detector electronics. In all these areas, interesting test hardware and software has to be realized, and several research topics for a Master project are available, taking the student in contact with state-of-the-art particle detectors, in a large team of physicists and engineers. Possible collaborations with the Nikhef R&D group can also be envisaged.

Contact: Antonio Pellegrino

LHCb: Discovery of the Decay Lb --> p Ds+

This project aims to measure the branching fraction of the decay Lb->p Ds+ (bud -> uud + ds). The decay Lb->p Ds+ is quite rare, because it occurs through the transition of a b-quark to a u-quark. It has not been measured yet (although some LHCb colleagues claim to have seen it). This decay is interesting, because

1) It is sensitive to the b->u coupling (CKM-element Vub), which determination is heavily debated. 2) It can quantify non-factorisable QCD effects in b-baryon decays.

The decay is closely related to B0->pi-Ds+, which proceeds through a similar Feynman diagram. Also, the final state of B0->pi-Ds+ is almost identical to Lb->p Ds+. The aim is to determine the relative branching fraction of Lb->pDs+ with respect to B0->D+pi- decays, in close collaboration with the PhD (who will study BR(B0->pi-Ds+)/BR(B0->D+pi-) ). This project will result in a journal publication on behalf of the LHCb collaboration, written by you. For this project computer skills are needed. The ROOT programme and C++ and/or Python macros are used. This is a project that is closely related to previous analyses in the group. Weekly video meetings with CERN coordinate the efforts with in the LHCb collaboration. Relevant information:

[1] R.Aaij et al. [LHCb Collaboration], ``Determination of the branching fractions of B0s->DsK and B0->DsK, JHEP 05 (2015) 019 [arXiv:1412.7654 [hep-ex]]. [2] R. Fleischer, N. Serra and N. Tuning, ``Tests of Factorization and SU(3) Relations in B Decays into Heavy-Light Final States, Phys. Rev. D 83, 014017 (2011) [arXiv:1012.2784 [hep-ph]].

Contact: Niels Tuning and Lennaert Bel and Mick Mulder

LHCb: Measurement of B0 -> pi Ds- , the b -> u quark transition

This project aims to measure the branching fraction of the decay B0->pi Ds+. This decay is closely related to Lb->p Ds+ (see above), and close collaboration between the two master projects is foreseen. This research was started by a previous master student. The new measurement will finish the work, and include the new data from 2015 and 2016.

See Mick Mulders master thesis for more information.

Contact: Niels Tuning and Lennaert Bel and Mick Mulder

LHCb: A search for heavy neutrinos in the decay of W bosons at LHCb

Neutrinos are arguably the most mysterious of all known fundamental fermions as they are both much lighter than all others and only weakly interacting. It is thought that the tiny mass of neutrinos can be explained by their mixing with so-far unknown, much heavier, neutrino-like particles. In this research proposal we look for these new neutrinos in the decay of the SM W-boson using data with the LHCb experiment at CERN. The W boson is assumed to decay to a heavy neutrino and a muon. The heavy neutrino subsequently decays to a muon and a pair of quarks. Both like-sign and opposite-sign muon pairs will be studied. The result of the analysis will either be a limit on the production of the new neutrinos or the discovery of something entirely new.

Contact: Wouter Hulsbergen and Elena Dall'Occo


LHCb: Searches for new pentaquarks

In 2015 LHCb surprisingly discovered states containing five quarks, called Pc+ pentaquarks. Such particles question our understanding of confinement, the principle that forces quarks to remain in a single hadron. Which hadrons are allowed and which are not? The pentaquarks were found in the decay of the Lambda_b baryon to a Pc+ and a kaon, and Pc+ to a J/psi and a proton. This project aims at studying other similar but yet unobserved decays which could reveal the presence of the know Pc+, or yet unknown pentaquarks. The student will optimise a selection for finding such a decay in LHCb data using machine learning techniques.

See arXiv:1406.0755 for more information.

Contact: Patrick Koppenburg

2015:

Cool with Carbon Foam

The sensors and readout chips of tracking detectors produce heat which must be removed by a cooling system. The amount of material used for cooling must be minimised to avoid spoiling the track measurement by multiple scattering, bremstrahling, and the like. Recently highly porous carbon foams with low density and high thermal conductivities have become available. In this project we investigate and optimise the performance of gas-cooled low radiation-length carbon-foams for cooling.

So far we have demonstrated the very high heat-transfer-coefficient from readout chip to gas. In a second phase we will make a more realistic detector prototype for study. We can also further optimise the design by machining the foam to direct the gas where it is needed most. With help from Nikhef engineering department we can study the implications for the off-detector part of the system.

Contact: Nigel Hessey

Electrode optimisation for Gaseous Pixel Detectors

The detector R&D Group develops highly accurate gaseous tracking detectors based on pixelised readout chips. In this computer-simulation based project we will use meshing and finite element analysis tools to calculate the electric field of a given detector design. We can then use the Garfield program to simulate the detector performance, and then optimise the design of the electrodes, improving the drift-field, avalanche field, and signal-pickup of future detectors.

In the first year of the project we have developed the tools needed, and are optimising the signal electrode design. With the tools now in place, in the coming year we can optimise many other features of the design.

Contact: Nigel Hessey


The Radon Terminator

For Dark Matter experiments achieving low radioactive backgrounds determines the succes of an experiment. Within the XENON collaboration a lot of expertise is present to control these radioactive backgrounds, but unfortunately some of these are extremely hard to control. One of these is radon: radon is an unstable noble gas with a lifetime of several days, which can be solved into the xenon we use in our experiment. The decays happen in the middle of the active volume of our detector and may form an irreducible background to the Dark Matter sources. Several ideas exist to filter radon from xenon, and at Nikhef we are developing a new technique based on electrostatic separation. In our group we need a master student to commission and validate a radon separator we have built at Nikhef. The student will need to build / buy the diagnostics equipment and then show whether our proposed technique works or not. You will be the 'owner' of your own experimental setup. This is a high-risk project - there is no guarantee yet that the technique works: if it works the pay-off is high!

Contact: Auke Colijn

The Modulation experiment

There exist a few measurements that suggest an annual modulation in the activity of radioactive sources. With a few groups from the XENON collaboration we have developed four sets of table-top experiments to investigate this effect on a few well known radioactive sources. The experiments are under construction in Purdue University (USA), a mountain top in Switzerland, a beach in Rio de Janeiro and the last one at Nikhef in Amsterdam. We urgently need a master student to (1) do the final commissioning of the experiment, (2) collect the 1st big data set, and (3) analyse the first data. We are looking for an all-round physicist with interest in both lab-work and data-analysis. The student will directly collaborate with the other groups in this small collaboration (around 10 people), and the goal is to have the first publication ready by the end of the project.

Contact: Auke Colijn


Testing general relativity with gravitational waves

The Advanced LIGO and Advanced Virgo detectors are gearing up to make the first direct detections of gravitational waves over the next few years, with a first observing run scheduled for September 2015. Among the most promising sources are mergers of binary systems consisting of neutron stars and/or black holes. The ability to observe the emitted gravitational wave signals will, for the first time, give access to the genuinely strong-field dynamics of general relativity (GR), thereby putting the classical theory to the ultimate test. The Nikhef group has developed a data analysis method to look for generic deviations from GR using signals from merging binary neutron stars. We are now extending this framework to binary black holes, which have much richer dynamics and will allow for more penetrating tests of GR, but which also pose significant new challenges. The student will study the end-to-end response of the analysis pipeline to signals predicted by GR as well as a range of alternative theories of gravity, by adding simulated waveforms to real detector noise. Basic programming skills in C, Python, or related languages are a prerequisite.

Contact: Chris Van Den Broeck


New physics from Higgs interactions with polarised W bosons

Higgs interactions with electroweak gauge bosons W+ and W- in the SM are a crucial, precisely defined part of the Standard Model. Measuring separately the Higgs coupling to longitudinally and transversely polarised bosons will determine, for the first time, if Higgs and gauge bosons are elementary, as predicted in the SM, or composite particles, indicating the presence of the BSM physics. The student will be involved in all steps of the analysis: Monte Carlo studies, the analysis of the ATLAS data and background rejection. The basic tools will include programming in C++ and Python and using ROOT.

Contact: Magdalena Slawinska


ATLAS inner tracker upgrade

Research description: One of the key sub-systems of the ATLAS experiment at the Large Hadron Collider (LHC) is the Inner Detector (ID), designed to provide excellent charged particles momentum and vertex resolution measurements.

At Phase-2 of the LHC run the operating luminosity of the collider will be increased significantly. This will imply an upgrade of all ATLAS subsystems. In particular, the ID will be fully replaced with a tracker completely made of Silicon, having higher granularity and radiation hardness. The R&D process for the new ATLAS ID is now ongoing. Different geometrical layouts are simulated and their performance is studied under different operating conditions in search for the optimal detector architecture. Also, the performance of the new Si-sensors/modules is under investigation with dedicated laboratory tests. The focus of the project could be on the simulation of the High-Luminosity LHC version of the ATLAS Inner Detector. The student will learn how a high-energy physics experiment is designed and optimized. Alternatively, if possible at that moment, the student could work on a project at the Nikhef Silicon laboratory at the test-bench for new ATLAS Si-strip detectors and participate in the quality assurance procedure for the new ATLAS Si detectors.

Contact: Peter Vankov


Searching for Dark Matter in the mono-jet channel in ATLAS

Searches for Dark Matter are one of the key points of the LHC physics programme in Run-2. The mono-jet analysis, where an energetic jet recoils against missing transverse energy, is the most sensitive general search channel for Dark Matter candidates in ATLAS. In this project, the student will take part in the data analysis, help with estimating Standard Model backgrounds and prepare an interpretation of the results in terms of simplified models such as, for example, Higgs portal Dark Matter. Basic knowledge of C++ and python is required.

Contact: David Salek

ATLAS Run 2 : Beyond Standard Model with multiple leptons

The Standard Model of particle physics (SM) is extremely successful, but would it hold against of check of with data containing multiple leptons? Although very rare process, the production of leptons is calculated in SM with high precision. On detector side the leptons (electrons and muons) are easy to reconstruct and such a sample contains very little "non-lepton" background. This analysis has a very ambitious goal to test many final states at once, without over-tuning for a specific model. The second step would then be to test obtained results against models of composite structure of leptons or presence of heavy right handed neutrinos favored in seesaw theories. With this project, the student would gain close familiarity with modern experimental techniques (statistical analysis, SM background estimates, etc.), with Monte Carlo generators and the standard HEP analysis tools (ROOT, C++, etc.).

Contact: Olya Igonkina


Higgs Physics: Is the observed Higgs-like particle at 125 GeV composite?

Now that a Higgs-like particle has been observed in several final states it is important to test experimentally whether it is composite. In the Standard Model the Higgs is elementary. The test can be done by looking in the final state where the H (composite) decays to the H (observed at 125 GeV) + a photon.

For the analysis the clean four-lepton final state of the H (observed) will be used: H -> Z Z^* -> 4 l. By combining the four-lepton candidates with a photon a search for a resonant composite particle - or excitation of the H (observed) ground state - can be performed by looking for a peak in the invariant mass spectrum. The full data taken from 2010 to 2012 with about 30 observed signal events will be used for this search. The goals is also to study the discovery reach for RUN2.

Contact: Peter Kluit


Acoustic detection of ultra-high energy cosmic-ray neutrinos

Experiment The study of the cosmic neutrinos of energies above 1017 eV, the so-called ultra-high energy neutrinos, provides a unique view on the universe and may provide insight in the origin of the most violent sources, such as gamma ray bursts, supernovae or even dark matter. The energy deposition of cosmic neutrinos in water induce a thermo-acoustic signal, which can be detected using sensitive hydrophones. The expected neutrino flux is however extremely low and the signal that neutrinos induce is small. TNO is presently developing sensitive hydrophone technology that is based on fiber optics. Optical fibers form a natural way to create a distributed sensing system. Using this technology a large scale neutrino telescope can be built in the deep sea. TNO is aiming for a prototype hydrophone which will form the building block of a future telescope. Students project

Students have the possibility to participate to this project is the following ways: (i) Modeling of cosmic rays induced acoustic signal in a neutrino telescope. Keywords: Cosmic rays, Monte Carlo, signal processing, telescope optimization. (ii) Testing and optimization of fiber optical hydrophone for a large scale neutrino telescope. Keywords: Experimental, physics, system design.

The work will be (partly) executed in Delft.

Further information Info on ultra-high energy neutrinos can be found at: http://arxiv.org/abs/1102.3591 Info on acoustic detection of neutrinos can be found at: http://arxiv.org/abs/1311.7588

Contact: Ernst-Jan Buis


First KM3NeT data

The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its very first string with sensitive photodetectors will be deployed in the summer 2015. Already the very first detection unit will provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere. The performance and calibration of the detector will be evaluated also in comparison with simulations. Procedures to identify and also reconstruct a background free sample of muons will be developed to verify the performance and potential of the detector and to pave the path towards the neutrino detection. Programming skills are essential, mostly root and C++ will be used.

Contact: Ronald Bruijn

Tau neutrino identification in the KM3NeT neutrino telescope

In order to uniquely identify neutrinos from cosmic sources a promising strategy is to focus on the tau neutrinos. This flavour is (almost) not expected to be produced in interactions of cosmic rays with the atmosphere so that a selection of tau neutrinos can provide for an almost background free sample of cosmic neutrinos. The signature of a tau neutrino interaction in the KM3NeT neutrino telescope is special as high energetic tau leptons created in the neutrino interaction will travel some length (>10m) in the detector before decaying so that two showers of particles are created (at the interaction and decay vertex) The project will use simulations to investigate possible methods for the identification of the tau signature in the KM3NeT neutrino telescope which is now under construction in the Mediterranean Sea. Programming skills are for this project essential, mainly C++ and root being used.

Contact: Dorothea Samtleben

Neutrino mass hierarchy with KM3NeT/ORCA

Neutrinos exist in three flavours and are known to oscillate between flavours whereby the detected flavour depends on the (partly) known oscillation parameters, the mass differences, their energy and travel length. The neutrino telescope KM3NeT is planning for a dedicated set of detection units in order to pursue an oscillation measurement of an unprecedented precision using neutrinos from atmospheric interactions and with this enabling the measurement of the so far still unknown neutrino mass hierarchy. The measurement of this subtle effect requires unprecedented precision in the reconstruction and identification of the flavour, energy and direction. Various projects are available in the reconstruction and evaluation of the mass hierarchy using dedicated simulations. Programming skills are essential, mainly C++ and root will be used.

Contact: Aart Heijboer


Bs->mumu and Bd->mumu normalization and B mesons hadronization probabilities

The measurement of the Bs-> mu mu and Bd->mu mu decays is one of the flagships of the LHCb experiment, the latest result in combination with CMS has recently been published on Nature. The aim of this project is to study the yields of other decays with a J/Psi in the final state, like B+ -> J/Psi K+ and Bs-> J/Psi Phi, that can be detected triggering on the muons decay products of the J/Psi. These yields are a crucial input to obtain the Bs-> mu mu and Bd->mu mu decays branching fraction as they provide a relative normalization. Moreover, in order to use Bd decays to normalize the Bs-> mu mu yields we need to measure the relative probabilities for a b quark to hadronize into a Bs (f_s) or a Bd (f_d) meson, that can also be obtained from B+ -> J/Psi K+, Bs-> J/Psi Phi and Bd-> J/Psi K* decays. The ratio f_s/f_d is not a constant and is therefore important to measure it as a function of both the energy in the center of mass of the pp collision and the B mesons kinematics. The combination of previous data at 7 and 8 TeV and data at 13 TeV from the LHC 2015 run will provide us important new insight and is a result worth a journal publication in his own right. For this project some programming skills are needed (PYTHON or C++). Some initial knowledge of the ROOT analysis framework is also useful. The student will perform his research in a group consisting of two seniors and two Ph.D. students engaged in the study of very rare decays of the B mesons to di-muon final states and the search for lepton-flavor violating final states (e.g. electron-muon). Relevant information: [1] R.Aaij et al. [LHCb Collaboration], ``Measurement of the fragmentation fraction ratio f_s/f_d and its dependence on B meson kinematics, JHEP 04 (2013) 001 [arXiv:1301.5286 [hep-ph]].

Contact: [mailto: pellegrino@nikhef.nl Antonio Pellegrino] and Maarten van Veghel (PhD)

B meson Production asymmetries

At the LHC, B0 mesons and anti-B0 mesons are not produced in equal quantities (about 0.5% more B0 mesons than anti-B0 mesons). This production asymmetry can be measured with semileptonic decays of the type B0 -> D-(*) mu+ nu (and its charge conjugate decay). The goal of this measurement is to measure the asymmetry as function of the transverse momentum and (pseudo)-rapidity of the B0 (or anti-B0). This requires to unfold of the observed kinematic distributions.

Contact: Jeroen van Tilburg and Jacco de Vries


Quantum decoherence

When two particles are created in an anti-symmetric wave function, the two particles are entangled, even though they may be separated by large distances. If one of the particles is forced into one state (projection), this determines the other state instantaneously. Several theoretical models, motivated by quantum gravity effects, predict the existance of a decoherence parameter. Using decays of phi->K_S K_L, it is possible to measure this decoherence parameter by counting the number of phi decays where both neutral kaons are measured as K_S-> pi+ pi-. If this parameter is measured to be non-zero, it would mean that our current understanding of quantum mechanics is not complete.

Contact: [mailto:jtilburg@nikhef.nl Jeroen van Tilburg


A search for heavy neutrinos in the decay of W at LHCb

Neutrinos are arguably the most mysterious of all known fundamental fermions as they are both much lighter than all others and only weakly interacting. It is thought that the tiny mass of neutrinos can be explained by their mixing with so-far unknown, much heavier, neutrino-like particles. In this research proposal we look for these new neutrinos in the decay of the SM W-boson using data with the LHCb experiment at CERN. The W boson is assumed to decay to a heavy neutrino and a muon. The heavy neutrino subsequently decays to a muon and a pair of quarks. Both like-sign and opposite-sign muon pairs will be studied. The result of the analysis will either be a limit on the production of the new neutrinos or the discovery of something entirely new.

Contact: Wouter Hulsbergen and Elena Dall'Occo


Measurement of BR(B0->pi-Ds+) and BR(Bs->Ds-*pi+)/BR(Bs->Ds-pi+)

This project aims to measure the branching fraction of the decay B0->pi-Ds+. The decay B0->pi-Ds+ is quite rare, because it occurs through the transition of a b-quark to a u-quark. It has been measured at the B-factories only at modest precision (~12%). This decay is interesting, because

  1. It is sensitive to the CKM-element Vub, which determination is heavily debated.
  2. It can be used to determine the ratio r_pi=B0->pi-D+/B0->D-pi+ which in turn is needed for CP violation measurements.
  3. It can quantify non-factorisable QCD effects in certain B-decays.

The experimental challenge is to understand the background from e.g. Bs->Ds*pi decays. The aim is to also determine the relative branching fraction of Bs->Ds*pi relative to Bs->Dspi decays. This can is useful, because

  • It helps in the measurement of B0->pi-Ds+
  • It might quantify the magnitude of the ratio of form factors F(Bs->Ds*)/F(Bs->Ds*)

The aim is that this project results in a journal publication on behalf of the LHCb collaboration. For this project computer skills are needed. The ROOT programme and C++ and/or Python macros are used. This is a project that is closely related to three important analyses in the group:

  • Measurements of fs/fd with hadronic Bs->DsPi decays,
  • Time dependent CP violation analysis of Bs->DsK decays.

Weekly video meetings with CERN coordinate the efforts with in the LHCb collaboration.

Contact: Niels Tuning and Mick Mulder

Compton camera

In the Nikhef R&D group we develop instrumentation for particle physics but we also investigate how particle physics detectors can be used for different purposes. A succesfull development is the Medipix chip that can be used in Xray imaging. For use in large scale medical applications compton scattering limits however the energy resolving possibilities. You will investigate whether it is in principle possible to design a Xray application that detects the compton scattered elctron and the absorbed photon. Your ideas can be tested in practice in the lab where a Xray scan can be performed.

Contact: Els Koffeman

Proton Radiography for Proton Beam Therapy

The construction of a Proton Beam Therapy centre in Groningen has started. The Nikhef R&D group is working with KVI-CART and the UMCG in Groningen to improve the quality of the data on which the treatment plan is based. The idea of Proton Beam Therapy is to stop the protons in the tumour where they will deposit the major part of their energy, thereby destroying the tumour. Currently, only the X-ray Computed Tomography data is used to determine the area that needs to be irradiated with protons to destroy the tumour. However, this data is not ideal to calculate the proton beam stopping power distribution as it is based on X-ray attenuation, which is a completely different physical process compared to the stopping of protons. Therefore, we want to implement Proton Beam Computed Tomography, by shooting fast protons through the patient. To improve the information about where the protons are going to stop in the patient, we use a detector system that can track the individual protons both before and after the patient and at the same time will determine how much energy is dissipated in the patient.

In this project the topics that are under study are the following:

  • Data analysis of data taken in May 2015 at 150 MeV proton energy, which means reconstruction of proton tracks and combine this with the deposited energy to identify the different materials in the irradiated phantom.
  • Improving the current set-up based on the lessons learned in the analysis
  • Perform measurements at different initial proton energies with the same phantom to optimise the phantom reconstruction as the information that can be extracted is energy dependent.

Contact: Jan Visser

Medical X-ray Imaging

With the upcoming of true multi-threshold X-Ray detectors the possibilities for Spectral Imaging with low dose, including spectral CT, is now a reality around the corner. The Medipix3RX chip, from the Medipix Collaboration (CERN) features up to 8 programmable thresholds which can select energy bins without a threshold scan. A number of projects could be derived from the R&D activities with the Medipix3RX within the Nikhef R&D group on X-ray imaging for medical applications:

  • Medipix3RX characterization in all its operation modes and gains.
  • Spectral CT and scarce sampling 3D reconstruction
  • Charge sharing: the charge-sum capabilities of the chip can be exploited to further understand the problem of charge sharing in pixelized detectors. A combination of the characterization of the charge-summing mode plus the use of both planar, and 3D sensors, at the light of MC simulation, could reveal valuable information about charge sharing.

Contact: John Idarraga