Last years MSc Projects

From Education Wiki
Revision as of 09:14, 30 March 2016 by (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search


Cool with Carbon Foam

The sensors and readout chips of tracking detectors produce heat which must be removed by a cooling system. The amount of material used for cooling must be minimised to avoid spoiling the track measurement by multiple scattering, bremstrahling, and the like. Recently highly porous carbon foams with low density and high thermal conductivities have become available. In this project we investigate and optimise the performance of gas-cooled low radiation-length carbon-foams for cooling.

So far we have demonstrated the very high heat-transfer-coefficient from readout chip to gas. In a second phase we will make a more realistic detector prototype for study. We can also further optimise the design by machining the foam to direct the gas where it is needed most. With help from Nikhef engineering department we can study the implications for the off-detector part of the system.

Contact: Nigel Hessey

Electrode optimisation for Gaseous Pixel Detectors

The detector R&D Group develops highly accurate gaseous tracking detectors based on pixelised readout chips. In this computer-simulation based project we will use meshing and finite element analysis tools to calculate the electric field of a given detector design. We can then use the Garfield program to simulate the detector performance, and then optimise the design of the electrodes, improving the drift-field, avalanche field, and signal-pickup of future detectors.

In the first year of the project we have developed the tools needed, and are optimising the signal electrode design. With the tools now in place, in the coming year we can optimise many other features of the design.

Contact: Nigel Hessey

The Radon Terminator

For Dark Matter experiments achieving low radioactive backgrounds determines the succes of an experiment. Within the XENON collaboration a lot of expertise is present to control these radioactive backgrounds, but unfortunately some of these are extremely hard to control. One of these is radon: radon is an unstable noble gas with a lifetime of several days, which can be solved into the xenon we use in our experiment. The decays happen in the middle of the active volume of our detector and may form an irreducible background to the Dark Matter sources. Several ideas exist to filter radon from xenon, and at Nikhef we are developing a new technique based on electrostatic separation. In our group we need a master student to commission and validate a radon separator we have built at Nikhef. The student will need to build / buy the diagnostics equipment and then show whether our proposed technique works or not. You will be the 'owner' of your own experimental setup. This is a high-risk project - there is no guarantee yet that the technique works: if it works the pay-off is high!

Contact: Auke Colijn

The Modulation experiment

There exist a few measurements that suggest an annual modulation in the activity of radioactive sources. With a few groups from the XENON collaboration we have developed four sets of table-top experiments to investigate this effect on a few well known radioactive sources. The experiments are under construction in Purdue University (USA), a mountain top in Switzerland, a beach in Rio de Janeiro and the last one at Nikhef in Amsterdam. We urgently need a master student to (1) do the final commissioning of the experiment, (2) collect the 1st big data set, and (3) analyse the first data. We are looking for an all-round physicist with interest in both lab-work and data-analysis. The student will directly collaborate with the other groups in this small collaboration (around 10 people), and the goal is to have the first publication ready by the end of the project.

Contact: Auke Colijn

Testing general relativity with gravitational waves

The Advanced LIGO and Advanced Virgo detectors are gearing up to make the first direct detections of gravitational waves over the next few years, with a first observing run scheduled for September 2015. Among the most promising sources are mergers of binary systems consisting of neutron stars and/or black holes. The ability to observe the emitted gravitational wave signals will, for the first time, give access to the genuinely strong-field dynamics of general relativity (GR), thereby putting the classical theory to the ultimate test. The Nikhef group has developed a data analysis method to look for generic deviations from GR using signals from merging binary neutron stars. We are now extending this framework to binary black holes, which have much richer dynamics and will allow for more penetrating tests of GR, but which also pose significant new challenges. The student will study the end-to-end response of the analysis pipeline to signals predicted by GR as well as a range of alternative theories of gravity, by adding simulated waveforms to real detector noise. Basic programming skills in C, Python, or related languages are a prerequisite.

Contact: Chris Van Den Broeck

New physics from Higgs interactions with polarised W bosons

Higgs interactions with electroweak gauge bosons W+ and W- in the SM are a crucial, precisely defined part of the Standard Model. Measuring separately the Higgs coupling to longitudinally and transversely polarised bosons will determine, for the first time, if Higgs and gauge bosons are elementary, as predicted in the SM, or composite particles, indicating the presence of the BSM physics. The student will be involved in all steps of the analysis: Monte Carlo studies, the analysis of the ATLAS data and background rejection. The basic tools will include programming in C++ and Python and using ROOT.

Contact: Magdalena Slawinska

ATLAS inner tracker upgrade

Research description: One of the key sub-systems of the ATLAS experiment at the Large Hadron Collider (LHC) is the Inner Detector (ID), designed to provide excellent charged particles momentum and vertex resolution measurements.

At Phase-2 of the LHC run the operating luminosity of the collider will be increased significantly. This will imply an upgrade of all ATLAS subsystems. In particular, the ID will be fully replaced with a tracker completely made of Silicon, having higher granularity and radiation hardness. The R&D process for the new ATLAS ID is now ongoing. Different geometrical layouts are simulated and their performance is studied under different operating conditions in search for the optimal detector architecture. Also, the performance of the new Si-sensors/modules is under investigation with dedicated laboratory tests. The focus of the project could be on the simulation of the High-Luminosity LHC version of the ATLAS Inner Detector. The student will learn how a high-energy physics experiment is designed and optimized. Alternatively, if possible at that moment, the student could work on a project at the Nikhef Silicon laboratory at the test-bench for new ATLAS Si-strip detectors and participate in the quality assurance procedure for the new ATLAS Si detectors.

Contact: Peter Vankov

Searching for Dark Matter in the mono-jet channel in ATLAS

Searches for Dark Matter are one of the key points of the LHC physics programme in Run-2. The mono-jet analysis, where an energetic jet recoils against missing transverse energy, is the most sensitive general search channel for Dark Matter candidates in ATLAS. In this project, the student will take part in the data analysis, help with estimating Standard Model backgrounds and prepare an interpretation of the results in terms of simplified models such as, for example, Higgs portal Dark Matter. Basic knowledge of C++ and python is required.

Contact: David Salek

ATLAS Run 2 : Beyond Standard Model with multiple leptons

The Standard Model of particle physics (SM) is extremely successful, but would it hold against of check of with data containing multiple leptons? Although very rare process, the production of leptons is calculated in SM with high precision. On detector side the leptons (electrons and muons) are easy to reconstruct and such a sample contains very little "non-lepton" background. This analysis has a very ambitious goal to test many final states at once, without over-tuning for a specific model. The second step would then be to test obtained results against models of composite structure of leptons or presence of heavy right handed neutrinos favored in seesaw theories. With this project, the student would gain close familiarity with modern experimental techniques (statistical analysis, SM background estimates, etc.), with Monte Carlo generators and the standard HEP analysis tools (ROOT, C++, etc.).

Contact: Olya Igonkina

Higgs Physics: Is the observed Higgs-like particle at 125 GeV composite?

Now that a Higgs-like particle has been observed in several final states it is important to test experimentally whether it is composite. In the Standard Model the Higgs is elementary. The test can be done by looking in the final state where the H (composite) decays to the H (observed at 125 GeV) + a photon.

For the analysis the clean four-lepton final state of the H (observed) will be used: H -> Z Z^* -> 4 l. By combining the four-lepton candidates with a photon a search for a resonant composite particle - or excitation of the H (observed) ground state - can be performed by looking for a peak in the invariant mass spectrum. The full data taken from 2010 to 2012 with about 30 observed signal events will be used for this search. The goals is also to study the discovery reach for RUN2.

Contact: Peter Kluit

Acoustic detection of ultra-high energy cosmic-ray neutrinos

Experiment The study of the cosmic neutrinos of energies above 1017 eV, the so-called ultra-high energy neutrinos, provides a unique view on the universe and may provide insight in the origin of the most violent sources, such as gamma ray bursts, supernovae or even dark matter. The energy deposition of cosmic neutrinos in water induce a thermo-acoustic signal, which can be detected using sensitive hydrophones. The expected neutrino flux is however extremely low and the signal that neutrinos induce is small. TNO is presently developing sensitive hydrophone technology that is based on fiber optics. Optical fibers form a natural way to create a distributed sensing system. Using this technology a large scale neutrino telescope can be built in the deep sea. TNO is aiming for a prototype hydrophone which will form the building block of a future telescope. Students project

Students have the possibility to participate to this project is the following ways: (i) Modeling of cosmic rays induced acoustic signal in a neutrino telescope. Keywords: Cosmic rays, Monte Carlo, signal processing, telescope optimization. (ii) Testing and optimization of fiber optical hydrophone for a large scale neutrino telescope. Keywords: Experimental, physics, system design.

The work will be (partly) executed in Delft.

Further information Info on ultra-high energy neutrinos can be found at: Info on acoustic detection of neutrinos can be found at:

Contact: Ernst-Jan Buis

First KM3NeT data

The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its very first string with sensitive photodetectors will be deployed in the summer 2015. Already the very first detection unit will provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere. The performance and calibration of the detector will be evaluated also in comparison with simulations. Procedures to identify and also reconstruct a background free sample of muons will be developed to verify the performance and potential of the detector and to pave the path towards the neutrino detection. Programming skills are essential, mostly root and C++ will be used.

Contact: Ronald Bruijn

Tau neutrino identification in the KM3NeT neutrino telescope

In order to uniquely identify neutrinos from cosmic sources a promising strategy is to focus on the tau neutrinos. This flavour is (almost) not expected to be produced in interactions of cosmic rays with the atmosphere so that a selection of tau neutrinos can provide for an almost background free sample of cosmic neutrinos. The signature of a tau neutrino interaction in the KM3NeT neutrino telescope is special as high energetic tau leptons created in the neutrino interaction will travel some length (>10m) in the detector before decaying so that two showers of particles are created (at the interaction and decay vertex) The project will use simulations to investigate possible methods for the identification of the tau signature in the KM3NeT neutrino telescope which is now under construction in the Mediterranean Sea. Programming skills are for this project essential, mainly C++ and root being used.

Contact: Dorothea Samtleben

Neutrino mass hierarchy with KM3NeT/ORCA

Neutrinos exist in three flavours and are known to oscillate between flavours whereby the detected flavour depends on the (partly) known oscillation parameters, the mass differences, their energy and travel length. The neutrino telescope KM3NeT is planning for a dedicated set of detection units in order to pursue an oscillation measurement of an unprecedented precision using neutrinos from atmospheric interactions and with this enabling the measurement of the so far still unknown neutrino mass hierarchy. The measurement of this subtle effect requires unprecedented precision in the reconstruction and identification of the flavour, energy and direction. Various projects are available in the reconstruction and evaluation of the mass hierarchy using dedicated simulations. Programming skills are essential, mainly C++ and root will be used.

Contact: Aart Heijboer

Bs->mumu and Bd->mumu normalization and B mesons hadronization probabilities

The measurement of the Bs-> mu mu and Bd->mu mu decays is one of the flagships of the LHCb experiment, the latest result in combination with CMS has recently been published on Nature. The aim of this project is to study the yields of other decays with a J/Psi in the final state, like B+ -> J/Psi K+ and Bs-> J/Psi Phi, that can be detected triggering on the muons decay products of the J/Psi. These yields are a crucial input to obtain the Bs-> mu mu and Bd->mu mu decays branching fraction as they provide a relative normalization. Moreover, in order to use Bd decays to normalize the Bs-> mu mu yields we need to measure the relative probabilities for a b quark to hadronize into a Bs (f_s) or a Bd (f_d) meson, that can also be obtained from B+ -> J/Psi K+, Bs-> J/Psi Phi and Bd-> J/Psi K* decays. The ratio f_s/f_d is not a constant and is therefore important to measure it as a function of both the energy in the center of mass of the pp collision and the B mesons kinematics. The combination of previous data at 7 and 8 TeV and data at 13 TeV from the LHC 2015 run will provide us important new insight and is a result worth a journal publication in his own right. For this project some programming skills are needed (PYTHON or C++). Some initial knowledge of the ROOT analysis framework is also useful. The student will perform his research in a group consisting of two seniors and two Ph.D. students engaged in the study of very rare decays of the B mesons to di-muon final states and the search for lepton-flavor violating final states (e.g. electron-muon). Relevant information: [1] R.Aaij et al. [LHCb Collaboration], ``Measurement of the fragmentation fraction ratio f_s/f_d and its dependence on B meson kinematics, JHEP 04 (2013) 001 [arXiv:1301.5286 [hep-ph]].

Contact: [mailto: Antonio Pellegrino] and Maarten van Veghel (PhD)

B meson Production asymmetries

At the LHC, B0 mesons and anti-B0 mesons are not produced in equal quantities (about 0.5% more B0 mesons than anti-B0 mesons). This production asymmetry can be measured with semileptonic decays of the type B0 -> D-(*) mu+ nu (and its charge conjugate decay). The goal of this measurement is to measure the asymmetry as function of the transverse momentum and (pseudo)-rapidity of the B0 (or anti-B0). This requires to unfold of the observed kinematic distributions.

Contact: Jeroen van Tilburg and Jacco de Vries

Quantum decoherence

When two particles are created in an anti-symmetric wave function, the two particles are entangled, even though they may be separated by large distances. If one of the particles is forced into one state (projection), this determines the other state instantaneously. Several theoretical models, motivated by quantum gravity effects, predict the existance of a decoherence parameter. Using decays of phi->K_S K_L, it is possible to measure this decoherence parameter by counting the number of phi decays where both neutral kaons are measured as K_S-> pi+ pi-. If this parameter is measured to be non-zero, it would mean that our current understanding of quantum mechanics is not complete.

Contact: [ Jeroen van Tilburg

A search for heavy neutrinos in the decay of W at LHCb

Neutrinos are arguably the most mysterious of all known fundamental fermions as they are both much lighter than all others and only weakly interacting. It is thought that the tiny mass of neutrinos can be explained by their mixing with so-far unknown, much heavier, neutrino-like particles. In this research proposal we look for these new neutrinos in the decay of the SM W-boson using data with the LHCb experiment at CERN. The W boson is assumed to decay to a heavy neutrino and a muon. The heavy neutrino subsequently decays to a muon and a pair of quarks. Both like-sign and opposite-sign muon pairs will be studied. The result of the analysis will either be a limit on the production of the new neutrinos or the discovery of something entirely new.

Contact: Wouter Hulsbergen and Elena Dall'Occo

Measurement of BR(B0->pi-Ds+) and BR(Bs->Ds-*pi+)/BR(Bs->Ds-pi+)

This project aims to measure the branching fraction of the decay B0->pi-Ds+. The decay B0->pi-Ds+ is quite rare, because it occurs through the transition of a b-quark to a u-quark. It has been measured at the B-factories only at modest precision (~12%). This decay is interesting, because

  1. It is sensitive to the CKM-element Vub, which determination is heavily debated.
  2. It can be used to determine the ratio r_pi=B0->pi-D+/B0->D-pi+ which in turn is needed for CP violation measurements.
  3. It can quantify non-factorisable QCD effects in certain B-decays.

The experimental challenge is to understand the background from e.g. Bs->Ds*pi decays. The aim is to also determine the relative branching fraction of Bs->Ds*pi relative to Bs->Dspi decays. This can is useful, because

  • It helps in the measurement of B0->pi-Ds+
  • It might quantify the magnitude of the ratio of form factors F(Bs->Ds*)/F(Bs->Ds*)

The aim is that this project results in a journal publication on behalf of the LHCb collaboration. For this project computer skills are needed. The ROOT programme and C++ and/or Python macros are used. This is a project that is closely related to three important analyses in the group:

  • Measurements of fs/fd with hadronic Bs->DsPi decays,
  • Time dependent CP violation analysis of Bs->DsK decays.

Weekly video meetings with CERN coordinate the efforts with in the LHCb collaboration.

Contact: Niels Tuning and Mick Mulder

Compton camera

In the Nikhef R&D group we develop instrumentation for particle physics but we also investigate how particle physics detectors can be used for different purposes. A succesfull development is the Medipix chip that can be used in Xray imaging. For use in large scale medical applications compton scattering limits however the energy resolving possibilities. You will investigate whether it is in principle possible to design a Xray application that detects the compton scattered elctron and the absorbed photon. Your ideas can be tested in practice in the lab where a Xray scan can be performed.

Contact: Els Koffeman

Proton Radiography for Proton Beam Therapy

The construction of a Proton Beam Therapy centre in Groningen has started. The Nikhef R&D group is working with KVI-CART and the UMCG in Groningen to improve the quality of the data on which the treatment plan is based. The idea of Proton Beam Therapy is to stop the protons in the tumour where they will deposit the major part of their energy, thereby destroying the tumour. Currently, only the X-ray Computed Tomography data is used to determine the area that needs to be irradiated with protons to destroy the tumour. However, this data is not ideal to calculate the proton beam stopping power distribution as it is based on X-ray attenuation, which is a completely different physical process compared to the stopping of protons. Therefore, we want to implement Proton Beam Computed Tomography, by shooting fast protons through the patient. To improve the information about where the protons are going to stop in the patient, we use a detector system that can track the individual protons both before and after the patient and at the same time will determine how much energy is dissipated in the patient.

In this project the topics that are under study are the following:

  • Data analysis of data taken in May 2015 at 150 MeV proton energy, which means reconstruction of proton tracks and combine this with the deposited energy to identify the different materials in the irradiated phantom.
  • Improving the current set-up based on the lessons learned in the analysis
  • Perform measurements at different initial proton energies with the same phantom to optimise the phantom reconstruction as the information that can be extracted is energy dependent.

Contact: Jan Visser

Medical X-ray Imaging

With the upcoming of true multi-threshold X-Ray detectors the possibilities for Spectral Imaging with low dose, including spectral CT, is now a reality around the corner. The Medipix3RX chip, from the Medipix Collaboration (CERN) features up to 8 programmable thresholds which can select energy bins without a threshold scan. A number of projects could be derived from the R&D activities with the Medipix3RX within the Nikhef R&D group on X-ray imaging for medical applications:

  • Medipix3RX characterization in all its operation modes and gains.
  • Spectral CT and scarce sampling 3D reconstruction
  • Charge sharing: the charge-sum capabilities of the chip can be exploited to further understand the problem of charge sharing in pixelized detectors. A combination of the characterization of the charge-summing mode plus the use of both planar, and 3D sensors, at the light of MC simulation, could reveal valuable information about charge sharing.

Contact: John Idarraga

Personal tools