UCL

Physics and Astronomy »

Centre for Doctoral Training in Data Intensive Science

04 Oct 2024

2018 Intake Projects

  • Pattern Recognition and Event Classification in the Search for Neutrinoless Double-Beta Decay with SuperNEMO

    Student: Matteo Ceschia

    Supervisory Team: Ruben Saakyan (SuperNemo)

    SuperNEMO is an experiment, currently being commissioned, to search for neutrinoless double-beta decay (0νββ) and thereby address fundamental questions regarding the nature of the neutrino and the origin of the cosmic matter-antimatter asymmetry. The project, jointly led by UCL, is unique in the field through its combination of tracking and calorimetric information in the search for extremely rare nuclear decay modes with half-lives much longer than the age of the universe. SuperNEMO provides much richer per-event information than any other 0νββ experiment. Tracker hit positions and drift times, together with calorimeter timing and energy measurements, are combined to give a picture of each event. Although in this sense the detection principles are similar to those of collider experiments, in practice the challenges are quite different due to the extensive scattering of low-energy electrons and γ-rays and the wide variety of background isotopes with poorly known spatial distributions (both internal and external to the detector). SuperNEMO currently separates signal from background using a sequence of algorithms - clustering, track fitting and calorimeter association, prior to cut-based event classification. As each stage is performed sequentially, the latter stages can have no influence on the earlier ones - for example, if a particle track has been wrongly split there is no opportunity to recombine tracks later, even if other event characteristics strongly favour this hypothesis. More complex topologies are often completely mis-reconstructed and mis-categorised by the current approach. A more holistic event classification approach where a machine learning algorithm was allowed to find features on its own would allow us to combine some or all of these steps and generate distinctly better analysis outcomes.

  • Transforming exploration of the Dark Matter search frontier: application and development of ML and data intensive science techniques

    Student: Ishan Khurana

    Supervisory Team: Jim Dobson (LZ)

    During my PhD, I will be working on developing machine learning techniques to search for Weakly Interacting Massive Particles (WIMPs) in the first data from the LUX-Zepplin (LZ) dark matter experiment.

  • Spherical Wavelet Analysis of Geophysical Data

    Student: Augustin Marignier

    Supervisory Team: Ana Ferreira (Earth Science)

    Despite being very distinct research areas, dark matter mapping in cosmology and the imaging of the seismic properties of the Earth's interior (e.g., wave speeds) in geophysics currently face similar technical challenges. For example, with the explosion of big data sets in both disciplines, many maps and images of the same object have been produced by different research groups. There are often substantial differences between them, but fully quantitative, objective analyses of their statistical properties are lacking. In addition, most images and maps do not report uncertainties, which limits their usefulness to understand the fundamental physical processes underneath them. This project addresses these challenges through a two-way transfer of knowledge between cosmology and geophysics by: (i) applying some new methods developed for dark matter mapping in cosmology to the geophysical context, notably wavelet and sparse/compressed sensing approaches, and Bayesian Hierarchical modelling; and, (ii) adapting novel Bayesian trans-dimensional approaches currently used in geophysics to cosmological mapping.

  • Deep learning in exoplanet data analysis

    Student: Mario Morvan

    Supervisory Team: Ingo Waldmann, Jan-Peter Muller, Angelos Tsiaras

    With thousands of planets confirmed, tens of exoplanetary atmospheres observed, and several promising missions in preparation, the field of Exoplanetology is currently undergoing a complete revolution. Yet, it is still facing technical challenges, since planetary signals remains very faint no matter the detection technique, and instrument systematics numerous and loosely understood. Besides, the rapid growth of the observational data strongly suggests a shift towards more automated algorithms for the detection, de-trending and study of exoplanets. For these reasons, and in order to take advantage as much as possible of the current detectors and next generation of space-based transit-observers (JWST, ARIEL...), the whole analysis pipeline needs to be transformed, processing the raw data from the camera all the way up to the retrieval of the detrended transit light curves and transmission spectra. The current objective of this project is to build a deep learning framework allowing to deal with the amount and complexity of the input data coming from various exoplanets search projects.

  • Applying machine learning to extract galaxy properties

    Student: Sunil Mucesh

    Supervisory Team: Ofer Lahav

  • gw_event

    DES/DECam follow-ups of Gravitational Wave events

    Student: Constantina Nicolaou

    Supervisory Team: Ofer Lahav

    With the advent of the LIGO/Virgo network of gravitational wave detectors, we have successfully detected gravitational waves (GWs) produced during compact object inspirals. This has enabled us to inspect the Universe using a completely new probe. One of the GW events detected, GW170817, is understood to be the result of a binary neutron star (BNS) merger in the galaxy NGC4993. This was a multi-messenger event as the Dark Energy Survey Camera (DECam) (and other collaborations) detected an electromagnetic signal accompanying the gravitational wave detection. This detection alone has led to exciting discoveries in the field of physics. The PhD project is split into three main parts. Initially the aim is to search for optical flashes from future LIGO/Virgo alerts as part of the DES/DECam team. About one GW event per month is expected, so there is a real-time element to the project, with exciting unexpected discoveries. This part will involve advanced image processing methods on very large data sets. Once a long list of optical candidates is selected, a range of machine learning and Bayesian approaches will be implemented to assign probabilities to the most likely GW host galaxy. This is a 'finding a needle in a haystack' problem. For example, in the case of the DES/DECam follow-up of GW170817 there was a down-selection from 1,500 candidates to the 'real' one. Finally, a catalogue of all shell galaxies in DES will be created, by comparing images to templates of simulated shell galaxies which will facilitate to test whether BNS are more likely to be formed as a result of a galaxy merger. This hypothesis will be tested by cross correlating the shell galaxies with X-ray, Gamma-ray and GW events and comparing the frequency of events with other non-shell galaxies. Other outcomes of this project include improved measurements of the Hubble constant using GWs.

  • Radiotherapy Dose Verification with Cherenkov Light

    Student: Jeremy Ocampo

    Supervisory Team: Simon Jolly, Adam Gibson, Jamie McClelland

    Modern radiotherapy delivers treatment with a 360° rotating gantry that produces X-rays in a continuous arc around the patient. This X-ray beam is continuously modulated and shaped by adjusting both its intensity and the shape of the beam. This results in a highly conformal dose distribution that changes continuously over time, making dosimetry during treatment significantly more complex than when treatment is delivered with static fields. There is currently no widely accepted method for 4D in-vivo dosimetry in the clinical setting, and we believe that imaging the Cherenkov light emission of excited electrons could fill that gap. As the X-rays deposit dose, energetic electrons moving faster than the speed of light in tissue are produced: the subsequent emission of Cherenkov light can be recorded to reconstruct the 4D dose distribution. Cherenkov imaging therefore allows the radiation field to be imaged in real time and connected to anatomical landmarks (registration). This will give greater confidence that the planned dose is actually being delivered, enabling real-time detection - and ultimately correction - of discrepancies between the planned and delivered dose, giving clinicians the confidence to trial more advanced therapies. Experiments at UCL have demonstrated that the radiotherapy treatment beam from a radiotherapy gantry generates visible Cherenkov light which can be imaged with a standard consumer digital SLR camera. Other groups have generated in vivo movies showing the Cherenkov light emitted as treatment progresses. This project will build on this previous work and demonstrate the application of Cherenkov imaging for real-time in-vivo verification. The techniques that are developed in this project are medical image reconstruction, Geant4-based Monte Carlo simulations which are used to simulate the Cherenkov emissions, and the application of deep-learning techniques to optimise the reconstruction of the Cherenkov emissions.

  • Applying machine learning to nuclear fusion

    Student: Katya Richards

    Supervisory Team: Nikos Konstantinidis

    In the current age of climate change, we are in need of a new source of energy that is sustainable, reliable, and doesn't produce greenhouse gases or long term radioactive waste. Nuclear fusion, the process which powers the sun, is a promising solution, with a few major problems still to solve. One of these, disruptions, have an expensive habit of melting parts of the walls, and are the topic of my PhD.

  • Learning to Ignore Uncertainties with Adversaries at the LHC

    Student: Sam Van Stroud

    Supervisory Team: Tim Scanlon (ATLAS)

    My project at ATLAS is focused on improving b-tagging using data intensive methods. This will involve retraining the standard basic b-tagging MVA algorithms with a focus on both weak and unsupervised learning. The results from the enhanced algorithm will be propagated to become the default ATLAS b-tagging tool, used by all analyses. Additionally, we will train a systematics aware ANN using data-corrected simulated samples. This will significantly reduce the impact of systematic uncertainties on the systematically limited H->bb analysis. The final goal of the project is to perform the world's most precise H->bb measurement.