Postdoctoral Researcher Positions

These positions are now closed: we will be advertising similar positions later in 2024/25.

We are now recruiting for 5 PDRA positions.

There is a 3 year position available at each of our six institutions. (This recruitment relates to five of these positions, with a separate recruitment process for the PDRA position at the University of Cambridge). Interaction between the research groups at the six universities is strongly encouraged and resourced, with broad research projects to involve substantial cross-institutional collaborations.

The research projects led by each university differ, as does the technical expertise required from the PDRAs. These are detailed below.

To apply for the PDRA positions, click here.

For further information about the positions in general contact probaihub@lancaster.ac.uk.

  • Lead Researchers: Anthony Lee/Christophe Andrieu

    Themes: AI-scale probabilistic reasoning; Mathematical underpinnings of generative models; Structure-constrained and informed AI.

    The Bristol team has expertise in computational statistics and applied probability, with a focus on new algorithms and theoretical analysis of existing algorithms. We have recently been exploring a variety of generative modelling ideas.

    Possible Research Projects:

    - Sampling methods for AI-scale problems: theory and/or methodology.

    - Analysis/refinement of probabilistic properties of generative models. E.g. LLMs are trained to predict accurately masked tokens, which corresponds to learning a distribution over unobserved tokens.

    - Mathematical understanding of generative models, e.g. transformers, diffusion-based models and optimal transport.

    - Methodology using generative models for statistical tasks, e.g. by appropriate conditioning.

    - Methodology for incorporating constraints in AI models, e.g. by modularization, losses or architectures.

    Skills/Experience Required:

    - PhD in a Mathematical Science.

    - Strong mathematical background in a suitable combination of probability, statistics, machine learning, optimization, simulation.

    - Ability to code numerical algorithms (essential) and to use common machine learning frameworks (previous experience desirable but not necessary).

    - Familiarity with generative AI models (desirable).

  • Lead Researchers: Ben Leimkuhler/Aretha Teckentrup/Sara Wade/Konstantinos Zygalakis

    Themes: AI-scale probabilistic reasoning;  Mathematical underpinnings of generative models; Structure-constrained and informed AI; and Probabilistic and uncertainty-aware methods for trustworthy and green AI.

    The Edinburgh team brings together expertise in computational statistics, scientific computing and applied analysis to address large scale problems, and research in Edinburgh will focus on developing scalable and robust probabilistic inference.

    Potential Projects:

    - Efficient, scalable algorithms for sampling and optimization in artificial intelligence applications, including e.g. sparse Bayesian neural networks, (multilevel) Gaussian processes, gradient-based schemes, piece-wise deterministic Markov processes
    - Machine Learning inspired algorithms for SDEs and molecular systems
    - Understanding limiting behaviours of deep Gaussian processes and Bayesian neural networks, e.g. under physical constraints or sparsity
    - Numerical methods for accelerating and stabilising diffusion models

    Skills/Experience Required: Candidates must have a PhD in applied mathematics, statistics, or a closely related area. This should include expertise in relevant areas such as computational statistics, Bayesian inference, numerical analysis or uncertainty quantification, and evidence of the ability to develop new methods and/or theoretical understanding. Some programming experience is also essential. Prior experience in in using AI or ML tools is desirable but not essential.  

  • Lead Researchers: Paul Fearnhead/Chris Nemeth

    Themes: AI-scale probabilistic reasoning; mathematical underpinning of generative models; and probabilistic; uncertainty-aware methods for trustworthy and green AI.

    Recent breakthroughs, from generative AI to the use of AI for emulation, holds promise to revolutionise many areas of science and impact wide-ranging applications. Research at Lancaster will look at links between these methods and more traditional approaches to computational and Bayesian statistics, and how these can be used to make AI more reliable and efficient, and also lead to new computational statistics methods.

    Potential Projects:

    - Developing sampling methods that can scale to AI-size applications

    - Exploring links between annealing, sequential Monte Carlo and diffusion generative models.

    - Extending generative modelling to new data structures

    - New algorithms for constrained and conditional sampling for diffusion generative models.

    - Developing new scalable learning approaches to uncertainty quantification for AI.

    Skills/Experience Required: Candidates must have a PhD in some area of statistics or machine learning. This should include expertise in relevant areas such as: computational statistics, Bayesian statistics or stochastic processes, and evidence of the ability to develop new methods and/or theoretical understanding. Some programming experience is also essential. Prior experience in generative AI or other relevant areas of AI is desirable but not essential.

  • Lead Researchers: Catherine Powell/David Silvester/Jonas Latz

    Themes: Structure-constrained and informed AI.  

    Postdocs at Manchester will explore connections between numerical analysis of, and numerical methods for differential equations and the design and analysis of novel probabilistic AI methods.

    ML and AI methods are becoming increasingly popular for building surrogates for models of physical processes or as models for complex data. While they offer more flexibility in how data can be incorporated into approximation and simulation tasks, standard approaches often ignore, or are not able to enforce important structural information. Moreover, the quality of predictions obtained using existing ML and AI methods is often difficult to quantify. Finding novel ways to fuse structural and other domain-specific information with data holds the promise to not only produce better AI models but also more robust approximations that have smaller generalisation error.

    Potential Projects:

    - ML models that satisfy physical laws

    - Developing hybrid classical and AI-informed solvers for PDE models

    - Designing better neural network architectures that mimic structure-preserving numerical schemes for differential equations

    - Using ML techniques to learn parameters that optimise performance of deterministic and randomised numerical methods

    -  Learning PDEs from data using variational and Bayesian techniques

    Skills/Experience Required: Candidates must have a PhD in some area of applied mathematics, preferably in numerical analysis, with expertise in the implementation and application of numerical methods (such as finite elements, finite differences, time-stepping etc) and analysis thereof, for models consisting of differential equations (e.g., deterministic, parametric or stochastic ODEs/PDEs). Some programming experience is also essential. Prior experience in using AI or ML tools is desirable but not essential. Candidates with only expertise in Computer Science, Machine Learning or Statistics will not be considered for this post.

  • Lead Researchers: Paul Jenkins/Gareth Roberts/Matthew Thorpe

    Themes: AI-scale probabilistic reasoning;  A dynamic systems and probabilistic view of AI; Mathematical underpinnings of Generative Models; Structure-constrained and informed AI.

    The Warwick team will fuse expertise in Bayesian and Computational Statistics, Probability, Stochastic Simulation, Continuum Dynamics, PDEs and Mathematical Analysis to provide mathematical underpinning for existing AI algorithms, to study their limiting behaviours in order to understand complexity and provide interpretability, to devise new scalable sampling algorithms, and to use this understanding to inspire new AI architectures and related machine learning algorithms. 

    Potential Projects:

    - Developing theory and methodology for sampling algorithms that can scale to AI-size applications

    - Introduce and provide theory for simulation methods for conditioned stochastic processes for application in AI algorithms such as diffusion-based generative modelling

    - Developing new methodology and associated theory for stochastic process limits for large AI structures such as neural networks

    - Connect (graph) neural networks to continuum models via deep layer and mean field limits to understand the behaviour of large networks.

    - Propose new neural architectures through the PDEs arising from the continuum interpretation. 

    - Provide an interpretation of learning of neural networks for networks that can be formulated as gradient flows.

    - Develop dimension independent (i.e. independent of network depth and width) sampling techniques for uncertainty quantification in neural networks.

    Skills/Experience Required: Candidates must have a PhD in some area of statistics or machine learning. This should include expertise in relevant areas such as: computational statistics, Bayesian statistics, probability, or applied analysis (in particular, functional analysis, PDE’s and/or variational methods), and evidence of the ability to develop new methods and/or theoretical understanding. Candidates with a relevant theoretical background who can demonstrate a willingness to apply their expertise to AI applications are also encouraged to apply. Some programming experience is also desirable. Prior experience in existing methods of AI is desirable but not essential.