Hoppa till huvudinnehåll

PhD course (FMS020F-NAMS002): Statistical inference for partially observed stochastic processes

 

Institute/Department: Mathematical Statistics, Centre for Mathematical Sciences 

Credits: 7.5 credits

Topics: Inference and data imputation for diffusions and other continuous time processes; iterated filtering; particle marginal methods for parameter inference; approximate Bayesian computation; inference for Gaussian Markov random fields..

 

Prerequisites: Basics of inference for stochastic processes, Bayesian methods and Monte Carlo methods (e.g. Markov Chain Monte Carlo, Metropolis-Hastings method). For example having taken the courses Time series Analysis (FMS051/MASM17) and Monte Carlo and Empirical Methods for Stochastic Inference (FMS091/MASM11).

 

Registration: register at http://goo.gl/forms/SEJHnEfXDi

 

Schedule and first lecture: we plan to have our first lecture on Monday 1 February 2016, 10.15-12.15 and 13.15-14.15 in room MH:228. Then from 14.15 to 15.15 students are invited to group and start working unsupervised on exercise material, but lecturers can be contacted in their offices for questions. The complete schedule is available here.

 

Course content: This PhD-level course will present an overview of modern inferential methods for partially observed stochastic processes, with emphasis on state-space models (also known as Hidden Markov Models). Here is a detailed course description.

  • 1) Inference and data imputation for diffusion and other continuous time processes.The inference problem for diffusion processes is generally difficult due to the lack of closed form expressions for the likelihood function. However, the problems becomes manageable if data is imputed between the observations. This lecture will cover the basic ideas, some important variance reduction techniques (with a focus towards bridge samplers) as well as pointing towards future extensions.
  • 2) Iterated filtering.General state space models are defined in terms of a latent Markov process, from which partial observations can be obtained. This typically means that the latent process must be recovered in order to estimate parameters. An old idea, going back at least half a century, is to treat the model parameters as latent processes themselves. This idea has been tested repeatedly with varying success, but no proof was presented before the introduction of iterated filtering. The lecture will present an historical overview, while trying to explain why older methods have failed. These experiences are then used to introduce the "iterated filtering" method for which strong consistency can be proved. We also look at some extensions that are far more efficient from a computational point of view.
  • 3) Particle marginal methods for parameter inference. Sequential Monte Carlo methods (SMC, a.k.a. particle filters) have revolutionized and simplified the problem of filtering for nonlinear, non-Gaussian models. For example SMC can be used to filter the "signal" in state-space models out of noisy measurements by using a set of N computer-generated "trajectories" ("particles") of the system's state. SMC can also be used to construct an approximation to the likelihood function for the parameters of the state-space model of interest. A striking result is when SMC is used in the context of Bayesian inference for the model parameters: if an unbiased approximation to the likelihood function is plugged into the posterior distribution of the parameters, it is possible to construct a standard MCMC algorithm for sampling exactly from such posterior distribution, regardless of the specific (finite) value of N. This is the so called "pseudo-marginal approach" which is a special case of the class of algorithms known as Particle MCMC.
  • 4) Approximate Bayesian Computation. Approximate Bayesian Computation (ABC) is a class of algorithms allowing for inference in complex models with "intractable likelihoods". Specifically, with "complex" we mean models for which we are unable to make use of the likelihood function (because it is analytically unavailable or computationally too expensive to evaluate). However, it is often the case that it is possible -- and computationally cheap -- to simulate from the data generating model, and this implies producing "simulated data" from the likelihood function. By repeatedly drawing from the likelihood we can construct "likelihood-free" methods for Bayesian inference even when we cannot evaluate the likelihood pointwise (but we can somehow sample from it!). In the most typical scenarios, such methods only result in approximate Bayesian inference, though they can also produce exact inference under some very stringent conditions.
  • 5) Gaussian Markov random fields. A common model for spatial data consists of a latent Gaussian field with (non-)Gaussian observations. The dependence structure in the latent process is often described using a parametric covariance function. For large datasets the computation, storage and inversion of the covariance matrix becomes a major issue. Replacing the covariance with a suitable Markov random field representation leads to latent fields with sparse precision matrices which have computational benefits. This lecture will cover the basic ideas of models with latent Gaussian processes, discussing alternatives to covariance matrices for large data. We will then discuss the formulation of latent Gaussian processes as solutions to stochastic partial differential equations (SPDE). The links between SPDEs and older conditional and simultaneous autoregressive models (CAR & SAR), the spectral interpretation of the SPDE and the construction of solutions using basis functions, and the selection of different basis functions will be discussed.
  • 6) Inference for Gaussian Markov random fields.The inference for GMRF-based models (and other latent field models) is often based on MCMC computations. This lecture will cover the general framework of Hierarchical Bayesian modelling, i.e. partially observed latent process with unknown parameters governing process and observations. The inference for these models will be discussed highlighting: 1) The advantage of blocking MCMC updates, 2) Construction of Laplace approximations to the posterior, 3) using Laplace approximations to construct MCMC proposals and 4) replacing the MCMC step with numerical integration, resulting in INLA. If time allows other methods for estimating latent fields, mainly expected maximisation (possibly something about SA-EM) and expected conjugate gradient algorithms van be discussed.

 

Course literature: see the bibliographic references for each topic here.


Assessment: To pass the course students must solve at least 3 out of 6 home assignments and a final approved written project report/presentation. More in detail, students must solve at least one assignment for each of the following three areas (a)-(b)-(c) where (a) = particle methods and approximate Bayesian computation; (b) = data imputation for diffusions and iterated filtering; (c) = inference for Gaussian Markov random fields. The final project will be about freely choosing a method or article to study a bit more in detail (e.g. studying an extension to some method considered during the course) and implement corresponding simulations.

 

Lecturers: Erik Lindström, Johan Lindström, Umberto Picchini.

 

Language: the course is given in English.

 

Course material:  the table below will be populated with slides, codes etc as the course proceeds.

datetimeroomlecturemateriallecturer
1 February10.15-12MH228Inference and imputation for
diffusions and other continuous time
processes

slides

EL

 

 

 

13.15-14MH227lecture continuationEL
14.15-15MH227exercises session
4 February10.15-12MH228latent Gaussian processes and
stochastic partial differential equations

slides

JL
13.15-14MH227lecture continuationJL
14.14-15MH227exercises session
8 February10.15-12MH228particle marginal methods for parameter inference

slides

UP
13.15-14MH227lecture continuationUP
14.15-15MH227exercises session
11 February10.15-12MH228inference for Gaussian Markov random fields and the INLA approach

slides

JL
13.15-14MH227lecture continuationJL
14.15-15MH227exercises session
15 February10.15-12MH228approximate Bayesian computation

slides

 

UP
13.15-14MH227lecture continuationUP
14.15-15MH227exercises session
18 February10.15-12MH228Iterated filtering

slides

List Of Final Projects

EL
13.15-14MH227lecture continuationEL
14.15-15MH227exercises session


Practical info

Course start: 

First lecture: 1 February 2016 at 10.15

Room MH:228, Centre for Mathematical Sciences

Detailed schedule

 

Registration:

goo.gl/forms/SEJHnEfXDi

 

Course codes: FMS020F (LTH)

                        NAMS002 (NatFak)

Credits: 7.5 ECTS

Detailed description: available here

Language: English

 

Lecturers:

Erik Lindström

Johan Lindström

Umberto Picchini

 

Sidansvarig: webbansvarig@math.lu.se | 2016-04-12