Archive for the ‘Uncategorized’ Category

Giacomo Zanella (University of Warwick): Bayesian complementary clustering, MCMC for data association and Anglo-Saxon placenames

December 11, 2013

Motivated by the study of Anglo-Saxon settlements locations, where administrative clusters involving a variety of complementary names tend to appear, we develop a Bayesian Cluster Model that, given a multi-type (k-type) point process, looks for clusters formed by points of different types. We obtain a multimodal, intractable posterior distribution on the space of matchings contained in a k-partite hypergraph. Such distribution is closely related to the posterior distributions arising in multi-target tracking and data association problems.

We develop an efficient Metropolis-Hastings algorithm to sample from the posterior distribution. We consider the problem of choosing an optimal proposal distribution to sample from product-form measures on the space of matchings contained in a bipartite graph. Simulated Tempering techniques are used to overcome multimodality. A multiple proposal scheme is developed to allow for parallel programming. Finally, Convergence Diagnostic techniques are used to assess the improvements in mixing given by such techniques. This allows us to study the Anglo-Saxon placenames locations dataset. First results seem to support the hypothesis of the settlements being organized into administrative clusters.

PhD work supervised by Wilfrid Kendall

Keywords: Bayesian cluster models; Complementary clustering; MCMC; Optimal proposal distribution; k-dimensional assignment problem; Probabilistic data association; Anglo-Saxon placenames locations

Simon Lacoste-Julien (INRIA, ENS, Paris): Sequential Kernel Herding: Frank-Wolfe Optimization for Particle Filtering

December 1, 2013

Recently, the Frank-Wolfe optimization algorithm was suggested as a procedure to obtain adaptive quadrature rules for integrals of functions in a RKHS with a potentially faster rate of convergence than Monte Carlo integration (and “kernel herding” was shown to be a special case of this procedure). In this work, we propose to replace the random sampling step in a particle filter by Frank-Wolfe optimization. By optimizing the position of the particles, we can obtain better accuracy than random or quasi-Monte Carlo sampling. In applications where the evaluation of the emission probabilities is expensive (such as in robot localization), the additional computational cost to generate the particles through optimization can be justified. Experiments on standard synthetic examples as well as on a robot localization task indicate indeed an improvement of accuracy over random and quasi-Monte Carlo sampling.

Joint work with Fredrik Lindsten and Francis Bach.

Wentao Li (Lancaster University): Efficient Sequential Monte Carlo with Multiple Proposals and Control Variates

November 30, 2013

Sequential Monte Carlo is a useful method for online filtering of state space models. Due to the complexity of modern problems, a single proposal distribution is usually not efficient and considering multiple proposal distributions is a general method to address various as- pects of the filtering. This paper proposes an efficient method of using multiple proposals in combination with control variates. Tan (2004)’s likelihood approach is used in both resam- pling and estimation. The new algorithm is shown to be asymptotically more efficient than the direct use of multiple proposals and control variates. The guidance for selecting multiple proposals and control variates is also given. Numerical studies of the AR(1) model observed with noise and the stochastic volatility model with AR(1) dynamics show that the new algo- rithm can significantly improve over the bootstrap filter and auxiliary particle filter.

Charlotte Baey (Ecole Centrale Paris): stochastic algorithms for nonlinear mixed models

November 27, 2013

There is a strong genetic variability among plants, even of the same variety, which, combined with the locally varying climatic effects in a given field, can lead to the development of highly different neighbouring plants. This is one of the reasons why population-based methods for modelling plant growth are of great interest. The functional structural plant growth model, called GreenLab, has already been shown to be successful in describing plant growth dynamics primarily in an individual level.
In this study, we extend its formulation to the population level. In order to model the deviations from some fixed but unknown important biophysical and genetic parameters we introduce random effects. The resulting model can be cast into the framework of nonlinear mixed models, which can be seen as particular types of incomplete data models.
A stochastic variant of an EM-type algorithm (Expectation-Maximization) is generally needed to perform maximum likelihood estimation for this type of models. Under some assumptions, the complete data distribution belongs to a subclass of the exponential family of distributions where the M-step can be solved explicitly. In such cases the interest is focused on the best approximation of the E-step by competing simulation methods. In this direction, we propose to compare two commonly used stochastic algorithms: the automated Monte-Carlo EM (MCEM) and the SAEM algorithm. The performances of both algorithms are compared on simulated data, and an application to real data from sugar beet plants is also given.

Henrik Nyman (Åbo Akademi University): Stratified Gaussian Graphical Models

November 22, 2013

Gaussian graphical models represent the backbone of the statistical toolbox for analyzing continuous multivariate systems. However, due to the intrinsic properties of the multivariate normal distribution, use of this model family may hide certain forms of context-specific independence that are natural to consider from an applied perspective. Such independencies have been earlier introduced to generalize discrete graphical models and Bayesian networks into more flexible model families. We will present a class of models that incorporates the idea of context-specific independence to Gaussian graphical models by introducing a stratification of the Euclidean space such that a conditional independence may hold in certain segments but be absent elsewhere. Additionally, a non-reversible MCMC approach is used to infer the model structure best suited to represent the dependence structure found in a given dataset.

Joint work with Johan Pensar and Jukka Corander.

Johan Dahlin (Linköping University): Particle Metropolis-Hastings using second-order proposals

November 4, 2013

We propose an improved proposal distribution in the Particle Metropolis-Hastings (PMH) algorithm for Bayesian parameter inference in nonlinear state space models. This proposal incorporates second-order information about the posterior distribution over the system parameters, which can be extracted from the particle filter used in the PMH algorithm. This makes the algorithm scale-invariant, simpler to calibrate and shortens the burn-in phase. We also suggest improvements that reduces the computational complexity of our earlier first-order method. The complexity of the previous method is quadratic in the number of particles, whereas the new second-order method is linear.

Joint work of Johan Dahlin (Linköping University), Fredrik Lindsten (Linköping University), and Thomas B. Schön (Uppsala University)

Wojciech Niemiro (Nicolaus Copernicus University, Torun, and University of Warsaw, Poland): Adaptive Monte Carlo Maximum Likelihood based on Importance Sampling with Resampling

October 31, 2013

We consider the problem of computing the Maximum Likelihood estimate in a model where the likelihood involves an intractable normalizing constant. This constant is computed via Monte Carlo. Straightforward Importance Sampling in this context is usually inefficient because of degeneracy of weights. We propose an adaptive scheme, in which resampling combined with MCMC is used to overcome the weights degeneracy problem. Asymptotic behaviour of our algorithm is examined.

Joint work with Jan Palczewski

Keywords: Adaptation, Maximum Likelihood, Normalizing constant, Martingale differences

Zacharie Naulet (CEA): Nonparametric bayesian kernel-based function estimation

October 25, 2013

We consider the inverse (ill-posed) problem of Quantum Homodyne Tomography (see Artilès, 2005) from a fully bayesian point of view. In our context, functions of interest are complex valued wave-functions belonging to L2(R). We use an approach inspired by the stochastic expansion over continuous dictionnary introduced by Abramovith et al. (2000) and generalized later by Wolpert et al. (2011). The basic idea is to represent the function of interest as a weighted sum of building blocks, generally a kernel function with arbitrary parameters. We propose a method relying on group representations to build efficient kernels, allowing us to have expansions dense in large families of well-known Banach spaces. Hence our model could be useful for many other applications. As noticed by Wolpert et al., for computational purpose, an approximation is needed concerning the prior, because no one-to-one link between the components of the expansion and the observated data is possible. They propose a RJMCMC algorithm relying on the truncation of components whose weights are lower than a specified threshold, with general Lévy random fields as a prior. In the case of Lévy random fields built from the Gamma process, we propose an inference scheme based on the Dirichlet process (up to a Gamma distributed scale), for which a bunch of efficient algorithms for posterior sampling have been proposed (see Neal 2000, Ishwaran & James 2001). Concerning the approximation, instead of specifying a threshold for small components we believe the particle approximation of Favaro et al. (2012) might be a better candidate. Going back to our initial problem, preliminary results shows that our model does pretty good job, even in what we believe being the worst case, that is wave-functions being Hermite polynomials.

This is a joint work with Eric Barat, Judith Rousseau and Trong T. Truong

Kody Law (King Abdullah University): Dimension-independent likelihood-informed MCMC samplers

October 23, 2013

Many Bayesian inference problems require exploring
the posterior distribution of high-dimensional parameters,
which in principle can be described as functions.
Formulating algorithms which are defined on function space
yields dimension-independent algorithms.
By exploiting the intrinsic low dimensionality of the
likelihood function, we introduce a newly developed suite
of proposals for the Metropolis Hastings MCMC algorithm
that can adapt to the complex structure of the posterior
distribution, yet are defined on function space. I will
present numerical examples indicating the efficiency of
these dimension-independent likelihood-informed samplers.
I will also present some applications of function-space
samplers to problems relevant to numerical weather prediction and subsurface reconstruction.

Alberto Caimo (USI Lugano): Bayesian modeling of network heterogeneity

October 23, 2013

With respect to the available statistical modeling cross-sectional network data one may roughly distinguish between two strands: (a) models which explain the existence of an edge depends on nodal random effects, (b) models where the existence of an edge also depends on the local network structure. The strand (a) is phrased as p1 and p2 models and the strand (b) is based on exponential random graph models (ERGMs). We present a comprehensive inferential framework for Bayesian ERGMs with nodal random effects in order to account for both global dependence structure and network heterogeneity. Parameter inference and model selection procedures are based on the use of an approximate exchange algorithm and its trans-dimensional extension.

Keywords: social network analysis, network heterogeneity, exponential random graphs, exchange algorithm.

*Join work with Stephanie Thiemichen (LMU Munich), Goeran Kauermann (LMU Munich) and Nial Friel (UCD Dublin)

Pierre Minvielle (CEA): Particle MCMC for Inverse Scattering in microwave control

October 19, 2013

This talk considers the estimation of local radioelectric properties from global electromagnetic scattering measurements. This challenging ill-posed high dimensional inverse problem can be explored by intensive computations of a parallel Maxwell solver on a petaflopic supercomputer. I will present applied works, in extension to [1]. In consists in investigating how Particle Marginal Metropolis-Hastings (PMMH), including a Rao-Blackwellised SMC algorithm based on interacting Kalman filters, can perform inference of material properties and determine a multiple components (Debye relaxation/Lorenzian resonant) material model. I will give illustrations, question about practical issues and eventually propose different ways to adapt this computational intensive approach to higher dimensional problems.

Joint work with A. Todeschini, F. Caron and P. Del Moral

[1] F. Giraud, P. Minvielle and P. Del Moral, Advanced interacting sequential Monte Carlo sampling for inverse scattering, IOP Inverse Problems, vol. 29, 2013 [doi:10.1088/0266-5611/29/9/095014]

Matt Moores (QUT): Scalable Bayesian computation for intractable likelihoods in image analysis

August 27, 2013

The inverse temperature hyperparameter of the hidden Potts model governs the strength of spatial cohesion and therefore has a substantial influence over the resulting model fit. The difficulty arises from the dependence of an intractable normalising constant on the value of the inverse temperature, thus there is no closed form solution for sampling from the distribution directly. We review three computational approaches for addressing this issue, namely pseudolikelihood, path sampling, and the approximate exchange algorithm. We compare the accuracy and scalability of these methods using a simulation study.

This is joint work with Clair Alston, Kerrie Mengersen.