Ftp directory for Chris Williams (ckiw@ai.utoronto.ca) Contents: * nips8.ps.Z (Jan 96) Paper to appear in Proc. NIPS 8 "Gaussian Processes for Regression" Christopher K. I. Williams and Carl Edward Rasmussen Abstract: The Bayesian analysis of neural networks is difficult because a simple prior over weights implies a complex prior distribution over functions. In this paper we investigate the use of Gaussian process priors over functions, which permit the predictive Bayesian analysis for fixed values of hyperparameters to be carried out exactly using matrix operations. Two methods, using optimization and averaging (via Hybrid Monte Carlo) over hyperparameters have been tested on a number of challenging problems and have produced excellent results.

* bishop_nips8.ps.Z (Jan 96) Paper to appear in Proc. NIPS 8 "EM Optimization of Latent-Variable Density Models" C. M. Bishop, M. S. Svensen and C. K. I. Williams Abstract: There is currently considerable interest in developing general non-linear density models based on latent, or hidden, variables. Such models have the ability to discover the presence of a relatively small number of underlying "causes", which, acting in combination, give rise to the apparent complexity of the observed data set. Unfortunately, to train such models generally requires large computational effort. In this paper we introduce a novel latent variable algorithm which retains the general non-linear capabilities of previous models but which uses a training procedure based on the EM algorithm. We demonstrate the performance of the model on a toy problem and on data from flow diagnosticsfor a multi-phase oil pipeline. * nips7.ps.Z (Jan 95) Paper in Proc. NIPS 7 "Using a neural net to instantiate a deformable model" by Christopher K. I. Williams, Michael D. Revow and Geoffrey E. Hinton Abstract: Deformable models are an attractive approach to recognizing non-rigid objects which have considerable within class variability. However, there are severe search problems associated with fitting the models to data. We show that by using neural networks to provide better starting points, the search time can be significantly reduced. The method is demonstrated on a character recognition task. * thesis.ps.Z (November 1994) PhD thesis "Combining deformable modlels and neural networks for handprinted digit recognition" by Christopher K. I. Williams (also available as thesis.part1.ps.Z and thesis.part2.ps.Z for printers with small amounts of memory) * pami.ps.Z (July 1994) Paper submitted to PAMI "Using generative models for handwritten digit recognition" M. D. Revow, C. K. I. Williams, G. E. Hinton * williams.unsup.ps.Z (September 1993) Paper in AAAI Fall 1993 Workshop on Machine Learning in Computer Vision "Discovering objects with unsupervised learning" by Christopher K. I. Williams, Richard S. Zemel and Michael C. Mozer Abstract: Given a set of images, each of which contains one instance of a small but unknown set of objects imaged from a random viewpoint, we show how to perform unsupervised learning to discover the object classes. To group the data into objects we use a mixture model which is trained with the EM algorithm. We have investigated characterizing the the probability distribution for the features of each object either in terms of an object model or by a Gaussian distribution. We compare the performance of these two approaches on a dataset containing six different stick-animals, and on a dataset consisting of seven hand gestures.