Bayesian Inference in Neural Networks

Abstract

Approximate marginal Bayesian computation and inference are developed for neural network models. The marginal considerations include determination of approximate Bayes factors for model choice about the number of nonlinear sigmoid terms, approximate predictive density computation for a future observable and determination of approximate Bayes estimates for the nonlinear regression function. Standard conjugate analysis applied to the linear parameters leads to an explicit posterior on the nonlinear parameters. Further marginalisation is performed using Laplace approximations. The choice of prior and the use of an alternative sigmoid lead to posterior invariance in the nonlinear parameter which is discussed in connection with the lack of sigmoid identifiability. A principal finding is that parsimonious model choice is best determined from the list of modal estimates used in the Laplace approximation of the Bayes factors for various numbers of sigmoids. By comparison, the values of the various Bayes factors are of only secondary importance. The proposed methods are illustrated in the context of two nonlinear datasets that involve respectively univariate and multivariate nonlinear regression models.

Department(s)

Mathematics and Statistics

Keywords and Phrases

bayesian computation; Laplace approximation; model choice; neural network; predi

International Standard Serial Number (ISSN)

0006-3444

Document Type

Article - Journal

Document Version

Citation

File Type

text

Language(s)

English

Rights

© 2001 Oxford University Press, All rights reserved.

Publication Date

01 Jan 2001

Share

 
COinS