Mixtures : estimation and applications
 Language
 English.
 Imprint
 Chichester, West Sussex : Wiley, 2011.
 Physical description
 xviii, 311 p. : ill. ; 24 cm.
 Series
 Wiley series in probability and statistics.
Access
Available online

Stacks

Unknown
QA273.6 .M59 2011

Unknown
QA273.6 .M59 2011
More options
Contributors
Contents/Summary
 Bibliography
 Includes bibliographical references and index.
 Contents

 Preface Acknowledgements List of Contributors 1 The EM algorithm, variational approximations and expectation propagation for mixtures D.Michael Titterington 1.1 Preamble 1.2 The EM algorithm 1.3 Variational approximations 1.4 Expectationpropagation Acknowledgements References 2 Online expectation maximisation Olivier Cappe 2.1 Introduction 2.2 Model and assumptions 2.3 The EM algorithm and the limiting EM recursion 2.4 Online expectation maximisation 2.5 Discussion References 3 The limiting distribution of the EM test of the order of a finite mixture J. Chen and Pengfei Li 3.1 Introduction 3.2 The method and theory of the EM test 3.3 Proofs 3.4 Discussion References 4 Comparing Wald and likelihood regions applied to locally identifiable mixture models Daeyoung Kim and Bruce G. Lindsay 4.1 Introduction 4.2 Background on likelihood confidence regions 4.3 Background on simulation and visualisation of the likelihood regions 4.4 Comparison between the likelihood regions and the Wald regions 4.5 Application to a finite mixture model 4.6 Data analysis 4.7 Discussion References 5 Mixture of experts modelling with social science applications Isobel Claire Gormley and Thomas Brendan Murphy 5.1 Introduction 5.2 Motivating examples 5.3 Mixture models 5.4 Mixture of experts models 5.5 A Mixture of experts model for ranked preference data 5.6 A Mixture of experts latent position cluster model 5.7 Discussion Acknowledgements References 6 Modelling conditional densities using finite smooth mixtures Feng Li, Mattias Villani and Robert Kohn 6.1 Introduction 6.2 The model and prior 6.3 Inference methodology 6.4 Applications 6.5 Conclusions Acknowledgements Appendix: Implementation details for the gamma and lognormal models References 7 Nonparametric mixed membership modelling using the IBP compound Dirichlet process Sinead Williamson, Chong Wang, Katherine A. Heller, and David M. Blei 7.1 Introduction 7.2 Mixed membership models 7.3 Motivation 7.4 Decorrelating prevalence and proportion 7.5 Related models 7.6 Empirical studies 7.7 Discussion References 8 Discovering nonbinary hierarchical structures with Bayesian rose trees Charles Blundell, Yee Whye Teh, and Katherine A. Heller 8.1 Introduction 8.2 Prior work 8.3 Rose trees, partitions and mixtures 8.4 Greedy Construction of Bayesian Rose Tree Mixtures 8.5 Bayesian hierarchical clustering, Dirichlet process models and product partition models 8.6 Results 8.7 Discussion References 9 Mixtures of factor analyzers for the analysis of highdimensional data Geoffrey J. McLachlan, Jangsun Baek, and Suren I. Rathnayake 9.1 Introduction 9.2 Singlefactor analysis model 9.3 Mixtures of factor analyzers 9.4 Mixtures of common factor analyzers (MCFA) 9.5 Some related approaches 9.6 Fitting of factoranalytic models 9.7 Choice of the number of factors q 9.8 Example 9.9 Lowdimensional plots via MCFA approach 9.10 Multivariate t factor analysers 9.11 Discussion Appendix References 10 Dealing with Label Switching under model uncertainty Sylvia FruhwirthSchnatter 10.1 Introduction 10.2 Labelling through clustering in the pointprocess representation 10.3 Identifying mixtures when the number of components is unknown 10.4 Overfitting heterogeneity of componentspecific parameters 10.5 Concluding remarks References 11 Exact Bayesian analysis of mixtures Christian .P. Robert and Kerrie L. Mengersen 11.1 Introduction 11.2 Formal derivation of the posterior distribution References 12 Manifold MCMC for mixtures Vassilios Stathopoulos and Mark Girolami 12.1 Introduction 12.2 Markov chain Monte Carlo methods 12.3 Finite Gaussian mixture models 12.4 Experiments 12.5 Discussion Acknowledgements Appendix References 13 How many components in a finite mixture? Murray Aitkin 13.1 Introduction 13.2 The galaxy data 13.3 The normal mixture model 13.4 Bayesian analyses 13.5 Posterior distributions for K (for flat prior) 13.6 Conclusions from the Bayesian analyses 13.7 Posterior distributions of the model deviances 13.8 Asymptotic distributions 13.9 Posterior deviances for the galaxy data 13.10 Conclusion References 14 Bayesian mixture models: a bloodfree dissection of a sheep Clair L. Alston, Kerrie L. Mengersen, and Graham E. Gardner 14.1 Introduction 14.2 Mixture models 14.3 Altering dimensions of the mixture model 14.4 Bayesian mixture model incorporating spatial information 14.5 Volume calculation 14.6 Discussion References Index.
 (source: Nielsen Book Data)
 Publisher's Summary
 This book uses the EM (expectation maximization) algorithm to simultaneously estimate the missing data and unknown parameter(s) associated with a data set. The parameters describe the component distributions of the mixture; the distributions may be continuous or discrete. The editors provide a complete account of the applications, mathematical structure and statistical analysis of finite mixture distributions along with MCMC computational methods, together with a range of detailed discussions covering the applications of the methods and features chapters from the leading experts on the subject. The applications are drawn from scientific discipline, including biostatistics, computer science, ecology and finance. This area of statistics is important to a range of disciplines, and its methodology attracts interest from researchers in the fields in which it can be applied.
(source: Nielsen Book Data)
Subjects
Bibliographic information
 Publication date
 2011
 Responsibility
 edited by Kerrie L. Mengersen, Christian P. Robert, D. Michael Titterington.
 Series
 Wiley series in probability and statistics
 ISBN
 9781119993896 (cloth)
 111999389X (cloth)
 9781119995685 (ePDF)
 111999568X (ePDF)
 9781119995678 (oBook)
 1119995671 (oBook)
 9781119998440 (ePub)
 1119998441 (ePub)
 9781119998457 (Mobi)
 111999845X (Mobi)