ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Monograph available for loan
    Monograph available for loan
    Dordrecht : Springer
    Call number: AWI S2-18-91494
    Description / Table of Contents: This book provides a compact self-contained introduction to the theory and application of Bayesian statistical methods. The book is accessible to readers having a basic familiarity with probability, yet allows more advanced readers to quickly grasp the principles underlying Bayesian theory and methods. The examples and computer code allow the reader to understand and implement basic Bayesian data analyses using standard statistical models and to extend the standard models to specialized data analysis situations. The book begins with fundamental notions such as probability, exchangeability and Bayes' rule, and ends with modern topics such as variable selection in regression, generalized linear mixed effects models, and semiparametric copula estimation. Numerous examples from the social, biological and physical sciences show how to implement these methodologies in practice. Monte Carlo summaries of posterior distributions play an important role in Bayesian data analysis. The open-source R statistical computing environment provides sufficient functionality to make Monte Carlo estimation very easy for a large number of statistical models and example R-code is provided throughout the text. Much of the example code can be run ``as is'' in R, and essentially all of it can be run after downloading the relevant datasets from the companion website for this book. Peter Hoff is an Associate Professor of Statistics and Biostatistics at the University of Washington. He has developed a variety of Bayesian methods for multivariate data, including covariance and copula estimation, cluster analysis, mixture modeling and social network analysis. He is on the editorial board of the Annals of Applied Statistics.
    Type of Medium: Monograph available for loan
    Pages: IX, 270 Seiten , Illustrationen
    ISBN: 9780387922997 (GB.) , 9780387924076 (electronic)
    Series Statement: Springer texts in statistics
    Language: English
    Note: Contents: 1 Introduction and examples. - 1.1 Introduction. - 1.2 Why Bayes?. - 1.2.1 Estimating the probability of a rare event. - 1.2.2 Building a predictive model. - 1.3 Where we are going. - 1.4 Discussion and further references. - 2 Belief, probability and exchangeability. - 2.1 Belief functions and probabilities. - 2.2 Events, partitions and Bayes' rule. - 2.3 Independence. - 2.4 Random variables. - 2.4.1 Discrete random variables. - 2.4.2 Continuous random variables. - 2.4.3 Descriptions of distributions. - 2.5 Joint distributions. - 2.6 Independent random variables. - 2.7 Exchangeability. - 2.8 de Finetti's theorem. - 2.9 Discussion and further references. - 3 One-parameter models. - 3.1 The binomial model. - 3.1.1 Inference for exchangeable binary data. - 3.1.2 Confidence regions. - 3.2 The Poisson model. - 3.2.1 Posterior inference . - 3.2.2 Example: Birth rates. - 3.3 Exponential families and conjugate priors. - 3.4 Discussion and further references. - 4 Monte Carlo approximation. - 4.1 The Monte Carlo method. - 4.2 Posterior inference for arbitrary functions. - 4.3 Sampling from predictive distributions. - 4.4 Posterior predictive model checking. - 4.5 Discussion and further references. - 5 The normal model. - 5.1 The normal model. - 5.2 Inference for the mean, conditional on the variance. - 5.3 Joint inference for the mean and variance. - 5.4 Bias, variance and mean squared error. - 5.5 Prior specification based on expectations. - 5.6 The normal model for non-normal data. - 5.7 Discussion and further references. - 6 Posterior approximation with the Gibbs sampler. - 6.1 A semiconjugate prior distribution. - 6.2 Discrete approximations. - 6.3 Sampling from the conditional distributions. - 6.4 Gibbs sampling. - 6.5 General properties of the Gibbs sampler. - 6.6 Introduction to MCMC diagnostics. - 6.7 Discussion and further references. - 7 The multivariate normal model. - 7.1 The multivariate normal density. - 7.2 A semiconjugate prior distribution for the mean. - 7.3 The inverse-Wishart distribution. - 7.4 Gibbs sampling of the mean and covariance. - 7.5 Missing data and imputation. - 7.6 Discussion and further references. - 8 Group comparisons and hierarchical modeling. - 8.1 Comparing two groups. - 8.2 Comparing multiple groups. - 8.2.1 Exchangeability and hierarchical models. - 8.3 The hierarchical normal model. - 8.3.1 Posterior inference. - 8.4 Example: Math scores in U.S. public schools. - 8.4.1 Prior distributions and posterior approximation. - 8.4.2 Posterior summaries and shrinkage. - 8.5 Hierarchical modeling of means and variances. - 8.5.1 Analysis of math score data. - 8.6 Discussion and further references. - 9 Linear regression. - 9.1 The linear regression model. - 9.1.1 Least squares estimation for the oxygen uptake data. - 9.2 Bayesian estimation for a regression model. - 9.2.1 A semiconjugate prior distribution. - 9.2.2 Default and weakly informative prior distributions. - 9.3 Model selection. - 9.3.1 Bayesian model comparison. - 9.3.2 Gibbs sampling and model averaging. - 9.4 Discussion and further references. - 10 Nonconjugate priors and Metropolis-Hastings algorithms. - 10.1 Generalized linear models. - 10.2 The Metropolis algorithm. - 10.3 The Metropolis algorithm for Poisson regression. - 10.4 Metropolis, Metropolis-Hastings and Gibbs. - 10.4.1 The Metropolis-Hastings algorithm. - 10.4.2 Why does the Metropolis-Hastings algorithm work?. - 10.5 Combining the Metropolis and Gibbs algorithms. - 10.5.1 A regression model with correlated errors. - 10.5.2 Analysis of the ice core data. -
    Location: AWI Reading room
    Branch Library: AWI Library
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...