Learning probabilistic graphical models in R : familiarize yourself with probabilistic graphical models through real-world problems and illustrative code examples in R /

Familiarize yourself with probabilistic graphical models through real-world problems and illustrative code examples in R About This Book Predict and use a probabilistic graphical models (PGM) as an expert system Comprehend how your computer can learn Bayesian modeling to solve real-world problems Kn...

全面介绍

书目详细资料
主要作者: Bellot, David (Author)
格式: Licensed eBooks
语言:英语
出版: Birmingham, UK : Packt Publishing, 2016.
丛编:Community experience distilled.
在线阅读:https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=1230623
书本目录:
  • Cover
  • Copyright
  • Credits
  • About the Author
  • About the Reviewers
  • www.PacktPub.com
  • Table of Contents
  • Preface
  • Chapter 1: Probabilistic Reasoning
  • Machine learning
  • Representing uncertainty with probabilities
  • Beliefs and uncertainty as probabilities
  • Conditional probability
  • Probability calculus and random variables
  • Sample space, events, and probability
  • Random variables and probability calculus
  • Joint probability distributions
  • Bayes' rule
  • Interpreting the Bayes' formula
  • A first example of Bayes' rule
  • A first example of Bayes' rule in R
  • Probabilistic graphical models
  • Probabilistic models
  • Graphs and conditional independence
  • Factorizing a distribution
  • Directed models
  • Undirected models
  • Examples and applications
  • Summary
  • Chapter 2: Exact Inference
  • Building graphical models
  • Types of random variable
  • Building graphs
  • Probabilistic expert system
  • Basic structures in probabilistic graphical models
  • Variable elimination
  • Sum-product and belief updates
  • The junction tree algorithm
  • Examples of probabilistic graphical models
  • The sprinkler example
  • The medical expert system
  • Models with more than two layers
  • Tree structure
  • Summary
  • Chapter 3: Learning Parameters
  • Introduction
  • Learning by inference
  • Maximum likelihood
  • How are empirical and model distribution related?
  • The ML algorithm and its implementation in R
  • Application
  • Learning with hidden variables
  • the EM algorithm
  • Latent variables
  • Principles of the EM algorithm
  • Derivation of the EM algorithm
  • Applying EM to graphical models
  • Summary
  • Chapter 4: Bayesian Modeling
  • Basic Models
  • The Naive Bayes model
  • Representation
  • Learning the Naive Bayes model
  • Bayesian Naive Bayes
  • Beta-Binomial
  • The prior distribution.
  • The posterior distribution with the conjugacy property
  • Which values should we choose for the Beta parameters?
  • The Gaussian mixture model
  • Definition
  • Summary
  • Chapter 5: Approximate Inference
  • Sampling from a distribution
  • Basic sampling algorithms
  • Standard distributions
  • Rejection sampling
  • An implementation in R
  • Importance sampling
  • An implementation in R
  • Markov Chain Monte-Carlo
  • General idea of the method
  • The Metropolis-Hastings algorithm
  • MCMC for probabilistic graphical models in R
  • Installing Stan and RStan
  • A simple example in RStan
  • Summary
  • Chapter 6: Bayesian Modeling
  • Linear Models
  • Linear regression
  • Estimating the parameters
  • Bayesian linear models
  • Over-fitting a model
  • Graphical model of a linear model
  • Posterior distribution
  • Implementation in R
  • A stable implementation
  • More packages in R
  • Summary
  • Chapter 7: Probabilistic Mixture Models
  • Mixture models
  • EM for mixture models
  • Mixture of Bernoulli
  • Mixture of experts
  • Latent Dirichlet Allocation
  • The LDA model
  • Variational inference
  • Examples
  • Summary
  • Appendix
  • References
  • Books on the Bayesian theory
  • Books on machine learning
  • Papers
  • Index.