Wiki

Clone wiki

bnpy-dev / Resources

Welcome. This is an outline of a community-driven collection of reading lists about key concepts relevant to bnpy.

Recent publications using bnpy from the Sudderth lab

Theses at Brown University

  • "Variational Inference for Hierarchical Dirichlet Process Based Nonparametric Models". Undergraduate honors thesis by William Stephenson. 2015. [paper]

  • "Parallelization of Variational Inference for Bayesian nonparametric topic models". Undergraduate honors theses by Sonia Phene. 2015. [paper]

  • "Variational Inference for Beta-Bernoulli Dirichlet Process Mixture Models." Master's thesis by Mengrui Ni. 2015. [paper]

  • "Reliable and scalable variational inference for Bayesian nonparametric models". PhD thesis proposal by Michael C. Hughes. 2014. [paper]

Conference publications

  • "Memoized online variational inference for Dirichlet process mixture models." Michael C. Hughes and Erik B. Sudderth. NIPS 2013. [paper] [supplement] [bibtex]

  • "Reliable and scalable variational inference for the hierarchical Dirichlet process." Michael C. Hughes, Dae Il Kim, and Erik B. Sudderth. AISTATS 2015. [paper] [supplement] [bibtex]

  • "Efficient Online Inference for Bayesian Nonparametric Relational Models." Dae Il Kim, Prem Gopalan, David M. Blei and Erik B. Sudderth. NIPS 2013. [paper]

  • "Truly Nonparametric Online Variational Inference for Hierarchical Dirichlet Processes." Michael J. Bryant and Erik B. Sudderth. NIPS 2012. [paper]

Reading Lists

Basics: Simple Clustering

Key concepts: K-means, mixture models, EM algorithm, Gaussian distribution

Basics: Topic Models

Key concepts: Dirichlet distribution, topic models, Latent Dirichlet Allocation

Advanced: Topic Models

Key concepts: online learning, correlations, metadata, nonparametric

Basics : Sequential Markov Models

Key concepts: Markov chains, Hidden Markov Models

Basics : Optimization-based Bayesian learning algorithms

Key concepts: Expectation-maximization, variational inference

Prerequisite: Basics: Simple Clustering

Basics : Dirichlet distributions

Key concepts: Gamma function, Dirichlet random variables

Advanced : Dirichlet process

Key concepts: stick breaking, Chinese restaurant process

Advanced : Nonparametric Bayesian models

TODO: [Basics : Sampling-based Bayesian learning algorithms]

TODO: [Basics : Online learning]

General references

Free textbooks

  • "Bayesian Reasoning and Machine Learning" by David Barber. 2012. [pdf].

  • "Information theory, inference, and learning algorithms" by David MacKay. 2003. [pdf]

Digital resources

Updated