Interactive Tutorial on Dirichlet Processes Using R Shiny

For More posts on Bayesian models, Bayesian Nonparametrics, and causal inference follow me on twitter @stablemarkets. My advisor and his collaborator are teaching a short course on Bayesian Nonparametric Methods for Causal Inference at JSM next week. As part of the short course, I made an interactive tutorial on Dirichlet Processes using R Shiny. All … Continue reading Interactive Tutorial on Dirichlet Processes Using R Shiny

Advertisements

Bayesian Inference with Backfitting MCMC

Previous posts in this series on MCMC samplers for Bayesian inference (in order of publication): Bayesian Simple Linear Regression with Gibbs Sampling in R Blocked Gibbs Sampling in R for Bayesian Multiple Linear Regression Metropolis-in-Gibbs Sampling and Runtime Analysis with Profviz Speeding up Metropolis-Hastings with Rcpp All code for this (and previous) posts are in … Continue reading Bayesian Inference with Backfitting MCMC

Speeding up Metropolis-Hastings with Rcpp

Previous posts in this series on MCMC samplers for Bayesian inference (in order of publication): Bayesian Simple Linear Regression with Gibbs Sampling in R Blocked Gibbs Sampling in R for Bayesian Multiple Linear Regression Metropolis-in-Gibbs Sampling and Runtime Analysis with Profviz The code for all of these posts can be found in my BayesianTutorials GitHub … Continue reading Speeding up Metropolis-Hastings with Rcpp

Metropolis-in-Gibbs Sampling and Runtime Analysis with Profviz

First off, here are the previous posts in my Bayesian sampling series: Bayesian Simple Linear Regression with Gibbs Sampling in R Blocked Gibbs Sampling in R for Bayesian Multiple Linear Regression In the first post, I illustrated Gibbs Sampling - an algorithm for getting draws from a posterior when conditional posteriors are known. In the … Continue reading Metropolis-in-Gibbs Sampling and Runtime Analysis with Profviz

Blocked Gibbs Sampling in R for Bayesian Multiple Linear Regression

In a previous post, I derived and coded a Gibbs sampler in R for estimating a simple linear regression. In this post, I will do the same for multivariate linear regression. I will derive the conditional posterior distributions necessary for the blocked Gibbs sampler. I will then code the sampler and test it using simulated … Continue reading Blocked Gibbs Sampling in R for Bayesian Multiple Linear Regression

Bayesian Simple Linear Regression with Gibbs Sampling in R

Many introductions to Bayesian analysis use relatively simple didactic examples (e.g. making inference about the probability of success given bernoulli data). While this makes for a good introduction to Bayesian principles, the extension of these principles to regression is not straight-forward. This post will sketch out how these principles extend to simple linear regression. Along … Continue reading Bayesian Simple Linear Regression with Gibbs Sampling in R

Fixed Effects, Random Effects, and First Differencing

I came across a stackoverflow post the other day touching on first differencing and decided to write a quick review of the topic as well as related random effects and fixed effects methods. In the end we'll see that random effects, fixed effects, and first differencing are primarily used to handle unobserved heterogeneity within a … Continue reading Fixed Effects, Random Effects, and First Differencing