This function runs the HMC algorithm on a generic model provided the logPOSTERIOR and gradient glogPOSTERIOR functions. All parameters specified within the list paramare passed to these two functions. The tuning parameters epsilon and L are passed to the Leapfrog algorithm.

hmc(
  N = 10000,
  theta.init,
  epsilon = 0.01,
  L = 10,
  logPOSTERIOR,
  glogPOSTERIOR,
  randlength = FALSE,
  Mdiag = NULL,
  constrain = NULL,
  verbose = FALSE,
  varnames = NULL,
  param = list(),
  chains = 1,
  parallel = FALSE,
  ...
)

Arguments

N

Number of MCMC samples

theta.init

Vector of initial values for the parameters

epsilon

Step-size parameter for leapfrog

L

Number of leapfrog steps parameter

logPOSTERIOR

Function to calculate and return the log posterior given a vector of values of theta

glogPOSTERIOR

Function to calculate and return the gradient of the log posterior given a vector of values of theta

randlength

Logical to determine whether to apply some randomness to the number of leapfrog steps tuning parameter L

Mdiag

Optional vector of the diagonal of the mass matrix M. Defaults to unit diagonal.

constrain

Optional vector of which parameters in theta accept positive values only. Default is that all parameters accept all real numbers

verbose

Logical to determine whether to display the progress of the HMC algorithm

varnames

Optional vector of theta parameter names

param

List of additional parameters for logPOSTERIOR and glogPOSTERIOR

chains

Number of MCMC chains to run

parallel

Logical to set whether multiple MCMC chains should be run in parallel

...

Additional parameters for logPOSTERIOR

Value

Object of class hmclearn

Elements for hmclearn objects

N

Number of MCMC samples

theta

Nested list of length N of the sampled values of theta for each chain

thetaCombined

List of dataframes containing sampled values, one for each chain

r

List of length N of the sampled momenta

theta.all

Nested list of all parameter values of theta sampled prior to accept/reject step for each

r.all

List of all values of the momenta r sampled prior to accept/reject

accept

Number of accepted proposals. The ratio accept / N is the acceptance rate

accept_v

Vector of length N indicating which samples were accepted

M

Mass matrix used in the HMC algorithm

algorithm

HMC for Hamiltonian Monte Carlo

varnames

Optional vector of parameter names

chains

Number of MCMC chains

Available logPOSTERIOR and glogPOSTERIOR functions

linear_posterior

Linear regression: log posterior

g_linear_posterior

Linear regression: gradient of the log posterior

logistic_posterior

Logistic regression: log posterior

g_logistic_posterior

Logistic regression: gradient of the log posterior

poisson_posterior

Poisson (count) regression: log posterior

g_poisson_posterior

Poisson (count) regression: gradient of the log posterior

lmm_posterior

Linear mixed effects model: log posterior

g_lmm_posterior

Linear mixed effects model: gradient of the log posterior

glmm_bin_posterior

Logistic mixed effects model: log posterior

g_glmm_bin_posterior

Logistic mixed effects model: gradient of the log posterior

glmm_poisson_posterior

Poisson mixed effects model: log posterior

g_glmm_poisson_posterior

Poisson mixed effects model: gradient of the log posterior

References

Neal, Radford. 2011. MCMC Using Hamiltonian Dynamics. In Handbook of Markov Chain Monte Carlo, edited by Steve Brooks, Andrew Gelman, Galin L. Jones, and Xiao-Li Meng, 116–62. Chapman; Hall/CRC.

Betancourt, Michael. 2017. A Conceptual Introduction to Hamiltonian Monte Carlo.

Thomas, S., Tu, W. 2020. Learning Hamiltonian Monte Carlo in R.

Author

Samuel Thomas samthoma@iu.edu, Wanzhu Tu wtu@iu.edu

Examples

# Linear regression example set.seed(521) X <- cbind(1, matrix(rnorm(300), ncol=3)) betavals <- c(0.5, -1, 2, -3) y <- X%*%betavals + rnorm(100, sd=.2) fm1_hmc <- hmc(N = 500, theta.init = c(rep(0, 4), 1), epsilon = 0.01, L = 10, logPOSTERIOR = linear_posterior, glogPOSTERIOR = g_linear_posterior, varnames = c(paste0("beta", 0:3), "log_sigma_sq"), param=list(y=y, X=X), parallel=FALSE, chains=1) summary(fm1_hmc, burnin=100)
#> Summary of MCMC simulation #>
#> 2.5% 5% 25% 50% 75% 95% #> beta0 0.4778379 0.4847789 0.518149 0.5321988 0.5474873 0.5726653 #> beta1 -1.0435956 -1.0398451 -1.022090 -1.0099795 -0.9969146 -0.9695882 #> beta2 1.9749648 1.9834760 2.003620 2.0201366 2.0330953 2.0521270 #> beta3 -3.0212212 -3.0146309 -2.996771 -2.9813806 -2.9668917 -2.9442674 #> log_sigma_sq -3.2887695 -3.2522864 -3.150465 -3.0570739 -2.9441514 -2.8255475 #> 97.5% #> beta0 0.5781760 #> beta1 -0.9622133 #> beta2 2.0554530 #> beta3 -2.9308313 #> log_sigma_sq -2.7737642
# poisson regression example set.seed(7363) X <- cbind(1, matrix(rnorm(40), ncol=2)) betavals <- c(0.8, -0.5, 1.1) lmu <- X %*% betavals y <- sapply(exp(lmu), FUN = rpois, n=1) fm2_hmc <- hmc(N = 500, theta.init = rep(0, 3), epsilon = 0.01, L = 10, logPOSTERIOR = poisson_posterior, glogPOSTERIOR = g_poisson_posterior, varnames = paste0("beta", 0:2), param = list(y=y, X=X), parallel=FALSE, chains=1) summary(fm2_hmc, burnin=100)
#> Summary of MCMC simulation #>
#> 2.5% 5% 25% 50% 75% 95% #> beta0 0.5168403 0.5581553 0.6644147 0.7756224 0.8917828 1.0520828 #> beta1 -0.6815951 -0.6505318 -0.5738573 -0.5167261 -0.4646123 -0.3867174 #> beta2 1.0298809 1.0541035 1.1469696 1.2088527 1.2757970 1.3789653 #> 97.5% #> beta0 1.1026491 #> beta1 -0.3636618 #> beta2 1.4163911