hmc.RdThis function runs the HMC algorithm on a generic model provided
the logPOSTERIOR and gradient glogPOSTERIOR functions.
All parameters specified within the list paramare passed to these two functions.
The tuning parameters epsilon and L are passed to the
Leapfrog algorithm.
hmc( N = 10000, theta.init, epsilon = 0.01, L = 10, logPOSTERIOR, glogPOSTERIOR, randlength = FALSE, Mdiag = NULL, constrain = NULL, verbose = FALSE, varnames = NULL, param = list(), chains = 1, parallel = FALSE, ... )
| N | Number of MCMC samples  | 
    
|---|---|
| theta.init | Vector of initial values for the parameters  | 
    
| epsilon | Step-size parameter for   | 
    
| L | Number of   | 
    
| logPOSTERIOR | Function to calculate and return the log posterior given a vector of values of   | 
    
| glogPOSTERIOR | Function to calculate and return the gradient of the log posterior given a vector of values of    | 
    
| randlength | Logical to determine whether to apply some randomness to the number of leapfrog steps tuning parameter   | 
    
| Mdiag | Optional vector of the diagonal of the mass matrix   | 
    
| constrain | Optional vector of which parameters in   | 
    
| verbose | Logical to determine whether to display the progress of the HMC algorithm  | 
    
| varnames | Optional vector of theta parameter names  | 
    
| param | List of additional parameters for   | 
    
| chains | Number of MCMC chains to run  | 
    
| parallel | Logical to set whether multiple MCMC chains should be run in parallel  | 
    
| ... | Additional parameters for   | 
    
Object of class hmclearn
hmclearn objectsNNumber of MCMC samples
thetaNested list of length N of the sampled values of theta for each chain
thetaCombinedList of dataframes containing sampled values, one for each chain
rList of length N of the sampled momenta
theta.allNested list of all parameter values of theta sampled prior to accept/reject step for each
r.allList of all values of the momenta r sampled prior to accept/reject
acceptNumber of accepted proposals.  The ratio accept / N is the acceptance rate
accept_vVector of length N indicating which samples were accepted
MMass matrix used in the HMC algorithm
algorithmHMC for Hamiltonian Monte Carlo
varnamesOptional vector of parameter names
chainsNumber of MCMC chains
logPOSTERIOR and glogPOSTERIOR functionslinear_posteriorLinear regression: log posterior
g_linear_posteriorLinear regression: gradient of the log posterior
logistic_posteriorLogistic regression: log posterior
g_logistic_posteriorLogistic regression: gradient of the log posterior
poisson_posteriorPoisson (count) regression: log posterior
g_poisson_posteriorPoisson (count) regression: gradient of the log posterior
lmm_posteriorLinear mixed effects model: log posterior
g_lmm_posteriorLinear mixed effects model: gradient of the log posterior
glmm_bin_posteriorLogistic mixed effects model: log posterior
g_glmm_bin_posteriorLogistic mixed effects model: gradient of the log posterior
glmm_poisson_posteriorPoisson mixed effects model: log posterior
g_glmm_poisson_posteriorPoisson mixed effects model: gradient of the log posterior
Neal, Radford. 2011. MCMC Using Hamiltonian Dynamics. In Handbook of Markov Chain Monte Carlo, edited by Steve Brooks, Andrew Gelman, Galin L. Jones, and Xiao-Li Meng, 116–62. Chapman; Hall/CRC.
Betancourt, Michael. 2017. A Conceptual Introduction to Hamiltonian Monte Carlo.
Thomas, S., Tu, W. 2020. Learning Hamiltonian Monte Carlo in R.
Samuel Thomas samthoma@iu.edu, Wanzhu Tu wtu@iu.edu
# Linear regression example set.seed(521) X <- cbind(1, matrix(rnorm(300), ncol=3)) betavals <- c(0.5, -1, 2, -3) y <- X%*%betavals + rnorm(100, sd=.2) fm1_hmc <- hmc(N = 500, theta.init = c(rep(0, 4), 1), epsilon = 0.01, L = 10, logPOSTERIOR = linear_posterior, glogPOSTERIOR = g_linear_posterior, varnames = c(paste0("beta", 0:3), "log_sigma_sq"), param=list(y=y, X=X), parallel=FALSE, chains=1) summary(fm1_hmc, burnin=100)#> Summary of MCMC simulation #>#> 2.5% 5% 25% 50% 75% 95% #> beta0 0.4778379 0.4847789 0.518149 0.5321988 0.5474873 0.5726653 #> beta1 -1.0435956 -1.0398451 -1.022090 -1.0099795 -0.9969146 -0.9695882 #> beta2 1.9749648 1.9834760 2.003620 2.0201366 2.0330953 2.0521270 #> beta3 -3.0212212 -3.0146309 -2.996771 -2.9813806 -2.9668917 -2.9442674 #> log_sigma_sq -3.2887695 -3.2522864 -3.150465 -3.0570739 -2.9441514 -2.8255475 #> 97.5% #> beta0 0.5781760 #> beta1 -0.9622133 #> beta2 2.0554530 #> beta3 -2.9308313 #> log_sigma_sq -2.7737642# poisson regression example set.seed(7363) X <- cbind(1, matrix(rnorm(40), ncol=2)) betavals <- c(0.8, -0.5, 1.1) lmu <- X %*% betavals y <- sapply(exp(lmu), FUN = rpois, n=1) fm2_hmc <- hmc(N = 500, theta.init = rep(0, 3), epsilon = 0.01, L = 10, logPOSTERIOR = poisson_posterior, glogPOSTERIOR = g_poisson_posterior, varnames = paste0("beta", 0:2), param = list(y=y, X=X), parallel=FALSE, chains=1) summary(fm2_hmc, burnin=100)#> Summary of MCMC simulation #>#> 2.5% 5% 25% 50% 75% 95% #> beta0 0.5168403 0.5581553 0.6644147 0.7756224 0.8917828 1.0520828 #> beta1 -0.6815951 -0.6505318 -0.5738573 -0.5167261 -0.4646123 -0.3867174 #> beta2 1.0298809 1.0541035 1.1469696 1.2088527 1.2757970 1.3789653 #> 97.5% #> beta0 1.1026491 #> beta1 -0.3636618 #> beta2 1.4163911