This is the basic computing function for HMC and should not be called directly except by experienced users.

hmc.fit(
  N,
  theta.init,
  epsilon,
  L,
  logPOSTERIOR,
  glogPOSTERIOR,
  varnames = NULL,
  randlength = FALSE,
  Mdiag = NULL,
  constrain = NULL,
  verbose = FALSE,
  ...
)

Arguments

N

Number of MCMC samples

theta.init

Vector of initial values for the parameters

epsilon

Step-size parameter for leapfrog

L

Number of leapfrog steps parameter

logPOSTERIOR

Function to calculate and return the log posterior given a vector of values of theta

glogPOSTERIOR

Function to calculate and return the gradient of the log posterior given a vector of values of theta

varnames

Optional vector of theta parameter names

randlength

Logical to determine whether to apply some randomness to the number of leapfrog steps tuning parameter L

Mdiag

Optional vector of the diagonal of the mass matrix M. Defaults to unit diagonal.

constrain

Optional vector of which parameters in theta accept positive values only. Default is that all parameters accept all real numbers

verbose

Logical to determine whether to display the progress of the HMC algorithm

...

Additional parameters for logPOSTERIOR and glogPOSTERIOR

Value

List for hmc

Elements for hmclearn objects

N

Number of MCMC samples

theta

Nested list of length N of the sampled values of theta for each chain

thetaCombined

List of dataframes containing sampled values, one for each chain

r

List of length N of the sampled momenta

theta.all

Nested list of all parameter values of theta sampled prior to accept/reject step for each

r.all

List of all values of the momenta r sampled prior to accept/reject

accept

Number of accepted proposals. The ratio accept / N is the acceptance rate

accept_v

Vector of length N indicating which samples were accepted

M

Mass matrix used in the HMC algorithm

algorithm

HMC for Hamiltonian Monte Carlo

References

Neal, Radford. 2011. MCMC Using Hamiltonian Dynamics. In Handbook of Markov Chain Monte Carlo, edited by Steve Brooks, Andrew Gelman, Galin L. Jones, and Xiao-Li Meng, 116–62. Chapman; Hall/CRC.

Betancourt, Michael. 2017. A Conceptual Introduction to Hamiltonian Monte Carlo.

Thomas, S., Tu, W. 2020. Learning Hamiltonian Monte Carlo in R.

Examples

# Logistic regression example X <- cbind(1, seq(-100, 100, by=0.25)) betavals <- c(-0.9, 0.2) lodds <- X %*% betavals prob1 <- as.numeric(1 / (1 + exp(-lodds))) set.seed(9874) y <- sapply(prob1, function(xx) { sample(c(0, 1), 1, prob=c(1-xx, xx)) }) f1 <- hmc.fit(N = 500, theta.init = rep(0, 2), epsilon = c(0.1, 0.002), L = 10, logPOSTERIOR = logistic_posterior, glogPOSTERIOR = g_logistic_posterior, y=y, X=X) f1$accept / f1$N
#> [1] 0.93