Creates a Metropolis-type MCMC with options for covariance adaptatin, delayed rejection, Metropolis-within-Gibbs, and tempering
Metropolis(bayesianSetup, settings = list(startValue = NULL, optimize = T, proposalGenerator = NULL, consoleUpdates = 100, burnin = 0, thin = 1, parallel = NULL, adapt = T, adaptationInterval = 500, adaptationNotBefore = 3000, DRlevels = 1, proposalScaling = NULL, adaptationDepth = NULL, temperingFunction = NULL, gibbsProbabilities = NULL, message = TRUE))
bayesianSetup | either an object of class bayesianSetup created by |
---|---|
settings | a list of settings - possible options follow below |
startValue | startValue for the MCMC and optimization (if optimize = T). If not provided, the sampler will attempt to obtain the startValue from the bayesianSetup |
optimize | logical, determines whether an optimization for start values and proposal function should be run before starting the sampling |
proposalGenerator | optional proposalgenerator object (see |
proposalScaling | additional scaling parameter for the proposals that controls the different scales of the proposals after delayed rejection (typical, after a rejection, one would want to try a smaller scale). Needs to be as long as DRlevels. Defaults to 0.5^(- 0:(mcmcSampler$settings$DRlevels -1) |
burnin | number of iterations treated as burn-in. These iterations are not recorded in the chain. |
thin | thinning parameter. Determines the interval in which values are recorded. |
consoleUpdates | integer, determines the frequency with which sampler progress is printed to the console |
adapt | logical, determines wheter an adaptive algorithm should be implemented. Default is TRUE. |
adaptationInterval | integer, determines the interval of the adaption if adapt = TRUE. |
adaptationNotBefore | integer, determines the start value for the adaption if adapt = TRUE. |
DRlevels | integer, determines the number of levels for a delayed rejection sampler. Default is 1, which means no delayed rejection is used. |
temperingFunction | function to implement simulated tempering in the algorithm. The function describes how the acceptance rate will be influenced in the course of the iterations. |
gibbsProbabilities | vector that defines the relative probabilities of the number of parameters to be changes simultaniously. |
message | logical determines whether the sampler's progress should be printed |
The 'Metropolis' function is the main function for all Metropolis based samplers in this package. To call the derivatives from the basic Metropolis-Hastings MCMC, you can either use the corresponding function (e.g. AM
for an adaptive Metropolis sampler) or use the parameters to adapt the basic Metropolis-Hastings. The advantage of the latter case is that you can easily combine different properties (e.g. adapive sampling and delayed rejection sampling) without changing the function.
Haario, H., E. Saksman, and J. Tamminen (2001). An adaptive metropolis algorithm. Bernoulli , 223-242.
Haario, Heikki, et al. "DRAM: efficient adaptive MCMC." Statistics and Computing 16.4 (2006): 339-354.
Hastings, W. K. (1970). Monte carlo sampling methods using markov chains and their applications. Biometrika 57 (1), 97-109.
Green, Peter J., and Antonietta Mira. "Delayed rejection in reversible jump Metropolis-Hastings." Biometrika (2001): 1035-1053.
Metropolis, N., A. W. Rosenbluth, M. N. Rosenbluth, A. H. Teller, and E. Teller (1953). Equation of state calculations by fast computing machines. The journal of chemical physics 21 (6), 1087 - 1092.
# Running the metropolis via the runMCMC with a proposal covariance generated from the prior # (can be useful for complicated priors) ll = function(x) sum(dnorm(x, log = TRUE)) setup = createBayesianSetup(ll, lower = c(-10,-10), upper = c(10,10)) samples = setup$prior$sampler(1000) generator = createProposalGenerator(diag(1, setup$numPars)) generator = updateProposalGenerator(generator, samples, manualScaleAdjustment = 1, message = TRUE)#> Proposalgenerator settings changedcovariance set to: #> [,1] [,2] #> [1,] 34.8160170 -0.5583527 #> [2,] -0.5583527 32.8871576 #> covarianceDecomp set to: #> [,1] [,2] #> [1,] 5.90051 -0.09462788 #> [2,] 0.00000 5.73395180 #> gibbsProbabilities set to: #> NULL #> gibbsWeights set to: #> NULL #> otherDistribution set to: #> NULL #> otherDistributionLocation set to: #> NULLsettings = list(proposalGenerator = generator, optimize = FALSE, iterations = 500) out = runMCMC(bayesianSetup = setup, sampler = "Metropolis", settings = settings)#> Running Metropolis-MCMC, chain 1 iteration 100 of 500 . Current logp: -12.60143 Please wait! Running Metropolis-MCMC, chain 1 iteration 200 of 500 . Current logp: -8.423186 Please wait! Running Metropolis-MCMC, chain 1 iteration 300 of 500 . Current logp: -8.174177 Please wait! Running Metropolis-MCMC, chain 1 iteration 400 of 500 . Current logp: -8.532211 Please wait! Running Metropolis-MCMC, chain 1 iteration 500 of 500 . Current logp: -8.532211 Please wait!#>