Step sizes

All adaptation algorithms are based on similar sequential stochastic gradient like (or Robbins-Monro) updates, which rely in decreasing step size or 'learning rate' sequence $\gamma_k$.

The AdaptiveMCMC.jl uses an abstract StepSize type. Each concrete subtype should have a method get(stepsize, k), which returns $\gamma_k$ corresponding to stepsize.

The current implementation AdaptiveMCMC.jl implements essentially only the step size of the form:

\[ \gamma_k = c k^{-\eta}\]

where $c>0$ and $1/2 < \eta\le 1$ are the two parameters of the sequence. (The given range for $\eta$ ensures that $\sum_k \eta_k = \infty$ and $\sum_k \eta_k^2 <\infty$, which are desirable properties for the step size sequence...)

AdaptiveMCMC.PolynomialStepSizeType
gamma = PolynomialStepSize(eta::AbstractFloat, [c::AbstractFloat=1.0]))

Constructor for PolynomialStepSize.

Arguments

  • eta: The step size exponent, should be within (1/2,1].
  • c: Scaling factor; default 1.0.
source

There is also a variant of the PolynomialStepSize for the RAM:

\[ \gamma_k = \min\{1/2, d k^{-\eta}\},\]

where $d$ is the state dimension. The RAM step size can be constructed as follows:

AdaptiveMCMC.RAMStepSizeType
gamma = RAMStepSize(eta::AbstractFloat, d::Int)

Constructor for RAM step size.

Arguments

  • eta: The step size exponent, should be within (1/2,1].
  • d: State dimension.
source