-
Notifications
You must be signed in to change notification settings - Fork 0
Description
I am hopeful this is just a corner case when using a 1-d random effect, but I noticed this behavior in debugging a test failure for the test "Laplace simplest 1D (constrained) with multiple data works" in test-Laplace.R.
So I am recording this here, but not planning on further investigation at the moment.
In the plot below I plot the Laplace marginal logLik as a function of the parameter. The black points are from using "zero" for innerOptimStart, the red are from using "last.best", when calculating the logLik curve first, and the green from using "last.best" when first finding the MLE and then calculating the logLik curve. It seems that because the starting point for inner optimization changes depending on the history of calls, we can have history dependence and some weird numerical behavior. I'm hopeful this is confined to the 1 RE case, as it doesn't seem to occur when using multiple REs to extend this example and I don't think we've noticed bad behavior in other examples, but we might just not have noticed.
Here's code to reproduce:
set.seed(1)
m <- nimbleModel(
nimbleCode({
mu ~ dnorm(0, sd = 5)
a ~ dexp(rate = exp(mu))
for (i in 1:5){
y[i] ~ dnorm(a, sd = 2)
}
}), data = list(y = rnorm(5, 1, 2)), inits = list(mu = 2, a = 1),
buildDerivs = TRUE
)
mLaplace <- buildLaplace(model = m,control=list(innerOptimStart = 'zero'))
cm <- compileNimble(m)
cLaplace <- compileNimble(mLaplace, project = m)
# cLaplace$calcLogLik(.289)
xs <- seq(.289, .290, len=100)
ll0 <- sapply(xs, function(x) cLaplace$calcLogLik(x))
set.seed(1)
m <- nimbleModel(
nimbleCode({
mu ~ dnorm(0, sd = 5)
a ~ dexp(rate = exp(mu))
for (i in 1:5){
y[i] ~ dnorm(a, sd = 2)
}
}), data = list(y = rnorm(5, 1, 2)), inits = list(mu = 2, a = 1),
buildDerivs = TRUE
)
mLaplace <- buildLaplace(model = m,control=list(innerOptimStart = 'last.best'))
cm <- compileNimble(m)
cLaplace <- compileNimble(mLaplace, project = m)
ll1 <- sapply(xs, function(x) cLaplace$calcLogLik(x))
set.seed(1)
m <- nimbleModel(
nimbleCode({
mu ~ dnorm(0, sd = 5)
a ~ dexp(rate = exp(mu))
for (i in 1:5){
y[i] ~ dnorm(a, sd = 2)
}
}), data = list(y = rnorm(5, 1, 2)), inits = list(mu = 2, a = 1),
buildDerivs = TRUE
)
mLaplace <- buildLaplace(model = m,control=list(innerOptimStart = 'last.best'))
cm <- compileNimble(m)
cLaplace <- compileNimble(mLaplace, project = m)
opt <- cLaplace$findMLE(2)
ll2 <- sapply(xs, function(x) cLaplace$calcLogLik(x))
pdf('profiles.pdf')
plot(xs,ll0, type ='p')
points(xs,ll1, col='red')
points(xs,ll2, col='green')
dev.off()