Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
47 changes: 47 additions & 0 deletions BayesianCognitiveModeling/bayesian_cognitive_modeling.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -1636,3 +1636,50 @@ As always in everyday life when working with code, it is likely that you will ru
While there is an unpredictable number of errors that JAGS can throw out, here, we treat the below list as a work-in-progress documentation of common errors that we have encountered, and what might be potential ways to solve them.

We will continue expanding this section.

### I.) Node inconsistent with parents

Sometimes you might get errors of the following format:

```
Error in checkForRemoteErrors(val) :
8 nodes produced errors; first error: Error in node choices[31,22]
Node inconsistent with parents
```

These are very common in JAGS and it is often not immediately clear what the cause is. It is recommendable to take a very close look at the underlying data at the specific datapoint.
In this example, it would mean to check what the outcome and probability information of problem 31 for person 22 is, that the model has to work with, and what the choice was that the individual made. Here, the individual chose option B (choice = 0):

```
> choices[31,22]
07-JE-SU-MU
0
```

... even though the choice problem had a clear dominant option of a sure win in option A compared to a sure high loss in option B:

```
> prospectsA[31,,22]
[1] 150 1 0 0 -730 0
> prospectsB[31,,22]
[1] 810 0 0 0 -1000 1
```

Consequently, the model would predict a choice of A, whereas the empirical data (the participant) chose otherwise.
Safe options like this (where one probability = 1) can appear frequently in experience-based choice data, when people do not sample a lot.

Here, the JAGS error might be caused by a computational problem, when the model predicts a choice probability of 0 which causes the model to crash.
We can circumvent the problem in this case by constraining the predicted choice probability to a range between 0.00001 and 0.99999, for example.
I.e., in the model code, we substitute

```
choices[i,j] ~ dbern(binval[i,j])
```
with
```
choices[i,j] ~ dbern(min(max(binval[i,j],0.00001),0.99999))
```




47 changes: 46 additions & 1 deletion BayesianCognitiveModeling/bayesian_cognitive_modeling.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ Bayesian Cognitive Modeling Tutorial
================
Linus Hof & Nuno Busch

Last updated: 2025-07-25
Last updated: 2025-07-28

## Part I: Risky Choice and its Models

Expand Down Expand Up @@ -2416,3 +2416,48 @@ as a work-in-progress documentation of common errors that we have
encountered, and what might be potential ways to solve them.

We will continue expanding this section.

### I). Node inconsistent with parents

Sometimes you might get errors of the following format:

Error in checkForRemoteErrors(val) :
8 nodes produced errors; first error: Error in node choices[31,22]
Node inconsistent with parents

These are very common in JAGS and it is often not immediately clear what
the cause is. It is recommendable to take a very close look at the
underlying data at the specific datapoint. In this example, it would
mean to check what the outcome and probability information of problem 31
for person 22 is, that the model has to work with, and what the choice
was that the individual made. Here, the individual chose option B
(choice = 0):

> choices[31,22]
07-JE-SU-MU
0

… even though the choice problem had a clear dominant option of a sure
win in option A compared to a sure high loss in option B:

> prospectsA[31,,22]
[1] 150 1 0 0 -730 0
> prospectsB[31,,22]
[1] 810 0 0 0 -1000 1

Consequently, the model would predict a choice of A, whereas the
empirical data (the participant) chose otherwise. Safe options like this
(where one probability = 1) can appear frequently in experience-based
choice data, when people do not sample a lot.

Here, the JAGS error might be caused by a computational problem, when
the model predicts a choice probability of 0 which causes the model to
crash. We can circumvent the problem in this case by constraining the
predicted choice probability to a range between 0.00001 and 0.99999, for
example. I.e., in the model code, we substitute

choices[i,j] ~ dbern(binval[i,j])

with

choices[i,j] ~ dbern(min(max(binval[i,j],0.00001),0.99999))
53 changes: 44 additions & 9 deletions BayesianCognitiveModeling/helper_functions/CPT_plotting.R
Original file line number Diff line number Diff line change
Expand Up @@ -7,23 +7,58 @@ color = rgb(1/255, 100/255, 200/255, alpha = 1)
####### value function TK92 ##########
######################################
vf_TK92 <- function(samples = samples, # JAGS output object name
color = rgb(1/255, 100/255, 200/255, alpha = 1), # color for lines
color = rgb(1/255, 100/255, 200/255, alpha = 1), # color for lines
fontsize = 1, # controls font size of axis and tick labels
condition = "", # condition for title of plot
alpha_subj = "alpha", # individual parameter names
lambda_subj = "lambda",
beta_subj = "alpha", # by default, the function will assume no separate curvature parameter for the loss domain. if there is one, that can be specified here
alpha_mean = "mu.alpha", # parameter names of group_level means
lambda_mean = "mu.lambda",
beta_mean = "mu.alpha" # by default, the function will assume no separate curvature parameter for the loss domain. if there is one, that can be specified here
) {


par(mfrow=c(1,1))
a <- seq(-100, 100, by=0.1)
plot(a, a, "l", axes=FALSE, xlab='', ylab='', cex.axis=.7, lty=2, lwd=1, ylim=c(-10, 10), xlim=c(-20, 20), col="white")
par(xpd=FALSE)
title(paste("Value function"), cex.main=1.5, font.main=1)
axis(1, seq(from=-20, to=20, by=5), pos=0, cex.axis=.6, mgp=c(3, .1, 1), tck=-.01)
axis(2, seq(from=-10, to=10, by=2), pos=0, cex.axis=.6, tck=-.01, las=1, mgp=c(3, 0.6, 0))
mtext(side=1, text="Outcome", line=1)
mtext(side=2, text="Subjective Value", line=.5)

# scale gaps to font size
lab_gap <- 0.6 * fontsize # distance of tick labels from axis line
title_gap <- 1.2 + 0.4*(fontsize - 1) # distance of axis titles (mtext) from plot

plot(a, a, type = "l", axes = FALSE, xlab = "", ylab = "",
cex.axis = fontsize * 1.1, lty = 2, lwd = 1,
ylim = c(-10, 10), xlim = c(-20, 20), col = "white")
par(xpd = FALSE)
title(paste0("Value function", condition), cex.main = fontsize * 1.5, font.main = 1)

axis(1, seq(-20, 20, by = 5), pos = 0,
cex.axis = 0.9* fontsize, tck = -0.01, mgp = c(3, lab_gap, 0))
axis(2, seq(-10, 10, by = 2), pos = 0,
cex.axis = 0.9*fontsize, tck = -0.01, las = 1, mgp = c(3, lab_gap, 0))

mtext(text = "Outcome", side = 1, line = title_gap, cex = fontsize)
mtext(text = "Subjective Value", side = 2, line = title_gap, cex = fontsize)



# plot(a, a, "l", axes=FALSE, xlab='', ylab='',
# cex.axis=fontsize*1.1, # .6,
# lty=2, lwd=1, ylim=c(-10, 10), xlim=c(-20, 20), col="white")
# par(xpd=FALSE)
# title(paste("Value function"), cex.main=1.5, font.main=1)
# axis(1, seq(from=-20, to=20, by=5), pos=0,
# cex.axis= fontsize, # .6,
# mgp=c(3, .1, 1), tck=-.01)
# axis(2, seq(from=-10, to=10, by=2), pos=0,
# cex.axis=fontsize, # .6,
# tck=-.01, las=1, mgp=c(3, 0.6, 0))
# mtext(side=1, text="Outcome",
# cex = fontsize,
# line=1)
# mtext(side=2, text="Subjective Value",
# cex = fontsize,
# line=.5)

# plot dashed line
lines(a,a,col="black",lty=2,lwd=1)
Expand Down Expand Up @@ -66,7 +101,7 @@ vf_TK92 <- function(samples = samples, # JAGS output object name

legend(1, -2, inset=0,
legend = c(expression("Group-level estimate"), expression("Individual estimates")),
cex = 1.2,
cex = 0.9 * fontsize,
col = c(color, color_subj), horiz = F,bty = "n",
lty = 1, # Solid line
lwd = 2 # Line width
Expand Down