diff --git a/.github/workflows/check.yml b/.github/workflows/check.yml index ffd6bace7..3e81c8885 100644 --- a/.github/workflows/check.yml +++ b/.github/workflows/check.yml @@ -25,7 +25,7 @@ jobs: run: Rscript -e 'install.packages(c("rmarkdown","blogdown", "remotes"))' - name: Install hugo - run: Rscript -e 'blogdown::install_hugo()' + run: Rscript -e 'blogdown::install_hugo(version="0.152.2")' # Latest is currently broken - name: Build site with blogdown run: Rscript -e "blogdown::build_site()" diff --git a/.github/workflows/deploy.yml b/.github/workflows/deploy.yml index 8d0f5a1c2..2474fcc18 100644 --- a/.github/workflows/deploy.yml +++ b/.github/workflows/deploy.yml @@ -25,7 +25,7 @@ jobs: run: Rscript -e 'install.packages(c("rmarkdown","blogdown", "remotes"))' - name: Install hugo - run: Rscript -e 'blogdown::install_hugo(version="0.139.4")' # Update to the latest version + run: Rscript -e 'blogdown::install_hugo(version="0.152.2")' # Latest is currently broken - name: Build site with blogdown run: Rscript -e "blogdown::build_site()" diff --git a/content/authors/ben-tribe/_index.md b/content/authors/ben-tribe/_index.md deleted file mode 100644 index 05e908251..000000000 --- a/content/authors/ben-tribe/_index.md +++ /dev/null @@ -1,33 +0,0 @@ ---- -# Name -title: Ben Tribe -# Folder name -authors: -- ben-tribe -bio: MRes student -social: -- icon: researchgate - icon_pack: fab - link: https://www.researchgate.net/profile/Benjamin_Tribe -- icon: github - icon_pack: fab - link: https://github.com/BenjaminTribe -role: Research Assistant
University of Sussex
-interests: -- (In)Attention and mind-wandering -- Consciousness, perception and belief -- Individual differences and (in)equity -- Data analysis and programming -- Walks 🏞 and Doggos 🐢 -education: - courses: - - course: MRes Psychological Methods - institution: University of Sussex - year: 2023 - 2025 - - course: BSc, Psychology with British Sign Language - institution: University of Sussex - year: 2018 - 2023 -superuser: false -user_groups: -- Research Assistants ---- diff --git a/content/authors/ben-tribe/avatar.JPG b/content/authors/ben-tribe/avatar.JPG deleted file mode 100644 index fbf204883..000000000 Binary files a/content/authors/ben-tribe/avatar.JPG and /dev/null differ diff --git a/content/post/2025-07-02-DataPipeOSF/index.md b/content/post/2025-07-02-DataPipeOSF/index.md index e161c18a9..0375ec64a 100644 --- a/content/post/2025-07-02-DataPipeOSF/index.md +++ b/content/post/2025-07-02-DataPipeOSF/index.md @@ -21,8 +21,7 @@ tags: - Psychology --- -Hello there! πŸ‘‹ -Let's learn how to set up DataPipe to collect and save data in OSF. +Hello there! πŸ‘‹ Let's learn how to set up DataPipe to collect and save data in OSF. Lets start with some basics! @@ -32,38 +31,38 @@ DataPipe is a tool that allows you to collect and save data in OSF (Open Science ## How to set up DataPipe in OSF -1. **Create an OSF Project**: Start by creating a new project in [OSF](https://osf.io/). This will be the container for your data and any related files. You can set up an account if you don't have one already, quite easily! +1. **Create an OSF Project**: Start by creating a new project in [OSF](https://osf.io/). This will be the container for your data and any related files. You can set up an account if you don't have one already, quite easily! - - Go to the OSF homepage and log in or create an account. You can easily sign up through institutional access. - - Click on "Create a New Project" and fill in the necessary details such as project title, description, and visibility settings. ***DO NOT SET IT AS PUBLIC*** as the data being saved will not be anonymized and may contain sensitive information. + - Go to the OSF homepage and log in or create an account. You can easily sign up through institutional access. + - Click on "Create a New Project" and fill in the necessary details such as project title, description, and visibility settings. ***DO NOT SET IT AS PUBLIC*** as the data being saved will not be anonymized and may contain sensitive information. -2. **Create OSF Token**: You will need to create a token to grant DataPipe the necessary permissions to access your OSF project. +2. **Create OSF Token**: You will need to create a token to grant DataPipe the necessary permissions to access your OSF project. - - Go to your OSF account settings and navigate to the "personal access tokens" section. - - Click on "Create a new token" and give it a name (e.g., "DataPipe Token"). - - Set the permissions for the token, ensuring it has access to read and write data in your project. - - Copy the generated token; you will need it later. + - Go to your OSF account settings and navigate to the "personal access tokens" section. + - Click on "Create a new token" and give it a name (e.g., "DataPipe Token"). + - Set the permissions for the token, ensuring it has access to read and write data in your project. + - Copy the generated token; you will need it later. -3. **Link OSF to DataPipe**: In DataPipe, you will need to link your OSF project using the token you created. +3. **Link OSF to DataPipe**: In DataPipe, you will need to link your OSF project using the token you created. - - Open DataPipe, click Account on the top right corner and select settings. - - Click on the 'Set OSF Token' button and paste the token you copied earlier. + - Open DataPipe, click Account on the top right corner and select settings. + - Click on the 'Set OSF Token' button and paste the token you copied earlier. -4. **Create new experiment on DataPipe**: Now that your OSF project is linked, you can create a new experiment in DataPipe. +4. **Create new experiment on DataPipe**: Now that your OSF project is linked, you can create a new experiment in DataPipe. - - Click on "Create New Experiment" in DataPipe. - - Give the experiment a name - I recommend using the same name as your OSF project for consistency. - - Add the OSF project ID to the experiment settings. You can find the project ID in the URL of your OSF project (it is the alphanumeric string after osf.io/) - - Create a new OSF Data component called 'data'. This will create a folder - named data - in your OSF project where all the data collected will be saved. - - Choose Germany - Frankfurt as the server location for your DataPipe experiment. This is important for data privacy and compliance with regulations such as GDPR. + - Click on "Create New Experiment" in DataPipe. + - Give the experiment a name - I recommend using the same name as your OSF project for consistency. + - Add the OSF project ID to the experiment settings. You can find the project ID in the URL of your OSF project (it is the alphanumeric string after osf.io/) + - Create a new OSF Data component called 'data'. This will create a folder - named data - in your OSF project where all the data collected will be saved. + - Choose Germany - Frankfurt as the server location for your DataPipe experiment. This is important for data privacy and compliance with regulations such as GDPR. -5. **Configure Data Collection**: Once the experiment is set up on DataPipe enable data collection on the Status section. +5. **Configure Data Collection**: Once the experiment is set up on DataPipe enable data collection on the Status section. You can optionally enable base64 data collection if you wish to encode any video, audio or image files as strings. 'Condition assignment' can also be enabled- this makes DataPipe loop through the conditions when it requests the data. When deciding whether these features are suitable, it's best to consider how you will preprocess the data. It's advised that you only enable the minimum needed as a security measure. -6. **Save the data from the experiment hosted on GitHub**: If you are using a GitHub repository to host your experiment, you can save the data collected by writing the bellow code to the experiment HTML file. This bit of code should be called at the end of your experiment to ensure that all data is saved to the OSF project +6. **Save the data from the experiment hosted on GitHub**: If you are using a GitHub repository to host your experiment, you can save the data collected by writing the below code to the experiment HTML file. This bit of code should be called at the end of your experiment to ensure that all data is saved to the OSF project - - Here is what the code should look like in your experiment HTML file: + - Here is what the code should look like in your experiment HTML file: - ```javascript + ``` javascript // Save data via DataPipe timeline.push({ type: jsPsychPipe, @@ -72,10 +71,18 @@ DataPipe is a tool that allows you to collect and save data in OSF (Open Science filename: `${participantID}.csv`, data_string: () => jsPsych.data.get().csv(), }) - ``` + ``` - - On the experiment created on DataPipe, there is an 'Experiment ID' field. This is the ID you need to add to the `experiment_id` field in the code above. - - The `filename` field can be customized to include the participant ID or any other identifier you prefer. + - On the experiment created on DataPipe, there is an 'Experiment ID' field. This is the ID you need to add to the `experiment_id` field in the code above. + - The `filename` field can be customized to include the participant ID or any other identifier you prefer. -7. **Run Your Experiment**: With everything set up, you can now run your experiment. DataPipe will automatically collect and save the data to your OSF project as specified. *Give it a try!* \ No newline at end of file + - If publishing your experiment to GitHub, make sure the link is + + *'https://[your username].github.io/[your repository name]'* + + or *'https://[your username].github.io/[your repository name]/[name of experiment's html file]'* if the html file for your experiment is named anything other than `'index.html'` + +7. **Run Your Experiment**: With everything set up, you can now run your experiment. DataPipe will automatically collect and save the data to your OSF project as specified. + + *Give it a try!* If you'd like further clarification, the [DataPipe website](https://pipe.jspsych.org/getting-started) includes a useful outline. diff --git a/content/post/2025-12-20-Interoception/ans.webp b/content/post/2025-12-20-Interoception/ans.webp new file mode 100644 index 000000000..f2e21df6e Binary files /dev/null and b/content/post/2025-12-20-Interoception/ans.webp differ diff --git a/content/post/2025-12-20-Interoception/featured.png b/content/post/2025-12-20-Interoception/featured.png new file mode 100644 index 000000000..e284fcf01 Binary files /dev/null and b/content/post/2025-12-20-Interoception/featured.png differ diff --git a/content/post/2025-12-20-Interoception/index.md b/content/post/2025-12-20-Interoception/index.md new file mode 100644 index 000000000..9ae620352 --- /dev/null +++ b/content/post/2025-12-20-Interoception/index.md @@ -0,0 +1,117 @@ +--- +authors: +- roisin-sharma +- oliver-collins +categories: +- Reality Bending Lab +date: "2025-12-20" +title: "What is the Best Interoception Questionnaire?" +sbtitle: "" +summary: "There are many self-report interoception scales out there- but which one should you pick? Let's discuss the pros and cons of each." +tags: +- Reality Bending Lab +- ReBeL +- University of Sussex +- Psychology +- Interoception +- Mint Scale +- Self-Report Measures +--- + +HelloπŸ‘‹! We are [RΓ³isΓ­n](https://realitybending.github.io/authors/roisin-sharma/) and [Oliver](https://realitybending.github.io/authors/oliver-collins/), two [Research Assistants](https://realitybending.github.io/jobs/assistant/) at the lab, and today we are going to be discussing the tricky topic of self-report interoception questionnaires. + +**Interoception**, essentially referring to one's sensation of their internal body, is a fundamental phenomenon that we rely on in everyday life, and recent research highlights it as a trans-diagnostic underpinning of a variety of somatic and psychological difficulties. + +**While we know interoception is very important, the specifics are still being worked out**. Debates continue on what the exactly interoceptions is, is not, and what it encompasses in terms of modalities or processes. Is it limited to visceral sensations (i.e., from internal organs)? Does it include proprioception (i.e., body position sense)? Pain? What about tactile sensations (i.e., touch and skin)? Does it include the interaction with higher-order processes like attention and beliefs? + + +This chaotic and moving landscape has been accompanied by the development and repurposing of different interoception (and interoception-adjacent) questionnaires, each with their own philosophies and approach. Carefully choosing a good measure of interoception is crucial to avoid adding to the [**jingle-jangle fallacy**](https://en.wikipedia.org/wiki/Jingle-jangle_fallacies) plaguing the field, in which discrepancies and contradictions of results *"related to interoception"* are driven by differences in what aspect of it is actually being measured. + + +Moreover, unlike *exteroception* (vision, audition, etc.), where researchers can easily manipulate external stimuli to validate a participant's response, interoception presents a unique challenge: the stimuli originate from within the body. Because internal states are difficult to manipulate or observe directly, objective validation is complex. Nonetheless, especially as "objective" tasks like the Heart Beat Counting Task (HCT; [Schandry, 1981](https://onlinelibrary-wiley-com.sussex.idm.oclc.org/doi/10.1111/j.1469-8986.1981.tb02486.x)) have their own methodological drawbacks, self-report questionnaires remain a scalable, practical, and widely-used tools for assessing interoception. Let's explore the most popular and established questionnaires. + +## Questionnaires Overview + +### 😨 Body Perception Questionnaire (BPQ) + +The **BPQ** is one of the earliest interoception scales, originally built by [Porges in 1993](https://terpconnect.umd.edu/~sporges/body/body.txt). This questionnaire focuses on the autonomic nervous system, involved in stress responses, and thus is mainly concerned with internal sensing when there are problems (e.g., 'tremor in my lips', 'general jitteriness' being two items for body awareness). This makes the scale beneficial in clinical contexts to investigate maladaptive interoception, particularly in patients who have a dysregulated autonomic nervous systems. However, if you are interested in interoception in a wider context, other questionnaires may be more appropriate. + + +
+The Autonomic Nervous System (ANS) +
The Autonomic Nervous System (ANS)
+
+ + +### πŸ§˜β€β™€οΈ Multidimensional Assessment of Interoceptive Awareness (MAIA) + +The **Multidimensional Assessment of Interoceptive Awareness (MAIA)** (the MAIA-2 being the most recent version) is another widely used questionnaire that accounts for body awareness in positive states - deriving from research on emotional regulation and pain. This questionnaire was created because [Mehling et al. (2012)](https://doi.org/10.1371/journal.pone.0048230) believed western medicine focused too much on bodily awareness as a maladaptive trait, even though research was increasingly finding health benefits from a sense of embodiment. It was specifically designed to assess mind-body therapies and was finalised based on data from individuals with various therapeutic backgrounds including yoga, tai chi and breath-work. The MAIA reconceptualises bodily awareness not only as an anxiety-related process but also an integral part of mindfulness. This translates to many of the questions focusing on _metacognitive beliefs_ about one's body and emotions, as well as some targetting more directly other mindfulness-related processes, such as attention regulation and non-reactivity. The MAIA includes subscales encompassing self-regulation abilities which - while important - might be conceptualized as distinct from core interoception. + + +### 🀧 Interoceptive Accuracy Scale (IAS) + +More recently, the **Interoceptive Accuracy Scale (IAS)** [(Murphy et al., 2019)](https://doi-org.sussex.idm.oclc.org/10.1177/1747021819879826) took the opposite route, trying to remove contamination by meta-cognitive processes to focus on interoceptive _accuracy_ (distinct from interoceptive _attention_). It includes 21 questions ("I can always accurately perceive when...") pertaining discrete, clear, and "objectifiable" interoceptive events, hopefully being meaningful and consistently interpreted across participants (including those who have difficulty perceiving internal sensations). + + + +### πŸƒ Multimodal Interoception Questionnaire (Mint) + +The **Multimodal Interoception Questionnaire (Mint;** [**Makowski et al., 2025**](https://doi.org/10.31234/osf.io/8qrht_v1)) is the most recent interoception questionnaire, designed with the intention of addressing the caveats and limitations by building on established measures and synthesising the previous research and advances. Fundamentally, the Mint takes a "**context-by-modality**" approach to item development, encompassing a wide range of (seven) **modalities** of interoceptive experience (cardiac, respiratory, gastric, etc.) and also controlling for the **contexts** in which these may appear (covering negative (*anxious*) and positive (*sexual*) arousal states). The Mint also incorporates both adaptive and maladaptive aspects of interoception (interoceptive confusion), as well as items targeting different levels of processing. + +Importantly, this questionnaire was developed with the aim of addressing some of the methodological shortcomings of previous interoception questionnaires, such as limiting *interpretation Variance*, *state Dependency* (the fact that respondents "anchor" their answers to their current physiological state rather than their general trait), and *recency effects* (recent, salient physical experiences disproportionately influencing scores), in particular by providing a clear contextual reference for each item. The validation study shows displayed strong correlations with the above questionnaires (suggesting that it can be used as a comprehensive replacement), while also demonstrating a superior predictive power for a variety of clinical conditions. + + + +
+Items of the Multimodal Interoception Questionnaire (Mint) +
Items of the Multimodal Interoception Questionnaire (Mint)
+
+ + + + +### Others + +- The **Interoceptive Attention Scale (IATS; Gabriele et al., 2021)**: Attention to bodily signals. Designed as the orthogonal counterpart of the Interoceptive Accuracy Scale, also using consistent phrasing of all statements ('Most of the time my attention is focused on...'). +- The **Interoceptive Sensations Questionnaire (THISQ; [Vlemincx et al., 2021](https://doi.org/10.1080/08870446.2021.2009479))**: Neutral internal sensations (not emotionally valenced), including cardiorespiratory activation, deactivation, and gastroesophageal sensations. +- The **Interoception Sensory Questionnaire (ISQ; [Fiene, 2018](https://link.springer.com/article/10.1007/s10803-018-3600-3))**: Designed to assess confusion about interoceptive bodily states unless these states are extreme (Alexisomia). +- The **Interoceptive Confusion Questionnaire (ICQ; [Brewer, 2016](https://royalsocietypublishing.org/rsos/article/3/10/150664/36458/Alexithymia-a-general-deficit-of))**: Assesses confusion and misinterpretation of bodily signals. +- The **Body Consciousness scale (BCS; Miller et al., 1981)**: Awareness of the "private body" (internal sensations) and "public body" (observable aspects of body) + + + +## In summary - which interoception questionnaire should I pick? + +Interoceptive questionnaires are a product of their time, often molded by specific contextual demands and underlying theoretical frameworks. As our understanding of interoception evolves, so too do the tools we use to measure it. It might seem like the best option is to pick a questionnaire based on the interoception facet you are interested in (e.g., confusion, attention, accuracy, ...), but as the field is still developing, and the theorethical models are in flux, it might be more useful to consider using a broader, more comprehensive, theory-agnostic questionnaire that captures multiple facets and modalities of interoception, such as the **Mint**. + + + +## References + +Bergomi, C., Tschacher, W., & Kupper, Z. (2012). The Assessment of Mindfulness with Self-Report Measures: Existing Scales and Open Issues. _Mindfulness_, _4_(3), 191–202. https://doi.org/10.1007/s12671-012-0110-9 + +Gabriele, E., Spooner, R., Brewer, R., & Murphy, J. (2021). Dissociations between self-reported interoceptive accuracy and attention: Evidence from the interoceptive attention scale. _Biological Psychology_, _168_, 108243. https://doi.org/10.1016/j.biopsycho.2021.108243 + +Kolacz, J., & Bjorum, E. (2023). Measuring Autonomic Symptoms with the Body Perception Questionnaire. _The Traumatic Stress Research Consortium_ . https://www.traumascience.org/s/TSRCMarch2023Newsletter.pdf + +Kolacz, J., Holmes, L., & Porges, S. W. (2018). Body perception questionnaire (BPQ) manual. Traumatic Stress Research Consortium. + +Makowski, D., Neves, A., Benn, E., Bennett, M., & Poerio, G. (2025). The Mint Scale: A Fresh Validation of the Multimodal Interoception Questionnaire and Comparison to the MAIA, BPQ and IAS. [https://doi.org/10.31234/osf.io/8qrht_v1](https://doi.org/10.31234/osf.io/8qrht_v1) + +Mehling, Price, Daubenmier, Acree, Bartmess, & Stewart. (2012). The Multidimensional Assessment of Interoceptive Awareness (MAIA). _Plos One_, _7_(11). https://doi.org/10.1371/journal.pone.0048230.g001 + +Mehling, W. E., Acree, M., Stewart, A., Silas, J., & Jones, A. (2018). The Multidimensional Assessment of Interoceptive Awareness, Version 2 (MAIA-2). _PLOS ONE_, _13_(12), e0208034. https://doi.org/10.1371/journal.pone.0208034 + +Miller, L. C., Murphy, R., & Buss, A. H. (1981). Consciousness of body: Private and public. _Journal of Personality and Social Psychology_, _41_(2), 397–406. https://doi.org/10.1037/0022-3514.41.2.397 + +Murphy, J., Brewer, R., Plans, D., Khalsa, S. S., Catmur, C., & Bird, G. (2019). Testing the independence of self-reported interoceptive accuracy and attention. _Quarterly Journal of Experimental Psychology_, _73_(1), 115–133. https://doi.org/10.1177/1747021819879826 + +Paola Solano DurΓ‘n, Morales, J.-P., & Huepe, D. (2024). Interoceptive awareness in a clinical setting: the need to bring interoceptive perspectives into clinical evaluation. _Frontiers in Psychology_, _15_(1244701). https://doi.org/10.3389/fpsyg.2024.1244701 + +Porges. (1993). _Body Perception Questionnaire_. Umd.edu. https://terpconnect.umd.edu/~sporges/body/body.txt + +Schandry, R. (1981). Heart Beat Perception and Emotional Experience. _Psychophysiology_, _18_(4), 483–488. https://doi.org/10.1111/j.1469-8986.1981.tb02486.x + +Sherrington C. S. (1906). The integrative action of the nervous system. Yale University Press. + +Vlemincx, E., Walentynowicz, M., Zamariola, G., Van Oudenhove, L., & Luminet, O. (2021). A novel self-report scale of interoception: the three-domain interoceptive sensations questionnaire (THISQ). _Psychology & Health_, _38_(9), 1–20. https://doi.org/10.1080/08870446.2021.2009479 \ No newline at end of file diff --git a/content/post/2025-12-20-Interoception/mint.png b/content/post/2025-12-20-Interoception/mint.png new file mode 100644 index 000000000..541f223e8 Binary files /dev/null and b/content/post/2025-12-20-Interoception/mint.png differ