Resting-State fMRI Analysis in R#
From Preprocessed Data to Functional Connectivity#
Authors: Giulia Baracchini & Monika Doerig
Date: 13 Jan 2026
License:
Note: If this notebook uses neuroimaging tools from Neurocontainers, those tools retain their original licenses. Please see Neurodesk citation guidelines for details.
Citation and Resources:#
Tools included in this workflow#
R:
R Core Team. (2025). R: A language and environment for statistical computing (Version 4.4.3) [Software]. R Foundation for Statistical Computing. https://www.R-project.org/
Workflows this work is based on#
Original work from Giulia Baracchini:
Dataset#
HCP
Schefer parcellation
Introduction#
This notebook is adapted from Giulia Baracchini’s comprehensive fMRI preprocessing tutorial (available on GitHub), which covers the essential steps of resting-state fMRI analysis from raw data preprocessing through functional connectivity analyses.
The original tutorial provides a complete pipeline covering:
Standard fMRI preprocessing - preparing raw neuroimaging data for analysis
Resting-state fMRI denoising - removing artifacts and noise from the signal
Time-series extraction and parcellation - converting voxel-level data to meaningful brain regions
Functional connectivity analyses - examining statistical relationships between brain regions
What This Notebook Covers#
While the original tutorial works with multiple subjects and uses the Schaefer 200 region-7 network parcellation, this notebook focuses on a single-subject analysis using the Schaefer 200 region-17 network parcellation applied to subject 101309.
Key Concepts#
Parcellation: The process of grouping individual voxels into meaningful brain regions or “parcels.” Instead of analyzing thousands of individual voxels, we average signals within anatomically or functionally defined regions, making our analyses more interpretable and computationally manageable.
Functional Connectivity (FC): A statistical measure (typically Pearson’s correlation) that quantifies how synchronously different brain regions activate during rest. The result is a region × region matrix where each entry represents the strength of correlation between two brain areas.
Analysis Pipeline#
This notebook focuses on the analysis phase using already preprocessed and parcellated data. Starting with clean time series data from 200 brain regions, we implement:
Data normalization - standardizing time series signals across regions
Functional connectivity calculation - computing correlation matrices between brain regions
Fisher z-transformation - normalizing correlation values for statistical analysis
Network visualization - creating heatmaps and brain plots to visualize connectivity patterns
Nodal strength analysis - quantifying each region’s overall connectivity
Relationships to other measures of brain organisation - relating connectivity patterns to brain organization gradients and gene expression patterns
The Schaefer parcellation we’re using divides the brain into 200 regions across 17 functional networks, providing a good balance between spatial resolution and interpretability for resting-state connectivity analyses.
Running R in Jupyter with Python Kernel#
⚠️ Run R in Jupyter Notebook: This notebook uses R magic commands (%%R) to run R code within a Python kernel environment. This approach offers several advantages:
- No kernel switching required - all code runs in the Python kernel
- Seamless Python ↔ R integration - easy data exchange between languages
- Fully automated setup - no manual kernel installation needed
Note: Alternatively, Jupyter supports native R kernels, but the magic command approach keeps everything within the Python kernel for simpler workflow management.
Setup Steps#
1. Install R runtime and packages via mamba:
# Install R runtime, Python↔R bridge, and geospatial R packages via conda
# Note: r-base provides the R runtime; we avoid r-essentials (80+ packages) to prevent solver conflicts
!mamba install -c conda-forge r-base rpy2 r-sf r-units r-s2 r-ggplot2 -y -q
# Set env vars before loading rpy2
# Set PROJ database path so sf can find coordinate reference systems
# (needed when r-sf is installed via conda)
import os
proj_path = os.path.join(os.environ.get("CONDA_PREFIX", "/opt/conda"), "share", "proj")
os.environ["PROJ_LIB"] = proj_path
os.environ["PROJ_DATA"] = proj_path
2. Enable %%R magic commands:
We only need to run it once for the first time. After these installations, the Jupyter Notebook now supports both Python 3 and R programming languages.
%load_ext rpy2.ipython
3. Install R packages from CRAN and r-universe:
%%R
# ==============================================================================
# Setup a User-Specific R Library and Install Packages
# ==============================================================================
# 1. Define and create a personal library path
user_lib <- "~/R/library"
dir.create(user_lib, recursive = TRUE, showWarnings = FALSE)
# 2. Add the personal library to R's library search path
.libPaths(c(user_lib, .libPaths()))
# 3. Set repositories: CRAN + ggseg r-universe
options(repos = c(
ggseg = "https://ggseg.r-universe.dev",
CRAN = "https://cloud.r-project.org"
))
# 4. Install required packages from CRAN
install.packages(c("remotes", "ggplot2", "dplyr", "tidyr", "superheat", "knitr"))
# 5. Install ggseg from r-universe
install.packages("ggseg")
# 6. Install ggsegSchaefer - try r-universe first, fall back to GitHub
# (r-universe may not have binaries for all platforms, e.g. linux/arm64)
install.packages("ggsegSchaefer")
if (!requireNamespace("ggsegSchaefer", quietly = TRUE)) {
message("ggsegSchaefer not found via r-universe, installing from GitHub...")
remotes::install_github("ggseg/ggsegSchaefer", upgrade = "never")
}
# 7. Verify the library path
print("R packages will be installed in and loaded from:")
print(.libPaths())
x86_64-conda-linux-gnu-c++ -std=gnu++17 -I"/opt/conda/lib/R/include" -DNDEBUG -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /opt/conda/include -I/opt/conda/include -Wl,-rpath-link,/opt/conda/lib -fpic -fvisibility-inlines-hidden -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /opt/conda/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1771898340574/work=/usr/local/src/conda/r-base-4.5.2 -fdebug-prefix-map=/opt/conda=/usr/local/src/conda-prefix -c chop.cpp -o chop.o
x86_64-conda-linux-gnu-c++ -std=gnu++17 -I"/opt/conda/lib/R/include" -DNDEBUG -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /opt/conda/include -I/opt/conda/include -Wl,-rpath-link,/opt/conda/lib -fpic -fvisibility-inlines-hidden -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /opt/conda/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1771898340574/work=/usr/local/src/conda/r-base-4.5.2 -fdebug-prefix-map=/opt/conda=/usr/local/src/conda-prefix -c filter.cpp -o filter.o
x86_64-conda-linux-gnu-c++ -std=gnu++17 -I"/opt/conda/lib/R/include" -DNDEBUG -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /opt/conda/include -I/opt/conda/include -Wl,-rpath-link,/opt/conda/lib -fpic -fvisibility-inlines-hidden -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /opt/conda/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1771898340574/work=/usr/local/src/conda/r-base-4.5.2 -fdebug-prefix-map=/opt/conda=/usr/local/src/conda-prefix -c funs.cpp -o funs.o
x86_64-conda-linux-gnu-c++ -std=gnu++17 -I"/opt/conda/lib/R/include" -DNDEBUG -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /opt/conda/include -I/opt/conda/include -Wl,-rpath-link,/opt/conda/lib -fpic -fvisibility-inlines-hidden -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /opt/conda/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1771898340574/work=/usr/local/src/conda/r-base-4.5.2 -fdebug-prefix-map=/opt/conda=/usr/local/src/conda-prefix -c group_by.cpp -o group_by.o
x86_64-conda-linux-gnu-c++ -std=gnu++17 -I"/opt/conda/lib/R/include" -DNDEBUG -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /opt/conda/include -I/opt/conda/include -Wl,-rpath-link,/opt/conda/lib -fpic -fvisibility-inlines-hidden -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /opt/conda/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1771898340574/work=/usr/local/src/conda/r-base-4.5.2 -fdebug-prefix-map=/opt/conda=/usr/local/src/conda-prefix -c group_data.cpp -o group_data.o
x86_64-conda-linux-gnu-c++ -std=gnu++17 -I"/opt/conda/lib/R/include" -DNDEBUG -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /opt/conda/include -I/opt/conda/include -Wl,-rpath-link,/opt/conda/lib -fpic -fvisibility-inlines-hidden -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /opt/conda/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1771898340574/work=/usr/local/src/conda/r-base-4.5.2 -fdebug-prefix-map=/opt/conda=/usr/local/src/conda-prefix -c imports.cpp -o imports.o
x86_64-conda-linux-gnu-c++ -std=gnu++17 -I"/opt/conda/lib/R/include" -DNDEBUG -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /opt/conda/include -I/opt/conda/include -Wl,-rpath-link,/opt/conda/lib -fpic -fvisibility-inlines-hidden -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /opt/conda/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1771898340574/work=/usr/local/src/conda/r-base-4.5.2 -fdebug-prefix-map=/opt/conda=/usr/local/src/conda-prefix -c init.cpp -o init.o
x86_64-conda-linux-gnu-c++ -std=gnu++17 -I"/opt/conda/lib/R/include" -DNDEBUG -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /opt/conda/include -I/opt/conda/include -Wl,-rpath-link,/opt/conda/lib -fpic -fvisibility-inlines-hidden -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /opt/conda/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1771898340574/work=/usr/local/src/conda/r-base-4.5.2 -fdebug-prefix-map=/opt/conda=/usr/local/src/conda-prefix -c mask.cpp -o mask.o
x86_64-conda-linux-gnu-c++ -std=gnu++17 -I"/opt/conda/lib/R/include" -DNDEBUG -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /opt/conda/include -I/opt/conda/include -Wl,-rpath-link,/opt/conda/lib -fpic -fvisibility-inlines-hidden -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /opt/conda/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1771898340574/work=/usr/local/src/conda/r-base-4.5.2 -fdebug-prefix-map=/opt/conda=/usr/local/src/conda-prefix -c mutate.cpp -o mutate.o
x86_64-conda-linux-gnu-c++ -std=gnu++17 -I"/opt/conda/lib/R/include" -DNDEBUG -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /opt/conda/include -I/opt/conda/include -Wl,-rpath-link,/opt/conda/lib -fpic -fvisibility-inlines-hidden -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /opt/conda/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1771898340574/work=/usr/local/src/conda/r-base-4.5.2 -fdebug-prefix-map=/opt/conda=/usr/local/src/conda-prefix -c reconstruct.cpp -o reconstruct.o
x86_64-conda-linux-gnu-c++ -std=gnu++17 -I"/opt/conda/lib/R/include" -DNDEBUG -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /opt/conda/include -I/opt/conda/include -Wl,-rpath-link,/opt/conda/lib -fpic -fvisibility-inlines-hidden -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /opt/conda/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1771898340574/work=/usr/local/src/conda/r-base-4.5.2 -fdebug-prefix-map=/opt/conda=/usr/local/src/conda-prefix -c slice.cpp -o slice.o
x86_64-conda-linux-gnu-c++ -std=gnu++17 -I"/opt/conda/lib/R/include" -DNDEBUG -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /opt/conda/include -I/opt/conda/include -Wl,-rpath-link,/opt/conda/lib -fpic -fvisibility-inlines-hidden -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /opt/conda/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1771898340574/work=/usr/local/src/conda/r-base-4.5.2 -fdebug-prefix-map=/opt/conda=/usr/local/src/conda-prefix -c summarise.cpp -o summarise.o
x86_64-conda-linux-gnu-c++ -std=gnu++17 -shared -L/opt/conda/lib/R/lib -Wl,-O2 -Wl,--sort-common -Wl,--as-needed -Wl,-z,relro -Wl,-z,now -Wl,--disable-new-dtags -Wl,--gc-sections -Wl,--allow-shlib-undefined -Wl,-rpath,/opt/conda/lib -Wl,-rpath-link,/opt/conda/lib -L/opt/conda/lib -o dplyr.so chop.o filter.o funs.o group_by.o group_data.o imports.o init.o mask.o mutate.o reconstruct.o slice.o summarise.o -L/opt/conda/lib/R/lib -lR
x86_64-conda-linux-gnu-c++ -std=gnu++17 -I"/opt/conda/lib/R/include" -DNDEBUG -I'/opt/conda/lib/R/library/cpp11/include' -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /opt/conda/include -I/opt/conda/include -Wl,-rpath-link,/opt/conda/lib -fpic -fvisibility-inlines-hidden -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /opt/conda/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1771898340574/work=/usr/local/src/conda/r-base-4.5.2 -fdebug-prefix-map=/opt/conda=/usr/local/src/conda-prefix -c cpp11.cpp -o cpp11.o
x86_64-conda-linux-gnu-c++ -std=gnu++17 -I"/opt/conda/lib/R/include" -DNDEBUG -I'/opt/conda/lib/R/library/cpp11/include' -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /opt/conda/include -I/opt/conda/include -Wl,-rpath-link,/opt/conda/lib -fpic -fvisibility-inlines-hidden -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /opt/conda/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1771898340574/work=/usr/local/src/conda/r-base-4.5.2 -fdebug-prefix-map=/opt/conda=/usr/local/src/conda-prefix -c melt.cpp -o melt.o
x86_64-conda-linux-gnu-c++ -std=gnu++17 -I"/opt/conda/lib/R/include" -DNDEBUG -I'/opt/conda/lib/R/library/cpp11/include' -DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /opt/conda/include -I/opt/conda/include -Wl,-rpath-link,/opt/conda/lib -fpic -fvisibility-inlines-hidden -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /opt/conda/include -fdebug-prefix-map=/home/conda/feedstock_root/build_artifacts/r-base-split_1771898340574/work=/usr/local/src/conda/r-base-4.5.2 -fdebug-prefix-map=/opt/conda=/usr/local/src/conda-prefix -c simplifyPieces.cpp -o simplifyPieces.o
x86_64-conda-linux-gnu-c++ -std=gnu++17 -shared -L/opt/conda/lib/R/lib -Wl,-O2 -Wl,--sort-common -Wl,--as-needed -Wl,-z,relro -Wl,-z,now -Wl,--disable-new-dtags -Wl,--gc-sections -Wl,--allow-shlib-undefined -Wl,-rpath,/opt/conda/lib -Wl,-rpath-link,/opt/conda/lib -L/opt/conda/lib -o tidyr.so cpp11.o melt.o simplifyPieces.o -L/opt/conda/lib/R/lib -lR
[1] "R packages will be installed in and loaded from:"
[1] "/home/jovyan/R/library" "/opt/conda/lib/R/library"
* installing *source* package ‘remotes’ ...
** this is package ‘remotes’ version ‘2.5.0’
** package ‘remotes’ successfully unpacked and MD5 sums checked
** using staged installation
** R
** inst
** byte-compile and prepare package for lazy loading
** help
*** installing help indices
** building package indices
** installing vignettes
** testing if installed package can be loaded from temporary location
** testing if installed package can be loaded from final location
** testing if installed package keeps a record of temporary installation path
* DONE (remotes)
* installing *source* package ‘ggplot2’ ...
** this is package ‘ggplot2’ version ‘4.0.2’
** package ‘ggplot2’ successfully unpacked and MD5 sums checked
** using staged installation
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
** help
*** installing help indices
*** copying figures
** building package indices
** installing vignettes
** testing if installed package can be loaded from temporary location
** testing if installed package can be loaded from final location
** testing if installed package keeps a record of temporary installation path
* DONE (ggplot2)
* installing *source* package ‘dplyr’ ...
** this is package ‘dplyr’ version ‘1.2.0’
** package ‘dplyr’ successfully unpacked and MD5 sums checked
** using staged installation
** libs
using C++ compiler: ‘x86_64-conda-linux-gnu-c++ (conda-forge gcc 15.2.0-18) 15.2.0’
installing to /home/jovyan/R/library/00LOCK-dplyr/00new/dplyr/libs
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
** help
*** installing help indices
*** copying figures
** building package indices
** installing vignettes
** testing if installed package can be loaded from temporary location
** checking absolute paths in shared objects and dynamic libraries
** testing if installed package can be loaded from final location
** testing if installed package keeps a record of temporary installation path
* DONE (dplyr)
* installing *source* package ‘knitr’ ...
** this is package ‘knitr’ version ‘1.51’
** package ‘knitr’ successfully unpacked and MD5 sums checked
** using staged installation
** R
** inst
** byte-compile and prepare package for lazy loading
** help
*** installing help indices
** building package indices
** installing vignettes
** testing if installed package can be loaded from temporary location
** testing if installed package can be loaded from final location
** testing if installed package keeps a record of temporary installation path
* DONE (knitr)
* installing *source* package ‘tidyr’ ...
** this is package ‘tidyr’ version ‘1.3.2’
** package ‘tidyr’ successfully unpacked and MD5 sums checked
** using staged installation
** libs
using C++ compiler: ‘x86_64-conda-linux-gnu-c++ (conda-forge gcc 15.2.0-18) 15.2.0’
installing to /home/jovyan/R/library/00LOCK-tidyr/00new/tidyr/libs
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
** help
*** installing help indices
*** copying figures
** building package indices
** installing vignettes
** testing if installed package can be loaded from temporary location
** checking absolute paths in shared objects and dynamic libraries
** testing if installed package can be loaded from final location
** testing if installed package keeps a record of temporary installation path
* DONE (tidyr)
* installing *source* package ‘superheat’ ...
** this is package ‘superheat’ version ‘0.1.0’
** package ‘superheat’ successfully unpacked and MD5 sums checked
** using staged installation
** R
** byte-compile and prepare package for lazy loading
** help
*** installing help indices
** building package indices
** testing if installed package can be loaded from temporary location
** testing if installed package can be loaded from final location
** testing if installed package keeps a record of temporary installation path
* DONE (superheat)
* installing *source* package ‘ggseg’ ...
** this is package ‘ggseg’ version ‘2.0.0’
** package ‘ggseg’ successfully unpacked and MD5 sums checked
** using staged installation
** R
** inst
** byte-compile and prepare package for lazy loading
** help
*** installing help indices
*** copying figures
** building package indices
** installing vignettes
** testing if installed package can be loaded from temporary location
** testing if installed package can be loaded from final location
** testing if installed package keeps a record of temporary installation path
* DONE (ggseg)
Installing packages into ‘/home/jovyan/R/library’
(as ‘lib’ is unspecified)
Warning: unable to access index for repository https://ggseg.r-universe.dev/src/contrib:
cannot open URL 'https://ggseg.r-universe.dev/src/contrib/PACKAGES'
trying URL 'https://cloud.r-project.org/src/contrib/remotes_2.5.0.tar.gz'
trying URL 'https://cloud.r-project.org/src/contrib/ggplot2_4.0.2.tar.gz'
trying URL 'https://cloud.r-project.org/src/contrib/dplyr_1.2.0.tar.gz'
trying URL 'https://cloud.r-project.org/src/contrib/tidyr_1.3.2.tar.gz'
trying URL 'https://cloud.r-project.org/src/contrib/superheat_0.1.0.tar.gz'
trying URL 'https://cloud.r-project.org/src/contrib/knitr_1.51.tar.gz'
The downloaded source packages are in
‘/tmp/Rtmpb1t3Uw/downloaded_packages’
Installing package into ‘/home/jovyan/R/library’
(as ‘lib’ is unspecified)
Warning: unable to access index for repository https://ggseg.r-universe.dev/src/contrib:
cannot open URL 'https://ggseg.r-universe.dev/src/contrib/PACKAGES'
trying URL 'https://cloud.r-project.org/src/contrib/ggseg_2.0.0.tar.gz'
Content type 'application/x-gzip' length 4039860 bytes (3.9 MB)
==================================================
downloaded 3.9 MB
The downloaded source packages are in
‘/tmp/Rtmpb1t3Uw/downloaded_packages’
Installing package into ‘/home/jovyan/R/library’
(as ‘lib’ is unspecified)
Warning: unable to access index for repository https://ggseg.r-universe.dev/src/contrib:
cannot open URL 'https://ggseg.r-universe.dev/src/contrib/PACKAGES'
In addition: Warning message:
package ‘ggsegSchaefer’ is not available for this version of R
A version of this package for your version of R might be available elsewhere,
see the ideas at
https://cran.r-project.org/doc/manuals/r-patched/R-admin.html#Installing-packages
4. All subsequent R code uses the %%R cell magic:
Each cell containing R code must start with %%R to be executed as R code.
%%R
# Load packages
library(ggseg)
library(ggsegSchaefer)
library(sf)
library(units)
library(s2)
library(superheat)
library(ggplot2)
library(dplyr)
library(tidyr)
Linking to GEOS 3.14.1, GDAL 3.12.2, PROJ 9.7.1; sf_use_s2() is TRUE
udunits database from /opt/conda/lib/R/library/units/share/udunits/udunits2.xml
Attaching package: ‘dplyr’
The following objects are masked from ‘package:stats’:
filter, lag
The following objects are masked from ‘package:base’:
intersect, setdiff, setequal, union
Data download#
%%R
# Create data directory
dir.create("data", showWarnings = FALSE)
# Download files into data folder
download.file("https://raw.githubusercontent.com/giuliabaracc/teaching_fMRI/main/data/margulies2016_fcgradient01_20017Schaefer.csv",
destfile = "data/margulies2016_fcgradient01_20017Schaefer.csv")
download.file("https://raw.githubusercontent.com/giuliabaracc/teaching_fMRI/main/data/gene_pc1_20017Schaefer.csv",
destfile = "data/gene_pc1_20017Schaefer.csv")
# Don't forget the main data file too
download.file("https://raw.githubusercontent.com/giuliabaracc/teaching_fMRI/main/data/sub-101309_Schaefer20017.txt",
destfile = "data/sub-101309_Schaefer20017.txt")
trying URL 'https://raw.githubusercontent.com/giuliabaracc/teaching_fMRI/main/data/margulies2016_fcgradient01_20017Schaefer.csv'
Content type 'text/plain; charset=utf-8' length 8044 bytes
==================================================
downloaded 8044 bytes
trying URL 'https://raw.githubusercontent.com/giuliabaracc/teaching_fMRI/main/data/gene_pc1_20017Schaefer.csv'
Content type 'text/plain; charset=utf-8' length 8032 bytes
==================================================
downloaded 8032 bytes
trying URL 'https://raw.githubusercontent.com/giuliabaracc/teaching_fMRI/main/data/sub-101309_Schaefer20017.txt'
Content type 'text/plain; charset=utf-8' length 3094605 bytes (3.0 MB)
==================================================
downloaded 3.0 MB
Load data and visualize data from one subject#
%%R
###We are going to use data from subject 101309 in Scahefer 200-17 space
df <- read.table("./data/sub-101309_Schaefer20017.txt")
atlas <- read.csv("./data/margulies2016_fcgradient01_20017Schaefer.csv")$region #our 200-17 regions important for plotting later
%%R
# Add a Time column so we can create a nice labelled plot
df2 <- df
df2$Time <- 1:nrow(df2)
# Reshape data frame to long format
df_long <- pivot_longer(df2,
cols = -Time,
names_to = "Region",
values_to = "Signal")
# Plot
ggplot(df_long, aes(x = Time, y = Signal, color = Region)) +
geom_line(alpha = 0.6) +
theme_minimal() +
labs(title = "ROI Time Series", x = "Time (TRs)", y = "fMRI BOLD Signal") +
theme(legend.position = "none")
Normalize time series data#
%%R
###First step: normalise their time series data
df_z <- scale(df, center = TRUE, scale = TRUE)
df_z <- as.data.frame(df_z) #make data frame
#Plot df_z
# Add a Time column so we can create a nice labelled plot
df_z$Time <- 1:nrow(df_z)
#Reshape data frame to long format
df_z_long <- pivot_longer(df_z,
cols = -Time,
names_to = "Region",
values_to = "Signal")
#Plot
ggplot(df_z_long, aes(x = Time, y = Signal, color = Region)) +
geom_line(alpha = 0.6) +
theme_minimal() +
labs(title = "ROI Time Series", x = "Time (TRs)", y = "fMRI BOLD Signal") +
theme(legend.position = "none")
Calculate their Functional Connectivity (FC) matrix#
Let’s do some analyses on these data now! First thing, let’s calculate functional connectivity (FC). As a refresher, FC is a statistical construct derived (typically) as the Pearson’s correlation between pairs of brain regions. The output is therefore a region x region matrix where each entry indicates how strong the correlation is between two regions. In our case, our FC matrix will be 200x200.
%%R
###Second step: calculate their FC matrix
matrix_df <- as.matrix(df) #convert to matrix format to do calculations
# Pearson's correlation
cor_mat <- cor(matrix_df) #compute FC on matrix format variable
#Look at distribution of FC values
hist(cor_mat)
Normalise FC values#
For group analyses, we need to normalise these FC values: Fisher-z transformation
%%R
###Third step: normalise their FC values
z_mat <- atanh(cor_mat)
# Clean matrix: replace Inf values that come from Fisher z-transform
z_mat[!is.finite(z_mat)] <- NA
%%R
hist(z_mat)
Visualise FC matrix#
You can play with the value limits, but in generally you want to make sure you see boxes in your matrix reflecting the brain’s functional network organisation.
%%R
###Fourth step: roughly visualise their FC matrix
# Use these names to label your FC matrix
colnames(z_mat) <- atlas
rownames(z_mat) <- atlas
# Example: "17Networks_LH_VisCent_ExStr_1" → "VisCent"
networks <- sub("17Networks_.._([^_]+).*", "\\1", atlas)
networks <- factor(networks, levels = unique(networks))
reds <- colorRampPalette(c("#fee5d9", "#fcae91", "#fb6a4a", "#de2d26", "#a50f15"))(100)
superheat(z_mat,
membership.rows = networks,
membership.cols = networks,
left.label.size = 0.4,
bottom.label.size = 0,
scale = FALSE,
heat.pal = reds,
# make the legend bigger
legend.height = 0.25,
legend.width = 2,
legend.text.size = 10)
In addition: Warning message:
Using `size` aesthetic for lines was deprecated in ggplot2 3.4.0.
ℹ Please use `linewidth` instead.
ℹ The deprecated feature was likely used in the superheat package.
Please report the issue to the authors.
This warning is displayed once per session.
Call `lifecycle::last_lifecycle_warnings()` to see where this warning was
generated.
Calculate Nodal strength#
Let’s take this a step further. Let’s calculate how much each region is connected to the rest of the brain, a measure that is called node strength. Node strength, or region strength, is “the sum of weights of links connected to the node”. This will allow us to obtain a 1x200 vector that we can relate to a bunch of other measures of brain organisation.
%%R
###Fifth step: calculate nodal strength
node_strength <- rowSums(z_mat, na.rm = TRUE)
node_strength <- as.data.frame(node_strength)
rownames(node_strength) <- seq(1:200)
node_strength[,2] <- atlas
colnames(node_strength)[1] <- "Value"
colnames(node_strength)[2] <- "region"
%%R
options(repr.plot.width = 20, repr.plot.height = 8)
node_strength %>%
mutate(label = case_when(
grepl("^17Networks_LH", region) ~ paste0("lh_", region),
grepl("^17Networks_RH", region) ~ paste0("rh_", region),
TRUE ~ region
)) %>%
ggplot() +
geom_brain(atlas = schaefer17_200(),
position = position_brain(hemi ~ view,
view = c("lateral", "medial")),
mapping = aes(fill = Value),
colour = "black",
size = 0.1) +
scale_fill_viridis_c(limits = c(50, 90), oob = scales::squish) +
theme_void() +
labs(fill = "Node Strength")
Merging atlas and data by region and label.
%%R
top_n <- 15
node_strength %>%
slice_max(Value, n = top_n) %>%
ggplot(aes(x = reorder(region, Value), y = Value)) +
geom_col(fill = "steelblue") +
coord_flip() +
labs(title = paste("Top", top_n, "Node Strengths"),
x = "Brain Region",
y = "Node Strength") +
theme_minimal() +
theme(axis.text.y = element_text(size = 8))
Relate their nodal strength measures to measures of brain organisation and gene organisation#
%%R
###Sixth step: relate their nodal strength measures to measures of brain organisation and gene organisation
brain_organisation <- read.csv("./data/margulies2016_fcgradient01_20017Schaefer.csv")
gene_organisation <- read.csv("./data/gene_pc1_20017Schaefer.csv")
%%R
brain_organisation %>%
mutate(label = case_when(
grepl("^17Networks_LH", region) ~ paste0("lh_", region),
grepl("^17Networks_RH", region) ~ paste0("rh_", region),
TRUE ~ region
)) %>%
ggplot() +
geom_brain(atlas = schaefer17_200(),
position = position_brain(hemi ~ view,
view = c("lateral", "medial")),
mapping = aes(fill = Value),
colour = "black",
size = 0.1) +
scale_fill_viridis_c(limits = c(-6, 6), oob = scales::squish) +
theme_void() +
labs(fill = "Brain Organisation")
Merging atlas and data by region and label.
%%R
gene_organisation %>%
mutate(label = case_when(
grepl("^17Networks_LH", region) ~ paste0("lh_", region),
grepl("^17Networks_RH", region) ~ paste0("rh_", region),
TRUE ~ region
)) %>%
ggplot() +
geom_brain(atlas = schaefer17_200(),
position = position_brain(hemi ~ view,
view = c("lateral", "medial")),
mapping = aes(fill = Value),
colour = "black",
size = 0.1) +
scale_fill_viridis_c(limits = c(-100, 100), oob = scales::squish) +
theme_void() +
labs(fill = "Gene Organisation")
Merging atlas and data by region and label.
%%R
#Compute correlation amongst these vectors: nodal strength, brain organisation and gene organisation
corr_node_brainorg <- cor(node_strength$Value,brain_organisation$Value, method = 'spearman')
corr_node_geneorg <- cor(node_strength$Value,gene_organisation$Value, method = 'spearman')
%%R
# Put results into a data frame
cor_results <- data.frame(
Comparison = c("Nodal Strength vs Brain Organisation",
"Nodal Strength vs Gene Organisation"),
Spearman_rho = c(corr_node_brainorg, corr_node_geneorg)
)
library(knitr)
kable(cor_results, digits = 3, caption = "Spearman correlations for Nodal Strength")
Table: Spearman correlations for Nodal Strength
|Comparison | Spearman_rho|
|:------------------------------------|------------:|
|Nodal Strength vs Brain Organisation | -0.359|
|Nodal Strength vs Gene Organisation | 0.459|
Complete session information for reproducibility#
%%R
cat("=== R Session Information ===\n\n")
sessionInfo()
=== R Session Information ===
R version 4.5.2 (2025-10-31)
Platform: x86_64-conda-linux-gnu
Running under: Ubuntu 24.04.3 LTS
Matrix products: default
BLAS/LAPACK: /opt/conda/lib/libopenblasp-r0.3.30.so; LAPACK version 3.12.0
locale:
[1] LC_CTYPE=C.UTF-8 LC_NUMERIC=C LC_TIME=C
[4] LC_COLLATE=C LC_MONETARY=C LC_MESSAGES=C
[7] LC_PAPER=C LC_NAME=C LC_ADDRESS=C
[10] LC_TELEPHONE=C LC_MEASUREMENT=C LC_IDENTIFICATION=C
time zone: Etc/UTC
tzcode source: system (glibc)
attached base packages:
[1] tools stats graphics grDevices utils datasets methods
[8] base
other attached packages:
[1] knitr_1.51 tidyr_1.3.2 dplyr_1.2.0
[4] ggplot2_4.0.2 superheat_0.1.0 s2_1.1.9
[7] units_1.0-0 sf_1.1-0 ggsegSchaefer_2.0.0
[10] ggseg_2.0.0
loaded via a namespace (and not attached):
[1] gtable_0.3.6 compiler_4.5.2 tidyselect_1.2.1
[4] Rcpp_1.1.1 scales_1.4.0 R6_2.6.1
[7] labeling_0.4.3 generics_0.1.4 classInt_0.4-11
[10] ggseg.formats_0.0.1 tibble_3.3.1 DBI_1.3.0
[13] pillar_1.11.1 RColorBrewer_1.1-3 rlang_1.1.7
[16] xfun_0.56 S7_0.2.1 viridisLite_0.4.3
[19] cli_3.6.5 withr_3.0.2 magrittr_2.0.4
[22] class_7.3-23 wk_0.9.5 grid_4.5.2
[25] lifecycle_1.0.5 vctrs_0.7.1 KernSmooth_2.23-26
[28] evaluate_1.0.5 proxy_0.4-29 glue_1.8.0
[31] farver_2.1.2 e1071_1.7-17 purrr_1.2.1
[34] pkgconfig_2.0.3