SLIM-BRAIN: A DATA- AND TRAINING-EFFICIENT FOUNDATION MODEL FOR FMRI DATA ANALYSIS

arXiv GitHub Hugging Face

This repository contains the official implementation of SLIM-Brain. SLIM-Brain is a two-stage, selective-compute pipeline for voxel-level fMRI representation learning. A lightweight global branch ranks informative temporal windows; a high-capacity 4D Hiera–JEPA encoder processes only those windows, focusing compute on brain voxels and drastically reducing memory.

framework


Installation

Setting up the environment requires Python 3.13 and CUDA-compatible PyTorch for GPU acceleration:

conda create -n hiera-jepa python=3.13.5
conda activate hiera-jepa

# Install dependencies
pip install -r requirements.txt

Project Structure

The codebase is organized into modular components for easy navigation and extension:

hiera-jepa/
β”œβ”€β”€ configs/                    # YAML configuration files for training and model parameters
β”œβ”€β”€ checkpoints/                # Saved model weights and training checkpoints
β”œβ”€β”€ hiera/                      # Hierarchical Vision Transformer backbone implementation
β”œβ”€β”€ scripts/                   # Bash....
β”œβ”€β”€ finetune.py               # Downstream task training and feature extraction script
└── requirements.txt            # Python package dependencies

Downstream evaluation

  1. Ensure your pre-train data structure as follow:
data_root/
β”œβ”€β”€ ABIDE_train/                
β”œβ”€β”€ ABIDE_val/                  
β”œβ”€β”€ HCP_val/              
└── HCP_train/              
    β”œβ”€β”€ 0010001/                # Subject ID
    └── 0010002/                
        β”œβ”€β”€ 0010002_run-1_0000-0199_1.npz  # Data chunk 1 
        β”œβ”€β”€ 0010002_run-1_0000-0199_2.npz  # Data chunk 2
  1. Loading downstream datasets as following data structure:
task:
  csv: "/path/to/data_csv"

data:
  data_root: /path/to/data_root
  datasets: ["HCP"]
  mode: "directory"
  1. Start downstream training:
# running downstream training
sh scripts/finetune.sh

Model Checkpoints

Our pre-trained model weights can be found in the checkpoints directory: ./checkpoints/best_model.pth

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Paper for OneMore1/SLIM-Brain