CoCoLIT: ControlNet-Conditioned Latent Image Translation for MRI to Amyloid PET Synthesis

This model, CoCoLIT, presents a diffusion-based latent generative framework for synthesizing amyloid PET scans from structural MRI. It addresses challenges in 3D neuroimaging data translation through a novel Weighted Image Space Loss (WISL), Latent Average Stabilization (LAS), and ControlNet-based conditioning for improved synthesis quality and inference consistency.

Paper: CoCoLIT: ControlNet-Conditioned Latent Image Translation for MRI to Amyloid PET Synthesis Code: https://github.com/brAIn-science/CoCoLIT

Installation

This repository requires Python 3.10 and PyTorch 2.0 or later. To install the latest version, run:

pip install cocolit

Usage

After installing the package, you can convert a T1-weighted MRI to a Florbetapir SUVR map by running:

mri2pet --i /path/to/t1.nii.gz --o /path/to/output.nii.gz

To replicate the results presented in the paper, include the --m 64 flag.

Disclaimer

This software is not intended for clinical use. The code is not available for commercial applications. For commercial inquiries, please contact the corresponding authors.

Citing

Arxiv Preprint:

@article{sargood2025cocolit,
  title={CoCoLIT: ControlNet-Conditioned Latent Image Translation for MRI to Amyloid PET Synthesis},
  author={Sargood, Alec and Puglisi, Lemuel and Cole, James H and Oxtoby, Neil P and Rav{\`\i}, Daniele and Alexander, Daniel C},
  journal={arXiv preprint arXiv:2508.01292},
  year={2025}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support