scGPT
Bo Wang Lab (University of Toronto)
Foundation model for single-cell multi-omics built on generative pre-training of ~33M cells. Fine-tunes to SOTA on cell type annotation, multi-batch integration, perturbation prediction, and gene network inference.
Best For
Multi-task single-cell analysis from a single pre-trained foundation model
License
Open Source (check repo)
Strengths
- +Pre-trained on 33M+ cells
- +Multi-task fine-tuning
- +Gene network inference
- +Perturbation response prediction
Limitations
- −Requires GPU for fine-tuning
- −Performance debated vs. simpler baselines on some tasks
- −Large model size
R&D Pipeline Coverage
Related Tools
More in Single-Cell
CellTypist
Teichmann Lab (Wellcome Sanger Institute)
Automated cell type annotation tool for scRNA-seq data using logistic regression models trained on curated cross-tissue immune cell atlases. Provides a growing encyclopedia of pre-trained cell type models.
Geneformer
Theodoris Lab (Harvard / MIT)
Transformer-based foundation model pre-trained on ~30M single-cell transcriptomes. Learns context-dependent gene network dynamics and transfers to diverse downstream tasks including disease modeling and therapeutic target prioritization.
Stay updated on scGPT
Weekly newsletter covering AI tool releases, benchmarks, and what practitioners actually use.