alphafold2  by lucidrains

Pytorch implementation for protein structure prediction

created 4 years ago
1,614 stars

Top 26.6% on sourcepulse

GitHubView on GitHub
Project Summary

This repository provides an unofficial PyTorch implementation of AlphaFold2, targeting researchers and developers interested in protein structure prediction. It aims to replicate DeepMind's AlphaFold2 architecture, offering flexibility in predicting distograms, angles, and 3D coordinates, with a focus on integrating various attention mechanisms and structural refinement techniques.

How It Works

The core of the implementation is a modular Transformer architecture that processes sequence and Multiple Sequence Alignment (MSA) data. It incorporates axial attention for MSA processing and offers options for SE(3) Transformers, E(n)-Transformers, or EGNNs for iterative coordinate refinement. The design allows for customization of attention types (sparse, linear, Kronecker), convolutional blocks, and atom representations, enabling exploration of different architectural choices for improved accuracy and efficiency.

Quick Start & Requirements

  • Install via pip: pip install alphafold2-pytorch
  • Requires PyTorch and CUDA.
  • Optional: NVIDIA Apex library for pre-trained embeddings (ESM, MSA Transformers, Protein Transformer).
  • See Usage Examples for detailed code snippets.

Highlighted Details

  • Supports prediction of distograms, angles, and 3D coordinates.
  • Integrates various attention mechanisms: sparse, linear, Kronecker, and memory-compressed attention.
  • Offers multiple structure module options for coordinate refinement: SE(3) Transformer, E(n)-Transformer, EGNN.
  • Allows customization of convolutional kernels, dilations, and block ordering.
  • Can incorporate pre-trained embeddings from ESM, MSA Transformers, or Protein Transformer.

Maintenance & Community

  • Actively developed by lucidrains.
  • Discussion channel: #alphafold on Discord.
  • References numerous research papers and datasets, indicating community engagement.

Licensing & Compatibility

  • MIT License.
  • Compatible with commercial use and closed-source linking.

Limitations & Caveats

  • This is an unofficial implementation and may not perfectly match the official AlphaFold2.
  • Sparse attention is currently only supported for self-attention, not cross-attention.
  • Installation of Deepspeed with sparse attention and Triton may require specific steps.
Health Check
Last commit

2 years ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
18 stars in the last 90 days

Explore Similar Projects

Starred by Andrej Karpathy Andrej Karpathy(Founder of Eureka Labs; Formerly at Tesla, OpenAI; Author of CS 231n), Phil Wang Phil Wang(Prolific Research Paper Implementer), and
4 more.

vit-pytorch by lucidrains

0.2%
24k
PyTorch library for Vision Transformer variants and related techniques
created 4 years ago
updated 6 days ago
Feedback? Help us improve.