tsalib  by ofnote

Tensor Shape Annotation Library (tsalib) enables named tensor dimensions

Created 7 years ago
267 stars

Top 95.9% on SourcePulse

GitHubView on GitHub
Project Summary

This library provides Tensor Shape Annotations (TSA) for popular deep learning frameworks like NumPy, TensorFlow, and PyTorch, enabling users to name tensor dimensions for improved code clarity and debugging. It targets researchers and engineers working with complex tensor manipulations, offering a way to manage and verify tensor shapes more intuitively.

How It Works

tsalib leverages Python's type annotations and a custom shorthand notation (TSN) to associate names with tensor dimensions. It allows users to declare dimension variables (e.g., B, C, H, W = dvs('Batch(b):32 ...')) and use these names directly in tensor creation and shape assertions. A key feature is the warp operator, which composes multiple shape transformations (like reshape, permute) into a single, readable line using TSN. This approach simplifies complex tensor operations and enhances code maintainability.

Quick Start & Requirements

  • Install: pip install tsalib
  • Dependencies: sympy
  • Python version: >= 3.5 for type annotations, tested with 3.6, 3.7.
  • Documentation: Working documentation notebook

Highlighted Details

  • Supports NumPy, TensorFlow, PyTorch, Keras, MXNet backends.
  • warp operator for composing multiple shape transformations inline.
  • join, alignto, reduce_dims, and dot operators for tensor manipulation.
  • Enables symbolic shape assertions that remain valid even if dimension sizes change.
  • Annotated BERT and OpenAI Transformer models available for demonstration.

Maintenance & Community

  • Author: Nishant Sinha, OffNote Labs.
  • Contributions and feedback are welcome.
  • Last update mentioned: May 2020 (for update_dim_vars_len).

Licensing & Compatibility

  • License: Not explicitly stated in the README. (Requires clarification for commercial use).
  • Compatibility: Designed for progressive adoption with minimal code changes.

Limitations & Caveats

The library's last update was in May 2020, indicating potential staleness. While it mentions support for multiple backends, the warp operator is backend-dependent, and the README does not specify which backends are currently fully maintained or tested. The license is not clearly stated, which could be a barrier for commercial adoption.

Health Check
Last Commit

5 years ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
0 stars in the last 30 days

Explore Similar Projects

Starred by Awni Hannun Awni Hannun(Author of MLX; Research Scientist at Apple), Patrick Kidger Patrick Kidger(Core Contributor to JAX ecosystem), and
4 more.

einx by fferflo

0%
408
Tensor operation library using Einstein-inspired notation
Created 2 years ago
Updated 5 months ago
Starred by Andrej Karpathy Andrej Karpathy(Founder of Eureka Labs; Formerly at Tesla, OpenAI; Author of CS 231n), Edward Sun Edward Sun(Research Scientist at Meta Superintelligence Lab), and
5 more.

attorch by BobMcDear

0.2%
576
PyTorch nn module subset, implemented in Python using Triton
Created 2 years ago
Updated 1 month ago
Starred by George Hotz George Hotz(Author of tinygrad; Founder of the tiny corp, comma.ai), Patrick von Platen Patrick von Platen(Author of Hugging Face Diffusers; Research Engineer at Mistral), and
6 more.

jaxtyping by patrick-kidger

0.7%
2k
Typing library for array shapes/dtypes
Created 3 years ago
Updated 4 months ago
Starred by Nat Friedman Nat Friedman(Former CEO of GitHub), Chip Huyen Chip Huyen(Author of "AI Engineering", "Designing Machine Learning Systems"), and
15 more.

FasterTransformer by NVIDIA

0.1%
6k
Optimized transformer library for inference
Created 4 years ago
Updated 1 year ago
Feedback? Help us improve.