awesome-kan  by mintisan

KAN resources for researchers/developers in the Kolmogorov-Arnold Network field

created 1 year ago
3,013 stars

Top 16.2% on sourcepulse

GitHubView on GitHub
Project Summary

This repository is a curated collection of resources for Kolmogorov-Arnold Networks (KANs), a novel neural network architecture inspired by the Kolmogorov-Arnold representation theorem. It serves researchers and developers by organizing papers, libraries, projects, tutorials, and discussions, aiming to foster understanding and advancement in KANs as a promising alternative to traditional Multi-Layer Perceptrons (MLPs).

How It Works

KANs replace the fixed activation functions on MLP nodes with learnable activation functions on edges, parameterized as univariate splines. This edge-based learnable activation approach allows KANs to achieve higher accuracy with fewer parameters than MLPs, particularly in data fitting and PDE solving. They also offer improved interpretability through intuitive visualization and potential for human interaction, enabling them to act as collaborators in scientific discovery.

Quick Start & Requirements

  • Primary install / run command: Varies by library; pykan (official implementation) can be installed via pip. Many projects are PyTorch-based.
  • Non-default prerequisites: Python, PyTorch, JAX, TensorFlow, Julia, Mojo, C++, C#, MATLAB. Specific projects may require CUDA for GPU acceleration.
  • Links:
    • Official KAN paper: https://arxiv.org/abs/2404.19758
    • pykan (official implementation): https://github.com/microsoft/torchkan (Note: README links to microsoft/torchkan but also mentions mintisan/awesome-kan as the repo itself. The official implementation is often referred to as microsoft/torchkan or Nikhil-math/kan)
    • Tutorials and explanations are abundant throughout the README.

Highlighted Details

  • KANs claim faster neural scaling laws than MLPs and offer superior accuracy and interpretability.
  • Numerous variations exist, including ConvKANs, KAN Transformers, KANs with different basis functions (Fourier, Chebyshev, Wavelets), and applications in survival analysis, reinforcement learning, and computer vision.
  • Benchmarking efforts are underway to compare the efficiency of various KAN implementations.

Maintenance & Community

  • The primary author of the KAN paper is Ziming Liu.
  • Community discussions are linked via Hacker News and Twitter.
  • Contributions are welcomed via pull requests.

Licensing & Compatibility

  • The work is licensed under Creative Commons Attribution 4.0 International License.
  • Compatibility for commercial use is generally open, but specific library licenses should be checked.

Limitations & Caveats

Some research suggests KANs may not universally outperform MLPs across all tasks (e.g., computer vision, NLP) and that their advantage in symbolic representation might stem from the B-spline activation. Criticisms also question the naming and the extent to which KANs truly "beat the curse of dimensionality." Scalability and computational efficiency remain active research areas.

Health Check
Last commit

1 month ago

Responsiveness

1 day

Pull Requests (30d)
4
Issues (30d)
0
Star History
128 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.