synalinks  by SynaLinks

LM framework for neuro-symbolic systems and in-context RL

Created 11 months ago
400 stars

Top 72.3% on SourcePulse

GitHubView on GitHub
1 Expert Loves This Project
Project Summary

SynaLinks is a production-first framework for building, evaluating, training, and deploying neuro-symbolic Language Model (LM) applications. It targets professionals, researchers, and developers, offering a progressive disclosure of complexity to simplify basic workflows while enabling advanced system development. The framework enhances LM predictions and accuracy using in-context reinforcement learning and constrained structured output without altering model weights.

How It Works

SynaLinks adapts Keras 3 principles for neuro-symbolic systems and in-context reinforcement learning. It allows users to define LM applications as programs, composed of modular components like Input, Generator, and Sequential. These programs can be built using functional, subclassing, or mixed APIs, facilitating structured data handling via Pydantic models and seamless integration with various LM providers (Ollama, OpenAI, Anthropic, Mistral, Groq) through LiteLLM.

Quick Start & Requirements

  • Install: uv pip install synalinks
  • Run: uv run synalinks init
  • Prerequisites: Python, uv (recommended).
  • Documentation: Documentation
  • Code Examples: Code Examples

Highlighted Details

  • Integrates multiple LM providers (Ollama, OpenAI, Anthropic, Mistral, Groq) via LiteLLM.
  • Supports in-context reinforcement learning for prompt optimization and constrained structured output for correctness.
  • Offers serialization to JSON for versioning with Git and FastAPI compatibility for REST API deployment.
  • Includes built-in metrics, rewards, and optimizers for application assessment and training.
  • Provides visualization tools for program structure and training history.

Maintenance & Community

  • Active development with a Beta release status.
  • Community support via Discord.
  • Inspired by Keras, DSPy, Pydantic, and LiteLLM.

Licensing & Compatibility

  • License: Apache-2.0.
  • Compatible with commercial use and closed-source linking.

Limitations & Caveats

The framework is in Beta, indicating potential for breaking changes or undiscovered issues. The core team aims to keep the library minimal, which may mean contributions for additional modules, metrics, or optimizers require approval.

Health Check
Last Commit

22 hours ago

Responsiveness

1 day

Pull Requests (30d)
0
Issues (30d)
1
Star History
13 stars in the last 30 days

Explore Similar Projects

Starred by Chip Huyen Chip Huyen(Author of "AI Engineering", "Designing Machine Learning Systems"), Wing Lian Wing Lian(Founder of Axolotl AI), and
3 more.

ROLL by alibaba

2.3%
3k
RL library for large language models
Created 7 months ago
Updated 21 hours ago
Feedback? Help us improve.