full-stack-fastapi-nextjs-llm-template  by vstorm-co

Full-stack AI/LLM application generator

Created 3 weeks ago

New!

463 stars

Top 65.4% on SourcePulse

GitHubView on GitHub
Project Summary

This project provides a production-ready, full-stack template generator for AI/LLM applications, streamlining the development of complex systems. It targets developers building AI chatbots, ML applications, enterprise SaaS, and startups, offering a significant benefit by abstracting away boilerplate infrastructure, allowing users to focus on core AI product development and ship faster.

How It Works

The template generates projects using FastAPI for the backend and Next.js for the frontend, integrating PydanticAI or LangChain for AI agent capabilities. It emphasizes real-time interaction via WebSocket streaming for LLM responses and includes robust features like conversation persistence, type-safe agents, and multi-provider LLM support. The backend employs a layered Repository + Service architectural pattern for maintainability, while observability is handled by Logfire for PydanticAI or LangSmith for LangChain, providing deep insights into application performance and AI interactions.

Quick Start & Requirements

Installation is straightforward via pip (pip install fastapi-fullstack), uv (uv tool install fastapi-fullstack), or pipx (pipx install fastapi-fullstack). Project generation is initiated interactively with fastapi-fullstack new or via command-line options like fastapi-fullstack create my_ai_app --preset production. Development setup involves make install for dependencies and make docker-up for containerized services. Prerequisites include Python, Docker, and make (which may require installation on Windows via Chocolatey or WSL).

Highlighted Details

  • AI/LLM First: Choice of PydanticAI or LangChain, WebSocket streaming, conversation persistence, and multi-provider LLM support (OpenAI, Anthropic, OpenRouter).
  • Backend (FastAPI): Pydantic v2, multiple async databases (PostgreSQL, MongoDB, SQLite), JWT/API Key/OAuth2 authentication, background task support (Celery, Taskiq, ARQ), and a Django-style CLI.
  • Frontend (Next.js 15): Built with React 19, TypeScript, and Tailwind CSS v4, featuring an AI chat interface with WebSocket streaming and robust authentication.
  • Observability: Integrated Logfire for PydanticAI (tracing, metrics, logs) and LangSmith for LangChain (traces, feedback, datasets).
  • Enterprise Integrations: Over 20 integrations covering caching, security, admin panels, webhooks, and DevOps tooling.
  • CLI Tool: A powerful, auto-discovering CLI for project management, database migrations, and custom commands.

Maintenance & Community

No specific details regarding maintainers, community channels (like Discord/Slack), or sponsorship were found in the provided README content.

Licensing & Compatibility

The project is released under the MIT License, which is permissive and generally suitable for commercial use and integration into closed-source projects.

Limitations & Caveats

The make command, used extensively for development workflows, requires GNU Make, which is not natively available on Windows and necessitates additional setup steps (e.g., using WSL or Chocolatey). The README specifies Next.js 15 and React 19, which may indicate a project utilizing very recent or pre-release technologies.

Health Check
Last Commit

5 days ago

Responsiveness

Inactive

Pull Requests (30d)
18
Issues (30d)
16
Star History
467 stars in the last 23 days

Explore Similar Projects

Feedback? Help us improve.