insight  by abhimishra91

NLP as a service with GUI and backend for transformer models

created 5 years ago
306 stars

Top 88.7% on sourcepulse

GitHubView on GitHub
Project Summary

Project Insight offers an "NLP as a Service" platform, enabling users to deploy and interact with various transformer models for common NLP tasks like classification, entity recognition, sentiment analysis, and summarization. It targets developers and researchers seeking a flexible, Python-based solution for integrating NLP capabilities into applications, with a user-friendly Streamlit GUI and a robust FastAPI backend.

How It Works

The project employs a microservices architecture for its backend, with each NLP task (e.g., classification, NER) running as an independent FastAPI service. Nginx acts as a reverse proxy, routing requests to the appropriate service. This design facilitates independent updates, maintenance, and scaling of individual NLP functionalities. The frontend, built with Streamlit, provides a GUI for model selection and inference, automatically reflecting new models added to the backend.

Quick Start & Requirements

  • Installation: Clone the repository. Use sudo docker-compose up -d in the src_fastapi directory to start the backend services. Build and run the Streamlit frontend using sudo docker build -t streamlit_app . and sudo docker run -d --name streamlit_app streamlit_app in the src_streamlit directory, followed by streamlit run NLPfily.py.
  • Prerequisites: Docker, Docker Compose. Models need to be downloaded separately and placed in specific directories within src_fastapi.
  • Documentation: FastAPI documentation for each service is available at http://localhost:8080/<task>/docs (e.g., http://localhost:8080/api/v1/classification/docs).

Highlighted Details

  • Python-centric codebase using FastAPI and Streamlit.
  • Microservices architecture for backend NLP tasks, managed via Docker and Nginx.
  • Expandable design allowing easy addition of new transformer models and tasks.
  • Command-line inference via the FastAPI backend is supported.

Maintenance & Community

The project appears to be maintained by abhimishra91. No specific community channels or roadmap details are provided in the README.

Licensing & Compatibility

Licensed under GPL-3.0. This license is copyleft, meaning derivative works must also be open-sourced under the GPL-3.0. Commercial use or integration into closed-source projects may require careful consideration of licensing obligations.

Limitations & Caveats

The frontend is noted as a Work In Progress (WIP) at the time of the README's writing. Adding new models requires manual updates to configuration files and service code, indicating a potential for manual overhead in expansion.

Health Check
Last commit

2 years ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
0 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.