IntelliQ  by answerlink

Multi-turn QA system with LLM and intent recognition

created 1 year ago
634 stars

Top 53.3% on sourcepulse

GitHubView on GitHub
Project Summary

IntelliQ is an open-source multi-turn question-answering system designed for developers building conversational AI applications. It leverages Large Language Models (LLMs) with intent recognition and slot-filling capabilities to enable sophisticated dialogue management and direct integration with external APIs via a NL2API approach.

How It Works

The system processes user input by first identifying the user's intent and then extracting relevant entities (slots) from the conversation. This information is used to manage the dialogue flow across multiple turns. A key feature is its "interface slot" technology, which directly maps recognized slots to external API calls, allowing for real-time data retrieval and action execution within the conversational context. This approach aims to provide a robust framework for building function-calling conversational agents.

Quick Start & Requirements

  • Install via pip install -r requirements.txt.
  • Requires Git and Python 3.
  • Configuration involves modifying config/init.py for GPT_URL (proxy) and API_KEY.
  • Run with python app.py.
  • Demo available at demo/user_input.html or 127.0.0.1:5000.
  • Detailed API documentation is available via a provided link.

Highlighted Details

  • Advanced multi-turn dialogue management.
  • Custom intent recognition and slot filling.
  • Direct integration with external APIs (NL2API).
  • Adaptive learning for improved responses.
  • Recent updates include integration with Tongyi Qianwen and support for private Qwen models.

Maintenance & Community

The project welcomes community contributions following standard Git workflow (fork, branch, commit, PR). A CONTRIBUTING.md file provides further guidance.

Licensing & Compatibility

Licensed under the Apache License, Version 2.0. This license is permissive and generally compatible with commercial use and closed-source linking.

Limitations & Caveats

The project requires configuration of API keys and potentially proxy URLs for LLM access, indicating a dependency on external LLM services. Specific hardware requirements or performance benchmarks are not detailed in the README.

Health Check
Last commit

2 weeks ago

Responsiveness

1 day

Pull Requests (30d)
2
Issues (30d)
8
Star History
62 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.