node-question-answering  by huggingface

Fast question answering for Node.js

created 5 years ago
466 stars

Top 66.0% on sourcepulse

GitHubView on GitHub
Project Summary

This package provides fast, production-ready question answering capabilities directly within Node.js applications. It targets Node.js developers seeking to integrate NLP features without relying on external Python services, offering a streamlined API and compatibility with various Hugging Face models.

How It Works

The library leverages the Rust-based 🤗Tokenizers for efficient text processing and TensorFlow.js for running pre-trained models locally. It defaults to DistilBERT-cased, achieving an 87.1 F1 score on SQuAD v1.1. The architecture supports SavedModel and TFJS formats, allowing local execution or integration with TensorFlow Serving for remote model inference.

Quick Start & Requirements

Highlighted Details

  • Supports DistilBERT, BERT, and RoBERTa based models in SavedModel and TFJS formats.
  • Integrates with TensorFlow Serving for remote model deployment.
  • CLI available for model downloading and management.
  • Performance comparable to TensorFlow in Python due to native SavedModel execution in TFJS.

Maintenance & Community

  • Maintained by Hugging Face.
  • Community support via Hugging Face Discord/Slack.

Licensing & Compatibility

  • MIT License. Permissive for commercial use and integration with closed-source projects.

Limitations & Caveats

A known TypeScript incompatibility with TFJS requires the --skipLibCheck flag or skipLibCheck: true in tsconfig.json. Direct use of @tensorflow/tfjs-node with SavedModel might require careful initialization order to avoid backend conflicts.

Health Check
Last commit

2 years ago

Responsiveness

1+ week

Pull Requests (30d)
0
Issues (30d)
0
Star History
1 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.