Fast question answering for Node.js
Top 66.0% on sourcepulse
This package provides fast, production-ready question answering capabilities directly within Node.js applications. It targets Node.js developers seeking to integrate NLP features without relying on external Python services, offering a streamlined API and compatibility with various Hugging Face models.
How It Works
The library leverages the Rust-based 🤗Tokenizers for efficient text processing and TensorFlow.js for running pre-trained models locally. It defaults to DistilBERT-cased, achieving an 87.1 F1 score on SQuAD v1.1. The architecture supports SavedModel and TFJS formats, allowing local execution or integration with TensorFlow Serving for remote model inference.
Quick Start & Requirements
npm install question-answering@latest
Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
A known TypeScript incompatibility with TFJS requires the --skipLibCheck
flag or skipLibCheck: true
in tsconfig.json
. Direct use of @tensorflow/tfjs-node
with SavedModel might require careful initialization order to avoid backend conflicts.
2 years ago
1+ week