Keras layer for BERT, ALBERT, and adapter-BERT implementations
Top 44.7% on sourcepulse
This repository provides a TensorFlow 2.0 Keras implementation of BERT, ALBERT, and adapter-BERT, enabling users to load original pre-trained weights and achieve numerically identical activations. It's designed for NLP researchers and engineers seeking a flexible and efficient way to integrate these transformer models into Keras workflows.
How It Works
The implementation is built from scratch using basic TensorFlow operations, mirroring google-research/bert/modeling.py
with simplifications. It leverages kpe/params-flow
to reduce Keras boilerplate. Support for ALBERT and adapter-BERT is achieved through configuration parameters like shared_layer
and adapter_size
, allowing for parameter-efficient fine-tuning by adding small adapter layers over frozen BERT weights.
Quick Start & Requirements
pip install bert-for-tf2
Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The project's last significant update was in July 2020, indicating potential for unaddressed issues or lack of support for newer transformer architectures or TensorFlow features.
2 years ago
Inactive