Discover and explore top open-source AI tools and projects—updated daily.
Traffic classifier research paper using transformers
Top 59.8% on SourcePulse
ET-BERT offers a novel approach to classifying encrypted network traffic by leveraging pre-trained transformer models. It aims to accurately identify traffic types by learning contextual relationships between datagrams within encrypted traffic, benefiting researchers and practitioners in network security and traffic analysis.
How It Works
ET-BERT utilizes a multi-layer attention mechanism to learn inter-datagram contextual and inter-traffic transport relationships from large-scale unlabeled traffic. This pre-training phase allows the model to capture nuanced patterns in encrypted data. Subsequently, it can be fine-tuned on smaller, labeled datasets for specific traffic classification tasks, offering a flexible and efficient method for identifying traffic types.
Quick Start & Requirements
Highlighted Details
python3 fine-tuning/run_classifier.py ...
python3 inference/run_classifier_infer.py ...
python3 pre-training/pretrain.py ...
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The project requires specific hardware (CUDA 11.4, V100S GPU) and a complex set of dependencies, including optional ones like Apex and TensorFlow, which may complicate setup. The licensing status is not clearly defined, which could impact commercial use.
3 weeks ago
Inactive