Fashion-domain CLIP model fine-tuned for industry applications
Top 70.8% on sourcepulse
FashionCLIP is a specialized CLIP-like model fine-tuned for the fashion domain, offering enhanced zero-shot performance on fashion-related tasks like retrieval and classification. It targets researchers and practitioners in the fashion industry seeking more accurate and domain-specific multimodal understanding.
How It Works
FashionCLIP builds upon the CLIP architecture, fine-tuning it on a large dataset of over 700K fashion image-text pairs from the Farfetch dataset. This fine-tuning process adapts the model to better capture domain-specific fashion concepts, leading to improved generalization and performance in zero-shot scenarios compared to general-purpose CLIP models. The latest version, FashionCLIP 2.0, leverages the laion/CLIP-ViT-B-32-laion2B-s34B-b79K
checkpoint, further boosting performance due to its larger training data.
Quick Start & Requirements
$ pip install fashion-clip
$ pip install -e .
from the project root.patrickjohncyh/fashion-clip
.Highlighted Details
FCLIPDataset
and FashionCLIP
classes for data handling and model interaction.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
6 months ago
1 day