Android app for local image search via natural language
Top 69.9% on sourcepulse
PicQuery enables users to search their local Android image gallery using natural language queries, powered by OpenAI's CLIP model. It offers a privacy-focused, offline solution for efficiently finding images based on descriptive text, benefiting users who manage large personal photo collections and prioritize data security.
How It Works
The application leverages OpenAI's CLIP model to encode both images and text queries into vector representations. These image vectors are stored locally in a database. During a search, the user's text query is encoded into a vector, which is then compared against the indexed image vectors to find the most similar matches. This approach allows for semantic understanding of image content, enabling searches like "kitty in the grass" rather than relying solely on filenames or metadata.
Quick Start & Requirements
clip-image-int8.ort
, clip-text-int8.ort
or vision_model.ort
, text_model.ort
) placed in the app/src/main/assets
directory.Highlighted Details
Maintenance & Community
The project acknowledges contributions from @mazzzystar and @Young-Flash. Discussions can be found via a provided link.
Licensing & Compatibility
Limitations & Caveats
The FAQ notes potential FileNotFoundException
or InvocationTargetException
if model files are not correctly placed in the app/src/main/assets
directory, indicating a manual step is required for building from source.
3 weeks ago
1 week