Run OpenAI's CLIP and Apple's MobileCLIP model on iOS to search photos.
-
Updated
Jan 4, 2025 - Swift
Run OpenAI's CLIP and Apple's MobileCLIP model on iOS to search photos.
🔎 SimilaritySearchKit is a Swift package providing on-device text embeddings and semantic search functionality for iOS and macOS applications.
[NAACL 2022]Mobile Text-to-Image search powered by multimodal semantic representation models(e.g., OpenAI's CLIP)
Real-time on-device text-to-image and image-to-image Semantic Search with video stream camera capture using USearch & UForm AI Swift SDKs for Apple devices 🍏
CLIP-Finder enables semantic offline searches of images from gallery photos using natural language descriptions or the camera. Built on Apple's MobileCLIP-S0 architecture, it ensures optimal performance and accurate media retrieval.
Add a description, image, and links to the semantic-search topic page so that developers can more easily learn about it.
To associate your repository with the semantic-search topic, visit your repo's landing page and select "manage topics."