Skip to content

Run Apple's Mobile-Clip model on iOS to search photos.

License

Notifications You must be signed in to change notification settings

Norod/Queryable-mobileclip

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

92 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Queryable-MC [ML-MobileClip fork]

This is a fork of Queryable, an iOS app, which leverages the Apple's ml-mobileclip model to conduct offline searches in the 'Photos' album. Unlike the category-based search model built into the iOS Photos app, Queryable allows you to use natural language statements, such as a brown dog sitting on a bench, to search your album. Since it's offline, your album privacy won't be compromised by any company, including Apple or Google.

How does it work?

  • Encode all album photos using the ml-mobileclip Image Encoder, compute image vectors, and save them.
  • For each new text query, compute the corresponding text vector using the Text Encoder.
  • Compare the similarity between this text vector and each image vector.
  • Rank and return the top K most similar results.

Run on Xcode

Download the ImageEncoder_float32.mlmodelc and TextEncoder_float32.mlmodelc from Google Drive. Clone this repo, put the downloaded models below CoreMLModels/ path and run Xcode, it should work.

Core ML Export

If you only want to run Queryable, you can skip this step and directly use the exported model from Google Drive. If you wish to implement Queryable that supports your own native language, or do some model quantization/acceleration work, here is The notbook I've prepared for converting Apple's S0 weights to CoreML

Original Queryable License

MIT License

Copyright (c) 2023 Ke Fang

ML-MobileClip Port

By Doron Adler

About

Run Apple's Mobile-Clip model on iOS to search photos.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Swift 88.1%
  • Jupyter Notebook 11.9%