Perform multimodal image search and visualization using CLIP, ChromaDB, UMAP and Bokeh
In this blog post, I am going to show you how to perform image search against the Unsplash Lite dataset with 25k photos, using both text and image queries. To better comprehend our search results, I will then visualize the query and its matches using UMAP dimension reduction and plot it with Bokeh. By the end of this post, we will have generated a plot like this, showing both our input query image, and photos in the Unsplash Lite dataset that are closest in semantic meaning to it....