A library for style similarity search for art and design images.
- analysis.ipynb: a walkthrough of the the EDA, development, and analysis of my similarity search method.
Note that there is some dynamic inheritence modificaiton and method declaration (monkey patching) to enhance the story-like flow of the notebook, which is not included in the
analysts.py
class versions, and would not be included in production code. - analysts.py: classes used in analysis and EDA
- style_stack.py: primary similarity search classes for production
To install the package above, pleae run:
conda install faiss-gpu -c pytorch
conda install requirements.txt
conda install faiss-cpu -c pytorch
conda install requirements.txt
- conda
Import GramStack
and choose a model from keras.applications
from keras.applications.vgg16 import VGG16
from stylestack.gram_stack import GramStack
Set up arguments
image_dir = '../data/my_data'
model = VGG16(weights='imagenet', include_top=False)
layer_range = ('block1_conv1', 'block2_pool')
Build GramStack
stack = GramStack.build(image_dir, model, layer_range)
Set weighting for embedding layers in similarity search. Any layers not specified will be weighted as 0, or all layers can be used by specifying None
.
embedding_weights = {
'block1_conv1': 1,
'block3_conv2': 0.5,
'block3_pool': .25
}
Set other arguments. Use write_output
to output results JSON to /output/
.
image_path = '../data/my_data/cat_painting.jpg'
n_results = 5
write_output = True
Query GramStack
results = stack.query(image_path, embedding_weights, n_results, write_output)
stack.save(lib_name='my_data')
GramStack.load(lib_name='my_data')
Once the GramStack
is loaded, it can be queried and behaves the same as when it was built.