Skip to content

Commit

Permalink
Add NNCF quantization for CLIP notebook (#1173)
Browse files Browse the repository at this point in the history
* Add NNCF quantization for CLIP notebook

* Refactoring

* Remove image from notebook

* Add reduceSum to ignored scope

* Move visualize_result to visualize.py

* Fix comments

* Fix message

* Replace dataset

* Add custom nncf version

* Renamed notebooks

* Fix names

* Update notebooks/228-clip-zero-shot-image-classification/README.md

Co-authored-by: Ekaterina Aidova <ekaterina.aidova@intel.com>

* Update notebooks/228-clip-zero-shot-image-classification/README.md

Co-authored-by: Ekaterina Aidova <ekaterina.aidova@intel.com>

* make spell checker happy

* Apply suggestions from code review

---------

Co-authored-by: Ekaterina Aidova <ekaterina.aidova@intel.com>
  • Loading branch information
l-bat and eaidova authored Jul 14, 2023
1 parent d0ec891 commit e68d657
Show file tree
Hide file tree
Showing 4 changed files with 484 additions and 44 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -58,46 +58,6 @@
"processor = CLIPProcessor.from_pretrained(\"openai/clip-vit-base-patch16\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from typing import List\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"from PIL import Image\n",
"\n",
"\n",
"def visualize_result(image:Image, labels:List[str], probs:np.ndarray, top:int = 5):\n",
" \"\"\" \n",
" Utility function for visualization classification results\n",
" params:\n",
" image: input image\n",
" labels: list of classification labels\n",
" probs: model predicted softmaxed probabilities for each label\n",
" top: number of the highest probability results for visualization\n",
" returns:\n",
" None\n",
" \"\"\"\n",
" plt.figure(figsize=(64, 64))\n",
" top_labels = np.argsort(-probs)[:min(top, probs.shape[0])]\n",
" top_probs = probs[top_labels]\n",
" plt.subplot(8, 8, 1)\n",
" plt.imshow(image)\n",
" plt.axis(\"off\")\n",
"\n",
" plt.subplot(8, 8, 2)\n",
" y = np.arange(top_probs.shape[-1])\n",
" plt.grid()\n",
" plt.barh(y, top_probs)\n",
" plt.gca().invert_yaxis()\n",
" plt.gca().set_axisbelow(True)\n",
" plt.yticks(y, [labels[index] for index in top_labels])\n",
" plt.xlabel(\"probability\") "
]
},
{
"attachments": {},
"cell_type": "markdown",
Expand All @@ -115,6 +75,9 @@
"metadata": {},
"outputs": [],
"source": [
"from PIL import Image\n",
"from visualize import visualize_result\n",
"\n",
"image = Image.open('../data/image/coco.jpg')\n",
"input_labels = ['cat', 'dog', 'wolf', 'tiger', 'man', 'horse', 'frog', 'tree', 'house', 'computer']\n",
"text_descriptions = [f\"This is a photo of a {label}\" for label in input_labels]\n",
Expand Down Expand Up @@ -197,7 +160,6 @@
"metadata": {},
"outputs": [],
"source": [
"import numpy as np\n",
"from scipy.special import softmax\n",
"from openvino.runtime import Core\n",
"\n",
Expand Down Expand Up @@ -290,6 +252,16 @@
"# visualize prediction\n",
"visualize_result(image, labels, probs[0])"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Next Steps\n",
"\n",
"Open the [228-clip-zero-shot-quantize](228-clip-zero-shot-quantize.ipynb) notebook to quantize the IR model with the Post-training Quantization API of NNCF and compare `FP16` and `INT8` models."
]
}
],
"metadata": {
Expand Down
Loading

0 comments on commit e68d657

Please sign in to comment.