Skip to content

Commit

Permalink
Updated onnx for all projects (#157)
Browse files Browse the repository at this point in the history
  • Loading branch information
adrianboguszewski authored Jan 9, 2025
1 parent 5f63d61 commit 0c2a918
Show file tree
Hide file tree
Showing 12 changed files with 23 additions and 113 deletions.
3 changes: 2 additions & 1 deletion ai_ref_kits/agentic_llm_rag/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,8 @@ llama-index-vector-stores-faiss==0.3.0
faiss-cpu==1.9.0

# onnx>1.16.1 doesn't work on windows
onnx==1.16.1
onnx==1.16.1; platform_system == "Windows"
onnx==1.17.0; platform_system != "Windows"
onnxruntime==1.17.3
torch==2.5.1

Expand Down
4 changes: 3 additions & 1 deletion ai_ref_kits/automated_self_checkout/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,7 +1,9 @@
openvino-dev==2023.0.0
nncf==2.5.0
ultralytics==8.0.117
onnx==1.16.0
# onnx>1.16.1 doesn't work on windows
onnx==1.16.1; platform_system == "Windows"
onnx==1.17.0; platform_system != "Windows"
supervision==0.16.0
jupyterlab==4.2.5
pycocotools==2.0.6
8 changes: 4 additions & 4 deletions ai_ref_kits/conversational_ai_chatbot/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -189,10 +189,10 @@ For the python script, you must include the following model directory arguments.

- `--reranker_model path/to/reranker_model`: The path to your reranker model directory (for example, `model/bge-reranker-large-FP32`). This model reranks responses to ensure relevance and accuracy.

- `--personality path/to/personality.yaml`: The path to your custom personality YAML file (for example, `concierge_personality.yaml`).
- `--personality path/to/personality.yaml`: The path to your custom personality YAML file (for example, `config/concierge_personality.yaml`).
This file defines the assistant's personality, including instructions, system configuration, and greeting prompts. You can create and specify your own custom personality file.

- `--example_pdf path/to/personality.yaml`: The path to your custom PDF file which is an additional context (for example, `Grand_Azure_Resort_Spa_Full_Guide.pdf`).
- `--example_pdf path/to/personality.yaml`: The path to your custom PDF file which is an additional context (for example, `data/Grand_Azure_Resort_Spa_Full_Guide.pdf`).
This file defines the knowledge of the resort in this concierge use case. You can use your own custom file to build a local knowledge base.

- `--public`: Include this flag to make the Gradio interface publicly accessible over the network. Without this flag, the interface is only available on your local machine.
Expand All @@ -204,8 +204,8 @@ python app.py \
--chat_model path/to/chat_model \
--embedding_model path/to/embedding_model \
--reranker_model path/to/reranker_model \
--personality concierge_personality.yaml \
--example_pdf Grand_Azure_Resort_Spa_Full_Guide.pdf \
--personality config/concierge_personality.yaml \
--example_pdf data/Grand_Azure_Resort_Spa_Full_Guide.pdf \
--public
```

Expand Down
4 changes: 2 additions & 2 deletions ai_ref_kits/conversational_ai_chatbot/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -465,8 +465,8 @@ def run(asr_model_dir: Path, chat_model_dir: Path, embedding_model_dir: Path, re
parser.add_argument("--chat_model", type=str, default="model/llama3.1-8B-INT4", help="Path to the chat model directory")
parser.add_argument("--embedding_model", type=str, default="model/bge-small-FP32", help="Path to the embedding model directory")
parser.add_argument("--reranker_model", type=str, default="model/bge-reranker-large-FP32", help="Path to the reranker model directory")
parser.add_argument("--personality", type=str, default="concierge_personality.yaml", help="Path to the YAML file with chatbot personality")
parser.add_argument("--example_pdf", type=str, default="Grand_Azure_Resort_Spa_Full_Guide.pdf", help="Path to the PDF file which is an additional context")
parser.add_argument("--personality", type=str, default="config/concierge_personality.yaml", help="Path to the YAML file with chatbot personality")
parser.add_argument("--example_pdf", type=str, default="data/Grand_Azure_Resort_Spa_Full_Guide.pdf", help="Path to the PDF file which is an additional context")
parser.add_argument("--public", default=False, action="store_true", help="Whether interface should be available publicly")

args = parser.parse_args()
Expand Down
3 changes: 2 additions & 1 deletion ai_ref_kits/conversational_ai_chatbot/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,8 @@ langchain-text-splitters==0.3.4
faiss-cpu==1.9.0

# onnx>1.16.1 doesn't work on windows
onnx==1.16.1
onnx==1.16.1; platform_system == "Windows"
onnx==1.17.0; platform_system != "Windows"
onnxruntime==1.17.3
torch==2.5.1
torchaudio==2.5.1
Expand Down
2 changes: 1 addition & 1 deletion ai_ref_kits/multimodal_ai_visual_generator/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ Next, you’ll download and optimize the required models. This will involve the
```shell
python3 -m venv model_installation_venv
source model_installation_venv/bin/activate
pip install -r python3.12_requirements_model_installation.txt
pip install -r requirements.txt
python3 download_and_prepare_models.py
```
After model installation, you can remove the `model_installation_venv` virtual environment as it is no longer needed.
Expand Down

This file was deleted.

Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,9 @@ openai-whisper==20240930
# deep learning frameworks
tensorflow==2.17.0

onnx==1.16.1
# onnx>1.16.1 doesn't work on windows
onnx==1.16.1; platform_system == "Windows"
onnx==1.17.0; platform_system != "Windows"
--extra-index-url https://download.pytorch.org/whl/cpu
torch==2.4.1
torchvision==0.19.1
Expand Down
4 changes: 3 additions & 1 deletion notebooks/onnxruntime_lcm/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,9 @@ jupyterlab==4.2.5
ipywidgets==8.1.5

openvino==2024.4.0
onnx==1.16.1
# onnx>1.16.1 doesn't work on windows
onnx==1.16.1; platform_system == "Windows"
onnx==1.17.0; platform_system != "Windows"
onnxruntime-openvino==1.20.0
optimum==1.22.0
accelerate==0.33.0
Expand Down
4 changes: 3 additions & 1 deletion notebooks/onnxruntime_yolov8/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,5 +5,7 @@ jupyterlab==4.2.5
openvino==2024.5.0
ultralytics==8.3.38
onnxruntime-openvino==1.20.0
onnx==1.16.1
# onnx>1.16.1 doesn't work on windows
onnx==1.16.1; platform_system == "Windows"
onnx==1.17.0; platform_system != "Windows"
setuptools==73.0.1

0 comments on commit 0c2a918

Please sign in to comment.