From d6afbc864c8141a56a9e894d5a6e99b9ab2e058d Mon Sep 17 00:00:00 2001 From: Alex Rakowski <20504869+agrski@users.noreply.github.com> Date: Thu, 20 Jul 2023 20:57:11 +0100 Subject: [PATCH 1/8] Fix link for Rclone to k8s storage secrets --- docs/source/contents/models/rclone/index.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/contents/models/rclone/index.md b/docs/source/contents/models/rclone/index.md index ae11d73ab6..c71c2284e2 100644 --- a/docs/source/contents/models/rclone/index.md +++ b/docs/source/contents/models/rclone/index.md @@ -4,4 +4,4 @@ We utilize [Rclone](https://rclone.org/) to copy model artifacts from a storage For local storage while developing see [here](../../getting-started/docker-installation/index.html#local-models). -For authorization needed for cloud storage when running on Kubernetes see [here](../../kubernetes/cloud-storage/index.html#kubernetes-secret). \ No newline at end of file +For authorization needed for cloud storage when running on Kubernetes see [here](../../kubernetes/storage-secrets/index). From c08e3785147a7012cfe644375e5cabb74faaf80f Mon Sep 17 00:00:00 2001 From: Alex Rakowski <20504869+agrski@users.noreply.github.com> Date: Thu, 20 Jul 2023 21:11:28 +0100 Subject: [PATCH 2/8] Change link to use .md instead of .html for in-page link reference --- docs/source/contents/models/rclone/index.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/contents/models/rclone/index.md b/docs/source/contents/models/rclone/index.md index c71c2284e2..8b1cd7f2ac 100644 --- a/docs/source/contents/models/rclone/index.md +++ b/docs/source/contents/models/rclone/index.md @@ -2,6 +2,6 @@ We utilize [Rclone](https://rclone.org/) to copy model artifacts from a storage location to the model servers. This allows users to take advantage of Rclones support for over 40 cloud storage backends including Amazon S3, Google Storage and many others. -For local storage while developing see [here](../../getting-started/docker-installation/index.html#local-models). +For local storage while developing see [here](../../getting-started/docker-installation/index.md#local-models). For authorization needed for cloud storage when running on Kubernetes see [here](../../kubernetes/storage-secrets/index). From f063008648e92e05093f23fd4f52b82eb487c197 Mon Sep 17 00:00:00 2001 From: Alex Rakowski <20504869+agrski@users.noreply.github.com> Date: Thu, 20 Jul 2023 21:19:30 +0100 Subject: [PATCH 3/8] Allow linking to deeply nested sub-headings (H6 level) --- docs/source/conf.py | 2 ++ 1 file changed, 2 insertions(+) diff --git a/docs/source/conf.py b/docs/source/conf.py index 68150cde10..87f82f126f 100644 --- a/docs/source/conf.py +++ b/docs/source/conf.py @@ -52,6 +52,8 @@ "tasklist", ] +myst_heading_anchors = 6; + source_suffix = ['.rst', '.md'] # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] From 95bb9b74dd3c382ef56ce54f6972f792a4a72b9e Mon Sep 17 00:00:00 2001 From: Alex Rakowski <20504869+agrski@users.noreply.github.com> Date: Thu, 20 Jul 2023 21:24:34 +0100 Subject: [PATCH 4/8] Fix broken links in inference artifacts docs --- .../contents/models/inference-artifacts/index.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/docs/source/contents/models/inference-artifacts/index.md b/docs/source/contents/models/inference-artifacts/index.md index 57790eb1a1..536d77fedb 100644 --- a/docs/source/contents/models/inference-artifacts/index.md +++ b/docs/source/contents/models/inference-artifacts/index.md @@ -29,15 +29,15 @@ To run your model inside Seldon you must supply an inference artifact that can b * - LightGBM - MLServer - `lightgbm` - - [example](../../examples/model-zoo.html#lightgbm-model) + - [example](../../examples/model-zoo.md#lightgbm-model) * - MLFlow - MLServer - `mlflow` - - [example](../../examples/model-zoo.html#mlflow-wine-model) + - [example](../../examples/model-zoo.md#mlflow-wine-model) * - ONNX - Triton - `onnx` - - [example](../../examples/model-zoo.html#onnx-mnist-model) + - [example](../../examples/model-zoo.md#onnx-mnist-model) * - OpenVino - Triton - `openvino` @@ -53,7 +53,7 @@ To run your model inside Seldon you must supply an inference artifact that can b * - PyTorch - Triton - `pytorch` - - [example](../../examples/model-zoo.html#pytorch-mnist-model) + - [example](../../examples/model-zoo.md#pytorch-mnist-model) * - SKLearn - MLServer - `sklearn` @@ -77,7 +77,7 @@ To run your model inside Seldon you must supply an inference artifact that can b * - XGBoost - MLServer - `xgboost` - - [example](../../examples/model-zoo.html#xgboost-model) + - [example](../../examples/model-zoo.md#xgboost-model) ``` ## Saving Model artifacts From bc80fc562503e75cff3d1beb1b82c0c844b2d73d Mon Sep 17 00:00:00 2001 From: Alex Rakowski <20504869+agrski@users.noreply.github.com> Date: Thu, 20 Jul 2023 21:37:20 +0100 Subject: [PATCH 5/8] Fix broken links in pipeline examples docs --- docs/source/contents/pipelines/index.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/docs/source/contents/pipelines/index.md b/docs/source/contents/pipelines/index.md index 99b5c4d4ef..a64feba7ae 100644 --- a/docs/source/contents/pipelines/index.md +++ b/docs/source/contents/pipelines/index.md @@ -82,7 +82,7 @@ The simplest Pipeline chains models together: the output of one model goes into In the above we rename tensor `OUTPUT0` to `INPUT0` and `OUTPUT1` to `INPUT1`. This allows these models to be chained together. The shape and data-type of the tensors needs to match as well. -This example can be found in the [pipeline-examples examples](../examples/pipeline-examples.html#model-chaining). +This example can be found in the [pipeline-examples examples](../examples/pipeline-examples.md#model-chaining). ## Join @@ -127,7 +127,7 @@ Joins can have a join type which can be specified with `inputsJoinType` and can * `outer` : wait for `joinWindowMs` to join any inputs. Ignoring any inputs that have not sent any data at that point. This will mean this step of the pipeline is guaranteed to have a latency of at least `joinWindowMs`. * `any` : Wait for any of the specified data sources. -This example can be found in the [pipeline-examples examples](../examples/pipeline-examples.html#model-join). +This example can be found in the [pipeline-examples examples](../examples/pipeline-examples.md#model-join). ## Conditional Logic @@ -181,7 +181,7 @@ In the above we have a step `conditional` that either outputs a tensor named `OU Note, we also have a final Pipeline output step that does an `any` join on these two models essentially outputing fron the pipeline whichever data arrives from either model. This type of Pipeline can be used for Multi-Armed bandit solutions where you want to route traffic dynamically. -This example can be found in the [pipeline-examples examples](../examples/pipeline-examples.html#conditional). +This example can be found in the [pipeline-examples examples](../examples/pipeline-examples.md#conditional). ### Errors @@ -193,7 +193,7 @@ Its also possible to abort pipelines when an error is produced to in effect crea This Pipeline runs normally or throws an error based on whether the input tensors have certain values. -This example can be found in the [pipeline-examples examples](../examples/pipeline-examples.html#error). +This example can be found in the [pipeline-examples examples](../examples/pipeline-examples.md#error). ### Triggers @@ -241,7 +241,7 @@ Sometimes you want to run a step if an output is received from a previous step b In this example the last step `tfsimple3` runs only if there are outputs from `tfsimple1` and `tfsimple2` but also data from the `check` step. However, if the step `tfsimple3` is run it only receives the join of data from `tfsimple1` and `tfsimple2`. -This example can be found in the [pipeline-examples examples](../examples/pipeline-examples.html#model-join-with-trigger). +This example can be found in the [pipeline-examples examples](../examples/pipeline-examples.md#model-join-with-trigger). ### Trigger Joins From 0a9dbc11a318860234f4ea7dee9efe90cd4de33f Mon Sep 17 00:00:00 2001 From: Alex Rakowski <20504869+agrski@users.noreply.github.com> Date: Thu, 20 Jul 2023 21:39:07 +0100 Subject: [PATCH 6/8] Update link to changed section name --- docs/source/contents/metrics/usage.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/contents/metrics/usage.md b/docs/source/contents/metrics/usage.md index 8cf36f37b7..22fdc7dfce 100644 --- a/docs/source/contents/metrics/usage.md +++ b/docs/source/contents/metrics/usage.md @@ -75,7 +75,7 @@ Hodometer is installed as a separate deployment, by default in the same namespac ````{group-tab} Helm -If you install Seldon Core v2 by [Helm chart](../getting-started/kubernetes-installation/helm.md), there are values corresponding to the key environment variables discussed [above](#setting-options). +If you install Seldon Core v2 by [Helm chart](../getting-started/kubernetes-installation/helm.md), there are values corresponding to the key environment variables discussed [above](#options). These Helm values and their equivalents are provided below: | Helm value | Environment variable | From 59c54ca4ae78528b618115daa7254c5e36f70854 Mon Sep 17 00:00:00 2001 From: Alex Rakowski <20504869+agrski@users.noreply.github.com> Date: Thu, 20 Jul 2023 21:46:41 +0100 Subject: [PATCH 7/8] Fix more broken links due to typos, moved docs, and filename suffix --- docs/source/contents/apis/inference/v2.md | 2 +- docs/source/contents/getting-started/index.md | 2 +- .../contents/getting-started/kubernetes-installation/helm.md | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/source/contents/apis/inference/v2.md b/docs/source/contents/apis/inference/v2.md index ba3df563c2..6c0c210b85 100644 --- a/docs/source/contents/apis/inference/v2.md +++ b/docs/source/contents/apis/inference/v2.md @@ -869,7 +869,7 @@ matches the tensor's data type. A platform is a string indicating a DL/ML framework or backend. Platform is returned as part of the response to a -[Model Metadata](#model_metadata) request but is information only. The +[Model Metadata](#model-metadata) request but is information only. The proposed inference APIs are generic relative to the DL/ML framework used by a model and so a client does not need to know the platform of a given model to use the API. Platform names use the format diff --git a/docs/source/contents/getting-started/index.md b/docs/source/contents/getting-started/index.md index e3598316ca..a447f46ce1 100644 --- a/docs/source/contents/getting-started/index.md +++ b/docs/source/contents/getting-started/index.md @@ -12,7 +12,7 @@ Seldon Core can be installed either with Docker Compose or with Kubernetes: Once installed: * Try the existing [examples](../examples/index.md). - * Train and deploy your own [model artifact](../models/inference-artifacts/index.html#saving-model-artifacts). + * Train and deploy your own [model artifact](../models/inference-artifacts/index.md#saving-model-artifacts). ## Core Concepts diff --git a/docs/source/contents/getting-started/kubernetes-installation/helm.md b/docs/source/contents/getting-started/kubernetes-installation/helm.md index 3a77c04a1c..806ffdb59d 100644 --- a/docs/source/contents/getting-started/kubernetes-installation/helm.md +++ b/docs/source/contents/getting-started/kubernetes-installation/helm.md @@ -13,7 +13,7 @@ The Helm charts can be found within the `k8s/helm-charts` folder and they are pu Assuming you have installed any ecosystem components: Jaeger, Prometheus, Kafka as discussed [here](./index.md) you can follow the following steps. -Note that for Kafka follow the steps discussed [here](kafka.md) +Note that for Kafka follow the steps discussed [here](../../kubernetes/kafka/index) ## Add Seldon Core v2 Charts From bf3359a7fff6fd8623f0e7890d8ef722ccdad288 Mon Sep 17 00:00:00 2001 From: Alex Rakowski <20504869+agrski@users.noreply.github.com> Date: Thu, 20 Jul 2023 21:47:02 +0100 Subject: [PATCH 8/8] Remove trailing whitespace in Getting Started page --- docs/source/contents/getting-started/index.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/source/contents/getting-started/index.md b/docs/source/contents/getting-started/index.md index a447f46ce1..bf644d9387 100644 --- a/docs/source/contents/getting-started/index.md +++ b/docs/source/contents/getting-started/index.md @@ -1,7 +1,7 @@ # Getting Started ```{note} -Some dependencies may require that the (virtual) machines on which you deploy, support the SSE4.2 instruction set or x86-64-v2 microarchitecture. If `lscpu | grep sse4_2` does not return anything on your machine, your CPU is not compatible, and you may need to update the (virtual) host's CPU. +Some dependencies may require that the (virtual) machines on which you deploy, support the SSE4.2 instruction set or x86-64-v2 microarchitecture. If `lscpu | grep sse4_2` does not return anything on your machine, your CPU is not compatible, and you may need to update the (virtual) host's CPU. ``` Seldon Core can be installed either with Docker Compose or with Kubernetes: @@ -12,7 +12,7 @@ Seldon Core can be installed either with Docker Compose or with Kubernetes: Once installed: * Try the existing [examples](../examples/index.md). - * Train and deploy your own [model artifact](../models/inference-artifacts/index.md#saving-model-artifacts). + * Train and deploy your own [model artifact](../models/inference-artifacts/index.md#saving-model-artifacts). ## Core Concepts