Skip to content

Commit

Permalink
chore(docs,tests): fix typo and grammar error (#376)
Browse files Browse the repository at this point in the history
Because

- there were so many typing errors and grammar errors in various files,
so I tried and fix them

This commit

- fixes almost all the typo and grammar errors in the repo
  • Loading branch information
FarukhS52 authored Oct 26, 2023
1 parent 5a2241f commit 39039a0
Show file tree
Hide file tree
Showing 6 changed files with 18 additions and 18 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@

- **macOS or Linux** - VDP works on macOS or Linux, but does not support Windows yet.

- **Docker and Docker Compose** - VDP uses Docker Compose (specifically, `Compose V2` and `Compose specification`) to run all services at local. Please install the latest stable [Docker](https://docs.docker.com/get-docker/) and [Docker Compose](https://docs.docker.com/compose/install/) before using VDP.
- **Docker and Docker Compose** - VDP uses Docker Compose (specifically, `Compose V2` and `Compose specification`) to run all services locally. Please install the latest stable [Docker](https://docs.docker.com/get-docker/) and [Docker Compose](https://docs.docker.com/compose/install/) before using VDP.

## Quick Start

Expand Down Expand Up @@ -106,7 +106,7 @@ Explore our open-source unstructured data infrastructure stack, comprising a col
<details>
<summary><b>⚗️ <a href="https://github.com/instill-ai/model" target="_blank">Instill Model</a>: Scalable AI model serving and training</b></summary><br>

**Instill Model**, or simply **Model**, emerges as an advanced ModelOps platform. Here, the focus is on empowering you to seamlessly import, train and serve Machine Learning (ML) models for inference purposes. Like other projects, Instill Model's source code is available for your exploration.
**Instill Model**, or simply **Model**, emerges as an advanced ModelOps platform. Here, the focus is on empowering you to seamlessly import, train, and serve Machine Learning (ML) models for inference purposes. Like other projects, Instill Model's source code is available for your exploration.
</details>

## No-/Low-code Access & Support
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,8 @@ After onboarding, you will be redirected to the **Pipeline** page on the left si

#### Create a SYNC pipeline `stomata`

**Step 1: Add a HTTP source**
A HTTP source accepts HTTP requests with image payloads to be processed by a pipeline.
**Step 1: Add an HTTP source**
An HTTP source accepts HTTP requests with image payloads to be processed by a pipeline.

To set it up,

Expand All @@ -38,7 +38,7 @@ To set it up,
4. fill in the GitHub repository URL `instill-ai/model-stomata-instance-segmentation-dvc`, and a Git tag e.g., `v1.0-cpu` to import the model
5. click **Set up**.

**Step 3: Add a HTTP destination**
**Step 3: Add an HTTP destination**

Since we are building a `SYNC` pipeline, the HTTP destination is automatically paired with the HTTP source.

Expand All @@ -49,7 +49,7 @@ Just click **Next**.
Almost done! Just

1. give your pipeline a unique ID `stomata`,
2. [optional] add description, and
2. [optional] Add description, and
3. click **Set up**.

Now you should see the newly created SYNC pipeline `stomata` on the Pipeline page 🎉
Expand Down Expand Up @@ -90,4 +90,4 @@ You can now view your Streamlit app in your browser.
To shut down all running VDP services:
```
$ make down
```
```
Original file line number Diff line number Diff line change
Expand Up @@ -13,14 +13,14 @@


def parse_instance_segmentation_response(resp: requests.Response) -> Tuple[List[Tuple[float]], List[str], List[str], List[float]]:
r""" Parse an Instance Segmentation response in to bounding boxes, RLEs, categories and scores
r""" Parse an Instance Segmentation response into bounding boxes, RLEs, categories, and scores
Args:
resp (`requests.Response`): response for standardised Instance Segmentation task
resp (`requests.Response`): response for standardized Instance Segmentation task
Returns: parsed outputs, a tuple of
List[Tuple[float]]: a list of detected bounding boxes in the format of (left, top, width, height)
List[str]: a list of Uncompressed Run-length encoding (RLE) in the format of comma separated string separated by comma. The length of this list must be the same as the detected bounding boxes.
List[str]: a list of Uncompressed Run-length encoding (RLE) in the format of a comma-separated string separated by a comma. The length of this list must be the same as the detected bounding boxes.
List[str]: a list of category labels, each of which corresponds to a detected bounding box. The length of this list must be the same as the detected bounding boxes.
List[float]: a list of scores, each of which corresponds to a detected bounding box. The length of this list must be the same as the detected bounding boxes.
Expand Down Expand Up @@ -76,7 +76,7 @@ def display_intro_markdown(pipeline_id="stomata"):

intro_markdown = """
# 🥦 Identify stomata by triggering VDP pipeline
# 🥦 Identify stomata by triggering the VDP pipeline
[Versatile Data Pipeline (VDP)](https://github.com/instill-ai/vdp) is a source available unstructured data ETL tool to streamline end-to-end unstructured data processing
Expand All @@ -87,7 +87,7 @@ def display_intro_markdown(pipeline_id="stomata"):
Give us a ⭐ on [![GitHub](https://img.shields.io/badge/github-%23121011.svg?style=for-the-badge&logo=github&logoColor=white)](https://github.com/instill-ai/vdp) and join our [![Discord](https://img.shields.io/badge/Community-%237289DA.svg?style=for-the-badge&logo=discord&logoColor=white)](https://discord.gg/sevxWsqpGh)
#### We are offering **FREE** fully-managed VDP on Instill Cloud
If you are interested in showcasing your models, please [sign up the form](https://www.instill.tech/get-access) and we will reach out to you. Seats are limited - first come , first served.
If you are interested in showcasing your models, please [sign up for the form](https://www.instill.tech/get-access) and we will reach out to you. Seats are limited - first come, first served.
# Demo
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@ def rle_decode(uncompressed_rle):
'counts': [n1, n2, n3, ...],
'size': [height, width] of the mask
}
mask_shape: (height,width) of mask to return
mask_shape: (height, width) of mask to return
Returns np.ndarray binary mask, 1 - mask, 0 - background
"""
Expand All @@ -124,7 +124,7 @@ def binary_mask_to_polygons(binary_mask, tolerance=0):
r""" Converts a binary mask to COCO polygon representation
Args:
binary_mask: a 2D binary numpy array where '1's represent the object
tolerance: Maximum distance from original points of polygon to approximated polygonal chain. If tolerance is 0, the original coordinate array is returned.
tolerance: Maximum distance from original points of polygon to approximated polygonal chain. If the tolerance is 0, the original coordinate array is returned.
"""
def close_contour(contour):
Expand Down Expand Up @@ -158,7 +158,7 @@ def rle_to_polygon(uncompressed_rle, bbox_ltwh):
'counts': [n1, n2, n3, ...],
'size': [height, width] of the mask
}
bbox_ltwh: the bounding box that constraints the RLE in the format of [left, top, width, height]
bbox_ltwh: the bounding box that constrains the RLE in the format of [left, top, width, height]
Return a list of polygons related to the original image
"""
Expand Down
4 changes: 2 additions & 2 deletions test/pipeline/helper.js
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ export function deployModel(model, repository, modelInstances) {
r.status === 200,
});

// Check the model instance state being updated in 24 hours. Some GitHub models is huge.
// Check the model instance state being updated in 24 hours. Some GitHub models are huge.
currentTime = new Date().getTime();
timeoutTime = new Date().getTime() + 24 * 60 * 60 * 1000;
while (timeoutTime > currentTime) {
Expand Down Expand Up @@ -209,4 +209,4 @@ export function sendSlackMessages(data) {
check(res, {
"is status 200": (r) => r.status === 200,
});
}
}
2 changes: 1 addition & 1 deletion test/pipeline/verify-classification.js
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ import {
let classificationDog = "golden retriever"
let classificationBear = "brown bear"

if (__ENV.TARGET == "m1") { // m1 have problem with classification extension and result with the Triton 22.12 is wrong when running batching.
if (__ENV.TARGET == "m1") { // m1 has a problem with classification extension and the result with the Triton 22.12 is wrong when running batching.
classificationDog = "tile roof"
classificationBear = "tile roof"
}
Expand Down

0 comments on commit 39039a0

Please sign in to comment.