Skip to content

Commit

Permalink
Wahdan/support array types (#17)
Browse files Browse the repository at this point in the history
## Summary

Support array types

### Why?

It's part of the standard

### How?

- Added encoders and decoders for new array types.
- Reached 100% test coverage for _datatype.py module, including tests
from the spec.
- Updated uv lock local dev dependencies to include pytest-subtests for
subtests in those unit tests, and numpy.
- Updated changelog for next release.
- Update type annotations to stop using Dict, List, Tuple, Type, etc.
- Add codecov.yaml.
- Omit protobuf files from coverage, and some untestable lines, increase
coverage requirements.
- Update demo notebooks.
- Remove reundant notebooks.
- Add documentation for unsupported datatypes.
- Add just command for notebooks, remove run.sh.
- Fix Dockerfile for notebooks and rename it.
- Document notebooks better in readme

---------

Co-authored-by: Ahmed Wahdan <ahmedibrahimwahdan@gmail.com>
  • Loading branch information
matteosox and ahmedwahdan authored Jan 10, 2025
1 parent ca64b43 commit 176e73e
Show file tree
Hide file tree
Showing 24 changed files with 1,731 additions and 396 deletions.
1 change: 0 additions & 1 deletion .github/CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -150,7 +150,6 @@ When naming a branch, please use the syntax `username/branch-name-here`. If you
- Metadata
- Properties
- DataSet types
- Array types
- MQTT v5
- Historian/analytics (just listens)
- Refactor all of `_payload.py`.
Expand Down
14 changes: 14 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,20 @@
All notable changes for `pysparkplug` will be documented in this file.
This project adheres to [Semantic Versioning](http://semver.org/) and [Keep a Changelog](http://keepachangelog.com/).

## 0.5.0 (2025-01-09)

### Added
- Support for array datatypes, i.e. INT8_ARRAY, INT16_ARRAY, INT32_ARRAY, INT64_ARRAY,
UINT8_ARRAY, UINT16_ARRAY, UINT32_ARRAY, UINT64_ARRAY, FLOAT_ARRAY, DOUBLE_ARRAY,
STRING_ARRAY, BOOLEAN_ARRAY, and DATETIME_ARRAY.

### Changed
- Payload `metrics` attribute is now type annotated and implemented as a `tuple`.
- `DATETIME` datatypes are no longer all treated as UTC, instead properly converting
them to the UTC timezone. Naive datetime objects are thus treated as the local
timezone.
- Unsupported datatypes now raise a `NotImplementedError` when attempting to encode/decode them instead of a `ValueError`.

## 0.4.0 (2024-10-24)

### Added
Expand Down
2 changes: 1 addition & 1 deletion notebook.Dockerfile → Dockerfile
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Start from a core stack version
FROM jupyter/scipy-notebook:2024-10-21
FROM quay.io/jupyter/scipy-notebook:2025-01-06

# Move to directory where repo will be mounted in home directory
WORKDIR /home/jovyan/pysparkplug
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,10 +20,10 @@ $ pip install pysparkplug

### Usage

More documentation to come later, but for now, you can find some example usage notebooks in the `notebooks` directory.
Simple demos of the `EdgeNode`, `Device`, and `Client` classes publishing and subscribing all supported payloads and metric datatypes can be found in the `notebooks` directory. To run them dynamically, you'll need to install Docker and run `just notebooks` before opening up your local browser to http://localhost:8888. The password is `bokchoy`.

## Features

### Fully type annotated

`pysparkplug`'s various interfaces are fully type annotated, passing [Mypy](https://mypy.readthedocs.io/en/stable/)'s static type checker.
`pysparkplug`'s various interfaces are fully type annotated.
8 changes: 8 additions & 0 deletions codecov.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
coverage:
status:
patch:
default:
target: 100%
project:
default:
target: 69%
3 changes: 1 addition & 2 deletions compose.yaml
Original file line number Diff line number Diff line change
@@ -1,14 +1,13 @@
services:
emqx:
image: emqx/emqx:5.0.9
image: emqx/emqx:5.8.4
ports:
- "18083:18083" # Dashboard
environment:
EMQX_DASHBOARD__DEFAULT_PASSWORD: admin
notebook:
build:
context: .
dockerfile: notebook.Dockerfile
command:
- "start.sh"
- "jupyter"
Expand Down
4 changes: 4 additions & 0 deletions justfile
Original file line number Diff line number Diff line change
Expand Up @@ -51,3 +51,7 @@ draft:

publish: packaging
uv publish --publish-url {{publish-url}}

notebooks:
-docker compose up
docker compose down
2 changes: 1 addition & 1 deletion notebooks/dcmd_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.11"
"version": "3.12.8"
}
},
"nbformat": 4,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -123,6 +123,84 @@
" ),\n",
" psp.Metric(\n",
" timestamp=psp.get_current_timestamp(),\n",
" name=\"int8_array\",\n",
" datatype=psp.DataType.INT8_ARRAY,\n",
" value=(1, -2, 3),\n",
" ),\n",
" psp.Metric(\n",
" timestamp=psp.get_current_timestamp(),\n",
" name=\"int16_array\",\n",
" datatype=psp.DataType.INT16_ARRAY,\n",
" value=(4, -5, 6),\n",
" ),\n",
" psp.Metric(\n",
" timestamp=psp.get_current_timestamp(),\n",
" name=\"int32_array\",\n",
" datatype=psp.DataType.INT32_ARRAY,\n",
" value=(7, -8, 9),\n",
" ),\n",
" psp.Metric(\n",
" timestamp=psp.get_current_timestamp(),\n",
" name=\"int64_array\",\n",
" datatype=psp.DataType.INT64_ARRAY,\n",
" value=(10, -11, 12),\n",
" ),\n",
" psp.Metric(\n",
" timestamp=psp.get_current_timestamp(),\n",
" name=\"uint8_array\",\n",
" datatype=psp.DataType.UINT8_ARRAY,\n",
" value=(1, 2, 3),\n",
" ),\n",
" psp.Metric(\n",
" timestamp=psp.get_current_timestamp(),\n",
" name=\"uint16_array\",\n",
" datatype=psp.DataType.UINT16_ARRAY,\n",
" value=(4, 5, 6),\n",
" ),\n",
" psp.Metric(\n",
" timestamp=psp.get_current_timestamp(),\n",
" name=\"uint32_array\",\n",
" datatype=psp.DataType.UINT32_ARRAY,\n",
" value=(7, 8, 9),\n",
" ),\n",
" psp.Metric(\n",
" timestamp=psp.get_current_timestamp(),\n",
" name=\"uint64_array\",\n",
" datatype=psp.DataType.UINT64_ARRAY,\n",
" value=(10, 11, 12),\n",
" ),\n",
" psp.Metric(\n",
" timestamp=psp.get_current_timestamp(),\n",
" name=\"float_array\",\n",
" datatype=psp.DataType.FLOAT_ARRAY,\n",
" value=(1.1, -2.2, 3.3),\n",
" ),\n",
" psp.Metric(\n",
" timestamp=psp.get_current_timestamp(),\n",
" name=\"double_array\",\n",
" datatype=psp.DataType.DOUBLE_ARRAY,\n",
" value=(4.4, -5.5, 6.6),\n",
" ),\n",
" psp.Metric(\n",
" timestamp=psp.get_current_timestamp(),\n",
" name=\"boolean_array\",\n",
" datatype=psp.DataType.BOOLEAN_ARRAY,\n",
" value=(True, False, True),\n",
" ),\n",
" psp.Metric(\n",
" timestamp=psp.get_current_timestamp(),\n",
" name=\"string_array\",\n",
" datatype=psp.DataType.STRING_ARRAY,\n",
" value=(\"hello\", \"world\"),\n",
" ),\n",
" psp.Metric(\n",
" timestamp=psp.get_current_timestamp(),\n",
" name=\"datetime_array\",\n",
" datatype=psp.DataType.DATETIME_ARRAY,\n",
" value=(datetime.datetime(2024, 1, 1), datetime.datetime.now()),\n",
" ),\n",
" psp.Metric(\n",
" timestamp=psp.get_current_timestamp(),\n",
" name=\"null_uint8\",\n",
" datatype=psp.DataType.UINT8,\n",
" ),\n",
Expand All @@ -149,19 +227,11 @@
"edge_node.connect(host)\n",
"time.sleep(1)\n",
"edge_node.update(metrics)\n",
"time.sleep(1)\n",
"edge_node.update_device(device_id, metrics)\n",
"edge_node.deregister(device_id)"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "af1372c1-0fb2-4f39-a8b2-b01f6d27ea5f",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"time.sleep(1)\n",
"edge_node.deregister(device_id)\n",
"time.sleep(1)\n",
"edge_node.disconnect()"
]
}
Expand All @@ -182,7 +252,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.11"
"version": "3.12.8"
}
},
"nbformat": 4,
Expand Down
173 changes: 0 additions & 173 deletions notebooks/edge_node_demo.ipynb

This file was deleted.

Loading

0 comments on commit 176e73e

Please sign in to comment.