diff --git a/.github/workflows/tests.yml b/.github/workflows/tests.yml index 1f093c1..a6ad599 100644 --- a/.github/workflows/tests.yml +++ b/.github/workflows/tests.yml @@ -1,9 +1,9 @@ name: tests on: - # push: - # branches: - # - master + push: + branches: + - master workflow_dispatch: jobs: @@ -18,4 +18,4 @@ jobs: uses: actions-rust-lang/setup-rust-toolchain@v1 - name: Run tests - run: cargo test + run: cargo test --workspace diff --git a/Makefile b/Makefile index c41c9fb..328c2e4 100644 --- a/Makefile +++ b/Makefile @@ -36,13 +36,12 @@ profile-mem: ############################################################################### .PHONY: test # | Run tests test: - cargo test + cargo test --workspace ############################################################################### .PHONY: lint # | Run linter (clippy) lint: - cargo clippy - cargo clippy + cargo clippy --workspace .PHONY: format # | Run formatter (cargo fmt) format: diff --git a/README.md b/README.md index 9cda781..5df0d7f 100644 --- a/README.md +++ b/README.md @@ -36,7 +36,7 @@ Compute nodes can technically do any arbitrary task, from computing the square r - **Ping/Pong**: Dria Admin Node broadcasts **ping** messages at a set interval, it is a required duty of the compute node to respond with a **pong** to these so that they can be included in the list of available nodes for task assignment. These tasks will respect the type of model provided within the pong message, e.g. if a task requires `gpt-4o` and you are running `phi3`, you won't be selected for that task. -- **Workflows**: Each task is given in the form of a workflow, based on [Ollama Workflows](https://github.com/andthattoo/ollama-workflows) (see repository for more information). In simple terms, each workflow defines the agentic behavior of an LLM, all captured in a single JSON file, and can represent things ranging from simple LLM generations to iterative web searching. +- **Workflows**: Each task is given in the form of a workflow, based on [Ollama Workflows](https://github.com/andthattoo/ollama-workflows). In simple terms, each workflow defines the agentic behavior of an LLM, all captured in a single JSON file, and can represent things ranging from simple LLM generations to iterative web searching. ## Node Running diff --git a/p2p/README.md b/p2p/README.md index 391df53..4ff7908 100644 --- a/p2p/README.md +++ b/p2p/README.md @@ -1,5 +1,7 @@ # DKN Peer-to-Peer Client +Dria Knowledge Network is a peer-to-peer network, built over libp2p. This crate is a wrapper client to easily interact with DKN. + ## Installation Add the package via `git` within your Cargo dependencies: @@ -10,4 +12,30 @@ dkn-p2p = { git = "https://github.com/firstbatchxyz/dkn-compute-node" } ## Usage -TODO: !!! +You can create the client as follows: + +```rs +use dkn_p2p::DriaP2PClient; + +// your wallet, or something random maybe +let keypair = Keypair::generate_secp256k1(); + +// your listen address +let addr = Multiaddr::from_str("/ip4/0.0.0.0/tcp/4001")?; + +// static bootstrap & relay addresses +let bootstraps = vec![Multiaddr::from_str( + "some-multiaddrs-here" +)?]; +let relays = vec![Multiaddr::from_str( + "some-multiaddrs-here" +)?]; + +// protocol version number, usually derived as `{major}.{minor}` +let version = "0.2"; + +// create the client! +let mut client = DriaP2PClient::new(keypair, addr, &bootstraps, &relays, "0.2")?; +``` + +Then, you can use its underlying functions, such as `subscribe`, `process_events` and `unsubscribe`. In particular, `process_events` handles all p2p events and returns a GossipSub message when it is received.