Skip to content

Commit

Permalink
fixed most examples for newest version
Browse files Browse the repository at this point in the history
  • Loading branch information
maniospas committed Aug 22, 2024
1 parent 8e50e0e commit 899cd6d
Show file tree
Hide file tree
Showing 4 changed files with 15 additions and 19 deletions.
2 changes: 1 addition & 1 deletion JGNN/src/examples/tutorial/Learning.java
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ public static void main(String[] args) {
.config("regularize", 1.E-5)
.var("x")
.operation("h = relu(x@matrix(features, 16, regularize)+vector(16))")
.operation("yhat = softmax(h@matrix(16, classes)+vector(classes), row)")
.operation("yhat = softmax(h@matrix(16, classes)+vector(classes), dim: 'row')")
.out("yhat")
.assertBackwardValidity();

Expand Down
2 changes: 1 addition & 1 deletion JGNN/src/examples/tutorial/NN.java
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ public static void main(String[] args) {
.layer("h{l+1} = relu(h{l}@matrix(features, hidden)+vector(hidden))")
.layerRepeat("h{l+1} = relu(h{l}@matrix(hidden, hidden)+vector(hidden))", 2)
.concat(2)
.layer("yhat = softmax(h{l}@matrix(2hidden, classes)+vector(classes), row)")
.layer("yhat = softmax(h{l}@matrix(2hidden, classes)+vector(classes), dim: 'row')")
.out("yhat");

Slice sampleIds = dataset.samples().getSlice().shuffle(100);
Expand Down
5 changes: 0 additions & 5 deletions JGNN/src/examples/tutorial/Quickstart.java
Original file line number Diff line number Diff line change
@@ -1,8 +1,5 @@
package tutorial;

import java.nio.file.Files;
import java.nio.file.Paths;

import mklab.JGNN.adhoc.Dataset;
import mklab.JGNN.adhoc.ModelBuilder;
import mklab.JGNN.adhoc.datasets.Cora;
Expand Down Expand Up @@ -42,8 +39,6 @@ public static void main(String[] args) throws Exception {
.classify()
.autosize(new EmptyTensor(numSamples));

System.out.println(modelBuilder.getConfig("lr"));

ModelTraining trainer = new ModelTraining()
.setOptimizer(new Adam(0.01))
.setEpochs(3000)
Expand Down
25 changes: 13 additions & 12 deletions docs/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -866,12 +866,12 @@ <h3 id="fastbuilder">3.2. FastBuilder</h3>
<h3 id="neuralang">3.3. Neuralang</h3>

<p>Neuralang scripts consist of functions that declare machine learning
components and their interactions using a syntax inspired by the
components. These call each other and adopt a syntax inspired by the
<a href="https://www.modular.com/mojo" target="_blank">Mojo</a>
language. Use a Rust highlighter to cover all keywords, though.
Before explaining how to use the <code class="language-java">Neuralang</code> model builder,
To get a sense of the language's syntax, we present and analyse code that leads to a full
architecture definition. First, look at the <code>classify</code>
we present and analyse code that supports a fully functional architecture.
First, look at the <code>classify</code>
function, which for completeness is presented below.
This takes two tensor inputs: <code>nodes</code> that correspond to identifiers
insicating which nodes should be classified (the output has a number of rows equal to the
Expand Down Expand Up @@ -906,7 +906,8 @@ <h3 id="neuralang">3.3. Neuralang</h3>
3. Function signature defaults.<br>
</p>

<p>Next, let us look at the <code>gcnlayer</code> function. This accepts
<p>Next, let us look at some functions creating the main body of an architecture.
First, <code>gcnlayer</code> accepts
two parameters: an adjacency matrix <code>A</code> and input feature matrix <code>h</code>.
The configuration <code>hidden: 64</code> in the functions's signature
specifies the deafult number of hidden units,
Expand Down Expand Up @@ -935,7 +936,7 @@ <h3 id="neuralang">3.3. Neuralang</h3>


<p>We now move to parsing our declarations with the <code class="language-java">Neuralang</code>
model builder and using them to create an architecture. To this end, save your Neuralang code
model builder and using them to create an architecture. To this end, save your code
to a file and get is as a path <code class="language-java">Path architecture = Paths.get("filename.nn");</code>,
or avoid external files by inlining the definition within Java code through
a multiline String per <code class="language-java">String architecture = """ ... """;</code>.
Expand All @@ -944,15 +945,15 @@ <h3 id="neuralang">3.3. Neuralang</h3>
</p>


<p>For our model builder, we set remaining hyperparameters and overwrite the default value
for <code class="language-java">"hidden"</code> using the
<code class="language-java">.config(String, double)</code> method. Now that
we know about broadcasts, this is the method that implements them. We also determine
which variables are constants, namely the adjacency matrix <code>A</code> and node
<p>For the model builder, the following snippet sets remaining hyperparameters
and overwrites the default value
for <code class="language-java">"hidden"</code>. It also specifies
that certain variables are constants, namely the adjacency matrix <code>A</code> and node
representation <code>h</code>, as well as that node identifiers is a variable that serves
as the architecture's inputs. There could be multiple inputs, so this distinction of what
as the architecture's inpu. There could be multiple inputs, so this distinction of what
is a constant and what is a variable depends mostly on which quantities change
during training. In the case of node classification, both the adjacency matrix and
during training and is managed by onlt the Java-side of the code.
In the case of node classification, both the adjacency matrix and
node features remain constant, as we work in one graph. Finally, the definition
sets an Neuralang expression as the architecture's output
by calling the <code class="language-java">.out(String)</code> method,
Expand Down

0 comments on commit 899cd6d

Please sign in to comment.