Node features embedding for time-series forecasting #384
-
Hello, First of all, I would like to know if you think that this architecture could work. The second question is that at the moment I did not found a function in Spektral to extract these features (I am using the GCNConv class)...is there some tutorial or pointer to some resource? Thank you in advance! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hi @andremarco , import numpy as np
import tensorflow as tf
from src.modules.models import NestedGRUCell
B = 1
T = 100
N = 10
F = 5
x = np.random.normal(size=(B, T, N, F))
a = np.asarray([[np.eye(N) for _ in range(T)] for _ in range(B)])
stateful = False
unroll = False
dropout = 0.1
input_1 = tf.keras.Input((None, N, F), batch_size=a.shape[0])
input_2 = tf.keras.Input((None, N, N), batch_size=a.shape[0])
cell = m.NestedGRUCell(nodes=N,
hidden_size_in=F,
hidden_size_out=F,
dropout=dropout,
recurrent_dropout=dropout,
regularizer=None,
gnn_h=True,
layer_norm=False
)
rnn = tf.keras.layers.RNN(cell, return_sequences=True, return_state=False, stateful=stateful, unroll=unroll)
outputs = rnn((input_1, input_2), training=True)
model = tf.keras.models.Model([input_1, input_2], outputs)
model.compile("rmsprop", "mse")
model.fit([x,a], x, batch_size=B,epochs=5)
o = model([x, a]) |
Beta Was this translation helpful? Give feedback.
Hi @andremarco ,
you can check out my implementation of a GAT-RNN available from this link . The first input represent the N time series features of shape BxTxNxF and the second is a tensor of shape BxTxNxN which represent the relation between the time series. B is the batch dimesion, T the length of the time series and F the feature dimension. The model outputs the hidden states of the RNN for each time step. You can then either apply a dense layer on top of the hidden representation or just use them to predict original time series shifted by t and use a MSE loss. The following sample code implements the model.