You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jul 2, 2024. It is now read-only.
According to paper, MLPs skip connection is on layer 2 (starting from 0 is the third one) but using nerfactor I've seen strange behaviour so I checked the code. Network .call() in mlp.py does the following:
x_ = x + 0 # make a copy
for i, layer in enumerate(self.layers):
y = layer(x_)
if i in self.skip_at:
y = tf.concat((y, x), -1)
x_ = y
return y
So the concatenation is applied after calling the layer and therefore the true skip connection is at the next layer (the fourth one).
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hello,
According to paper, MLPs skip connection is on layer 2 (starting from 0 is the third one) but using nerfactor I've seen strange behaviour so I checked the code. Network .call() in mlp.py does the following:
So the concatenation is applied after calling the layer and therefore the true skip connection is at the next layer (the fourth one).
The text was updated successfully, but these errors were encountered: