You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Great project, thank you! This is more a cry for help. I'm trying to add new functionality to the forward renderer, which works great, but next I have huge problems understanding how to write the backward pass.
Do you have any suggestions on how to best approach this task?
The text was updated successfully, but these errors were encountered:
After a lot of trial and error I think the gradient is computing well now, however I still do not know a good procedure to tackle this task: how to modify/add the derivative in the backward step such that it reflects the forward math changes, and how to best test the changes (I tried using torch.autograd.gradcheck but first of all I had to convert everything to doubles otherwise it does lack accuracy, and even like that it is not obvious if there is a problem how do I spot where in the code it occurs).
Great project, thank you! This is more a cry for help. I'm trying to add new functionality to the forward renderer, which works great, but next I have huge problems understanding how to write the backward pass.
Do you have any suggestions on how to best approach this task?
The text was updated successfully, but these errors were encountered: