-
Notifications
You must be signed in to change notification settings - Fork 762
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix a bug of forward-mode AD when multi-output is needed #1925
base: master
Are you sure you want to change the base?
Conversation
Update example of heat equation (lululxvi#706)
Add document for Lorenz inverse with exogenous input (lululxvi#709)
OperatorPredictor supports backends tensorflow.compat.v1, tensorflow,…
… method of Lr decay in Pytorch
update to latest version
Here, we compute the dim deepxde/deepxde/gradients/jacobian.py Line 31 in b0d239b
Would this a problem? |
|
What you mentioned is indeed a problem, the dim computed is wrong. This error has nothing to do with this pull request. It is about the last Forward-mode AD pull request. Since we changed the forward-mode AD to loop-free, then the dim computed is wrong. I have used another way to compute the dim of |
if bkd.ndim(ys[0]) == 2: | ||
self.dim_y = 1 | ||
elif bkd.ndim(ys[0]) == 3: | ||
self.dim_y = ys[0].shape[2] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about using
self.dim_y = ys[0].shape[-1]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you mean using self.dim_y = ys[0].shape[-1]
when ndim of ys[0]
is 3?
deepxde/data/pde_operator.py
Outdated
losses_bc = [bkd.reduce_mean(bkd.stack(loss, 0)) for loss in losses_bc] | ||
losses.append(losses_bc) | ||
for loss in losses_bc: | ||
losses.append(bkd.reduce_mean(bkd.stack(loss, 0))) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about
losses_bc = [bkd.reduce_mean(bkd.stack(loss, 0)) for loss in losses_bc]
losses.extend(losses_bc)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
I find the bug from this case: https://github.com/lululxvi/deepxde/blob/master/examples/operator/stokes_aligned_pideeponet.py
when setting the
num_output=3
andmulti_output_strategy="independent"
, the output shape will be (batch size, # of coordinates, 3).Then, the slice here is wrong, what we really want is slicing in the last dim.