Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix a bug of forward-mode AD when multi-output is needed #1925

Open
wants to merge 188 commits into
base: master
Choose a base branch
from

Conversation

Jerry-Jzy
Copy link
Contributor

I find the bug from this case: https://github.com/lululxvi/deepxde/blob/master/examples/operator/stokes_aligned_pideeponet.py

when setting the num_output=3 and multi_output_strategy="independent", the output shape will be (batch size, # of coordinates, 3).
Then, the slice here is wrong, what we really want is slicing in the last dim.

Jerry-Jzy and others added 30 commits May 27, 2022 16:39
Update example of heat equation (lululxvi#706)
Add document for Lorenz inverse with exogenous input (lululxvi#709)
OperatorPredictor supports backends tensorflow.compat.v1, tensorflow,…
@lululxvi
Copy link
Owner

Here, we compute the dim

self.dim_y = ys[0].shape[1]

Would this a problem?

@Jerry-Jzy
Copy link
Contributor Author

problem

@Jerry-Jzy Jerry-Jzy closed this Dec 26, 2024
@Jerry-Jzy Jerry-Jzy reopened this Dec 26, 2024
@Jerry-Jzy
Copy link
Contributor Author

Here, we compute the dim

self.dim_y = ys[0].shape[1]

Would this a problem?

What you mentioned is indeed a problem, the dim computed is wrong. This error has nothing to do with this pull request. It is about the last Forward-mode AD pull request. Since we changed the forward-mode AD to loop-free, then the dim computed is wrong.

I have used another way to compute the dim of ys, and fix a bug in PDEOperatorCartesianProd

if bkd.ndim(ys[0]) == 2:
self.dim_y = 1
elif bkd.ndim(ys[0]) == 3:
self.dim_y = ys[0].shape[2]
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about using

self.dim_y = ys[0].shape[-1]

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you mean using self.dim_y = ys[0].shape[-1]when ndim of ys[0] is 3?

losses_bc = [bkd.reduce_mean(bkd.stack(loss, 0)) for loss in losses_bc]
losses.append(losses_bc)
for loss in losses_bc:
losses.append(bkd.reduce_mean(bkd.stack(loss, 0)))
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about

losses_bc = [bkd.reduce_mean(bkd.stack(loss, 0)) for loss in losses_bc]
losses.extend(losses_bc)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants