Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

where is shjax ? #6

Open
Chutlhu opened this issue Jan 5, 2024 · 2 comments
Open

where is shjax ? #6

Chutlhu opened this issue Jan 5, 2024 · 2 comments

Comments

@Chutlhu
Copy link

Chutlhu commented Jan 5, 2024

Dear author,

thank you very much for this repository.
I am interested in the spherical harmonics hash encoding, could you provide some more information about the shjax library?
I cannot find it online

thank you very much

@blurgyy
Copy link
Owner

blurgyy commented Jan 5, 2024

Hi @Chutlhu,

Thank you for your interest.

I implemented shjax as a custom extension in deps/spherical-harmonics-encoding-jax/, and it is integrated in python via

jaxngp/models/encoders.py

Lines 351 to 357 in d63c2c9

class SphericalHarmonicsEncoderCuda(Encoder):
# highest degree
L: int
def __call__(self, dirs: jax.Array) -> jax.Array:
"Just a thin wrapper on top of :func:`shjax.spherical_harmonics_encoding()`"
return shjax.spherical_harmonics_encoding(dirs, self.L)

You can inspect its source there.

However, I ended up using a JAX implementation of the spherical harmonics encoding, because in my benchmarks, the JAX implementation is consistently faster than the custom CUDA implementation, I think it is because JAX code can be easier optimized via operations like kernel fusion. The JAX implementation which is used throughout the project can be found at

class SphericalHarmonicsEncoder(Encoder):

The benchmark I used to compare between the JAX vs CUDA implementation of spherical harmonics encoding is at

def bench_sh():

Cheers!

@Chutlhu
Copy link
Author

Chutlhu commented Jan 14, 2024

Dear @blurgyy ,
Thank you very much! I found it.
I played a little bit with it. Using this positional embedding, the model overfits the training data very well (compared to standard Random Fourier Features), but it seems to lose the native interpolation property.
Do you know something about it? Do you have some references about this problem?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants