-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Renaming of normalization to standardization and implementation of actual normalization #123
Comments
Seems like a good idea to me. First we can deprecate |
I don't even think Flux uses |
Flux has its own definition Julia already uses "normalize" to mean something different to either of those:
I recently saw this package https://github.com/brendanjohnharris/Normalization.jl#normalization-methods but have not investigated closely. Thought on whether farming this out might be better? Edit: that uses JuliennedArrays for |
The Flux function ought to be renamed to something, anything else. The existing name is just confusing. We can probably discuss that over in NNlib though. |
I think the |
Fastai and pytorch use the name I'm more worried about the proximity of |
Currently the implemented function called
normalise
in reality is doing standardization, i.e., transforming the numbers to mean 0 and standard deviation of 1. I propose that weAs a reminder:
Normalization (Min-Max Scaling):$\hat{X} = (X - X_{min})/(X_{max} - X_{min})$ $\hat{X} = (X - \mu_X)/\sigma_X$
Standardization (Z-Score Normalization):
I know this might be nitpicking but since I think we should have a function that does normalization it just seems odd to give that a different name.
The text was updated successfully, but these errors were encountered: