-
Notifications
You must be signed in to change notification settings - Fork 99
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Robustness and quality of life improvements to Shared Sparsity contribution. #58
base: master
Are you sure you want to change the base?
Conversation
Updates to improve robustness and quality-of-life: -Addition of a max_norm constraint to avoid overflow/nonsparse outcomes given the nonconvex regularizer (involves a minor change to the optimization algorithm). This was a parameter that the theory had included, but we had omitted this from the original version since it only seems to matter in extreme circumstances where the hyperparameters are poorly chosen. There is also an auto option here for easy use, as well as none to ignore. -Addition of an auto option for the mcp_alpha parameter, since this may be somewhat mysterious to some. -Addition of an include_bias option defaulted to True. This subtracts off the mean of each group of covariates and outcomes (something we typically did outside the module), which is equivalent to adding an unregularized bias term to the regressions. -The replacement of the auto option for mcp_lambda with an option proportional_lambda, which, if set to True, interprets the value of mcp_lambda as a constant of proportionality to the previous auto option. This is because I found in practice, the coefficient of the auto option pretty much always needs to be tuned, but that the analytic expression it uses was extremely helpful otherwise.
edit parameter description
Fixes a bug where `proportional_lambda` was always set to `True` in `MCPSelector` regardless of the value passed to the actual user-instantiated `SharedSparsityConfounderSelection` model.
Apologies for the late response, I was under the impression everything was ok and just waited for the right version change to merge and release. However, it sems these changes broke the existing tests. |
No description provided.