-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add golden section search #335
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a golden PR ;-)
alphadia/workflow/optimization.py
Outdated
self.tolerance_for_golden_section_search = workflow.config["optimization"][ | ||
self.parameter_name | ||
]["tolerance_for_golden_section_search"] | ||
self.golden_search_converged = False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
move this out of the if
to have a consistent set of attributes
self.has_converged = True | ||
self._update_after_convergence() | ||
|
||
def _update_after_convergence(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
docstring :-) (everywhere)
alphadia/workflow/optimization.py
Outdated
|
||
upper_bound = self.history_df.loc[self.golden_triple_idxes[2]].parameter | ||
lower_bound = self.history_df.loc[self.golden_triple_idxes[0]].parameter | ||
golden_parameter_proposal = upper_bound - 0.618 * (upper_bound - lower_bound) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
magic number?
alphadia/workflow/optimization.py
Outdated
parameter_history.loc[self.golden_triple_idxes[1]] | ||
> parameter_history.iloc[-1] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could you extract this into a variable? (or comment what it means exactly :-)
golden_parameter_values = parameter_history.loc[self.golden_triple_idxes] | ||
|
||
differences = np.diff(golden_parameter_values) | ||
if differences[0] < differences[1]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
smth like
sign = np.sign(differences[1] - differences[0])
golden_parameter_proposal = golden_parameter_values.iloc[0] + sign * 0.618 * (
golden_parameter_values.iloc[2] - golden_parameter_values.iloc[0]
)```
(pls double-check order of `sign` variable ;-))
self.history_df.index.values > feature_history.idxmax() | ||
][self.feature_name].idxmax() | ||
lower_bound = self.history_df.loc[lower_bound_idx].parameter | ||
except ValueError: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could this be turned into an explicit check? (same below)
@@ -587,6 +588,61 @@ def _skip_all_optimizers( | |||
) | |||
optimizer.skip() | |||
|
|||
def golden_section_search(self, optimizer): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you should definitely explain the idea behin this somewhere in detail, maybe the docstring of this method would be a good place
This adds a deterministic search step after the main search. It may not be worth bothering with, but it could be useful in certain settings.