Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add golden section search #335

Closed
wants to merge 13 commits into from
Closed

Conversation

odespard
Copy link
Collaborator

@odespard odespard commented Aug 30, 2024

This adds a deterministic search step after the main search. It may not be worth bothering with, but it could be useful in certain settings.

@odespard odespard marked this pull request as draft August 30, 2024 09:54
Copy link
Collaborator

@mschwoer mschwoer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

a golden PR ;-)

self.tolerance_for_golden_section_search = workflow.config["optimization"][
self.parameter_name
]["tolerance_for_golden_section_search"]
self.golden_search_converged = False
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

move this out of the if to have a consistent set of attributes

self.has_converged = True
self._update_after_convergence()

def _update_after_convergence(self):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

docstring :-) (everywhere)


upper_bound = self.history_df.loc[self.golden_triple_idxes[2]].parameter
lower_bound = self.history_df.loc[self.golden_triple_idxes[0]].parameter
golden_parameter_proposal = upper_bound - 0.618 * (upper_bound - lower_bound)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

magic number?

Comment on lines 254 to 255
parameter_history.loc[self.golden_triple_idxes[1]]
> parameter_history.iloc[-1]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you extract this into a variable? (or comment what it means exactly :-)

golden_parameter_values = parameter_history.loc[self.golden_triple_idxes]

differences = np.diff(golden_parameter_values)
if differences[0] < differences[1]:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

smth like

           sign = np.sign(differences[1] - differences[0]) 
            golden_parameter_proposal = golden_parameter_values.iloc[0] + sign * 0.618 * (
                golden_parameter_values.iloc[2] - golden_parameter_values.iloc[0]
            )```
(pls double-check order of `sign` variable ;-))

self.history_df.index.values > feature_history.idxmax()
][self.feature_name].idxmax()
lower_bound = self.history_df.loc[lower_bound_idx].parameter
except ValueError:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could this be turned into an explicit check? (same below)

@@ -587,6 +588,61 @@ def _skip_all_optimizers(
)
optimizer.skip()

def golden_section_search(self, optimizer):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you should definitely explain the idea behin this somewhere in detail, maybe the docstring of this method would be a good place

Base automatically changed from refactor_convergence_rules to development August 30, 2024 13:22
@GeorgWa GeorgWa closed this Sep 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants