Skip to content

Commit

Permalink
Tests pass
Browse files Browse the repository at this point in the history
  • Loading branch information
GilesStrong committed Mar 11, 2021
1 parent aa151f2 commit cec49b9
Show file tree
Hide file tree
Showing 10 changed files with 2,327 additions and 2,457 deletions.
6 changes: 3 additions & 3 deletions CHANGES.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

## Important changes

- Fixed bug in `Model.set_mom` which resulted in momentum never being set (affects e.g. OneCycle and CyclicalMom)
- Fixed bug in `Model.set_mom` which resulted in momentum never being set (affects e.g. OneCycle and CyclicalMom)
- `Model.fit` now shuffles the fold indices for training folds prior to each epoch rather than once per training; removes the periodicity in training loss which was occasionally apparent.
- Bugs found in `OneCycle`:
- When training multiple models, the initial LR for subsequent models was the end LR of the previous model (list in partial was being mutated)
Expand All @@ -15,7 +15,7 @@

- Mish activation function
- `Model.fit_params.val_requires_grad` to control whether to compute validation epoch with gradient, default zero, built some losses might require it in the future
- `ParameterisedPrediction` now stores copies of values for parameterised features in case they change, or need to be changed locally during prediction.
- `ParameterisedPrediction` now stores copies of values for parametrised features in case they change, or need to be changed locally during prediction.
- `freeze_layers` and `unfreeze_layers` methods for `Model`
- `PivotTraining` callback implementing Learning to Pivot [Louppe, Kagan, & Kranmer, 2016](https://papers.nips.cc/paper/2017/hash/48ab2f9b45957ab574cf005eb8a76760-Abstract.html)
- New example reimplementing paper's jets example
Expand All @@ -24,7 +24,7 @@
- `train_models` now has arguments to:
- Exclude specific fold indices from training and validation
- Train models on unique folds, e.g. when training 5 models on a file with 10 folds, each model would be trained on their own unique pair of folds
- Added discussion of core concept in LUMIN to the docs
- Added discussion of core concepts in LUMIN to the docs

## Removals

Expand Down
6 changes: 3 additions & 3 deletions examples/Advanced_Model_Building.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -896,7 +896,7 @@
"from lumin.nn.metrics.class_eval import AMS\n",
"\n",
"cb_partials = [partial(OneCycle, lengths=(5, 10), lr_range=[1e-4, 1e-2])]\n",
"eval_metrics = {'AMS':AMS(n_total=250000, br=10, wgt_name='gen_orig_weight')}"
"metric_partials = [partial(AMS,n_total=250000, br=10, wgt_name='gen_orig_weight', main_metric=False)]"
]
},
{
Expand Down Expand Up @@ -1013,7 +1013,7 @@
" model_builder=model_builder,\n",
" bs=bs,\n",
" cb_partials=cb_partials,\n",
" eval_metrics=eval_metrics,\n",
" metric_partials=metric_partials,\n",
" n_epochs=15)"
]
},
Expand Down Expand Up @@ -1422,7 +1422,7 @@
" model_builder=model_builder,\n",
" bs=bs,\n",
" cb_partials=cb_partials,\n",
" eval_metrics=eval_metrics,\n",
" metric_partials=metric_partials,\n",
" n_epochs=15)"
]
},
Expand Down
987 changes: 486 additions & 501 deletions examples/Binary_Classification_Signal_versus_Background.ipynb

Large diffs are not rendered by default.

800 changes: 396 additions & 404 deletions examples/Multi_Target_Regression_Di-tau_momenta.ipynb

Large diffs are not rendered by default.

1,015 changes: 469 additions & 546 deletions examples/Multiclass_Classification_Signal_versus_Backgrounds.ipynb

Large diffs are not rendered by default.

645 changes: 320 additions & 325 deletions examples/RNNs_CNNs_and_GNNs_for_matrix_data.ipynb

Large diffs are not rendered by default.

Loading

0 comments on commit cec49b9

Please sign in to comment.