Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Single RF loading at training and dl1->dl2 #985

Merged
merged 11 commits into from
Jun 28, 2022
Merged

Single RF loading at training and dl1->dl2 #985

merged 11 commits into from
Jun 28, 2022

Conversation

gabemery
Copy link
Collaborator

@gabemery gabemery commented May 23, 2022

To solve #979 and reduce the memory usage during training and usage of random forest.

  • Free space in build_model()
  • Change apply_model or script logic to not require to pre-load RF
  • Finalise


file_reg_energy = path_models + "/reg_energy.sav"
joblib.dump(reg_energy, file_reg_energy, compress=3)
if free_model_memory:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is not really a reason to put this behind a flag, is there? The variable is not used after dumping.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For some reason the function returns the models (probably useful if not saved). I personally don't know if there are any use case. If not we could remove save_models and free_model_memory

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is at least used in test currently

@codecov
Copy link

codecov bot commented May 23, 2022

Codecov Report

Merging #985 (d825e4f) into master (fdf7423) will increase coverage by 0.18%.
The diff coverage is 78.84%.

@@            Coverage Diff             @@
##           master     #985      +/-   ##
==========================================
+ Coverage   85.69%   85.88%   +0.18%     
==========================================
  Files          78       78              
  Lines        6572     6645      +73     
==========================================
+ Hits         5632     5707      +75     
+ Misses        940      938       -2     
Impacted Files Coverage Δ
lstchain/reco/dl1_to_dl2.py 76.96% <77.08%> (+3.44%) ⬆️
lstchain/tests/test_lstchain.py 97.00% <100.00%> (+0.03%) ⬆️
lstchain/io/config.py 94.87% <0.00%> (-5.13%) ⬇️
lstchain/paths.py 87.50% <0.00%> (-0.97%) ⬇️
lstchain/tools/lstchain_create_dl3_index_files.py 76.92% <0.00%> (-0.86%) ⬇️
lstchain/io/tests/test_config.py 100.00% <0.00%> (ø)
lstchain/high_level/tests/test_hdus.py 100.00% <0.00%> (ø)
lstchain/high_level/hdu_table.py 93.65% <0.00%> (+0.06%) ⬆️
lstchain/tools/lstchain_create_dl3_file.py 92.66% <0.00%> (+0.06%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update fdf7423...d825e4f. Read the comment docs.

lstchain/reco/dl1_to_dl2.py Outdated Show resolved Hide resolved
@gabemery gabemery marked this pull request as ready for review May 25, 2022 13:20
@gabemery gabemery requested a review from vuillaut June 23, 2022 14:52
@gabemery
Copy link
Collaborator Author

Can this be merged or is some review still needed? @maxnoe @vuillaut

@@ -165,6 +165,7 @@ def test_build_models(simulated_dl1_file, rf_models):
infile,
infile,
save_models=False,
free_model_memory=False,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe test as well with True?

@vuillaut
Copy link
Member

If you can just add a test for the new option, otherwise lgtm.

@gabemery
Copy link
Collaborator Author

Thanks @vuillaut. I am not allowed to merge myself so if anyone with merging right can do it would be nice. Else I will ask on slack later.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants