Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change calibration-frame naming #1641

Merged
merged 12 commits into from
Aug 11, 2023
Merged

Change calibration-frame naming #1641

merged 12 commits into from
Aug 11, 2023

Conversation

kbwestfall
Copy link
Collaborator

... again.

This goes some way to addressing #1640 .

The implemented change reduces sequential numbers to a range. For example, '0-1-2-3-4' becomes '0:5' and '3-5-6-10-11-12-15-18-19' becomes '3-5:7-10:13-15-18:20'. This introduces colons into the file naming convention, so please comment if anyone thinks this will cause OS-specific problems!

I've directed this to develop, but I can merge in #1604 and redirect there, if that makes more sense. Note, most of the doc updates are repeats of what's done in #1604. Edits for this PR are mostly confined to calibframe.py and unit tests.

I've done some minimal testing, but will run the dev-suite.

@codecov-commenter
Copy link

codecov-commenter commented Jul 26, 2023

Codecov Report

Merging #1641 (9c61063) into develop (dd9fc7a) will increase coverage by 0.02%.
The diff coverage is 91.30%.

❗ Your organization is not using the GitHub App Integration. As a result you may experience degraded service beginning May 15th. Please install the Github App Integration for your organization. Read more.

@@             Coverage Diff             @@
##           develop    #1641      +/-   ##
===========================================
+ Coverage    40.98%   41.01%   +0.02%     
===========================================
  Files          189      189              
  Lines        43056    43076      +20     
===========================================
+ Hits         17646    17666      +20     
  Misses       25410    25410              
Files Changed Coverage Δ
pypeit/calibframe.py 86.50% <91.30%> (+1.88%) ⬆️

@kbwestfall
Copy link
Collaborator Author

All the QL tests failed in the dev-suite. I fixed the bug, added a test to the dev-suite (see pypeit/PypeIt-development-suite#277), and I've restarted the dev-suite.

@kbwestfall
Copy link
Collaborator Author

Test Summary
--------------------------------------------------------
--- PYTEST PYPEIT UNIT TESTS PASSED  234 passed, 39 warnings in 202.34s (0:03:22) ---
--- PYTEST UNIT TESTS PASSED  123 passed, 190 warnings in 1011.64s (0:16:51) ---
--- PYTEST VET TESTS PASSED  44 passed, 9 warnings in 559.53s (0:09:19) ---
--- PYPEIT DEVELOPMENT SUITE PASSED 195/195 TESTS  ---
Testing Started at 2023-07-27T17:11:31.743640
Testing Completed at 2023-07-28T01:19:57.390633
Total Time: 8:08:25.646993

@profxj profxj requested a review from freddavies July 31, 2023 13:24
Copy link
Collaborator

@freddavies freddavies left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me, and solves #1640 (in cases where all would not solve it otherwise). Nice!

My only lingering comment would be that, now that the notation is no longer using the : -- which would otherwise directly correspond to slicing an array -- I'm not sure that it should still have the second number be outside of the list of calibration IDs contained in the file. That is, if you had two groups with 0,1,2,3,4 and 5,6,7,8,9, the corresponding frame names would have 0+5 and 5+10, someone might be confused which one actually contains calib ID = 5 without digging into the documentation to find out that the + acts like a :. But that's being pretty nit-picky, I think, so I'm fine with it.

@kbwestfall
Copy link
Collaborator Author

Looks good to me, and solves #1640 (in cases where all would not solve it otherwise). Nice!

My only lingering comment would be that, now that the notation is no longer using the : -- which would otherwise directly correspond to slicing an array -- I'm not sure that it should still have the second number be outside of the list of calibration IDs contained in the file. That is, if you had two groups with 0,1,2,3,4 and 5,6,7,8,9, the corresponding frame names would have 0+5 and 5+10, someone might be confused which one actually contains calib ID = 5 without digging into the documentation to find out that the + acts like a :. But that's being pretty nit-picky, I think, so I'm fine with it.

This is a good point! I wasn't really crazy about the exclusive range anyway. I've changed the ranges so they're now inclusive. So the result for your example should be 0+4 and 5+9. Thanks for thinking this through!

Copy link
Collaborator

@profxj profxj left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

approving; might as well run tests..

pypeit/calibframe.py Show resolved Hide resolved
@kbwestfall
Copy link
Collaborator Author

Got some errors:

Test Summary
--------------------------------------------------------
--- PYTEST PYPEIT UNIT TESTS PASSED  234 passed, 73 warnings in 374.54s (0:06:14) ---
--- PYTEST UNIT TESTS FAILED  5 failed, 118 passed, 191 warnings in 792.92s (0:13:12) ---
--- PYTEST VET TESTS PASSED  47 passed, 30 warnings in 675.19s (0:11:15) ---
--- PYPEIT DEVELOPMENT SUITE FAILED 1/205 TESTS  ---
Failed tests:
    gemini_gmos/GS_HAM_R400_860 pypeit_sensfunc
Skipped tests:
    gemini_gmos/GS_HAM_R400_860 pypeit_flux_setup
    gemini_gmos/GS_HAM_R400_860 pypeit_flux
Testing Started at 2023-08-09T22:09:58.790372
Testing Completed at 2023-08-10T17:59:27.676496
Total Time: 19:49:28.886124

I expect the unit tests will pass once the corrections for the removal of b2.fits.gz file are merged. I'll look into the gemini_gmos errors.

@kbwestfall
Copy link
Collaborator Author

Test failures fixed by pypeit/PypeIt-development-suite#281

% ./pypeit_test afterburn -i gemini_gmos -s GS_HAM_R400_860
Running afterburner tests
Running tests on the following instruments:
    gemini_gmos

Reducing data from gemini_gmos for the following setups:
    GS_HAM_R400_860

 0 passed/ 0 failed/ 0 skipped STARTED gemini_gmos/GS_HAM_R400_860 pypeit_sensfunc
 1 passed/ 0 failed/ 0 skipped PASSED  gemini_gmos/GS_HAM_R400_860 pypeit_sensfunc
 1 passed/ 0 failed/ 0 skipped STARTED gemini_gmos/GS_HAM_R400_860 pypeit_flux_setup
 2 passed/ 0 failed/ 0 skipped PASSED  gemini_gmos/GS_HAM_R400_860 pypeit_flux_setup
 2 passed/ 0 failed/ 0 skipped STARTED gemini_gmos/GS_HAM_R400_860 pypeit_flux
 3 passed/ 0 failed/ 0 skipped PASSED  gemini_gmos/GS_HAM_R400_860 pypeit_flux
--- PYPEIT DEVELOPMENT SUITE PASSED 3/3 TESTS  ---

Test Summary
--------------------------------------------------------
--- PYPEIT DEVELOPMENT SUITE PASSED 3/3 TESTS  ---
Testing Started at 2023-08-10T14:08:02.352383
Testing Completed at 2023-08-10T14:10:20.366619
Total Time: 0:02:18.014236

@kbwestfall
Copy link
Collaborator Author

Only the unit tests fail, which are fixed in a different dev-suite PR.

Test Summary
--------------------------------------------------------
--- PYTEST PYPEIT UNIT TESTS PASSED  234 passed, 73 warnings in 494.92s (0:08:14) ---
--- PYTEST UNIT TESTS FAILED  5 failed, 118 passed, 191 warnings in 950.41s (0:15:50) ---
--- PYTEST VET TESTS PASSED  47 passed, 30 warnings in 655.84s (0:10:55) ---
--- PYPEIT DEVELOPMENT SUITE PASSED 207/207 TESTS  ---
Testing Started at 2023-08-10T21:58:06.439644
Testing Completed at 2023-08-11T06:35:35.137632
Total Time: 8:37:28.697988

@kbwestfall kbwestfall merged commit 5980c0d into develop Aug 11, 2023
25 checks passed
@kbwestfall kbwestfall deleted the cf_naming branch August 11, 2023 14:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants