From 29766de054fdbf74eaf3e20a0df0005a928744c1 Mon Sep 17 00:00:00 2001 From: Jessica Scheick Date: Fri, 21 Jun 2024 10:47:20 -0400 Subject: [PATCH 01/11] fix bib entry (#529) --- doc/source/tracking/icepyx_pubs.bib | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/source/tracking/icepyx_pubs.bib b/doc/source/tracking/icepyx_pubs.bib index 47c024273..7456352b0 100644 --- a/doc/source/tracking/icepyx_pubs.bib +++ b/doc/source/tracking/icepyx_pubs.bib @@ -21,7 +21,7 @@ @misc{quest2023agu @misc{selper2023, -author = {Tilling, Rachel and +author = {Wegener, Rachel and Lopez, Luis and Steiker, Amy and Scheick, Jessica From 2c924b0a6f6e9082f3a000cb3ca8d8ef6d7e82de Mon Sep 17 00:00:00 2001 From: Jessica Scheick Date: Thu, 18 Jul 2024 16:22:52 -0400 Subject: [PATCH 02/11] [docs] update is2 resources (#535) --- doc/source/community/resources.rst | 31 +++++++---- .../community/resources/2020_IS2_HW.rst | 22 ++++---- .../community/resources/2022_IS2_HW.rst | 52 ++++++++++++++---- .../community/resources/2023_IS2_HW.rst | 54 +++++++++++++++++++ .../community/resources/IS2_software.rst | 25 +++++---- 5 files changed, 142 insertions(+), 42 deletions(-) create mode 100644 doc/source/community/resources/2023_IS2_HW.rst diff --git a/doc/source/community/resources.rst b/doc/source/community/resources.rst index 8d60fdee7..2f4701456 100644 --- a/doc/source/community/resources.rst +++ b/doc/source/community/resources.rst @@ -15,23 +15,34 @@ Other Ways to Access ICESat-2 Data ---------------------------------- icepyx aims to provide intuitive, object-based methods for finding, obtaining, visualizing, and analyzing ICESat-2 data as part of an open, reproducible workflow that leverages existing tools wherever possible (see :ref:`Complementary GitHub Repositories `) -and can be run locally, using high performance computing, or in the cloud using Pangeo. +and can be run locally, using high performance computing, or in the cloud. A few other options available for querying, visualizing, and downloading ICESat-2 data files are: +- `earthaccess Python library `_ + + - A powerful tool for querying and downloading NASA datasets. + - Seamlessly handles authentication and cloud tokening. + - Under active development to expand functionality, + including adding icepyx as a plugin to enable subsetting services for ICESat-2 data. + - `NSIDC (DAAC) Data Access `_ - - Select “ICESat-2 Data Sets” from the left hand menu. Choose your dataset (ATL##). Then, use the spatial and temporal filters to narrow your list of granules available for download. + - Select “Data from the right hand menu. + Choose your dataset (ATL##). + Then, use the spatial and temporal filters to narrow your list of granules available for download. -- `OpenAltimetry `_ +- `OpenAltimetry `_ - - Collaboration between NSIDC, Scripps, and San Diego Supercomputer Center - - A web tool to visualize and download ICESat and ICESat-2 surface heights - - Data may be subsetted by data product, reference ground track (RGT), and beam - - Currently available ICESat-2 datasets are: ATL06 (land ice height), ATL07 (sea ice height), ATL08 (land/vegetation height), ATL13 (water surface height) + - Collaboration between NSIDC, Scripps, and San Diego Supercomputer Center. + - A web tool to visualize and download ICESat and ICESat-2 surface heights. + - Data may be subsetted by data product, reference ground track (RGT), and beam. + - Currently available ICESat-2 datasets are: ATL06 (land ice height), ATL07 (sea ice height), + ATL08 (land/vegetation height), ATL10 (sea ice freeboard), ATL12 (ocean surface height), ATL13 (water surface height). Software Packages for Working with ICESat-2 Data ------------------------------------------------ -icepyx is but one of several software packages designed to improve user experience with ICESat-2 data. The links below highlight other packages active in the community. +icepyx is but one of several software packages designed to improve user experience with ICESat-2 data. +The links below highlight other packages active in the community. .. toctree:: :maxdepth: 2 @@ -40,7 +51,8 @@ icepyx is but one of several software packages designed to improve user experien Resources Developed For and During Hackweeks -------------------------------------------- -Previous hackweeks gave participants the opportunity to develop codes to help download and/or analyze ICESat-2 data. Many of these projects are inactive, but they may provide useful workflows for users to work with. +Hackweeks give participants the opportunity to develop code to help download and/or analyze ICESat-2 data. +Many of these projects become inactive following the event, but they may provide useful workflows for users to work with. .. toctree:: :maxdepth: 2 @@ -48,3 +60,4 @@ Previous hackweeks gave participants the opportunity to develop codes to help do resources/2019_IS2_HW resources/2020_IS2_HW resources/2022_IS2_HW + resources/2023_IS2_HW diff --git a/doc/source/community/resources/2020_IS2_HW.rst b/doc/source/community/resources/2020_IS2_HW.rst index ee447b61a..de7d50218 100644 --- a/doc/source/community/resources/2020_IS2_HW.rst +++ b/doc/source/community/resources/2020_IS2_HW.rst @@ -31,17 +31,17 @@ The published tutorial repo also includes links to presentation slides and video Tutorial Topics: 1. Introductory Session -1. ICESat-2 Mission: Satellite, Sensor, and Data -1. Git and GitHub -1. Jupyter and iPython -1. Geospatial Analysis with Python -1. Introduction to ICESat-2 Sea Ice and Land Ice Products and Data Access -1. Programmatic ICESat-2 data access -1. Introduction to HDF5 and ICESat-2 data files -1. Land ice applications -1. Sea ice applications -1. Science data generation -1. Machine learning +2. ICESat-2 Mission: Satellite, Sensor, and Data +3. Git and GitHub +4. Jupyter and iPython +5. Geospatial Analysis with Python +6. Introduction to ICESat-2 Sea Ice and Land Ice Products and Data Access +7. Programmatic ICESat-2 data access +8. Introduction to HDF5 and ICESat-2 data files +9. Land ice applications +10. Sea ice applications +11. Science data generation +12. Machine learning Projects ^^^^^^^^ diff --git a/doc/source/community/resources/2022_IS2_HW.rst b/doc/source/community/resources/2022_IS2_HW.rst index f27b9b75e..45911b627 100644 --- a/doc/source/community/resources/2022_IS2_HW.rst +++ b/doc/source/community/resources/2022_IS2_HW.rst @@ -1,17 +1,51 @@ .. _resource_IS2HW_2022: -Third ICESat-2 Hackweek Facilitated by the University of Washington eScience Institute --------------------------------------------------------------------------------------- -This event is currently being planned, and will take place 21-25 March 2022. -Please see the `event website `_ for more information. +Third [Virtual] ICESat-2 Cloud Computing Hackweek Facilitated by the University of Washington eScience Institute +---------------------------------------------------------------------------------------------------------------- +This March 2022 event produced a series of `tutorials `_, +developed by volunteer instructors and presented during the event. +A key focus of the event was on transitioning to cloud-based environments and +building skills around accessing ICESat-2 data in the cloud. +During the Hackweek, teams of attendees worked on +`projects `_ +related to their domain interests and learning goals. +.. |Zenodo badge| image:: https://zenodo.org/badge/DOI/10.5281/zenodo.6462479.svg + :target: https://doi.org/10.5281/zenodo.6462479 -Tutorials -^^^^^^^^^ -The tutorials for this event will be available to download for interactive use as well as pre-rendered on the event's website. -The website will be linked here once it is live, and final information posted here after the event. +Tutorials |Zenodo badge| +^^^^^^^^^^^^^^^^^^^^^^^^ +The tutorials from this event live within a dedicated `GitHub repository `_ +and are published on `Zenodo `_. +The published tutorial repo also includes links to presentation materials and videos of the recorded presentations. +Tutorial Topics: + +1. Open Science Tools +2. About ICESat-2 Data +3. ICESat-2 Data Access +4. Object-oriented Programming +5. Geospatial Transforms +6. Data Integration (Part I) +7. Data Visualization 2D +8. Cloud Computing Tools +9. Data Integration (Part II) +10. Data Integration (Part III) Projects ^^^^^^^^ -Projects will be listed here after the event. +These `project repositories `_ +often provide useful starting points to develop effective disciplinary +workflows. +Links to project GitHub repositories and presentation videos are at the above link. + +Project Names: + +- Polynyas +- Icebergs +- Coastal +- Sea ice floes +- Antarctic Ripples +- Snow DEM +- Ice discharge +- Strong Beams diff --git a/doc/source/community/resources/2023_IS2_HW.rst b/doc/source/community/resources/2023_IS2_HW.rst new file mode 100644 index 000000000..ed512d80a --- /dev/null +++ b/doc/source/community/resources/2023_IS2_HW.rst @@ -0,0 +1,54 @@ +.. _resource_IS2HW_2023: + +Fourth ICESat-2 Cloud Computing Hackweek at the University of Washington eScience Institute +------------------------------------------------------------------------------------------- +This August 2023 event produced a series of `tutorials `_, +developed by volunteer instructors and presented during the event. +A continuing focus of the event was considerations for cloud-native workflows +(such as cloud-optimized data) and +building skills around accessing ICESat-2 data in the cloud. +During the Hackweek, teams of attendees worked on +`projects `_ +related to their domain interests and learning goals. + +.. |Zenodo badge| image:: https://zenodo.org/badge/DOI/10.5281/zenodo.10519966.svg + :target: https://doi.org/10.5281/zenodo.10519966 + +Tutorials |Zenodo badge| +^^^^^^^^^^^^^^^^^^^^^^^^ +The tutorials from this event live within a dedicated `Jupyter Book `_ +and are published on `Zenodo `_. +You can run the tutorials by following the instructions +`here `_. +The published tutorial repo also includes links to presentation slides and videos of the recorded presentations. + +Tutorial Topics: + +1. CryoCloud +2. ICESat-2 Mission Overview and Data +3. Open source development/Object Oriented Programming +4. Cloud data access and data formats +5. Best Practices for Collaborative development +6. Application: snow depth +7. Application: bathymetry +8. Application: Visualization and Integration +9. Application: inland hydrology +10. Application: sea ice +11. Application: ICESat-2 integration with GrIMP data + +Projects +^^^^^^^^ +In an effort to improve learning outcomes, we asked participants +to summarize their accomplishments in a structured +`project readme `_, +some of which are drafts, and did not record the final presentations. + +Project titles: + +- 3D Lakes +- surfit +- dzdt +- grounding zones +- h5cloud +- Everything Anywhere All At Once +- seabath diff --git a/doc/source/community/resources/IS2_software.rst b/doc/source/community/resources/IS2_software.rst index fc0edaf56..d7c75dae6 100644 --- a/doc/source/community/resources/IS2_software.rst +++ b/doc/source/community/resources/IS2_software.rst @@ -1,23 +1,17 @@ Open-Source Packages -------------------- ICESat-2 can be tricky to process for the first time, especially if working with the ATL03 data. Software packages have been developed to make ICESat-2 data analysis easier for new and experienced users. -Here, we highlight some commonly-used software packages developed by the science community. Most of these can be used alongside Icepyx to facilitate ICESat-2 data processing. +Here, we highlight some commonly-used software packages developed by the science community. Most of these can be used alongside icepyx to facilitate ICESat-2 data processing. Most of these packages are callable through Python, though others may require access to other software. Keep this in mind before attempting to use any package or plugin. -* `SlideRule `_ +* `SlideRule `_ - collaboration between the ICESat-2 science team and University of Washington - A Python client to process ICESat-2 ATL03 data prior to download. - - Aggregates ATL03 data into line segments of user-defined length, creating a customized form of the ATL06 product. + - Create customized versions of ATL06 (land ice), ATL08 (vegetation), and ATL24(bathymetry) products. + Ideal for situations where a given algorithm is not run or is too coarse for a particular application. - Data may also be subsetted based on spatial bounds and photon classification. -* `IceFlow `_ - - - by National Snow and Ice Data Center (NSIDC) - - A Python library designed to simplify the co-registration of cryospheric datasets. - - Matches georeferencing parameters across data sets, allowing a user to derive a time series across multiple datasets for a given region. - - Currently valid datasets include ICESat, ICESat-2, and Operation IceBridge. - * `PhoREAL `_ - by Applied Research Laboratories, University of Texas at Austin @@ -34,10 +28,10 @@ Here we describe a selection of publicly available Python code posted on GitHub This includes repositories that are more broadly designed for working with LiDAR/point cloud datasets in general. These repositories represent independent but complimentary projects that we hope to make easily interoperable within icepyx in order to maximize capabilities and minimize duplication of efforts. Conversations about how to best accomplish this have been ongoing since the conception of icepyx, and we welcome everyone to join the conversation (please see our :ref:`contact page`). +Some of these repositories may not be actively maintained. *Note: This list is a compilation of publicly available GitHub repositories and includes some annotations to reflect how they relate to icepyx. -Please check each repository's licensing information before using or modifying their code. -Additional resources having to do specifically with obtaining ICESat-2 data are noted in the last section of this document.* +Please check each repository's licensing information before using or modifying their code.* * `captoolkit `_ @@ -59,7 +53,12 @@ Additional resources having to do specifically with obtaining ICESat-2 data are - Retrieve IceBridge, ICESat, and ICESat-2 data using the NSIDC subsetter API - Command line tool - Download data and convert it into a georeferenced format (e.g. geojson, kml, or shapefile) - - We envision use of Nsidc-subsetter to improve interoperability between icepyx and the NSIDC subsetter API. Currently, icepyx has very limited subsetting capabilities that are not easy to access or find more information about. + +* `read-ICESat-2 `_ + + - by Tyler Sutterley + - Read selected ICESat-2 products into memory. + * `pointCollection `_ From b4582df4fd61aacef71dada35ed8b212582e7cf2 Mon Sep 17 00:00:00 2001 From: Jessica Scheick Date: Wed, 24 Jul 2024 12:59:13 -0400 Subject: [PATCH 03/11] update docstring tests for numpy 2.0 (#537) --- icepyx/core/is2ref.py | 4 ++++ icepyx/core/query.py | 2 +- icepyx/core/spatial.py | 2 +- 3 files changed, 6 insertions(+), 2 deletions(-) diff --git a/icepyx/core/is2ref.py b/icepyx/core/is2ref.py index 361221d6a..86888547b 100644 --- a/icepyx/core/is2ref.py +++ b/icepyx/core/is2ref.py @@ -271,6 +271,10 @@ def _default_varlists(product): return common_list +# Currently this function is used one-off, but if it needs to be done for a series of values, +# a faster version using pandas map (instead of apply) is available in SlideRule: +# https://github.com/SlideRuleEarth/sliderule/issues/388 +# https://github.com/SlideRuleEarth/sliderule/commit/46cceac0e5f6d0a580933d399a6239bc911757f3 def gt2spot(gt, sc_orient): warnings.warn( "icepyx versions 0.8.0 and earlier used an incorrect spot number calculation." diff --git a/icepyx/core/query.py b/icepyx/core/query.py index 46a306dd2..dce3c1c34 100644 --- a/icepyx/core/query.py +++ b/icepyx/core/query.py @@ -914,7 +914,7 @@ def avail_granules(self, ids=False, cycles=False, tracks=False, cloud=False): >>> reg_a = ipx.Query('ATL06',[-55, 68, -48, 71],['2019-02-20','2019-02-28']) >>> reg_a.avail_granules() {'Number of available granules': 4, - 'Average size of granules (MB)': 55.166646003723145, + 'Average size of granules (MB)': np.float64(55.166646003723145), 'Total size of all granules (MB)': 220.66658401489258} >>> reg_a = ipx.Query('ATL06',[-55, 68, -48, 71],['2019-02-20','2019-02-23']) diff --git a/icepyx/core/spatial.py b/icepyx/core/spatial.py index c34e928ed..721c949ab 100644 --- a/icepyx/core/spatial.py +++ b/icepyx/core/spatial.py @@ -49,7 +49,7 @@ def geodataframe(extent_type, spatial_extent, file=False, xdateline=None): >>> reg_a = ipx.Query('ATL06',[-55, 68, -48, 71],['2019-02-20','2019-02-28']) >>> gdf = geodataframe(reg_a.spatial.extent_type, reg_a.spatial.extent) >>> gdf.geometry - 0 POLYGON ((-48.00000 68.00000, -48.00000 71.000... + 0 POLYGON ((-48 68, -48 71, -55 71, -55 68, -48 ... Name: geometry, dtype: geometry """ From c81c14ac5642569d2545450efae709084961e6ab Mon Sep 17 00:00:00 2001 From: Jessica Scheick Date: Wed, 24 Jul 2024 15:36:00 -0400 Subject: [PATCH 04/11] Add Zenodo badge and update all-contributors badge (#536) --- README.rst | 10 ++++++++-- doc/source/index.rst | 8 ++++++-- 2 files changed, 14 insertions(+), 4 deletions(-) diff --git a/README.rst b/README.rst index c0e4b6999..5240a1113 100644 --- a/README.rst +++ b/README.rst @@ -3,7 +3,9 @@ icepyx **Python tools for obtaining and working with ICESat-2 data** -|GitHub license| |Conda install| |Pypi install| |Contributors| |JOSS| +|Contributors| |GitHub license| |Conda install| |Pypi install| + +|JOSS| |Zenodo-all| Latest release (main branch): |Docs Status main| |Travis main Build Status| |Code Coverage main| @@ -18,7 +20,7 @@ Current development version (development branch): |Docs Status dev| |Travis dev .. |Pypi install| image:: https://badge.fury.io/py/icepyx.svg :target: https://pypi.org/project/icepyx -.. |Contributors| image:: https://img.shields.io/badge/all_contributors-40-orange.svg?style=flat-square +.. |Contributors| image:: https://img.shields.io/github/all-contributors/icesat2py/icepyx?color=ee8449&style=flat-square(#contributors) :alt: All Contributors :target: https://github.com/icesat2py/icepyx/blob/main/CONTRIBUTORS.rst @@ -26,6 +28,10 @@ Current development version (development branch): |Docs Status dev| |Travis dev :alt: JOSS publication link and DOI :target: https://doi.org/10.21105/joss.04912 +.. |Zenodo-all| image:: https://zenodo.org/badge/DOI/10.5281/zenodo.7729175.svg + :alt: Zenodo all-versions DOI for icepyx + :target: https://doi.org/10.5281/zenodo.7729175 + .. |Docs Status main| image:: https://readthedocs.org/projects/icepyx/badge/?version=latest :target: http://icepyx.readthedocs.io/?badge=latest diff --git a/doc/source/index.rst b/doc/source/index.rst index 26f398605..e73818942 100644 --- a/doc/source/index.rst +++ b/doc/source/index.rst @@ -5,9 +5,13 @@ :alt: JOSS publication link and DOI :target: https://doi.org/10.21105/joss.04912 +.. |Zenodo-all| image:: https://zenodo.org/badge/DOI/10.5281/zenodo.7729175.svg + :alt: Zenodo all-versions DOI for icepyx + :target: https://doi.org/10.5281/zenodo.7729175 -icepyx |version badge| |JOSS| -================================== + +icepyx |version badge| |JOSS| |Zenodo-all| +=============================================== **Python tools for obtaining and working with ICESat-2 data** From 0d1e02edfb86e42ddd52e819e9aabfbce8805e28 Mon Sep 17 00:00:00 2001 From: "pre-commit-ci[bot]" <66853113+pre-commit-ci[bot]@users.noreply.github.com> Date: Tue, 6 Aug 2024 19:32:02 -0600 Subject: [PATCH 05/11] [pre-commit.ci] pre-commit autoupdate (#538) Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: GitHub Action --- .pre-commit-config.yaml | 2 +- .../documentation/classes_dev_uml.svg | 525 +++++++++--------- .../documentation/classes_user_uml.svg | 333 +++++------ 3 files changed, 437 insertions(+), 423 deletions(-) diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index c7c0d675a..19e67d75b 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -1,6 +1,6 @@ repos: - repo: https://github.com/psf/black - rev: 24.4.2 + rev: 24.8.0 hooks: - id: black diff --git a/doc/source/user_guide/documentation/classes_dev_uml.svg b/doc/source/user_guide/documentation/classes_dev_uml.svg index d2ca62b29..f59739433 100644 --- a/doc/source/user_guide/documentation/classes_dev_uml.svg +++ b/doc/source/user_guide/documentation/classes_dev_uml.svg @@ -4,11 +4,11 @@ - + classes_dev_uml - + icepyx.quest.dataset_scripts.argo.Argo @@ -87,390 +87,397 @@ icepyx.core.auth.EarthdataAuthMixin - -EarthdataAuthMixin - -_auth : NoneType -_s3_initial_ts : NoneType, datetime -_s3login_credentials : NoneType -_session : NoneType -auth -s3login_credentials -session - -__init__(auth) -__str__() -earthdata_login(uid, email, s3token): None + +EarthdataAuthMixin + +_auth : NoneType +_s3_initial_ts : NoneType, datetime +_s3login_credentials : NoneType +_session : NoneType +auth +s3login_credentials +session + +__init__(auth) +__str__() +earthdata_login(uid, email, s3token): None icepyx.core.query.GenQuery - -GenQuery - -_spatial -_temporal -dates -end_time -spatial -spatial_extent -start_time -temporal - -__init__(spatial_extent, date_range, start_time, end_time) -__str__() + +GenQuery + +_spatial +_temporal +dates +end_time +spatial +spatial_extent +start_time +temporal + +__init__(spatial_extent, date_range, start_time, end_time) +__str__() icepyx.core.granules.Granules - -Granules - -avail : list -orderIDs : list - -__init__() -download(verbose, path, restart) -get_avail(CMRparams, reqparams, cloud) -place_order(CMRparams, reqparams, subsetparams, verbose, subset, geom_filepath) + +Granules + +avail : list +orderIDs : list + +__init__() +download(verbose, path, restart) +get_avail(CMRparams, reqparams, cloud) +place_order(CMRparams, reqparams, subsetparams, verbose, subset, geom_filepath) icepyx.core.granules.Granules->icepyx.core.auth.EarthdataAuthMixin - - + + icepyx.core.query.Query - -Query - -CMRparams -_CMRparams -_about_product -_cust_options : dict -_cycles : list -_granules -_order_vars -_prod : NoneType, str -_readable_granule_name : list -_reqparams -_subsetparams : NoneType -_tracks : list -_version -cycles -dataset -granules -order_vars -product -product_version -reqparams -tracks - -__init__(product, spatial_extent, date_range, start_time, end_time, version, cycles, tracks, auth) -__str__() -avail_granules(ids, cycles, tracks, cloud) -download_granules(path, verbose, subset, restart) -latest_version() -order_granules(verbose, subset, email) -product_all_info() -product_summary_info() -show_custom_options(dictview) -subsetparams() -visualize_elevation() -visualize_spatial_extent() + +Query + +CMRparams +_CMRparams +_about_product +_cust_options : dict +_cycles : list +_granules +_order_vars +_prod : NoneType, str +_readable_granule_name : list +_reqparams +_subsetparams : NoneType +_tracks : list +_version +cycles +dataset +granules +order_vars +product +product_version +reqparams +tracks + +__init__(product, spatial_extent, date_range, start_time, end_time, version, cycles, tracks, auth) +__str__() +avail_granules(ids, cycles, tracks, cloud) +download_granules(path, verbose, subset, restart) +latest_version() +order_granules(verbose, subset, email) +product_all_info() +product_summary_info() +show_custom_options(dictview) +subsetparams() +visualize_elevation() +visualize_spatial_extent() icepyx.core.granules.Granules->icepyx.core.query.Query - - -_granules + + +_granules icepyx.core.granules.Granules->icepyx.core.query.Query - - -_granules + + +_granules icepyx.core.icesat2data.Icesat2Data - -Icesat2Data - - -__init__() + +Icesat2Data + + +__init__() icepyx.core.exceptions.NsidcQueryError - -NsidcQueryError - -errmsg -msgtxt : str - -__init__(errmsg, msgtxt) -__str__() + +NsidcQueryError + +errmsg +msgtxt : str + +__init__(errmsg, msgtxt) +__str__() icepyx.core.exceptions.QueryError - -QueryError - - - + +QueryError + + + icepyx.core.exceptions.NsidcQueryError->icepyx.core.exceptions.QueryError - - + + icepyx.core.APIformatting.Parameters - -Parameters - -_fmted_keys : NoneType, dict -_poss_keys : dict -_reqtype : NoneType, str -fmted_keys -partype -poss_keys - -__init__(partype, values, reqtype) -_check_valid_keys() -_get_possible_keys() -build_params() -check_req_values() -check_values() + +Parameters + +_fmted_keys : NoneType, dict +_poss_keys : dict +_reqtype : NoneType, str +fmted_keys +partype +poss_keys + +__init__(partype, values, reqtype) +_check_valid_keys() +_get_possible_keys() +build_params() +check_req_values() +check_values() icepyx.core.APIformatting.Parameters->icepyx.core.query.Query - - -_CMRparams + + +_CMRparams icepyx.core.APIformatting.Parameters->icepyx.core.query.Query - - -_reqparams + + +_reqparams icepyx.core.APIformatting.Parameters->icepyx.core.query.Query - - -_subsetparams + + +_subsetparams icepyx.core.APIformatting.Parameters->icepyx.core.query.Query - - -_subsetparams + + +_subsetparams icepyx.core.query.Query->icepyx.core.auth.EarthdataAuthMixin - - + + icepyx.core.query.Query->icepyx.core.query.GenQuery - - + + icepyx.quest.quest.Quest - -Quest - -datasets : dict - -__init__(spatial_extent, date_range, start_time, end_time, proj) -__str__() -add_argo(params, presRange): None -add_icesat2(product, start_time, end_time, version, cycles, tracks, files): None -download_all(path) -save_all(path) -search_all() + +Quest + +datasets : dict + +__init__(spatial_extent, date_range, start_time, end_time, proj) +__str__() +add_argo(params, presRange): None +add_icesat2(product, start_time, end_time, version, cycles, tracks, files): None +download_all(path) +save_all(path) +search_all() icepyx.quest.quest.Quest->icepyx.core.query.GenQuery - - + + icepyx.core.read.Read - -Read - -_filelist -_out_obj : Dataset -_product -_read_vars -filelist -is_s3 -product -vars - -__init__(data_source, glob_kwargs, out_obj_type, product, filename_pattern, catalog) -_add_vars_to_ds(is2ds, ds, grp_path, wanted_groups_tiered, wanted_dict) -_build_dataset_template(file) -_build_single_file_dataset(file, groups_list) -_combine_nested_vars(is2ds, ds, grp_path, wanted_dict) -_read_single_grp(file, grp_path) -load() + +Read + +_filelist +_out_obj : Dataset +_product +_read_vars +filelist +is_s3 +product +vars + +__init__(data_source, glob_kwargs, out_obj_type, product, filename_pattern, catalog) +_add_vars_to_ds(is2ds, ds, grp_path, wanted_groups_tiered, wanted_dict) +_build_dataset_template(file) +_build_single_file_dataset(file, groups_list) +_combine_nested_vars(is2ds, ds, grp_path, wanted_dict) +_read_single_grp(file, grp_path) +load() icepyx.core.read.Read->icepyx.core.auth.EarthdataAuthMixin - - + + icepyx.core.spatial.Spatial - -Spatial - -_ext_type : str -_gdf_spat : GeoDataFrame -_geom_file : NoneType -_spatial_ext -_xdateln -extent -extent_as_gdf -extent_file -extent_type - -__init__(spatial_extent) -__str__() -fmt_for_CMR() -fmt_for_EGI() + +Spatial + +_ext_type : str +_gdf_spat : GeoDataFrame +_geom_file : NoneType +_spatial_ext +_xdateln +extent +extent_as_gdf +extent_file +extent_type + +__init__(spatial_extent) +__str__() +fmt_for_CMR() +fmt_for_EGI() icepyx.core.spatial.Spatial->icepyx.core.query.GenQuery - - -_spatial + + +_spatial icepyx.core.spatial.Spatial->icepyx.core.query.GenQuery - - -_spatial + + +_spatial icepyx.core.temporal.Temporal - -Temporal - -_end : datetime -_start : datetime -end -start - -__init__(date_range, start_time, end_time) -__str__() + +Temporal + +_end : datetime +_start : datetime +end +start + +__init__(date_range, start_time, end_time) +__str__() icepyx.core.temporal.Temporal->icepyx.core.query.GenQuery - - -_temporal + + +_temporal icepyx.core.variables.Variables - -Variables - -_avail : NoneType, list -_path : NoneType -_product : NoneType, str -_version -path -product -version -wanted : NoneType, dict - -__init__(vartype, path, product, version, avail, wanted, auth) -_check_valid_lists(vgrp, allpaths, var_list, beam_list, keyword_list) -_get_combined_list(beam_list, keyword_list) -_get_sum_varlist(var_list, all_vars, defaults) -_iter_paths(sum_varlist, req_vars, vgrp, beam_list, keyword_list) -_iter_vars(sum_varlist, req_vars, vgrp) -append(defaults, var_list, beam_list, keyword_list) -avail(options, internal) -parse_var_list(varlist, tiered, tiered_vars) -remove(all, var_list, beam_list, keyword_list) + +Variables + +_avail : NoneType, list +_path : NoneType +_product : NoneType, str +_version +path +product +version +wanted : NoneType, dict + +__init__(vartype, path, product, version, avail, wanted, auth) +_check_valid_lists(vgrp, allpaths, var_list, beam_list, keyword_list) +_get_combined_list(beam_list, keyword_list) +_get_sum_varlist(var_list, all_vars, defaults) +_iter_paths(sum_varlist, req_vars, vgrp, beam_list, keyword_list) +_iter_vars(sum_varlist, req_vars, vgrp) +append(defaults, var_list, beam_list, keyword_list) +avail(options, internal) +parse_var_list(varlist, tiered, tiered_vars) +remove(all, var_list, beam_list, keyword_list) icepyx.core.variables.Variables->icepyx.core.auth.EarthdataAuthMixin - - + + icepyx.core.variables.Variables->icepyx.core.query.Query - - -_order_vars + + +_order_vars icepyx.core.variables.Variables->icepyx.core.query.Query - - -_order_vars + + +_order_vars icepyx.core.variables.Variables->icepyx.core.read.Read - - -_read_vars + + +_read_vars + + + +icepyx.core.variables.Variables->icepyx.core.read.Read + + +_read_vars icepyx.core.visualization.Visualize - -Visualize - -bbox : list -cycles : NoneType -date_range : NoneType -product : NoneType, str -tracks : NoneType - -__init__(query_obj, product, spatial_extent, date_range, cycles, tracks) -generate_OA_parameters(): list -grid_bbox(binsize): list -make_request(base_url, payload) -parallel_request_OA(): da.array -query_icesat2_filelist(): tuple -request_OA_data(paras): da.array -viz_elevation(): (hv.DynamicMap, hv.Layout) + +Visualize + +bbox : list +cycles : NoneType +date_range : NoneType +product : NoneType, str +tracks : NoneType + +__init__(query_obj, product, spatial_extent, date_range, cycles, tracks) +generate_OA_parameters(): list +grid_bbox(binsize): list +make_request(base_url, payload) +parallel_request_OA(): da.array +query_icesat2_filelist(): tuple +request_OA_data(paras): da.array +viz_elevation(): (hv.DynamicMap, hv.Layout) diff --git a/doc/source/user_guide/documentation/classes_user_uml.svg b/doc/source/user_guide/documentation/classes_user_uml.svg index 6ecef9681..d5c17c066 100644 --- a/doc/source/user_guide/documentation/classes_user_uml.svg +++ b/doc/source/user_guide/documentation/classes_user_uml.svg @@ -4,11 +4,11 @@ - + classes_user_uml - + icepyx.core.auth.AuthenticationError @@ -42,231 +42,231 @@ icepyx.core.query.GenQuery - -GenQuery - -dates -end_time -spatial -spatial_extent -start_time -temporal - - + +GenQuery + +dates +end_time +spatial +spatial_extent +start_time +temporal + + icepyx.core.granules.Granules - -Granules - -avail : list -orderIDs : list - -download(verbose, path, restart) -get_avail(CMRparams, reqparams, cloud) -place_order(CMRparams, reqparams, subsetparams, verbose, subset, geom_filepath) + +Granules + +avail : list +orderIDs : list + +download(verbose, path, restart) +get_avail(CMRparams, reqparams, cloud) +place_order(CMRparams, reqparams, subsetparams, verbose, subset, geom_filepath) icepyx.core.granules.Granules->icepyx.core.auth.EarthdataAuthMixin - - + + icepyx.core.query.Query - -Query - -CMRparams -cycles -dataset -granules -order_vars -product -product_version -reqparams -tracks - -avail_granules(ids, cycles, tracks, cloud) -download_granules(path, verbose, subset, restart) -latest_version() -order_granules(verbose, subset, email) -product_all_info() -product_summary_info() -show_custom_options(dictview) -subsetparams() -visualize_elevation() -visualize_spatial_extent() + +Query + +CMRparams +cycles +dataset +granules +order_vars +product +product_version +reqparams +tracks + +avail_granules(ids, cycles, tracks, cloud) +download_granules(path, verbose, subset, restart) +latest_version() +order_granules(verbose, subset, email) +product_all_info() +product_summary_info() +show_custom_options(dictview) +subsetparams() +visualize_elevation() +visualize_spatial_extent() icepyx.core.granules.Granules->icepyx.core.query.Query - - -_granules + + +_granules icepyx.core.granules.Granules->icepyx.core.query.Query - - -_granules + + +_granules icepyx.core.icesat2data.Icesat2Data - -Icesat2Data - - - + +Icesat2Data + + + icepyx.core.exceptions.NsidcQueryError - -NsidcQueryError - -errmsg -msgtxt : str - - + +NsidcQueryError + +errmsg +msgtxt : str + + icepyx.core.exceptions.QueryError - -QueryError - - - + +QueryError + + + icepyx.core.exceptions.NsidcQueryError->icepyx.core.exceptions.QueryError - - + + icepyx.core.APIformatting.Parameters - -Parameters - -fmted_keys -partype -poss_keys - -build_params() -check_req_values() -check_values() + +Parameters + +fmted_keys +partype +poss_keys + +build_params() +check_req_values() +check_values() icepyx.core.APIformatting.Parameters->icepyx.core.query.Query - - -_CMRparams + + +_CMRparams icepyx.core.APIformatting.Parameters->icepyx.core.query.Query - - -_reqparams + + +_reqparams icepyx.core.APIformatting.Parameters->icepyx.core.query.Query - - -_subsetparams + + +_subsetparams icepyx.core.APIformatting.Parameters->icepyx.core.query.Query - - -_subsetparams + + +_subsetparams icepyx.core.query.Query->icepyx.core.auth.EarthdataAuthMixin - - + + icepyx.core.query.Query->icepyx.core.query.GenQuery - - + + icepyx.core.read.Read - -Read - -filelist -is_s3 -product -vars - -load() + +Read + +filelist +is_s3 +product +vars + +load() icepyx.core.read.Read->icepyx.core.auth.EarthdataAuthMixin - - + + icepyx.core.spatial.Spatial - -Spatial - -extent -extent_as_gdf -extent_file -extent_type - -fmt_for_CMR() -fmt_for_EGI() + +Spatial + +extent +extent_as_gdf +extent_file +extent_type + +fmt_for_CMR() +fmt_for_EGI() icepyx.core.spatial.Spatial->icepyx.core.query.GenQuery - - -_spatial + + +_spatial icepyx.core.spatial.Spatial->icepyx.core.query.GenQuery - - -_spatial + + +_spatial icepyx.core.temporal.Temporal - -Temporal - -end -start - - + +Temporal + +end +start + + icepyx.core.temporal.Temporal->icepyx.core.query.GenQuery - - -_temporal + + +_temporal @@ -287,49 +287,56 @@ icepyx.core.variables.Variables->icepyx.core.auth.EarthdataAuthMixin - - + + icepyx.core.variables.Variables->icepyx.core.query.Query - - -_order_vars + + +_order_vars icepyx.core.variables.Variables->icepyx.core.query.Query - - -_order_vars + + +_order_vars icepyx.core.variables.Variables->icepyx.core.read.Read - - -_read_vars + + +_read_vars + + + +icepyx.core.variables.Variables->icepyx.core.read.Read + + +_read_vars icepyx.core.visualization.Visualize - -Visualize - -bbox : list -cycles : NoneType -date_range : NoneType -product : NoneType, str -tracks : NoneType - -generate_OA_parameters(): list -grid_bbox(binsize): list -make_request(base_url, payload) -parallel_request_OA(): da.array -query_icesat2_filelist(): tuple -request_OA_data(paras): da.array -viz_elevation(): (hv.DynamicMap, hv.Layout) + +Visualize + +bbox : list +cycles : NoneType +date_range : NoneType +product : NoneType, str +tracks : NoneType + +generate_OA_parameters(): list +grid_bbox(binsize): list +make_request(base_url, payload) +parallel_request_OA(): da.array +query_icesat2_filelist(): tuple +request_OA_data(paras): da.array +viz_elevation(): (hv.DynamicMap, hv.Layout) From 38ebec9dfe01f510b45326e962df6a035d83bda6 Mon Sep 17 00:00:00 2001 From: Matt Fisher Date: Wed, 7 Aug 2024 18:31:21 -0600 Subject: [PATCH 06/11] Replace `setup.py` with equivalent `pyproject.toml` (#539) Co-authored-by: Wei Ji <23487320+weiji14@users.noreply.github.com> --- .pre-commit-config.yaml | 1 + pyproject.toml | 63 +++++++++++++++++++++++++++++++++++++++++ setup.py | 55 ----------------------------------- 3 files changed, 64 insertions(+), 55 deletions(-) create mode 100644 pyproject.toml delete mode 100644 setup.py diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index 19e67d75b..3b0674938 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -7,6 +7,7 @@ repos: - repo: https://github.com/pre-commit/pre-commit-hooks rev: v4.6.0 # Use the ref you want to point at hooks: + - id: check-toml - id: check-added-large-files args: ["--maxkb=5000"] - id: end-of-file-fixer diff --git a/pyproject.toml b/pyproject.toml new file mode 100644 index 000000000..917d03682 --- /dev/null +++ b/pyproject.toml @@ -0,0 +1,63 @@ +[project] +name = "icepyx" +description = "Python tools for obtaining and working with ICESat-2 data" +license = {file = "LICENSE"} +readme = "README.rst" + +requires-python = "~=3.7" +dynamic = ["version", "dependencies"] + +authors = [ + {name = "The icepyx Developers", email = "jbscheick@gmail.com"}, +] +maintainers = [ + {name = "The icepyx Developers", email = "jbscheick@gmail.com"}, +] + +classifiers=[ + "Development Status :: 4 - Beta", + "Intended Audience :: Science/Research", + "License :: OSI Approved :: BSD License", + "Operating System :: OS Independent", + "Programming Language :: Python :: 3", + "Programming Language :: Python :: 3.7", + "Programming Language :: Python :: 3.8", + "Programming Language :: Python :: 3.9", + "Programming Language :: Python :: 3.10", + "Programming Language :: Python :: 3.11", + "Programming Language :: Python :: 3.12", + "Topic :: Scientific/Engineering", + "Topic :: Scientific/Engineering :: GIS", + "Topic :: Software Development :: Libraries", +] + +[project.urls] +Homepage = "https://icepyx.readthedocs.io" +Documentation = "https://icepyx.readthedocs.io" +Repository = "https://github.com/icesat2py/icepyx" +Issues = "https://github.com/icesat2py/icepyx/issues" +Changelog = "https://icepyx.readthedocs.io/en/latest/user_guide/changelog/index.html" + + +[build-system] +build-backend = "setuptools.build_meta" +requires = ["setuptools>=66", "wheel", "setuptools_scm"] + +[tool.setuptools] +py-modules = ["_icepyx_version"] + +[tool.setuptools.dynamic] +dependencies = {file = ["requirements.txt"]} + +[project.optional-dependencies] +viz = ["geoviews >= 1.9.0", "cartopy >= 0.18.0", "scipy"] +complete = ["icepyx[viz]"] + +[tool.setuptools.packages.find] +exclude = ["*tests"] + +[tool.setuptools_scm] +version_file = "_icepyx_version.py" +version_file_template = 'version = "{version}"' +local_scheme = "node-and-date" +fallback_version = "unknown" diff --git a/setup.py b/setup.py deleted file mode 100644 index 003a5043f..000000000 --- a/setup.py +++ /dev/null @@ -1,55 +0,0 @@ -import setuptools - -with open("README.rst", "r") as f: - LONG_DESCRIPTION = f.read() - -with open("requirements.txt") as f: - INSTALL_REQUIRES = f.read().strip().split("\n") - -EXTRAS_REQUIRE = { - "viz": ["geoviews >= 1.9.0", "cartopy >= 0.18.0", "scipy"], -} -EXTRAS_REQUIRE["complete"] = sorted(set(sum(EXTRAS_REQUIRE.values(), []))) -# install with `pip install "icepyx[complete]"` There is no way to use this functionality with conda. - -setuptools.setup( - name="icepyx", - author="The icepyx Developers", - author_email="jbscheick@gmail.com", - maintainer="Jessica Scheick", - maintainer_email="jbscheick@gmail.com", - description="Python tools for obtaining and working with ICESat-2 data", - long_description=LONG_DESCRIPTION, - long_description_content_type="text/x-rst", - url="https://github.com/icesat2py/icepyx.git", - license="BSD 3-Clause", - packages=setuptools.find_packages(exclude=["*tests"]), - install_requires=INSTALL_REQUIRES, - extras_require=EXTRAS_REQUIRE, - python_requires=">=3", - # classifiers are a set of standard descriptions. Possible list: https://pypi.org/pypi?%3Aaction=list_classifiers - classifiers=[ - "Development Status :: 4 - Beta", - "Intended Audience :: Science/Research", - "License :: OSI Approved :: BSD License", - "Operating System :: OS Independent", - "Programming Language :: Python :: 3", - "Programming Language :: Python :: 3.7", - "Programming Language :: Python :: 3.8", - "Programming Language :: Python :: 3.9", - "Programming Language :: Python :: 3.10", - "Programming Language :: Python :: 3.11", - "Programming Language :: Python :: 3.12", - "Topic :: Scientific/Engineering", - "Topic :: Scientific/Engineering :: GIS", - "Topic :: Software Development :: Libraries", - ], - py_modules=["_icepyx_version"], - use_scm_version={ - "fallback_version": "unknown", - "local_scheme": "node-and-date", - "write_to": "_icepyx_version.py", - "write_to_template": 'version = "{version}"\n', - }, - setup_requires=["setuptools>=30.3.0", "wheel", "setuptools_scm"], -) From 91ca3c98fea017007c4d15906a131f4f6ba61c20 Mon Sep 17 00:00:00 2001 From: Matt Fisher Date: Thu, 8 Aug 2024 02:22:10 -0600 Subject: [PATCH 07/11] Fix continuous delivery & docs to account for setup.py -> pyproject.toml change (#541) --- .coveragerc | 1 - .github/workflows/publish_to_pypi.yml | 2 +- doc/source/getting_started/install.rst | 2 +- 3 files changed, 2 insertions(+), 3 deletions(-) diff --git a/.coveragerc b/.coveragerc index 2f02d0030..19831e170 100644 --- a/.coveragerc +++ b/.coveragerc @@ -2,5 +2,4 @@ branch = True source = icepyx omit = - setup.py doc/* diff --git a/.github/workflows/publish_to_pypi.yml b/.github/workflows/publish_to_pypi.yml index 91a19785e..e7fadd774 100644 --- a/.github/workflows/publish_to_pypi.yml +++ b/.github/workflows/publish_to_pypi.yml @@ -42,7 +42,7 @@ jobs: # Change setuptools-scm local_scheme to "no-local-version" so the # local part of the version isn't included, making the version string # compatible with PyPI. - sed --in-place "s/node-and-date/no-local-version/g" setup.py + sed --in-place "s/node-and-date/no-local-version/g" pyproject.toml - name: Build source and wheel distributions run: | diff --git a/doc/source/getting_started/install.rst b/doc/source/getting_started/install.rst index 0ee77de55..8286c37f0 100644 --- a/doc/source/getting_started/install.rst +++ b/doc/source/getting_started/install.rst @@ -79,7 +79,7 @@ To clone the repository: Provided the location of the repo is part of your $PYTHONPATH, you should simply be able to add `import icepyx` to your Python document. -Alternatively, in a command line or terminal, navigate to the folder in your cloned repository containing setup.py and run +Alternatively, in a command line or terminal, navigate to the folder in your cloned repository containing `pyproject.toml` and run .. code-block:: From a77b685c945ce16e9b39d06dda1797960b219104 Mon Sep 17 00:00:00 2001 From: Matt Fisher Date: Sun, 11 Aug 2024 18:07:22 -0600 Subject: [PATCH 08/11] Autofix flake8 ignores E711, E712, E714, F401, F841 with Ruff (#542) --- .flake8 | 22 ++++----- .pre-commit-config.yaml | 13 ++--- icepyx/core/auth.py | 1 - icepyx/core/granules.py | 2 +- icepyx/core/is2ref.py | 2 +- icepyx/core/query.py | 13 ++--- icepyx/core/read.py | 2 +- icepyx/core/spatial.py | 72 ++++++++++++++-------------- icepyx/core/validate_inputs.py | 3 -- icepyx/core/variables.py | 52 ++++++++++---------- icepyx/core/visualization.py | 2 +- icepyx/quest/dataset_scripts/argo.py | 14 +++--- icepyx/quest/quest.py | 2 - icepyx/tests/is2class_query.py | 2 - icepyx/tests/test_Earthdata.py | 1 - icepyx/tests/test_auth.py | 1 - icepyx/tests/test_quest.py | 1 - icepyx/tests/test_quest_argo.py | 2 +- icepyx/tests/test_spatial.py | 41 +++++++--------- icepyx/tests/test_temporal.py | 19 ++++---- icepyx/tests/test_visualization.py | 1 - 21 files changed, 122 insertions(+), 146 deletions(-) diff --git a/.flake8 b/.flake8 index 97a126574..bacc40964 100644 --- a/.flake8 +++ b/.flake8 @@ -8,32 +8,28 @@ per-file-ignores = test_granules.py:E501 # imported but unused __init__.py:F401 - # import not at top of file - doc/source/conf.py:E402 + # import not at top of file, imported but unused + doc/source/conf.py:E402,F401 -# GOAL: remove these ignores ignore = # line too long + # NOTE: This is a formatting concern. Black handles long lines of code, but + # allows inline comments to be infinitely long (automatically formatting + # them can have unintended consequences). In our codebase, we have a lot of + # overlong comments. + # See: https://github.com/psf/black/issues/1713#issuecomment-1357045092 E501 - # comparison syntax - E711 - # comparison syntax - E712 - # comparison syntax in tests - E714 + # GOAL: remove ignores below this line # comparison syntax in tests E721 # bare except E722 # ambiguous var name E741 - # imported but unused - F401 # unable to detect undefined names F403 - # assigned and unused (in tests) - F841 # line break before binary operator + # NOTE: This is a formatting concern W503 # GOAL: diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index 3b0674938..eb801b50d 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -1,13 +1,14 @@ repos: -- repo: https://github.com/psf/black - rev: 24.8.0 - hooks: +- repo: https://github.com/psf/black + rev: 24.8.0 + hooks: - id: black -- repo: https://github.com/pre-commit/pre-commit-hooks - rev: v4.6.0 # Use the ref you want to point at - hooks: +- repo: https://github.com/pre-commit/pre-commit-hooks + rev: v4.6.0 # Use the ref you want to point at + hooks: - id: check-toml + - id: check-yaml - id: check-added-large-files args: ["--maxkb=5000"] - id: end-of-file-fixer diff --git a/icepyx/core/auth.py b/icepyx/core/auth.py index ba07ac398..55e34fd7f 100644 --- a/icepyx/core/auth.py +++ b/icepyx/core/auth.py @@ -1,6 +1,5 @@ import copy import datetime -import warnings import earthaccess from icepyx.core.exceptions import DeprecationError diff --git a/icepyx/core/granules.py b/icepyx/core/granules.py index 5c298c625..205149f56 100644 --- a/icepyx/core/granules.py +++ b/icepyx/core/granules.py @@ -351,7 +351,7 @@ def place_order( # DevGoal: use the request response/number to do some error handling/ # give the user better messaging for failures # print(request.content) - root = ET.fromstring(request.content) + # root = ET.fromstring(request.content) # print([subset_agent.attrib for subset_agent in root.iter('SubsetAgent')]) if verbose is True: diff --git a/icepyx/core/is2ref.py b/icepyx/core/is2ref.py index 86888547b..38561168a 100644 --- a/icepyx/core/is2ref.py +++ b/icepyx/core/is2ref.py @@ -159,7 +159,7 @@ def get_varlist(elem): get_varlist(root) vars_vals = [ - v.replace(":", "/") if v.startswith("/") == False else v.replace("/:", "") + v.replace(":", "/") if v.startswith("/") is False else v.replace("/:", "") for v in vars_raw ] cust_options.update({"variables": vars_vals}) diff --git a/icepyx/core/query.py b/icepyx/core/query.py index dce3c1c34..a57806501 100644 --- a/icepyx/core/query.py +++ b/icepyx/core/query.py @@ -1,6 +1,5 @@ import geopandas as gpd import matplotlib.pyplot as plt -from pathlib import Path # used in docstring tests import pprint import icepyx.core.APIformatting as apifmt @@ -103,9 +102,10 @@ class GenQuery: Initializing Query with a geospatial polygon file. - >>> aoi = str(Path('./doc/source/example_notebooks/supporting_files/simple_test_poly.gpkg').resolve()) + >>> from pathlib import Path + >>> aoi = Path('./doc/source/example_notebooks/supporting_files/simple_test_poly.gpkg').resolve() >>> reg_a_dates = ['2019-02-22','2019-02-28'] - >>> reg_a = GenQuery(aoi, reg_a_dates) + >>> reg_a = GenQuery(str(aoi), reg_a_dates) >>> print(reg_a) Extent type: polygon Coordinates: [-55.0, 68.0, -55.0, 71.0, -48.0, 71.0, -48.0, 68.0, -55.0, 68.0] @@ -378,9 +378,10 @@ class Query(GenQuery, EarthdataAuthMixin): Initializing Query with a geospatial polygon file. - >>> aoi = str(Path('./doc/source/example_notebooks/supporting_files/simple_test_poly.gpkg').resolve()) + >>> from pathlib import Path + >>> aoi = Path('./doc/source/example_notebooks/supporting_files/simple_test_poly.gpkg').resolve() >>> reg_a_dates = ['2019-02-22','2019-02-28'] - >>> reg_a = Query('ATL06', aoi, reg_a_dates) + >>> reg_a = Query('ATL06', str(aoi), reg_a_dates) >>> print(reg_a) Product ATL06 v006 ('polygon', [-55.0, 68.0, -55.0, 71.0, -48.0, 71.0, -48.0, 68.0, -55.0, 68.0]) @@ -1132,7 +1133,7 @@ def visualize_spatial_extent( gdf = self._spatial.extent_as_gdf try: - from shapely.geometry import Polygon + from shapely.geometry import Polygon # noqa: F401 import geoviews as gv gv.extension("bokeh") diff --git a/icepyx/core/read.py b/icepyx/core/read.py index 2e4e03ade..e6bfc40b7 100644 --- a/icepyx/core/read.py +++ b/icepyx/core/read.py @@ -5,7 +5,6 @@ import earthaccess import numpy as np -from s3fs.core import S3File import xarray as xr from icepyx.core.auth import EarthdataAuthMixin @@ -634,6 +633,7 @@ def load(self): ) # wanted_groups, vgrp.keys())) # Closing the file prevents further operations on the dataset + # from s3fs.core import S3File # if isinstance(file, S3File): # file.close() diff --git a/icepyx/core/spatial.py b/icepyx/core/spatial.py index 721c949ab..d810726b3 100644 --- a/icepyx/core/spatial.py +++ b/icepyx/core/spatial.py @@ -1,12 +1,10 @@ import geopandas as gpd import numpy as np import os -from pathlib import Path from shapely.geometry import box, Polygon from shapely.geometry.polygon import orient import warnings -import icepyx.core.APIformatting as apifmt # DevGoal: need to update the spatial_extent docstring to describe coordinate order for input @@ -62,7 +60,7 @@ def geodataframe(extent_type, spatial_extent, file=False, xdateline=None): # print("this should cross the dateline:" + str(xdateline)) if extent_type == "bounding_box": - if xdateline == True: + if xdateline is True: cartesian_lons = [i if i > 0 else i + 360 for i in spatial_extent[0:-1:2]] cartesian_spatial_extent = [ item @@ -79,14 +77,14 @@ def geodataframe(extent_type, spatial_extent, file=False, xdateline=None): # DevGoal: Currently this if/else within this elif are not tested... # DevGoal: the crs setting and management needs to be improved - elif extent_type == "polygon" and file == False: + elif extent_type == "polygon" and file is False: # if spatial_extent is already a Polygon if isinstance(spatial_extent, Polygon): spatial_extent_geom = spatial_extent # else, spatial_extent must be a list of floats (or list of tuples of floats) else: - if xdateline == True: + if xdateline is True: cartesian_lons = [ i if i > 0 else i + 360 for i in spatial_extent[0:-1:2] ] @@ -109,7 +107,7 @@ def geodataframe(extent_type, spatial_extent, file=False, xdateline=None): # If extent_type is a polygon AND from a file, create a geopandas geodataframe from it # DevGoal: Currently this elif isn't tested... - elif extent_type == "polygon" and file == True: + elif extent_type == "polygon" and file is True: gdf = gpd.read_file(spatial_extent) else: @@ -397,37 +395,38 @@ def __init__(self, spatial_extent, **kwarg): Optional keyword argument to let user specify whether the spatial input crosses the dateline or not. - See Also - -------- - icepyx.Query - - - Examples - -------- - Initializing Spatial with a bounding box. - - >>> reg_a_bbox = [-55, 68, -48, 71] - >>> reg_a = Spatial(reg_a_bbox) - >>> print(reg_a) - Extent type: bounding_box - Coordinates: [-55.0, 68.0, -48.0, 71.0] - - Initializing Query with a list of polygon vertex coordinate pairs. - - >>> reg_a_poly = [(-55, 68), (-55, 71), (-48, 71), (-48, 68), (-55, 68)] - >>> reg_a = Spatial(reg_a_poly) - >>> print(reg_a) - Extent type: polygon - Coordinates: [-55.0, 68.0, -55.0, 71.0, -48.0, 71.0, -48.0, 68.0, -55.0, 68.0] + See Also + -------- + icepyx.Query - Initializing Query with a geospatial polygon file. - >>> aoi = str(Path('./doc/source/example_notebooks/supporting_files/simple_test_poly.gpkg').resolve()) - >>> reg_a = Spatial(aoi) - >>> print(reg_a) # doctest: +SKIP - Extent Type: polygon - Source file: ./doc/source/example_notebooks/supporting_files/simple_test_poly.gpkg - Coordinates: [-55.0, 68.0, -55.0, 71.0, -48.0, 71.0, -48.0, 68.0, -55.0, 68.0] + Examples + -------- + Initializing Spatial with a bounding box. + + >>> reg_a_bbox = [-55, 68, -48, 71] + >>> reg_a = Spatial(reg_a_bbox) + >>> print(reg_a) + Extent type: bounding_box + Coordinates: [-55.0, 68.0, -48.0, 71.0] + + Initializing Query with a list of polygon vertex coordinate pairs. + + >>> reg_a_poly = [(-55, 68), (-55, 71), (-48, 71), (-48, 68), (-55, 68)] + >>> reg_a = Spatial(reg_a_poly) + >>> print(reg_a) + Extent type: polygon + Coordinates: [-55.0, 68.0, -55.0, 71.0, -48.0, 71.0, -48.0, 68.0, -55.0, 68.0] + + Initializing Query with a geospatial polygon file. + + >>> from pathlib import Path + >>> aoi = Path('./doc/source/example_notebooks/supporting_files/simple_test_poly.gpkg').resolve() + >>> reg_a = Spatial(str(aoi)) + >>> print(reg_a) # doctest: +SKIP + Extent Type: polygon + Source file: ./doc/source/example_notebooks/supporting_files/simple_test_poly.gpkg + Coordinates: [-55.0, 68.0, -55.0, 71.0, -48.0, 71.0, -48.0, 68.0, -55.0, 68.0] """ scalar_types = (int, float, np.int64) @@ -590,6 +589,7 @@ def extent_file(self): >>> reg_a.extent_file + >>> from pathlib import Path >>> reg_a = Spatial(str(Path('./doc/source/example_notebooks/supporting_files/simple_test_poly.gpkg').resolve())) >>> reg_a.extent_file # doctest: +SKIP ./doc/source/example_notebooks/supporting_files/simple_test_poly.gpkg @@ -643,7 +643,7 @@ def fmt_for_CMR(self): extent = [float(i) for i in polygon] # TODO: explore how this will be impacted if the polygon is read in from a shapefile and crosses the dateline - if hasattr(self, "_xdateln") and self._xdateln == True: + if hasattr(self, "_xdateln") and self._xdateln is True: neg_lons = [i if i < 181.0 else i - 360 for i in extent[0:-1:2]] extent = [item for pair in zip(neg_lons, extent[1::2]) for item in pair] diff --git a/icepyx/core/validate_inputs.py b/icepyx/core/validate_inputs.py index a69f045fb..e70a8f944 100644 --- a/icepyx/core/validate_inputs.py +++ b/icepyx/core/validate_inputs.py @@ -1,10 +1,7 @@ import datetime as dt -import os import warnings import numpy as np -import icepyx.core.APIformatting as apifmt - def prod_version(latest_vers, version): """ diff --git a/icepyx/core/variables.py b/icepyx/core/variables.py index 15d5268e5..87c2a94e8 100644 --- a/icepyx/core/variables.py +++ b/icepyx/core/variables.py @@ -1,12 +1,10 @@ import numpy as np import os -import pprint from icepyx.core.auth import EarthdataAuthMixin import icepyx.core.is2ref as is2ref from icepyx.core.exceptions import DeprecationError import icepyx.core.validate_inputs as val -import icepyx.core as ipxc # DEVGOAL: use h5py to simplify some of these tasks, if possible! @@ -145,7 +143,7 @@ def avail(self, options=False, internal=False): 'quality_assessment/gt3r/signal_selection_source_fraction_3'] """ - if not hasattr(self, "_avail") or self._avail == None: + if not hasattr(self, "_avail") or self._avail is None: if not hasattr(self, "path") or self.path.startswith("s3"): self._avail = is2ref._get_custom_options( self.session, self.product, self.version @@ -167,15 +165,15 @@ def visitor_func(name, node): with h5py.File(self.path, "r") as h5f: h5f.visititems(visitor_func) - if options == True: + if options is True: vgrp, paths = self.parse_var_list(self._avail) allpaths = [] [allpaths.extend(np.unique(np.array(paths[p]))) for p in range(len(paths))] allpaths = np.unique(allpaths) - if internal == False: + if internal is False: print("var_list inputs: " + ", ".join(vgrp.keys())) print("keyword_list and beam_list inputs: " + ", ".join(allpaths)) - elif internal == True: + elif internal is True: return vgrp, allpaths else: return self._avail @@ -259,12 +257,12 @@ def parse_var_list(varlist, tiered=True, tiered_vars=False): # create a dictionary of variable names and paths vgrp = {} - if tiered == False: + if tiered is False: paths = [] else: num = np.max([v.count("/") for v in varlist]) # print('max needed: ' + str(num)) - if tiered_vars == True: + if tiered_vars is True: paths = [[] for i in range(num + 1)] else: paths = [[] for i in range(num)] @@ -279,7 +277,7 @@ def parse_var_list(varlist, tiered=True, tiered_vars=False): vgrp[vkey].append(vn) if vpath: - if tiered == False: + if tiered is False: paths.append(vpath) else: j = 0 @@ -289,7 +287,7 @@ def parse_var_list(varlist, tiered=True, tiered_vars=False): for i in range(j, num): paths[i].append("none") i = i + 1 - if tiered_vars == True: + if tiered_vars is True: paths[num].append(vkey) return vgrp, paths @@ -363,7 +361,7 @@ def _get_sum_varlist(self, var_list, all_vars, defaults): Get the list of variables to add or iterate through, depending on function inputs. """ sum_varlist = [] - if defaults == True: + if defaults is True: sum_varlist = sum_varlist + is2ref._default_varlists(self.product) if var_list is not None: for vn in var_list: @@ -380,9 +378,9 @@ def _get_combined_list(beam_list, keyword_list): Get the combined list of beams and/or keywords to add or iterate through. """ combined_list = [] - if beam_list == None: + if beam_list is None: combined_list = keyword_list - elif keyword_list == None: + elif keyword_list is None: combined_list = beam_list else: combined_list = keyword_list + beam_list @@ -485,10 +483,10 @@ def append(self, defaults=False, var_list=None, beam_list=None, keyword_list=Non """ assert not ( - defaults == False - and var_list == None - and beam_list == None - and keyword_list == None + defaults is False + and var_list is None + and beam_list is None + and keyword_list is None ), "You must enter parameters to add to a variable subset list. If you do not want to subset by variable, ensure your is2.subsetparams dictionary does not contain the key 'Coverage'." final_vars = {} @@ -497,7 +495,7 @@ def append(self, defaults=False, var_list=None, beam_list=None, keyword_list=Non self._check_valid_lists(vgrp, allpaths, var_list, beam_list, keyword_list) # Instantiate self.wanted to an empty dictionary if it doesn't exist - if not hasattr(self, "wanted") or self.wanted == None: + if not hasattr(self, "wanted") or self.wanted is None: self.wanted = {} # DEVGOAL: add a secondary var list to include uncertainty/error information for lower level data if specific data variables have been specified... @@ -506,7 +504,7 @@ def append(self, defaults=False, var_list=None, beam_list=None, keyword_list=Non sum_varlist = self._get_sum_varlist(var_list, vgrp.keys(), defaults) # Case only variables (but not keywords or beams) are specified - if beam_list == None and keyword_list == None: + if beam_list is None and keyword_list is None: final_vars.update(self._iter_vars(sum_varlist, final_vars, vgrp)) # Case a beam and/or keyword list is specified (with or without variables) @@ -577,16 +575,16 @@ def remove(self, all=False, var_list=None, beam_list=None, keyword_list=None): >>> reg_a.order_vars.remove(keyword_list=['ancillary_data']) # doctest: +SKIP """ - if not hasattr(self, "wanted") or self.wanted == None: + if not hasattr(self, "wanted") or self.wanted is None: raise ValueError( "You must construct a wanted variable list in order to remove values from it." ) assert not ( - all == False - and var_list == None - and beam_list == None - and keyword_list == None + all is False + and var_list is None + and beam_list is None + and keyword_list is None ), "You must specify which variables/paths/beams you would like to remove from your wanted list." # if not hasattr(self, 'avail'): self.get_avail() @@ -598,7 +596,7 @@ def remove(self, all=False, var_list=None, beam_list=None, keyword_list=None): # self._check_valid_lists(vgrp, allpaths, var_list, beam_list, keyword_list) - if all == True: + if all is True: try: self.wanted = None except NameError: @@ -606,7 +604,7 @@ def remove(self, all=False, var_list=None, beam_list=None, keyword_list=None): else: # Case only variables (but not keywords or beams) are specified - if beam_list == None and keyword_list == None: + if beam_list is None and keyword_list is None: for vn in var_list: try: del self.wanted[vn] @@ -617,7 +615,7 @@ def remove(self, all=False, var_list=None, beam_list=None, keyword_list=None): # Case a beam and/or keyword list is specified (with or without variables) else: combined_list = self._get_combined_list(beam_list, keyword_list) - if var_list == None: + if var_list is None: var_list = self.wanted.keys() # nec_varlist = ['sc_orient','atlas_sdp_gps_epoch','data_start_utc','data_end_utc', diff --git a/icepyx/core/visualization.py b/icepyx/core/visualization.py index edc10d66d..bdbc6d2d9 100644 --- a/icepyx/core/visualization.py +++ b/icepyx/core/visualization.py @@ -364,7 +364,7 @@ def request_OA_data(self, paras) -> da.array: df_series = df.query(expr="date == @Date").iloc[0] beam_data = df_series.beams - except (NameError, KeyError, IndexError) as error: + except (NameError, KeyError, IndexError): beam_data = None if not beam_data: diff --git a/icepyx/quest/dataset_scripts/argo.py b/icepyx/quest/dataset_scripts/argo.py index 8c614d301..b7f374bd6 100644 --- a/icepyx/quest/dataset_scripts/argo.py +++ b/icepyx/quest/dataset_scripts/argo.py @@ -268,10 +268,10 @@ def search_data(self, params=None, presRange=None, printURL=False) -> str: """ # if search is called with replaced parameters or presRange - if not params is None: + if params is not None: self.params = params - if not presRange is None: + if presRange is not None: self.presRange = presRange # builds URL to be submitted @@ -437,23 +437,23 @@ def download(self, params=None, presRange=None, keep_existing=True) -> pd.DataFr """ # TODO: do some basic testing of this block and how the dataframe merging actually behaves - if keep_existing == False: + if keep_existing is False: print( "Your previously stored data in reg.argodata", "will be deleted before new data is downloaded.", ) self.argodata = None - elif keep_existing == True and hasattr(self, "argodata"): + elif keep_existing is True and hasattr(self, "argodata"): print( "The data requested by running this line of code\n", "will be added to previously downloaded data.", ) # if download is called with replaced parameters or presRange - if not params is None: + if params is not None: self.params = params - if not presRange is None: + if presRange is not None: self.presRange = presRange # Add qc data for each of the parameters requested @@ -482,7 +482,7 @@ def download(self, params=None, presRange=None, keep_existing=True) -> pd.DataFr # now that we have a df from this round of downloads, we can add it to any existing dataframe # note that if a given column has previously been added, update needs to be used to replace nans (merge will not replace the nan values) - if not self.argodata is None: + if self.argodata is not None: self.argodata = self.argodata.merge(merged_df, how="outer") else: self.argodata = merged_df diff --git a/icepyx/quest/quest.py b/icepyx/quest/quest.py index 966b19dca..a7cf9be3c 100644 --- a/icepyx/quest/quest.py +++ b/icepyx/quest/quest.py @@ -1,5 +1,3 @@ -import matplotlib.pyplot as plt - from icepyx.core.query import GenQuery, Query from icepyx.quest.dataset_scripts.argo import Argo diff --git a/icepyx/tests/is2class_query.py b/icepyx/tests/is2class_query.py index 22a10b223..84c31cfa4 100644 --- a/icepyx/tests/is2class_query.py +++ b/icepyx/tests/is2class_query.py @@ -1,6 +1,4 @@ import icepyx as ipx -import pytest -import warnings def test_CMRparams(): diff --git a/icepyx/tests/test_Earthdata.py b/icepyx/tests/test_Earthdata.py index 60b92f621..81093f8ff 100644 --- a/icepyx/tests/test_Earthdata.py +++ b/icepyx/tests/test_Earthdata.py @@ -6,7 +6,6 @@ import os import pytest import shutil -import warnings # PURPOSE: test different authentication methods diff --git a/icepyx/tests/test_auth.py b/icepyx/tests/test_auth.py index c8f8e8f5d..50ae1e6ca 100644 --- a/icepyx/tests/test_auth.py +++ b/icepyx/tests/test_auth.py @@ -4,7 +4,6 @@ import earthaccess from icepyx.core.auth import EarthdataAuthMixin -from icepyx.core.exceptions import DeprecationError @pytest.fixture() diff --git a/icepyx/tests/test_quest.py b/icepyx/tests/test_quest.py index 0ba7325a6..2270bfa8b 100644 --- a/icepyx/tests/test_quest.py +++ b/icepyx/tests/test_quest.py @@ -1,5 +1,4 @@ import pytest -import re import icepyx as ipx from icepyx.quest.quest import Quest diff --git a/icepyx/tests/test_quest_argo.py b/icepyx/tests/test_quest_argo.py index a6940fe7b..fb20a3a47 100644 --- a/icepyx/tests/test_quest_argo.py +++ b/icepyx/tests/test_quest_argo.py @@ -59,7 +59,7 @@ def test_param_setter(argo_quest_instance): reg_a.params = ["temperature", "salinity"] - exp = list(set(["temperature", "salinity"])) + exp = ["temperature", "salinity"] assert reg_a.params == exp diff --git a/icepyx/tests/test_spatial.py b/icepyx/tests/test_spatial.py index 4d6369d9e..dc71cdac9 100644 --- a/icepyx/tests/test_spatial.py +++ b/icepyx/tests/test_spatial.py @@ -1,12 +1,9 @@ -import datetime as dt import geopandas as gpd import numpy as np -import os from pathlib import Path import pytest import re from shapely.geometry import Polygon -import warnings import icepyx.core.spatial as spat @@ -65,62 +62,62 @@ def test_intlist_with0_bbox(): def test_too_few_bbox_points(): with pytest.raises(AssertionError): - too_few_bbox_points = spat.Spatial([-64.2, 66.2, -55.5]) + spat.Spatial([-64.2, 66.2, -55.5]) def test_too_many_bbox_points(): with pytest.raises(AssertionError): - too_many_bbox_points = spat.Spatial([-64.2, 66.2, -55.5, 72.5, 0]) + spat.Spatial([-64.2, 66.2, -55.5, 72.5, 0]) def test_invalid_low_latitude_1_bbox(): with pytest.raises(AssertionError): - low_lat_1_bbox = spat.Spatial([-64.2, -90.2, -55.5, 72.5]) + spat.Spatial([-64.2, -90.2, -55.5, 72.5]) def test_invalid_high_latitude_1_bbox(): with pytest.raises(AssertionError): - high_lat_1_bbox = spat.Spatial([-64.2, 90.2, -55.5, 72.5]) + spat.Spatial([-64.2, 90.2, -55.5, 72.5]) def test_invalid_low_latitude_3_bbox(): with pytest.raises(AssertionError): - low_lat_3_bbox = spat.Spatial([-64.2, 66.2, -55.5, -90.5]) + spat.Spatial([-64.2, 66.2, -55.5, -90.5]) def test_invalid_high_latitude_3_bbox(): with pytest.raises(AssertionError): - high_lat_3_bbox = spat.Spatial([-64.2, 66.2, -55.5, 90.5]) + spat.Spatial([-64.2, 66.2, -55.5, 90.5]) def test_invalid_low_longitude_0_bbox(): with pytest.raises(AssertionError): - low_lon_0_bbox = spat.Spatial([-180.2, 66.2, -55.5, 72.5]) + spat.Spatial([-180.2, 66.2, -55.5, 72.5]) def test_invalid_high_longitude_0_bbox(): with pytest.raises(AssertionError): - high_lon_0_bbox = spat.Spatial([180.2, 66.2, -55.5, 72.5]) + spat.Spatial([180.2, 66.2, -55.5, 72.5]) def test_invalid_low_longitude_2_bbox(): with pytest.raises(AssertionError): - low_lon_2_bbox = spat.Spatial([-64.2, 66.2, -180.5, 72.5]) + spat.Spatial([-64.2, 66.2, -180.5, 72.5]) def test_invalid_high_longitude_2_bbox(): with pytest.raises(AssertionError): - high_lon_2_bbox = spat.Spatial([-64.2, 66.2, 180.5, 72.5]) + spat.Spatial([-64.2, 66.2, 180.5, 72.5]) def test_same_sign_lowleft_gt_upright_latitude_bbox(): with pytest.raises(AssertionError): - lat_ll_gt_ur_ss_bbox = spat.Spatial([-64.2, 72.5, -55.5, 66.2]) + spat.Spatial([-64.2, 72.5, -55.5, 66.2]) def test_bad_values_bbox(): with pytest.raises(ValueError): - bad_input = spat.Spatial(["a", "b", "c", "d"]) + spat.Spatial(["a", "b", "c", "d"]) # ############### END BOUNDING BOX TESTS ################################################################ @@ -287,19 +284,17 @@ def test_numpy_intlist_latlon_coords(): def test_odd_num_lat_long_list_poly_throws_error(): with pytest.raises(AssertionError): - bad_input = spat.Spatial([-55, 68, -55, 71, -48, 71, -48, 68, -55]) + spat.Spatial([-55, 68, -55, 71, -48, 71, -48, 68, -55]) def test_wrong_num_lat_long_tuple_poly_throws_error(): with pytest.raises(ValueError): - bad_input = spat.Spatial( - [(-55, 68, 69), (-55, 71), (-48, 71), (-48, 68), (-55, 68)] - ) + spat.Spatial([(-55, 68, 69), (-55, 71), (-48, 71), (-48, 68), (-55, 68)]) def test_bad_value_types_poly(): with pytest.raises(ValueError): - bad_input = spat.Spatial(["a", "b", "c", "d", "e"]) + spat.Spatial(["a", "b", "c", "d", "e"]) # ###################### Automatically Closed Polygon Tests ########################################################### @@ -378,12 +373,12 @@ def test_poly_file_simple_one_poly(): def test_bad_poly_inputfile_name_throws_error(): with pytest.raises(AssertionError): - bad_input = spat.Spatial("bad_filename.gpkg") + spat.Spatial("bad_filename.gpkg") def test_bad_poly_inputfile_type_throws_error(): with pytest.raises(TypeError): - bad_input = spat.Spatial(str(Path("./icepyx/tests/test_read.py").resolve())) + spat.Spatial(str(Path("./icepyx/tests/test_read.py").resolve())) ########## geodataframe ########## @@ -461,7 +456,7 @@ def test_bbox_not_crosses_dateline(bbox): def test_poly_wrong_input(): with pytest.raises(AssertionError): - tuplelist = spat.check_dateline( + spat.check_dateline( "polygon", [[160, -45], [160, -40], [-170, -39], [-128, -40], [-128, -45], [160, -45]], ) diff --git a/icepyx/tests/test_temporal.py b/icepyx/tests/test_temporal.py index c93b30a38..cd24deda4 100644 --- a/icepyx/tests/test_temporal.py +++ b/icepyx/tests/test_temporal.py @@ -1,8 +1,5 @@ import datetime as dt -import numpy as np import pytest -from shapely.geometry import Polygon -import warnings import icepyx.core.temporal as tp @@ -239,44 +236,44 @@ def test_range_str_yyyydoy_dict_time_start_end(): # (The following inputs are bad, testing to ensure the temporal class handles this elegantly) def test_bad_start_time_type(): with pytest.raises(AssertionError): - bad_start = tp.Temporal(["2016-01-01", "2020-01-01"], 100000, "13:10:01") + tp.Temporal(["2016-01-01", "2020-01-01"], 100000, "13:10:01") def test_bad_end_time_type(): with pytest.raises(AssertionError): - bad_end = tp.Temporal(["2016-01-01", "2020-01-01"], "01:00:00", 131001) + tp.Temporal(["2016-01-01", "2020-01-01"], "01:00:00", 131001) def test_range_bad_list_len(): with pytest.raises(ValueError): - result = tp.Temporal(["2016-01-01", "2020-01-01", "2022-02-15"]) + tp.Temporal(["2016-01-01", "2020-01-01", "2022-02-15"]) def test_range_str_bad_yyyydoy(): with pytest.raises(AssertionError): - bad_end = tp.Temporal(["2016-01-01", "2020-01-01"], "01:00:00", 131001) + tp.Temporal(["2016-01-01", "2020-01-01"], "01:00:00", 131001) def test_range_str_bad_yyyymmdd(): with pytest.raises(AssertionError): - bad_end = tp.Temporal(["2016-01-01", "2020-01-01"], "01:00:00", 131001) + tp.Temporal(["2016-01-01", "2020-01-01"], "01:00:00", 131001) # a "bad dict" is assumed to be one of the wrong length or with the wrong key names def test_bad_dict_keys(): with pytest.raises(ValueError): - result = tp.Temporal({"startdate": "2016-01-01", "enddate": "2020-01-01"}) + tp.Temporal({"startdate": "2016-01-01", "enddate": "2020-01-01"}) def test_bad_dict_length(): with pytest.raises(ValueError): - result = tp.Temporal({"start_date": "2016-01-01"}) + tp.Temporal({"start_date": "2016-01-01"}) # A "bad range" is a range where the start_date > end date def test_range_str_bad_range(): with pytest.raises(AssertionError): - result = tp.Temporal({"start_date": "2020-01-01", "end_date": "2016-01-01"}) + tp.Temporal({"start_date": "2020-01-01", "end_date": "2016-01-01"}) # NOTE: Not testing bad datetime/time inputs because it is assumed the datetime library diff --git a/icepyx/tests/test_visualization.py b/icepyx/tests/test_visualization.py index ede046f0b..403cb21f1 100644 --- a/icepyx/tests/test_visualization.py +++ b/icepyx/tests/test_visualization.py @@ -1,6 +1,5 @@ import pytest -from icepyx.core.visualization import Visualize import icepyx.core.visualization as vis From 49501ecaf6e08197e2d394de47f842c988cdb12d Mon Sep 17 00:00:00 2001 From: Matt Fisher Date: Mon, 12 Aug 2024 15:40:14 -0600 Subject: [PATCH 09/11] Switch to set comparison to fix a test dependent on unreliable order (#550) --- icepyx/tests/test_quest_argo.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/icepyx/tests/test_quest_argo.py b/icepyx/tests/test_quest_argo.py index fb20a3a47..a31b52f78 100644 --- a/icepyx/tests/test_quest_argo.py +++ b/icepyx/tests/test_quest_argo.py @@ -59,8 +59,8 @@ def test_param_setter(argo_quest_instance): reg_a.params = ["temperature", "salinity"] - exp = ["temperature", "salinity"] - assert reg_a.params == exp + exp = {"temperature", "salinity"} + assert set(reg_a.params) == exp def test_param_setter_invalid_inputs(argo_quest_instance): From e0e1738737db77714e604035242da32bcd263ad2 Mon Sep 17 00:00:00 2001 From: Matt Fisher Date: Mon, 12 Aug 2024 15:52:26 -0600 Subject: [PATCH 10/11] Switch to ruff (#543) Co-authored-by: Wei Ji <23487320+weiji14@users.noreply.github.com> --- .flake8 | 37 ---------------- .github/workflows/linter_actions.yml | 13 +++--- .pre-commit-config.yaml | 30 +++++++------ doc/sphinxext/announce.py | 1 + icepyx/core/is2ref.py | 19 ++++---- icepyx/tests/test_Earthdata.py | 1 + pyproject.toml | 65 ++++++++++++++++++++++------ requirements-dev.txt | 2 - 8 files changed, 87 insertions(+), 81 deletions(-) delete mode 100644 .flake8 diff --git a/.flake8 b/.flake8 deleted file mode 100644 index bacc40964..000000000 --- a/.flake8 +++ /dev/null @@ -1,37 +0,0 @@ -[flake8] -#GOAL: max_line_length = 79 or 99 -max_line_length = 99 -per-file-ignores = - # too many leading '#' for block comment - */tests/*:E266 - # line too long (several test strs) - test_granules.py:E501 - # imported but unused - __init__.py:F401 - # import not at top of file, imported but unused - doc/source/conf.py:E402,F401 - -ignore = - # line too long - # NOTE: This is a formatting concern. Black handles long lines of code, but - # allows inline comments to be infinitely long (automatically formatting - # them can have unintended consequences). In our codebase, we have a lot of - # overlong comments. - # See: https://github.com/psf/black/issues/1713#issuecomment-1357045092 - E501 - # GOAL: remove ignores below this line - # comparison syntax in tests - E721 - # bare except - E722 - # ambiguous var name - E741 - # unable to detect undefined names - F403 - # line break before binary operator - # NOTE: This is a formatting concern - W503 - - # GOAL: - # syntax check doctests in docstrings - # doctests = True diff --git a/.github/workflows/linter_actions.yml b/.github/workflows/linter_actions.yml index d2f8c31f6..88f66a3bc 100644 --- a/.github/workflows/linter_actions.yml +++ b/.github/workflows/linter_actions.yml @@ -10,9 +10,10 @@ jobs: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - - name: Run black linter - uses: psf/black@stable - # use the flake8 linter to annotate improperly formatted code - # note linter arguments are supplied via the .flake8 config file - - name: Annotate PR after running flake8 - uses: TrueBrain/actions-flake8@v2 + + # Use the Ruff linter to annotate code style / best-practice issues + # NOTE: More config provided in pyproject.toml + - name: Lint and annotate PR + uses: chartboost/ruff-action@v1 + with: + args: "check . --output-format github" diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index eb801b50d..62fe68370 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -1,19 +1,21 @@ repos: -- repo: https://github.com/psf/black - rev: 24.8.0 - hooks: - - id: black + - repo: https://github.com/pre-commit/pre-commit-hooks + rev: v4.6.0 + hooks: + - id: check-toml + - id: check-yaml + - id: check-added-large-files + args: ["--maxkb=5000"] + - id: end-of-file-fixer + - id: trailing-whitespace + args: [--markdown-linebreak-ext=md] -- repo: https://github.com/pre-commit/pre-commit-hooks - rev: v4.6.0 # Use the ref you want to point at - hooks: - - id: check-toml - - id: check-yaml - - id: check-added-large-files - args: ["--maxkb=5000"] - - id: end-of-file-fixer - - id: trailing-whitespace - args: [--markdown-linebreak-ext=md] + - repo: https://github.com/astral-sh/ruff-pre-commit + rev: v0.5.7 + hooks: + - id: ruff + args: ["--fix", "--show-fixes"] + - id: ruff-format ci: autoupdate_schedule: monthly diff --git a/doc/sphinxext/announce.py b/doc/sphinxext/announce.py index db6858678..6ff0e4884 100644 --- a/doc/sphinxext/announce.py +++ b/doc/sphinxext/announce.py @@ -48,6 +48,7 @@ $ ./scripts/announce.py $GITHUB v1.11.0..v1.11.1 > announce.rst """ + import codecs import os import re diff --git a/icepyx/core/is2ref.py b/icepyx/core/is2ref.py index 38561168a..be3a3c8da 100644 --- a/icepyx/core/is2ref.py +++ b/icepyx/core/is2ref.py @@ -56,14 +56,17 @@ def _validate_OA_product(product): """ if isinstance(product, str): product = str.upper(product) - assert product in [ - "ATL06", - "ATL07", - "ATL08", - "ATL10", - "ATL12", - "ATL13", - ], "Oops! Elevation visualization only supports products ATL06, ATL07, ATL08, ATL10, ATL12, ATL13; please try another product." + assert ( + product + in [ + "ATL06", + "ATL07", + "ATL08", + "ATL10", + "ATL12", + "ATL13", + ] + ), "Oops! Elevation visualization only supports products ATL06, ATL07, ATL08, ATL10, ATL12, ATL13; please try another product." else: raise TypeError("Please enter a product string") return product diff --git a/icepyx/tests/test_Earthdata.py b/icepyx/tests/test_Earthdata.py index 81093f8ff..cfa2eb2c5 100644 --- a/icepyx/tests/test_Earthdata.py +++ b/icepyx/tests/test_Earthdata.py @@ -2,6 +2,7 @@ """ test icepyx.core.query.Query.earthdata_login function """ + import netrc import os import pytest diff --git a/pyproject.toml b/pyproject.toml index 917d03682..4f4c0a88c 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -15,20 +15,20 @@ maintainers = [ ] classifiers=[ - "Development Status :: 4 - Beta", - "Intended Audience :: Science/Research", - "License :: OSI Approved :: BSD License", - "Operating System :: OS Independent", - "Programming Language :: Python :: 3", - "Programming Language :: Python :: 3.7", - "Programming Language :: Python :: 3.8", - "Programming Language :: Python :: 3.9", - "Programming Language :: Python :: 3.10", - "Programming Language :: Python :: 3.11", - "Programming Language :: Python :: 3.12", - "Topic :: Scientific/Engineering", - "Topic :: Scientific/Engineering :: GIS", - "Topic :: Software Development :: Libraries", + "Development Status :: 4 - Beta", + "Intended Audience :: Science/Research", + "License :: OSI Approved :: BSD License", + "Operating System :: OS Independent", + "Programming Language :: Python :: 3", + "Programming Language :: Python :: 3.7", + "Programming Language :: Python :: 3.8", + "Programming Language :: Python :: 3.9", + "Programming Language :: Python :: 3.10", + "Programming Language :: Python :: 3.11", + "Programming Language :: Python :: 3.12", + "Topic :: Scientific/Engineering", + "Topic :: Scientific/Engineering :: GIS", + "Topic :: Software Development :: Libraries", ] [project.urls] @@ -61,3 +61,40 @@ version_file = "_icepyx_version.py" version_file_template = 'version = "{version}"' local_scheme = "node-and-date" fallback_version = "unknown" +# [tool.ruff.format] +# docstring-code-format = true +# docstring-code-line-length = "dynamic" + +[tool.ruff.lint] +select = [ + "E", # pycodestyle + "F", # pyflakes +] +ignore = [ + # Line too long + # NOTE: This is a formatting concern. Formatter handles long lines of code, + # but allows inline comments to be infinitely long (automatically formatting + # them can have unintended consequences). In our codebase, we have a lot of + # overlong comments. + # See: https://github.com/psf/black/issues/1713#issuecomment-1357045092 + "E501", + # TODO: remove ignores below this line + # comparison syntax in tests + "E721", + # bare except + "E722", + # unable to detect undefined names + "F403", +] + +[tool.ruff.lint.per-file-ignores] +# Ignore import violations in all `__init__.py` files and doc config +"__init__.py" = ["E402", "F401"] +"doc/source/conf.py" = ["E402", "F401"] + +# Ignore line length in test file containing some very long test strings +"test_granules.py" = ["E501"] +"test_spatial.py" = ["E501"] + +# Ignore too many leading '#' for block comment +"*/tests/*" = ["E266"] diff --git a/requirements-dev.txt b/requirements-dev.txt index 6a0e3eba2..66106dab8 100644 --- a/requirements-dev.txt +++ b/requirements-dev.txt @@ -1,5 +1,3 @@ -black -flake8 pre-commit pypistats pytest>=4.6 From c85baec810b439618b74c7147d4911033c7b840f Mon Sep 17 00:00:00 2001 From: Jessica Scheick Date: Tue, 13 Aug 2024 15:41:47 -0400 Subject: [PATCH 11/11] release v1.2.0 (#551) Co-authored-by: Matt Fisher --- doc/source/user_guide/changelog/index.rst | 10 +++++- doc/source/user_guide/changelog/v1.1.0.rst | 2 +- doc/source/user_guide/changelog/v1.2.0.rst | 41 ++++++++++++++++++++++ 3 files changed, 51 insertions(+), 2 deletions(-) create mode 100644 doc/source/user_guide/changelog/v1.2.0.rst diff --git a/doc/source/user_guide/changelog/index.rst b/doc/source/user_guide/changelog/index.rst index ee5bb11b3..927bc497f 100644 --- a/doc/source/user_guide/changelog/index.rst +++ b/doc/source/user_guide/changelog/index.rst @@ -7,9 +7,17 @@ This is the list of changes made to icepyx in between each release. Full details can be found in the `commit logs `_. -Latest Release (Version 1.1.0) +Latest Release (Version 1.2.0) ------------------------------ +.. toctree:: + :maxdepth: 2 + + v1.2.0 + +Version 1.1.0 +------------- + .. toctree:: :maxdepth: 2 diff --git a/doc/source/user_guide/changelog/v1.1.0.rst b/doc/source/user_guide/changelog/v1.1.0.rst index a4c17dbb7..22787fed2 100644 --- a/doc/source/user_guide/changelog/v1.1.0.rst +++ b/doc/source/user_guide/changelog/v1.1.0.rst @@ -89,4 +89,4 @@ Other Contributors ~~~~~~~~~~~~ -.. contributors:: v0.4.0..v0.4.1|HEAD +.. contributors:: v1.0.0..v1.1.0|HEAD diff --git a/doc/source/user_guide/changelog/v1.2.0.rst b/doc/source/user_guide/changelog/v1.2.0.rst new file mode 100644 index 000000000..99d39197d --- /dev/null +++ b/doc/source/user_guide/changelog/v1.2.0.rst @@ -0,0 +1,41 @@ +What's new in 1.2.0 (14 August 2024) +----------------------------------- + +These are the changes in icepyx 1.2.0 See :ref:`release` for a full changelog +including other versions of icepyx. + + +New Features +~~~~~~~~~~~~ + +- Replace `setup.py` with equivalent `pyproject.toml` (#539) +- Fix continuous delivery & docs to account for setup.py -> pyproject.toml change (#541) + + +Bug fixes +~~~~~~~~~ + +- Switch to set comparison to fix a test dependent on unreliable order (#550) + + + +Maintenance +^^^^^^^^^^^ + +- update docstring tests for numpy 2.0 (#537) +- Add Zenodo badge and update all-contributors badge (#536) +- Autofix flake8 ignores E711, E712, E714, F401, F841 with Ruff (#542) +- Switch to ruff (#543) + + +Documentation +^^^^^^^^^^^^^ + +- fix bib entry (#529) +- [docs] update is2 resources (#535) + + +Contributors +~~~~~~~~~~~~ + +.. contributors:: v1.1.0..v1.2.0|HEAD