Skip to content

Commit

Permalink
Merge branch 'dev' into port-example-args-to-yamls
Browse files Browse the repository at this point in the history
  • Loading branch information
garlic-os authored Jul 8, 2024
2 parents abb93c7 + 4c413ad commit ff860bd
Show file tree
Hide file tree
Showing 34 changed files with 627 additions and 221 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,13 +12,13 @@ on:

jobs:
call-version-info-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-version-info.yml@v0.11.1
uses: ASFHyP3/actions/.github/workflows/reusable-version-info.yml@v0.11.2
with:
python_version: '3.10'

call-docker-ghcr-workflow:
needs: call-version-info-workflow
uses: ASFHyP3/actions/.github/workflows/reusable-docker-ghcr.yml@v0.11.1
uses: ASFHyP3/actions/.github/workflows/reusable-docker-ghcr.yml@v0.11.2
with:
version_tag: ${{ needs.call-version-info-workflow.outputs.version_tag }}
release_branch: main
Expand Down
4 changes: 1 addition & 3 deletions .github/workflows/changelog.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,4 @@ on:

jobs:
call-changelog-check-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-changelog-check.yml@v0.11.1
secrets:
USER_TOKEN: ${{ secrets.GITHUB_TOKEN }}
uses: ASFHyP3/actions/.github/workflows/reusable-changelog-check.yml@v0.11.2
2 changes: 1 addition & 1 deletion .github/workflows/labeled-pr.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,4 +12,4 @@ on:

jobs:
call-labeled-pr-check-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-labeled-pr-check.yml@v0.11.1
uses: ASFHyP3/actions/.github/workflows/reusable-labeled-pr-check.yml@v0.11.2
2 changes: 1 addition & 1 deletion .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ on:

jobs:
call-release-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-release.yml@v0.11.1
uses: ASFHyP3/actions/.github/workflows/reusable-release.yml@v0.11.2
with:
release_prefix: RAiDER
develop_branch: dev
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/tag.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ on:

jobs:
call-bump-version-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-bump-version.yml@v0.11.1
uses: ASFHyP3/actions/.github/workflows/reusable-bump-version.yml@v0.11.2
with:
user: dbekaert
email: bekaertdavid@gmail.com
Expand Down
7 changes: 6 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,11 @@ and uses [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
### Added
* [656](https://github.com/dbekaert/RAiDER/pull/656) - example run configuration files available through `raider.py --generate_config <example_name>`.
### Changed
* [651](https://github.com/dbekaert/RAiDER/pull/651) Removed use of deprecated argument to `pandas.read_csv`.
* [657](https://github.com/dbekaert/RAiDER/pull/657) - Fixed a few typos in `README.md`.
* [651](https://github.com/dbekaert/RAiDER/pull/651) - Removed use of deprecated argument to `pandas.read_csv`.
* [627](https://github.com/dbekaert/RAiDER/pull/627) - Made Python datetimes timezone-aware and add unit tests and bug fixes.
* [662](https://github.com/dbekaert/RAiDER/pull/662) - Ensures dem-stitcher to be >= v2.5.6, which updates the url for reading the Geoid EGM 2008.
* [661](https://github.com/dbekaert/RAiDER/pull/661) - Fix bug in raiderDownloadGNSS, remove call to scipy.sum, and add unit tests

## [0.5.1]
### Changed
Expand All @@ -34,6 +38,7 @@ and uses [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [0.4.6]

### Added
* Added check for intermediate grid creation in _get_delays_on_cube
* Adds an `s1_orbits.py` module which includes:
* `get_orbits_from_slc_ids` to download the associated orbit files for a list of Sentinel-1 SLC IDs
* `ensure_orbit_credentials` to ensure ESA CSDE credentials have been provides to download orbit files. This should be called before `sentineleof` is used to download orbits.
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ RAiDER does **not** currently run on arm64 processors on Mac. We will update thi

### Installing With Conda

RAiDER is available on [conda-forge](https://anaconda.org/conda-forge/raider). __[Conda](https://docs.conda.io/en/latest/index.html)__ is a cross-platform way to use Python that allows you to setup and use "virtual environments." These can help to keep dependencies for different sets of code separate. We recommend using [Miniforge](https://github.com/conda-forge/miniforge), a conda environment manager that uses conda-forge as its default code repo. Alternatively,see __[here](https://docs.anaconda.com/anaconda/install/)__ for help installing Anaconda and __[here](https://docs.conda.io/en/latest/miniconda.html)__ for installing Miniconda.
RAiDER is available on [conda-forge](https://anaconda.org/conda-forge/raider). __[Conda](https://docs.conda.io/en/latest/index.html)__ is a cross-platform way to use Python that allows you to setup and use "virtual environments." These can help to keep dependencies for different sets of code separate. We recommend using [Miniforge](https://github.com/conda-forge/miniforge), a conda environment manager that uses conda-forge as its default code repo. Alternatively, see __[here](https://docs.anaconda.com/anaconda/install/)__ for help installing Anaconda and __[here](https://docs.conda.io/en/latest/miniconda.html)__ for installing Miniconda.

Installing RAiDER:
```
Expand Down
2 changes: 1 addition & 1 deletion environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ dependencies:
- cdsapi
- cfgrib
- dask
- dem_stitcher>=2.3.1
- dem_stitcher>=2.5.6
- ecmwf-api-client
- h5netcdf
- h5py
Expand Down
3 changes: 2 additions & 1 deletion test/fake_raytracing
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,8 @@ class MockWeatherModel(WeatherModel):
self._k3 = 1

self._Name = "MOCK"
self._valid_range = (datetime.datetime(1970, 1, 1), "Present")
self._valid_range = (datetime.datetime(1970, 1, 1).replace(tzinfo=datetime.timezone(offset=datetime.timedelta())),
datetime.datetime.now(datetime.timezone.utc))
self._lag_time = datetime.timedelta(days=15)

def _fetch(self, ll_bounds, time, out):
Expand Down
89 changes: 86 additions & 3 deletions test/test_GUNW.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@
import os
import shutil
import unittest

from datetime import datetime
from pathlib import Path

import jsonschema
Expand All @@ -18,10 +20,13 @@
from RAiDER import aws
from RAiDER.aria.prepFromGUNW import (
check_hrrr_dataset_availablity_for_s1_azimuth_time_interpolation,
check_weather_model_availability,
check_weather_model_availability,_get_acq_time_from_gunw_id,
get_slc_ids_from_gunw,get_acq_time_from_slc_id
)
from RAiDER.cli.raider import calcDelaysGUNW
from RAiDER.models.customExceptions import NoWeatherModelData
from RAiDER.models.customExceptions import (
NoWeatherModelData, WrongNumberOfFiles,
)


def compute_transform(lats, lons):
Expand Down Expand Up @@ -565,7 +570,7 @@ def test_GUNW_workflow_fails_if_a_download_fails(gunw_azimuth_test, orbit_dict_f
'-interp', 'azimuth_time_grid'
]

with pytest.raises(RuntimeError):
with pytest.raises(WrongNumberOfFiles):
calcDelaysGUNW(iargs_1)
RAiDER.s1_azimuth_timing.get_s1_azimuth_time_grid.assert_not_called()

Expand All @@ -587,3 +592,81 @@ def test_value_error_for_file_inputs_when_no_data_available(mocker):
with pytest.raises(NoWeatherModelData):
calcDelaysGUNW(iargs)
RAiDER.aria.prepFromGUNW.main.assert_not_called()


def test_get_acq_time_reference():
"""Tests if function extracts acquisition time for reference"""
gunw_id = "S1-GUNW-A-R-106-tops-20220115_20211222-225947-00078W_00041N-PP-4be8-v3_0_0"
expected_time = datetime(2022, 1, 15, 22, 59, 47)
result = _get_acq_time_from_gunw_id(gunw_id, "reference")
assert result == expected_time

def test_get_acq_time_secondary():
"""Tests if function extracts acquisition time for secondary"""
gunw_id = "S1-GUNW-A-R-106-tops-20220115_20211222-225947-00078W_00041N-PP-4be8-v3_0_0"
expected_time = datetime(2021, 12, 22, 22, 59, 47)
result = _get_acq_time_from_gunw_id(gunw_id, "secondary")
assert result == expected_time

def test_invalid_reference_or_secondary():
"""Tests if function raises error for invalid reference_or_secondary value"""
gunw_id = "S1-GUNW-A-R-106-tops-20220115_20211222-225947-00078W_00041N-PP-4be8-v3_0_0"
with pytest.raises(ValueError):
_get_acq_time_from_gunw_id(gunw_id, "invalid")


def test_check_hrrr_availability_all_true():
"""Tests if check_hrrr_dataset_availablity_for_s1_azimuth_time_interpolation returns True
when all check_hrrr_dataset_availability return True"""

gunw_id = "S1-GUNW-A-R-106-tops-20220115_20211222-225947-00078W_00041N-PP-4be8-v3_0_0"

# Mock _get_acq_time_from_gunw_id to return expected times
result = check_hrrr_dataset_availablity_for_s1_azimuth_time_interpolation(gunw_id)
assert result == True

def test_get_slc_ids_from_gunw():
test_path = 'test/gunw_test_data/S1-GUNW-D-R-059-tops-20230320_20220418-180300-00179W_00051N-PP-c92e-v2_0_6.nc'
assert get_slc_ids_from_gunw(test_path, 'reference') == 'S1A_IW_SLC__1SDV_20230320T180251_20230320T180309_047731_05BBDB_DCA0.zip'
assert get_slc_ids_from_gunw(test_path, 'secondary') == 'S1A_IW_SLC__1SDV_20220418T180246_20220418T180305_042831_051CC3_3C47.zip'

with pytest.raises(FileNotFoundError):
get_slc_ids_from_gunw('dummy.nc')

with pytest.raises(ValueError):
get_slc_ids_from_gunw(test_path, 'tertiary')

with pytest.raises(OSError):
get_slc_ids_from_gunw('test/weather_files/ERA-5_2020_01_30_T13_52_45_32N_35N_120W_115W.nc')


def test_get_acq_time_valid_slc_id():
"""Tests if function extracts acquisition time for a valid slc_id"""
slc_id = "S1B_OPER_AUX_POEORB_OPOD_20210731T111940_V20210710T225942_20210712T005942.EOF"
expected_time = pd.Timestamp("20210731T111940")
result = get_acq_time_from_slc_id(slc_id)
assert result == expected_time


def test_get_acq_time_invalid_slc_id():
"""Tests if function raises error for an invalid slc_id format"""
invalid_slc_id = "test/gunw_azimuth_test_data/S1B_OPER_AUX_POEORB_OPOD_20210731T111940_V20210710T225942_20210712T005942.EOF"
with pytest.raises(ValueError):
get_acq_time_from_slc_id(invalid_slc_id)


def test_check_weather_model_availability():
gunw_id = "test/gunw_test_data/S1-GUNW-D-R-059-tops-20230320_20220418-180300-00179W_00051N-PP-c92e-v2_0_6.nc"
weather_models = ['ERA5', 'GMAO', 'MERRA2', 'HRRR']
for wm in weather_models:
assert check_weather_model_availability(gunw_id, wm)

with pytest.raises(ValueError):
check_weather_model_availability(gunw_id, 'NotAModel')

def test_check_weather_model_availability_2():
gunw_id = "test/gunw_test_data/S1-GUNW-D-R-059-tops-20230320_20220418-180300-00179W_00051N-PP-c92e-v2_0_6.nc"
weather_models = ['ERA5', 'GMAO', 'MERRA2', 'HRRR']
fail_check = [True, True, True, True]
for wm, check in zip(weather_models, fail_check):
assert check_weather_model_availability(gunw_id, wm)==check
10 changes: 8 additions & 2 deletions test/test_datelist.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,9 @@ def test_datelist():
os.makedirs(SCENARIO_DIR, exist_ok=False)

dates = ['20200124', '20200130']
true_dates = [datetime.datetime(2020,1,24), datetime.datetime(2020,1,30)]
true_dates = [
datetime.datetime(2020,1,24), datetime.datetime(2020,1,30)
]

dct_group = {
'aoi_group': {'bounding_box': [28, 28.3, -116.3, -116]},
Expand All @@ -32,7 +34,11 @@ def test_datelist():
def test_datestep():
SCENARIO_DIR = os.path.join(TEST_DIR, 'scenario_5')
st, en, step = '20200124', '20200130', 3
true_dates = [datetime.datetime(2020,1,24), datetime.datetime(2020,1,27), datetime.datetime(2020,1,30)]
true_dates = [
datetime.datetime(2020,1,24),
datetime.datetime(2020,1,27),
datetime.datetime(2020,1,30)
]

dct_group = {
'aoi_group': {'bounding_box': [28, 39, -123, -112]},
Expand Down
101 changes: 101 additions & 0 deletions test/test_downloadGNSS.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,101 @@
import os
import pytest
import requests
from unittest import mock

from test import TEST_DIR, pushd
from RAiDER.dem import download_dem
from RAiDER.gnss.downloadGNSSDelays import (
check_url,in_box,fix_lons,get_ID,
download_UNR,main,
)

# Test check_url with a valid and invalid URL
def test_check_url_valid():
valid_url = "https://www.example.com/test.txt"
with mock.patch.object(requests.Session, 'head') as mock_head:
mock_head.return_value.status_code = 200 # Simulate successful response
assert check_url(valid_url) == valid_url

def test_check_url_invalid():
invalid_url = "https://www.not-a-real-website.com/notfound.txt"
with mock.patch.object(requests.Session, 'head') as mock_head:
mock_head.return_value.status_code = 404 # Simulate not found response
assert check_url(invalid_url) == ''


# Test in_box with points inside and outside the box
def test_in_box_inside():
lat = 38.0
lon = -97.0
llbox = [30, 40, -100, -90] # Sample bounding box
assert in_box(lat, lon, llbox) == True

def test_in_box_outside():
lat = 50.0
lon = -80.0
llbox = [30, 40, -100, -90] # Sample bounding box
assert in_box(lat, lon, llbox) == False

# Test fix_lons with various longitudes
def test_fix_lons_positive():
lon = 200.0
assert fix_lons(lon) == -160.0

def test_fix_lons_negative():
lon = -220.0
assert fix_lons(lon) == 140.0

def test_fix_lons_positive_180():
lon = 180.0
assert fix_lons(lon) == 180.0

def test_fix_lons_negative_180():
lon = -180.0
assert fix_lons(lon) == -180.0

# Test get_ID with a valid line
def test_get_ID_valid():
line = "ABCD 35.0 -98.0 100.0"
stat_id, lat, lon, height = get_ID(line)
assert stat_id == "ABCD"
assert lat == 35.0
assert lon == -98.0
assert height == 100.0

# Test get_ID with an invalid line (not enough elements)
def test_get_ID_invalid():
line = "ABCD 35.0" # Missing longitude and height
with pytest.raises(ValueError):
get_ID(line)


def test_download_UNR():
statID = 'MORZ'
year = 2020
outDict = download_UNR(statID, year)
assert outDict['path'] == 'http://geodesy.unr.edu/gps_timeseries/trop/MORZ/MORZ.2020.trop.zip'

def test_download_UNR_2():
statID = 'MORZ'
year = 2000
with pytest.raises(ValueError):
download_UNR(statID, year, download=True)

def test_download_UNR_3():
statID = 'DUMY'
year = 2020
with pytest.raises(ValueError):
download_UNR(statID, year, download=True)

def test_download_UNR_4():
statID = 'MORZ'
year = 2020
with pytest.raises(NotImplementedError):
download_UNR(statID, year, baseURL='www.google.com')


def test_main():
# iargs = None
# main(inps=iargs)
assert True
8 changes: 4 additions & 4 deletions test/test_downloaders.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@

import numpy as np

from datetime import datetime
from datetime import datetime, timedelta, timezone
from test import TEST_DIR

from RAiDER.models.era5 import ERA5
Expand All @@ -17,7 +17,7 @@ def test_era5():
wm.set_latlon_bounds(np.array([10, 10.2, -72, -72]))
wm.fetch(
os.path.join(TEST_DIR, 'test_geom', 'test_era5.nc'),
datetime(2020, 1, 1, 0, 0, 0)
datetime(2020, 1, 1, 0, 0, 0).replace(tzinfo=timezone(offset=timedelta()))
)


Expand All @@ -27,7 +27,7 @@ def test_era5t():
wm.set_latlon_bounds(np.array([10, 10.2, -72, -72]))
wm.fetch(
os.path.join(TEST_DIR, 'test_geom', 'test_era5t.nc'),
datetime(2020, 1, 1, 0, 0, 0)
datetime(2020, 1, 1, 0, 0, 0).replace(tzinfo=timezone(offset=timedelta()))
)


Expand All @@ -37,5 +37,5 @@ def test_erai():
wm.set_latlon_bounds(np.array([10, 10.2, -72, -72]))
wm.fetch(
os.path.join(TEST_DIR, 'test_geom', 'test_erai.nc'),
datetime(2017, 1, 1, 0, 0, 0)
datetime(2017, 1, 1, 0, 0, 0).replace(tzinfo=timezone(offset=timedelta()))
)
4 changes: 3 additions & 1 deletion test/test_gnss.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
from RAiDER.models.customExceptions import NoStationDataFoundError
from RAiDER.gnss.downloadGNSSDelays import (
get_stats_by_llh, get_station_list, download_tropo_delays,
filterToBBox,
filterToBBox
)
from RAiDER.gnss.processDelayFiles import (
addDateTimeToFiles,
Expand All @@ -15,6 +15,7 @@
import pandas as pd

from test import pushd, TEST_DIR
from unittest import mock
SCENARIO2_DIR = os.path.join(TEST_DIR, "scenario_2")


Expand Down Expand Up @@ -163,3 +164,4 @@ def test_filterByBBox2():
assert stat not in new_data['ID'].to_list()
for stat in ['FGNW', 'JPLT', 'NVTP', 'WLHG', 'WORG']:
assert stat in new_data['ID'].to_list()

Loading

0 comments on commit ff860bd

Please sign in to comment.