pax_global_header 0000666 0000000 0000000 00000000064 14777330235 0014525 g ustar 00root root 0000000 0000000 52 comment=51f9932622deca3df3d80b94dfefab7468063dfe
Discovery-asf_search-8.1.2/ 0000775 0000000 0000000 00000000000 14777330235 0015600 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/.coveragerc 0000664 0000000 0000000 00000000040 14777330235 0017713 0 ustar 00root root 0000000 0000000 [run]
omit = *tests* *examples*
Discovery-asf_search-8.1.2/.github/ 0000775 0000000 0000000 00000000000 14777330235 0017140 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/.github/ISSUE_TEMPLATE/ 0000775 0000000 0000000 00000000000 14777330235 0021323 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/.github/ISSUE_TEMPLATE/bug_report.md 0000664 0000000 0000000 00000001302 14777330235 0024011 0 ustar 00root root 0000000 0000000 ---
name: Bug report
about: Create a report to help us improve
title: "[Bug]"
labels: ''
assignees: ''
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Desktop (please complete the following information):**
- OS: [e.g. Ubuntu 20.04]
- Python Version [e.g. python3.11]
- Pip Environment ['python3 -m pip freeze']
**Additional context**
Add any other context about the problem here.
Discovery-asf_search-8.1.2/.github/ISSUE_TEMPLATE/config.yml 0000664 0000000 0000000 00000000462 14777330235 0023315 0 ustar 00root root 0000000 0000000 contact_links:
- name: Ask Questions
url: https://github.com/asfadmin/Discovery-asf_search/discussions
about: Feel free to ask and answer questions in GitHub's Discussions
- name: Gitter Chat
url: https://gitter.im/ASFDiscovery/asf_search
about: Come chat with the asf_search community Discovery-asf_search-8.1.2/.github/ISSUE_TEMPLATE/feature_request.md 0000664 0000000 0000000 00000001134 14777330235 0025047 0 ustar 00root root 0000000 0000000 ---
name: Feature request
about: Suggest an idea for this project
title: "[Feature]"
labels: ''
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
Discovery-asf_search-8.1.2/.github/workflows/ 0000775 0000000 0000000 00000000000 14777330235 0021175 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/.github/workflows/changelog.yml 0000664 0000000 0000000 00000000752 14777330235 0023653 0 ustar 00root root 0000000 0000000 name: Update changelog on Releases
on:
pull_request:
types:
- opened
- labeled
- unlabeled
- synchronize
branches:
- stable
jobs:
changelog-updated:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: Changelog check
uses: Zomzog/changelog-checker@v1.0.0
with:
fileName: CHANGELOG.md
noChangelogLabel: bumpless
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
Discovery-asf_search-8.1.2/.github/workflows/label-prod-pr.yml 0000664 0000000 0000000 00000000755 14777330235 0024367 0 ustar 00root root 0000000 0000000 name: Check for required labels
on:
pull_request:
types:
- opened
- reopened
- labeled
- unlabeled
- synchronize
branches:
- stable
jobs:
check-version-label:
runs-on: ubuntu-latest
if: github.event.pull_request.state == 'open'
steps:
- name: Require Version Label
uses: mheap/github-action-required-labels@v1
with:
mode: exactly
count: 1
labels: "major, minor, patch, bumpless"
Discovery-asf_search-8.1.2/.github/workflows/lint.yml 0000664 0000000 0000000 00000000264 14777330235 0022670 0 ustar 00root root 0000000 0000000 on: push
jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: chartboost/ruff-action@v1
with:
src: './asf_search' Discovery-asf_search-8.1.2/.github/workflows/prod-request-merged.yml 0000664 0000000 0000000 00000003521 14777330235 0025614 0 ustar 00root root 0000000 0000000 name: Merged to Stable
on:
pull_request:
types: [closed]
branches:
- stable
jobs:
OpenRequest:
runs-on: ubuntu-latest
# If a merge request triggered the push, and that request DOESN'T contain the 'bumpless' label.
# (Need to check all three, instead of 'not bumpless', because if and admin overrides the tests,
# it might not have ANY labels at that point.).
if: >
github.event.pull_request.merged &&
(
contains(github.event.pull_request.labels.*.name, 'patch') ||
contains(github.event.pull_request.labels.*.name, 'minor') ||
contains(github.event.pull_request.labels.*.name, 'major')
)
steps:
- uses: actions/checkout@v2
- name: Save version type
# Whichever one return's true, will let their 'echo' statement run:
# Must wrap in "(*) || true" to prevent it from exiting on failure, until
# 'allow-failure' is finished getting added: https://github.com/actions/toolkit/issues/399
run: |
(${{ contains(github.event.pull_request.labels.*.name, 'patch') }} && echo "version_type=patch" >> $GITHUB_ENV) || true
(${{ contains(github.event.pull_request.labels.*.name, 'minor') }} && echo "version_type=minor" >> $GITHUB_ENV) || true
(${{ contains(github.event.pull_request.labels.*.name, 'major') }} && echo "version_type=major" >> $GITHUB_ENV) || true
- name: Create a Release
uses: zendesk/action-create-release@v1
env:
# NOT built in token, so this can trigger other actions:
GITHUB_TOKEN: ${{ secrets.DISCO_GITHUB_MACHINE_USER }}
with:
# version_type populated with the last job just above ^^
auto_increment_type: "${{ env.version_type }}"
tag_schema: semantic
draft: false
prerelease: false
body: "${{ github.event.pull_request.body }}"
Discovery-asf_search-8.1.2/.github/workflows/pypi-publish.yml 0000664 0000000 0000000 00000001422 14777330235 0024344 0 ustar 00root root 0000000 0000000 # This workflow will upload a Python Package using Twine when a release is created
# For more information see: https://help.github.com/en/actions/language-and-framework-guides/using-python-with-github-actions#publishing-to-package-registries
name: Upload Python Package
on:
release:
types: [created]
branches:
- stable
jobs:
DeployToPypi:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Install dependencies
run: python3 -m pip install --upgrade pip build
- name: Build package
run: python3 -m build .
- name: Publish package
uses: pypa/gh-action-pypi-publish@bea5cda687c2b79989126d589ef4411bedce0195
with:
user: __token__
password: ${{ secrets.PYPI_TOKEN }}
Discovery-asf_search-8.1.2/.github/workflows/run-pytest.yml 0000664 0000000 0000000 00000001325 14777330235 0024053 0 ustar 00root root 0000000 0000000 name: tests
on: [push]
jobs:
run-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v5
with:
python-version: '3.9'
- name: Install Dependencies
run: |
python3 -m pip install --upgrade pip
python3 -m pip install .[extras,test]
- name: Run Tests
run: python3 -m pytest -n auto --cov=asf_search --cov-report=xml --dont-run-file test_known_bugs .
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v3
with:
fail_ci_if_error: false
files: ./coverage.xml
flags: unittests
name: asf_admin pytest
verbose: true
Discovery-asf_search-8.1.2/.gitignore 0000664 0000000 0000000 00000003506 14777330235 0017574 0 ustar 00root root 0000000 0000000 # Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
pip-wheel-metadata/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
.python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock
# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat.pid
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
# VS Code
.vscode/
search_results.csv
search_results.metalink
Discovery-asf_search-8.1.2/CHANGELOG.md 0000664 0000000 0000000 00000077347 14777330235 0017433 0 ustar 00root root 0000000 0000000 # Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [PEP 440](https://www.python.org/dev/peps/pep-0440/)
and uses [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
------
## [v8.1.2](https://github.com/asfadmin/Discovery-asf_search/compare/v8.1.1...v8.1.2)
### Added
- Added NISAR search parameters `frameCoverage`, `jointObservation`, `mainBandPolarization`, `sideBandPolarization`, `rangeBandwidth`.
- Updated `NISARProduct` to include these new searchable fields in `properties` dictionary
- Include new NISAR fields in jsonlite & jsonlite2 output
------
## [v8.1.1](https://github.com/asfadmin/Discovery-asf_search/compare/v8.1.0...v8.1.1)
### Fixed
- SLC Burst product urls are now searchable with `find_urls()`
------
## [v8.1.0](https://github.com/asfadmin/Discovery-asf_search/compare/v8.0.1...v8.1.0)
### Added
- Adds `ASFSearchResults.find_urls()` and `ASFProduct.find_urls()` to gather urls/uris from results by extension and/or regex pattern
### Changed
- Changed log level from warning to debug/info for search timing log messages
- Raised minimum Python version to 3.9 from 3.8, which reached EOL last year (see the official [Status of Python versions](https://devguide.python.org/versions/) for the Python version release cycle)
------
## [v8.0.1](https://github.com/asfadmin/Discovery-asf_search/compare/v8.0.0...v8.0.1)
### Fixed
- Fixed setting end date timezone when translating search opts to CMR opts
------
## [v8.0.0](https://github.com/asfadmin/Discovery-asf_search/compare/v7.1.0...v8.0.0)
### Added
- Added `asf.ASFSearchOptions(circle=[lat, long, radius])` search param. Takes list of exactly 3 numbers.
- Exposed `asf.validator_map`, which given a ops search param, can be used to look up which method we're going to validate it against.
- Exposed `ASFProduct.get_urls` which returns the URL's for it's products directly. Can control which products with the `fileType` enum.
### Removed
- Removes `get_property_paths()` static method from `ASFProduct`, just uses `_base_properties`
## [v7.1.4](https://github.com/asfadmin/Discovery-asf_search/compare/v7.1.3...v7.1.4)
### Changed
- replaces `ciso8601` package with `dateutil` for package wheel compatibility. `ciso8601` used when installed via `extra` dependency
### Fixed
- Fixes syntax warning with escaped slash in `translate.py`
------
## [v7.1.3](https://github.com/asfadmin/Discovery-asf_search/compare/v7.1.2...v7.1.3)
### Fixed
- Adds missing values for polarization constants `DUAL_HH`, `DUAL_VV`, `DUAL_HV`, `DUAL_VH`, `HH_3SCAN`, `HH_4SCAN`, `HH_5SCAN`
- processingLevel `RAW` now includes `C1234413256-ASFDEV` in collection alias list (`SENTINEL-1B_RAW`'s collection for ASFDEV provider)
------
## [v7.1.2](https://github.com/asfadmin/Discovery-asf_search/compare/v7.1.1...v7.1.2)
### Fixed
- `OPERAS1Product` subclass now properly assigned to PGE v2.0.1 results
### Changed
- `ARIAS1GUNWProduct.is_ARIAS1GUNWProduct()` removed, replaced with `ASFProduct._is_subclass()` implementation
------
## [v7.1.1](https://github.com/asfadmin/Discovery-asf_search/compare/v7.1.0...v7.1.1)
### Changed
- Uses `ciso8601.parse_datetime()` in baseline calculations, speeds up calculations on larger stacks
### Added
- Adds `ASF_LOGGER` logging in `search_generator()` and related methods
### Fixed
- `ASFProduct.get_sort_keys()` will no longer returns `None` if missing sort key, defaults to empty string
------
## [v7.1.0](https://github.com/asfadmin/Discovery-asf_search/compare/v7.0.9...v7.1.0)
### Added
- Improved logging in `ASFSession` authentication methods
### Changed
- Uses `ciso8601` module for parsing dates from CMR response, significant performance improvement post-query
- `ASFSession` now allows for authorized user access to hidden/restricted CMR datasets via `auth_with_creds()` or `auth_with_cookiejar()` authentication methods (previously only supported via `auth_with_token()` method)
- `ASFSession.auth_with_token()` now authenticates directly against EDL endpoint
- UMM Platform ShortName used as final fallback criteria for product subclass assignment
------
## [v7.0.9](https://github.com/asfadmin/Discovery-asf_search/compare/v7.0.8...v7.0.9)
### Changed
- collection "ARIA_S1_GUNW" added to `ARIA_S1_GUNW` dataset, V3 products now loaded as `ARIAS1GUNWProduct` subclass
- `ARIAS1GUNWProduct` now exposes `ariaVersion` and (for V3 products) `inputGranules` in `ARIAS1GUNWProduct.properties`
------
## [v7.0.8](https://github.com/asfadmin/Discovery-asf_search/compare/v7.0.7...v7.0.8)
### Added
- `s3Urls` property added to `S1Product`, `OPERAS1Product`, and `NISARProduct` types, exposing direct access S3 links
------
## [v7.0.7](https://github.com/asfadmin/Discovery-asf_search/compare/v7.0.6...v7.0.7)
### Added
- Adds `cmr_keywords` search keyword, enables passing CMR format strings in search directly
- Adds `shortName` keyword, for use with lists of collection short names
### Changed
- Allows using `dataset` and `platform` in same search
------
## [v7.0.6](https://github.com/asfadmin/Discovery-asf_search/compare/v7.0.5...v7.0.6)
### Changed
- timestamps while building queries and reading results from CMR now use UTC if no timezone is provided
- Changed what collections the `NISAR` dataset and platform collections lists are pointed at.
------
## [v7.0.5](https://github.com/asfadmin/Discovery-asf_search/compare/v7.0.4...v7.0.5)
### Added
- Adds basic NISAR dataset search and product functionality for test data
------
## [v7.0.4](https://github.com/asfadmin/Discovery-asf_search/compare/v7.0.3...v7.0.4)
### Changed
- `OPERA-S1-CALIBRATION` dataset is now the `OPERA-S1-CALVAL` dataset, uses the `OPERA_S1_CALVAL` constant
------
## [v7.0.3](https://github.com/asfadmin/Discovery-asf_search/compare/v7.0.2...v7.0.3)
### Fixed
- Fixes typo for constant variable name `constants.PRODUCT_TYPE.CSLC_STATIC`
- Normalizes concept-id lists for `OPERA-S1` dataset product types
### Changed
- Completely removes `CSLC-STATIC` Calval and `RTC-STATIC` Calval collections from concept-id lists
------
## [v7.0.2](https://github.com/asfadmin/Discovery-asf_search/compare/v7.0.1...v7.0.2)
### Added
- Adds `AUTH_COOKIES` to `constants.INTERNAL` and `auth_cookie_names` variable for `ASFSession`, used by `auth_with_creds()` and `auth_with_cookiejar()` to confirm login.
### Fixed
- Attempting to authorize `ASFSession` against CMR UAT using `auth_with_creds()` and `auth_with_cookiejar()` no longer raises an exception on valid login
- Fixes custom host in `ASFSearchOptions` raising type error while searching.
------
## [v7.0.1](https://github.com/asfadmin/Discovery-asf_search/compare/v7.0.0...v7.0.1)
### Fixed
- Fixed `OPERA-S1-CALIBRATION` dataset products raising error during search.
------
## [v7.0.0](https://github.com/asfadmin/Discovery-asf_search/compare/v6.7.3...v7.0.0)
### Added
- `ASFProduct` now has 13 sublcasses for different sub-products that correspond to datasets:
- `S1Product`, `S1BurstProduct`, `OPERAS1Product`, `ARIAS1GUNWProduct`, `ALOSProduct`, `RADARSATProduct`, `AIRSARProduct`, `ERSProduct`, `JERSProduct`, `UAVSARProduct`, `SIRCProduct`, `SEASATProduct`, `SMAPProduct`
- Each subclass defines relevant keys to pull from `umm` response, reducing the amount of irrelevant values in `properties` dict for certain product types
- Adds `collectionAlias` to `ASFSearchOptions` validator map as config param. Set to `False` to disable concept-id aliasing behaviour for `processingLevel` and `platform`.
- Adds warning when scenes in stack are missing state vectors, and logs baseline warnings with `ASF_LOGGER`
- Adds `OPERA-S1-CALIBRATION` entry to `dataset_collections` and corresponding `OPERA_S1_CALIBRATION` constant to `DATASET.py`, used to search for OPERA-S1 `CSLC` and `RTC` calibration data.
### Changed
- `remotezip` is now an optional dependency of asf-search's pip and conda installs, (pip install example: `python3 -m pip install asf-search[extras]`).
- Constants are no longer top level import, are now accessible through respective modules
- `processingLevel` and `platform` are now aliased by collection concept-ids, (lists of concept ids by their processing levels/platforms viewable in `dataset.py`), improving search performance and dodging subquery system
- Baseline stacking no longer excludes products with missing state vectors from final stack, like SearchAPI
- `OPERA-S1` dataset no longer includes calibration data (moved to new dataset)
- Adds optional `ASFSession` constructor keyword arguments for new class variables:
- `edl_host`
- `edl_client_id`
- `asf_auth_host`
- `cmr_host`
- `cmr_collections`
- `auth_domains`
- `ASFSession` imports `asf_search.constants.INTERNAL` in constructor call
- `ASFSession` methods `auth_with_creds()`, `auth_with_token()`, and `rebuild_auth()` use new class variables instead of constants
------
## [v6.7.3](https://github.com/asfadmin/Discovery-asf_search/compare/v6.7.2...v6.7.3)
### Added
- Adds OPERA-S1 constants `RTC`, `RTC_STATIC` (RTC-STATIC), `CSLC`, `CSLC_STATIC` (CSLC-STATIC) to `PRODUCT_TYPE.py`
### Fixed
- Harmonizes `search()`, `geo_search()`, and `search_count()` parameters
- Updates python version requirement in `setup.py` to 3.8+
### Changed
- search method params with `Iterable` type hinting now changed to `Sequence`
- search method param validators updated to support `Sequence` type
------
## [v6.7.2](https://github.com/asfadmin/Discovery-asf_search/compare/v6.7.1...v6.7.2)
### Added
- Adds constants for `dataset` keyword, under `asf_search.DATASET`
- Adds CALVAL concept-ids to 'OPERA-S1' dataset
- Adds `validityStartDate` for applicable OPERA-S1 products
### Fixed
- Fixes OPERA-S1 dataset `RTC-STATIC` and `CSLC-STATIC` breaking returned results, sorts by `validityStartDate` in place of `stopTime`
------
## [v6.7.1](https://github.com/asfadmin/Discovery-asf_search/compare/v6.7.0...v6.7.1)
### Fixed
- Fixes issue with certain S1 products not stacking properly in certain environments, which caused null `perpendicularBaseline` values
------
## [v6.7.0](https://github.com/asfadmin/Discovery-asf_search/compare/v6.6.3...v6.7.0)
### Added
- Adds new `dataset` keyword to `search()` as an alternative to `platform`. Allows users to get results from multiple platforms at once in a single page
- Adds `operaBurstID` keyword to `search()`
- Adds OPERA-S1 param `operaBurstID` to `ASFProduct.properties`, and adds Opera product urls to `additionalUrls`
- OPERA-S1 RTC product `polarization` now shows both polarizations as list
- adds `frameNumber` properties support for new `Sentinel-1 Interferogram` products
- added `CMR_TIMEOUT` constant. This is the amount of time in seconds to wait without seeing *any* data. (Default=30)
### Changed
- Changes `CMR_FORMAT_EXT` constant from `umm_json_v1_4` to `umm_json`, umm returned from CMR will now be in latest umm format by default
### Fixed
- ERS-1, ERS-2, JERS-1, and RADARSAT-1 now assign `FRAME_NUMBER` to the `frameNumber` properties field
------
## [v6.6.3](https://github.com/asfadmin/Discovery-asf_search/compare/v6.6.2...v6.6.3)
### Fixed
- Fixes type hinting compatibility break introduced in v6.6.2 in `search_generator.py` for Python versions < v3.9
------
## [v6.6.2](https://github.com/asfadmin/Discovery-asf_search/compare/v6.6.1...v6.6.2)
### Added
- Adds new `CMRIncompleteError` exception, raised by search methods when CMR returns an incomplete page
### Fixed
- Fixes bug in `search_generator()` causing results to sometimes wrongly be marked as incomplete
### Changed
- `stack_from_id()` now raises if results are incomplete, before checking if reference was found
------
## [v6.6.1](https://github.com/asfadmin/Discovery-asf_search/compare/v6.6.0...v6.6.1)
### Added
- Adds automated release notes
### Fixed
- `filename` can be used again with `ASFProduct.Download()` method (ignored if multiple files are to be downloaded)
------
## [v6.6.0](https://github.com/asfadmin/Discovery-asf_search/compare/v6.5.0...v6.6.0)
### Added
- Adds `fileType` param to `ASFProduct` and `ASFSearchResults` download method. Let's users download burst .xml and/or .tiff from the burst extractor with `FileDownloadType` enum (`DEFAULT_FILE`, `ADDITIONAL_FILES`, `ALL_FILES`)
### Fixed
- Fixes typo in convex hull warning message
------
## [v6.5.0](https://github.com/asfadmin/Discovery-asf_search/compare/v6.4.0...v6.5.0)
### Added
- Adds `collections` search keyword, letting results be limited to the provided concept-ids
- Adds `temporalBaselineDays` search keyword, allows searching `Sentinel-1 Interferogram (BETA)` products by their temporal baseline
### Changed
- `search_generator()` now uses tenacity library to poll CMR
- moves/re-organizes certain constant url fields to `INTERNAL.py`
### Fixed
- TimeoutErrors now properly caught and logged
------
## [v6.4.0](https://github.com/asfadmin/Discovery-asf_search/compare/v6.3.1...v6.4.0)
### Added
- Burst product downloads now supported
- `IPFVersion` field added to `ASFProduct` properties
### Fixed
- `BURST` product `url`, `fileName`, and `bytes` properties populated again
- `search_count()` now uses `ASFSearchOptions.host` when building query url
### Changed:
- `BURST` product baseline stackng now uses `fullBurstID` and `polarization` for getting initial stack
- Changed order of entries in `ASFSession`'s `User-Agent` header
- `BURST` `filename` field uses "`sceneName`.`extension`" format
------
## [v6.3.1](https://github.com/asfadmin/Discovery-asf_search/compare/v6.3.0...v6.3.1)
### Changed
- Changed `CMR_PAGE_SIZE` constant from 500 to 250
------
## [v6.3.0](https://github.com/asfadmin/Discovery-asf_search/compare/v6.2.0...v6.3.0)
### Added
- `BURST` product temporal/perpendicular baseline stacking now supported
- Added searchable burst keyword params, `relativeBurstID`, `absoluteBurstID`, and `fullBurstID`
### Changed
- `validate_wkt()` now returns both wrapped and unwrapped wkts along with repair reports.
- asf-search now sends the wrapped wkt to CMR when using the `intersectsWith` keyword
- Removed `burstAnxTime`, `timeFromAnxSeconds`
- Added `azimuthAnxTime`, `azimuthTime`
------
## [v6.2.0](https://github.com/asfadmin/Discovery-asf_search/compare/v6.1.0...v6.2.0)
### Added
- `search_generator()` returns a generator, which returns results from CMR page-by-page, yielding each page as an `ASFSearchResults` object. See /examples/1-Basic_Overview.ipynb for an example.
- The generator can be passed to different output formats via `results_to_[format]()` methods, allowing users to stream results to different format strings as they're received from CMR
### Changed
- Removed Jinja2 as a dependency for metalink, kml, and csv output formats.
------
## [v6.1.0](https://github.com/asfadmin/Discovery-asf_search/compare/v6.0.2...v6.1.0)
### Added
- Burst metadata available in `ASFProduct.properties['burst']`, also available in `csv`, `kml`, `jsonlite`, and `jsonlite2` output formats.
- Added `BURST` to `PRODUCT_TYPE.py` constants
- Added python `logging` support, for easier debugging and reporting when using asf_search inside an application.
### Changed
- Decreased the scope of tested platforms used in platform test cases
### Fixed
- Adds markupsafe<=2.0.1 as package requirement (Jinja2 requires this version)
- CMR url will now actually use the `host` property in `ASFSearchOptions` object
------
## [v6.0.2](https://github.com/asfadmin/Discovery-asf_search/compare/v6.0.1...v6.0.2)
### Fixed
- Fixed Setuptools not including csv, kml, and metalink export templates
------
## [v6.0.1](https://github.com/asfadmin/Discovery-asf_search/compare/v6.0.0...v6.0.1)
### Fixed
- `csv()`, `metalink()`, and `kml()` output formats should now work properly when installed from pip
------
## [v6.0.0](https://github.com/asfadmin/Discovery-asf_search/compare/v5.1.2...v6.0.0)
### Added
- Search errors are now automatically reported to ASF, users can opt out by changing `asf_search.REPORT_ERRORS` after import
- Example and information available in "Usage" section of /examples/1-Basic_Overview.ipynb
- `ASFSearchResults` now has `raise_if_incomplete()` method, raises `ASFSearchError()` if a search encountered an error and was unable to return all results from CMR
- `ASFProduct` now has a `remotezip()` method, which takes a user's pre-authenticated `ASFSession` and returns a `RemoteZip` object. This can be used to list and download specific files from a product's zip archive, rather than the whole zip file.
- Example available in /examples/5-Download.ipynb
- see https://github.com/gtsystem/python-remotezip for further details on how to use the `RemoteZip` class.
- Adds `GRD_FD`, `PROJECTED_ML3X3`, `THREEFP` product type constants.
### Changed
- While returning results, `search()` will no longer throw. Instead, `search()` will retry the request 3 times. If all 3 attempts fail:
- `search()` will return the results it found before the search error
- An error will be logged warning the user, and the returned results will be marked as incomplete. Use `raise_if_incomplete()` to raise an error when the returned `ASFSearchResults` are incomplete.
------
## [5.1.2](https://github.com/asfadmin/Discovery-asf_search/compare/v5.1.0...v5.1.2)
### Changed
- `CMR_PAGE_SIZE` reduced from 2000 to 500
------
## [5.1.0](https://github.com/asfadmin/Discovery-asf_search/compare/v5.0.2...v5.1.0)
### Added
- Adds export support to ASFSearchResults for `csv`, `jsonlite`, `jsonlite2`, `kml`, `metalink`
- example availabe in "Output" section of /examples/1-Basic_Overview.ipynb
- Adds `beamSwath` as a searchable parameter
### Fixed
- `count()` type hinting changed to `int`
### Changed
- Improved testing coverage of `ASFSearchResults`
------
## [5.0.2](https://github.com/asfadmin/Discovery-asf_search/compare/v5.0.1...v5.0.2)
### Fixed
- non-rectangular polygons are now sent to CMR instead of their bounding boxes
------
## [5.0.1](https://github.com/asfadmin/Discovery-asf_search/compare/v5.0.0...v5.0.1)
### Changed
- `ASFProduct` is now aware of the session used during search (if available) and will use that by default to download. A session can still be explicitly provided as before.
- `ASFProduct.stack()` now uses the session provided via the opts argument. If none is provided, it will use the session referenced by `ASFProduct.session`.
- `ASFProduct` more gracefully handles missing or malformed metadata during instantiation.
------
## [5.0.0](https://github.com/asfadmin/Discovery-asf_search/compare/v4.0.3...v5.0.0)
### Changed
- `asf_search` now searches CMR directly, no longer relying on ASF's SearchAPI
- This should significantly improve reliability and performance
- With this change, ALL metadata fields provided by CMR's UMM JSON format are now available through `ASFProduct`.
- All metadata fields previously available through `ASFProduct.properties` remain where they are
- For those and any other fields, the full CMR `umm` and `meta` records are available through `ASFProduct.umm` and `ASFProduct.meta` respectively
- Some geojson fields were previously presented as strings, they are now more appropriate types such as `int` or `float`:
- `bytes`, `centerLat`, `centerLon`, `frame`, `offNadirAngle`, `orbit`, `pathNumber`
- Timestamps in geojson fields now include an explicit `Z` time zone indicator.
- `ASFSearchOptions.reset()` has been renamed to `reset_search()` for clarity of purpose and to make room for future similar functionality regarding search opts configuration.
- `search()` (and related functions) now return results pre-sorted, most recent first
------
## [4.0.3](https://github.com/asfadmin/Discovery-asf_search/compare/v4.0.2...v4.0.3)
### Fixed
- `product_search()` now assigns `product_list` parameter to `ASFSearchOptions.product_list` instead of `ASFSearchOptions.granule_list`
------
## [4.0.2](https://github.com/asfadmin/Discovery-asf_search/compare/v4.0.1...v4.0.2)
### Changed
- Removed `scikit-learn` module as a dependency, greatly reducing install footprint
- Simplified AOI refinement:
- AOIs are iteratively simplified with an increasing threshold, that threshold now starts at 0.004
- AOIs with an MBR <= 0.004 in lat/lon are collapsed to a single point
- AOIs with an MBR <= 0.004 in either lat or lon are collapsed to a line along the center of the rectangle
------
## [4.0.1](https://github.com/asfadmin/Discovery-asf_search/compare/v4.0.0...v4.0.1)
### Changed
- Removed WKTUtils module as a dependency, that functionality is now directly included
------
## [4.0.0](https://github.com/asfadmin/Discovery-asf_search/compare/v3.0.4...v4.0.0)
### Added
- `ASFSearchOptions`: This class provides a number of useful ways to build search results
- Search parameters are immediately validated upon object creation/edit instead of at search time, which should lead to fewer errors at search time
- All search functions allow both the previous style of keyword arguments, as well as simply passing in an ASFSearchOptions object using the `opts` keyword arg. `opts` is always optional.
- If both approaches are used, the two are merged, with specific keyword args superseding the options in the object
- Most search functions now expect only their specific parameters, and an optional `opts` parameter. This allows simple usage in most cases, while the `opts` parameter provides access to advanced behavior or alternate workflows.
- Internally, all search functions work by passing ASFSearchOptions objects. This allows consistency when working with differently-configured search environments, such as in development.
- `ASFSearchResults` objects now include a `searchOptions` property, which describes the search used to create those results. This object can be copied, altered, used for subsequent searches, etc.
- When downloading, `ASFSearchResults` and `ASFProduct` default to use the session inside `searchOptions`, so you don't have to pass the same session in for both fetching and downloading results.
- Exposed `get_stack_opts()` to support more approaches for building insar stacks.
- `get_stack_opts()` accepts an `ASFProduct` as a stack reference and returns the ASFSearchOptions object that would be used to build a corresponding insar stack
- A matching convenience method has been added to `ASFProduct`
- Supports the new `opts` argument described above.
### Changed
- All search functions now accepts the optional `opts=` argument, see `ASFSearchOptions` notes above.
- Replaced all `cmr_token` key arguments with `session`, which takes a `Session`-compatible object. See https://docs.asf.alaska.edu/asf_search/ASFSession/ for more details.
- Removed old GitHub actions
### Fixed
- `season` filter in `asf.search()` now doesn't throw when used.
------
## [3.2.2](https://github.com/asfadmin/Discovery-PytestAutomation/compare/v3.2.1...v3.2.2)
### Fixed
- netrc authentication works again, affects `ASFProduct.download()`, `ASFSearchResults.download()`, `download_urls()`, `download_url()`
------
## [3.2.1](https://github.com/asfadmin/Discovery-PytestAutomation/compare/v3.2.0...v3.2.1)
### Fixed
- `ASFProduct.stack()` and `asf_search.baseline_search.stack_from_id()` now return ASFSearchResults instead of a list
------
## [3.2.0](https://github.com/asfadmin/Discovery-PytestAutomation/compare/v3.1.3...v3.2.0)
### Changed
- `ASFProduct.stack()` and `asf_search.baseline_search.stack_from_id()` now calculate `temporalBaseline` and `perpendicularBaseline` values of stacked products locally
- `search()` now internally uses a custom format when communicating with ASF's SearchAPI. This should have no apparent impact on current usage of asf_search.
------
## [3.1.3](https://github.com/asfadmin/Discovery-PytestAutomation/compare/v3.1.2...v3.1.3)
### Fixed
- Centroid calculation fixed for scenes spanning the antimeridian
------
## [3.1.2](https://github.com/asfadmin/Discovery-PytestAutomation/compare/v3.1.1...v3.1.2)
### Changed
- `ASFSession` methods `auth_with_cookiejar()` and `auth_with_token()` now raise an error if the passed cookiejar/token is invalid or expired
- `ASFAuthenticationError` raised when encountering a 400 level error while downloading files
### Fixed
- Downloading files with sessions authenticated by `auth_with_token()` method works again
------
## [3.1.1](https://github.com/asfadmin/Discovery-PytestAutomation/compare/v3.1.0...v3.1.1)
### Fixed:
- Fixes missing CMR module import
------
## [3.1.0](https://github.com/asfadmin/Discovery-asf_search/compare/v3.0.6...v3.1.0)
### Added
- Added walkthrough in the form of several jupyter notebooks in /examples
- Added `campaigns()` in `Campaigns` module, returns a list of campaigns for `UAV, AIRSAR, SENTINEL-1 INTERFEROGRAM (BETA)` platforms
### Changed
- Re-enable run-pytest workflow
- Add tests for `ASFSearch, ASFSession, ASFProduct` as well as baseline, geographic, and search modules
- Add Pytest-Automation Plugin integration
- Add automated CodeCov badge to readme
- "collectionName" parameter in `geo_search()` and `search()` is deprecated and raises a warning. Will be removed in a future release, use "campaign" instead
### Fixed
- Fix error while raising ASFBaselineError in `baseline_search.get_stack_params()`
------
## [3.0.6](https://github.com/asfadmin/Discovery-asf_search/compare/v3.0.5...v3.0.6)
### Changed
- Skip download if file already exists
- In the future we will apply file size and/or checksum checks to ensure the existing file is correct
------
## [3.0.5](https://github.com/asfadmin/Discovery-asf_search/compare/v3.0.4...v3.0.5)
### Added
- Add documentation URL to setup.py
- Add Gitter badge/link to readme
### Fixed
- Change hyphens to underscores in some product type constants
------
## [3.0.4](https://github.com/asfadmin/Discovery-asf_search/compare/v3.0.3...v3.0.4)
### Changed
- When working with source, package **must** be installed directly:
- `python3 -m pip install -e .`
### Fixed
- In-region S3 downloads should now function without issue
------
## [3.0.3](https://github.com/asfadmin/Discovery-asf_search/compare/v3.0.2...v3.0.3)
### Fixed
- Replace `ASFProduct.centroid()` calculation with shapely-based calculation
- See: https://github.com/asfadmin/Discovery-asf_search/pull/53
- Removes numpy requirement
- Adds shapely requirement
------
## [3.0.2](https://github.com/asfadmin/Discovery-asf_search/compare/v3.0.0...v3.0.2)
### Added
- Feature and Bug Report github issue templates
### Fixed
- Fix download authentication header issue during direct-to-S3 redirects
- Fix Sentinel-1 stacking to include both A and B in stacks
------
## [3.0.0](https://github.com/asfadmin/Discovery-asf_search/compare/v2.0.2...v3.0.0)
### Added
- Auth support for username/password and cookiejars, in addition to the previously available token-based approach. Create a session, authenticate it with the method of choice, then pass the session to whichever download method is being used.
- Sessions can be created using the `ASFSession` class, a subclass of `requests.Session`
- Once a session is created, call one of its authentication methods:
- `auth_with_creds('user', 'pass)`
- `auth_with_token(`EDL token`)
- `auth_with_cookiejar(http.cookiejar)`
- If you were previously using the `token` argument, such as:
- `results.download(path='...', token='EDL token')`
- Updating can be as simple as:
- `results.download(path='...', session=ASFSession().auth_with_token('EDL token'))`
- Sessions can be re-used and are thread-safe
### Changed
- `download_url()`, `download_urls()`, `ASFProduct.download()` and `ASFSearchResults.download()` now expect a `session` argument instead of `token`
- Send auth headers to every step along a download redirect chain (including final AWS S3 buckets)
------
## [2.0.2](https://github.com/asfadmin/Discovery-asf_search/compare/v2.0.1...v2.0.2)
### Added
- INSTRUMENT constants for C-SAR, PALSAR, and ANVIR-2
------
## [2.0.1](https://github.com/asfadmin/Discovery-asf_search/compare/v2.0.0...v2.0.1)
### Fixed
- Versioning workflow corrected for proper versioning, stop bumping major instead of patch!
------
## [2.0.0](https://github.com/asfadmin/Discovery-asf_search/compare/v1.1.0...v2.0.0)
### Fixed
- Fixed import order of operations bug
- Updated ASFProduct and ASFSearchResults to use path arg in download methods
------
## [1.1.0](https://github.com/asfadmin/Discovery-asf_search/compare/v0.4.0...v1.1.0)
### Added
- Parallel downloads now supported by ASFSearchResults. Defaults to 1 (sequential download)
- For `search()`-based functions that take an argument as a list, single values are now also allowed
### Changed
- Import download functionality in asf_search (for `download_url()` and `download_urls()`)
- "parallel" is now "processes" in download functionality
### Fixed
- Fixed ASFProduct import in search.py
- importlib metadata fix for python <3.8
------
## [0.4.0](https://github.com/asfadmin/Discovery-asf_search/compare/v0.3.0...v0.4.0)
### Added
- ASFSearchResults now has a geojson() method which returns a data structure that matches the geojson specification
- ASFProduct now has a geojson() method that produces a data structure matching a geojson feature snippet
- ASFSearchResults and ASFProduct both have a __str__() methods that serializes the output of their geojson() methods
- Added CodeFactor shield to readme
- Now calculates temporal baselines when building a stack
- New search options:
- min/maxDoppler
- min/MaxFaradayRotation
- flightLine
- offNadirAngle
- season
### Changed
- ASFProduct is no longer a subclass of dict. Instead, metadata has been moved to .properties and .geometry
- ASFSearchResults is now a subclass of UserList, for list-type operations
- Newly-built stacks are sorted by temporal baselines, ascending
### Fixed
- Cleaned up cruft from various refactors
------
## [0.3.0](https://github.com/asfadmin/Discovery-asf_search/compare/v0.2.4...v0.3.0)
### Added
- Layed out framework for INSTRUMENT constants (needs to be populated)
- Support for baseline stacking of pre-calculated datasets
- Download support for single products or entire search result sets, token-based auth only
- ASFSearchResults and ASFProduct classes
- Lower-level ASFError exception class
- ASFDownloadError exception class
- ASFBaselineError exception class
- Better path/frame/platform/product example
### Changed
- No longer uses range type for parameters that accept lists of values and/or ranges. Now expects a 2-value tuple.
- Removed DATASET constants (not searchable, use platform+instrument to identify a dataset)
- Updated hello_world.py baseline example
- Removed output options across the board, geojson only until we no longer rely on SearchAPI calls
- insarStackID now a search option (needed for baseline stacking of pre-calculated datasets)
- Flatter structure for constants
- baseline functionality moved into search group (file restructuring)
### Fixed
- Corrected handling of version number in user agent string
- unused import cleanup
- better type hinting on centroid() function
------
## [0.2.4](https://github.com/asfadmin/Discovery-asf_search/compare/v0.0.0...v0.2.4)
### Added
- product_search(): search using a list of Product IDs (CMR's GranuleUR)
- granule_search(): search using a list of Granule names (aka Scene names)
- geo_search(): search using a WKT string, as well as other parameters
- search(): a generic search function, allowing any combination of the above search features
- stack(): provides basic Baseline stacking functionality (does not yet provide perpendicular/temporal baseline values)
- Numerous constants available, covering common BEAMMODE, DATASET, FLIGHT_DIRECTION, PLATFORM, POLARIZATION, and PRODUCT_TYPE values
- Basic exception classes and error handling for search parameter and server errors
- Populated readme with instructions, examples, and badges
### Changed
- Improved packaging/build process
- Restructured branch layout according to https://gist.github.com/digitaljhelms/4287848
### Fixed
- Removed hard-coded version string
- Install setuptools_scm in pypi publish action
------
Discovery-asf_search-8.1.2/LICENSE 0000664 0000000 0000000 00000002775 14777330235 0016620 0 ustar 00root root 0000000 0000000 BSD 3-Clause License
Copyright (c) 2021, Alaska Satellite Facility
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
3. Neither the name of the copyright holder nor the names of its
contributors may be used to endorse or promote products derived from
this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
Discovery-asf_search-8.1.2/README.md 0000664 0000000 0000000 00000013517 14777330235 0017066 0 ustar 00root root 0000000 0000000 # asf_search
[](https://pypi.python.org/pypi/asf_search/)
[](https://anaconda.org/conda-forge/asf_search)
[](https://pypi.python.org/pypi/asf_search/)
[](https://pypi.python.org/pypi/asf_search/)
[](https://www.codefactor.io/repository/github/asfadmin/discovery-asf_search)
[](https://github.com/asfadmin/Discovery-asf_search/actions/workflows/run-pytest.yml)

[](https://docs.asf.alaska.edu/asf_search/basics/)
[](https://gitter.im/ASFDiscovery/asf_search?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
Python wrapper for the ASF SearchAPI
```python
import asf_search as asf
results = asf.granule_search(['ALPSRS279162400', 'ALPSRS279162200'])
print(results)
wkt = 'POLYGON((-135.7 58.2,-136.6 58.1,-135.8 56.9,-134.6 56.1,-134.9 58.0,-135.7 58.2))'
results = asf.geo_search(platform=[asf.PLATFORM.SENTINEL1], intersectsWith=wkt, maxResults=10)
print(results)
```
## Install
In order to easily manage dependencies, we recommend using dedicated project environments
via [Anaconda/Miniconda](https://docs.conda.io/projects/conda/en/latest/user-guide/install/index.html)
or [Python virtual environments](https://docs.python.org/3/tutorial/venv.html).
asf_search can be installed into a conda environment with
```bash
conda install -c conda-forge asf_search
```
or into a virtual environment with
```bash
python3 -m pip install asf_search
```
To install pytest/cov packages for testing, along with the minimal packages:
```bash
python3 -m pip install asf_search[test]
```
## Usage
_Full documentation is available at https://docs.asf.alaska.edu/asf_search/basics/_
Programmatically searching for ASF data is made simple with asf_search. Several search functions are provided:
- `geo_search()` Find product info over an area of interest using a WKT string
- `granule_search()` Find product info using a list of scenes
- `product_search()` Find product info using a list of products
- `search()` Find product info using any combination combination of search parameters
- `stack()` Find a baseline stack of products using a reference scene
- Additionally, numerous constants are provided to ease the search process
Additionally, asf_search support downloading data, both from search results as provided by the above search functions, and directly on product URLs. An authenticated session is generally required. This is provided by the `ASFSession` class, and use of one of its three authentication methods:
- `auth_with_creds('user', 'pass)`
- `auth_with_token('EDL token')`
- `auth_with_cookiejar(http.cookiejar)`
That session should be passed to whichever download method is being called, can be re-used, and is thread safe. Examples:
```python
results = asf_search.granule_search([...])
session = asf_search.ASFSession()
session.auth_with_creds('user', 'pass')
results.download(path='/Users/SARGuru/data', session=session)
```
Alternately, downloading a list of URLs contained in `urls` and creating the session inline:
```python
urls = [...]
asf_search.download_urls(urls=urls, path='/Users/SARGuru/data', session=ASFSession().auth_with_token('EDL token'))
```
Also note that `ASFSearchResults.download()` and the generic `download_urls()` function both accept a `processes` parameter which allows for parallel downloads.
Further examples of all of the above can be found in `examples/`
## Development
### Branching
Instance |
Branch |
Description, Instructions, Notes |
Stable |
stable |
Accepts merges from Working and Hotfixes |
Working |
master |
Accepts merges from Features/Issues and Hotfixes |
Features/Issues |
topic-* |
Always branch off HEAD of Working |
Hotfix |
hotfix-* |
Always branch off Stable |
For an extended description of our workflow, see https://gist.github.com/digitaljhelms/4287848
### Enable Logging
We use standard the standard `logging` in our package for output.
Heres a basic example for hooking into it with your application:
```python
import asf_search as asf
import logging
ASF_LOGGER = logging.getLogger("asf_search")
formatter = logging.Formatter('[ %(asctime)s (%(name)s) %(filename)s:%(lineno)d ] %(levelname)s - %(message)s')
# Get output to the console:
stream_handle = logging.StreamHandler()
stream_handle.setFormatter(formatter)
ASF_LOGGER.addHandler(stream_handle)
# If you want it write to a file too:
file_handle = logging.FileHandler('MyCustomApp.log')
file_handle.setFormatter(formatter)
ASF_LOGGER.addHandler(file_handle)
# Only see messages that might affect you
ASF_LOGGER.setLevel(logging.WARNING)
# Test if the logger throws an error, you see it as expected:
ASF_LOGGER.error("This is only a drill. Please do not panic.")
# Should output this:
# [ 2023-01-17 10:04:53,780 (asf_search) main.py:42 ] ERROR - This is only a drill. Please do not panic.
```
For more configure options on `logging`, please visit [their howto page](https://docs.python.org/3/howto/logging.html).
Discovery-asf_search-8.1.2/asf_search/ 0000775 0000000 0000000 00000000000 14777330235 0017676 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/asf_search/ASFProduct.py 0000664 0000000 0000000 00000047454 14777330235 0022240 0 ustar 00root root 0000000 0000000 import os
from typing import Any, Dict, Tuple, Type, List, final
import warnings
from shapely.geometry import shape, Point, Polygon, mapping
import json
import re
from urllib import parse
from asf_search import ASFSession, ASFSearchResults
from asf_search.ASFSearchOptions import ASFSearchOptions
from asf_search.download import download_url
from asf_search.download.file_download_type import FileDownloadType
from asf_search.CMR.translate import try_parse_date
from asf_search.CMR.translate import try_parse_float, try_parse_int, try_round_float
class ASFProduct:
"""
The ASFProduct class is the base class for search results from asf-search.
Key props:
- properties:
- stores commonly acessed properties of the CMR UMM for convenience
- umm:
- The data portion of the CMR response
- meta:
- The metadata portion of the CMR response
- geometry:
- The geometry `{coordinates: [[lon, lat] ...], 'type': Polygon}`
- baseline:
- used for spatio-temporal baseline stacking, stores state vectors/ascending
node time/insar baseline values when available (Not set in base ASFProduct class)
- See `S1Product` or `ALOSProduct` `get_baseline_calc_properties()`
methods for implementation examples
Key methods:
- `download()`
- `stack()`
- `remotezip()`
"""
@classmethod
def get_classname(cls):
return cls.__name__
_base_properties = {
# min viable product
'centerLat': {
'path': ['AdditionalAttributes', ('Name', 'CENTER_LAT'), 'Values', 0],
'cast': try_parse_float,
},
'centerLon': {
'path': ['AdditionalAttributes', ('Name', 'CENTER_LON'), 'Values', 0],
'cast': try_parse_float,
},
'stopTime': {
'path': ['TemporalExtent', 'RangeDateTime', 'EndingDateTime'],
'cast': try_parse_date,
}, # primary search results sort key
'fileID': {'path': ['GranuleUR']}, # secondary search results sort key
'flightDirection': {
'path': [
'AdditionalAttributes',
('Name', 'ASCENDING_DESCENDING'),
'Values',
0,
]
},
'pathNumber': {
'path': ['AdditionalAttributes', ('Name', 'PATH_NUMBER'), 'Values', 0],
'cast': try_parse_int,
},
'processingLevel': {
'path': ['AdditionalAttributes', ('Name', 'PROCESSING_TYPE'), 'Values', 0]
},
# commonly used
'url': {'path': ['RelatedUrls', ('Type', 'GET DATA'), 'URL']},
'startTime': {
'path': ['TemporalExtent', 'RangeDateTime', 'BeginningDateTime'],
'cast': try_parse_date,
},
'sceneName': {
'path': [
'DataGranule',
'Identifiers',
('IdentifierType', 'ProducerGranuleId'),
'Identifier',
]
},
'browse': {'path': ['RelatedUrls', ('Type', [('GET RELATED VISUALIZATION', 'URL')])]},
'platform': {'path': ['AdditionalAttributes', ('Name', 'ASF_PLATFORM'), 'Values', 0]},
'bytes': {
'path': ['AdditionalAttributes', ('Name', 'BYTES'), 'Values', 0],
'cast': try_round_float,
},
'md5sum': {'path': ['AdditionalAttributes', ('Name', 'MD5SUM'), 'Values', 0]},
'frameNumber': {
'path': ['AdditionalAttributes', ('Name', 'CENTER_ESA_FRAME'), 'Values', 0],
'cast': try_parse_int,
}, # overloaded by S1, ALOS, and ERS
'granuleType': {'path': ['AdditionalAttributes', ('Name', 'GRANULE_TYPE'), 'Values', 0]},
'orbit': {
'path': ['OrbitCalculatedSpatialDomains', 0, 'OrbitNumber'],
'cast': try_parse_int,
},
'polarization': {'path': ['AdditionalAttributes', ('Name', 'POLARIZATION'), 'Values', 0]},
'processingDate': {
'path': ['DataGranule', 'ProductionDateTime'],
'cast': try_parse_date,
},
'sensor': {
'path': ['Platforms', 0, 'Instruments', 0, 'ShortName'],
},
}
"""
_base_properties dictionary, mapping readable property names to paths and optional type casting
entries are organized as such:
- `PROPERTY_NAME`: The name the property should be called in `ASFProduct.properties`
- `path`: the expected path in the CMR UMM json granule response as a list
- `cast`: (optional): the optional type casting method
Defining `_base_properties` in subclasses allows for
defining custom properties or overiding existing ones.
See `S1Product.get_property_paths()` on how subclasses are expected to
combine `ASFProduct._base_properties` with their own separately defined `_base_properties`
"""
_url_types = ['GET DATA', 'EXTENDED METADATA', 'GET DATA VIA DIRECT ACCESS', 'GET RELATED VISUALIZATION', 'VIEW RELATED INFORMATION', 'USE SERVICE API']
def __init__(self, args: Dict = {}, session: ASFSession = ASFSession()):
self.meta = args.get('meta')
self.umm = args.get('umm')
translated = self.translate_product(args)
self.properties = translated['properties']
self.geometry = translated['geometry']
self.baseline = None
self.session = session
def __str__(self):
return json.dumps(self.geojson(), indent=2, sort_keys=True)
def geojson(self) -> Dict:
"""
Returns ASFProduct object as a geojson formatted dictionary
with `type`, `geometry`, and `properties` keys
"""
return {
'type': 'Feature',
'geometry': self.geometry,
'properties': self.properties,
}
def download(
self,
path: str,
filename: str = None,
session: ASFSession = None,
fileType=FileDownloadType.DEFAULT_FILE,
) -> None:
"""
Downloads this product to the specified path and optional filename.
:param path: The directory into which this product should be downloaded.
:param filename: Optional filename to use instead of the original filename of this product.
:param session: The session to use, defaults to the one used to find the results.
:return: None
"""
default_filename = self.properties['fileName']
if filename is not None:
multiple_files = (
fileType == FileDownloadType.ADDITIONAL_FILES
and len(self.properties['additionalUrls']) > 1
) or fileType == FileDownloadType.ALL_FILES
if multiple_files:
warnings.warn(
'Attempting to download multiple files for product, '
f'ignoring user provided filename argument "{filename}", using default.'
)
else:
default_filename = filename
if session is None:
session = self.session
urls = self.get_urls(fileType=fileType)
for url in urls:
base_filename = '.'.join(default_filename.split('.')[:-1])
extension = url.split('.')[-1]
download_url(
url=url,
path=path,
filename=f'{base_filename}.{extension}',
session=session,
)
def get_urls(self, fileType=FileDownloadType.DEFAULT_FILE) -> list:
urls = []
if fileType == FileDownloadType.DEFAULT_FILE:
urls.append(self.properties['url'])
elif fileType == FileDownloadType.ADDITIONAL_FILES:
urls.extend(self.properties.get('additionalUrls', []))
elif fileType == FileDownloadType.ALL_FILES:
urls.append(self.properties['url'])
urls.extend(self.properties.get('additionalUrls', []))
else:
raise ValueError(
"Invalid FileDownloadType provided, the valid types are 'DEFAULT_FILE', 'ADDITIONAL_FILES', and 'ALL_FILES'"
)
return urls
def _get_additional_filenames_and_urls(
self,
default_filename: str = None, # for subclasses without fileName in url (see S1BurstProduct implementation) # noqa F401
) -> List[Tuple[str, str]]:
return [
(self._parse_filename_from_url(url), url)
for url in self.properties.get('additionalUrls', [])
]
def _parse_filename_from_url(self, url: str) -> str:
file_path = os.path.split(parse.urlparse(url).path)
filename = file_path[1]
return filename
def stack(
self, opts: ASFSearchOptions = None, useSubclass: Type['ASFProduct'] = None
) -> ASFSearchResults:
"""
Builds a baseline stack from this product.
Parameters
----------
opts:
An ASFSearchOptions object describing the search parameters to be used.
Search parameters specified outside this object will override in event of a conflict.
ASFProductSubclass: An ASFProduct subclass constructor to cast results to
Returns
----------
asf_search.ASFSearchResults
containing the stack, with the addition of baseline values
(temporal, perpendicular) attached to each ASFProduct.
"""
from .search.baseline_search import stack_from_product
if opts is None:
opts = ASFSearchOptions(session=self.session)
return stack_from_product(self, opts=opts, ASFProductSubclass=useSubclass)
def get_stack_opts(self, opts: ASFSearchOptions = None) -> ASFSearchOptions:
"""
Build search options that can be used to find an insar stack for this product
:return: ASFSearchOptions describing appropriate options
for building a stack from this product
"""
return None
def _get_access_urls(
self,
url_types: List[str] = ['GET DATA', 'EXTENDED METADATA']
) -> List[str]:
accessUrls = []
for url_type in url_types:
if urls := self.umm_get(self.umm, 'RelatedUrls', ('Type', [(url_type, 'URL')]), 0):
accessUrls.extend(urls)
return sorted(list(set(accessUrls)))
def _get_urls(self) -> List[str]:
"""Finds and returns all umm urls"""
urls = self._get_access_urls(self._url_types)
return [
url for url in urls if not url.startswith('s3://')
]
def _get_s3_uris(self) -> List[str]:
"""Finds and returns all umm S3 direct access uris"""
s3_urls = self._get_access_urls(self._url_types)
return [url for url in s3_urls if url.startswith('s3://')]
def _get_additional_urls(self) -> List[str]:
"""Finds and returns all non-md5/image urls and filters out the existing `url` property"""
access_urls = self._get_urls()
return [
url for url in access_urls
if not url.endswith('.md5')
and not url.endswith('.png')
and url != self.properties['url']
and 's3credentials' not in url
]
def find_urls(self, extension: str = None, pattern: str = r'.*', directAccess: bool = False) -> List[str]:
"""
Searches for all urls matching a given extension and/or pattern
param extension: the file extension to search for. (Defaults to `None`)
- Example: '.tiff'
param pattern: A regex pattern to search each url for.(Defaults to `False`)
- Example: `r'(QA_)+'` to find urls with 'QA_' at least once
param directAccess: should search in s3 bucket urls (Defaults to `False`)
"""
search_list = self._get_s3_uris() if directAccess else self._get_urls()
def _get_extension(file_url: str):
path = parse.urlparse(file_url).path
return os.path.splitext(path)[-1]
if extension is not None:
search_list = [url for url in search_list if _get_extension(url) == extension]
regexp = re.compile(pattern=pattern)
return sorted([url for url in search_list if regexp.search(url) is not None])
def centroid(self) -> Point:
"""
Finds the centroid of a product
"""
coords = mapping(shape(self.geometry))['coordinates'][0]
lons = [p[0] for p in coords]
if max(lons) - min(lons) > 180:
unwrapped_coords = [a if a[0] > 0 else [a[0] + 360, a[1]] for a in coords]
else:
unwrapped_coords = [a for a in coords]
return Polygon(unwrapped_coords).centroid
def remotezip(self, session: ASFSession) -> 'RemoteZip': # type: ignore # noqa: F821
"""Returns a RemoteZip object which can be used to download
a part of an ASFProduct's zip archive. (See example in examples/5-Download.ipynb)
requires installing optional dependencies via pip or conda to use the `remotezip` package:
`python3 -m pip install asf-search[extras]`
:param session: an authenticated ASFSession
"""
from .download.download import remotezip
return remotezip(self.properties['url'], session=session)
def _read_umm_property(self, umm: Dict, mapping: Dict) -> Any:
value = self.umm_get(umm, *mapping['path'])
if mapping.get('cast') is None:
return value
return self.umm_cast(mapping['cast'], value)
def translate_product(self, item: Dict) -> Dict:
"""
Generates `properties` and `geometry` from the CMR UMM response
"""
try:
coordinates = item['umm']['SpatialExtent']['HorizontalSpatialDomain']['Geometry'][
'GPolygons'
][0]['Boundary']['Points']
coordinates = [[c['Longitude'], c['Latitude']] for c in coordinates]
geometry = {'coordinates': [coordinates], 'type': 'Polygon'}
except KeyError:
geometry = {'coordinates': None, 'type': 'Polygon'}
umm = item.get('umm')
# additionalAttributes = {attr['Name']: attr['Values'] for attr in umm['AdditionalAttributes']}
properties = {
prop: self._read_umm_property(umm, umm_mapping)
for prop, umm_mapping in self._base_properties.items()
}
if properties.get('url') is not None:
properties['fileName'] = properties['url'].split('/')[-1]
else:
properties['fileName'] = None
# Fallbacks
if properties.get('beamModeType') is None:
properties['beamModeType'] = self.umm_get(
umm, 'AdditionalAttributes', ('Name', 'BEAM_MODE'), 'Values', 0
)
if properties.get('platform') is None:
properties['platform'] = self.umm_get(umm, 'Platforms', 0, 'ShortName')
return {'geometry': geometry, 'properties': properties, 'type': 'Feature'}
def get_sort_keys(self) -> Tuple[str, str]:
"""
Returns tuple of primary and secondary date values used for sorting final search results
Any subclasses must return string for final `sort()` to work
"""
# `sort()` will raise an error when comparing `NoneType`,
# using self._read_property() to wrap standard `dict.get()` for possible `None` values
primary_key = self._read_property(key='stopTime', default='')
secondary_key = self._read_property(
key='fileID', default=self._read_property('sceneName', '')
)
return (primary_key, secondary_key)
def _read_property(self, key: str, default: Any = None) -> Any:
"""
Helper method wraps `properties.get()`.
Since a property can be `None`, if the key exists`dict.get('key', 'default')`
will never return the default
"""
output = default
if (value := self.properties.get(key)) is not None:
output = value
return output
@final
@staticmethod
def umm_get(item: Dict, *args):
"""
Used to search for values in CMR UMM
:param item: the umm dict returned from CMR
:param *args: the expected path to the value
Example case:
"I want to grab the polarization from the granule umm"
```
item = {
'AdditionalAttributes': [
{
'Name': 'POLARIZATION',
'Values': ['VV', 'VH']
},
...
],
...
}
```
The path provided to *args would look like this:
```
'AdditionalAttributes', ('Name', 'POLARIZATION'), 'Values', 0
result: 'VV'
```
- `'AdditionalAttributes'` acts like item['AdditionalAttributes'],
which is a list of dictionaries
- Since `AdditionalAttributes` is a LIST of dictionaries,
we search for a dict with the key value pair,
`('Name', 'POLARIZATION')`
- If found, we try to access that dictionary's `Values` key
- Since `Values` is a list, we can access the first index `0` (in this case, 'VV')
---
If you want more of the umm, simply reduce how deep you search:
Example: "I need BOTH polarizations (`OPERAS1Product` does this, noticed the omitted `0`)
```
'AdditionalAttributes', ('Name', 'POLARIZATION'), 'Values'
result: ['VV', 'VH']
```
---
Example: "I need the ENTIRE POLARIZATION dict"
```
'AdditionalAttributes', ('Name', 'POLARIZATION')
result: {
'Name': 'POLARIZATION',
'Values': ['VV', 'VH']
}
```
---
ADVANCED:
Sometimes there are multiple dictionaries in a list that have
the same key value pair we're searching for (See `OPERAS1Product` umm under `RelatedUrls`).
This means we can miss values since we're only grabbing the first match
depending on how the umm is organized.
There is a way to get ALL data that matches our key value criteria.
Example: "I need ALL `URL` values for dictionaries in `RelatedUrls`
where `Type` is `GET DATA`" (See in use in `OPERAS1Product` class)
```
'RelatedUrls', ('Type', [('GET DATA', 'URL')]), 0
```
"""
if item is None:
return None
for key in args:
if isinstance(key, str):
item = item.get(key)
elif isinstance(key, int):
item = item[key] if key < len(item) else None
elif isinstance(key, tuple):
(a, b) = key
if isinstance(b, List):
output = []
b = b[0]
for child in item:
if ASFProduct.umm_get(child, key[0]) == b[0]:
output.append(ASFProduct.umm_get(child, b[1]))
if len(output):
return output
return None
found = False
for child in item:
if ASFProduct.umm_get(child, a) == b:
item = child
found = True
break
if not found:
return None
if item is None:
return None
if item in [None, 'NA', 'N/A', '']:
item = None
return item
@final
@staticmethod
def umm_cast(f, v):
"""Tries to cast value v by callable f, returns None if it fails"""
try:
return f(v)
except TypeError:
return None
@staticmethod
def _is_subclass(item: Dict) -> bool:
"""
Used to determine which subclass to use for specific
edge-cases when parsing results in search methods
(Currently implemented for ARIA and OPERA subclasses).
params:
- item (dict): the CMR UMM-G item to read from
"""
raise NotImplementedError()
Discovery-asf_search-8.1.2/asf_search/ASFSearchOptions/ 0000775 0000000 0000000 00000000000 14777330235 0023011 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/asf_search/ASFSearchOptions/ASFSearchOptions.py 0000664 0000000 0000000 00000011305 14777330235 0026476 0 ustar 00root root 0000000 0000000 import warnings
import json
from .validator_map import validator_map, validate
from .config import config
from asf_search import ASF_LOGGER
class ASFSearchOptions:
def __init__(self, **kwargs):
"""
Initialize the object, creating the list of attributes
based on the contents of validator_map, and assign them based on kwargs
:param kwargs: any search options to be set immediately
"""
# init the built in attrs:
for key in validator_map:
self.__setattr__(key, None)
# Apply any parameters passed in:
for key, value in kwargs.items():
self.__setattr__(key, value)
def __setattr__(self, key, value):
"""
Set a search option, restricting to the keys in validator_map only,
and applying validation to the value before setting
:param key: the name of the option to be set
:param value: the value to which to set the named option
"""
# self.* calls custom __setattr__ method, creating inf loop. Use super().*
# Let values always be None, even if their validator doesn't agree. Used to delete them too:
if key in validator_map:
if value is None: # always maintain config on required fields
if key in config:
super().__setattr__(key, config[key])
else:
super().__setattr__(key, None)
else:
super().__setattr__(key, validate(key, value))
else:
msg = f"key '{key}' is not a valid search option (setattr)"
ASF_LOGGER.error(msg)
raise KeyError(msg)
def __delattr__(self, item):
"""
Clear a search option by setting its value to None
:param item: the name of the option to clear
"""
if item in validator_map:
self.__setattr__(item, None)
else:
msg = f"key '{item}' is not a valid search option (delattr)"
ASF_LOGGER.error(msg)
raise KeyError(msg)
def __iter__(self):
"""
Filters search parameters, only returning populated fields. Used when casting to a dict.
"""
for key in validator_map:
if not self._is_val_default(key):
value = self.__getattribute__(key)
yield key, value
def __str__(self):
"""
What to display if `print(opts)` is called.
"""
return json.dumps(dict(self), indent=4, default=str)
# Default is set to '...', since 'None' is a very valid value here
def pop(self, key, default=...):
"""
Removes 'key' from self and returns it's value. Throws KeyError if doesn't exist
:param key: name of key to return value of, and delete
"""
if key not in validator_map:
msg = f"key '{key}' is not a valid key for ASFSearchOptions. (pop)"
ASF_LOGGER.error(msg)
raise KeyError(msg)
if self._is_val_default(key):
if default != ...:
return default
msg = f"key '{key}' is set to empty/None. (pop)"
ASF_LOGGER.error(msg)
raise KeyError(msg)
# Success, delete and return it:
val = getattr(self, key)
self.__delattr__(key)
return val
def reset_search(self):
"""
Resets all populated search options, excluding config options (host, session, etc)
"""
for key, _ in self:
if key not in config:
super().__setattr__(key, None)
def merge_args(self, **kwargs) -> None:
"""
Merges all keyword args into this ASFSearchOptions object.
Emits a warning for any options that are over-written by the operation.
:param kwargs: The search options to merge into the object
:return: None
"""
for key in kwargs:
# Spit out warning if the value is something other than the default:
if not self._is_val_default(key):
msg = (
'While merging search options, '
f'existing option {key}:{getattr(self, key, None)} '
f'overwritten by kwarg with value {kwargs[key]}'
)
ASF_LOGGER.warning(msg)
warnings.warn(msg)
self.__setattr__(key, kwargs[key])
def _is_val_default(self, key) -> bool:
"""
Returns bool on if the key's current value is the same as it's default value
:param key: The key to check
:return: bool
"""
default_val = config[key] if key in config else None
current_val = getattr(self, key, None)
return current_val == default_val
Discovery-asf_search-8.1.2/asf_search/ASFSearchOptions/__init__.py 0000664 0000000 0000000 00000000150 14777330235 0025116 0 ustar 00root root 0000000 0000000 from .ASFSearchOptions import ASFSearchOptions # noqa F401
from .validators import * # noqa F401 F403
Discovery-asf_search-8.1.2/asf_search/ASFSearchOptions/config.py 0000664 0000000 0000000 00000000351 14777330235 0024627 0 ustar 00root root 0000000 0000000 from asf_search.constants import INTERNAL
from asf_search.ASFSession import ASFSession
config = {
'host': INTERNAL.CMR_HOST,
'provider': INTERNAL.DEFAULT_PROVIDER,
'session': ASFSession(),
'collectionAlias': True,
}
Discovery-asf_search-8.1.2/asf_search/ASFSearchOptions/validator_map.py 0000664 0000000 0000000 00000005574 14777330235 0026220 0 ustar 00root root 0000000 0000000 from asf_search import ASF_LOGGER
from .validators import (
parse_string,
parse_float,
parse_wkt,
parse_date,
parse_string_list,
parse_int_list,
parse_int_or_range_list,
parse_float_or_range_list,
parse_cmr_keywords_list,
parse_session,
parse_circle,
parse_linestring,
parse_point,
)
def validate(key, value):
if key not in validator_map:
error_msg = f'Key "{key}" is not a valid search option.'
# See if they just missed up case sensitivity:
for valid_key in validator_map:
if key.lower() == valid_key.lower():
error_msg += f' (Did you mean "{valid_key}"?)'
break
ASF_LOGGER.error(error_msg)
raise KeyError(error_msg)
try:
return validator_map[key](value)
except ValueError as exc:
ASF_LOGGER.exception(f'Failed to parse item in ASFSearchOptions: {key=} {value=} {exc=}')
raise
validator_map = {
# Search parameters Parser
'maxResults': int,
'absoluteOrbit': parse_int_or_range_list,
'asfFrame': parse_int_or_range_list,
'beamMode': parse_string_list,
'beamSwath': parse_string_list,
'campaign': parse_string,
'circle': parse_circle,
'linestring': parse_linestring,
'point': parse_point,
'maxDoppler': parse_float,
'minDoppler': parse_float,
'maxFaradayRotation': parse_float,
'minFaradayRotation': parse_float,
'flightDirection': parse_string,
'flightLine': parse_string,
'frame': parse_int_or_range_list,
'granule_list': parse_string_list,
'product_list': parse_string_list,
'intersectsWith': parse_wkt,
'lookDirection': parse_string,
'offNadirAngle': parse_float_or_range_list,
'platform': parse_string_list,
'polarization': parse_string_list,
'processingLevel': parse_string_list,
'relativeOrbit': parse_int_or_range_list,
'processingDate': parse_date,
'start': parse_date,
'end': parse_date,
'season': parse_int_list,
'groupID': parse_string_list,
'insarStackId': parse_string,
'instrument': parse_string,
'collections': parse_string_list,
'shortName': parse_string_list,
'dataset': parse_string_list,
'cmr_keywords': parse_cmr_keywords_list,
# S1 Inteferrogram Filters
'temporalBaselineDays': parse_string_list,
# Opera Burst Filters
'operaBurstID': parse_string_list,
# SLC Burst Filters
'absoluteBurstID': parse_int_list,
'relativeBurstID': parse_int_list,
'fullBurstID': parse_string_list,
# nisar paramaters
'frameCoverage': parse_string,
'jointObservation': bool,
'mainBandPolarization': parse_string_list,
'sideBandPolarization': parse_string_list,
'rangeBandwidth': parse_string_list,
# Config parameters Parser
'session': parse_session,
'host': parse_string,
'provider': parse_string,
'collectionAlias': bool,
}
Discovery-asf_search-8.1.2/asf_search/ASFSearchOptions/validators.py 0000664 0000000 0000000 00000024235 14777330235 0025541 0 ustar 00root root 0000000 0000000 import dateparser
from datetime import datetime, timezone
import requests
from typing import Dict, Union, Tuple, TypeVar, Callable, List, Type, Sequence
import math
from shapely import wkt, errors
number = TypeVar('number', int, float)
def parse_string(value: str) -> str:
"""
Base string validator. Maybe silly, but we can also ensure any constraints needed in the future.
:param value: The string to validate
:return: The validated string, with any required modifications
"""
# Convert to string first, so length is checked against only str types:
try:
value = f'{value}'
except ValueError as exc: # If this happens, printing v's value would fail too...
raise ValueError(f"Invalid string: Can't cast type '{type(value)}' to string.") from exc
if len(value) == 0:
raise ValueError('Invalid string: Empty.')
return value
def parse_float(value: float) -> float:
"""
Base float validator. Ensures values like Inf are not allowed even though they are valid floats.
:param value: The float to validate
:return: The validated float
"""
try:
value = float(value)
except ValueError as exc:
raise ValueError(f'Invalid float: {value}') from exc
if math.isinf(value) or math.isnan(value):
raise ValueError(f'Float values must be finite: got {value}')
return value
def parse_date(value: Union[str, datetime]) -> Union[datetime, str]:
"""
Base date validator
:param value: String or datetime object to be validated
:return: String passed in, if it can successfully convert to Datetime.
(Need to keep strings like "today" w/out converting them, but throw on "asdf")
"""
if isinstance(value, datetime):
return _to_utc(value)
date = dateparser.parse(str(value))
if date is None:
raise ValueError(f"Invalid date: '{value}'.")
return _to_utc(date).strftime('%Y-%m-%dT%H:%M:%SZ')
def _to_utc(date: datetime):
if date.tzinfo is None:
date = date.replace(tzinfo=timezone.utc)
return date
def parse_range(
value: Tuple[number, number], h: Callable[[number], number]
) -> Tuple[number, number]:
"""
Base range validator. For our purposes, a range is a tuple
with exactly two numeric elements (a, b), requiring a <= b.
Parameters
----------
value: The range to be validated. Examples: (3, 5), (1.1, 12.3)
h: The validator function to apply to each individual value
Returns
----------
Validated tuple representing the range
"""
if isinstance(value, tuple):
if len(value) < 2:
raise ValueError(f'Not enough values in min/max tuple: {value}')
if len(value) > 2:
raise ValueError(f'Too many values in min/max tuple: {value}')
value = (h(value[0]), h(value[1]))
if math.isinf(value[0]) or math.isnan(value[0]):
raise ValueError(
f'Expected finite numeric min in min/max tuple, got {value[0]}: {value}'
)
if math.isinf(value[1]) or math.isnan(value[1]):
raise ValueError(
f'Expected finite numeric max in min/max tuple, got {value[1]}: {value}'
)
if value[0] > value[1]:
raise ValueError(
f'Min must be less than max when using min/max tuples to search: {value}'
)
return value
raise ValueError(f'Invalid range. Expected 2-value numeric tuple, got {type(value)}: {value}')
# Parse and validate a date range: "1991-10-01T00:00:00Z,1991-10-02T00:00:00Z"
def parse_date_range(
value: Tuple[Union[str, datetime], Union[str, datetime]],
) -> Tuple[datetime, datetime]:
return parse_range(value, parse_date)
# Parse and validate an integer range: "3-5"
def parse_int_range(value: Tuple[int, int]) -> Tuple[int, int]:
return parse_range(value, int)
# Parse and validate a float range: "1.1-12.3"
def parse_float_range(value: Tuple[float, float]) -> Tuple[float, float]:
return parse_range(value, float)
# Parse and validate an iterable of values, using h() to validate each value:
# "a,b,c", "1,2,3", "1.1,2.3"
def parse_list(value: Sequence, h) -> List:
if not isinstance(value, Sequence) or isinstance(value, str):
value = [value]
try:
return [h(a) for a in value]
except ValueError as exc:
raise ValueError(f'Invalid {h.__name__} list: {exc}') from exc
def parse_cmr_keywords_list(value: Sequence[Union[Dict, Sequence]]):
if not isinstance(value, Sequence) or (
len(value) == 2 and isinstance(value[0], str)
): # in case we're passed single key value pair as sequence
value = [value]
for idx, item in enumerate(value):
if not isinstance(item, tuple) and not isinstance(item, Sequence):
raise ValueError(
f'Expected item in cmr_keywords list index {idx} to be tuple pair, '
f'got value {item} of type {type(item)}'
)
if len(item) != 2:
raise ValueError(
f'Expected item in cmr_keywords list index {idx} to be of length 2, '
f'got value {item} of length {len(item)}'
)
search_key, search_value = item
if not isinstance(search_key, str) or not isinstance(search_value, str):
raise ValueError(
f'Expected tuple pair of types: '
f'"{type(str)}, {type(str)}" in cmr_keywords at index {idx}, '
f'got value "{str(item)}" '
f'of types: "{type(search_key)}, {type(search_value)}"'
)
return value
# Parse and validate an iterable of strings: "foo,bar,baz"
def parse_string_list(value: Sequence[str]) -> List[str]:
return parse_list(value, parse_string)
# Parse and validate an iterable of integers: "1,2,3"
def parse_int_list(value: Sequence[int]) -> List[int]:
return parse_list(value, int)
# Parse and validate an iterable of floats: "1.1,2.3,4.5"
def parse_float_list(value: Sequence[float]) -> List[float]:
return parse_list(value, float)
def parse_number_or_range(value: Union[List, Tuple[number, number], range], h):
try:
if isinstance(value, tuple):
return parse_range(value, h)
if isinstance(value, range):
if value.step == 1:
return [value.start, value.stop]
return h(value)
except ValueError as exc:
raise ValueError(f'Invalid {h.__name__} or range: {exc}') from exc
# Parse and validate an iterable of numbers or number ranges, using h() to validate each value:
# "1,2,3-5", "1.1,1.4,5.1-6.7"
def parse_number_or_range_list(value: Sequence, h) -> List:
if not isinstance(value, Sequence) or isinstance(value, range):
value = [value]
return [parse_number_or_range(x, h) for x in value]
# Parse and validate an iterable of integers or integer ranges: "1,2,3-5"
def parse_int_or_range_list(value: Sequence) -> List:
return parse_number_or_range_list(value, int)
# Parse and validate an iterable of float or float ranges: "1.0,2.0,3.0-5.0"
def parse_float_or_range_list(value: Sequence) -> List:
return parse_number_or_range_list(value, parse_float)
# Parse and validate a coordinate list
def parse_coord_list(value: Sequence[float]) -> List[float]:
if not isinstance(value, Sequence):
raise ValueError(f'Invalid coord list list: Must pass in an iterable. Got {type(value)}.')
for coord in value:
try:
float(coord)
except ValueError as exc:
raise ValueError(f'Invalid coordinate: {coord}') from exc
if len(value) % 2 != 0:
raise ValueError(f'Invalid coordinate list, odd number of values provided: {value}')
return value
# Parse and validate a bbox coordinate list
def parse_bbox_list(value: Sequence[float]) -> List[float]:
try:
# This also makes sure v is an iterable:
value = parse_coord_list(value)
except ValueError as exc:
raise ValueError(f'Invalid bbox: {exc}') from exc
if len(value) != 4:
raise ValueError(f'Invalid bbox, must be 4 values: {value}')
return value
# Parse and validate a point coordinate list
def parse_point_list(value: Sequence[float]) -> List[float]:
try:
# This also makes sure v is an iterable:
value = parse_coord_list(value)
except ValueError as exc:
raise ValueError(f'Invalid point: {exc}') from exc
if len(value) != 2:
raise ValueError(f'Invalid point, must be 2 values: {value}')
return value
# Parse a WKT and convert it to a coordinate string
def parse_wkt(value: str) -> str:
try:
value = wkt.loads(value)
except errors.WKTReadingError as exc:
raise ValueError(f'Invalid wkt: {exc}') from exc
return wkt.dumps(value)
# Parse a CMR circle:
# [longitude, latitude, radius(meters)]
def parse_circle(value: List[float]) -> List[float]:
value = parse_float_list(value)
if len(value) != 3:
raise ValueError(f'Invalid circle, must be 3 values (lat, long, radius). Got: {value}')
return value
# Parse a CMR linestring:
# [longitude, latitude, longitude, latitude, ...]
def parse_linestring(value: List[float]) -> List[float]:
value = parse_float_list(value)
if len(value) % 2 != 0:
raise ValueError(
f'Invalid linestring, must be values of format (lat, long, lat, long, ...). Got: {value}'
)
return value
def parse_point(value: List[float]) -> List[float]:
value = parse_float_list(value)
if len(value) != 2:
raise ValueError(f'Invalid point, must be values of format (lat, long). Got: {value}')
return value
# Parse and validate a coordinate string
def parse_coord_string(value: List):
value = parse_float_list(value)
if len(value) % 2 != 0:
raise ValueError(
f'Invalid coordinate string, must be values of format (lat, long, lat, long, ...). Got: {value}'
)
return value
# Take "requests.Session", or anything that subclasses it:
def parse_session(session: Type[requests.Session]):
if issubclass(type(session), requests.Session):
return session
else:
raise ValueError(
'Invalid Session: expected ASFSession or a requests.Session subclass. '
f'Got {type(session)}'
)
Discovery-asf_search-8.1.2/asf_search/ASFSearchResults.py 0000664 0000000 0000000 00000011025 14777330235 0023370 0 ustar 00root root 0000000 0000000 from collections import UserList
from multiprocessing import Pool
import json
from typing import List
from asf_search import ASFSession, ASFSearchOptions
from asf_search.download.file_download_type import FileDownloadType
from asf_search.exceptions import ASFSearchError
from asf_search import ASF_LOGGER
from asf_search.export.csv import results_to_csv
from asf_search.export.jsonlite import results_to_jsonlite
from asf_search.export.jsonlite2 import results_to_jsonlite2
from asf_search.export.kml import results_to_kml
from asf_search.export.metalink import results_to_metalink
class ASFSearchResults(UserList):
def __init__(self, *args, opts: ASFSearchOptions = None):
super().__init__(*args)
# Store it JUST so the user can access it (There might be zero products)
# Each product will use their own reference to opts (but points to the same obj)
self.searchOptions = opts
self.searchComplete = False
def geojson(self):
return {
'type': 'FeatureCollection',
'features': [product.geojson() for product in self],
}
def csv(self):
return results_to_csv(self)
def kml(self):
return results_to_kml(self)
def metalink(self):
return results_to_metalink(self)
def jsonlite(self):
return results_to_jsonlite(self)
def jsonlite2(self):
return results_to_jsonlite2(self)
def find_urls(self, extension: str = None, pattern: str = r'.*', directAccess: bool = False) -> List[str]:
"""Returns a flat list of all https or s3 urls from all results matching an extension and/or regex pattern
param extension: the file extension to search for. (Defaults to `None`)
- Example: '.tiff'
param pattern: A regex pattern to search each url for.(Defaults to `False`)
- Example: `r'(QA_)+'` to find urls with 'QA_' at least once
param directAccess: should search in s3 bucket urls (Defaults to `False`)
"""
urls = []
for product in self:
urls.extend(product.find_urls(extension=extension, pattern=pattern, directAccess=directAccess))
return sorted(list(set(urls)))
def __str__(self):
return json.dumps(self.geojson(), indent=2, sort_keys=True)
def download(
self,
path: str,
session: ASFSession = None,
processes: int = 1,
fileType=FileDownloadType.DEFAULT_FILE,
) -> None:
"""
Iterates over each ASFProduct and downloads them to the specified path.
Parameters
----------
path:
The directory into which the products should be downloaded.
session:
The session to use
Defaults to the session used to fetch the results, or a new one if none was used.
processes:
Number of download processes to use. Defaults to 1 (i.e. sequential download)
"""
ASF_LOGGER.info(f'Started downloading ASFSearchResults of size {len(self)}.')
if processes == 1:
for product in self:
product.download(path=path, session=session, fileType=fileType)
else:
ASF_LOGGER.info(f'Using {processes} threads - starting up pool.')
pool = Pool(processes=processes)
args = [(product, path, session, fileType) for product in self]
pool.map(_download_product, args)
pool.close()
pool.join()
ASF_LOGGER.info(f'Finished downloading ASFSearchResults of size {len(self)}.')
def raise_if_incomplete(self) -> None:
if not self.searchComplete:
msg = (
'Results are incomplete due to a search error. '
'See logging for more details. (ASFSearchResults.raise_if_incomplete called)'
)
ASF_LOGGER.error(msg)
raise ASFSearchError(msg)
def get_products_by_subclass_type(self) -> dict:
"""
Organizes results into dictionary by ASFProduct subclass name
: return: Dict of ASFSearchResults, organized by ASFProduct subclass names
"""
subclasses = {}
for product in self.data:
product_type = product.get_classname()
if subclasses.get(product_type) is None:
subclasses[product_type] = ASFSearchResults([])
subclasses[product_type].append(product)
return subclasses
def _download_product(args) -> None:
product, path, session, fileType = args
product.download(path=path, session=session, fileType=fileType)
Discovery-asf_search-8.1.2/asf_search/ASFSession.py 0000664 0000000 0000000 00000026302 14777330235 0022230 0 ustar 00root root 0000000 0000000 from logging import warn
import platform
from typing import List, Union
import requests
from requests.utils import get_netrc_auth
import http.cookiejar
from asf_search import ASF_LOGGER, __name__ as asf_name, __version__ as asf_version
from asf_search.exceptions import ASFAuthenticationError
import warnings
class ASFSession(requests.Session):
def __init__(
self,
edl_host: str = None,
edl_client_id: str = None,
asf_auth_host: str = None,
cmr_host: str = None,
cmr_collections: str = None,
auth_domains: List[str] = None,
auth_cookie_names: List[str] = None,
):
"""
ASFSession is a subclass of `requests.Session`, and is meant to ease
downloading ASF hosted data by simplifying logging in to Earthdata Login.
To create an EDL account, see here: https://urs.earthdata.nasa.gov/users/new
ASFSession provides three built-in methods for authorizing downloads:
- EDL Username and Password: `auth_with_creds()`
- EDL Token: `auth_with_token()`
- Authenticated cookiejars: `auth_with_cookiejar()`
Parameters
----------
`edl_host`:
the Earthdata login endpoint used by auth_with_creds().
Defaults to `asf_search.constants.INTERNAL.EDL_HOST`
`edl_client_id`:
The Earthdata Login client ID for this package.
Defaults to `asf_search.constants.INTERNAL.EDL_CLIENT_ID`
`asf_auth_host`:
the ASF auth endpoint.
Defaults to `asf_search.constants.INTERNAL.ASF_AUTH_HOST`
`cmr_host (DEPRECATED V7.0.9)`:
the base CMR endpoint to test EDL login tokens against.
Defaults to `asf_search.constants.INTERNAL.CMR_HOST`
`cmr_collections`:
the CMR endpoint path login tokens will be tested against.
Defaults to `asf_search.constants.INTERNAL.CMR_COLLECTIONS`
`auth_domains`:
the list of authorized endpoints that are allowed to pass auth credentials.
Defaults to `asf_search.constants.INTERNAL.AUTH_DOMAINS`.
Authorization headers WILL NOT be stripped from the session object
when redirected through these domains.
`auth_cookie_names`:
the list of cookie names to use when verifying
with `auth_with_creds()` & `auth_with_cookiejar()`
More information on Earthdata Login can be found here:
https://urs.earthdata.nasa.gov/documentation/faq
"""
super().__init__()
user_agent = '; '.join(
[
f'Python/{platform.python_version()}',
f'{requests.__name__}/{requests.__version__}',
f'{asf_name}/{asf_version}',
]
)
self.headers.update({'User-Agent': user_agent}) # For all hosts
self.headers.update({'Client-Id': f'{asf_name}_v{asf_version}'}) # For CMR
from asf_search.constants import INTERNAL
self.edl_host = INTERNAL.EDL_HOST if edl_host is None else edl_host
self.edl_client_id = INTERNAL.EDL_CLIENT_ID if edl_client_id is None else edl_client_id
self.asf_auth_host = INTERNAL.ASF_AUTH_HOST if asf_auth_host is None else asf_auth_host
self.cmr_collections = (
INTERNAL.CMR_COLLECTIONS if cmr_collections is None else cmr_collections
)
self.auth_domains = INTERNAL.AUTH_DOMAINS if auth_domains is None else auth_domains
self.auth_cookie_names = (
INTERNAL.AUTH_COOKIES if auth_cookie_names is None else auth_cookie_names
)
self.cmr_host = INTERNAL.CMR_HOST
if cmr_host is not None:
warnings.warn(
'Use of `cmr_host` keyword with `ASFSession` is deprecated '
'for asf-search versions >= 7.0.9, '
'and may be removed in a future major release.'
'\nTo authenticate an EDL token for a non-prod deployment of CMR, '
'set the `edl_host` keyword instead. '
'\n(ex: session arugments for authenticating against uat: '
'`ASFSession(edl_host="uat.urs.earthdata.nasa.gov")`)',
category=DeprecationWarning,
stacklevel=2,
)
self.cmr_host = cmr_host
def __eq__(self, other):
return (
self.auth == other.auth
and self.headers == other.headers
and self.cookies == other.cookies
)
def auth_with_creds(self, username: str, password: str):
"""
Authenticates the session using EDL username/password credentials
Parameters
----------
username:
EDL username, see https://urs.earthdata.nasa.gov/
password:
EDL password, see https://urs.earthdata.nasa.gov/
host:
(optional): EDL host to log in to
Returns
----------
ASFSession
"""
login_url = f'https://{self.edl_host}/oauth/authorize?client_id={self.edl_client_id}&response_type=code&redirect_uri=https://{self.asf_auth_host}/login' # noqa F401
self.auth = (username, password)
ASF_LOGGER.info(f'Attempting to login via "{login_url}"')
self.get(login_url)
if not self._check_auth_cookies(self.cookies.get_dict()):
raise ASFAuthenticationError('Username or password is incorrect')
ASF_LOGGER.info('Login successful')
token = self.cookies.get_dict().get('urs-access-token')
if token is None:
warn(
f'Provided asf_auth_host "{self.asf_auth_host}" returned no EDL token '
'during ASFSession validation. EDL Token expected in "urs-access-token" cookie, '
'required for hidden/restricted dataset access. '
'The current session will use basic authorization.'
)
else:
ASF_LOGGER.info(
'Found "urs-access-token" cookie in response from auth host, '
'using token for downloads and cmr queries.'
)
self.auth = None
self._update_edl_token(token=token)
return self
def auth_with_token(self, token: str):
"""
Authenticates the session using an EDL Authorization: Bearer token
Parameters
----------
token:
EDL Auth Token for authenticated downloads, see https://urs.earthdata.nasa.gov/user_tokens
Returns
----------
ASFSession
"""
oauth_authorization = (
f'https://{self.edl_host}/oauth/tokens/user?client_id={self.edl_client_id}'
)
ASF_LOGGER.info(f'Authenticating EDL token against {oauth_authorization}')
response = self.post(url=oauth_authorization, data={'token': token})
if not 200 <= response.status_code <= 299:
if not self._try_legacy_token_auth(token=token):
raise ASFAuthenticationError('Invalid/Expired token passed')
ASF_LOGGER.info('EDL token authentication successful')
self._update_edl_token(token=token)
return self
def _try_legacy_token_auth(self, token: str) -> False:
"""
Checks `cmr_host` search endpoint directly with provided token
using method used in previous versions of asf-search (<7.0.9).
This may be removed in a future release
"""
from asf_search.constants import INTERNAL
if self.cmr_host != INTERNAL.CMR_HOST:
self.headers.update({'Authorization': 'Bearer {0}'.format(token)})
legacy_auth_url = f'https://{self.cmr_host}{self.cmr_collections}'
response = self.get(legacy_auth_url)
self.headers.pop('Authorization')
return 200 <= response.status_code <= 299
return False
def _update_edl_token(self, token: str):
self.headers.update({'Authorization': 'Bearer {0}'.format(token)})
def auth_with_cookiejar(
self,
cookies: Union[http.cookiejar.CookieJar, requests.cookies.RequestsCookieJar],
):
"""
Authenticates the session using a pre-existing cookiejar
:param cookies: Any http.cookiejar compatible object
:return ASFSession: returns self for convenience
"""
if not self._check_auth_cookies(cookies):
raise ASFAuthenticationError('Cookiejar does not contain login cookies')
for cookie in cookies:
if cookie.is_expired():
raise ASFAuthenticationError('Cookiejar contains expired cookies')
token = cookies.get_dict().get('urs-access-token')
if token is None:
ASF_LOGGER.warning(
'Failed to find EDL Token in cookiejar. '
'EDL Token expected in "urs-access-token" cookie, '
'required for hidden/restricted dataset access.'
)
else:
ASF_LOGGER.info('Authenticating EDL token found in "urs-access-token" cookie')
try:
self.auth_with_token(token)
except ASFAuthenticationError:
ASF_LOGGER.warning(
'Failed to authenticate with found EDL token found. '
'Access to hidden/restricted cmr data may be limited.'
)
self.cookies = cookies
return self
def _check_auth_cookies(
self,
cookies: Union[http.cookiejar.CookieJar, requests.cookies.RequestsCookieJar],
) -> bool:
if isinstance(cookies, requests.cookies.RequestsCookieJar):
cookies = dict(cookies)
return any(cookie in self.auth_cookie_names for cookie in cookies)
def rebuild_auth(self, prepared_request: requests.Request, response: requests.Response):
"""
Overrides requests.Session.rebuild_auth()
default behavior of stripping the Authorization header
upon redirect. This allows token authentication to work with redirects to trusted domains
"""
headers = prepared_request.headers
url = prepared_request.url
if 'Authorization' in headers:
original_domain = '.'.join(self._get_domain(response.request.url).split('.')[-3:])
redirect_domain = '.'.join(self._get_domain(url).split('.')[-3:])
if original_domain != redirect_domain and (
original_domain not in self.auth_domains or redirect_domain not in self.auth_domains
):
del headers['Authorization']
new_auth = get_netrc_auth(url) if self.trust_env else None
if new_auth is not None:
prepared_request.prepare_auth(new_auth)
def _get_domain(self, url: str):
return requests.utils.urlparse(url).hostname
# multi-processing does an implicit copy of ASFSession objects,
# this ensures ASFSession class variables are included
def __getstate__(self):
state = super().__getstate__()
state = {
**state,
'edl_host': self.edl_host,
'edl_client_id': self.edl_client_id,
'asf_auth_host': self.asf_auth_host,
'cmr_host': self.cmr_host,
'cmr_collections': self.cmr_collections,
'auth_domains': self.auth_domains,
'auth_cookie_names': self.auth_cookie_names,
}
return state
Discovery-asf_search-8.1.2/asf_search/ASFStackableProduct.py 0000664 0000000 0000000 00000005501 14777330235 0024035 0 ustar 00root root 0000000 0000000 from enum import Enum
import copy
from typing import Dict, Union
from asf_search import ASFSession, ASFProduct
from asf_search.ASFSearchOptions import ASFSearchOptions
from asf_search.exceptions import ASFBaselineError
class ASFStackableProduct(ASFProduct):
"""
Used for ERS-1 and ERS-2 products
ASF ERS-1 Dataset Documentation Page: https://asf.alaska.edu/datasets/daac/ers-1/
ASF ERS-2 Dataset Documentation Page: https://asf.alaska.edu/datasets/daac/ers-2/
"""
class BaselineCalcType(Enum):
"""
Defines how asf-search will calculate perpendicular baseline for products of this subclass
"""
PRE_CALCULATED = 0
"""Has pre-calculated insarBaseline value that will be used for perpendicular calculations""" # noqa F401
CALCULATED = 1
"""Uses position/velocity state vectors and ascending node time for perpendicular calculations""" # noqa F401
baseline_type = BaselineCalcType.PRE_CALCULATED
"""Determines how asf-search will attempt to stack products of this type."""
def __init__(self, args: Dict = {}, session: ASFSession = ASFSession()):
super().__init__(args, session)
self.baseline = self.get_baseline_calc_properties()
def get_baseline_calc_properties(self) -> Dict:
insarBaseline = self.umm_cast(
float,
self.umm_get(
self.umm,
'AdditionalAttributes',
('Name', 'INSAR_BASELINE'),
'Values',
0,
),
)
if insarBaseline is None:
return None
return {'insarBaseline': insarBaseline}
def get_stack_opts(self, opts: ASFSearchOptions = None):
stack_opts = ASFSearchOptions() if opts is None else copy(opts)
stack_opts.processingLevel = self.get_default_baseline_product_type()
if self.properties.get('insarStackId') in [None, 'NA', 0, '0']:
raise ASFBaselineError(
'Requested reference product needs a baseline stack ID '
f'but does not have one: {self.properties["fileID"]}'
)
stack_opts.insarStackId = self.properties['insarStackId']
return stack_opts
def is_valid_reference(self):
# we don't stack at all if any of stack is missing insarBaseline,
# unlike stacking S1 products(?)
if 'insarBaseline' not in self.baseline:
raise ValueError('No baseline values available for precalculated dataset')
return True
@staticmethod
def get_default_baseline_product_type() -> Union[str, None]:
"""
Returns the product type to search for when building a baseline stack.
"""
return None
def has_baseline(self) -> bool:
baseline = self.get_baseline_calc_properties()
return baseline is not None
Discovery-asf_search-8.1.2/asf_search/CMR/ 0000775 0000000 0000000 00000000000 14777330235 0020317 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/asf_search/CMR/MissionList.py 0000664 0000000 0000000 00000001506 14777330235 0023150 0 ustar 00root root 0000000 0000000 from typing import Dict
from asf_search.exceptions import CMRError
from asf_search.constants.INTERNAL import CMR_HOST, CMR_COLLECTIONS_PATH
import requests
def get_campaigns(data) -> Dict:
"""Queries CMR Collections endpoint for
collections associated with the given platform
:param data: a dictionary with required keys:
'include_facets', 'provider', 'platform[]' and optional key: 'instrument[]'
:return: Dictionary containing CMR umm_json response
"""
response = requests.post(f'https://{CMR_HOST}{CMR_COLLECTIONS_PATH}', data=data)
if response.status_code != 200:
raise CMRError(f'CMR_ERROR {response.status_code}: {response.text}')
try:
data = response.json()
except Exception as e:
raise CMRError(f'CMR_ERROR: Error parsing JSON from CMR: {e}')
return data
Discovery-asf_search-8.1.2/asf_search/CMR/__init__.py 0000664 0000000 0000000 00000000716 14777330235 0022434 0 ustar 00root root 0000000 0000000 from .MissionList import get_campaigns # noqa: F401
from .subquery import build_subqueries # noqa: F401
from .translate import translate_opts # noqa: F401
from .field_map import field_map # noqa: F401
from .datasets import ( # noqa: F401
dataset_collections, # noqa: F401
collections_per_platform, # noqa: F401
collections_by_processing_level, # noqa: F401
get_concept_id_alias, # noqa: F401
get_dataset_concept_ids, # noqa: F401
)
Discovery-asf_search-8.1.2/asf_search/CMR/datasets.py 0000664 0000000 0000000 00000126653 14777330235 0022516 0 ustar 00root root 0000000 0000000 from typing import List
dataset_collections = {
'NISAR': {
'NISAR_NEN_RRST_BETA_V1': [
'C1261815181-ASFDEV',
'C1261815288-ASF',
'C2850220296-ASF',
],
'NISAR_NEN_RRST_PROVISIONAL_V1': [
'C1261832381-ASFDEV',
'C1261832657-ASF',
'C2853068083-ASF',
],
'NISAR_NEN_RRST_V1': [
'C1256533420-ASFDEV',
'C1257349121-ASF',
'C2727902012-ASF',
],
'NISAR_L0A_RRST_BETA_V1': [
'C1261813453-ASFDEV',
'C1261815147-ASF',
'C2850223384-ASF',
],
'NISAR_L0A_RRST_PROVISIONAL_V1': [
'C1261832466-ASFDEV',
'C1261832658-ASF',
'C2853086824-ASF',
],
'NISAR_L0A_RRST_V1': [
'C1256524081-ASFDEV',
'C1257349120-ASF',
'C2727901263-ASF',
],
'NISAR_L0B_RRSD_BETA_V1': [
'C1261815274-ASFDEV',
'C1261815289-ASF',
'C2850224301-ASF',
],
'NISAR_L0B_RRSD_PROVISIONAL_V1': [
'C1261832497-ASFDEV',
'C1261832659-ASF',
'C2853089814-ASF',
],
'NISAR_L0B_RRSD_V1': [
'C1256358262-ASFDEV',
'C1257349115-ASF',
'C2727901639-ASF',
],
'NISAR_L0B_CRSD_BETA_V1': [
'C1261815276-ASFDEV',
'C1261815301-ASF',
'C2850225137-ASF',
],
'NISAR_L0B_CRSD_PROVISIONAL_V1': [
'C1261832632-ASFDEV',
'C1261832671-ASF',
'C2853091612-ASF',
],
'NISAR_L0B_CRSD_V1': [
'C1256358463-ASFDEV',
'C1257349114-ASF',
'C2727901523-ASF',
],
'NISAR_L1_RSLC_BETA_V1': [
'C1261813489-ASFDEV',
'C1261815148-ASF',
'C2850225585-ASF',
],
'NISAR_L1_RSLC_PROVISIONAL_V1': [
'C1261832868-ASFDEV',
'C1261833052-ASF',
'C2853145197-ASF',
],
'NISAR_L1_RSLC_V1': [
'C1256363301-ASFDEV',
'C1257349109-ASF',
'C2727900439-ASF',
],
'NISAR_L1_RIFG_BETA_V1': [
'C1261819086-ASFDEV',
'C1261819120-ASF',
'C2850234202-ASF',
],
'NISAR_L1_RIFG_PROVISIONAL_V1': [
'C1261832940-ASFDEV',
'C1261833063-ASF',
'C2853147928-ASF',
],
'NISAR_L1_RIFG_V1': [
'C1256381769-ASFDEV',
'C1257349108-ASF',
'C2723110181-ASF',
],
'NISAR_L1_RUNW_BETA_V1': [
'C1261819098-ASFDEV',
'C1261819121-ASF',
'C2850235455-ASF',
],
'NISAR_L1_RUNW_PROVISIONAL_V1': [
'C1261832990-ASFDEV',
'C1261833064-ASF',
'C2853153429-ASF',
],
'NISAR_L1_RUNW_V1': [
'C1256420738-ASFDEV',
'C1257349107-ASF',
'C2727900827-ASF',
],
'NISAR_L1_ROFF_BETA_V1': [
'C1261819110-ASFDEV',
'C1261819145-ASF',
'C2850237619-ASF',
],
'NISAR_L1_ROFF_PROVISIONAL_V1': [
'C1261832993-ASFDEV',
'C1261833076-ASF',
'C2853156054-ASF',
],
'NISAR_L1_ROFF_V1': [
'C1256411631-ASFDEV',
'C1257349103-ASF',
'C2727900080-ASF',
],
'NISAR_L2_GSLC_BETA_V1': [
'C1261819167-ASFDEV',
'C1261819258-ASF',
'C2850259510-ASF',
],
'NISAR_L2_GSLC_PROVISIONAL_V1': [
'C1261833024-ASFDEV',
'C1261833127-ASF',
'C2854332392-ASF',
],
'NISAR_L2_GSLC_V1': [
'C1256413628-ASFDEV',
'C1257349102-ASF',
'C2727896667-ASF',
],
'NISAR_L2_GUNW_BETA_V1': [
'C1261819168-ASFDEV',
'C1261819270-ASF',
'C2850261892-ASF',
],
'NISAR_L2_GUNW_PROVISIONAL_V1': [
'C1261833025-ASFDEV',
'C1261846741-ASF',
'C2854335566-ASF',
],
'NISAR_L2_GUNW_V1': [
'C1256432264-ASFDEV',
'C1257349096-ASF',
'C2727897718-ASF',
],
'NISAR_L2_GCOV_BETA_V1': [
'C1261819211-ASFDEV',
'C1261819275-ASF',
'C2850262927-ASF',
],
'NISAR_L2_GCOV_PROVISIONAL_V1': [
'C1261833026-ASFDEV',
'C1261846880-ASF',
'C2854338529-ASF',
],
'NISAR_L2_GCOV_V1': [
'C1256477304-ASFDEV',
'C1257349095-ASF',
'C2727896018-ASF',
],
'NISAR_L2_GOFF_BETA_V1': [
'C1261819233-ASFDEV',
'C1261819281-ASF',
'C2850263910-ASF',
],
'NISAR_L2_GOFF_PROVISIONAL_V1': [
'C1261833027-ASFDEV',
'C1261846994-ASF',
'C2854341702-ASF',
],
'NISAR_L2_GOFF_V1': [
'C1256479237-ASFDEV',
'C1257349094-ASF',
'C2727896460-ASF',
],
'NISAR_L3_SME2_BETA_V1': [
'C1261819245-ASFDEV',
'C1261819282-ASF',
'C2850265000-ASF',
],
'NISAR_L3_SME2_PROVISIONAL_V1': [
'C1261833050-ASFDEV',
'C1261847095-ASF',
'C2854344945-ASF',
],
'NISAR_L3_SME2_V1': [
'C1256568692-ASFDEV',
'C1257349093-ASF',
'C2727894546-ASF',
],
'NISAR_CUSTOM_PROVISIONAL_V1': [
'C1262134528-ASFDEV',
'C1262135006-ASF',
'C2874824964-ASF',
],
},
'SENTINEL-1': {
'SENTINEL-1A_SLC': ['C1214470488-ASF', 'C1205428742-ASF', 'C1234413245-ASFDEV'],
'SENTINEL-1B_SLC': ['C1327985661-ASF', 'C1216244348-ASF', 'C1234413263-ASFDEV'],
'SENTINEL-1A_DP_GRD_HIGH': [
'C1214470533-ASF',
'C1212201032-ASF',
'C1234413229-ASFDEV',
],
'SENTINEL-1A_DP_META_GRD_HIGH': [
'C1214470576-ASF',
'C1212209226-ASF',
'C1234413232-ASFDEV',
],
'SENTINEL-1B_DP_GRD_HIGH': [
'C1327985645-ASF',
'C1216244589-ASF',
'C1234413247-ASFDEV',
],
'SENTINEL-1A_META_SLC': [
'C1214470496-ASF',
'C1208117434-ASF',
'C1234413236-ASFDEV',
],
'SENTINEL-1A_META_RAW': [
'C1214470532-ASF',
'C1208115009-ASF',
'C1234413235-ASFDEV',
],
'SENTINEL-1A_OCN': ['C1214472977-ASF', 'C1212212560-ASF', 'C1234413237-ASFDEV'],
'SENTINEL-1A_DP_META_GRD_MEDIUM': [
'C1214472336-ASF',
'C1212212493-ASF',
'C1234413233-ASFDEV',
],
'SENTINEL-1A_META_OCN': [
'C1266376001-ASF',
'C1215704763-ASF',
'C1234413234-ASFDEV',
],
'SENTINEL-1A_SP_META_GRD_HIGH': [
'C1214470732-ASF',
'C1212158326-ASF',
'C1234413243-ASFDEV',
],
'SENTINEL-1B_DP_GRD_MEDIUM': [
'C1327985660-ASF',
'C1216244594-ASF',
'C1234413248-ASFDEV',
],
'SENTINEL-1B_DP_META_GRD_HIGH': [
'C1327985741-ASF',
'C1216244601-ASF',
'C1234413250-ASFDEV',
],
'SENTINEL-1B_DP_META_GRD_MEDIUM': [
'C1327985578-ASF',
'C1216244591-ASF',
'C1234413251-ASFDEV',
],
'SENTINEL-1B_META_RAW': [
'C1327985650-ASF',
'C1216244595-ASF',
'C1234413253-ASFDEV',
],
'SENTINEL-1B_META_SLC': [
'C1327985617-ASF',
'C1216244585-ASF',
'C1234413254-ASFDEV',
],
'SENTINEL-1B_OCN': ['C1327985579-ASF', 'C1216244593-ASF', 'C1234413255-ASFDEV'],
'SENTINEL-1B_SP_META_GRD_HIGH': [
'C1327985619-ASF',
'C1216244587-ASF',
'C1234413261-ASFDEV',
],
'SENTINEL-1A_SP_GRD_MEDIUM': [
'C1214472994-ASF',
'C1212158318-ASF',
'C1234413241-ASFDEV',
],
'SENTINEL-1A_SP_META_GRD_MEDIUM': [
'C1214473170-ASF',
'C1212233976-ASF',
'C1234413244-ASFDEV',
],
'SENTINEL-1B_META_OCN': [
'C1327985646-ASF',
'C1216244590-ASF',
'C1234413252-ASFDEV',
],
'SENTINEL-1B_SP_GRD_MEDIUM': [
'C1327985740-ASF',
'C1216244600-ASF',
'C1234413259-ASFDEV',
],
'SENTINEL-1B_SP_META_GRD_MEDIUM': [
'C1327985739-ASF',
'C1216244598-ASF',
'C1234413262-ASFDEV',
],
'SENTINEL-1A_RAW': ['C1214470561-ASF', 'C1205264459-ASF', 'C1234413238-ASFDEV'],
'SENTINEL-1A_DP_GRD_MEDIUM': [
'C1214471521-ASF',
'C1212209035-ASF',
'C1234413230-ASFDEV',
],
'SENTINEL-1A_SP_GRD_HIGH': [
'C1214470682-ASF',
'C1212158327-ASF',
'C1234413240-ASFDEV',
],
'SENTINEL-1B_RAW': ['C1327985647-ASF', 'C1216244592-ASF', 'C1234413256-ASFDEV'],
'SENTINEL-1A_DP_GRD_FULL': [
'C1214471197-ASF',
'C1212200781-ASF',
'C1234413228-ASFDEV',
],
'SENTINEL-1A_DP_META_GRD_FULL': [
'C1214471960-ASF',
'C1212209075-ASF',
'C1234413231-ASFDEV',
],
'SENTINEL-1A_SP_GRD_FULL': ['C1214472978-ASF', 'C1234413239-ASFDEV'],
'SENTINEL-1A_SP_META_GRD_FULL': ['C1214473165-ASF', 'C1234413242-ASFDEV'],
'SENTINEL-1B_DP_GRD_FULL': [
'C1327985697-ASF',
'C1216244597-ASF',
'C1234413246-ASFDEV',
],
'SENTINEL-1B_DP_META_GRD_FULL': [
'C1327985651-ASF',
'C1216244596-ASF',
'C1234413249-ASFDEV',
],
'SENTINEL-1B_SP_GRD_FULL': [
'C1327985644-ASF',
'C1216244588-ASF',
'C1234413257-ASFDEV',
],
'SENTINEL-1B_SP_GRD_HIGH': [
'C1327985571-ASF',
'C1216244586-ASF',
'C1234413258-ASFDEV',
],
'SENTINEL-1B_SP_META_GRD_FULL': [
'C1327985674-ASF',
'C1216244599-ASF',
'C1234413260-ASFDEV',
],
'S1_Bursts': ['C1244552887-ASFDEV'],
'SENTINEL-1_BURSTS_DEV10': ['C1257175154-ASFDEV'],
'Sentinel-1_Burst_Map': ['C1244598379-ASFDEV'],
'Various Browse Images': ['C1240784657-ASFDEV'],
},
'OPERA-S1': {
'OPERA_L2_CSLC-S1_V1': ['C2777443834-ASF', 'C1259976861-ASF'],
'OPERA_L2_RTC-S1_V1': ['C2777436413-ASF', 'C1259974840-ASF'],
'OPERA_L2_CSLC-S1-STATIC_PROVISIONAL_V0': ['C1258354200-ASF'],
'OPERA_L2_CSLC-S1-STATIC_V1': ['C1259982010-ASF', 'C2795135668-ASF'],
'OPERA_L2_CSLC-S1_PROVISIONAL_V0': ['C1257995185-ASF'],
'OPERA_L2_RTC-S1-STATIC_PROVISIONAL_V0': ['C1258354201-ASF'],
'OPERA_L2_RTC-S1-STATIC_V1': ['C1259981910-ASF', 'C2795135174-ASF'],
'OPERA_L2_RTC-S1_PROVISIONAL_V0': ['C1257995186-ASF'],
},
'OPERA-S1-CALVAL': {
'OPERA_L2_CSLC-S1_CALVAL_V1': ['C1260721945-ASF', 'C2803501758-ASF'],
'OPERA_L2_RTC-S1_CALVAL_V1': ['C1260721853-ASF', 'C2803501097-ASF'],
},
'SLC-BURST': {'SENTINEL-1_BURSTS': ['C2709161906-ASF', 'C1257024016-ASF']},
'ALOS PALSAR': {
'ALOS_PSR_RTC_HIGH': ['C1206487504-ASF', 'C1207181535-ASF'],
'ALOS_PSR_L1.5': ['C1206485940-ASF', 'C1205261223-ASF'],
'ALOS_PSR_RTC_LOW': ['C1206487217-ASF', 'C1208013295-ASF'],
'ALOS_PSR_KMZ': ['C1206156901-ASF', 'C1207019609-ASF'],
'ALOS_PSR_L1.0': ['C1206485320-ASF'],
'ALOS_PSR_L1.1': ['C1206485527-ASF', 'C1207710476-ASF', 'C1239611505-ASFDEV'],
'ALOS_PSR_L2.2': ['C2011599335-ASF', 'C1239927797-ASF', 'C1238733834-ASFDEV'],
'ALOS_PALSAR_INSAR_METADATA': ['C1229740239-ASF'],
},
'ALOS AVNIR-2': {
'ALOS_AVNIR_OBS_ORI': [
'C1808440897-ASF',
'C1233629671-ASF',
'C1234413224-ASFDEV',
],
'ALOS_AVNIR_OBS_ORI_BROWSE': ['C1234712303-ASF'],
},
'SIR-C': {
'STS-59_BROWSE_GRD': [
'C1661710578-ASF',
'C1226557819-ASF',
'C1234413264-ASFDEV',
],
'STS-59_BROWSE_SLC': [
'C1661710581-ASF',
'C1226557809-ASF',
'C1234413265-ASFDEV',
],
'STS-59_GRD': ['C1661710583-ASF', 'C1226557808-ASF', 'C1234413266-ASFDEV'],
'STS-59_META_GRD': ['C1661710586-ASF', 'C1226557810-ASF', 'C1234413267-ASFDEV'],
'STS-59_META_SLC': ['C1661710588-ASF', 'C1226557811-ASF', 'C1234413268-ASFDEV'],
'STS-59_SLC': ['C1661710590-ASF', 'C1226557812-ASF', 'C1234413269-ASFDEV'],
'STS-68_BROWSE_GRD': [
'C1661710593-ASF',
'C1226557813-ASF',
'C1234413270-ASFDEV',
],
'STS-68_BROWSE_SLC': [
'C1661710596-ASF',
'C1226557814-ASF',
'C1234413271-ASFDEV',
],
'STS-68_GRD': ['C1661710597-ASF', 'C1226557815-ASF', 'C1234413272-ASFDEV'],
'STS-68_META_GRD': ['C1661710600-ASF', 'C1226557816-ASF', 'C1234413273-ASFDEV'],
'STS-68_META_SLC': ['C1661710603-ASF', 'C1226557817-ASF', 'C1234413274-ASFDEV'],
'STS-68_SLC': ['C1661710604-ASF', 'C1226557818-ASF', 'C1234413275-ASFDEV'],
},
'ARIA S1 GUNW': {
'SENTINEL-1_INTERFEROGRAMS': ['C1595422627-ASF', 'C1225776654-ASF'],
'SENTINEL-1_INTERFEROGRAMS_AMPLITUDE': ['C1596065640-ASF', 'C1225776655-ASF'],
'SENTINEL-1_INTERFEROGRAMS_COHERENCE': ['C1596065639-ASF', 'C1225776657-ASF'],
'SENTINEL-1_INTERFEROGRAMS_CONNECTED_COMPONENTS': [
'C1596065641-ASF',
'C1225776658-ASF',
],
'SENTINEL-1_INTERFEROGRAMS_UNWRAPPED_PHASE': [
'C1595765183-ASF',
'C1225776659-ASF',
],
'ARIA_S1_GUNW': ['C2859376221-ASF', 'C1261881077-ASF'],
},
'SMAP': {
'SPL1A_RO_METADATA_003': ['C1243122884-ASF', 'C1233103964-ASF'],
'SPL1A_RO_QA_003': ['C1243124139-ASF', 'C1216074923-ASF'],
'SPL1A_001': ['C1214473171-ASF', 'C1212243761-ASF'],
'SPL1A_002': ['C1243149604-ASF', 'C1213091807-ASF'],
'SPL1A_METADATA_001': ['C1214473426-ASF', 'C1212243437-ASF'],
'SPL1A_METADATA_002': ['C1243119801-ASF', 'C1213096699-ASF'],
'SPL1A_QA_001': ['C1214473839-ASF', 'C1212249653-ASF'],
'SPL1A_QA_002': ['C1243133204-ASF', 'C1213101573-ASF'],
'SPL1A_RO_001': ['C1243197402-ASF'],
'SPL1A_RO_002': ['C1243215430-ASF', 'C1213136240-ASF'],
'SPL1A_RO_003': ['C1243124754-ASF', 'C1216074755-ASF'],
'SPL1A_RO_METADATA_001': ['C1243141638-ASF', 'C1213136752-ASF'],
'SPL1A_RO_METADATA_002': ['C1243162394-ASF', 'C1213136799-ASF'],
'SPL1A_RO_QA_001': ['C1243168733-ASF', 'C1213136709-ASF'],
'SPL1A_RO_QA_002': ['C1243168866-ASF', 'C1213136844-ASF'],
'SPL1B_SO_LoRes_001': ['C1214473308-ASF', 'C1212249811-ASF'],
'SPL1B_SO_LoRes_002': ['C1243253631-ASF', 'C1213125007-ASF'],
'SPL1B_SO_LoRes_003': ['C1243133445-ASF', 'C1216074919-ASF'],
'SPL1B_SO_LoRes_METADATA_001': ['C1214473550-ASF', 'C1212196951-ASF'],
'SPL1B_SO_LoRes_METADATA_002': ['C1243197502-ASF', 'C1213115690-ASF'],
'SPL1B_SO_LoRes_METADATA_003': ['C1243126328-ASF', 'C1216074758-ASF'],
'SPL1B_SO_LoRes_QA_001': ['C1214474243-ASF', 'C1212243666-ASF'],
'SPL1B_SO_LoRes_QA_002': ['C1243216659-ASF', 'C1213115896-ASF'],
'SPL1B_SO_LoRes_QA_003': ['C1243129847-ASF', 'C1216074761-ASF'],
'SPL1C_S0_HiRes_001': ['C1214473367-ASF', 'C1212250364-ASF'],
'SPL1C_S0_HiRes_002': ['C1243268956-ASF', 'C1213134622-ASF'],
'SPL1C_S0_HiRes_003': ['C1243144528-ASF', 'C1216074770-ASF'],
'SPL1C_S0_HiRes_METADATA_001': ['C1214473624-ASF', 'C1212246173-ASF'],
'SPL1C_S0_HiRes_METADATA_002': ['C1243228612-ASF', 'C1213125156-ASF'],
'SPL1C_S0_HiRes_METADATA_003': ['C1243136142-ASF', 'C1216074764-ASF'],
'SPL1C_S0_HiRes_QA_001': ['C1214474435-ASF', 'C1212249773-ASF'],
'SPL1C_S0_HiRes_QA_002': ['C1243255360-ASF', 'C1213134486-ASF'],
'SPL1C_S0_HiRes_QA_003': ['C1243140611-ASF', 'C1233101609-ASF'],
'SPL1A_003': ['C1216074922-ASF'],
'SPL1A_METADATA_003': ['C1216074750-ASF'],
'SPL1A_QA_003': ['C1216074751-ASF'],
},
'UAVSAR': {
'UAVSAR_POL_META': ['C1214353986-ASF', 'C1210487703-ASF'],
'UAVSAR_INSAR_META': ['C1214336717-ASF', 'C1212030772-ASF'],
'UAVSAR_INSAR_INT': ['C1214336045-ASF', 'C1212001698-ASF'],
'UAVSAR_INSAR_AMP': ['C1214335430-ASF', 'C1206116665-ASF'],
'UAVSAR_INSAR_AMP_GRD': ['C1214335471-ASF', 'C1206132445-ASF'],
'UAVSAR_INSAR_DEM': ['C1214335903-ASF', 'C1211962154-ASF'],
'UAVSAR_INSAR_INT_GRD': ['C1214336154-ASF', 'C1212005594-ASF'],
'UAVSAR_INSAR_KMZ': ['C1214336554-ASF', 'C1212019993-ASF'],
'UAVSAR_POL_DEM': ['C1214353593-ASF', 'C1207638502-ASF'],
'UAVSAR_POL_INC': ['C1214353754-ASF', 'C1210025872-ASF'],
'UAVSAR_POL_KMZ': ['C1214353859-ASF', 'C1210485039-ASF'],
'UAVSAR_POL_ML_CMPLX_GRD': ['C1214337770-ASF', 'C1207188317-ASF'],
'UAVSAR_POL_ML_CMPLX_GRD_3X3': ['C1214354144-ASF', 'C1210546638-ASF'],
'UAVSAR_POL_ML_CMPLX_GRD_5X5': ['C1214354235-ASF', 'C1206122195-ASF'],
'UAVSAR_POL_ML_CMPLX_SLANT': ['C1214343609-ASF', 'C1209970710-ASF'],
'UAVSAR_POL_PAULI': ['C1214354031-ASF', 'C1207038647-ASF'],
'UAVSAR_POL_SLOPE': ['C1214408428-ASF', 'C1210599503-ASF'],
'UAVSAR_POL_STOKES': ['C1214419355-ASF', 'C1210599673-ASF'],
},
'RADARSAT-1': {
'RSAT-1_L0': ['C1206897141-ASF'],
'RSAT-1_L1': ['C1206936391-ASF', 'C1205181982-ASF'],
'RSAT-1_POLAR_YEAR_ANTARCTICA_L1': ['C1215670813-ASF'],
'RSAT-1_POLAR_YEAR_GREENLAND_L0': ['C1215709884-ASF'],
'RSAT-1_POLAR_YEAR_GREENLAND_L1': ['C1215709880-ASF'],
'RSAT-1_POLAR_YEAR_KAMCHATKA_L1': ['C1215714443-ASF'],
'RSAT-1_POLAR_YEAR_SEA_ICE_MIN_MAX_L1': ['C1215775284-ASF'],
'RSAT-1_POLAR_YEAR_TOOLIK_L1': ['C1215614037-ASF'],
},
'ERS': {
'ERS-1_L0': ['C1210197768-ASF', 'C1205261222-ASF'],
'ERS-1_L1': ['C1211627521-ASF', 'C1205302527-ASF'],
'ERS-2_L0': ['C1208794942-ASF', 'C1207143701-ASF'],
'ERS-2_L1': ['C1209373626-ASF', 'C1207144966-ASF'],
},
'JERS-1': {
'JERS-1_L0': ['C1208662092-ASF', 'C1207175327-ASF'],
'JERS-1_L1': ['C1207933168-ASF', 'C1207177736-ASF'],
},
'AIRSAR': {
'AIRSAR_POL_3FP': ['C1213921661-ASF', 'C1205256880-ASF'],
'AIRSAR_INT_JPG': ['C1213921626-ASF', 'C1000000306-ASF'],
'AIRSAR_POL_SYN_3FP': ['C1213928843-ASF', 'C1208713702-ASF'],
'AIRSAR_TOP_C-DEM_STOKES': ['C1213927035-ASF', 'C1208707768-ASF'],
'AIRSAR_TOP_DEM': ['C179001730-ASF', 'C1208655639-ASF'],
'AIRSAR_TOP_DEM_C': ['C1213925022-ASF', 'C1208680681-ASF'],
'AIRSAR_TOP_DEM_L': ['C1213926419-ASF', 'C1208691361-ASF'],
'AIRSAR_TOP_DEM_P': ['C1213926777-ASF', 'C1208703384-ASF'],
'AIRSAR_TOP_L-STOKES': ['C1213927939-ASF'],
'AIRSAR_TOP_P-STOKES': ['C1213928209-ASF'],
'AIRSAR_INT': ['C1208652494-ASF'],
},
'SEASAT': {
'SEASAT_SAR_L1_TIFF': ['C1206500826-ASF', 'C1206752770-ASF'],
'SEASAT_SAR_L1_HDF5': ['C1206500991-ASF', 'C1206144699-ASF'],
},
}
collections_per_platform = {
'SENTINEL-1A': [
'C2803501758-ASF',
'C2803501097-ASF',
'C1214470488-ASF',
'C1214470533-ASF',
'C1214470576-ASF',
'C1595422627-ASF',
'C2859376221-ASF',
'C1261881077-ASF',
'C1214470496-ASF',
'C1214470532-ASF',
'C1214472977-ASF',
'C1214472336-ASF',
'C1266376001-ASF',
'C1214472994-ASF',
'C1214470732-ASF',
'C1214473170-ASF',
'C1214470561-ASF',
'C1214471521-ASF',
'C1214470682-ASF',
'C2777443834-ASF',
'C2777436413-ASF',
'C1214471197-ASF',
'C1214471960-ASF',
'C1214472978-ASF',
'C1214473165-ASF',
'C2709161906-ASF',
'C1596065640-ASF',
'C1596065639-ASF',
'C1596065641-ASF',
'C1595765183-ASF',
'C2450786986-ASF',
'C1205428742-ASF',
'C1212201032-ASF',
'C1212212560-ASF',
'C1205264459-ASF',
'C1208117434-ASF',
'C1212209035-ASF',
'C1212209226-ASF',
'C1208115009-ASF',
'C1212158327-ASF',
'C1215704763-ASF',
'C1225776654-ASF',
'C1212158318-ASF',
'C1212212493-ASF',
'C1212158326-ASF',
'C1212233976-ASF',
'C1260726384-ASF',
'C1258354200-ASF',
'C1259982010-ASF',
'C2795135668-ASF',
'C1260721945-ASF',
'C1257995185-ASF',
'C1259976861-ASF',
'C1258354201-ASF',
'C1259981910-ASF',
'C2795135174-ASF',
'C1260721853-ASF',
'C1257995186-ASF',
'C1259974840-ASF',
'C1212200781-ASF',
'C1212209075-ASF',
'C1257024016-ASF',
'C1225776655-ASF',
'C1225776657-ASF',
'C1225776658-ASF',
'C1225776659-ASF',
'C1245953394-ASF',
'C1234413245-ASFDEV',
'C1234413229-ASFDEV',
'C1234413237-ASFDEV',
'C1234413238-ASFDEV',
'C1234413236-ASFDEV',
'C1234413230-ASFDEV',
'C1234413232-ASFDEV',
'C1234413235-ASFDEV',
'C1234413240-ASFDEV',
'C1234413234-ASFDEV',
'C1234413241-ASFDEV',
'C1234413233-ASFDEV',
'C1234413243-ASFDEV',
'C1234413244-ASFDEV',
'C1244552887-ASFDEV',
'C1234413228-ASFDEV',
'C1234413231-ASFDEV',
'C1234413239-ASFDEV',
'C1234413242-ASFDEV',
'C1257175154-ASFDEV',
'C1244598379-ASFDEV',
'C1240784657-ASFDEV',
],
'SENTINEL-1B': [
'C2803501758-ASF',
'C2803501097-ASF',
'C1327985661-ASF',
'C1327985645-ASF',
'C1595422627-ASF',
'C1327985617-ASF',
'C1327985660-ASF',
'C1327985741-ASF',
'C1327985578-ASF',
'C1327985646-ASF',
'C1327985650-ASF',
'C1327985579-ASF',
'C1327985740-ASF',
'C1327985619-ASF',
'C1327985739-ASF',
'C1327985647-ASF',
'C2777443834-ASF',
'C2777436413-ASF',
'C1327985697-ASF',
'C1327985651-ASF',
'C1327985644-ASF',
'C1327985571-ASF',
'C1327985674-ASF',
'C2709161906-ASF',
'C1596065640-ASF',
'C1596065639-ASF',
'C1596065641-ASF',
'C1595765183-ASF',
'C2450786986-ASF',
'C1216244348-ASF',
'C1216244589-ASF',
'C1216244594-ASF',
'C1216244593-ASF',
'C1216244585-ASF',
'C1216244592-ASF',
'C1216244595-ASF',
'C1225776654-ASF',
'C1216244590-ASF',
'C1216244601-ASF',
'C1216244600-ASF',
'C1216244591-ASF',
'C1216244587-ASF',
'C1216244598-ASF',
'C1216244586-ASF',
'C1260726384-ASF',
'C1258354200-ASF',
'C1259982010-ASF',
'C2795135668-ASF',
'C1260721945-ASF',
'C1257995185-ASF',
'C1259976861-ASF',
'C1258354201-ASF',
'C1259981910-ASF',
'C2795135174-ASF',
'C1260721853-ASF',
'C1257995186-ASF',
'C1259974840-ASF',
'C1216244597-ASF',
'C1216244596-ASF',
'C1216244588-ASF',
'C1216244599-ASF',
'C1257024016-ASF',
'C1225776655-ASF',
'C1225776657-ASF',
'C1225776658-ASF',
'C1225776659-ASF',
'C1245953394-ASF',
'C1234413263-ASFDEV',
'C1234413247-ASFDEV',
'C1234413248-ASFDEV',
'C1234413255-ASFDEV',
'C1234413254-ASFDEV',
'C1234413256-ASFDEV',
'C1234413253-ASFDEV',
'C1234413252-ASFDEV',
'C1234413250-ASFDEV',
'C1234413259-ASFDEV',
'C1234413251-ASFDEV',
'C1234413261-ASFDEV',
'C1234413262-ASFDEV',
'C1234413258-ASFDEV',
'C1244552887-ASFDEV',
'C1234413246-ASFDEV',
'C1234413249-ASFDEV',
'C1234413257-ASFDEV',
'C1234413260-ASFDEV',
'C1257175154-ASFDEV',
'C1244598379-ASFDEV',
],
'STS-59': [
'C1661710578-ASF',
'C1661710581-ASF',
'C1661710583-ASF',
'C1661710586-ASF',
'C1661710588-ASF',
'C1661710590-ASF',
'C1226557819-ASF',
'C1226557809-ASF',
'C1226557808-ASF',
'C1226557810-ASF',
'C1226557811-ASF',
'C1226557812-ASF',
'C1234413264-ASFDEV',
'C1234413265-ASFDEV',
'C1234413266-ASFDEV',
'C1234413267-ASFDEV',
'C1234413268-ASFDEV',
'C1234413269-ASFDEV',
],
'STS-68': [
'C1661710593-ASF',
'C1661710596-ASF',
'C1661710597-ASF',
'C1661710600-ASF',
'C1661710603-ASF',
'C1661710604-ASF',
'C1226557813-ASF',
'C1226557814-ASF',
'C1226557815-ASF',
'C1226557816-ASF',
'C1226557817-ASF',
'C1226557818-ASF',
'C1234413270-ASFDEV',
'C1234413271-ASFDEV',
'C1234413272-ASFDEV',
'C1234413273-ASFDEV',
'C1234413274-ASFDEV',
'C1234413275-ASFDEV',
],
'ALOS': [
'C1206487504-ASF',
'C1206487217-ASF',
'C1206485940-ASF',
'C1206156901-ASF',
'C1206485320-ASF',
'C1206485527-ASF',
'C1808440897-ASF',
'C2011599335-ASF',
'C1207181535-ASF',
'C1207710476-ASF',
'C1234712303-ASF',
'C1239927797-ASF',
'C1205261223-ASF',
'C1233629671-ASF',
'C1208013295-ASF',
'C1207019609-ASF',
'C1229740239-ASF',
'C1239611505-ASFDEV',
'C1238733834-ASFDEV',
'C1234413224-ASFDEV',
],
'ERS-1': [
'C1210197768-ASF',
'C1211627521-ASF',
'C1205261222-ASF',
'C1205302527-ASF',
],
'ERS-2': [
'C1208794942-ASF',
'C1209373626-ASF',
'C1207143701-ASF',
'C1207144966-ASF',
],
'JERS-1': [
'C1208662092-ASF',
'C1207933168-ASF',
'C1207175327-ASF',
'C1207177736-ASF',
],
'RADARSAT-1': [
'C1206897141-ASF',
'C1206936391-ASF',
'C1205181982-ASF',
'C1215670813-ASF',
'C1215709884-ASF',
'C1215709880-ASF',
'C1215714443-ASF',
'C1215775284-ASF',
'C1215614037-ASF',
],
'DC-8': [
'C1213921661-ASF',
'C1213921626-ASF',
'C1213928843-ASF',
'C1213927035-ASF',
'C179001730-ASF',
'C1213925022-ASF',
'C1213926419-ASF',
'C1213926777-ASF',
'C1213927939-ASF',
'C1213928209-ASF',
'C1205256880-ASF',
'C1208652494-ASF',
'C1000000306-ASF',
'C1208713702-ASF',
'C1208707768-ASF',
'C1208655639-ASF',
'C1208680681-ASF',
'C1208691361-ASF',
'C1208703384-ASF',
],
'SEASAT 1': [
'C1206500826-ASF',
'C1206500991-ASF',
'C1206752770-ASF',
'C1206144699-ASF',
],
'SMAP': [
'C1243122884-ASF',
'C1243124139-ASF',
'C1214473171-ASF',
'C1243149604-ASF',
'C1214473426-ASF',
'C1243119801-ASF',
'C1214473839-ASF',
'C1243133204-ASF',
'C1243197402-ASF',
'C1243215430-ASF',
'C1243124754-ASF',
'C1243141638-ASF',
'C1243162394-ASF',
'C1243168733-ASF',
'C1243168866-ASF',
'C1214473308-ASF',
'C1243253631-ASF',
'C1243133445-ASF',
'C1214473550-ASF',
'C1243197502-ASF',
'C1243126328-ASF',
'C1214474243-ASF',
'C1243216659-ASF',
'C1243129847-ASF',
'C1214473367-ASF',
'C1243268956-ASF',
'C1243144528-ASF',
'C1214473624-ASF',
'C1243228612-ASF',
'C1243136142-ASF',
'C1214474435-ASF',
'C1243255360-ASF',
'C1243140611-ASF',
'C1233103964-ASF',
'C1216074923-ASF',
'C1212243761-ASF',
'C1213091807-ASF',
'C1216074922-ASF',
'C1212243437-ASF',
'C1213096699-ASF',
'C1216074750-ASF',
'C1212249653-ASF',
'C1213101573-ASF',
'C1216074751-ASF',
'C1213136240-ASF',
'C1216074755-ASF',
'C1213136752-ASF',
'C1213136799-ASF',
'C1213136709-ASF',
'C1213136844-ASF',
'C1212249811-ASF',
'C1213125007-ASF',
'C1216074919-ASF',
'C1212196951-ASF',
'C1213115690-ASF',
'C1216074758-ASF',
'C1212243666-ASF',
'C1213115896-ASF',
'C1216074761-ASF',
'C1212250364-ASF',
'C1213134622-ASF',
'C1216074770-ASF',
'C1212246173-ASF',
'C1213125156-ASF',
'C1216074764-ASF',
'C1212249773-ASF',
'C1213134486-ASF',
'C1233101609-ASF',
],
'G-III': [
'C1214353986-ASF',
'C1214336045-ASF',
'C1214336717-ASF',
'C1214335430-ASF',
'C1214335471-ASF',
'C1214335903-ASF',
'C1214336154-ASF',
'C1214336554-ASF',
'C1214353593-ASF',
'C1214353754-ASF',
'C1214353859-ASF',
'C1214337770-ASF',
'C1214354144-ASF',
'C1214354235-ASF',
'C1214343609-ASF',
'C1214354031-ASF',
'C1214408428-ASF',
'C1214419355-ASF',
'C1210487703-ASF',
'C1212030772-ASF',
'C1206116665-ASF',
'C1206132445-ASF',
'C1211962154-ASF',
'C1212001698-ASF',
'C1212005594-ASF',
'C1212019993-ASF',
'C1207638502-ASF',
'C1210025872-ASF',
'C1210485039-ASF',
'C1207188317-ASF',
'C1210546638-ASF',
'C1206122195-ASF',
'C1209970710-ASF',
'C1207038647-ASF',
'C1210599503-ASF',
'C1210599673-ASF',
],
'NISAR': [
# UAT ASFDEV
'C1261815181-ASFDEV',
'C1261832381-ASFDEV',
'C1256533420-ASFDEV',
'C1261813453-ASFDEV',
'C1261832466-ASFDEV',
'C1256524081-ASFDEV',
'C1261815274-ASFDEV',
'C1261832497-ASFDEV',
'C1256358262-ASFDEV',
'C1261815276-ASFDEV',
'C1261832632-ASFDEV',
'C1256358463-ASFDEV',
'C1261813489-ASFDEV',
'C1261832868-ASFDEV',
'C1256363301-ASFDEV',
'C1261819086-ASFDEV',
'C1261832940-ASFDEV',
'C1256381769-ASFDEV',
'C1261819098-ASFDEV',
'C1261832990-ASFDEV',
'C1256420738-ASFDEV',
'C1261819110-ASFDEV',
'C1261832993-ASFDEV',
'C1256411631-ASFDEV',
'C1261819167-ASFDEV',
'C1261833024-ASFDEV',
'C1256413628-ASFDEV',
'C1261819168-ASFDEV',
'C1261833025-ASFDEV',
'C1256432264-ASFDEV',
'C1261819211-ASFDEV',
'C1261833026-ASFDEV',
'C1256477304-ASFDEV',
'C1261819233-ASFDEV',
'C1261833027-ASFDEV',
'C1256479237-ASFDEV',
'C1261819245-ASFDEV',
'C1261833050-ASFDEV',
'C1256568692-ASFDEV',
'C1262134528-ASFDEV',
# UAT
'C1261815288-ASF',
'C1261832657-ASF',
'C1257349121-ASF',
'C1261815147-ASF',
'C1261832658-ASF',
'C1257349120-ASF',
'C1261815289-ASF',
'C1261832659-ASF',
'C1257349115-ASF',
'C1261815301-ASF',
'C1261832671-ASF',
'C1257349114-ASF',
'C1261815148-ASF',
'C1261833052-ASF',
'C1257349109-ASF',
'C1261819120-ASF',
'C1261833063-ASF',
'C1257349108-ASF',
'C1261819121-ASF',
'C1261833064-ASF',
'C1257349107-ASF',
'C1261819145-ASF',
'C1261833076-ASF',
'C1257349103-ASF',
'C1261819258-ASF',
'C1261833127-ASF',
'C1257349102-ASF',
'C1261819270-ASF',
'C1261846741-ASF',
'C1257349096-ASF',
'C1261819275-ASF',
'C1261846880-ASF',
'C1257349095-ASF',
'C1261819281-ASF',
'C1261846994-ASF',
'C1257349094-ASF',
'C1261819282-ASF',
'C1261847095-ASF',
'C1257349093-ASF',
'C1262135006-ASF',
# PROD
'C2850220296-ASF',
'C2853068083-ASF',
'C2727902012-ASF',
'C2850223384-ASF',
'C2853086824-ASF',
'C2727901263-ASF',
'C2850224301-ASF',
'C2853089814-ASF',
'C2727901639-ASF',
'C2850225137-ASF',
'C2853091612-ASF',
'C2727901523-ASF',
'C2850225585-ASF',
'C2853145197-ASF',
'C2727900439-ASF',
'C2850234202-ASF',
'C2853147928-ASF',
'C2723110181-ASF',
'C2850235455-ASF',
'C2853153429-ASF',
'C2727900827-ASF',
'C2850237619-ASF',
'C2853156054-ASF',
'C2727900080-ASF',
'C2850259510-ASF',
'C2854332392-ASF',
'C2727896667-ASF',
'C2850261892-ASF',
'C2854335566-ASF',
'C2727897718-ASF',
'C2850262927-ASF',
'C2854338529-ASF',
'C2727896018-ASF',
'C2850263910-ASF',
'C2854341702-ASF',
'C2727896460-ASF',
'C2850265000-ASF',
'C2854344945-ASF',
'C2727894546-ASF',
'C2874824964-ASF',
],
}
collections_by_processing_level = {
'SLC': [
'C1214470488-ASF',
'C1205428742-ASF',
'C1234413245-ASFDEV',
'C1327985661-ASF',
'C1216244348-ASF',
'C1234413263-ASFDEV',
'C1661710588-ASF',
'C1661710590-ASF',
'C1226557811-ASF',
'C1226557812-ASF',
'C1661710603-ASF',
'C1661710604-ASF',
'C1226557817-ASF',
'C1226557818-ASF',
],
'GRD_HD': [
'C1214470533-ASF',
'C1212201032-ASF',
'C1234413229-ASFDEV',
'C1327985645-ASF',
'C1216244589-ASF',
],
'METADATA_GRD_HD': [
'C1214470576-ASF',
'C1212209226-ASF',
'C1234413232-ASFDEV',
'C1327985741-ASF',
'C1216244601-ASF',
],
'GUNW_STD': [
'C1595422627-ASF',
'C1225776654-ASF',
'C1595422627-ASF',
'C1225776654-ASF',
],
'METADATA_SLC': [
'C1214470496-ASF',
'C1208117434-ASF',
'C1234413236-ASFDEV',
'C1327985617-ASF',
'C1216244585-ASF',
'C1234413254-ASFDEV',
],
'METADATA_RAW': [
'C1214470532-ASF',
'C1208115009-ASF',
'C1234413235-ASFDEV',
'C1327985650-ASF',
'C1216244595-ASF',
],
'OCN': [
'C1214472977-ASF',
'C1212212560-ASF',
'C1234413237-ASFDEV',
'C1327985579-ASF',
'C1216244593-ASF',
'C1234413255-ASFDEV',
],
'METADATA_GRD_MD': [
'C1214472336-ASF',
'C1212212493-ASF',
'C1234413233-ASFDEV',
'C1327985578-ASF',
'C1216244591-ASF',
],
'METADATA_OCN': [
'C1266376001-ASF',
'C1215704763-ASF',
'C1234413234-ASFDEV',
'C1327985646-ASF',
'C1216244590-ASF',
'C1234413252-ASFDEV',
],
'GRD_MS': [
'C1214472994-ASF',
'C1212158318-ASF',
'C1327985740-ASF',
'C1216244600-ASF',
],
'METADATA_GRD_HS': [
'C1214470732-ASF',
'C1212158326-ASF',
'C1234413243-ASFDEV',
'C1327985619-ASF',
'C1216244587-ASF',
],
'METADATA_GRD_MS': [
'C1214473170-ASF',
'C1212233976-ASF',
'C1327985739-ASF',
'C1216244598-ASF',
],
'RAW': [
'C1214470561-ASF',
'C1205264459-ASF',
'C1234413238-ASFDEV',
'C1327985647-ASF',
'C1216244592-ASF',
'C1234413256-ASFDEV',
],
'GRD_MD': [
'C1214471521-ASF',
'C1212209035-ASF',
'C1234413230-ASFDEV',
'C1327985660-ASF',
'C1216244594-ASF',
],
'GRD_HS': [
'C1214470682-ASF',
'C1212158327-ASF',
'C1234413240-ASFDEV',
'C1327985571-ASF',
'C1216244586-ASF',
],
'CSLC': [
'C2777443834-ASF',
'C1260721945-ASF',
'C2803501758-ASF',
'C1259976861-ASF',
],
'RTC': [
'C2777436413-ASF',
'C1260721853-ASF',
'C2803501097-ASF',
'C1259974840-ASF',
],
'GRD_FD': ['C1214471197-ASF', 'C1212200781-ASF'],
'METADATA_GRD_FD': ['C1214471960-ASF', 'C1212209075-ASF'],
'BURST': [
'C2709161906-ASF',
'C1257024016-ASF',
'C1257175154-ASFDEV',
],
'GUNW_AMP': [
'C1596065640-ASF',
'C1225776655-ASF',
'C1596065640-ASF',
'C1225776655-ASF',
],
'GUNW_COH': [
'C1596065639-ASF',
'C1225776657-ASF',
'C1596065639-ASF',
'C1225776657-ASF',
],
'GUNW_CON': [
'C1596065641-ASF',
'C1225776658-ASF',
'C1596065641-ASF',
'C1225776658-ASF',
],
'GUNW_UNW': [
'C1595765183-ASF',
'C1225776659-ASF',
'C1595765183-ASF',
'C1225776659-ASF',
],
'CSLC-STATIC': ['C1259982010-ASF', 'C2795135668-ASF'],
'RTC-STATIC': ['C1259981910-ASF', 'C2795135174-ASF'],
'GRD': [
'C1661710583-ASF',
'C1661710586-ASF',
'C1226557808-ASF',
'C1226557810-ASF',
'C1661710597-ASF',
'C1661710600-ASF',
'C1226557815-ASF',
'C1226557816-ASF',
],
'RTC_HI_RES': ['C1206487504-ASF', 'C1207181535-ASF'],
'RTC_LOW_RES': ['C1206487217-ASF', 'C1208013295-ASF'],
'L1.5': ['C1206485940-ASF', 'C1205261223-ASF'],
'KMZ': [
'C1206156901-ASF',
'C1207019609-ASF',
'C1214336554-ASF',
'C1214353859-ASF',
'C1212019993-ASF',
'C1210485039-ASF',
],
'L1.0': ['C1206485320-ASF'],
'L1.1': ['C1206485527-ASF', 'C1207710476-ASF', 'C1239611505-ASFDEV'],
'L2.2': ['C2011599335-ASF', 'C1239927797-ASF', 'C1238733834-ASFDEV'],
'L0': [
'C1210197768-ASF',
'C1205261222-ASF',
'C1208794942-ASF',
'C1207143701-ASF',
'C1207933168-ASF',
'C1207175327-ASF',
'C1206897141-ASF',
],
'L1': [
'C1211627521-ASF',
'C1205302527-ASF',
'C1209373626-ASF',
'C1207144966-ASF',
'C1208662092-ASF',
'C1207177736-ASF',
'C1206936391-ASF',
'C1205181982-ASF',
'C1206500991-ASF',
'C1206144699-ASF',
],
'3FP': ['C1213921661-ASF', 'C1213928843-ASF', 'C1205256880-ASF', 'C1208713702-ASF'],
'JPG': ['C1213921626-ASF', 'C1000000306-ASF'],
'CSTOKES': ['C1213927035-ASF', 'C1208707768-ASF'],
'DEM': ['C179001730-ASF', 'C1208655639-ASF'],
'CTIF': ['C1213925022-ASF', 'C1208680681-ASF'],
'LTIF': ['C1213926419-ASF', 'C1208691361-ASF'],
'PTIF': ['C1213926777-ASF', 'C1208703384-ASF'],
'LSTOKES': ['C1213927939-ASF'],
'PSTOKES': ['C1213928209-ASF'],
'ATI': ['C1208652494-ASF'],
'GEOTIFF': ['C1206500826-ASF', 'C1206752770-ASF'],
'L1A_Radar_RO_ISO_XML': [
'C1243122884-ASF',
'C1243141638-ASF',
'C1243162394-ASF',
'C1233103964-ASF',
'C1213136752-ASF',
'C1213136799-ASF',
],
'L1A_Radar_RO_QA': [
'C1243124139-ASF',
'C1243168733-ASF',
'C1243168866-ASF',
'C1216074923-ASF',
'C1213136709-ASF',
'C1213136844-ASF',
],
'L1A_Radar_HDF5': [
'C1214473171-ASF',
'C1243149604-ASF',
'C1212243761-ASF',
'C1213091807-ASF',
],
'L1A_Radar_ISO_XML': [
'C1214473426-ASF',
'C1243119801-ASF',
'C1212243437-ASF',
'C1213096699-ASF',
],
'L1A_Radar_QA': [
'C1214473839-ASF',
'C1243133204-ASF',
'C1212249653-ASF',
'C1213101573-ASF',
],
'L1A_Radar_RO_HDF5': [
'C1243197402-ASF',
'C1243215430-ASF',
'C1243124754-ASF',
'C1213136240-ASF',
'C1216074755-ASF',
],
'L1B_S0_LoRes_HDF5': [
'C1214473308-ASF',
'C1243253631-ASF',
'C1243133445-ASF',
'C1212249811-ASF',
'C1213125007-ASF',
'C1216074919-ASF',
],
'L1B_S0_LoRes_ISO_XML': [
'C1214473550-ASF',
'C1243197502-ASF',
'C1243126328-ASF',
'C1212196951-ASF',
'C1213115690-ASF',
'C1216074758-ASF',
],
'L1B_S0_LoRes_QA': [
'C1214474243-ASF',
'C1243216659-ASF',
'C1243129847-ASF',
'C1212243666-ASF',
'C1213115896-ASF',
'C1216074761-ASF',
],
'L1C_S0_HiRes_HDF5': [
'C1214473367-ASF',
'C1243268956-ASF',
'C1243144528-ASF',
'C1212250364-ASF',
'C1213134622-ASF',
'C1216074770-ASF',
],
'L1C_S0_HiRes_ISO_XML': [
'C1214473624-ASF',
'C1243228612-ASF',
'C1243136142-ASF',
'C1212246173-ASF',
'C1213125156-ASF',
'C1216074764-ASF',
],
'L1C_S0_HiRes_QA': [
'C1214474435-ASF',
'C1243255360-ASF',
'C1243140611-ASF',
'C1212249773-ASF',
'C1213134486-ASF',
'C1233101609-ASF',
],
'METADATA': [
'C1214353986-ASF',
'C1214336717-ASF',
'C1210487703-ASF',
'C1212030772-ASF',
],
'INTERFEROMETRY': ['C1214336045-ASF', 'C1212001698-ASF'],
'AMPLITUDE': ['C1214335430-ASF', 'C1206116665-ASF'],
'AMPLITUDE_GRD': ['C1214335471-ASF', 'C1206132445-ASF'],
'DEM_TIFF': [
'C1214335903-ASF',
'C1214353593-ASF',
'C1211962154-ASF',
'C1207638502-ASF',
],
'INTERFEROMETRY_GRD': ['C1214336154-ASF', 'C1212005594-ASF'],
'INC': ['C1214353754-ASF', 'C1210025872-ASF'],
'PROJECTED': ['C1214337770-ASF', 'C1207188317-ASF'],
'PROJECTED_ML3X3': ['C1214354144-ASF', 'C1210546638-ASF'],
'PROJECTED_ML5X5': ['C1214354235-ASF', 'C1206122195-ASF'],
'COMPLEX': ['C1214343609-ASF', 'C1209970710-ASF'],
'PAULI': ['C1214354031-ASF', 'C1207038647-ASF'],
'SLOPE': ['C1214408428-ASF', 'C1210599503-ASF'],
'STOKES': ['C1214419355-ASF', 'C1210599673-ASF'],
}
# Helper Methods
def get_concept_id_alias(param_list: List[str], collections_dict: dict) -> List[str]:
"""
param: param_list (List[str]): list of search values to alias
param: collections_dict (dict): The search value to concept-id dictionary to read from
returns List[str]: Returns a list of concept-ids
that correspond to the given list of search values
If any of the search values are not keys in the collections_dict,
this will instead returns an empty list.
"""
concept_id_aliases = []
for param in param_list:
if alias := collections_dict.get(param):
concept_id_aliases.extend(alias)
else:
return []
return concept_id_aliases
def get_dataset_concept_ids(datasets: List[str]) -> List[str]:
"""
Returns concept-ids for provided dataset(s)
If an invalid datset is provided a ValueError is raised
:param `datasets` (`List[str]`): a list of datasets to grab concept-ids for
:returns `List[str]`: the list of concept-ids associated with the given datasets
"""
output = []
for dataset in datasets:
if collections_by_short_name := dataset_collections.get(dataset):
for concept_ids in collections_by_short_name.values():
output.extend(concept_ids)
else:
raise ValueError(
f'Could not find dataset named "{dataset}" provided for dataset keyword.'
)
return output
Discovery-asf_search-8.1.2/asf_search/CMR/field_map.py 0000664 0000000 0000000 00000011633 14777330235 0022615 0 ustar 00root root 0000000 0000000 field_map = {
# API parameter CMR keyword CMR format strings
'absoluteOrbit': {'key': 'orbit_number', 'fmt': '{0}'},
'asfFrame': {'key': 'attribute[]', 'fmt': 'int,FRAME_NUMBER,{0}'},
'maxBaselinePerp': {'key': 'attribute[]', 'fmt': 'float,INSAR_BASELINE,,{0}'},
'minBaselinePerp': {'key': 'attribute[]', 'fmt': 'float,INSAR_BASELINE,{0},'},
'bbox': {'key': 'bounding_box', 'fmt': '{0}'},
'beamMode': {'key': 'attribute[]', 'fmt': 'string,BEAM_MODE,{0}'},
'beamSwath': {'key': 'attribute[]', 'fmt': 'string,BEAM_MODE_TYPE,{0}'},
'campaign': {'key': 'attribute[]', 'fmt': 'string,MISSION_NAME,{0}'},
'circle': {'key': 'circle', 'fmt': '{0}'},
'maxDoppler': {'key': 'attribute[]', 'fmt': 'float,DOPPLER,,{0}'},
'minDoppler': {'key': 'attribute[]', 'fmt': 'float,DOPPLER,{0},'},
'maxFaradayRotation': {'key': 'attribute[]', 'fmt': 'float,FARADAY_ROTATION,,{0}'}, # noqa F401
'minFaradayRotation': {'key': 'attribute[]', 'fmt': 'float,FARADAY_ROTATION,{0},'}, # noqa F401
'flightDirection': {'key': 'attribute[]', 'fmt': 'string,ASCENDING_DESCENDING,{0}'}, # noqa F401
'flightLine': {'key': 'attribute[]', 'fmt': 'string,FLIGHT_LINE,{0}'},
'frame': {'key': 'attribute[]', 'fmt': 'int,CENTER_ESA_FRAME,{0}'},
'granule_list': {'key': 'readable_granule_name[]', 'fmt': '{0}'},
'groupID': {'key': 'attribute[]', 'fmt': 'string,GROUP_ID,{0}'},
'insarStackId': {'key': 'attribute[]', 'fmt': 'int,INSAR_STACK_ID,{0}'},
'linestring': {'key': 'line', 'fmt': '{0}'},
'lookDirection': {'key': 'attribute[]', 'fmt': 'string,LOOK_DIRECTION,{0}'},
'maxInsarStackSize': {'key': 'attribute[]', 'fmt': 'int,INSAR_STACK_SIZE,,{0}'},
'minInsarStackSize': {'key': 'attribute[]', 'fmt': 'int,INSAR_STACK_SIZE,{0},'},
'instrument': {'key': 'instrument[]', 'fmt': '{0}'},
'offNadirAngle': {'key': 'attribute[]', 'fmt': 'float,OFF_NADIR_ANGLE,{0}'},
'platform': {'key': 'platform[]', 'fmt': '{0}'},
'polarization': {'key': 'attribute[]', 'fmt': 'string,POLARIZATION,{0}'},
'point': {'key': 'point', 'fmt': '{0}'},
'polygon': {'key': 'polygon', 'fmt': '{0}'},
'processingDate': {'key': 'updated_since', 'fmt': '{0}'},
'processingLevel': {'key': 'attribute[]', 'fmt': 'string,PROCESSING_TYPE,{0}'},
'product_list': {'key': 'granule_ur[]', 'fmt': '{0}'},
'provider': {'key': 'provider', 'fmt': '{0}'},
'relativeOrbit': {'key': 'attribute[]', 'fmt': 'int,PATH_NUMBER,{0}'},
'temporal': {'key': 'temporal', 'fmt': '{0}'},
'collections': {'key': 'echo_collection_id[]', 'fmt': '{0}'},
'shortName': {'key': 'shortName', 'fmt': '{0}'},
'temporalBaselineDays': {'key': 'attribute[]', 'fmt': 'int,TEMPORAL_BASELINE_DAYS,{0}'}, # noqa F401
# SLC BURST fields
'absoluteBurstID': {'key': 'attribute[]', 'fmt': 'int,BURST_ID_ABSOLUTE,{0}'},
'relativeBurstID': {'key': 'attribute[]', 'fmt': 'int,BURST_ID_RELATIVE,{0}'},
'fullBurstID': {'key': 'attribute[]', 'fmt': 'string,BURST_ID_FULL,{0}'},
# OPERA-S1 field
'operaBurstID': {'key': 'attribute[]', 'fmt': 'string,OPERA_BURST_ID,{0}'},
# NISAR fields
'mainBandPolarization': {'key': 'attribute[]', 'fmt': 'string,FREQUENCY_A_POLARIZATION_CONCAT,{0}'},
'sideBandPolarization': {'key': 'attribute[]', 'fmt': 'string,FREQUENCY_B_POLARIZATION_CONCAT,{0}'},
'frameCoverage': {'key': 'attribute[]', 'fmt': 'string,FULL_FRAME,{0}'},
'jointObservation': {'key': 'attribute[]', 'fmt': 'string,JOINT_OBSERVATION,{0}'},
'rangeBandwidth': {'key': 'attribute[]', 'fmt': 'string,RANGE_BANDWIDTH_CONCAT,{0}'},
}
Discovery-asf_search-8.1.2/asf_search/CMR/subquery.py 0000664 0000000 0000000 00000013164 14777330235 0022555 0 ustar 00root root 0000000 0000000 from typing import List, Tuple
import itertools
from copy import copy
from asf_search.ASFSearchOptions import ASFSearchOptions
from asf_search.constants import CMR_PAGE_SIZE
from asf_search.CMR.datasets import (
collections_by_processing_level,
collections_per_platform,
get_concept_id_alias,
get_dataset_concept_ids,
)
from numpy import intersect1d, union1d
def build_subqueries(opts: ASFSearchOptions) -> List[ASFSearchOptions]:
"""
Build a list of sub-queries using the cartesian product
of all the list parameters described by opts
:param opts: The search options to split into sub-queries
:return list: A list of ASFSearchOptions objects
"""
params = dict(opts)
# Break out two big list offenders into manageable chunks
for chunked_key in ['granule_list', 'product_list']:
if params.get(chunked_key) is not None:
params[chunked_key] = chunk_list(params[chunked_key], CMR_PAGE_SIZE)
list_param_names = [
'platform',
'season',
'collections',
'dataset',
'cmr_keywords',
'shortName',
'circle',
'linestring',
'point',
] # these parameters will dodge the subquery system
skip_param_names = [
'maxResults',
] # these params exist in opts, but shouldn't be passed on to subqueries at ALL
collections, aliased_keywords = get_keyword_concept_ids(params, opts.collectionAlias)
params['collections'] = list(union1d(collections, params.get('collections', [])))
for keyword in [*skip_param_names, *aliased_keywords]:
params.pop(keyword, None)
subquery_params, list_params = {}, {}
for key, value in params.items():
if key in list_param_names:
list_params[key] = value
else:
subquery_params[key] = value
sub_queries = cartesian_product(subquery_params)
return [_build_subquery(query, opts, list_params) for query in sub_queries]
def _build_subquery(
query: List[Tuple[dict]], opts: ASFSearchOptions, list_params: dict
) -> ASFSearchOptions:
"""
Composes query dict and list params into new ASFSearchOptions object
param: query: the cartesian search query options
param: opts: the search options to pull config options from (provider, host, session)
param: list_params: the subquery parameters
"""
q = dict()
for p in query:
q.update(p)
q['provider'] = opts.provider
q['host'] = opts.host
q['session'] = copy(opts.session)
return ASFSearchOptions(**q, **list_params)
def get_keyword_concept_ids(params: dict, use_collection_alias: bool = True) -> dict:
"""
Gets concept-ids for dataset, platform, processingLevel keywords
processingLevel is scoped by dataset or platform concept-ids when available
: param params:
search parameter dictionary pre-CMR translation
: param use_collection_alias:
whether or not to alias platform and processingLevel with concept-ids
: returns two lists:
- list of concept-ids for dataset, platform, and processingLevel
- list of aliased keywords to remove from final parameters
"""
collections = []
aliased_keywords = []
if use_collection_alias:
if 'processingLevel' in params.keys():
collections = get_concept_id_alias(
params.get('processingLevel'), collections_by_processing_level
)
if len(collections):
aliased_keywords.append('processingLevel')
if 'platform' in params.keys():
platform_concept_ids = get_concept_id_alias(
[platform.upper() for platform in params.get('platform')],
collections_per_platform,
)
if len(platform_concept_ids):
aliased_keywords.append('platform')
collections = _get_intersection(platform_concept_ids, collections)
if 'dataset' in params.keys():
aliased_keywords.append('dataset')
dataset_concept_ids = get_dataset_concept_ids(params.get('dataset'))
collections = _get_intersection(dataset_concept_ids, collections)
return collections, aliased_keywords
def _get_intersection(keyword_concept_ids: List[str], intersecting_ids: List[str]) -> List[str]:
"""
Returns the intersection between two lists. If the second list is empty the first list
is return unchaged
"""
if len(intersecting_ids):
return list(intersect1d(intersecting_ids, keyword_concept_ids))
return keyword_concept_ids
def chunk_list(source: List, n: int) -> List:
"""
Breaks a longer list into a list of lists, each of length n
:param source: The list to be broken into chunks
:param n: The maximum length of each chunk
:return List[List, ...]:
"""
return [source[i * n : (i + 1) * n] for i in range((len(source) + n - 1) // n)]
def cartesian_product(params):
formatted_params = format_query_params(params)
p = list(itertools.product(*formatted_params))
return p
def format_query_params(params) -> List[List[dict]]:
listed_params = []
for param_name, param_val in params.items():
plist = translate_param(param_name, param_val)
listed_params.append(plist)
return listed_params
def translate_param(param_name, param_val) -> List[dict]:
param_list = []
if not isinstance(param_val, list):
param_val = [param_val]
for unformatted_val in param_val:
formatted_val = unformatted_val
if isinstance(unformatted_val, list):
formatted_val = ','.join([f'{t}' for t in unformatted_val])
param_list.append({param_name: formatted_val})
return param_list
Discovery-asf_search-8.1.2/asf_search/CMR/translate.py 0000664 0000000 0000000 00000022450 14777330235 0022671 0 ustar 00root root 0000000 0000000 from datetime import datetime, timezone
from typing import Any, Dict, List, Optional
from asf_search.ASFSearchOptions import ASFSearchOptions
from asf_search.CMR.datasets import get_concept_id_alias
from asf_search.constants import CMR_PAGE_SIZE
import re
from shapely import wkt
from shapely.geometry import Polygon
from shapely.geometry.base import BaseGeometry
from .field_map import field_map
from .datasets import collections_per_platform
import logging
try:
from ciso8601 import parse_datetime
except ImportError:
from dateutil.parser import parse as parse_datetime
def translate_opts(opts: ASFSearchOptions) -> List:
# Need to add params which ASFSearchOptions cant support (like temporal),
# so use a dict to avoid the validate_params logic:
dict_opts = dict(opts)
# Escape commas for each key in the list.
# intersectsWith, temporal, and other keys you don't want to escape, so keep whitelist instead
for escape_commas in ['campaign']:
if escape_commas in dict_opts:
dict_opts[escape_commas] = dict_opts[escape_commas].replace(',', '\\,')
dict_opts = fix_cmr_shapes(dict_opts)
# Additional Attribute FULL_FRAME stored as a TRUE/FALSE string
if 'frameCoverage' in dict_opts:
dict_opts['frameCoverage'] = {
'F': 'TRUE',
'P': 'FALSE',
}[dict_opts['frameCoverage'][0].upper()]
if 'jointObservation' in dict_opts:
dict_opts['jointObservation'] = str(dict_opts['jointObservation']).upper()
# Special case to unravel WKT field a little for compatibility
if 'intersectsWith' in dict_opts:
shape = wkt.loads(dict_opts.pop('intersectsWith', None))
# If a wide rectangle is provided, make sure to use the bounding box
# instead of the wkt for better responses from CMR
# This will provide better results with AOI's near poles
if should_use_bbox(shape):
bounds = shape.boundary.bounds
if bounds[0] > 180 or bounds[2] > 180:
bounds = [
(x + 180) % 360 - 180 if idx % 2 == 0 and abs(x) > 180 else x
for idx, x in enumerate(bounds)
]
bottom_left = [str(coord) for coord in bounds[:2]]
top_right = [str(coord) for coord in bounds[2:]]
bbox = ','.join([*bottom_left, *top_right])
dict_opts['bbox'] = bbox
else:
(shapeType, shape) = wkt_to_cmr_shape(shape).split(':')
dict_opts[shapeType] = shape
# If you need to use the temporal key:
if any(key in dict_opts for key in ['start', 'end', 'season']):
dict_opts = fix_date(dict_opts)
dict_opts = fix_range_params(dict_opts)
# convert the above parameters to a list of key/value tuples
cmr_opts = []
# user provided umm fields
custom_cmr_keywords = dict_opts.pop('cmr_keywords', [])
for key, val in dict_opts.items():
# If it's "session" or something else CMR doesn't accept, don't send it:
if key not in field_map:
continue
if isinstance(val, list):
for x in val:
if key in ['granule_list', 'product_list']:
for y in x.split(','):
cmr_opts.append((key, y))
else:
if isinstance(x, tuple):
cmr_opts.append((key, ','.join([str(t) for t in x])))
else:
cmr_opts.append((key, x))
else:
cmr_opts.append((key, val))
# translate the above tuples to CMR key/values
for i, opt in enumerate(cmr_opts):
cmr_opts[i] = field_map[opt[0]]['key'], field_map[opt[0]]['fmt'].format(opt[1])
if should_use_asf_frame(cmr_opts):
cmr_opts = use_asf_frame(cmr_opts)
cmr_opts.extend(custom_cmr_keywords)
additional_keys = [
('page_size', CMR_PAGE_SIZE),
('options[temporal][and]', 'true'),
('sort_key[]', '-end_date'),
('sort_key[]', 'granule_ur'),
('options[platform][ignore_case]', 'true'),
('provider', opts.provider),
]
cmr_opts.extend(additional_keys)
return cmr_opts
def fix_cmr_shapes(fixed_params: Dict[str, Any]) -> Dict[str, Any]:
"""Fixes raw CMR lon lat coord shapes"""
for param in ['point', 'linestring', 'circle']:
if param in fixed_params:
fixed_params[param] = ','.join(map(str, fixed_params[param]))
return fixed_params
def should_use_asf_frame(cmr_opts):
asf_frame_platforms = ['SENTINEL-1A', 'SENTINEL-1B', 'ALOS']
asf_frame_collections = get_concept_id_alias(asf_frame_platforms, collections_per_platform)
return any(
[
p[0] == 'platform[]'
and p[1].upper() in asf_frame_platforms
or p[0] == 'echo_collection_id[]'
and p[1] in asf_frame_collections
for p in cmr_opts
]
)
def use_asf_frame(cmr_opts):
"""
Sentinel/ALOS: always use asf frame instead of esa frame
Platform-specific hack
We do them at the subquery level in case the main query crosses
platforms that don't suffer these issue.
"""
for n, p in enumerate(cmr_opts):
if not isinstance(p[1], str):
continue
m = re.search(r'CENTER_ESA_FRAME', p[1])
if m is None:
continue
logging.debug('Sentinel/ALOS subquery, using ASF frame instead of ESA frame')
cmr_opts[n] = (p[0], p[1].replace(',CENTER_ESA_FRAME,', ',FRAME_NUMBER,'))
return cmr_opts
# some products don't have integer values in BYTES fields, round to nearest int
def try_round_float(value: str) -> Optional[int]:
if value is None:
return None
value = float(value)
return round(value)
def try_parse_int(value: str) -> Optional[int]:
if value is None:
return None
return int(value)
def try_parse_float(value: str) -> Optional[float]:
if value is None:
return None
return float(value)
def try_parse_bool(val: str) -> Optional[bool]:
"""Boolean values are stored as strings in umm json"""
if val is None:
return None
return val.lower() == 'true'
def try_parse_frame_coverage(val: str) -> Optional[str]:
"""Frame Coverage is stored as a string boolean in FULL_FRAME, convert it to Partial/Full"""
if val is not None:
if val.lower() == 'true':
val = 'Full'
else:
val = 'Partial'
return val
def try_parse_date(value: str) -> Optional[str]:
if value is None:
return None
try:
date = parse_datetime(value)
except ValueError:
return None
if date is None:
return value
if date.tzinfo is None:
date = date.replace(tzinfo=timezone.utc)
# Turn all inputs into a consistant format:
return date.strftime('%Y-%m-%dT%H:%M:%SZ')
def fix_date(fixed_params: Dict[str, Any]):
if 'start' in fixed_params or 'end' in fixed_params or 'season' in fixed_params:
fixed_params['start'] = (
fixed_params['start'] if 'start' in fixed_params else '1978-01-01T00:00:00Z'
)
fixed_params['end'] = (
fixed_params['end'] if 'end' in fixed_params else datetime.now(timezone.utc).isoformat()
)
fixed_params['season'] = (
','.join(str(x) for x in fixed_params['season']) if 'season' in fixed_params else ''
)
fixed_params['temporal'] = (
f'{fixed_params["start"]},{fixed_params["end"]},{fixed_params["season"]}'
)
# And a little cleanup
fixed_params.pop('start', None)
fixed_params.pop('end', None)
fixed_params.pop('season', None)
return fixed_params
def fix_range_params(fixed_params: Dict[str, Any]) -> Dict[str, Any]:
"""Converts ranges to comma separated strings"""
for param in [
'offNadirAngle',
'relativeOrbit',
'absoluteOrbit',
'frame',
'asfFrame',
]:
if param in fixed_params.keys() and isinstance(fixed_params[param], list):
fixed_params[param] = ','.join([str(val) for val in fixed_params[param]])
return fixed_params
def should_use_bbox(shape: BaseGeometry):
"""
If the passed shape is a polygon, and if that polygon
is equivalent to it's bounding box (if it's a rectangle),
we should use the bounding box to search instead
"""
if isinstance(shape, Polygon):
coords = [
[shape.bounds[0], shape.bounds[1]],
[shape.bounds[2], shape.bounds[1]],
[shape.bounds[2], shape.bounds[3]],
[shape.bounds[0], shape.bounds[3]],
]
return shape.equals(Polygon(shell=coords))
return False
def wkt_to_cmr_shape(shape: BaseGeometry):
# take note of the WKT type
if shape.geom_type not in ['Point', 'LineString', 'Polygon']:
raise ValueError('Unsupported WKT: {0}.'.format(shape.wkt))
if shape.geom_type == 'Polygon':
coords = shape.exterior.coords
else: # type == Point | Linestring
coords = shape.coords
# Turn [[x,y],[x,y]] into [x,y,x,y]:
lon_lat_sequence = []
for lon_lat in coords:
lon_lat_sequence.extend(lon_lat)
# Turn any "6e8" to a literal number. (As a sting):
coords = ['{:.16f}'.format(float(cord)) for cord in lon_lat_sequence]
return '{0}:{1}'.format(shape.geom_type.lower(), ','.join(coords))
Discovery-asf_search-8.1.2/asf_search/Products/ 0000775 0000000 0000000 00000000000 14777330235 0021501 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/asf_search/Products/AIRSARProduct.py 0000664 0000000 0000000 00000001565 14777330235 0024404 0 ustar 00root root 0000000 0000000 from typing import Dict
from asf_search import ASFSession, ASFProduct
from asf_search.CMR.translate import try_parse_int
class AIRSARProduct(ASFProduct):
"""
ASF Dataset Overview Page: https://asf.alaska.edu/data-sets/sar-data-sets/airsar/
"""
_base_properties = {
**ASFProduct._base_properties,
'frameNumber': {
'path': ['AdditionalAttributes', ('Name', 'CENTER_ESA_FRAME'), 'Values', 0],
'cast': try_parse_int,
},
'groupID': {'path': ['AdditionalAttributes', ('Name', 'GROUP_ID'), 'Values', 0]},
'insarStackId': {'path': ['AdditionalAttributes', ('Name', 'INSAR_STACK_ID'), 'Values', 0]},
'md5sum': {'path': ['AdditionalAttributes', ('Name', 'MD5SUM'), 'Values', 0]},
}
def __init__(self, args: Dict = {}, session: ASFSession = ASFSession()):
super().__init__(args, session)
Discovery-asf_search-8.1.2/asf_search/Products/ALOSProduct.py 0000664 0000000 0000000 00000003372 14777330235 0024157 0 ustar 00root root 0000000 0000000 from typing import Dict, Union
from asf_search import ASFSession, ASFStackableProduct
from asf_search.CMR.translate import try_parse_float, try_parse_int, try_round_float
from asf_search.constants import PRODUCT_TYPE
class ALOSProduct(ASFStackableProduct):
"""
Used for ALOS Palsar and Avnir dataset products
ASF Dataset Documentation Page: https://asf.alaska.edu/datasets/daac/alos-palsar/
"""
_base_properties = {
**ASFStackableProduct._base_properties,
'frameNumber': {
'path': ['AdditionalAttributes', ('Name', 'FRAME_NUMBER'), 'Values', 0],
'cast': try_parse_int,
},
'faradayRotation': {
'path': ['AdditionalAttributes', ('Name', 'FARADAY_ROTATION'), 'Values', 0],
'cast': try_parse_float,
},
'offNadirAngle': {
'path': ['AdditionalAttributes', ('Name', 'OFF_NADIR_ANGLE'), 'Values', 0],
'cast': try_parse_float,
},
'bytes': {
'path': ['AdditionalAttributes', ('Name', 'BYTES'), 'Values', 0],
'cast': try_round_float,
},
'insarStackId': {'path': ['AdditionalAttributes', ('Name', 'INSAR_STACK_ID'), 'Values', 0]},
'beamModeType': {'path': ['AdditionalAttributes', ('Name', 'BEAM_MODE_TYPE'), 'Values', 0]},
}
def __init__(self, args: Dict = {}, session: ASFSession = ASFSession()):
super().__init__(args, session)
if self.properties.get('groupID') is None:
self.properties['groupID'] = self.properties['sceneName']
@staticmethod
def get_default_baseline_product_type() -> Union[str, None]:
"""
Returns the product type to search for when building a baseline stack.
"""
return PRODUCT_TYPE.L1_1
Discovery-asf_search-8.1.2/asf_search/Products/ARIAS1GUNWProduct.py 0000664 0000000 0000000 00000005151 14777330235 0025037 0 ustar 00root root 0000000 0000000 from typing import Dict
from asf_search import ASFSession
from asf_search.ASFProduct import ASFProduct
from asf_search.ASFSearchOptions import ASFSearchOptions
from asf_search.Products import S1Product
from asf_search.CMR.translate import try_parse_float
class ARIAS1GUNWProduct(S1Product):
"""
Used for ARIA S1 GUNW Products
ASF Dataset Documentation Page:
https://asf.alaska.edu/data-sets/derived-data-sets/sentinel-1-interferograms/
"""
_base_properties = {
**S1Product._base_properties,
'perpendicularBaseline': {
'path': [
'AdditionalAttributes',
('Name', 'PERPENDICULAR_BASELINE'),
'Values',
0,
],
'cast': try_parse_float,
},
'orbit': {'path': ['OrbitCalculatedSpatialDomains']},
'inputGranules': {'path': ['InputGranules']},
'ariaVersion': {'path': ['AdditionalAttributes', ('Name', 'VERSION'), 'Values', 0]},
}
def __init__(self, args: Dict = {}, session: ASFSession = ASFSession()):
super().__init__(args, session)
self.properties['orbit'] = [orbit['OrbitNumber'] for orbit in self.properties['orbit']]
urls = self.umm_get(self.umm, 'RelatedUrls', ('Type', [('USE SERVICE API', 'URL')]), 0)
self.properties['additionalUrls'] = []
if urls is not None:
self.properties['url'] = urls[0]
self.properties['fileName'] = self.properties['fileID'] + '.' + urls[0].split('.')[-1]
self.properties['additionalUrls'] = urls[1:]
def get_stack_opts(self, opts: ASFSearchOptions = None) -> ASFSearchOptions:
"""
Build search options that can be used to find an insar stack for this product
:return: ASFSearchOptions describing appropriate options
for building a stack from this product
"""
return None
def is_valid_reference(self):
return False
@staticmethod
def get_default_baseline_product_type() -> None:
"""
Returns the product type to search for when building a baseline stack.
"""
return None
@staticmethod
def _is_subclass(item: Dict) -> bool:
platform = ASFProduct.umm_get(item['umm'], 'Platforms', 0, 'ShortName')
if platform in ['SENTINEL-1A', 'SENTINEL-1B']:
asf_platform = ASFProduct.umm_get(
item['umm'],
'AdditionalAttributes',
('Name', 'ASF_PLATFORM'),
'Values',
0,
)
return 'Sentinel-1 Interferogram' in asf_platform
return False
Discovery-asf_search-8.1.2/asf_search/Products/ERSProduct.py 0000664 0000000 0000000 00000002734 14777330235 0024053 0 ustar 00root root 0000000 0000000 from typing import Dict, Union
from asf_search import ASFSession, ASFStackableProduct
from asf_search.CMR.translate import try_round_float
from asf_search.constants import PRODUCT_TYPE
class ERSProduct(ASFStackableProduct):
"""
Used for ERS-1 and ERS-2 products
ASF ERS-1 Dataset Documentation Page: https://asf.alaska.edu/datasets/daac/ers-1/
ASF ERS-2 Dataset Documentation Page: https://asf.alaska.edu/datasets/daac/ers-2/
"""
_base_properties = {
**ASFStackableProduct._base_properties,
'frameNumber': {'path': ['AdditionalAttributes', ('Name', 'FRAME_NUMBER'), 'Values', 0]},
'bytes': {
'path': ['AdditionalAttributes', ('Name', 'BYTES'), 'Values', 0],
'cast': try_round_float,
},
'esaFrame': {'path': ['AdditionalAttributes', ('Name', 'CENTER_ESA_FRAME'), 'Values', 0]},
'md5sum': {'path': ['AdditionalAttributes', ('Name', 'MD5SUM'), 'Values', 0]},
'beamModeType': {'path': ['AdditionalAttributes', ('Name', 'BEAM_MODE_TYPE'), 'Values', 0]},
'insarStackId': {'path': ['AdditionalAttributes', ('Name', 'INSAR_STACK_ID'), 'Values', 0]},
}
def __init__(self, args: Dict = {}, session: ASFSession = ASFSession()):
super().__init__(args, session)
@staticmethod
def get_default_baseline_product_type() -> Union[str, None]:
"""
Returns the product type to search for when building a baseline stack.
"""
return PRODUCT_TYPE.L0
Discovery-asf_search-8.1.2/asf_search/Products/JERSProduct.py 0000664 0000000 0000000 00000002211 14777330235 0024153 0 ustar 00root root 0000000 0000000 from typing import Dict, Union
from asf_search import ASFSession, ASFStackableProduct
from asf_search.constants import PRODUCT_TYPE
class JERSProduct(ASFStackableProduct):
"""
ASF Dataset Documentation Page: https://asf.alaska.edu/datasets/daac/jers-1/
"""
_base_properties = {
**ASFStackableProduct._base_properties,
'browse': {'path': ['RelatedUrls', ('Type', [('GET RELATED VISUALIZATION', 'URL')])]},
'groupID': {'path': ['AdditionalAttributes', ('Name', 'GROUP_ID'), 'Values', 0]},
'md5sum': {'path': ['AdditionalAttributes', ('Name', 'MD5SUM'), 'Values', 0]},
'beamModeType': {'path': ['AdditionalAttributes', ('Name', 'BEAM_MODE_TYPE'), 'Values', 0]},
'insarStackId': {'path': ['AdditionalAttributes', ('Name', 'INSAR_STACK_ID'), 'Values', 0]},
}
def __init__(self, args: Dict = {}, session: ASFSession = ASFSession()):
super().__init__(args, session)
@staticmethod
def get_default_baseline_product_type() -> Union[str, None]:
"""
Returns the product type to search for when building a baseline stack.
"""
return PRODUCT_TYPE.L0
Discovery-asf_search-8.1.2/asf_search/Products/NISARProduct.py 0000664 0000000 0000000 00000004330 14777330235 0024270 0 ustar 00root root 0000000 0000000 from typing import Dict, Tuple, Union
from asf_search import ASFSearchOptions, ASFSession, ASFStackableProduct
from asf_search.CMR.translate import try_parse_frame_coverage, try_parse_bool
class NISARProduct(ASFStackableProduct):
"""
Used for NISAR dataset products
ASF Dataset Documentation Page: https://asf.alaska.edu/nisar/
"""
_base_properties = {
**ASFStackableProduct._base_properties,
'pgeVersion': {'path': ['PGEVersionClass', 'PGEVersion']},
'mainBandPolarization': {'path': ['AdditionalAttributes', ('Name', 'FREQUENCY_A_POLARIZATION'), 'Values']},
'sideBandPolarization': {'path': ['AdditionalAttributes', ('Name', 'FREQUENCY_B_POLARIZATION'), 'Values']},
'frameCoverage': {'path': ['AdditionalAttributes', ('Name', 'FULL_FRAME'), 'Values', 0], 'cast': try_parse_frame_coverage},
'jointObservation': {'path': ['AdditionalAttributes', ('Name', 'JOINT_OBSERVATION'), 'Values', 0], 'cast': try_parse_bool},
'rangeBandwidth': {'path': ['AdditionalAttributes', ('Name', 'RANGE_BANDWIDTH_CONCAT'), 'Values']},
}
def __init__(self, args: Dict = {}, session: ASFSession = ASFSession()):
super().__init__(args, session)
self.properties['additionalUrls'] = self._get_additional_urls()
self.properties['s3Urls'] = self._get_s3_uris()
if self.properties.get('groupID') is None:
self.properties['groupID'] = self.properties['sceneName']
@staticmethod
def get_default_baseline_product_type() -> Union[str, None]:
"""
Returns the product type to search for when building a baseline stack.
"""
return None
def is_valid_reference(self):
return False
def get_stack_opts(self, opts: ASFSearchOptions = None) -> ASFSearchOptions:
"""
Build search options that can be used to find an insar stack for this product
:return: ASFSearchOptions describing appropriate options
for building a stack from this product
"""
return None
def get_sort_keys(self) -> Tuple[str, str]:
keys = super().get_sort_keys()
if keys[0] == '':
return (self._read_property('processingDate', ''), keys[1])
return keys
Discovery-asf_search-8.1.2/asf_search/Products/OPERAS1Product.py 0000664 0000000 0000000 00000010603 14777330235 0024466 0 ustar 00root root 0000000 0000000 from typing import Dict, Tuple
from asf_search import ASFSearchOptions, ASFSession
from asf_search.CMR.translate import try_parse_date
from asf_search.Products import S1Product
class OPERAS1Product(S1Product):
"""
ASF Dataset Documentation Page: https://asf.alaska.edu/datasets/daac/opera/
"""
_base_properties = {
**S1Product._base_properties,
'centerLat': {'path': []}, # Opera products lacks these fields
'centerLon': {'path': []},
'frameNumber': {'path': []},
'operaBurstID': {'path': ['AdditionalAttributes', ('Name', 'OPERA_BURST_ID'), 'Values', 0]},
'validityStartDate': {'path': ['TemporalExtent', 'SingleDateTime'], 'cast': try_parse_date},
'bytes': {'path': ['DataGranule', 'ArchiveAndDistributionInformation']},
'subswath': {'path': ['AdditionalAttributes', ('Name', 'SUBSWATH_NAME'), 'Values', 0]},
'polarization': {
'path': ['AdditionalAttributes', ('Name', 'POLARIZATION'), 'Values']
}, # dual polarization is in list rather than a 'VV+VH' style format
}
_subclass_concept_ids = {
'C1257995185-ASF',
'C1257995186-ASF',
'C1258354200-ASF',
'C1258354201-ASF',
'C1259974840-ASF',
'C1259976861-ASF',
'C1259981910-ASF',
'C1259982010-ASF',
'C2777436413-ASF',
'C2777443834-ASF',
'C2795135174-ASF',
'C2795135668-ASF',
'C1260721853-ASF',
'C1260721945-ASF',
'C2803501097-ASF',
'C2803501758-ASF',
}
def __init__(self, args: Dict = {}, session: ASFSession = ASFSession()):
super().__init__(args, session)
self.baseline = None
self.properties['beamMode'] = self.umm_get(
self.umm, 'AdditionalAttributes', ('Name', 'BEAM_MODE'), 'Values', 0
)
self.properties['additionalUrls'] = self._get_additional_urls()
self.properties['operaBurstID'] = self.umm_get(
self.umm, 'AdditionalAttributes', ('Name', 'OPERA_BURST_ID'), 'Values', 0
)
self.properties['bytes'] = {
entry['Name']: {'bytes': entry['SizeInBytes'], 'format': entry['Format']}
for entry in self.properties['bytes']
}
center = self.centroid()
self.properties['centerLat'] = center.y
self.properties['centerLon'] = center.x
self.properties.pop('frameNumber')
if (processingLevel := self.properties['processingLevel']) in [
'RTC',
'RTC-STATIC',
]:
self.properties['bistaticDelayCorrection'] = self.umm_get(
self.umm,
'AdditionalAttributes',
('Name', 'BISTATIC_DELAY_CORRECTION'),
'Values',
0,
)
if processingLevel == 'RTC':
self.properties['noiseCorrection'] = self.umm_get(
self.umm,
'AdditionalAttributes',
('Name', 'NOISE_CORRECTION'),
'Values',
0,
)
self.properties['postProcessingFilter'] = self.umm_get(
self.umm,
'AdditionalAttributes',
('Name', 'POST_PROCESSING_FILTER'),
'Values',
0,
)
@staticmethod
def get_default_baseline_product_type() -> None:
"""
Returns the product type to search for when building a baseline stack.
"""
return None
def is_valid_reference(self):
return False
def get_stack_opts(self, opts: ASFSearchOptions = None) -> ASFSearchOptions:
"""
Build search options that can be used to find an insar stack for this product
:return: ASFSearchOptions describing appropriate options
for building a stack from this product
"""
return None
def get_sort_keys(self) -> Tuple[str, str]:
keys = super().get_sort_keys()
if keys[0] == '':
return (self._read_property('validityStartDate', ''), keys[1])
return keys
@staticmethod
def _is_subclass(item: Dict) -> bool:
# not all umm products have this field set,
# but when it's available it's convenient for fast matching
concept_id = item['meta'].get('collection-concept-id')
return concept_id in OPERAS1Product._subclass_concept_ids
Discovery-asf_search-8.1.2/asf_search/Products/RADARSATProduct.py 0000664 0000000 0000000 00000002715 14777330235 0024622 0 ustar 00root root 0000000 0000000 from typing import Dict, Union
from asf_search import ASFSession, ASFStackableProduct
from asf_search.CMR.translate import try_parse_float, try_parse_int
from asf_search.constants import PRODUCT_TYPE
class RADARSATProduct(ASFStackableProduct):
"""
ASF Dataset Documentation Page: https://asf.alaska.edu/datasets/daac/radarsat-1/
"""
_base_properties = {
**ASFStackableProduct._base_properties,
'faradayRotation': {'path': ['AdditionalAttributes', ('Name', 'FARADAY_ROTATION'), 'Values', 0], 'cast': try_parse_float},
'md5sum': {'path': ['AdditionalAttributes', ('Name', 'MD5SUM'), 'Values', 0]},
'beamModeType': {'path': ['AdditionalAttributes', ('Name', 'BEAM_MODE_TYPE'), 'Values', 0]},
'insarStackId': {'path': ['AdditionalAttributes', ('Name', 'INSAR_STACK_ID'), 'Values', 0]},
'frameNumber': {'path': ['AdditionalAttributes', ('Name', 'FRAME_NUMBER'), 'Values', 0], 'cast': try_parse_int}, #Sentinel and ALOS product alt for frameNumber (ESA_FRAME)
'esaFrame': {'path': ['AdditionalAttributes', ('Name', 'CENTER_ESA_FRAME'), 'Values', 0], 'cast': try_parse_int},
}
def __init__(self, args: Dict = {}, session: ASFSession = ASFSession()):
super().__init__(args, session)
@staticmethod
def get_default_baseline_product_type() -> Union[str, None]:
"""
Returns the product type to search for when building a baseline stack.
"""
return PRODUCT_TYPE.L0
Discovery-asf_search-8.1.2/asf_search/Products/S1BurstProduct.py 0000664 0000000 0000000 00000010512 14777330235 0024716 0 ustar 00root root 0000000 0000000 import copy
from typing import Dict, Union
from asf_search import ASFSearchOptions, ASFSession
from asf_search.Products import S1Product
from asf_search.CMR.translate import try_parse_date
from asf_search.CMR.translate import try_parse_int
from asf_search.constants import PRODUCT_TYPE
class S1BurstProduct(S1Product):
"""
S1Product Subclass made specifically for Sentinel-1 SLC-BURST products
Key features/properties:
- `properties['burst']` contains SLC-BURST Specific fields
such as `fullBurstID` and `burstIndex`
- `properties['additionalUrls']` contains BURST-XML url
- SLC-BURST specific stacking params
ASF Dataset Documentation Page:
https://asf.alaska.edu/datasets/data-sets/derived-data-sets/sentinel-1-bursts/
"""
_base_properties = {
**S1Product._base_properties,
'bytes': {'path': ['AdditionalAttributes', ('Name', 'BYTE_LENGTH'), 'Values', 0]},
'absoluteBurstID': {'path': ['AdditionalAttributes', ('Name', 'BURST_ID_ABSOLUTE'), 'Values', 0], 'cast': try_parse_int},
'relativeBurstID': {'path': ['AdditionalAttributes', ('Name', 'BURST_ID_RELATIVE'), 'Values', 0], 'cast': try_parse_int},
'fullBurstID': {'path': ['AdditionalAttributes', ('Name', 'BURST_ID_FULL'), 'Values', 0]},
'burstIndex': {'path': ['AdditionalAttributes', ('Name', 'BURST_INDEX'), 'Values', 0], 'cast': try_parse_int},
'samplesPerBurst': {'path': ['AdditionalAttributes', ('Name', 'SAMPLES_PER_BURST'), 'Values', 0], 'cast': try_parse_int},
'subswath': {'path': ['AdditionalAttributes', ('Name', 'SUBSWATH_NAME'), 'Values', 0]},
'azimuthTime': {'path': ['AdditionalAttributes', ('Name', 'AZIMUTH_TIME'), 'Values', 0], 'cast': try_parse_date},
'azimuthAnxTime': {'path': ['AdditionalAttributes', ('Name', 'AZIMUTH_ANX_TIME'), 'Values', 0]},
}
def __init__(self, args: Dict = {}, session: ASFSession = ASFSession()):
super().__init__(args, session)
self.properties["sceneName"] = self.properties["fileID"]
# Gathers burst properties into `burst` specific dict
# rather than properties dict to limit breaking changes
self.properties["burst"] = {
"absoluteBurstID": self.properties.pop("absoluteBurstID"),
"relativeBurstID": self.properties.pop("relativeBurstID"),
"fullBurstID": self.properties.pop("fullBurstID"),
"burstIndex": self.properties.pop("burstIndex"),
"samplesPerBurst": self.properties.pop("samplesPerBurst"),
"subswath": self.properties.pop("subswath"),
"azimuthTime": self.properties.pop("azimuthTime"),
"azimuthAnxTime": self.properties.pop("azimuthAnxTime"),
}
urls = self.umm_get(
self.umm, "RelatedUrls", ("Type", [("USE SERVICE API", "URL")]), 0
)
if urls is not None:
self.properties["url"] = urls[0]
self.properties["fileName"] = (
self.properties["fileID"] + "." + urls[0].split(".")[-1]
)
self.properties["additionalUrls"] = [urls[1]] # xml-metadata url
def get_stack_opts(self, opts: ASFSearchOptions = None):
"""
Returns the search options asf-search will use internally
to build an SLC-BURST baseline stack from
:param opts: additional criteria for limiting
:returns ASFSearchOptions used for build Sentinel-1 SLC-BURST Stack
"""
stack_opts = ASFSearchOptions() if opts is None else copy(opts)
stack_opts.processingLevel = self.get_default_baseline_product_type()
stack_opts.fullBurstID = self.properties["burst"]["fullBurstID"]
stack_opts.polarization = [self.properties["polarization"]]
return stack_opts
def _get_additional_filenames_and_urls(self, default_filename: str = None):
# Burst XML filenames are just numbers, this makes it more indentifiable
if default_filename is None:
default_filename = self.properties["fileName"]
file_name = f"{'.'.join(default_filename.split('.')[:-1])}.xml"
return [(file_name, self.properties["additionalUrls"][0])]
@staticmethod
def get_default_baseline_product_type() -> Union[str, None]:
"""
Returns the product type to search for when building a baseline stack.
"""
return PRODUCT_TYPE.BURST
Discovery-asf_search-8.1.2/asf_search/Products/S1Product.py 0000664 0000000 0000000 00000013250 14777330235 0023700 0 ustar 00root root 0000000 0000000 import copy
from typing import Dict, List, Optional, Tuple
from asf_search import ASFSearchOptions, ASFSession, ASFStackableProduct
from asf_search.CMR.translate import try_parse_date
from asf_search.CMR.translate import try_parse_int
from asf_search.constants import PLATFORM
from asf_search.constants import PRODUCT_TYPE
class S1Product(ASFStackableProduct):
"""
The S1Product classes covers most Sentinel-1 Products
(For S1 BURST-SLC, OPERA-S1, and ARIA-S1 GUNW Products, see relevant S1 subclasses)
ASF Dataset Overview Page: https://asf.alaska.edu/datasets/daac/sentinel-1/
"""
_base_properties = {
**ASFStackableProduct._base_properties,
'frameNumber': {
'path': ['AdditionalAttributes', ('Name', 'FRAME_NUMBER'), 'Values', 0],
'cast': try_parse_int,
}, # Sentinel and ALOS product alt for frameNumber (ESA_FRAME)
'groupID': {'path': ['AdditionalAttributes', ('Name', 'GROUP_ID'), 'Values', 0]},
'md5sum': {'path': ['AdditionalAttributes', ('Name', 'MD5SUM'), 'Values', 0]},
'pgeVersion': {'path': ['PGEVersionClass', 'PGEVersion']},
}
"""
S1 Specific path override
- frameNumber: overrides ASFProduct's `CENTER_ESA_FRAME` with `FRAME_NUMBER`
"""
baseline_type = ASFStackableProduct.BaselineCalcType.CALCULATED
def __init__(self, args: Dict = {}, session: ASFSession = ASFSession()):
super().__init__(args, session)
self.properties['s3Urls'] = self._get_s3_uris()
if self.has_baseline():
self.baseline = self.get_baseline_calc_properties()
def has_baseline(self) -> bool:
baseline = self.get_baseline_calc_properties()
return baseline is not None and None not in baseline['stateVectors']['positions'].values()
def get_baseline_calc_properties(self) -> Dict:
"""
:returns properties required for SLC baseline stack calculations
"""
ascendingNodeTime = self.umm_cast(
self._parse_timestamp,
self.umm_get(self.umm, 'AdditionalAttributes', ('Name', 'ASC_NODE_TIME'), 'Values', 0),
)
return {
'stateVectors': self.get_state_vectors(),
'ascendingNodeTime': ascendingNodeTime,
}
def get_state_vectors(self) -> Dict:
"""
Used in spatio-temporal perpendicular baseline calculations for non-pre-calculated stacks
:returns dictionary of pre/post positions, velocities, and times"""
positions = {}
velocities = {}
sv_pre_position = self.umm_get(
self.umm, 'AdditionalAttributes', ('Name', 'SV_POSITION_PRE'), 'Values', 0
)
sv_post_position = self.umm_get(
self.umm, 'AdditionalAttributes', ('Name', 'SV_POSITION_POST'), 'Values', 0
)
sv_pre_velocity = self.umm_get(
self.umm, 'AdditionalAttributes', ('Name', 'SV_VELOCITY_PRE'), 'Values', 0
)
sv_post_velocity = self.umm_get(
self.umm, 'AdditionalAttributes', ('Name', 'SV_VELOCITY_POST'), 'Values', 0
)
positions['prePosition'], positions['prePositionTime'] = self.umm_cast(
self._parse_state_vector, sv_pre_position
)
positions['postPosition'], positions['postPositionTime'] = self.umm_cast(
self._parse_state_vector, sv_post_position
)
velocities['preVelocity'], velocities['preVelocityTime'] = self.umm_cast(
self._parse_state_vector, sv_pre_velocity
)
velocities['postVelocity'], velocities['postVelocityTime'] = self.umm_cast(
self._parse_state_vector, sv_post_velocity
)
return {'positions': positions, 'velocities': velocities}
def _parse_timestamp(self, timestamp: str) -> Optional[str]:
if timestamp is None:
return None
return try_parse_date(timestamp)
def _parse_state_vector(self, state_vector: str) -> Tuple[Optional[List], Optional[str]]:
if state_vector is None:
return None, None
velocity = [float(val) for val in state_vector.split(',')[:3]]
timestamp = self._parse_timestamp(state_vector.split(',')[-1])
return velocity, timestamp
def get_stack_opts(self, opts: ASFSearchOptions = None) -> ASFSearchOptions:
"""
Returns the search options asf-search will use internally
to build an SLC baseline stack from
:param opts: additional criteria for limiting
:returns ASFSearchOptions used for build Sentinel-1 SLC Stack
"""
stack_opts = ASFSearchOptions() if opts is None else copy(opts)
stack_opts.processingLevel = self.get_default_baseline_product_type()
stack_opts.beamMode = [self.properties['beamModeType']]
stack_opts.flightDirection = self.properties['flightDirection']
stack_opts.relativeOrbit = [int(self.properties['pathNumber'])] # path
stack_opts.platform = [PLATFORM.SENTINEL1A, PLATFORM.SENTINEL1B]
if self.properties['polarization'] in ['HH', 'HH+HV']:
stack_opts.polarization = ['HH', 'HH+HV']
else:
stack_opts.polarization = ['VV', 'VV+VH']
stack_opts.intersectsWith = self.centroid().wkt
return stack_opts
def is_valid_reference(self) -> bool:
keys = ['postPosition', 'postPositionTime', 'prePosition', 'postPositionTime']
for key in keys:
if self.baseline['stateVectors']['positions'].get(key) is None:
return False
return True
@staticmethod
def get_default_baseline_product_type() -> str:
"""
Returns the product type to search for when building a baseline stack.
"""
return PRODUCT_TYPE.SLC
Discovery-asf_search-8.1.2/asf_search/Products/SEASATProduct.py 0000664 0000000 0000000 00000001423 14777330235 0024374 0 ustar 00root root 0000000 0000000 from typing import Dict
from asf_search import ASFSession, ASFProduct
from asf_search.CMR.translate import try_round_float
class SEASATProduct(ASFProduct):
"""
ASF Dataset Documentation Page: https://asf.alaska.edu/data-sets/sar-data-sets/seasat/
"""
_base_properties = {
**ASFProduct._base_properties,
'bytes': {
'path': ['AdditionalAttributes', ('Name', 'BYTES'), 'Values', 0],
'cast': try_round_float,
},
'insarStackId': {'path': ['AdditionalAttributes', ('Name', 'INSAR_STACK_ID'), 'Values', 0]},
'md5sum': {'path': ['AdditionalAttributes', ('Name', 'MD5SUM'), 'Values', 0]},
}
def __init__(self, args: Dict = {}, session: ASFSession = ASFSession()):
super().__init__(args, session)
Discovery-asf_search-8.1.2/asf_search/Products/SIRCProduct.py 0000664 0000000 0000000 00000001351 14777330235 0024154 0 ustar 00root root 0000000 0000000 from typing import Dict
from asf_search import ASFProduct, ASFSession
class SIRCProduct(ASFProduct):
"""
Dataset Documentation Page: https://eospso.nasa.gov/missions/spaceborne-imaging-radar-c
"""
_base_properties = {
**ASFProduct._base_properties,
'groupID': {'path': ['AdditionalAttributes', ('Name', 'GROUP_ID'), 'Values', 0]},
'md5sum': {'path': ['AdditionalAttributes', ('Name', 'MD5SUM'), 'Values', 0]},
'pgeVersion': {'path': ['PGEVersionClass', 'PGEVersion']},
'beamModeType': {'path': ['AdditionalAttributes', ('Name', 'BEAM_MODE_TYPE'), 'Values', 0]},
}
def __init__(self, args: Dict = {}, session: ASFSession = ASFSession()):
super().__init__(args, session)
Discovery-asf_search-8.1.2/asf_search/Products/SMAPProduct.py 0000664 0000000 0000000 00000001320 14777330235 0024150 0 ustar 00root root 0000000 0000000 from typing import Dict
from asf_search import ASFProduct, ASFSession
class SMAPProduct(ASFProduct):
"""
ASF Dataset Documentation Page:
https://asf.alaska.edu/data-sets/sar-data-sets/soil-moisture-active-passive-smap-mission/
"""
_base_properties = {
**ASFProduct._base_properties,
'groupID': {'path': ['AdditionalAttributes', ('Name', 'GROUP_ID'), 'Values', 0]},
'insarStackId': {'path': ['AdditionalAttributes', ('Name', 'INSAR_STACK_ID'), 'Values', 0]},
'md5sum': {'path': ['AdditionalAttributes', ('Name', 'MD5SUM'), 'Values', 0]},
}
def __init__(self, args: Dict = {}, session: ASFSession = ASFSession()):
super().__init__(args, session)
Discovery-asf_search-8.1.2/asf_search/Products/UAVSARProduct.py 0000664 0000000 0000000 00000001235 14777330235 0024416 0 ustar 00root root 0000000 0000000 from typing import Dict
from asf_search import ASFProduct, ASFSession
class UAVSARProduct(ASFProduct):
"""
ASF Dataset Documentation Page: https://asf.alaska.edu/datasets/daac/uavsar/
"""
_base_properties = {
**ASFProduct._base_properties,
'groupID': {'path': ['AdditionalAttributes', ('Name', 'GROUP_ID'), 'Values', 0]},
'insarStackId': {'path': ['AdditionalAttributes', ('Name', 'INSAR_STACK_ID'), 'Values', 0]},
'md5sum': {'path': ['AdditionalAttributes', ('Name', 'MD5SUM'), 'Values', 0]},
}
def __init__(self, args: Dict = {}, session: ASFSession = ASFSession()):
super().__init__(args, session)
Discovery-asf_search-8.1.2/asf_search/Products/__init__.py 0000664 0000000 0000000 00000001362 14777330235 0023614 0 ustar 00root root 0000000 0000000 from .S1Product import S1Product # noqa: F401
from .ALOSProduct import ALOSProduct # noqa: F401
from .RADARSATProduct import RADARSATProduct # noqa: F401
from .AIRSARProduct import AIRSARProduct # noqa: F401
from .ERSProduct import ERSProduct # noqa: F401
from .JERSProduct import JERSProduct # noqa: F401
from .UAVSARProduct import UAVSARProduct # noqa: F401
from .SIRCProduct import SIRCProduct # noqa: F401
from .SEASATProduct import SEASATProduct # noqa: F401
from .SMAPProduct import SMAPProduct # noqa: F401
from .S1BurstProduct import S1BurstProduct # noqa: F401
from .OPERAS1Product import OPERAS1Product # noqa: F401
from .ARIAS1GUNWProduct import ARIAS1GUNWProduct # noqa: F401
from .NISARProduct import NISARProduct # noqa: F401
Discovery-asf_search-8.1.2/asf_search/WKT/ 0000775 0000000 0000000 00000000000 14777330235 0020343 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/asf_search/WKT/RepairEntry.py 0000664 0000000 0000000 00000000351 14777330235 0023160 0 ustar 00root root 0000000 0000000 class RepairEntry:
def __init__(self, report_type: str, report: str) -> None:
self.report_type = report_type
self.report = report
def __str__(self) -> str:
return f'{self.report_type}: {self.report}'
Discovery-asf_search-8.1.2/asf_search/WKT/__init__.py 0000664 0000000 0000000 00000000150 14777330235 0022450 0 ustar 00root root 0000000 0000000 from .validate_wkt import validate_wkt # noqa: F401
from .RepairEntry import RepairEntry # noqa: F401
Discovery-asf_search-8.1.2/asf_search/WKT/validate_wkt.py 0000664 0000000 0000000 00000033305 14777330235 0023377 0 ustar 00root root 0000000 0000000 import logging
from typing import Union, Tuple, List
from shapely import wkt
from shapely.geometry.base import BaseGeometry
from shapely.geometry import (
Polygon,
MultiPolygon,
Point,
LineString,
GeometryCollection,
)
from shapely.geometry.collection import BaseMultipartGeometry
from shapely.ops import transform, orient, unary_union
from .RepairEntry import RepairEntry
from asf_search.exceptions import ASFWKTError
def validate_wkt(
aoi: Union[str, BaseGeometry],
) -> Tuple[BaseGeometry, BaseGeometry, List[RepairEntry]]:
"""
Param aoi: the WKT string or Shapely Geometry to validate and prepare for the CMR query
Validates the given area of interest, and returns a validated and simplified WKT string
returns: The input AOI's CMR ready WKT string
"""
if isinstance(aoi, str):
aoi_shape = wkt.loads(aoi)
else:
aoi_shape = wkt.loads(aoi.wkt)
if not aoi_shape.is_valid:
aoi_shape = _search_wkt_prep(aoi_shape)
if not aoi_shape.is_valid and not isinstance(aoi_shape, MultiPolygon):
if isinstance(aoi_shape, Polygon):
if not aoi_shape.exterior.is_simple:
raise ASFWKTError(
f'WKT string: "{aoi_shape.wkt}" is a self intersecting polygon'
)
raise ASFWKTError(f'WKT string: "{aoi_shape.wkt}" is not a valid WKT string')
if aoi_shape.is_empty:
raise ASFWKTError(f'WKT string: "{aoi_shape.wkt}" empty WKT is not a valid AOI')
wrapped, unwrapped, reports = _simplify_geometry(aoi_shape)
return wrapped, unwrapped, [report for report in reports if report is not None]
def _search_wkt_prep(shape: BaseGeometry):
if isinstance(shape, MultiPolygon):
output = []
for geom in shape.geoms:
output.append(orient(Polygon(geom.exterior)))
return MultiPolygon(output)
if isinstance(shape, Polygon):
return orient(Polygon(shape.exterior), sign=1.0)
def _simplify_geometry(
geometry: BaseGeometry,
) -> Tuple[BaseGeometry, BaseGeometry, List[RepairEntry]]:
"""
param geometry: AOI Shapely Geometry to be prepped for CMR
prepares geometry for CMR by:
1. Flattening any nested multi-part geometry into single collection
2. clamping latitude +/-90, unwrapping longitude +/-180,
removing coordinate dimensions higher than 2 (lon,lat)
3. Merging any overlapping shapes
4. convex-hulling the remainder into a single shape
4. simplifing until the shape has <= 300 points, with no point closer than 0.00001
5. Orienting vertices in counter-clockwise winding order
returns: geometry prepped for CMR
"""
flattened = _flatten_multipart_geometry(geometry)
merged, merge_report = _merge_overlapping_geometry(flattened)
convex, convex_report = _get_convex_hull(merged)
simplified, simplified_report = _simplify_aoi(convex)
reoriented, reorientation_report = _counter_clockwise_reorientation(simplified)
wrapped, unwrapped, clamp_report = _get_clamped_and_wrapped_geometry(reoriented)
dimension_report = (
RepairEntry(
report_type="'type': 'EXTRA_DIMENSION'",
report="'report': Only 2-Dimensional area of interests are supported (lon/lat), "
'higher dimension coordinates will be ignored',
)
if geometry.has_z
else None
)
if convex_report is not None:
merge_report = None
repair_reports = [
dimension_report,
merge_report,
convex_report,
*clamp_report,
*simplified_report,
reorientation_report,
]
for report in repair_reports:
if report is not None:
logging.info(f'{report}')
validated_wrapped = transform(lambda x, y, z=None: tuple([round(x, 14), round(y, 14)]), wrapped)
validated_unwrapped = transform(
lambda x, y, z=None: tuple([round(x, 14), round(y, 14)]), unwrapped
)
return validated_wrapped, validated_unwrapped, repair_reports
def _flatten_multipart_geometry(unflattened_geometry: BaseGeometry) -> BaseGeometry:
"""
Recursively flattens nested geometric collections,
guarantees geometric collections have a depth equal to 1.
Also ignores any empty shapes in multipart geometry
"""
def _recurse_nested_geometry(geometry: BaseGeometry) -> List[BaseGeometry]:
output = []
if isinstance(geometry, BaseMultipartGeometry):
for geom in geometry.geoms:
output.extend(_recurse_nested_geometry(geom))
elif not geometry.is_empty:
if isinstance(geometry, Polygon):
return [Polygon(geometry.exterior)]
return [geometry]
return output
flattened = _recurse_nested_geometry(unflattened_geometry)
return flattened[0] if len(flattened) == 1 else GeometryCollection(flattened)
def _merge_overlapping_geometry(
geometry: BaseGeometry,
) -> Tuple[BaseGeometry, RepairEntry]:
"""
parameter geometry: geometry to merge
Performs a unary union overlapping operation of the input geometry,
ensuring geometric collections (multipolygon, multipartgeometry, etc)
are simplied as much as possible before the convex-hull step
output: merged-overlapping geometry
"""
merge_report = None
if isinstance(geometry, BaseMultipartGeometry):
original_amount = len(geometry.geoms)
if original_amount == 1:
return geometry, merge_report
merged = unary_union(geometry)
# if there were non-overlapping shapes
if isinstance(merged, BaseMultipartGeometry):
unique_shapes = len(merged.geoms)
merged = orient(
unary_union(GeometryCollection([geom.convex_hull for geom in merged.geoms]))
)
if isinstance(merged, BaseMultipartGeometry):
if unique_shapes != len(merged.geoms):
merge_report = RepairEntry(
"'type': 'OVERLAP_MERGE'",
f"'report': {unique_shapes - len(merged.geoms)} "
'non-overlapping shapes merged by their convex-hulls',
)
else:
merge_report = RepairEntry(
"'type': 'OVERLAP_MERGE'",
f"'report': {unique_shapes} non-overlapping shapes merged by their convex-hulls", # noqa F401
)
else:
merge_report = RepairEntry(
"'type': 'OVERLAP_MERGE'",
f"'report': Overlapping {original_amount} shapes merged into one",
)
return merged, merge_report
return geometry, merge_report
def _counter_clockwise_reorientation(geometry: Union[Point, LineString, Polygon]):
"""
param geometry: Shapely geometry to re-orient
Ensures the geometry coordinates are wound counter-clockwise
output: counter-clockwise oriented geometry
"""
reoriented_report = RepairEntry("'type': 'REVERSE'", "'report': Reversed polygon winding order")
reoriented = orient(geometry)
if isinstance(geometry, Polygon):
# if the vertice ordering has changed
if reoriented.exterior.is_ccw != geometry.exterior.is_ccw:
return reoriented, reoriented_report
return reoriented, None
def _get_clamped_and_wrapped_geometry(
shape: BaseGeometry,
) -> Tuple[BaseGeometry, BaseGeometry, List[RepairEntry]]:
"""
param geometry: Shapely geometry to clamp
Clamps geometry to +/-90 latitude and wraps longitude +/-180
output: clamped shapely geometry
"""
coords_clamped = 0
coords_wrapped = 0
def _clamp_lat(x, y, z=None):
clamped = _clamp(y)
if clamped != y:
nonlocal coords_clamped
coords_clamped += 1
return tuple([x, clamped])
def _wrap_lon(x, y, z=None):
wrapped = x
if abs(x) > 180:
wrapped = (x + 180) % 360 - 180
if wrapped != x:
nonlocal coords_wrapped
coords_wrapped += 1
return tuple([wrapped, y])
def _unwrap_lon(x, y, z=None):
unwrapped = x if x >= 0 else x + 360 # This undoes wrapping
return tuple([unwrapped, y])
clamped_lat = transform(_clamp_lat, shape)
wrapped = transform(_wrap_lon, clamped_lat)
if wrapped.bounds[2] - wrapped.bounds[0] > 180:
unwrapped = transform(_unwrap_lon, wrapped)
else:
unwrapped = wrapped
clampRepairReport = None
wrapRepairReport = None
if coords_clamped > 0:
clampRepairReport = RepairEntry(
"'type': 'CLAMP'",
f"'report': 'Clamped {coords_clamped} value(s) to +/-90 latitude'",
)
if coords_wrapped > 0:
wrapRepairReport = RepairEntry(
"'type': 'WRAP'",
f"'report': 'Wrapped {coords_wrapped} value(s) to +/-180 longitude'",
)
return (wrapped, unwrapped, [clampRepairReport, wrapRepairReport])
def _get_convex_hull(geometry: BaseGeometry) -> Tuple[BaseGeometry, RepairEntry]:
"""
param geometry: geometry to perform possible convex hull operation on
If the given geometry is a collection of geometries,
creates a convex-hull encompassing said geometry
output: convex hull of multi-part geometry, or the original single-shaped geometry
"""
if geometry.geom_type not in [
'MultiPoint',
'MultiLineString',
'MultiPolygon',
'GeometryCollection',
]:
return geometry, None
possible_repair = RepairEntry(
"'type': 'CONVEX_HULL_INDIVIDUAL'",
"'report': 'Unconnected shapes: Convex-hulled each INDIVIDUAL shape to merge them together.'", # noqa F401
)
return geometry.convex_hull, possible_repair
def _simplify_aoi(
shape: Union[Polygon, LineString, Point],
threshold: float = 0.004,
max_depth: int = 10,
) -> Tuple[Union[Polygon, LineString, Point], List[RepairEntry]]:
"""
param shape: Shapely geometry to simplify
param threshold: point proximity threshold to merge nearby points of geometry with
param max_depth: the current depth of the recursive call, defaults to 10
Recursively simplifies geometry with increasing threshold, and
until there are no more than 300 points
output: simplified geometry
"""
repairs = []
if shape.geom_type == 'Point':
return shape, repairs
# Check for very small shapes and collapse accordingly
mbr_width = shape.bounds[2] - shape.bounds[0]
mbr_height = shape.bounds[3] - shape.bounds[1]
# If both pass, it's a tiny box. Turn it to a point
if mbr_width <= threshold and mbr_height <= threshold:
simplified = shape.centroid
repair = RepairEntry(
"'type': 'GEOMETRY_SIMPLIFICATION'",
"'report': 'Shape Collapsed to Point: "
f'shape of {_get_shape_coords_len(shape)} '
f'simplified to {_get_shape_coords_len(simplified)} '
f"with proximity threshold of {threshold}'",
)
return simplified, [*repairs, repair]
# If it's a single line segment, it's already as simple as can be. Don't do anything
elif shape.geom_type == 'LineString' and len(shape.coords) == 2:
return shape, repairs
# Else, check if it's slim enough to become a linestring:
elif mbr_width <= threshold:
lon = (shape.bounds[2] - shape.bounds[0]) / 2 + shape.bounds[0]
simplified = LineString([(lon, shape.bounds[1]), (lon, shape.bounds[3])])
repair = RepairEntry(
"'type': 'GEOMETRY_SIMPLIFICATION'",
f"'report': 'Shape Collapsed to Vertical Line: shape of {_get_shape_coords_len(shape)} "
f'simplified to {_get_shape_coords_len(simplified)} '
f"with proximity threshold of {threshold}'",
)
return simplified, [*repairs, repair]
elif mbr_height <= threshold:
lat = (shape.bounds[3] - shape.bounds[1]) / 2 + shape.bounds[1]
simplified = LineString([(shape.bounds[0], lat), (shape.bounds[2], lat)])
repair = RepairEntry(
"'type': 'GEOMETRY_SIMPLIFICATION'",
"'report': 'Shape Collapsed to Horizontal Line: "
f'shape of {_get_shape_coords_len(shape)} simplified '
f"to {_get_shape_coords_len(simplified)} with proximity threshold of {threshold}'",
)
return simplified, [*repairs, repair]
# Keep taking away points until it's under 300:
for simplify_level in range(0, max_depth):
simplifed = shape.simplify(tolerance=threshold * (1.5**simplify_level))
coords_length = _get_shape_coords_len(simplifed)
if _get_shape_coords_len(shape) != coords_length:
repairs.append(
RepairEntry(
"'type': 'GEOMETRY_SIMPLIFICATION'",
f"'report': 'Shape Simplified: shape of {_get_shape_coords_len(shape)} "
"simplified to {coords_length} with proximity threshold of {threshold}'",
)
)
if coords_length <= 300:
return simplifed, repairs
raise ASFWKTError(f'Failed to simplify wkt string: {shape.wkt}')
def _clamp(num):
"""Clamps value between -90 and 90"""
return max(-90, min(90, num))
def _get_shape_coords_len(geometry: BaseGeometry):
return len(_get_shape_coords(geometry))
def _get_shape_coords(geometry: BaseGeometry):
"""Returns flattened coordinates of input Shapely geometry"""
if geometry.geom_type == 'Polygon':
return list(geometry.exterior.coords[:-1])
if geometry.geom_type == 'LineString':
return list(geometry.coords)
if geometry.geom_type == 'Point':
return list(geometry.coords)
output = []
for geom in geometry.geoms:
coords = _get_shape_coords(geom)
output = [*output, *coords]
return output
Discovery-asf_search-8.1.2/asf_search/__init__.py 0000664 0000000 0000000 00000004431 14777330235 0022011 0 ustar 00root root 0000000 0000000 # backport of importlib.metadata for python < 3.8
from importlib_metadata import PackageNotFoundError, version
## Setup logging now, so it's available if __version__ fails:
import logging
ASF_LOGGER = logging.getLogger(__name__)
# Add null handle so we do nothing by default. It's up to whatever
# imports us, if they want logging.
ASF_LOGGER.addHandler(logging.NullHandler())
try:
__version__ = version(__name__)
except PackageNotFoundError as e:
msg = str(
"package is not installed!\n"
"Install in editable/develop mode via (from the top of this repo):\n"
" python3 -m pip install -e .\n"
"Or, to just get the version number use:\n"
" python setup.py --version"
)
print(msg)
ASF_LOGGER.exception(msg) # type: ignore # noqa: F821
raise PackageNotFoundError(
"Install with 'python3 -m pip install -e .' to use"
) from e
ASF_LOGGER = logging.getLogger(__name__)
# Add null handle so we do nothing by default. It's up to whatever
# imports us, if they want logging.
ASF_LOGGER.addHandler(logging.NullHandler())
from .ASFSession import ASFSession # noqa: F401, E402
from .ASFProduct import ASFProduct # noqa: F401 E402
from .ASFStackableProduct import ASFStackableProduct # noqa: F401 E402
from .ASFSearchResults import ASFSearchResults # noqa: F401 E402
from .ASFSearchOptions import ASFSearchOptions, validators # noqa: F401 E402
from .Products import * # noqa: F403 F401 E402
from .exceptions import * # noqa: F403 F401 E402
from .constants import ( # noqa: F401 E402
BEAMMODE, # noqa: F401 E402
FLIGHT_DIRECTION, # noqa: F401 E402
INSTRUMENT, # noqa: F401 E402
PLATFORM, # noqa: F401 E402
POLARIZATION, # noqa: F401 E402
PRODUCT_TYPE, # noqa: F401 E402
INTERNAL, # noqa: F401 E402
DATASET, # noqa: F401 E402
RANGE_BANDWIDTH, # noqa: F401 E402
)
from .health import * # noqa: F403 F401 E402
from .search import * # noqa: F403 F401 E402
from .download import * # noqa: F403 F401 E402
from .CMR import * # noqa: F403 F401 E402
from .baseline import * # noqa: F403 F401 E402
from .WKT import validate_wkt # noqa: F401 E402
from .export import * # noqa: F403 F401 E402
REPORT_ERRORS = True
"""Enables automatic search error reporting to ASF, send any questions to uso@asf.alaska.edu"""
Discovery-asf_search-8.1.2/asf_search/baseline/ 0000775 0000000 0000000 00000000000 14777330235 0021460 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/asf_search/baseline/__init__.py 0000664 0000000 0000000 00000000117 14777330235 0023570 0 ustar 00root root 0000000 0000000 from .calc import * # noqa: F403 F401
from .stack import * # noqa: F403 F401
Discovery-asf_search-8.1.2/asf_search/baseline/calc.py 0000664 0000000 0000000 00000020756 14777330235 0022746 0 ustar 00root root 0000000 0000000 from asf_search import ASFProduct
from math import sqrt, cos, sin, radians
from typing import List
import numpy as np
try:
from ciso8601 import parse_datetime
except ImportError:
from dateutil.parser import parse as parse_datetime
# WGS84 constants
a = 6378137
f = pow((1.0 - 1 / 298.257224), 2)
# Technically f is normally considered to just be that 298... part but this is all we ever use, so
# pre-calc and cache and call it all f anyhow
def calculate_perpendicular_baselines(reference: str, stack: List[ASFProduct]):
for product in stack:
baselineProperties = product.baseline
positionProperties = baselineProperties["stateVectors"]["positions"]
if len(positionProperties.keys()) == 0:
baselineProperties["noStateVectors"] = True
continue
if None in [
positionProperties["prePositionTime"],
positionProperties["postPositionTime"],
positionProperties["prePosition"],
positionProperties["postPosition"],
]:
baselineProperties["noStateVectors"] = True
continue
asc_node_time = parse_datetime(
baselineProperties["ascendingNodeTime"]
).timestamp()
start = parse_datetime(product.properties["startTime"]).timestamp()
end = parse_datetime(product.properties["stopTime"]).timestamp()
center = start + ((end - start) / 2)
baselineProperties["relative_start_time"] = start - asc_node_time
baselineProperties["relative_center_time"] = center - asc_node_time
baselineProperties["relative_end_time"] = end - asc_node_time
t_pre = parse_datetime(positionProperties["prePositionTime"]).timestamp()
t_post = parse_datetime(positionProperties["postPositionTime"]).timestamp()
product.baseline["relative_sv_pre_time"] = t_pre - asc_node_time
product.baseline["relative_sv_post_time"] = t_post - asc_node_time
for product in stack:
if product.properties["sceneName"] == reference:
reference = product
reference.properties["perpendicularBaseline"] = 0
# Cache these values
reference.baseline["granulePosition"] = get_granule_position(
reference.properties["centerLat"], reference.properties["centerLon"]
)
break
for secondary in stack:
if secondary.baseline.get("noStateVectors"):
secondary.properties["perpendicularBaseline"] = None
continue
shared_rel_time = get_shared_sv_time(reference, secondary)
reference_shared_pos = get_pos_at_rel_time(reference, shared_rel_time)
reference_shared_vel = get_vel_at_rel_time(reference, shared_rel_time)
secondary_shared_pos = get_pos_at_rel_time(secondary, shared_rel_time)
# secondary_shared_vel = get_vel_at_rel_time(secondary, shared_rel_time) # unused
# need to get sat pos and sat vel at center time
reference.baseline["alongBeamVector"] = get_along_beam_vector(
reference_shared_pos, reference.baseline["granulePosition"]
)
reference.baseline["upBeamVector"] = get_up_beam_vector(
reference_shared_vel, reference.baseline["alongBeamVector"]
)
perpendicular_baseline = get_paired_granule_baseline(
reference.baseline["granulePosition"],
reference.baseline["upBeamVector"],
secondary_shared_pos,
)
if abs(perpendicular_baseline) > 100000:
perpendicular_baseline = None
secondary.properties["perpendicularBaseline"] = perpendicular_baseline
return stack
# Convert granule center lat/lon to fixed earth coordinates in meters using WGS84 ellipsoid.
def get_granule_position(scene_center_lat, scene_center_lon):
lat = radians(float(scene_center_lat))
lon = radians(float(scene_center_lon))
coslat = cos(lat) # This value gets used a couple times, cache it
sinlat = sin(lat) # This value gets used a couple times, cache it
C = 1.0 / (sqrt(pow(coslat, 2) + f * pow(sinlat, 2)))
S = f * C
aC = a * C
granule_position = np.array(
[aC * coslat * cos(lon), aC * coslat * sin(lon), a * S * sinlat]
)
return granule_position
# Calculate along beam vector from sat pos and granule pos
def get_along_beam_vector(satellite_position, granule_position):
along_beam_vector = np.subtract(satellite_position, granule_position)
along_beam_vector = np.divide(
along_beam_vector, np.linalg.norm(along_beam_vector)
) # normalize
return along_beam_vector
# Calculate up beam vector from sat velocity and along beam vector
def get_up_beam_vector(satellite_velocity, along_beam_vector):
up_beam_vector = np.cross(satellite_velocity, along_beam_vector)
up_beam_vector = np.divide(
up_beam_vector, np.linalg.norm(up_beam_vector)
) # normalize
return up_beam_vector
# Calculate baseline between reference and paired granule
def get_paired_granule_baseline(
reference_granule_position, reference_up_beam_vector, paired_satellite_position
):
posd = np.subtract(paired_satellite_position, reference_granule_position)
baseline = np.dot(reference_up_beam_vector, posd)
return int(round(baseline))
# Find a relative orbit time covered by both granules' SVs
def get_shared_sv_time(reference, secondary):
start = max(
reference.baseline["relative_sv_pre_time"],
secondary.baseline["relative_sv_pre_time"],
)
end = max(
reference.baseline["relative_sv_post_time"],
secondary.baseline["relative_sv_post_time"],
)
# Favor the start/end SV time of the reference so
# we can use that SV directly without interpolation
if start == reference.baseline["relative_sv_pre_time"]:
return start
if end == reference.baseline["relative_sv_post_time"]:
return end
return start
# Interpolate a position SV based on relative time
def get_pos_at_rel_time(granule: ASFProduct, relative_time):
if relative_time == granule.baseline["relative_sv_pre_time"]:
return granule.baseline["stateVectors"]["positions"]["prePosition"]
if relative_time == granule.baseline["relative_sv_post_time"]:
return granule.baseline["stateVectors"]["positions"]["postPosition"]
duration = (
granule.baseline["relative_sv_post_time"] - granule.baseline["relative_sv_pre_time"]
)
factor = (relative_time - granule.baseline["relative_sv_pre_time"]) / duration
vec_a = granule.baseline["stateVectors"]["positions"]["prePosition"]
vec_b = granule.baseline["stateVectors"]["positions"]["postPosition"]
v = [
interpolate(vec_a[0], vec_b[0], factor),
interpolate(vec_a[1], vec_b[1], factor),
interpolate(vec_a[2], vec_b[2], factor),
]
return radius_fix(granule, v, relative_time)
# Interpolate a velocity SV based on relative time
def get_vel_at_rel_time(granule: ASFProduct, relative_time):
velocityProperties = granule.baseline["stateVectors"]["velocities"]
if relative_time == granule.baseline["relative_sv_pre_time"]:
return velocityProperties["preVelocity"]
if relative_time == granule.baseline["relative_sv_post_time"]:
return velocityProperties["postVelocity"]
duration = (
granule.baseline["relative_sv_post_time"] - granule.baseline["relative_sv_pre_time"]
)
factor = (relative_time - granule.baseline["relative_sv_pre_time"]) / duration
vec_a = velocityProperties["preVelocity"]
vec_b = velocityProperties["postVelocity"]
v = [
interpolate(vec_a[0], vec_b[0], factor),
interpolate(vec_a[1], vec_b[1], factor),
interpolate(vec_a[2], vec_b[2], factor),
]
return v
# convenience 1d linear interp
def interpolate(p0, p1, x):
return (p0 * (1.0 - x)) + (p1 * x)
# Bump the provided sat pos out to a radius interpolated between the start and end sat pos vectors
def radius_fix(granule: ASFProduct, sat_pos, relative_time):
positionProperties = granule.baseline["stateVectors"]["positions"]
pre_l = np.linalg.norm(positionProperties["prePosition"])
post_l = np.linalg.norm(positionProperties["postPosition"])
sat_pos_l = np.linalg.norm(sat_pos)
dt = relative_time - granule.baseline["relative_sv_pre_time"]
new_l = pre_l + (post_l - pre_l) * dt / (
granule.baseline["relative_sv_post_time"] - granule.baseline["relative_sv_pre_time"]
)
sat_pos[0] = sat_pos[0] * new_l / sat_pos_l
sat_pos[1] = sat_pos[1] * new_l / sat_pos_l
sat_pos[2] = sat_pos[2] * new_l / sat_pos_l
return sat_pos
Discovery-asf_search-8.1.2/asf_search/baseline/stack.py 0000664 0000000 0000000 00000010165 14777330235 0023142 0 ustar 00root root 0000000 0000000 from asf_search import ASFProduct, ASFStackableProduct, ASFSearchResults
from typing import Tuple, List, Union
import pytz
from .calc import calculate_perpendicular_baselines
try:
from ciso8601 import parse_datetime
except ImportError:
from dateutil.parser import parse as parse_datetime
def get_baseline_from_stack(
reference: ASFProduct, stack: ASFSearchResults
) -> Tuple[ASFSearchResults, List[dict]]:
warnings = []
if len(stack) == 0:
raise ValueError("No products found matching stack parameters")
stack = [
product
for product in stack
if not product.properties["processingLevel"].lower().startswith("metadata") and
product.baseline is not None
]
reference, stack, reference_warnings = check_reference(reference, stack)
if reference_warnings is not None:
warnings.append(reference_warnings)
stack = calculate_temporal_baselines(reference, stack)
if reference.baseline_type == ASFStackableProduct.BaselineCalcType.PRE_CALCULATED:
stack = offset_perpendicular_baselines(reference, stack)
else:
stack = calculate_perpendicular_baselines(
reference.properties["sceneName"], stack
)
missing_state_vectors = _count_missing_state_vectors(stack)
if missing_state_vectors > 0:
warnings.append(
{
"MISSING STATE VECTORS":
f'{missing_state_vectors} scenes in stack missing State Vectors, '
'perpendicular baseline not calculated for these scenes'
}
)
return ASFSearchResults(stack), warnings
def _count_missing_state_vectors(stack) -> int:
return len([scene for scene in stack if scene.baseline.get("noStateVectors")])
def find_new_reference(stack: ASFSearchResults) -> Union[ASFProduct, None]:
for product in stack:
if product.is_valid_reference():
return product
return None
def check_reference(reference: ASFProduct, stack: ASFSearchResults):
warnings = None
if reference.properties["sceneName"] not in [
product.properties["sceneName"] for product in stack
]: # Somehow the reference we built the stack from is missing?! Just pick one
reference = stack[0]
warnings = [
{
'NEW_REFERENCE':
'A new reference scene had to be selected in order to calculate baseline values.'
}
]
# non-s1 is_valid_reference raise an error, while we try to find a valid s1 reference
# do we want this behaviour for pre-calc stacks?
if not reference.is_valid_reference():
reference = find_new_reference(stack)
if reference is None:
raise ValueError(
"No valid state vectors on any scenes in stack, this is fatal"
)
return reference, stack, warnings
def calculate_temporal_baselines(reference: ASFProduct, stack: ASFSearchResults):
"""
Calculates temporal baselines for a stack of products based on a reference scene
and injects those values into the stack.
:param reference: The reference product from which to calculate temporal baselines.
:param stack: The stack to operate on.
:return: None, as the operation occurs in-place on the stack provided.
"""
reference_time = parse_datetime(reference.properties["startTime"])
if reference_time.tzinfo is None:
reference_time = pytz.utc.localize(reference_time)
for secondary in stack:
secondary_time = parse_datetime(secondary.properties["startTime"])
if secondary_time.tzinfo is None:
secondary_time = pytz.utc.localize(secondary_time)
secondary.properties["temporalBaseline"] = (
secondary_time.date() - reference_time.date()
).days
return stack
def offset_perpendicular_baselines(reference: ASFProduct, stack: ASFSearchResults):
reference_offset = float(reference.baseline["insarBaseline"])
for product in stack:
product.properties["perpendicularBaseline"] = round(
float(product.baseline["insarBaseline"]) - reference_offset
)
return stack
Discovery-asf_search-8.1.2/asf_search/constants/ 0000775 0000000 0000000 00000000000 14777330235 0021712 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/asf_search/constants/BEAMMODE.py 0000664 0000000 0000000 00000001052 14777330235 0023433 0 ustar 00root root 0000000 0000000 IW = 'IW'
EW = 'EW'
S1 = 'S1'
S2 = 'S2'
S3 = 'S3'
S4 = 'S4'
S5 = 'S5'
S6 = 'S6'
WV = 'WV'
DSN = 'DSN'
FBS = 'FBS'
FBD = 'FBD'
PLR = 'PLR'
WB1 = 'WB1'
WB2 = 'WB2'
OBS = 'OBS'
SIRC11 = '11'
SIRC13 = '13'
SIRC16 = '16'
SIRC20 = '20'
SLC = 'SLC'
STD = 'STD'
POL = 'POL'
RPI = 'RPI'
EH3 = 'EH3'
EH4 = 'EH4'
EH6 = 'EH6'
EL1 = 'EL1'
FN1 = 'FN1'
FN2 = 'FN2'
FN3 = 'FN3'
FN4 = 'FN4'
FN5 = 'FN5'
SNA = 'SNA'
SNB = 'SNB'
ST1 = 'ST1'
ST2 = 'ST2'
ST3 = 'ST3'
ST4 = 'ST4'
ST5 = 'ST5'
ST6 = 'ST6'
ST7 = 'ST7'
SWA = 'SWA'
SWB = 'SWB'
WD1 = 'WD1'
WD2 = 'WD2'
WD3 = 'WD3'
Discovery-asf_search-8.1.2/asf_search/constants/DATASET.py 0000664 0000000 0000000 00000000536 14777330235 0023355 0 ustar 00root root 0000000 0000000 SENTINEL1 = 'SENTINEL-1'
OPERA_S1 = 'OPERA-S1'
OPERA_S1_CALVAL = 'OPERA-S1-CALVAL'
SLC_BURST = 'SLC-BURST'
ALOS_PALSAR = 'ALOS PALSAR'
ALOS_AVNIR_2 = 'ALOS AVNIR-2'
SIRC = 'SIR-C'
ARIA_S1_GUNW = 'ARIA S1 GUNW'
SMAP = 'SMAP'
UAVSAR = 'UAVSAR'
RADARSAT_1 = 'RADARSAT-1'
ERS = 'ERS'
JERS_1 = 'JERS-1'
AIRSAR = 'AIRSAR'
SEASAT = 'SEASAT'
NISAR = 'NISAR'
Discovery-asf_search-8.1.2/asf_search/constants/FLIGHT_DIRECTION.py 0000664 0000000 0000000 00000000062 14777330235 0024637 0 ustar 00root root 0000000 0000000 ASCENDING = 'ASCENDING'
DESCENDING = 'DESCENDING'
Discovery-asf_search-8.1.2/asf_search/constants/INSTRUMENT.py 0000664 0000000 0000000 00000000066 14777330235 0023776 0 ustar 00root root 0000000 0000000 C_SAR = 'C-SAR'
PALSAR = 'PALSAR'
AVNIR_2 = 'AVNIR-2'
Discovery-asf_search-8.1.2/asf_search/constants/INTERNAL.py 0000664 0000000 0000000 00000001167 14777330235 0023505 0 ustar 00root root 0000000 0000000 ASF_AUTH_HOST = 'auth.asf.alaska.edu'
CMR_HOST = 'cmr.earthdata.nasa.gov'
CMR_TIMEOUT = 30
CMR_FORMAT_EXT = 'umm_json'
CMR_GRANULE_PATH = f'/search/granules.{CMR_FORMAT_EXT}'
CMR_COLLECTIONS = '/search/collections'
CMR_COLLECTIONS_PATH = f'{CMR_COLLECTIONS}.{CMR_FORMAT_EXT}'
CMR_HEALTH_PATH = '/search/health'
CMR_PAGE_SIZE = 250
EDL_HOST = 'urs.earthdata.nasa.gov'
EDL_CLIENT_ID = 'BO_n7nTIlMljdvU6kRRB3g'
DEFAULT_PROVIDER = 'ASF'
AUTH_DOMAINS = ['asf.alaska.edu', 'earthdata.nasa.gov']
AUTH_COOKIES = ['urs_user_already_logged', 'uat_urs_user_already_logged']
ERROR_REPORTING_ENDPOINT = 'search-error-report.asf.alaska.edu'
Discovery-asf_search-8.1.2/asf_search/constants/PLATFORM.py 0000664 0000000 0000000 00000000424 14777330235 0023510 0 ustar 00root root 0000000 0000000 SENTINEL1 = 'SENTINEL-1'
SENTINEL1A = 'Sentinel-1A'
SENTINEL1B = 'Sentinel-1B'
SIRC = 'SIR-C'
ALOS = 'ALOS'
ERS = 'ERS'
ERS1 = 'ERS-1'
ERS2 = 'ERS-2'
JERS = 'JERS-1'
RADARSAT = 'RADARSAT-1'
AIRSAR = 'AIRSAR'
SEASAT = 'SEASAT 1'
SMAP = 'SMAP'
UAVSAR = 'UAVSAR'
NISAR = 'NISAR'
Discovery-asf_search-8.1.2/asf_search/constants/POLARIZATION.py 0000664 0000000 0000000 00000000546 14777330235 0024204 0 ustar 00root root 0000000 0000000 HH = 'HH'
VV = 'VV'
VV_VH = 'VV+VH'
HH_HV = 'HH+HV'
DUAL_HH = 'DUAL HH'
DUAL_VV = 'DUAL VV'
DUAL_HV = 'DUAL HV'
DUAL_VH = 'DUAL VH'
HH_3SCAN = 'HH 3SCAN'
HH_4SCAN = 'HH 4SCAN'
HH_5SCAN = 'HH 5SCAN'
QUAD = 'quadrature'
HH_VV = 'HH+VV'
HH_HV_VH_VV = 'HH+HV+VH+VV'
FULL = 'full'
UNKNOWN = 'UNKNOWN'
# NISAR
LH_LV="LH+LV"
RH_RV="RH+RV"
HH_HV_VV_VH="HH+HV+VV+VH"
Discovery-asf_search-8.1.2/asf_search/constants/PRODUCT_TYPE.py 0000664 0000000 0000000 00000004050 14777330235 0024244 0 ustar 00root root 0000000 0000000 # Sentinel-1
GRD_HD = 'GRD_HD'
GRD_MD = 'GRD_MD'
GRD_MS = 'GRD_MS'
GRD_HS = 'GRD_HS'
GRD_FD = 'GRD_FD'
SLC = 'SLC'
OCN = 'OCN'
RAW = 'RAW'
METADATA_GRD_HD = 'METADATA_GRD_HD'
METADATA_GRD_MD = 'METADATA_GRD_MD'
METADATA_GRD_MS = 'METADATA_GRD_MS'
METADATA_GRD_HS = 'METADATA_GRD_HS'
METADATA_SLC = 'METADATA_SLC'
METADATA_OCN = 'METADATA_OCN'
METADATA_RAW = 'METADATA_RAW'
BURST = 'BURST'
# ALOS PALSAR
L1_0 = 'L1.0'
L1_1 = 'L1.1'
L1_5 = 'L1.5'
L2_2 = 'L2.2'
RTC_LOW_RES = 'RTC_LOW_RES'
RTC_HIGH_RES = 'RTC_HI_RES'
KMZ = 'KMZ'
# ALOS AVNIR
# No PROCESSING_TYPE attribute in CMR
# SIR-C
# SLC and SLC metadata are both 'SLC', provided by Sentinel-1 constants
# Sentinel-1 InSAR
GUNW_STD = 'GUNW_STD'
GUNW_AMP = 'GUNW_AMP'
GUNW_CON = 'GUNW_CON'
GUN_COH = 'GUNW_COH'
GUNW_UNW = 'GUNW_UNW'
# SMAP
L1A_RADAR_RO_HDF5 = 'L1A_Radar_RO_HDF5'
L1A_RADAR_HDF5 = 'L1A_Radar_HDF5'
L1B_S0_LOW_RES_HDF5 = 'L1B_S0_LoRes_HDF5'
L1C_S0_HIGH_RES_HDF5 = 'L1C_S0_HiRes_HDF5'
L1A_RADAR_RO_QA = 'L1A_Radar_RO_QA'
L1A_RADAR_QA = 'L1A_Radar_QA'
L1B_S0_LOW_RES_QA = 'L1B_S0_LoRes_QA'
L1C_S0_HIGH_RES_QA = 'L1C_S0_HiRes_QA'
L1A_RADAR_RO_ISO_XML = 'L1A_Radar_RO_ISO_XML'
L1B_S0_LOW_RES_ISO_XML = 'L1B_S0_LoRes_ISO_XML'
L1C_S0_HIGH_RES_ISO_XML = 'L1C_S0_HiRes_ISO_XML'
# UAVSAR
AMPLITUDE = 'AMPLITUDE'
STOKES = 'STOKES'
AMPLITUDE_GRD = 'AMPLITUDE_GRD'
PROJECTED = 'PROJECTED'
PROJECTED_ML5X5 = 'PROJECTED_ML5X5'
PROJECTED_ML3X3 = 'PROJECTED_ML3X3'
INTERFEROMETRY_GRD = 'INTERFEROMETRY_GRD'
INTERFEROMETRY = 'INTERFEROMETRY'
COMPLEX = 'COMPLEX'
# KMZ provided by ALOS PALSAR
INC = 'INC'
SLOPE = 'SLOPE'
DEM_TIFF = 'DEM_TIFF'
PAULI = 'PAULI'
METADATA = 'METADATA'
# RADARSAT
L0 = 'L0'
L1 = 'L1'
# ERS
# L0 provided by RADARSAT
# L1 provided by RADARSAT
# JERS
# L0 provided by RADARSAT
# L1 provided by RADARSAT
# AIRSAR
CTIF = 'CTIF'
PTIF = 'PTIF'
LTIF = 'LTIF'
JPG = 'JPG'
LSTOKES = 'LSTOKES'
PSTOKES = 'PSTOKES'
CSTOKES = 'CSTOKES'
DEM = 'DEM'
THREEFP = '3FP'
# SEASAT
GEOTIFF = 'GEOTIFF'
# L1 provided by RADARSAT
# OPERA-S1
RTC = 'RTC'
CSLC = 'CSLC'
RTC_STATIC = 'RTC-STATIC'
CSLC_STATIC = 'CSLC-STATIC'
Discovery-asf_search-8.1.2/asf_search/constants/RANGE_BANDWIDTH.py 0000664 0000000 0000000 00000000252 14777330235 0024503 0 ustar 00root root 0000000 0000000 # Nisar Sensor Bandwidths
## L-SAR
BW_20_5 = "20+5"
BW_40_5 = "40+5"
BW_77 = "77"
BW_5 = "5"
BW_5_5 = "5+5"
## S-SAR
BW_10 = "10"
BW_25 = "25"
BW_37 = "37"
BW_75 = "75"
Discovery-asf_search-8.1.2/asf_search/constants/__init__.py 0000664 0000000 0000000 00000001032 14777330235 0024017 0 ustar 00root root 0000000 0000000 """Various constants to be used in search and related functions,
provided as a convenience to help ensure sensible values."""
from .BEAMMODE import * # noqa: F403 F401
from .FLIGHT_DIRECTION import * # noqa: F403 F401
from .INSTRUMENT import * # noqa: F403 F401
from .PLATFORM import * # noqa: F403 F401
from .POLARIZATION import * # noqa: F403 F401
from .PRODUCT_TYPE import * # noqa: F403 F401
from .INTERNAL import * # noqa: F403 F401
from .DATASET import * # noqa: F403 F401
from .RANGE_BANDWIDTH import * # noqa: F403 F401
Discovery-asf_search-8.1.2/asf_search/download/ 0000775 0000000 0000000 00000000000 14777330235 0021505 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/asf_search/download/__init__.py 0000664 0000000 0000000 00000000212 14777330235 0023611 0 ustar 00root root 0000000 0000000 from .download import download_urls, download_url, remotezip # noqa: F401
from .file_download_type import FileDownloadType # noqa: F401
Discovery-asf_search-8.1.2/asf_search/download/download.py 0000664 0000000 0000000 00000010340 14777330235 0023664 0 ustar 00root root 0000000 0000000 from typing import Iterable
from multiprocessing import Pool
import os.path
from urllib import parse
from requests import Response
from requests.exceptions import HTTPError
import warnings
from asf_search.exceptions import ASFAuthenticationError, ASFDownloadError
from asf_search import ASFSession
from tenacity import retry, stop_after_delay, retry_if_result, wait_fixed
try:
from remotezip import RemoteZip
except ImportError:
RemoteZip = None
def _download_url(arg):
url, path, session = arg
download_url(url=url, path=path, session=session)
def download_urls(urls: Iterable[str], path: str, session: ASFSession = None, processes: int = 1):
"""
Downloads all products from the specified URLs to the specified location.
:param urls: List of URLs from which to download
:param path: Local path in which to save the product
:param session: The session to use, in most cases should be authenticated beforehand
:param processes: Number of download processes to use. Defaults to 1 (i.e. sequential download)
:return:
"""
if session is None:
session = ASFSession()
if processes <= 1:
for url in urls:
download_url(url=url, path=path, session=session)
else:
pool = Pool(processes=processes)
args = [(url, path, session) for url in urls]
pool.map(_download_url, args)
pool.close()
pool.join()
def download_url(url: str, path: str, filename: str = None, session: ASFSession = None) -> None:
"""
Downloads a product from the specified URL to the specified location and (optional) filename.
:param url: URL from which to download
:param path: Local path in which to save the product
:param filename: Optional filename to be used, extracted from the URL by default
:param session: The session to use, in most cases should be authenticated beforehand
:return:
"""
if filename is None:
filename = os.path.split(parse.urlparse(url).path)[1]
if not os.path.isdir(path):
raise ASFDownloadError(f'Error downloading {url}: directory not found: {path}')
if os.path.isfile(os.path.join(path, filename)):
warnings.warn(f'File already exists, skipping download: {os.path.join(path, filename)}')
return
if session is None:
session = ASFSession()
response = _try_get_response(session=session, url=url)
with open(os.path.join(path, filename), 'wb') as f:
for chunk in response.iter_content(chunk_size=8192):
f.write(chunk)
def remotezip(url: str, session: ASFSession) -> 'RemoteZip': # type: ignore # noqa: F821
"""
:param url: the url to the zip product
:param session: the authenticated ASFSession to read and download from the zip file
"""
if RemoteZip is None:
raise ImportError(
'Could not find remotezip package in current python environment.'
'"remotezip" is an optional dependency of asf-search required'
'for the `remotezip()` method.'
'Enable by including the appropriate pip or conda install.'
'Ex: `python3 -m pip install asf-search[extras]`'
)
session.hooks['response'].append(strip_auth_if_aws)
return RemoteZip(url, session=session)
def strip_auth_if_aws(r, *args, **kwargs):
if (
300 <= r.status_code <= 399
and 'amazonaws.com' in parse.urlparse(r.headers['location']).netloc
):
location = r.headers['location']
r.headers.clear()
r.headers['location'] = location
# if it's an unprocessed burst product it'll return a 202 and we'll have to query again
# https://sentinel1-burst-docs.asf.alaska.edu/
def _is_burst_processing(response: Response):
return response.status_code == 202
@retry(
reraise=True,
retry=retry_if_result(_is_burst_processing),
wait=wait_fixed(1),
stop=stop_after_delay(90),
)
def _try_get_response(session: ASFSession, url: str):
response = session.get(url, stream=True, hooks={'response': strip_auth_if_aws})
try:
response.raise_for_status()
except HTTPError as e:
if 400 <= response.status_code <= 499:
raise ASFAuthenticationError(f'HTTP {e.response.status_code}: {e.response.text}')
raise e
return response
Discovery-asf_search-8.1.2/asf_search/download/file_download_type.py 0000664 0000000 0000000 00000000166 14777330235 0025731 0 ustar 00root root 0000000 0000000 from enum import Enum
class FileDownloadType(Enum):
DEFAULT_FILE = 1
ADDITIONAL_FILES = 2
ALL_FILES = 3
Discovery-asf_search-8.1.2/asf_search/exceptions.py 0000664 0000000 0000000 00000001626 14777330235 0022436 0 ustar 00root root 0000000 0000000 class ASFError(Exception):
"""Base ASF Exception, not intended for direct use"""
class ASFSearchError(ASFError):
"""Base search-related Exception"""
class ASFSearch4xxError(ASFSearchError):
"""Raise when CMR returns a 4xx error"""
class ASFSearch5xxError(ASFSearchError):
"""Raise when CMR returns a 5xx error"""
class ASFBaselineError(ASFSearchError):
"""Raise when baseline related errors occur"""
class ASFDownloadError(ASFError):
"""Base download-related Exception"""
class ASFAuthenticationError(ASFError):
"""Base download-related Exception"""
class ASFWKTError(ASFError):
"""Raise when wkt related errors occur"""
class CMRError(Exception):
"""Base CMR Exception"""
class CMRConceptIDError(CMRError):
"""Raise when CMR encounters a concept-id error"""
class CMRIncompleteError(CMRError):
"""Raise when CMR returns an incomplete page of results"""
Discovery-asf_search-8.1.2/asf_search/export/ 0000775 0000000 0000000 00000000000 14777330235 0021217 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/asf_search/export/__init__.py 0000664 0000000 0000000 00000000616 14777330235 0023333 0 ustar 00root root 0000000 0000000 from .export_translators import ASFSearchResults_to_properties_list # noqa: F401
from .csv import results_to_csv # noqa: F401
from .metalink import results_to_metalink # noqa: F401
from .kml import results_to_kml # noqa: F401
from .jsonlite import results_to_jsonlite # noqa: F401
from .jsonlite2 import results_to_jsonlite2 # noqa: F401
from .geojson import results_to_geojson # noqa: F401
Discovery-asf_search-8.1.2/asf_search/export/csv.py 0000664 0000000 0000000 00000015644 14777330235 0022376 0 ustar 00root root 0000000 0000000 import csv
from types import GeneratorType
from asf_search import ASF_LOGGER
from asf_search.export.export_translators import ASFSearchResults_to_properties_list
import inspect
extra_csv_fields = [
("sceneDate", ["AdditionalAttributes", ("Name", "ACQUISITION_DATE"), "Values", 0]),
("nearStartLat", ["AdditionalAttributes", ("Name", "NEAR_START_LAT"), "Values", 0]),
("nearStartLon", ["AdditionalAttributes", ("Name", "NEAR_START_LON"), "Values", 0]),
("farStartLat", ["AdditionalAttributes", ("Name", "FAR_START_LAT"), "Values", 0]),
("farStartLon", ["AdditionalAttributes", ("Name", "FAR_START_LON"), "Values", 0]),
("nearEndLat", ["AdditionalAttributes", ("Name", "NEAR_END_LAT"), "Values", 0]),
("nearEndLon", ["AdditionalAttributes", ("Name", "NEAR_END_LON"), "Values", 0]),
("farEndLat", ["AdditionalAttributes", ("Name", "FAR_END_LAT"), "Values", 0]),
("farEndLon", ["AdditionalAttributes", ("Name", "FAR_END_LON"), "Values", 0]),
(
"faradayRotation",
["AdditionalAttributes", ("Name", "FARADAY_ROTATION"), "Values", 0],
),
(
"configurationName",
["AdditionalAttributes", ("Name", "BEAM_MODE_DESC"), "Values", 0],
),
("doppler", ["AdditionalAttributes", ("Name", "DOPPLER"), "Values", 0]),
("sizeMB", ["DataGranule", "ArchiveAndDistributionInformation", 0, "Size"]),
(
"insarStackSize",
["AdditionalAttributes", ("Name", "INSAR_STACK_SIZE"), "Values", 0],
),
(
"offNadirAngle",
["AdditionalAttributes", ("Name", "OFF_NADIR_ANGLE"), "Values", 0],
),
]
fieldnames = (
"Granule Name",
"Platform",
"Sensor",
"Beam Mode",
"Beam Mode Description",
"Orbit",
"Path Number",
"Frame Number",
"Acquisition Date",
"Processing Date",
"Processing Level",
"Start Time",
"End Time",
"Center Lat",
"Center Lon",
"Near Start Lat",
"Near Start Lon",
"Far Start Lat",
"Far Start Lon",
"Near End Lat",
"Near End Lon",
"Far End Lat",
"Far End Lon",
"Faraday Rotation",
"Ascending or Descending?",
"URL",
"Size (MB)",
"Off Nadir Angle",
"Stack Size",
"Doppler",
"GroupID",
"Pointing Angle",
"TemporalBaseline",
"PerpendicularBaseline",
"relativeBurstID",
"absoluteBurstID",
"fullBurstID",
"burstIndex",
"azimuthTime",
"azimuthAnxTime",
"samplesPerBurst",
"subswath",
)
def results_to_csv(results):
ASF_LOGGER.info("started translating results to csv format")
if inspect.isgeneratorfunction(results) or isinstance(results, GeneratorType):
return CSVStreamArray(results)
return CSVStreamArray([results])
class CSVStreamArray(list):
def __init__(self, results):
self.pages = results
self.len = 1
def __iter__(self):
return self.streamRows()
def __len__(self):
return self.len
def get_additional_output_fields(self, product):
additional_fields = {}
for key, path in extra_csv_fields:
additional_fields[key] = product.umm_get(product.umm, *path)
return additional_fields
def streamRows(self):
f = CSVBuffer()
writer = csv.DictWriter(f, quoting=csv.QUOTE_ALL, fieldnames=fieldnames)
yield writer.writeheader()
completed = False
for page_idx, page in enumerate(self.pages):
ASF_LOGGER.info(f"Streaming {len(page)} products from page {page_idx}")
completed = page.searchComplete
properties_list = ASFSearchResults_to_properties_list(
page, self.get_additional_output_fields
)
yield from [writer.writerow(self.getItem(p)) for p in properties_list]
if not completed:
ASF_LOGGER.warn("Failed to download all results from CMR")
ASF_LOGGER.info("Finished streaming csv results")
def getItem(self, p):
return {
"Granule Name": p.get("sceneName"),
"Platform": p.get("platform"),
"Sensor": p.get("sensor"),
"Beam Mode": p.get("beamModeType"),
"Beam Mode Description": p.get("configurationName"),
"Orbit": p.get("orbit"),
"Path Number": p.get("pathNumber"),
"Frame Number": p.get("frameNumber"),
"Acquisition Date": p.get("sceneDate"),
"Processing Date": p.get("processingDate"),
"Processing Level": p.get("processingLevel"),
"Start Time": p.get("startTime"),
"End Time": p.get("stopTime"),
"Center Lat": p.get("centerLat"),
"Center Lon": p.get("centerLon"),
"Near Start Lat": p.get("nearStartLat"),
"Near Start Lon": p.get("nearStartLon"),
"Far Start Lat": p.get("farStartLat"),
"Far Start Lon": p.get("farStartLon"),
"Near End Lat": p.get("nearEndLat"),
"Near End Lon": p.get("nearEndLon"),
"Far End Lat": p.get("farEndLat"),
"Far End Lon": p.get("farEndLon"),
"Faraday Rotation": p.get("faradayRotation"),
"Ascending or Descending?": p.get("flightDirection"),
"URL": p.get("url"),
"Size (MB)": p.get("sizeMB"),
"Off Nadir Angle": p.get("offNadirAngle"),
"Stack Size": p.get("insarStackSize"),
"Doppler": p.get("doppler"),
"GroupID": p.get("groupID"),
"Pointing Angle": p.get("pointingAngle"),
"TemporalBaseline": p.get("teporalBaseline"),
"PerpendicularBaseline": p.get("pependicularBaseline"),
"relativeBurstID": p["burst"]["relativeBurstID"]
if p["processingLevel"] == "BURST"
else None,
"absoluteBurstID": p["burst"]["absoluteBurstID"]
if p["processingLevel"] == "BURST"
else None,
"fullBurstID": p["burst"]["fullBurstID"]
if p["processingLevel"] == "BURST"
else None,
"burstIndex": p["burst"]["burstIndex"]
if p["processingLevel"] == "BURST"
else None,
"azimuthTime": p["burst"]["azimuthTime"]
if p["processingLevel"] == "BURST"
else None,
"azimuthAnxTime": p["burst"]["azimuthAnxTime"]
if p["processingLevel"] == "BURST"
else None,
"samplesPerBurst": p["burst"]["samplesPerBurst"]
if p["processingLevel"] == "BURST"
else None,
"subswath": p["burst"]["subswath"]
if p["processingLevel"] == "BURST"
else None,
}
class CSVBuffer:
# https://docs.djangoproject.com/en/3.2/howto/outputting-csv/#streaming-large-csv-files
# A dummy CSV buffer to be used by the csv.writer class, returns the
# formatted csv row "written" to it when writer.writerow/writeheader is called
def write(self, value):
"""Write the value by returning it, instead of storing in a buffer."""
return value
Discovery-asf_search-8.1.2/asf_search/export/export_translators.py 0000664 0000000 0000000 00000002672 14777330235 0025555 0 ustar 00root root 0000000 0000000 from types import FunctionType
from datetime import datetime
from asf_search import ASFSearchResults
# ASFProduct.properties don't have every property required of certain output formats,
# This grabs the missing properties from ASFProduct.umm required by the given format
def ASFSearchResults_to_properties_list(
results: ASFSearchResults, get_additional_fields: FunctionType
):
property_list = []
for product in results:
additional_fields = get_additional_fields(product)
properties = {**product.properties, **additional_fields}
property_list.append(properties)
# Format dates to match format used by SearchAPI output formats
for product in property_list:
# S1 date properties are formatted differently from other platforms
is_S1 = product['platform'].upper() in [
'SENTINEL-1',
'SENTINEL-1B',
'SENTINEL-1A',
]
for key, data in product.items():
if ('date' in key.lower() or 'time' in key.lower()) and data is not None:
if not is_S1:
# Remove trailing zeroes from miliseconds, add Z
if len(data.split('.')) == 2:
d = len(data.split('.')[0])
data = data[:d] + 'Z'
time = datetime.strptime(data, '%Y-%m-%dT%H:%M:%SZ')
product[key] = time.strftime('%Y-%m-%dT%H:%M:%SZ')
return property_list
Discovery-asf_search-8.1.2/asf_search/export/geojson.py 0000664 0000000 0000000 00000002633 14777330235 0023241 0 ustar 00root root 0000000 0000000 import inspect
import json
from types import GeneratorType
from asf_search import ASF_LOGGER
def results_to_geojson(results):
ASF_LOGGER.info('started translating results to geojson format')
if not inspect.isgeneratorfunction(results) and not isinstance(results, GeneratorType):
results = [results]
streamer = GeoJSONStreamArray(results)
for p in json.JSONEncoder(indent=2, sort_keys=True).iterencode(
{'type': 'FeatureCollection', 'features': streamer}
):
yield p
class GeoJSONStreamArray(list):
def __init__(self, results):
self.results = results
# need to make sure we actually have results so we can intelligently set __len__, otherwise
# iterencode behaves strangely and will output invalid json
self.len = 1
def __iter__(self):
return self.streamDicts()
def __len__(self):
return self.len
def streamDicts(self):
completed = False
for page_idx, page in enumerate(self.results):
ASF_LOGGER.info(f'Streaming {len(page)} products from page {page_idx}')
completed = page.searchComplete
yield from [self.getItem(p) for p in page if p is not None]
if not completed:
ASF_LOGGER.warn('Failed to download all results from CMR')
ASF_LOGGER.info('Finished streaming geojson results')
def getItem(self, p):
return p.geojson()
Discovery-asf_search-8.1.2/asf_search/export/jsonlite.py 0000664 0000000 0000000 00000020051 14777330235 0023416 0 ustar 00root root 0000000 0000000 import inspect
import json
from types import GeneratorType
from typing import Tuple
from shapely.geometry import shape
from shapely.ops import transform
from asf_search import ASF_LOGGER
from asf_search.export.export_translators import ASFSearchResults_to_properties_list
extra_jsonlite_fields = [
(
"processingTypeDisplay",
["AdditionalAttributes", ("Name", "PROCESSING_TYPE_DISPLAY"), "Values", 0],
),
("thumb", ["AdditionalAttributes", ("Name", "THUMBNAIL_URL"), "Values", 0]),
(
"faradayRotation",
["AdditionalAttributes", ("Name", "FARADAY_ROTATION"), "Values", 0],
),
("sizeMB", ["DataGranule", "ArchiveAndDistributionInformation", 0, "Size"]),
("flightLine", ["AdditionalAttributes", ("Name", "FLIGHT_LINE"), "Values", 0]),
("missionName", ["AdditionalAttributes", ("Name", "MISSION_NAME"), "Values", 0]),
]
def results_to_jsonlite(results):
ASF_LOGGER.info('started translating results to jsonlite format')
if len(results) == 0:
yield from json.JSONEncoder(indent=2, sort_keys=True).iterencode({'results': []})
return
if not inspect.isgeneratorfunction(results) and not isinstance(results, GeneratorType):
results = [results]
streamer = JSONLiteStreamArray(results)
jsondata = {"results": streamer}
for p in json.JSONEncoder(indent=2, sort_keys=True).iterencode(jsondata):
yield p
def unwrap_shape(x, y, z=None):
x = x if x > 0 else x + 360
return tuple([x, y])
def get_wkts(geometry) -> Tuple[str, str]:
wrapped = shape(geometry)
min_lon, max_lon = (wrapped.bounds[0], wrapped.bounds[2])
if max_lon - min_lon > 180:
unwrapped = transform(unwrap_shape, wrapped)
else:
unwrapped = wrapped
return wrapped.wkt, unwrapped.wkt
class JSONLiteStreamArray(list):
def __init__(self, results):
self.results = results
# need to make sure we actually have results so we can intelligently set __len__, otherwise
# iterencode behaves strangely and will output invalid json
self.len = 1
def __iter__(self):
return self.streamDicts()
def __len__(self):
return self.len
def get_additional_output_fields(self, product):
# umm = product.umm
additional_fields = {}
for key, path in extra_jsonlite_fields:
additional_fields[key] = product.umm_get(product.umm, *path)
if product.properties["platform"].upper() in [
"ALOS",
"RADARSAT-1",
"JERS-1",
"ERS-1",
"ERS-2",
]:
insarGrouping = product.umm_get(
product.umm,
*["AdditionalAttributes", ("Name", "INSAR_STACK_ID"), "Values", 0],
)
if insarGrouping not in [None, 0, "0", "NA", "NULL"]:
additional_fields["canInsar"] = True
additional_fields["insarStackSize"] = product.umm_get(
product.umm,
*[
"AdditionalAttributes",
("Name", "INSAR_STACK_SIZE"),
"Values",
0,
],
)
else:
additional_fields["canInsar"] = False
else:
additional_fields["canInsar"] = product.baseline is not None
additional_fields["geometry"] = product.geometry
return additional_fields
def streamDicts(self):
completed = False
for page_idx, page in enumerate(self.results):
ASF_LOGGER.info(f"Streaming {len(page)} products from page {page_idx}")
completed = page.searchComplete
yield from [
self.getItem(p)
for p in ASFSearchResults_to_properties_list(
page, self.get_additional_output_fields
)
if p is not None
]
if not completed:
ASF_LOGGER.warn("Failed to download all results from CMR")
ASF_LOGGER.info(f"Finished streaming {self.getOutputType()} results")
def getItem(self, p):
for i in p.keys():
if p[i] == "NA" or p[i] == "":
p[i] = None
try:
if p.get("offNadirAngle") is not None and float(p["offNadirAngle"]) < 0:
p["offNadirAngle"] = None
except TypeError:
pass
try:
if p.get("patNumber"):
if float(p["pathNumber"]) < 0:
p["pathNumber"] = None
except TypeError:
pass
try:
if p.get("groupID") is None:
p["groupID"] = p["sceneName"]
except TypeError:
pass
try:
p["sizeMB"] = float(p["sizeMB"])
except TypeError:
pass
try:
p["pathNumber"] = int(p["pathNumber"])
except TypeError:
pass
try:
p['frameNumber'] = int(p.get('frameNumber'))
except TypeError:
pass
try:
p["orbit"] = int(p["orbit"])
except TypeError:
pass
wrapped, unwrapped = get_wkts(p["geometry"])
result = {
"beamMode": p["beamModeType"],
"browse": [] if p.get("browse") is None else p.get("browse"),
"canInSAR": p.get("canInsar"),
"dataset": p.get("platform"),
"downloadUrl": p.get("url"),
"faradayRotation": p.get("faradayRotation"), # ALOS
"fileName": p.get("fileName"),
"flightDirection": p.get("flightDirection"),
"flightLine": p.get("flightLine"),
"frame": p.get("frameNumber"),
"granuleName": p.get("sceneName"),
"groupID": p.get("groupID"),
"instrument": p.get("sensor"),
"missionName": p.get("missionName"),
"offNadirAngle": str(p["offNadirAngle"])
if p.get("offNadirAngle") is not None
else None, # ALOS
"orbit": [str(p["orbit"])],
"path": p.get("pathNumber"),
"polarization": p.get("polarization"),
"pointingAngle": p.get("pointingAngle"),
"productID": p.get("fileID"),
"productType": p.get("processingLevel"),
"productTypeDisplay": p.get("processingTypeDisplay"),
"sizeMB": p.get("sizeMB"),
"stackSize": p.get(
"insarStackSize"
), # Used for datasets with precalculated stacks
"startTime": p.get("startTime"),
"stopTime": p.get("stopTime"),
"thumb": p.get("thumb"),
"wkt": wrapped,
"wkt_unwrapped": unwrapped,
"pgeVersion": p.get("pgeVersion"),
}
for key in result.keys():
if result[key] in ["NA", "NULL"]:
result[key] = None
if "temporalBaseline" in p.keys() or "perpendicularBaseline" in p.keys():
result["temporalBaseline"] = p["temporalBaseline"]
result["perpendicularBaseline"] = p["perpendicularBaseline"]
if p.get("processingLevel") == "BURST": # is a burst product
result["burst"] = p["burst"]
if p.get('operaBurstID') is not None or result['productID'].startswith('OPERA'):
result['opera'] = {
'operaBurstID': p.get('operaBurstID'),
'additionalUrls': p.get('additionalUrls'),
}
if p.get('validityStartDate'):
result['opera']['validityStartDate'] = p.get('validityStartDate')
if p.get('platform') == 'NISAR':
result['nisar'] = {
'pgeVersion': p.get('pgeVersion'),
'mainBandPolarization': p.get('mainBandPolarization'),
'sideBandPolarization': p.get('sideBandPolarization'),
'frameCoverage': p.get('frameCoverage'),
'jointObservation': p.get('jointObservation'),
'rangeBandwidth': p.get('rangeBandwidth'),
}
return result
def getOutputType(self) -> str:
return "jsonlite"
Discovery-asf_search-8.1.2/asf_search/export/jsonlite2.py 0000664 0000000 0000000 00000005635 14777330235 0023513 0 ustar 00root root 0000000 0000000 import inspect
import json
from types import GeneratorType
from asf_search import ASF_LOGGER
from .jsonlite import JSONLiteStreamArray
def results_to_jsonlite2(results):
ASF_LOGGER.info('started translating results to jsonlite2 format')
if len(results) == 0:
yield from json.JSONEncoder(indent=2, sort_keys=True).iterencode({'results': []})
return
if not inspect.isgeneratorfunction(results) and not isinstance(results, GeneratorType):
results = [results]
streamer = JSONLite2StreamArray(results)
for p in json.JSONEncoder(sort_keys=True, separators=(",", ":")).iterencode(
{"results": streamer}
):
yield p
class JSONLite2StreamArray(JSONLiteStreamArray):
def getItem(self, p):
# pre-processing of the result is the same as in the base jsonlite streamer,
# so use that and then rename/substitute fields
p = super().getItem(p)
result = {
"b": [a.replace(p["granuleName"], "{gn}") for a in p["browse"]]
if p["browse"] is not None
else p["browse"],
"bm": p["beamMode"],
"d": p["dataset"],
"du": p["downloadUrl"].replace(p["granuleName"], "{gn}"),
"f": p["frame"],
"fd": p["flightDirection"],
"fl": p["flightLine"],
"fn": p["fileName"].replace(p["granuleName"], "{gn}"),
"fr": p["faradayRotation"], # ALOS
"gid": p["groupID"].replace(p["granuleName"], "{gn}"),
"gn": p["granuleName"],
"i": p["instrument"],
"in": p["canInSAR"],
"mn": p["missionName"],
"o": p["orbit"],
"on": p["offNadirAngle"], # ALOS
"p": p["path"],
"pid": p["productID"].replace(p["granuleName"], "{gn}"),
"pa": p["pointingAngle"],
"po": p["polarization"],
"pt": p["productType"],
"ptd": p["productTypeDisplay"],
"s": p["sizeMB"],
"ss": p["stackSize"], # Used for datasets with precalculated stacks
"st": p["startTime"],
"stp": p["stopTime"],
"t": p["thumb"].replace(p["granuleName"], "{gn}")
if p["thumb"] is not None
else p["thumb"],
"w": p["wkt"],
"wu": p["wkt_unwrapped"],
"pge": p["pgeVersion"],
}
if 'temporalBaseline' in p.keys():
result['tb'] = p['temporalBaseline']
if 'perpendicularBaseline' in p.keys():
result['pb'] = p['perpendicularBaseline']
if p.get('burst') is not None: # is a burst product
result['s1b'] = p['burst']
if p.get('opera') is not None:
result['s1o'] = p['opera']
if p.get('nisar') is not None:
result['nisar'] = p['nisar']
return result
def getOutputType(self) -> str:
return "jsonlite2"
Discovery-asf_search-8.1.2/asf_search/export/kml.py 0000664 0000000 0000000 00000016100 14777330235 0022352 0 ustar 00root root 0000000 0000000 import inspect
from types import GeneratorType
from typing import Dict
from asf_search import ASF_LOGGER
from asf_search.export.metalink import MetalinkStreamArray
import xml.etree.ElementTree as ETree
extra_kml_fields = [
(
"configurationName",
["AdditionalAttributes", ("Name", "BEAM_MODE_DESC"), "Values", 0],
),
(
"faradayRotation",
["AdditionalAttributes", ("Name", "FARADAY_ROTATION"), "Values", 0],
),
(
"processingTypeDisplay",
["AdditionalAttributes", ("Name", "PROCESSING_TYPE_DISPLAY"), "Values", 0],
),
("sceneDate", ["AdditionalAttributes", ("Name", "ACQUISITION_DATE"), "Values", 0]),
(
"shape",
[
"SpatialExtent",
"HorizontalSpatialDomain",
"Geometry",
"GPolygons",
0,
"Boundary",
"Points",
],
),
("thumbnailUrl", ["AdditionalAttributes", ("Name", "THUMBNAIL_URL"), "Values", 0]),
(
"faradayRotation",
["AdditionalAttributes", ("Name", "FARADAY_ROTATION"), "Values", 0],
),
(
"offNadirAngle",
["AdditionalAttributes", ("Name", "OFF_NADIR_ANGLE"), "Values", 0],
),
]
def results_to_kml(results):
ASF_LOGGER.info("Started translating results to kml format")
if inspect.isgeneratorfunction(results) or isinstance(results, GeneratorType):
return KMLStreamArray(results)
return KMLStreamArray([results])
class KMLStreamArray(MetalinkStreamArray):
def __init__(self, results):
MetalinkStreamArray.__init__(self, results)
self.header = """
ASF Datapool Search Results
Search Performed:
\n """
self.footer = """\n"""
def getOutputType(self) -> str:
return "kml"
def get_additional_fields(self, product):
umm = product.umm
additional_fields = {}
for key, path in extra_kml_fields:
additional_fields[key] = product.umm_get(umm, *path)
return additional_fields
def getItem(self, p):
placemark = ETree.Element("Placemark")
name = ETree.Element("name")
name.text = p["sceneName"]
placemark.append(name)
description = ETree.Element("description")
description.text = """<![CDATA["""
placemark.append(description)
h1 = ETree.Element("h1")
h1.text = (
f"{p['platform']} ({p['configurationName']}), acquired {p['sceneDate']}"
)
h2 = ETree.Element("h2")
h2.text = p.get("url", "")
description.append(h1)
description.append(h2)
div = ETree.Element(
"div", attrib={"style": "position:absolute;left:20px;top:200px"}
)
description.append(div)
h3 = ETree.Element("h3")
h3.text = "Metadata"
div.append(h3)
ul = ETree.Element("ul")
div.append(ul)
for text, value in self.metadata_fields(p).items():
li = ETree.Element("li")
li.text = text + str(value)
ul.append(li)
d = ETree.Element(
"div", attrib={"style": "position:absolute;left:300px;top:250px"}
)
description.append(d)
a = ETree.Element("a")
if p.get("browse") is not None:
a.set("href", p.get("browse")[0])
else:
a.set("href", "")
d.append(a)
img = ETree.Element("img")
if p.get("thumbnailUrl") is not None:
img.set("src", p.get("thumbnailUrl"))
else:
img.set("src", "None")
a.append(img)
styleUrl = ETree.Element("styleUrl")
styleUrl.text = "#yellowLineGreenPoly"
placemark.append(styleUrl)
polygon = ETree.Element("Polygon")
placemark.append(polygon)
extrude = ETree.Element("extrude")
extrude.text = "1"
polygon.append(extrude)
altitudeMode = ETree.Element("altitudeMode")
altitudeMode.text = "relativeToGround"
polygon.append(altitudeMode)
outerBondaryIs = ETree.Element("outerBoundaryIs")
polygon.append(outerBondaryIs)
linearRing = ETree.Element("LinearRing")
outerBondaryIs.append(linearRing)
coordinates = ETree.Element("coordinates")
if p.get("shape") is not None:
coordinates.text = (
"\n"
+ (14 * " ")
+ ("\n" + (14 * " ")).join(
[f"{c['Longitude']},{c['Latitude']},2000" for c in p.get("shape")]
)
+ "\n"
+ (14 * " ")
)
linearRing.append(coordinates)
self.indent(placemark, 3)
# for CDATA section, manually replace & escape character with &
return ETree.tostring(placemark, encoding="unicode").replace("&", "&")
# Helper method for getting additional fields in tag
def metadata_fields(self, item: Dict):
required = {
'Processing type: ': item.get('processingTypeDisplay'),
'Frame: ': item.get('frameNumber'),
'Path: ': item.get('pathNumber'),
'Orbit: ': item.get('orbit'),
'Start time: ': item.get('startTime'),
'End time: ': item.get('stopTime'),
}
optional = {}
for text, key in [
("Faraday Rotation: ", "faradayRotation"),
("Ascending/Descending: ", "flightDirection"),
("Off Nadir Angle: ", "offNadirAngle"),
("Pointing Angle: ", "pointingAngle"),
("Temporal Baseline: ", "temporalBaseline"),
("Perpendicular Baseline: ", "perpendicularBaseline"),
]:
if item.get(key) is not None:
if isinstance(item[key], float) and key == "offNadirAngle":
optional[text] = f"{item[key]:g}" # trim trailing zeros
else:
optional[text] = item[key]
elif key not in ["temporalBaseline", "perpendicularBaseline"]:
optional[text] = "None"
output = {**required, **optional}
if item["processingLevel"] == "BURST":
burst = {
"Absolute Burst ID: ": item["burst"]["absoluteBurstID"],
"Relative Burst ID: ": item["burst"]["relativeBurstID"],
"Full Burst ID: ": item["burst"]["fullBurstID"],
"Burst Index: ": item["burst"]["burstIndex"],
"Azimuth Time: ": item["burst"]["azimuthTime"],
"Azimuth Anx Time: ": item["burst"]["azimuthAnxTime"],
"Samples per Burst: ": item["burst"]["samplesPerBurst"],
"Subswath: ": item["burst"]["subswath"],
}
output = {**output, **burst}
return output
Discovery-asf_search-8.1.2/asf_search/export/metalink.py 0000664 0000000 0000000 00000006315 14777330235 0023402 0 ustar 00root root 0000000 0000000 import inspect
from types import GeneratorType
import xml.etree.ElementTree as ETree
from asf_search import ASF_LOGGER
from asf_search.export.export_translators import ASFSearchResults_to_properties_list
def results_to_metalink(results):
ASF_LOGGER.info('Started translating results to metalink format')
if inspect.isgeneratorfunction(results) or isinstance(results, GeneratorType):
return MetalinkStreamArray(results)
return MetalinkStreamArray([results])
class MetalinkStreamArray(list):
def __init__(self, results):
self.pages = results
self.len = 1
self.header = (
''
'\n'
'Alaska Satellite Facilityhttp://www.asf.alaska.edu/\n' # noqa F401
''
)
self.footer = '\n\n'
def get_additional_fields(self, product):
return {}
def __iter__(self):
return self.streamPages()
def __len__(self):
return self.len
def streamPages(self):
yield self.header
completed = False
for page_idx, page in enumerate(self.pages):
ASF_LOGGER.info(f'Streaming {len(page)} products from page {page_idx}')
completed = page.searchComplete
properties_list = ASFSearchResults_to_properties_list(page, self.get_additional_fields)
yield from [self.getItem(p) for p in properties_list]
if not completed:
ASF_LOGGER.warn('Failed to download all results from CMR')
yield self.footer
ASF_LOGGER.info(f'Finished streaming {self.getOutputType()} results')
def getOutputType(self) -> str:
return 'metalink'
def getItem(self, p):
file = ETree.Element('file', attrib={'name': p['fileName']})
resources = ETree.Element('resources')
url = ETree.Element('url', attrib={'type': 'http'})
url.text = p['url']
resources.append(url)
file.append(resources)
if p.get('md5sum') and p.get('md5sum') != 'NA':
verification = ETree.Element('verification')
h = ETree.Element('hash', {'type': 'md5'})
h.text = p['md5sum']
verification.append(h)
file.append(verification)
if p['bytes'] and p['bytes'] != 'NA':
size = ETree.Element('size')
size.text = str(p['bytes'])
file.append(size)
return '\n' + (8 * ' ') + ETree.tostring(file, encoding='unicode')
def indent(self, elem, level=0):
# Only Python 3.9+ has a built-in indent function for element tree.
# https://stackoverflow.com/a/33956544
i = '\n' + level * ' '
if len(elem):
if not elem.text or not elem.text.strip():
elem.text = i + ' '
if not elem.tail or not elem.tail.strip():
elem.tail = i
for elem in elem:
self.indent(elem, level + 1)
if not elem.tail or not elem.tail.strip():
elem.tail = i
else:
if level and (not elem.tail or not elem.tail.strip()):
elem.tail = i
Discovery-asf_search-8.1.2/asf_search/health/ 0000775 0000000 0000000 00000000000 14777330235 0021143 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/asf_search/health/__init__.py 0000664 0000000 0000000 00000000051 14777330235 0023250 0 ustar 00root root 0000000 0000000 from .health import health # noqa: F401
Discovery-asf_search-8.1.2/asf_search/health/health.py 0000664 0000000 0000000 00000001144 14777330235 0022762 0 ustar 00root root 0000000 0000000 from typing import Dict
import requests
import json
import asf_search.constants
def health(host: str = None) -> Dict:
"""
Checks basic connectivity to and health of the ASF SearchAPI.
Parameters
----------
param host:
SearchAPI host, defaults to Production SearchAPI.
This option is intended for dev/test purposes.
Returns
-------
Current configuration and status of subsystems as a dict
"""
if host is None:
host = asf_search.INTERNAL.CMR_HOST
return json.loads(requests.get(f'https://{host}{asf_search.INTERNAL.CMR_HEALTH_PATH}').text)
Discovery-asf_search-8.1.2/asf_search/search/ 0000775 0000000 0000000 00000000000 14777330235 0021143 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/asf_search/search/__init__.py 0000664 0000000 0000000 00000000667 14777330235 0023265 0 ustar 00root root 0000000 0000000 from .search import search # noqa: F401
from .granule_search import granule_search # noqa: F401
from .product_search import product_search # noqa: F401
from .geo_search import geo_search # noqa: F401
from .baseline_search import stack_from_id # noqa: F401
from .campaigns import campaigns # noqa: F401
from .search_count import search_count # noqa: F401
from .search_generator import search_generator, preprocess_opts # noqa: F401
Discovery-asf_search-8.1.2/asf_search/search/baseline_search.py 0000664 0000000 0000000 00000011063 14777330235 0024625 0 ustar 00root root 0000000 0000000 from typing import Type
from asf_search.baseline.stack import get_baseline_from_stack
from asf_search import ASF_LOGGER
from copy import copy
from asf_search.search import search, product_search
from asf_search.ASFSearchOptions import ASFSearchOptions
from asf_search.ASFSearchResults import ASFSearchResults
from asf_search import ASFProduct
from asf_search.constants import PLATFORM
from asf_search.exceptions import ASFSearchError
precalc_platforms = [
PLATFORM.ALOS,
PLATFORM.RADARSAT,
PLATFORM.ERS1,
PLATFORM.ERS2,
PLATFORM.JERS,
]
def stack_from_product(
reference: ASFProduct,
opts: ASFSearchOptions = None,
ASFProductSubclass: Type[ASFProduct] = None,
) -> ASFSearchResults:
"""
Finds a baseline stack from a reference ASFProduct
Parameters
----------
reference:
Reference scene to base the stack on,
and from which to calculate perpendicular/temporal baselines
opts:
An ASFSearchOptions object describing the search parameters to be used.
Search parameters specified outside this object will override in event of a conflict.
ASFProductSubclass:
An ASFProduct subclass constructor.
Returns
-------
`asf_search.ASFSearchResults`
list of search results of subclass ASFProduct or of provided ASFProductSubclass
"""
opts = ASFSearchOptions() if opts is None else copy(opts)
opts.merge_args(**dict(reference.get_stack_opts()))
stack = search(opts=opts)
is_complete = stack.searchComplete
if ASFProductSubclass is not None:
_cast_results_to_subclass(stack, ASFProductSubclass)
stack, warnings = get_baseline_from_stack(reference=reference, stack=stack)
stack.searchComplete = is_complete # preserve final outcome of earlier search()
stack.sort(key=lambda product: product.properties['temporalBaseline'])
for warning in warnings:
ASF_LOGGER.warning(f'{warning}')
return stack
def stack_from_id(
reference_id: str,
opts: ASFSearchOptions = None,
useSubclass: Type[ASFProduct] = None,
) -> ASFSearchResults:
"""
Finds a baseline stack from a reference product ID
Parameters
----------
reference_id:
Reference product to base the stack from,
and from which to calculate perpendicular/temporal baselines
opts:
An ASFSearchOptions object describing the search parameters to be used.
Search parameters specified outside this object will override in event of a conflict.
ASFProductSubclass:
An ASFProduct subclass constructor.
Returns
-------
`asf_search.ASFSearchResults`
list of search results of subclass ASFProduct or of provided ASFProductSubclass
"""
opts = ASFSearchOptions() if opts is None else copy(opts)
reference_results = product_search(product_list=reference_id, opts=opts)
reference_results.raise_if_incomplete()
if len(reference_results) <= 0:
raise ASFSearchError(f'Reference product not found: {reference_id}')
reference = reference_results[0]
if useSubclass is not None:
reference = _cast_to_subclass(reference, useSubclass)
return reference.stack(opts=opts, useSubclass=useSubclass)
def _cast_results_to_subclass(stack: ASFSearchResults, ASFProductSubclass: Type[ASFProduct]):
"""
Converts results from default ASFProduct subclasses to custom ones
"""
for idx, product in enumerate(stack):
stack[idx] = _cast_to_subclass(product, ASFProductSubclass)
def _cast_to_subclass(product: ASFProduct, subclass: Type[ASFProduct]) -> ASFProduct:
"""
Casts this ASFProduct object as a new object of return type subclass.
example:
```
class MyCustomClass(ASFProduct):
_base_properties = {
**ASFProduct._base_properties,
'some_unique_property': {'path': ['AdditionalAttributes', 'UNIQUE_PROPERTY', ...]}
}
# subclass as constructor
customReference = reference.cast_to_subclass(MyCustomClass)
print(customReference.properties['some_unique_property'])
```
:param subclass: The ASFProduct subclass constructor to call on the product
:returns return product as `ASFProduct` subclass
"""
try:
if isinstance(subclass, type(ASFProduct)):
return subclass(
args={'umm': product.umm, 'meta': product.meta}, session=product.session
)
except Exception as e:
raise ValueError(f'Unable to use provided subclass {type(subclass)}, \nError Message: {e}')
raise ValueError(f'Expected ASFProduct subclass constructor, got {type(subclass)}')
Discovery-asf_search-8.1.2/asf_search/search/campaigns.py 0000664 0000000 0000000 00000003374 14777330235 0023466 0 ustar 00root root 0000000 0000000 from typing import Dict, List, Union
from asf_search.CMR.MissionList import get_campaigns
def campaigns(platform: str) -> List[str]:
"""
Returns a list of campaign names for the given platform,
each name being usable as a campaign for asf_search.search() and asf_search.geo_search()
:param platform: The name of the platform to gather campaign names for.
Platforms currently supported include UAVSAR, AIRSAR, and SENTINEL-1 INTERFEROGRAM (BETA)
:return: A list of campaign names for the given platform
"""
data = {'include_facets': 'true', 'provider': 'ASF'}
if platform is not None:
if platform == 'UAVSAR':
data['platform[]'] = 'G-III'
data['instrument[]'] = 'UAVSAR'
elif platform == 'AIRSAR':
data['platform[]'] = 'DC-8'
data['instrument[]'] = 'AIRSAR'
elif platform == 'SENTINEL-1 INTERFEROGRAM (BETA)':
data['platform[]'] = 'SENTINEL-1A'
else:
data['platform[]'] = platform
missions = get_campaigns(data)
mission_names = _get_project_names(missions)
return mission_names
def _get_project_names(data: Union[Dict, List]) -> List[str]:
"""
Recursively searches for campaign names
under "Projects" key in CMR umm_json response
:param data: CMR umm_json response
:return: A list of found campaign names for the given platform
"""
output = []
if isinstance(data, Dict):
for key, value in data.items():
if key == 'Projects':
return [list(item.values())[0] for item in value]
output.extend(_get_project_names(value))
elif isinstance(data, List):
for item in data:
output.extend(_get_project_names(item))
return output
Discovery-asf_search-8.1.2/asf_search/search/error_reporting.py 0000664 0000000 0000000 00000003454 14777330235 0024745 0 ustar 00root root 0000000 0000000 from asf_search import ASFSearchOptions
from asf_search import INTERNAL
import requests
import logging
def report_search_error(search_options: ASFSearchOptions, message: str):
"""Reports CMR Errors automatically to ASF"""
from asf_search import REPORT_ERRORS
if not REPORT_ERRORS:
logging.warning(
'Automatic search error reporting is turned off,'
'search errors will NOT be reported to ASF.'
'\nTo enable automatic error reporting, set asf_search.REPORT_ERRORS to True'
'\nIf you have any questions email uso@asf.alaska.edu'
)
return
user_agent = search_options.session.headers.get('User-Agent')
search_options_list = '\n'.join(
[f'\t{option}: {key}' for option, key in dict(search_options).items()]
)
message = f'Error Message: {str(message)}\nUser Agent: {user_agent} \
\nSearch Options: {{\n{search_options_list}\n}}'
response = requests.post(
f'https://{INTERNAL.ERROR_REPORTING_ENDPOINT}',
data={'Message': f'This error message and info was automatically generated:\n\n{message}'},
)
try:
response.raise_for_status()
except requests.exceptions.HTTPError:
logging.error(
'asf-search failed to automatically report an error,'
'if you have any questions email uso@asf.alaska.edu'
f"\nError Text: HTTP {response.status_code}: {response.json()['errors']}"
)
return
if response.status_code == 200:
logging.error(
(
'The asf-search module ecountered an error with CMR,'
'and the following message was automatically reported to ASF:'
'\n\n"\nmessage\n"'
'If you have any questions email uso@asf.alaska.edu'
)
)
Discovery-asf_search-8.1.2/asf_search/search/geo_search.py 0000664 0000000 0000000 00000016173 14777330235 0023624 0 ustar 00root root 0000000 0000000 from typing import Literal, Tuple, Union, Sequence
import datetime
from copy import copy
from asf_search.search import search
from asf_search.ASFSearchOptions import ASFSearchOptions
from asf_search.ASFSearchResults import ASFSearchResults
def geo_search(
absoluteOrbit: Union[
int, Tuple[int, int], range, Sequence[Union[int, Tuple[int, int], range]]
] = None,
asfFrame: Union[
int, Tuple[int, int], range, Sequence[Union[int, Tuple[int, int], range]]
] = None,
beamMode: Union[str, Sequence[str]] = None,
beamSwath: Union[str, Sequence[str]] = None,
campaign: Union[str, Sequence[str]] = None,
maxDoppler: float = None,
minDoppler: float = None,
end: Union[datetime.datetime, str] = None,
maxFaradayRotation: float = None,
minFaradayRotation: float = None,
flightDirection: str = None,
flightLine: str = None,
frame: Union[int, Tuple[int, int], range, Sequence[Union[int, Tuple[int, int], range]]] = None,
granule_list: Union[str, Sequence[str]] = None,
groupID: Union[str, Sequence[str]] = None,
insarStackId: str = None,
instrument: Union[str, Sequence[str]] = None,
intersectsWith: str = None,
lookDirection: Union[str, Sequence[str]] = None,
offNadirAngle: Union[
float, Tuple[float, float], Sequence[Union[float, Tuple[float, float]]]
] = None,
platform: Union[str, Sequence[str]] = None,
polarization: Union[str, Sequence[str]] = None,
processingDate: Union[datetime.datetime, str] = None,
processingLevel: Union[str, Sequence[str]] = None,
product_list: Union[str, Sequence[str]] = None,
relativeOrbit: Union[
int, Tuple[int, int], range, Sequence[Union[int, Tuple[int, int], range]]
] = None,
season: Tuple[int, int] = None,
start: Union[datetime.datetime, str] = None,
absoluteBurstID: Union[int, Sequence[int]] = None,
relativeBurstID: Union[int, Sequence[int]] = None,
fullBurstID: Union[str, Sequence[str]] = None,
temporalBaselineDays: Union[str, Sequence[str]] = None,
operaBurstID: Union[str, Sequence[str]] = None,
frameCoverage: Literal["FULL", "PARTIAL"] = None,
mainBandPolarization: Union[str, Sequence[str]] = None,
sideBandPolarization: Union[str, Sequence[str]] = None,
rangeBandwidth: Union[str, Sequence[str]] = None,
jointObservation: bool = None,
dataset: Union[str, Sequence[str]] = None,
collections: Union[str, Sequence[str]] = None,
shortName: Union[str, Sequence[str]] = None,
cmr_keywords: Union[Tuple[str, str], Sequence[Tuple[str, str]]] = None,
maxResults: int = None,
opts: ASFSearchOptions = None,
) -> ASFSearchResults:
"""
Performs a geographic search against the Central Metadata Repository (CMR),
returning all results in a single list.
Parameters
----------
absoluteOrbit:
For ALOS, ERS-1, ERS-2, JERS-1, and RADARSAT-1, Sentinel-1A, Sentinel-1B
this value corresponds to the orbit count within the orbit cycle.
For UAVSAR it is the Flight ID.
asfFrame:
This is primarily an ASF / JAXA frame reference. However,
some platforms use other conventions. See ‘frame’ for ESA-centric frame searches.
beamMode:
The beam mode used to acquire the data.
beamSwath:
Encompasses a look angle and beam mode.
campaign:
For UAVSAR and AIRSAR data collections only. Search by general location,
site description, or data grouping as supplied by flight agency or project.
maxDoppler:
Doppler provides an indication of how much the look direction deviates
from the ideal perpendicular flight direction acquisition.
minDoppler:
Doppler provides an indication of how much the look direction deviates
from the ideal perpendicular flight direction acquisition.
end:
End date of data acquisition. Supports timestamps
as well as natural language such as "3 weeks ago"
maxFaradayRotation:
Rotation of the polarization plane of
the radar signal impacts imagery, as HH and HV signals become mixed.
minFaradayRotation:
Rotation of the polarization plane of
the radar signal impacts imagery, as HH and HV signals become mixed.
flightDirection:
Satellite orbit direction during data acquisition
flightLine:
Specify a flightline for UAVSAR or AIRSAR.
frame:
ESA-referenced frames are offered to give users a universal framing convention.
Each ESA frame has a corresponding ASF frame assigned. See also: asfframe
granule_list:
List of specific granules.
Search results may include several products per granule name.
groupID:
Identifier used to find products considered to
be of the same scene but having different granule names
insarStackId:
Identifier used to find products of the same InSAR stack
instrument:
The instrument used to acquire the data. See also: platform
intersectsWith:
Search by polygon, linestring,
or point defined in 2D Well-Known Text (WKT)
lookDirection:
Left or right look direction during data acquisition
offNadirAngle:
Off-nadir angles for ALOS PALSAR
platform:
Remote sensing platform that acquired the data.
Platforms that work together, such as Sentinel-1A/1B and ERS-1/2
have multi-platform aliases available. See also: instrument
polarization:
A property of SAR electromagnetic waves
that can be used to extract meaningful information about surface properties of the earth.
processingDate:
Used to find data that has been processed at ASF since a given
time and date. Supports timestamps as well as natural language such as "3 weeks ago"
processingLevel:
Level to which the data has been processed
product_list:
List of specific products.
Guaranteed to be at most one product per product name.
relativeOrbit:
Path or track of satellite during data acquisition.
For UAVSAR it is the Line ID.
season:
Start and end day of year for desired seasonal range.
This option is used in conjunction with start/end to specify a seasonal range
within an overall date range.
start:
Start date of data acquisition.
Supports timestamps as well as natural language such as "3 weeks ago"
collections:
List of collections (concept-ids) to limit search to
temporalBaselineDays:
List of temporal baselines,
used for Sentinel-1 Interferogram (BETA)
maxResults:
The maximum number of results to be returned by the search
opts:
An ASFSearchOptions object describing the search parameters to be used.
Search parameters specified outside this object will override in event of a conflict.
Returns
-------
`asf_search.ASFSearchResults` (list of search results of subclass ASFProduct)
"""
kwargs = locals()
data = dict((k, v) for k, v in kwargs.items() if k not in ['host', 'opts'] and v is not None)
opts = ASFSearchOptions() if opts is None else copy(opts)
opts.merge_args(**data)
return search(opts=opts)
Discovery-asf_search-8.1.2/asf_search/search/granule_search.py 0000664 0000000 0000000 00000001705 14777330235 0024502 0 ustar 00root root 0000000 0000000 from typing import Sequence
from copy import copy
from asf_search.search import search
from asf_search.ASFSearchOptions import ASFSearchOptions
from asf_search.ASFSearchResults import ASFSearchResults
def granule_search(granule_list: Sequence[str], opts: ASFSearchOptions = None) -> ASFSearchResults:
"""
Performs a granule name search using the ASF SearchAPI
Parameters
----------
granule_list:
List of specific granules.
Search results may include several products per granule name.
opts:
An ASFSearchOptions object describing the search parameters to be used.
Search parameters specified outside this object will override in event of a conflict.
Returns
-------
`asf_search.ASFSearchResults` (list of search results of subclass ASFProduct)
"""
opts = ASFSearchOptions() if opts is None else copy(opts)
opts.merge_args(granule_list=granule_list)
return search(opts=opts)
Discovery-asf_search-8.1.2/asf_search/search/product_search.py 0000664 0000000 0000000 00000001703 14777330235 0024523 0 ustar 00root root 0000000 0000000 from typing import Sequence
from copy import copy
from asf_search.search import search
from asf_search.ASFSearchOptions import ASFSearchOptions
from asf_search.ASFSearchResults import ASFSearchResults
def product_search(product_list: Sequence[str], opts: ASFSearchOptions = None) -> ASFSearchResults:
"""
Performs a product ID search using the ASF SearchAPI
Parameters
----------
:param product_list:
List of specific products.
Guaranteed to be at most one product per product name.
opts:
An ASFSearchOptions object describing the search parameters to be used.
Search parameters specified outside this object will override in event of a conflict.
Returns
-------
`asf_search.ASFSearchResults` (list of search results of subclass ASFProduct)
"""
opts = ASFSearchOptions() if opts is None else copy(opts)
opts.merge_args(product_list=product_list)
return search(opts=opts)
Discovery-asf_search-8.1.2/asf_search/search/search.py 0000664 0000000 0000000 00000020126 14777330235 0022763 0 ustar 00root root 0000000 0000000 import time
from typing import Literal, Union, Sequence, Tuple
from copy import copy
import datetime
from asf_search import ASF_LOGGER, ASFSearchResults
from asf_search.ASFSearchOptions import ASFSearchOptions
from asf_search.search.search_generator import search_generator
def search(
absoluteOrbit: Union[
int, Tuple[int, int], range, Sequence[Union[int, Tuple[int, int], range]]
] = None,
asfFrame: Union[
int, Tuple[int, int], range, Sequence[Union[int, Tuple[int, int], range]]
] = None,
beamMode: Union[str, Sequence[str]] = None,
beamSwath: Union[str, Sequence[str]] = None,
campaign: Union[str, Sequence[str]] = None,
maxDoppler: float = None,
minDoppler: float = None,
end: Union[datetime.datetime, str] = None,
maxFaradayRotation: float = None,
minFaradayRotation: float = None,
flightDirection: str = None,
flightLine: str = None,
frame: Union[int, Tuple[int, int], range, Sequence[Union[int, Tuple[int, int], range]]] = None,
granule_list: Union[str, Sequence[str]] = None,
groupID: Union[str, Sequence[str]] = None,
insarStackId: str = None,
instrument: Union[str, Sequence[str]] = None,
intersectsWith: str = None,
lookDirection: Union[str, Sequence[str]] = None,
offNadirAngle: Union[
float, Tuple[float, float], Sequence[Union[float, Tuple[float, float]]]
] = None,
platform: Union[str, Sequence[str]] = None,
polarization: Union[str, Sequence[str]] = None,
processingDate: Union[datetime.datetime, str] = None,
processingLevel: Union[str, Sequence[str]] = None,
product_list: Union[str, Sequence[str]] = None,
relativeOrbit: Union[
int, Tuple[int, int], range, Sequence[Union[int, Tuple[int, int], range]]
] = None,
season: Tuple[int, int] = None,
start: Union[datetime.datetime, str] = None,
absoluteBurstID: Union[int, Sequence[int]] = None,
relativeBurstID: Union[int, Sequence[int]] = None,
fullBurstID: Union[str, Sequence[str]] = None,
temporalBaselineDays: Union[str, Sequence[str]] = None,
operaBurstID: Union[str, Sequence[str]] = None,
frameCoverage: Literal['FULL', 'PARTIAL'] = None,
mainBandPolarization: Union[str, Sequence[str]] = None,
sideBandPolarization: Union[str, Sequence[str]] = None,
rangeBandwidth: Union[str, Sequence[str]] = None,
jointObservation: bool = None,
dataset: Union[str, Sequence[str]] = None,
collections: Union[str, Sequence[str]] = None,
shortName: Union[str, Sequence[str]] = None,
cmr_keywords: Union[Tuple[str, str], Sequence[Tuple[str, str]]] = None,
maxResults: int = None,
opts: ASFSearchOptions = None,
) -> ASFSearchResults:
"""
Performs a generic search against the Central Metadata Repository (CMR),
returning all results in a single list.
(For accessing results page by page see `asf_search.search_generator()`)
Accepts a number of search parameters, and/or an ASFSearchOptions object.
If an ASFSearchOptions object is provided as well as other specific parameters,
the two sets of options will be merged, preferring the specific keyword arguments.
Parameters
----------
absoluteOrbit:
For ALOS, ERS-1, ERS-2, JERS-1, and RADARSAT-1, Sentinel-1A, Sentinel-1B
this value corresponds to the orbit count within the orbit cycle.
For UAVSAR it is the Flight ID.
asfFrame:
This is primarily an ASF / JAXA frame reference. However,
some platforms use other conventions. See ‘frame’ for ESA-centric frame searches.
beamMode:
The beam mode used to acquire the data.
beamSwath:
Encompasses a look angle and beam mode.
campaign:
For UAVSAR and AIRSAR data collections only. Search by general location,
site description, or data grouping as supplied by flight agency or project.
maxDoppler:
Doppler provides an indication of how much the look direction deviates
from the ideal perpendicular flight direction acquisition.
minDoppler:
Doppler provides an indication of how much the look direction deviates
from the ideal perpendicular flight direction acquisition.
end:
End date of data acquisition. Supports timestamps
as well as natural language such as "3 weeks ago"
maxFaradayRotation:
Rotation of the polarization plane of
the radar signal impacts imagery, as HH and HV signals become mixed.
minFaradayRotation:
Rotation of the polarization plane of
the radar signal impacts imagery, as HH and HV signals become mixed.
flightDirection:
Satellite orbit direction during data acquisition
flightLine:
Specify a flightline for UAVSAR or AIRSAR.
frame:
ESA-referenced frames are offered to give users a universal framing convention.
Each ESA frame has a corresponding ASF frame assigned. See also: asfframe
granule_list:
List of specific granules.
Search results may include several products per granule name.
groupID:
Identifier used to find products considered to
be of the same scene but having different granule names
insarStackId:
Identifier used to find products of the same InSAR stack
instrument:
The instrument used to acquire the data. See also: platform
intersectsWith:
Search by polygon, linestring,
or point defined in 2D Well-Known Text (WKT)
lookDirection:
Left or right look direction during data acquisition
offNadirAngle:
Off-nadir angles for ALOS PALSAR
platform:
Remote sensing platform that acquired the data.
Platforms that work together, such as Sentinel-1A/1B and ERS-1/2
have multi-platform aliases available. See also: instrument
polarization:
A property of SAR electromagnetic waves
that can be used to extract meaningful information about surface properties of the earth.
processingDate:
Used to find data that has been processed at ASF since a given
time and date. Supports timestamps as well as natural language such as "3 weeks ago"
processingLevel:
Level to which the data has been processed
product_list:
List of specific products.
Guaranteed to be at most one product per product name.
relativeOrbit:
Path or track of satellite during data acquisition.
For UAVSAR it is the Line ID.
season:
Start and end day of year for desired seasonal range.
This option is used in conjunction with start/end to specify a seasonal range
within an overall date range.
start:
Start date of data acquisition.
Supports timestamps as well as natural language such as "3 weeks ago"
collections:
List of collections (concept-ids) to limit search to
temporalBaselineDays:
List of temporal baselines,
used for Sentinel-1 Interferogram (BETA)
maxResults:
The maximum number of results to be returned by the search
opts:
An ASFSearchOptions object describing the search parameters to be used.
Search parameters specified outside this object will override in event of a conflict.
Returns
-------
`asf_search.ASFSearchResults` (list of search results of subclass ASFProduct)
"""
kwargs = locals()
data = dict((k, v) for k, v in kwargs.items() if k not in ['host', 'opts'] and v is not None)
opts = ASFSearchOptions() if opts is None else copy(opts)
opts.merge_args(**data)
results = ASFSearchResults([])
# The last page will be marked as complete if results sucessful
perf = time.time()
for page in search_generator(opts=opts):
ASF_LOGGER.debug(f'Page Time Elapsed {time.time() - perf}')
results.extend(page)
results.searchComplete = page.searchComplete
results.searchOptions = page.searchOptions
perf = time.time()
results.raise_if_incomplete()
try:
results.sort(key=lambda p: p.get_sort_keys(), reverse=True)
except TypeError as exc:
ASF_LOGGER.warning(f'Failed to sort final results, leaving results unsorted. Reason: {exc}')
return results
Discovery-asf_search-8.1.2/asf_search/search/search_count.py 0000664 0000000 0000000 00000007201 14777330235 0024172 0 ustar 00root root 0000000 0000000 import datetime
from typing import Literal, Sequence, Tuple, Union
from copy import copy
from asf_search.ASFSearchOptions import ASFSearchOptions
from asf_search.CMR.subquery import build_subqueries
from asf_search.CMR import translate_opts
from asf_search.search.search_generator import get_page, preprocess_opts
from asf_search import INTERNAL
def search_count(
absoluteOrbit: Union[
int, Tuple[int, int], range, Sequence[Union[int, Tuple[int, int], range]]
] = None,
asfFrame: Union[
int, Tuple[int, int], range, Sequence[Union[int, Tuple[int, int], range]]
] = None,
beamMode: Union[str, Sequence[str]] = None,
beamSwath: Union[str, Sequence[str]] = None,
campaign: Union[str, Sequence[str]] = None,
maxDoppler: float = None,
minDoppler: float = None,
end: Union[datetime.datetime, str] = None,
maxFaradayRotation: float = None,
minFaradayRotation: float = None,
flightDirection: str = None,
flightLine: str = None,
frame: Union[int, Tuple[int, int], range, Sequence[Union[int, Tuple[int, int], range]]] = None,
granule_list: Union[str, Sequence[str]] = None,
groupID: Union[str, Sequence[str]] = None,
insarStackId: str = None,
instrument: Union[str, Sequence[str]] = None,
intersectsWith: str = None,
lookDirection: Union[str, Sequence[str]] = None,
offNadirAngle: Union[
float, Tuple[float, float], Sequence[Union[float, Tuple[float, float]]]
] = None,
platform: Union[str, Sequence[str]] = None,
polarization: Union[str, Sequence[str]] = None,
processingDate: Union[datetime.datetime, str] = None,
processingLevel: Union[str, Sequence[str]] = None,
product_list: Union[str, Sequence[str]] = None,
relativeOrbit: Union[
int, Tuple[int, int], range, Sequence[Union[int, Tuple[int, int], range]]
] = None,
season: Tuple[int, int] = None,
start: Union[datetime.datetime, str] = None,
absoluteBurstID: Union[int, Sequence[int]] = None,
relativeBurstID: Union[int, Sequence[int]] = None,
fullBurstID: Union[str, Sequence[str]] = None,
temporalBaselineDays: Union[str, Sequence[str]] = None,
operaBurstID: Union[str, Sequence[str]] = None,
frameCoverage: Literal["FULL", "PARTIAL"] = None,
mainBandPolarization: Union[str, Sequence[str]] = None,
sideBandPolarization: Union[str, Sequence[str]] = None,
rangeBandwidth: Union[str, Sequence[str]] = None,
jointObservation: bool = None,
dataset: Union[str, Sequence[str]] = None,
collections: Union[str, Sequence[str]] = None,
shortName: Union[str, Sequence[str]] = None,
cmr_keywords: Union[Tuple[str, str], Sequence[Tuple[str, str]]] = None,
maxResults: int = None,
opts: ASFSearchOptions = None,
) -> int:
# Create a kwargs dict, that's all of the 'not None' items, and merge it with opts:
kwargs = locals()
opts = ASFSearchOptions() if kwargs['opts'] is None else copy(opts)
del kwargs['opts']
kwargs = dict((k, v) for k, v in kwargs.items() if v is not None)
kw_opts = ASFSearchOptions(**kwargs)
# Anything passed in as kwargs has priority over anything in opts:
opts.merge_args(**dict(kw_opts))
preprocess_opts(opts)
url = '/'.join(s.strip('/') for s in [f'https://{opts.host}', f'{INTERNAL.CMR_GRANULE_PATH}'])
count = 0
for query in build_subqueries(opts):
translated_opts = translate_opts(query)
idx = translated_opts.index(('page_size', INTERNAL.CMR_PAGE_SIZE))
translated_opts[idx] = ('page_size', 0)
response = get_page(session=opts.session, url=url, translated_opts=translated_opts)
count += response.json()['hits']
return count
Discovery-asf_search-8.1.2/asf_search/search/search_generator.py 0000664 0000000 0000000 00000053005 14777330235 0025033 0 ustar 00root root 0000000 0000000 import time
from typing import Dict, Generator, Literal, Union, Sequence, Tuple, List
from copy import copy
from requests.exceptions import HTTPError
from requests import ReadTimeout, Response
from tenacity import (
retry,
retry_if_exception_type,
stop_after_attempt,
wait_exponential,
wait_fixed,
)
import datetime
import dateparser
from asf_search import ASF_LOGGER
from asf_search.ASFSearchResults import ASFSearchResults
from asf_search.ASFSearchOptions import ASFSearchOptions
from asf_search.CMR import build_subqueries, translate_opts
from asf_search.CMR.datasets import dataset_collections
from asf_search.ASFSession import ASFSession
from asf_search.ASFProduct import ASFProduct
from asf_search.exceptions import (
ASFSearch4xxError,
ASFSearch5xxError,
ASFSearchError,
CMRIncompleteError,
)
from asf_search.constants import INTERNAL
from asf_search.WKT.validate_wkt import validate_wkt
from asf_search.search.error_reporting import report_search_error
import asf_search.Products as ASFProductType
def search_generator(
absoluteOrbit: Union[
int, Tuple[int, int], range, Sequence[Union[int, Tuple[int, int], range]]
] = None,
asfFrame: Union[
int, Tuple[int, int], range, Sequence[Union[int, Tuple[int, int], range]]
] = None,
beamMode: Union[str, Sequence[str]] = None,
beamSwath: Union[str, Sequence[str]] = None,
campaign: Union[str, Sequence[str]] = None,
maxDoppler: float = None,
minDoppler: float = None,
end: Union[datetime.datetime, str] = None,
maxFaradayRotation: float = None,
minFaradayRotation: float = None,
flightDirection: str = None,
flightLine: str = None,
frame: Union[int, Tuple[int, int], range, Sequence[Union[int, Tuple[int, int], range]]] = None,
granule_list: Union[str, Sequence[str]] = None,
groupID: Union[str, Sequence[str]] = None,
insarStackId: str = None,
instrument: Union[str, Sequence[str]] = None,
intersectsWith: str = None,
lookDirection: Union[str, Sequence[str]] = None,
offNadirAngle: Union[
float, Tuple[float, float], Sequence[Union[float, Tuple[float, float]]]
] = None,
platform: Union[str, Sequence[str]] = None,
polarization: Union[str, Sequence[str]] = None,
processingDate: Union[datetime.datetime, str] = None,
processingLevel: Union[str, Sequence[str]] = None,
product_list: Union[str, Sequence[str]] = None,
relativeOrbit: Union[
int, Tuple[int, int], range, Sequence[Union[int, Tuple[int, int], range]]
] = None,
season: Tuple[int, int] = None,
start: Union[datetime.datetime, str] = None,
absoluteBurstID: Union[int, Sequence[int]] = None,
relativeBurstID: Union[int, Sequence[int]] = None,
fullBurstID: Union[str, Sequence[str]] = None,
temporalBaselineDays: Union[str, Sequence[str]] = None,
operaBurstID: Union[str, Sequence[str]] = None,
frameCoverage: Literal["FULL", "PARTIAL"] = None,
mainBandPolarization: Union[str, Sequence[str]] = None,
sideBandPolarization: Union[str, Sequence[str]] = None,
rangeBandwidth: Union[str, Sequence[str]] = None,
jointObservation: bool = None,
dataset: Union[str, Sequence[str]] = None,
collections: Union[str, Sequence[str]] = None,
shortName: Union[str, Sequence[str]] = None,
cmr_keywords: Union[Tuple[str, str], Sequence[Tuple[str, str]]] = None,
maxResults: int = None,
opts: ASFSearchOptions = None,
) -> Generator[ASFSearchResults, None, None]:
"""
Performs a generic search against the Central Metadata Repository (CMR),
yielding results page by page (250 products at a time by default) as they're returned from CMR.
Accepts a number of search parameters, and/or an ASFSearchOptions object.
If an ASFSearchOptions object is provided as well as other specific parameters,
the two sets of options will be merged, preferring the specific keyword arguments.
Parameters
----------
absoluteOrbit:
For ALOS, ERS-1, ERS-2, JERS-1, and RADARSAT-1, Sentinel-1A, Sentinel-1B
this value corresponds to the orbit count within the orbit cycle.
For UAVSAR it is the Flight ID.
asfFrame:
This is primarily an ASF / JAXA frame reference. However,
some platforms use other conventions. See ‘frame’ for ESA-centric frame searches.
beamMode:
The beam mode used to acquire the data.
beamSwath:
Encompasses a look angle and beam mode.
campaign:
For UAVSAR and AIRSAR data collections only. Search by general location,
site description, or data grouping as supplied by flight agency or project.
maxDoppler:
Doppler provides an indication of how much the look direction deviates
from the ideal perpendicular flight direction acquisition.
minDoppler:
Doppler provides an indication of how much the look direction deviates
from the ideal perpendicular flight direction acquisition.
end:
End date of data acquisition. Supports timestamps
as well as natural language such as "3 weeks ago"
maxFaradayRotation:
Rotation of the polarization plane of
the radar signal impacts imagery, as HH and HV signals become mixed.
minFaradayRotation:
Rotation of the polarization plane of
the radar signal impacts imagery, as HH and HV signals become mixed.
flightDirection:
Satellite orbit direction during data acquisition
flightLine:
Specify a flightline for UAVSAR or AIRSAR.
frame:
ESA-referenced frames are offered to give users a universal framing convention.
Each ESA frame has a corresponding ASF frame assigned. See also: asfframe
granule_list:
List of specific granules.
Search results may include several products per granule name.
groupID:
Identifier used to find products considered to
be of the same scene but having different granule names
insarStackId:
Identifier used to find products of the same InSAR stack
instrument:
The instrument used to acquire the data. See also: platform
intersectsWith:
Search by polygon, linestring,
or point defined in 2D Well-Known Text (WKT)
lookDirection:
Left or right look direction during data acquisition
offNadirAngle:
Off-nadir angles for ALOS PALSAR
platform:
Remote sensing platform that acquired the data.
Platforms that work together, such as Sentinel-1A/1B and ERS-1/2
have multi-platform aliases available. See also: instrument
polarization:
A property of SAR electromagnetic waves
that can be used to extract meaningful information about surface properties of the earth.
processingDate:
Used to find data that has been processed at ASF since a given
time and date. Supports timestamps as well as natural language such as "3 weeks ago"
processingLevel:
Level to which the data has been processed
product_list:
List of specific products.
Guaranteed to be at most one product per product name.
relativeOrbit:
Path or track of satellite during data acquisition.
For UAVSAR it is the Line ID.
season:
Start and end day of year for desired seasonal range.
This option is used in conjunction with start/end to specify a seasonal range
within an overall date range.
start:
Start date of data acquisition.
Supports timestamps as well as natural language such as "3 weeks ago"
collections:
List of collections (concept-ids) to limit search to
temporalBaselineDays:
List of temporal baselines,
used for Sentinel-1 Interferogram (BETA)
maxResults:
The maximum number of results to be returned by the search
opts:
An ASFSearchOptions object describing the search parameters to be used.
Search parameters specified outside this object will override in event of a conflict.
Yields
-------
`asf_search.ASFSearchResults` (list of search results of subclass ASFProduct, page by page)
"""
# Create a kwargs dict, that's all of the 'not None' items, and merge it with opts:
kwargs = locals()
opts = ASFSearchOptions() if kwargs['opts'] is None else copy(opts)
del kwargs['opts']
kwargs = dict((k, v) for k, v in kwargs.items() if v is not None)
kw_opts = ASFSearchOptions(**kwargs)
# Anything passed in as kwargs has priority over anything in opts:
opts.merge_args(**dict(kw_opts))
maxResults = opts.pop('maxResults', None)
if maxResults is not None and (
getattr(opts, 'granule_list', False) or getattr(opts, 'product_list', False)
):
raise ValueError('Cannot use maxResults along with product_list/granule_list.')
ASF_LOGGER.debug(f'SEARCH: preprocessing opts: {opts}')
preprocess_opts(opts)
ASF_LOGGER.debug(f'SEARCH: preprocessed opts: {opts}')
ASF_LOGGER.info(f'SEARCH: Using search opts {opts}')
url = '/'.join(s.strip('/') for s in [f'https://{opts.host}', f'{INTERNAL.CMR_GRANULE_PATH}'])
total = 0
queries = build_subqueries(opts)
ASF_LOGGER.info(f'SEARCH: Using cmr endpoint: "{url}"')
ASF_LOGGER.debug(f'SEARCH: Built {len(queries)} subqueries')
for subquery_idx, query in enumerate(queries):
ASF_LOGGER.info(f'SUBQUERY {subquery_idx + 1}: Beginning subquery with opts: {query}')
ASF_LOGGER.debug(f'TRANSLATION: Translating subquery:\n{query}')
translated_opts = translate_opts(query)
ASF_LOGGER.debug(f'TRANSLATION: Subquery translated to cmr keywords:\n{translated_opts}')
cmr_search_after_header = ''
subquery_count = 0
page_number = 1
while cmr_search_after_header is not None:
try:
ASF_LOGGER.debug(f'SUBQUERY {subquery_idx + 1}: Fetching page {page_number}')
items, subquery_max_results, cmr_search_after_header = query_cmr(
opts.session, url, translated_opts, subquery_count
)
except (ASFSearchError, CMRIncompleteError) as exc:
message = str(exc)
ASF_LOGGER.error(message)
report_search_error(query, message)
opts.session.headers.pop('CMR-Search-After', None)
# If it's a CMRIncompleteError, we can just stop here and return what we have
# It's up to the user to call .raise_if_incomplete() if they're using the
# generator directly.
if isinstance(exc, CMRIncompleteError):
return
else:
raise
ASF_LOGGER.debug(
f'SUBQUERY {subquery_idx + 1}: Page {page_number} fetched, returned {len(items)} items.'
)
opts.session.headers.update({'CMR-Search-After': cmr_search_after_header})
perf = time.time()
last_page = process_page(
items, maxResults, subquery_max_results, total, subquery_count, opts
)
ASF_LOGGER.info(f'Page Processing Time {time.time() - perf}')
subquery_count += len(last_page)
total += len(last_page)
last_page.searchComplete = subquery_count == subquery_max_results or total == maxResults
yield last_page
if last_page.searchComplete:
if total == maxResults: # the user has as many results as they wanted
ASF_LOGGER.info(f'SEARCH COMPLETE: MaxResults ({maxResults}) reached')
opts.session.headers.pop('CMR-Search-After', None)
return
else: # or we've gotten all possible results for this subquery
ASF_LOGGER.info(
f'SUBQUERY {subquery_idx + 1} COMPLETE: results exhausted for subquery'
)
cmr_search_after_header = None
page_number += 1
opts.session.headers.pop('CMR-Search-After', None)
ASF_LOGGER.info(f'SEARCH COMPLETE: results exhausted for search opts {opts}')
@retry(
reraise=True,
retry=retry_if_exception_type(CMRIncompleteError),
wait=wait_fixed(2),
stop=stop_after_attempt(3),
)
def query_cmr(
session: ASFSession,
url: str,
translated_opts: Dict,
sub_query_count: int,
):
response = get_page(session=session, url=url, translated_opts=translated_opts)
perf = time.time()
items = [as_ASFProduct(f, session=session) for f in response.json()['items']]
ASF_LOGGER.debug(f'Product Subclassing Time {time.time() - perf}')
hits: int = response.json()['hits'] # total count of products given search opts
# 9-10 per process
# 3.9-5 per process
# sometimes CMR returns results with the wrong page size
if len(items) != INTERNAL.CMR_PAGE_SIZE and len(items) + sub_query_count < hits:
raise CMRIncompleteError(
'CMR returned page of incomplete results.'
f'Expected {min(INTERNAL.CMR_PAGE_SIZE, hits - sub_query_count)} results,'
f'got {len(items)}'
)
return items, hits, response.headers.get('CMR-Search-After', None)
def process_page(
items: List[ASFProduct],
max_results: int,
subquery_max_results: int,
total: int,
subquery_count: int,
opts: ASFSearchOptions,
):
if max_results is None:
last_page = ASFSearchResults(
items[: min(subquery_max_results - subquery_count, len(items))], opts=opts
)
else:
last_page = ASFSearchResults(items[: min(max_results - total, len(items))], opts=opts)
return last_page
@retry(
reraise=True,
retry=retry_if_exception_type(ASFSearch5xxError),
wait=wait_exponential(
multiplier=1, min=3, max=10
), # Wait 2^x * 1 starting with 3 seconds, max 10 seconds between retries
stop=stop_after_attempt(3),
)
def get_page(session: ASFSession, url: str, translated_opts: List) -> Response:
from asf_search.constants.INTERNAL import CMR_TIMEOUT
perf = time.time()
try:
response = session.post(url=url, data=translated_opts, timeout=CMR_TIMEOUT)
response.raise_for_status()
except HTTPError as exc:
error_message = f'HTTP {response.status_code}: {response.json()["errors"]}'
if 400 <= response.status_code <= 499:
raise ASFSearch4xxError(error_message) from exc
if 500 <= response.status_code <= 599:
raise ASFSearch5xxError(error_message) from exc
except ReadTimeout as exc:
raise ASFSearchError(
f'Connection Error (Timeout): CMR took too long to respond. Set asf constant "asf_search.constants.INTERNAL.CMR_TIMEOUT" to increase. ({url=}, timeout={CMR_TIMEOUT})'
) from exc
ASF_LOGGER.info(f'Query Time Elapsed {time.time() - perf}')
return response
def preprocess_opts(opts: ASFSearchOptions):
# Repair WKT here so it only happens once, and you can save the result to the new Opts object:
wrap_wkt(opts=opts)
# Date/Time logic, convert "today" to the literal timestamp if needed:
set_default_dates(opts=opts)
# Platform Alias logic:
set_platform_alias(opts=opts)
def wrap_wkt(opts: ASFSearchOptions):
if opts.intersectsWith is not None:
wrapped, _, repairs = validate_wkt(opts.intersectsWith)
opts.intersectsWith = wrapped.wkt
if len(repairs):
ASF_LOGGER.warning(
'WKT REPAIR/VALIDATION: The following repairs were performed'
f'on the provided AOI:\n{[str(repair) for repair in repairs]}'
)
def set_default_dates(opts: ASFSearchOptions):
if opts.start is not None and isinstance(opts.start, str):
opts.start = dateparser.parse(opts.start, settings={'RETURN_AS_TIMEZONE_AWARE': True})
if opts.end is not None and isinstance(opts.end, str):
opts.end = dateparser.parse(opts.end, settings={'RETURN_AS_TIMEZONE_AWARE': True})
# If both are used, make sure they're in the right order:
if opts.start is not None and opts.end is not None:
if opts.start > opts.end:
ASF_LOGGER.warning(
f'Start date ({opts.start}) is after end date ({opts.end}). Switching the two.'
)
opts.start, opts.end = opts.end, opts.start
# Can't do this sooner, since you need to compare start vs end:
if opts.start is not None:
opts.start = opts.start.strftime('%Y-%m-%dT%H:%M:%SZ')
if opts.end is not None:
opts.end = opts.end.strftime('%Y-%m-%dT%H:%M:%SZ')
def set_platform_alias(opts: ASFSearchOptions):
# Platform Alias logic:
if opts.platform is not None:
plat_aliases = {
# Groups:
'S1': ['SENTINEL-1A', 'SENTINEL-1B'],
'SENTINEL-1': ['SENTINEL-1A', 'SENTINEL-1B'],
'SENTINEL': ['SENTINEL-1A', 'SENTINEL-1B'],
'ERS': ['ERS-1', 'ERS-2'],
'SIR-C': ['STS-59', 'STS-68'],
# Singles / Aliases:
'R1': ['RADARSAT-1'],
'E1': ['ERS-1'],
'E2': ['ERS-2'],
'J1': ['JERS-1'],
'A3': ['ALOS'],
'AS': ['DC-8'],
'AIRSAR': ['DC-8'],
'SS': ['SEASAT 1'],
'SEASAT': ['SEASAT 1'],
'SA': ['SENTINEL-1A'],
'SB': ['SENTINEL-1B'],
'SP': ['SMAP'],
'UA': ['G-III'],
'UAVSAR': ['G-III'],
}
platform_list = []
for plat in opts.platform:
# If it's a key, *replace* it with all the values. Else just add the key:
if plat.upper() in plat_aliases:
platform_list.extend(plat_aliases[plat.upper()])
else:
platform_list.append(plat)
opts.platform = list(set(platform_list))
_dataset_collection_items = dataset_collections.items()
def as_ASFProduct(item: Dict, session: ASFSession) -> ASFProduct:
"""Returns the granule umm as the corresponding ASFProduct subclass,
or ASFProduct if no equivalent is found
:param item: the granule umm json
:param session: the session used to query CMR for the product
:returns the granule as an object of type ASFProduct
"""
if ASFProductType.OPERAS1Product._is_subclass(item):
return ASFProductType.OPERAS1Product(item, session=session)
product_type_key = _get_product_type_key(item)
# if there's a direct entry in our dataset to product type dict
# perf = time.time()
subclass = dataset_to_product_types.get(product_type_key)
if subclass is not None:
# ASF_LOGGER.warning(f'subclass selection time {time.time() - perf}')
return subclass(item, session=session)
# if the key matches one of the shortnames in any of our datasets
for dataset, collections in _dataset_collection_items:
if collections.get(product_type_key) is not None:
subclass = dataset_to_product_types.get(dataset)
if subclass is not None:
# ASF_LOGGER.warning(f'subclass selection time {time.time() - perf}')
return subclass(item, session=session)
break # dataset exists, but is not in dataset_to_product_types yet
# If the platform exists, try to match it
platform = _get_platform(item=item)
if ASFProductType.ARIAS1GUNWProduct._is_subclass(item=item):
return dataset_to_product_types.get('ARIA S1 GUNW')(item, session=session)
elif (subclass := dataset_to_product_types.get(platform)) is not None:
return subclass(item, session=session)
output = ASFProduct(item, session=session)
granule_concept_id = output.meta.get('concept-id', 'Missing Granule Concept ID')
fileID = output.properties.get(
'fileID', output.properties.get('sceneName', 'fileID and sceneName Missing')
)
ASF_LOGGER.warning(
f'Failed to find corresponding ASFProduct subclass for \
Product: "{fileID}", Granule Concept ID: "{granule_concept_id}", \
default to "ASFProduct"'
)
return output
def _get_product_type_key(item: Dict) -> str:
"""Match the umm response to the right ASFProduct subclass by returning one of the following:
1. collection shortName (Ideal case)
2. platform_shortName (Fallback)
- special case: Aria S1 GUNW
"""
collection_shortName = ASFProduct.umm_get(item['umm'], 'CollectionReference', 'ShortName')
if collection_shortName is None:
if ASFProductType.ARIAS1GUNWProduct._is_subclass(item=item):
return 'ARIA S1 GUNW'
platform = _get_platform(item=item)
return platform
return collection_shortName
def _get_platform(item: Dict):
return ASFProduct.umm_get(item['umm'], 'Platforms', 0, 'ShortName')
# Maps datasets from DATASET.py and collection/platform shortnames to ASFProduct subclasses
dataset_to_product_types = {
'SENTINEL-1': ASFProductType.S1Product,
'OPERA-S1': ASFProductType.OPERAS1Product,
'OPERA-S1-CALVAL': ASFProductType.OPERAS1Product,
'SLC-BURST': ASFProductType.S1BurstProduct,
'ALOS': ASFProductType.ALOSProduct,
'SIR-C': ASFProductType.SIRCProduct,
'STS-59': ASFProductType.SIRCProduct,
'STS-68': ASFProductType.SIRCProduct,
'ARIA S1 GUNW': ASFProductType.ARIAS1GUNWProduct,
'SMAP': ASFProductType.SMAPProduct,
'UAVSAR': ASFProductType.UAVSARProduct,
'G-III': ASFProductType.UAVSARProduct,
'RADARSAT-1': ASFProductType.RADARSATProduct,
'ERS': ASFProductType.ERSProduct,
'ERS-1': ASFProductType.ERSProduct,
'ERS-2': ASFProductType.ERSProduct,
'JERS-1': ASFProductType.JERSProduct,
'AIRSAR': ASFProductType.AIRSARProduct,
'DC-8': ASFProductType.AIRSARProduct,
'SEASAT': ASFProductType.SEASATProduct,
'SEASAT 1': ASFProductType.SEASATProduct,
'NISAR': ASFProductType.NISARProduct,
}
Discovery-asf_search-8.1.2/examples/ 0000775 0000000 0000000 00000000000 14777330235 0017416 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/examples/0-Intro.md 0000664 0000000 0000000 00000005167 14777330235 0021201 0 ustar 00root root 0000000 0000000 # SAR Data in Python: Getting to Know asf_search
***
## About the Alaska Satellite Facility
__ASF is part of the Geophysical Institute of the University of Alaska Fairbanks.__
- ASF downlinks, processes, archives, and distributes remote-sensing data to scientific users around the world.
- ASF promotes, facilitates, and participates in the advancement of remote sensing to support national and international Earth science research, field operations, and commercial applications.
- ASF commits to provide the highest quality data and services in a timely manner.
__Distributed Active Archive Center (DAAC):__ ASF operates the NASA archive of synthetic aperture radar (SAR) data from a variety of satellites and aircraft, providing these data and associated specialty support services to researchers in support of NASA’s Earth Science Data and Information System (ESDIS) project.
[ASF Website](https://asf.alaska.edu)
[Contact ASF](https://asf.alaska.edu/contact/)
***
## ASF Discovery Team
__The ASF Discovery team's focus is to provide tools that help users find and acquire the data they want as quickly and smoothly as possible.__
Some tools provided by ASF's Discovery team include:
- [Vertex](https://search.asf.alaska.edu): web application for searching the ASF archive, as well as performing meta-analysis and custom On Demand processing
- [ASF Search API](https://docs.asf.alaska.edu/api/basics/): Public REST API for searching the ASF archive
- [ASF Python Search Module](https://docs.asf.alaska.edu/asf_search/basics/) (asf_search): Python module for programmatically finding and acquiring data from the ASF archive
***
## Working in Python: asf_search
__asf_search is a Python module created to simplify the process of finding and acquiring data programmatically.__
- Search API is very technical: we saw a need to reduce the technical overhead required when using Search API.
- Vertex is very interactive: sometimes, an automated or semi-automated process is required, and we wanted to make the power of Vertex available in that context.
- Downloading data can be difficult: there are many ways to acquire data, and knowing how to use them effectively can be difficult.
***
## Today's Topic
__We will explore some basic usage of asf_search, with a focus on search functionality.__
Specific features to be covered include:
- Classes for working with ASF data
- Search functions
- Authentication methods
- Download functionality
This session is targeted largely at users who have a passing familiarity with Python, but `asf_search` is designed to be easily used by everyone from novice to expert.
***
Next: [Basic Overview](./1-Basic_Overview.ipynb) Discovery-asf_search-8.1.2/examples/1-Basic_Overview.ipynb 0000664 0000000 0000000 00000032231 14777330235 0023527 0 ustar 00root root 0000000 0000000 {
"cells": [
{
"cell_type": "markdown",
"id": "8bb99b33-3f6e-49f8-9a09-b56ebd0ed3d2",
"metadata": {
"tags": []
},
"source": [
"# asf_search - Basic Overview\n",
"`asf_search` provides several helpful classes and functions to simplify working with the ASF catalog. This notebook briefly describes how to get started with `asf_search`.\n",
"***\n",
"## Before You Start\n",
" \n",
"The steps outlined in this notebook assume `asf_search` is available on your system. `asf_search` is available through [PyPi](https://pypi.org/project/asf-search/), [Conda](https://anaconda.org/conda-forge/asf_search), and [Github](https://github.com/asfadmin/Discovery-asf_search). Additionally, full documentation for `asf_search` and many other services offered by ASF is available at [https://docs.asf.alaska.edu/](https://docs.asf.alaska.edu/)\n",
" \n",
"For this demonstration, we have already installed `asf_search` within a virtual environment through PyPi via the command:\n",
" \n",
" \n",
"```pip install asf_search```\n",
" \n",
"`asf_search` requires Python 3.6 or higher."
]
},
{
"cell_type": "markdown",
"id": "eb5a1320-5f3e-4125-8184-9a9b28fbf904",
"metadata": {
"tags": []
},
"source": [
"***\n",
"## Usage\n",
"Once installed, simply import `asf_search` as you would any other Python module:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "52144455-af61-4f36-b35b-57a04683ebb1",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"import asf_search as asf"
]
},
{
"cell_type": "markdown",
"id": "bc891ff7-608b-4bc2-ab7d-bc2468539722",
"metadata": {},
"source": [
"`asf_search` version numbers are based on [Semantic Versioning 2.0.0](https://semver.org/spec/v2.0.0.html):"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "0387c290-3d11-4ab2-b1e1-f94b761d245d",
"metadata": {},
"outputs": [],
"source": [
"asf.__version__"
]
},
{
"cell_type": "markdown",
"id": "6a8ada7a",
"metadata": {},
"source": [
"`asf_search` automatically reports search errors to ASF. To opt out, simply add the following line after importing the package:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "c8117da3",
"metadata": {},
"outputs": [],
"source": [
"asf.REPORT_ERRORS = False"
]
},
{
"cell_type": "markdown",
"id": "322a9186",
"metadata": {},
"source": [
"If you have any questions regarding automatic error reporting, email uso@asf.alaska.edu"
]
},
{
"cell_type": "markdown",
"id": "2bd9fb48-b2a8-4080-baab-89241cd1f8b1",
"metadata": {
"tags": []
},
"source": [
"***\n",
"\n",
"## Performing a Basic Search\n",
"[View this search in Vertex](https://search.asf.alaska.edu/#/?resultsLoaded=true&dataset=SENTINEL-1)\n",
"\n",
"For this basic example, we will specify two search parameters: a platform, and how many results we want to retrieve:\n",
"- Sentinel-1\n",
"- 5 results max\n",
" - (results are retrieved newest-first, so this will be the 5 newest products)."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "c4885370-3298-47ae-81c0-873ecfa47aba",
"metadata": {},
"outputs": [],
"source": [
"results = asf.search(platform=asf.PLATFORM.SENTINEL1, maxResults=5)"
]
},
{
"cell_type": "markdown",
"id": "630339c1-c1b6-46fd-b7de-3251553a61ca",
"metadata": {},
"source": [
"Alternatively, it may be useful to handle your search arguments as a dictionary:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "57f51960-dd3f-4531-8f0c-28f601d0b5e9",
"metadata": {},
"outputs": [],
"source": [
"opts = {\n",
" 'platform': asf.PLATFORM.SENTINEL1,\n",
" 'maxResults': 5\n",
"}\n",
"results = asf.search(**opts)"
]
},
{
"cell_type": "markdown",
"id": "cbe2e4ae",
"metadata": {},
"source": [
"If `search()` encounters an error while querying CMR the search will halt, log the error, and return all results up until the error occurred. To check that results are complete programmatically use `ASFSearchResults.raise_if_incomplete()` in a `try-except` statement:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "29333d4f",
"metadata": {},
"outputs": [],
"source": [
"from asf_search.exceptions import ASFSearchError\n",
"\n",
"try:\n",
" results.raise_if_incomplete()\n",
"except ASFSearchError as e:\n",
" pass #handle the error here"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "1fc55d9f",
"metadata": {},
"source": [
"`search()` is itself a convenient wrapper around `search_generator()`, which queries CMR and yields `ASFSearchResults` page-by-page. If `search_generator()` completed without encountering an error, the last page will mark `ASFSearchResults.searchComplete` to `True`. Check the final page's `searchComplete` flag to assert results status or mark the final `ASFSearchResults` object as complete."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "0ac69ddf",
"metadata": {},
"outputs": [],
"source": [
"paged_results = asf.ASFSearchResults([])\n",
"\n",
"pages = asf.search_generator(**opts)\n",
"\n",
"for page in pages:\n",
" paged_results.extend(page)\n",
"\n",
" # The last page will be marked as complete once all results have returned successfully,\n",
" # keep track by updating ASFSearchResults manually if you wish to use raise_if_incomplete() \n",
" paged_results.searchComplete = page.searchComplete\n",
" paged_results.searchOptions = page.searchOptions\n",
"\n",
"\n",
"try:\n",
" paged_results.raise_if_incomplete()\n",
"except ASFSearchError as e:\n",
" pass #handle the error here"
]
},
{
"cell_type": "markdown",
"id": "7bd0e8d7-ba02-472b-8b91-413090238eda",
"metadata": {},
"source": [
"Note the use of `asf_search`-provided constants, as many editors support use of these through autocompletion. Categories of constants include:\n",
"- `BEAMMODE`\n",
"- `FLIGHT_DIRECTION`\n",
"- `INSTRUMENT`\n",
"- `PLATFORM`\n",
"- `POLARIZATION`\n",
"- `PRODUCT_TYPE`\n",
" \n"
]
},
{
"cell_type": "markdown",
"id": "92ac4b7c",
"metadata": {},
"source": [
"`asf_search` also provides the `search_count()` method, which will return the count of total results matching the passed search options. \n",
"For example passing the same `opts` object we used for our search above will return the current size of the `SENTINEL1` catalog."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "be53c20f",
"metadata": {},
"outputs": [],
"source": [
"count = asf.search_count(**opts)"
]
},
{
"cell_type": "markdown",
"id": "f506de89-f36c-44dd-a408-d4d6853bae67",
"metadata": {
"tags": []
},
"source": [
"***\n",
"## Working With Results\n",
"### `ASFSearchResults`\n",
"Search results are returned as an `ASFSearchResults` object, a subclass of `UserList`, containing a list of `ASFProduct` objects, each of these classes providing some additional functionality to aid in working with the results and individual products:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "85157600-fa63-41e6-943a-9dfc4adc33da",
"metadata": {},
"outputs": [],
"source": [
"results"
]
},
{
"cell_type": "markdown",
"id": "f4b7a403-82e2-4c9d-872c-7a126841c654",
"metadata": {
"tags": []
},
"source": [
"### `ASFProduct`\n",
"`ASFProduct` provides a number of metadata fields, such as:\n",
"- Geographic coordinates\n",
" - Latitude/Longitude\n",
" - Shape type\n",
"- Scene and product metadata\n",
" - Path, frame\n",
" - Platform, beam, polarization\n",
" - File name, size, URL\n",
" - and many others\n",
" \n",
"Geographic coordinates are stored in the `geometry` attribute:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "4aafded2-662b-4ff8-b47c-d793834b7e0e",
"metadata": {},
"outputs": [],
"source": [
"results[0].geometry"
]
},
{
"cell_type": "markdown",
"id": "d66ecc39-57db-436e-b00c-efb86d903bf2",
"metadata": {
"tags": []
},
"source": [
"Other metadata is available through the `properties` attribute:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "bb928002-c003-4d13-864b-35d9e51a9862",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"results[0].properties"
]
},
{
"cell_type": "markdown",
"id": "5d9b4e27-3505-4303-bed9-feaa56a76358",
"metadata": {
"tags": []
},
"source": [
"### Output\n",
"The layout of the above data structure mirrors the geojson output format provided by ASF's SearchAPI, for a smooth transition to `asf_search`.\n",
" \n",
"In fact, the `ASFSearchResults.__str__()` method serializes results to geojson _identical_ to that of ASF's SearchAPI, allowing for a drop-in replacement for users of the SearchAPI, simply by explicitly or implicitly casting search results to a string:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "20a620fb-d219-489e-96f5-8ac3dd407a51",
"metadata": {
"scrolled": true,
"tags": []
},
"outputs": [],
"source": [
"print(results)"
]
},
{
"cell_type": "markdown",
"id": "4464dd9b-af8b-4a62-8281-87ae22e4b84d",
"metadata": {},
"source": [
"Additionally, individual `ASFProduct` objects provides geojson-based serialization, in the form of a geojson `Feature` snippet:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "f4fe28bf-3a16-4f1e-abce-d340005a815d",
"metadata": {
"scrolled": true,
"tags": []
},
"outputs": [],
"source": [
"print(results[0])"
]
},
{
"cell_type": "markdown",
"id": "b7dd770b",
"metadata": {},
"source": [
"`ASFSearchResults` also supports the following output formats:\n",
"- `csv`\n",
"- `jsonlite`\n",
"- `jsonlite2`\n",
"- `metalink`\n",
"- `kml`\n",
"\n",
"All formats are callable as methods on any `ASFSearchResults` object, producing results equivalent to SearchAPI.\n",
"\n",
"Below is an example of saving results to a csv file using the `ASFSearchResults.csv()` method."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "712d1b0b",
"metadata": {},
"outputs": [],
"source": [
"with open(\"search_results.csv\", \"w\") as f:\n",
" f.writelines(results.csv())"
]
},
{
"cell_type": "markdown",
"id": "880990be",
"metadata": {},
"source": [
"The output can be also previewed in the terminal"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "6fc75e9b",
"metadata": {},
"outputs": [],
"source": [
"print(*results.csv(), sep='')"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "de64e149",
"metadata": {},
"source": [
"For more advanced usage, different output formats can be used with the `asf_search.search_generator()`, and can be used to stream results as they are returned from CMR, page-by-page"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "4dc756bd",
"metadata": {},
"outputs": [],
"source": [
"# asf-search queries CMR with page sizes of 500, \n",
"# so setting maxResults=1000 means asf-search will have to query cmr twice, each time returning 500 products\n",
"large_results_generator = asf.search_generator(maxResults=1000, platform=asf.PLATFORM.SENTINEL1A)\n",
"\n",
"with open(\"search_results.metalink\", \"w\") as f:\n",
" f.writelines(asf.export.results_to_metalink(large_results_generator))"
]
},
{
"cell_type": "markdown",
"id": "cf042a01-674d-4518-9a05-067cd61fb51e",
"metadata": {},
"source": [
"## Summary\n",
"A complete example, showing how simple use of `asf_search` can be:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "04465e37-17ec-4648-ae63-7cb17b933bd9",
"metadata": {
"scrolled": true,
"tags": []
},
"outputs": [],
"source": [
"import asf_search as asf\n",
"\n",
"print(asf.search(platform='S1', maxResults=5))"
]
},
{
"cell_type": "markdown",
"id": "ffec1743-d84f-402e-a805-5e109b13869d",
"metadata": {},
"source": [
"***\n",
"Next: [Geographic Searches](./2-Geographic_Search.ipynb)"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.6"
},
"toc-autonumbering": false,
"toc-showcode": false,
"toc-showmarkdowntxt": false,
"toc-showtags": false
},
"nbformat": 4,
"nbformat_minor": 5
}
Discovery-asf_search-8.1.2/examples/2-Geographic_Search.ipynb 0000664 0000000 0000000 00000015240 14777330235 0024157 0 ustar 00root root 0000000 0000000 {
"cells": [
{
"cell_type": "markdown",
"id": "928e44f7-b11c-4392-bb4f-832ea77f3855",
"metadata": {
"tags": []
},
"source": [
"# asf_search - Geographic Search\n",
"`asf_search` provides several search-oriented functions. One of these functions is `geo_search()`. This function accepts an area of interest in the form of a WKT (Well Known Text), as well as a variety of search parameters by which to refine your search results.\n",
"***\n",
"## Before You Start\n",
"The steps outlined in this demonstration assume `asf_search` is available in your environment. For guidance on installing `asf_search`, [begin here](./1-Basic_Overview.ipynb#Before-You-Start)."
]
},
{
"cell_type": "markdown",
"id": "43588eb5-1dfc-4e82-8503-940bbbaffbb6",
"metadata": {
"tags": []
},
"source": [
"***\n",
"## Performing a Geographic Search\n",
"[View this search in Vertex](https://search.asf.alaska.edu/#/?zoom=5.373¢er=-153.186,53.964&polygon=POLYGON((-152.81%2058.49,-154.9%2057.49,-155.08%2056.3,-153.82%2056.34,-151.99%2057.3,-151.43%2058.19,-152.81%2058.49))&dataset=ALOS&start=2010-01-01T06:00:00Z&end=2010-02-02T05:59:59Z&resultsLoaded=true)\n",
"\n",
"To begin, we will need to craft a suitable WKT. ASF's [Vertex](https://search.asf.alaska.edu) can be helpful in this regard, as it allows you to draw on a map, or import a geospatial file such as a shapefile or geojson, after which a WKT string can be copied and used elsewhere.\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "52144455-af61-4f36-b35b-57a04683ebb1",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"import asf_search as asf\n",
"\n",
"aoi = 'POLYGON((-152.81 58.49,-154.90 57.49,-155.08 56.30,-153.82 56.34,-151.99 57.30,-151.43 58.19,-152.81 58.49))'"
]
},
{
"cell_type": "markdown",
"id": "43f1d561-c88f-435a-ac64-d691c8fb7e76",
"metadata": {},
"source": [
"Next, we will assemble any additional search parameters, such as platform, date range, and the type of product in which we are interested. All options to `geo_search()` can be specified using _kwargs_, which also allows them to be handled using a dictionary:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "4dd4ec77-6e5f-4d5e-8253-719a4152285e",
"metadata": {},
"outputs": [],
"source": [
"opts = {\n",
" 'platform': asf.PLATFORM.ALOS,\n",
" 'start': '2010-01-01T00:00:00Z',\n",
" 'end': '2010-02-01T23:59:59Z'\n",
"}"
]
},
{
"cell_type": "markdown",
"id": "084583dc-9f5c-4887-9cae-1f10d7e2cedf",
"metadata": {},
"source": [
"Once all the search parameters are ready, we can run the query and retrieve the results:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "8e996159-0e4b-4cdd-bf22-f75a48163b4d",
"metadata": {},
"outputs": [],
"source": [
"results = asf.geo_search(intersectsWith=aoi, **opts)\n",
"\n",
"print(f'{len(results)} results found')"
]
},
{
"cell_type": "markdown",
"id": "c6be9b2a-a0c6-4ef3-a118-98c99f82f229",
"metadata": {},
"source": [
"At this point, your search is complete and you can perform any of the usual workflows on your search results."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "42d0123c-7f84-4356-b0b9-23359f9d106f",
"metadata": {
"scrolled": true,
"tags": []
},
"outputs": [],
"source": [
"print(results)"
]
},
{
"cell_type": "markdown",
"id": "6c4900f9-e610-4136-8c90-1fbdaee93bdf",
"metadata": {},
"source": [
"***\n",
"## Search Based on Scene's Footprint"
]
},
{
"cell_type": "markdown",
"id": "f0572135-4d19-4a2f-9a2a-7b1db2411b4b",
"metadata": {},
"source": [
"As a secondary example, suppose you want to perform another geographic search to find results that overlap with one of your previous results. Using `shapely`, which is already installed by requirement of `asf_search` itself, we can quickly product a WKT based on a result and then perform a secondary search:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "d04567f4-1223-475f-9b2a-a59ff7ce1486",
"metadata": {},
"outputs": [],
"source": [
"from shapely.geometry import shape\n",
"\n",
"new_aoi = shape(results[0].geometry).wkt\n",
"\n",
"refined_results = asf.geo_search(intersectsWith=new_aoi, **opts)\n",
"\n",
"print(f'{len(refined_results)} results found')"
]
},
{
"cell_type": "markdown",
"id": "8bd15b6e-355e-4b96-858b-cd2085ace0e8",
"metadata": {},
"source": [
"***\n",
"## Search Based on Scene's Centroid\n",
"Alternately, suppose we would like to search for similar scenes that overlap another scene's centroid. `ASFProduct` provides a `centroid()` convenience function, which returns a `shapely.Point` object for this purpose:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "d6dd53c7-fcfd-4369-9859-a71dbcbef12b",
"metadata": {},
"outputs": [],
"source": [
"centroid = results[0].centroid().wkt\n",
"\n",
"centroid_results = asf.geo_search(intersectsWith=centroid, **opts)\n",
"\n",
"print(f'{len(centroid_results)} results found')"
]
},
{
"cell_type": "markdown",
"id": "8aed3b7c-a557-4cbb-878e-aa9fe8330646",
"metadata": {},
"source": [
"***\n",
"## Summary\n",
"A complete, basic example of using `geo_search()`:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "ceea6e9a-b1ea-417f-8daa-98bd17c9730c",
"metadata": {
"scrolled": true,
"tags": []
},
"outputs": [],
"source": [
"import asf_search as asf\n",
"\n",
"print(\n",
" asf.geo_search(\n",
" intersectsWith='POLYGON((-152.81 58.49,-154.90 57.49,-155.08 56.30,-153.82 56.34,-151.99 57.30,-151.43 58.19,-152.81 58.49))',\n",
" platform=asf.PLATFORM.ALOS,\n",
" start='2010-01-01',\n",
" end='2010-02-01'))"
]
},
{
"cell_type": "markdown",
"id": "44497649-d720-45dc-a2f7-b6a737b635bb",
"metadata": {},
"source": [
"***\n",
"Next: [Granule Searches](./3-Granule_Search.ipynb)"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.9"
},
"toc-autonumbering": false
},
"nbformat": 4,
"nbformat_minor": 5
}
Discovery-asf_search-8.1.2/examples/3-Granule_Search.ipynb 0000664 0000000 0000000 00000025061 14777330235 0023507 0 ustar 00root root 0000000 0000000 {
"cells": [
{
"cell_type": "markdown",
"id": "928e44f7-b11c-4392-bb4f-832ea77f3855",
"metadata": {
"tags": []
},
"source": [
"# asf_search - Granule Search\n",
"`asf_search` provides several search-oriented functions. One of these functions is `granule_search()`. This function accepts a granule name or names (also known as a scene name) in order to find search results. A similar function, `product_search()`, is also covered here.\n",
"***\n",
"## Before You Start\n",
"The steps outlined in this demonstration assume `asf_search` is available in your environment. For guidance on installing `asf_search`, [begin here](./1-Basic_Overview.ipynb#Before-You-Start)."
]
},
{
"cell_type": "markdown",
"id": "43588eb5-1dfc-4e82-8503-940bbbaffbb6",
"metadata": {
"tags": []
},
"source": [
"***\n",
"## Performing a Granule Search\n",
"[View this search in Vertex](https://search.asf.alaska.edu/#/?zoom=7.370¢er=-134.548,55.753&resultsLoaded=true&searchType=List%20Search&searchList=S1B_IW_GRDH_1SDV_20190822T151551_20190822T151616_017700_0214D2_6084,S1B_IW_GRDH_1SDV_20190810T151550_20190810T151615_017525_020F5A_2F74,S1B_IW_GRDH_1SDV_20190729T151549_20190729T151614_017350_020A0A_C3E2,S1B_IW_GRDH_1SDV_20190717T151548_20190717T151613_017175_0204EA_4181,S1B_IW_GRDH_1SDV_20190705T151548_20190705T151613_017000_01FFC4_24EC,S1B_IW_GRDH_1SDV_20190623T151547_20190623T151612_016825_01FA95_14B9,S1B_IW_GRDH_1SDV_20190611T151546_20190611T151611_016650_01F566_D7CE,S1B_IW_GRDH_1SDV_20190530T151546_20190530T151611_016475_01F02E_BF97,S1B_IW_GRDH_1SDV_20190518T151545_20190518T151610_016300_01EAD8_9308,S1B_IW_GRDH_1SDV_20190506T151544_20190506T151609_016125_01E56C_1D67)\n",
"\n",
"To perform a granule list search, you need a list of granule names. This could come from previous `asf_search` results, ASF's [Vertex](https://search.asf.alaska.edu) app, a colleague, or your favorite spreadsheet.\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "52144455-af61-4f36-b35b-57a04683ebb1",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"import asf_search as asf\n",
"\n",
"granule_list = [\n",
" 'S1B_IW_GRDH_1SDV_20190822T151551_20190822T151616_017700_0214D2_6084',\n",
" 'S1B_IW_GRDH_1SDV_20190810T151550_20190810T151615_017525_020F5A_2F74',\n",
" 'S1B_IW_GRDH_1SDV_20190729T151549_20190729T151614_017350_020A0A_C3E2',\n",
" 'S1B_IW_GRDH_1SDV_20190717T151548_20190717T151613_017175_0204EA_4181',\n",
" 'S1B_IW_GRDH_1SDV_20190705T151548_20190705T151613_017000_01FFC4_24EC',\n",
" 'S1B_IW_GRDH_1SDV_20190623T151547_20190623T151612_016825_01FA95_14B9',\n",
" 'S1B_IW_GRDH_1SDV_20190611T151546_20190611T151611_016650_01F566_D7CE',\n",
" 'S1B_IW_GRDH_1SDV_20190530T151546_20190530T151611_016475_01F02E_BF97',\n",
" 'S1B_IW_GRDH_1SDV_20190518T151545_20190518T151610_016300_01EAD8_9308',\n",
" 'S1B_IW_GRDH_1SDV_20190506T151544_20190506T151609_016125_01E56C_1D67'\n",
"]"
]
},
{
"cell_type": "markdown",
"id": "43f1d561-c88f-435a-ac64-d691c8fb7e76",
"metadata": {},
"source": [
"`granule_search()` does not make use of any other search filters, so at this point we can simply run the search and retrieve our results:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "4dd4ec77-6e5f-4d5e-8253-719a4152285e",
"metadata": {
"scrolled": true,
"tags": []
},
"outputs": [],
"source": [
"results = asf.granule_search(granule_list)\n",
"\n",
"print(results)"
]
},
{
"cell_type": "markdown",
"id": "084583dc-9f5c-4887-9cae-1f10d7e2cedf",
"metadata": {},
"source": [
"For convenience, `granule_search()` will also accept kwargs for consistency with other search functions:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "8e996159-0e4b-4cdd-bf22-f75a48163b4d",
"metadata": {},
"outputs": [],
"source": [
"results = asf.granule_search(granule_list=granule_list)\n",
"\n",
"print(f'{len(results)} results found')"
]
},
{
"cell_type": "markdown",
"id": "c6be9b2a-a0c6-4ef3-a118-98c99f82f229",
"metadata": {},
"source": [
"A single granule name is allowed with either above approach as well:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "42d0123c-7f84-4356-b0b9-23359f9d106f",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"results = asf.granule_search('S1B_IW_GRDH_1SDV_20190822T151551_20190822T151616_017700_0214D2_6084')\n",
"\n",
"print(f'{len(results)} results found')"
]
},
{
"cell_type": "markdown",
"id": "c4d2c680-080f-4f7d-9cbc-f08519a91635",
"metadata": {
"tags": []
},
"source": [
"***\n",
"## Performing a Product Search\n",
"[View this search in Vertex](https://search.asf.alaska.edu/#/?zoom=7.370¢er=-134.548,55.753&resultsLoaded=true&searchType=List%20Search&searchList=S1A_IW_GRDH_1SDV_20190809T001336_20190809T001401_028485_033839_78A1-GRD_HD,S1A_IW_GRDH_1SDV_20150322T000454_20150322T000524_005137_006794_56E3-GRD_HD,S1A_IW_GRDH_1SDV_20160121T001256_20160121T001321_009585_00DF26_5B84-GRD_HD,S1A_IW_GRDH_1SDV_20151117T000448_20151117T000513_008637_00C455_3DC2-GRD_HD&listSearchType=Product)\n",
"\n",
"To perform a product list search, you need a list of product names. Where granule names refer to a scene that may include multiple products, product names are guaranteed-unique identifiers for specific products in the archive, and as such they are always 1:1.\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "90d3e41e-0a15-40cb-9361-7023f904a423",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"import asf_search as asf\n",
"\n",
"product_list = [\n",
" 'S1A_IW_GRDH_1SDV_20190809T001336_20190809T001401_028485_033839_78A1-GRD_HD',\n",
" 'S1A_IW_GRDH_1SDV_20150322T000454_20150322T000524_005137_006794_56E3-GRD_HD',\n",
" 'S1A_IW_GRDH_1SDV_20160121T001256_20160121T001321_009585_00DF26_5B84-GRD_HD',\n",
" 'S1A_IW_GRDH_1SDV_20151117T000448_20151117T000513_008637_00C455_3DC2-GRD_HD'\n",
"]"
]
},
{
"cell_type": "markdown",
"id": "296e5054-28c2-4f38-957e-fcacefe122e6",
"metadata": {},
"source": [
"`product_search()` does not make use of any other search filters, so we have everything we need to run the search:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "1f03d2e7-766c-4cbb-a53a-a261220dddf5",
"metadata": {
"scrolled": true,
"tags": []
},
"outputs": [],
"source": [
"results = asf.product_search(product_list)\n",
"\n",
"print(results)"
]
},
{
"cell_type": "markdown",
"id": "cd4ca5c5-1e02-4f31-8437-bd8ad9b59e9c",
"metadata": {},
"source": [
"Like `granule_search()`, `product_search()` will also accept kwargs for consistency with other search functions:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "04d07e01-a4da-4b79-a8ce-8e7ff3ea3675",
"metadata": {},
"outputs": [],
"source": [
"results = asf.product_search(product_list=product_list)\n",
"\n",
"print(f'{len(results)} results found')"
]
},
{
"cell_type": "markdown",
"id": "79223bfd-f6a8-4d5c-9a45-9ddae31d2879",
"metadata": {},
"source": [
"A single product name is allowed with either above approach as well:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "6beb628c-b409-45c4-a227-25904bd1afe8",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"results = asf.product_search('S1A_IW_GRDH_1SDV_20190809T001336_20190809T001401_028485_033839_78A1-GRD_HD')\n",
"\n",
"print(f'{len(results)} results found')"
]
},
{
"cell_type": "markdown",
"id": "007ba2a5-3170-41d4-94ac-a5fbb99031af",
"metadata": {},
"source": [
"## Best Practices\n",
" \n",
"It is generally preferred to \"collapse\" many small queries into fewer large queries. That is, it may be easy and logically reasonable to run a number of small `granule_search()` queries via a `foreach` loop over each of the items in the original granule list. _Please do not do this._ It consumes a lot of resources at both ASF and at CMR. Instead, combine your small queries into a single large query where possible, as shown in the original example, and then post-process the results locally. `granule_search()` can support very large lists, and will break them up internally when needed.\n",
" \n",
"This approach of using queries that will yield all desired results in one search call, and then post-processing the results locally, rather than issuing multiple requests to collect small parts of the same overall results applies to all usage of `asf_search`."
]
},
{
"cell_type": "markdown",
"id": "8aed3b7c-a557-4cbb-878e-aa9fe8330646",
"metadata": {},
"source": [
"***\n",
"## Summary\n",
"A complete, basic example of using `granule_search()`:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "ceea6e9a-b1ea-417f-8daa-98bd17c9730c",
"metadata": {
"scrolled": true,
"tags": []
},
"outputs": [],
"source": [
"import asf_search as asf\n",
"\n",
"print(\n",
" asf.granule_search(['S1B_IW_GRDH_1SDV_20190822T151551_20190822T151616_017700_0214D2_6084']))"
]
},
{
"cell_type": "markdown",
"id": "d866ae37-f33e-4b81-8f2c-81f7c22a8fa1",
"metadata": {},
"source": [
"A complete, basic example of using `product_search()`:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "9594fdcd-547d-4fc5-9c9c-72d9fc215861",
"metadata": {
"scrolled": true,
"tags": []
},
"outputs": [],
"source": [
"import asf_search as asf\n",
"\n",
"print(\n",
" asf.product_search(['S1A_IW_GRDH_1SDV_20190809T001336_20190809T001401_028485_033839_78A1-GRD_HD']))"
]
},
{
"cell_type": "markdown",
"id": "44497649-d720-45dc-a2f7-b6a737b635bb",
"metadata": {},
"source": [
"***\n",
"Next: [Baseline Searches](./4-Baseline_Search.ipynb)"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.9"
},
"toc-autonumbering": false
},
"nbformat": 4,
"nbformat_minor": 5
}
Discovery-asf_search-8.1.2/examples/4-Baseline_Search.ipynb 0000664 0000000 0000000 00000016220 14777330235 0023632 0 ustar 00root root 0000000 0000000 {
"cells": [
{
"cell_type": "markdown",
"id": "928e44f7-b11c-4392-bb4f-832ea77f3855",
"metadata": {
"tags": []
},
"source": [
"# asf_search - Baseline Search\n",
"`asf_search` provides several search-oriented functions, including ways to find baseline stacks, such as for InSAR work. There are two main approaches to creating a baseline stack, detailed below.\n",
"***\n",
"## Before You Start\n",
"The steps outlined in this demonstration assume `asf_search` is available in your environment. For guidance on installing `asf_search`, [begin here](./1-Basic_Overview.ipynb#Before-You-Start)."
]
},
{
"cell_type": "markdown",
"id": "43588eb5-1dfc-4e82-8503-940bbbaffbb6",
"metadata": {
"tags": []
},
"source": [
"***\n",
"## Building a Stack from an `ASFProduct`\n",
"[View this search in Vertex](https://search.asf.alaska.edu/#/?resultsLoaded=true&zoom=7.220¢er=-73.969,39.213&searchType=Baseline%20Search&master=S1A_IW_SLC__1SDV_20220215T225119_20220215T225146_041930_04FE2E_9252&temporal=-2533to0&perp=-153to98)\n",
"\n",
"The most typical use case when building a baseline stack using `asf_search` is that the user has performed a search according to some criteria, and then wishes to build a baseline stack or stacks based on those results. In order to build a stack from an ASFProduct, the first required step is to perform some other sort of search. For a trivial example, we will simply perform a product search:\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "52144455-af61-4f36-b35b-57a04683ebb1",
"metadata": {
"scrolled": true,
"tags": []
},
"outputs": [],
"source": [
"import asf_search as asf\n",
"\n",
"results = asf.product_search('S1A_IW_SLC__1SDV_20220215T225119_20220215T225146_041930_04FE2E_9252-SLC')\n",
"\n",
"reference = results[0]\n",
"\n",
"print(reference)"
]
},
{
"cell_type": "markdown",
"id": "43f1d561-c88f-435a-ac64-d691c8fb7e76",
"metadata": {},
"source": [
"Given those results, we can simply call a specific `ASFProduct`'s `stack()` method, which internally builds the required parameters by which to identify products in the stack, and then performs a query based on those parameters, returning the stack in the form of another `ASFSearchResults` object. Here we will create a stack from the first (and in this case, only) result from our initial search:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "4dd4ec77-6e5f-4d5e-8253-719a4152285e",
"metadata": {},
"outputs": [],
"source": [
"stack = reference.stack()\n",
"\n",
"print(f'{len(stack)} products found in stack')"
]
},
{
"cell_type": "markdown",
"id": "084583dc-9f5c-4887-9cae-1f10d7e2cedf",
"metadata": {},
"source": [
"As stated, these results are a standard `ASFSearchResults` object containing a list of `ASFProduct` objects, each with all the usual functionality.\n",
" \n",
"There is one addition to the `ASFProduct` objects, however: a new field in the `properties` dictionary. This field describes the temporal offset in days from the reference scene used to build the stack. The reference scene is included in the stack and will always have a temporal baseline of 0. This additional field is included when serializing to geojson."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "8e996159-0e4b-4cdd-bf22-f75a48163b4d",
"metadata": {},
"outputs": [],
"source": [
"stack[0].properties"
]
},
{
"cell_type": "markdown",
"id": "c6be9b2a-a0c6-4ef3-a118-98c99f82f229",
"metadata": {},
"source": [
"__Note:__ At this time, perpendicular baseline values are not available (but they're coming!)"
]
},
{
"cell_type": "markdown",
"id": "6c4900f9-e610-4136-8c90-1fbdaee93bdf",
"metadata": {},
"source": [
"***\n",
"## Building a Stack from a Product ID\n",
"Sometimes, you may not have arrived at your product by way of `asf_search`, and all you have is the product ID. In this case, you can make use of `asf_search.stack_from_id()` to build a stack. The results returned using this approach are identical to those above."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "d04567f4-1223-475f-9b2a-a59ff7ce1486",
"metadata": {},
"outputs": [],
"source": [
"product_id = 'S1A_IW_SLC__1SDV_20220215T225119_20220215T225146_041930_04FE2E_9252-SLC'\n",
"\n",
"stack = asf.stack_from_id(product_id)\n",
"\n",
"print(f'{len(stack)} products found in stack')"
]
},
{
"cell_type": "markdown",
"id": "52e62364-dce3-438c-ad0d-65a5f945054f",
"metadata": {},
"source": [
"***\n",
"## Best Practices\n",
"\n",
"`stack_from_id()` is provided largely as a convenience: internally, it performs a `product_search()` using the provided ID, and then returns the results of that product's `stack()` method. For this reason, it is recommend that if you have an `ASFProduct` object at hand, you use that to build your stack directly, as it removes the need for the additional search action.\n",
"\n",
"For other cases where you have parameters describing your reference scene but not an `ASFProduct` object itself, it is appropriate to use one of the various search features available with `asf_search` to obtain an `ASFProduct` first."
]
},
{
"cell_type": "markdown",
"id": "8aed3b7c-a557-4cbb-878e-aa9fe8330646",
"metadata": {},
"source": [
"***\n",
"## Summary\n",
"A complete, basic example of using `ASFProduct.stack()`:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "ceea6e9a-b1ea-417f-8daa-98bd17c9730c",
"metadata": {
"scrolled": true,
"tags": []
},
"outputs": [],
"source": [
"import asf_search as asf\n",
"\n",
"reference = asf.product_search('S1A_IW_SLC__1SDV_20220215T225119_20220215T225146_041930_04FE2E_9252-SLC')[0]\n",
"\n",
"print(reference.stack())"
]
},
{
"cell_type": "markdown",
"id": "e35d5b08-7c23-4102-946a-693f396ceadf",
"metadata": {},
"source": [
"A complete, basic example of using `stack_from_id()`:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "59412d12-1264-4c36-bb46-6a733b8f5acc",
"metadata": {
"scrolled": true,
"tags": []
},
"outputs": [],
"source": [
"import asf_search as asf\n",
"\n",
"print(asf.stack_from_id('S1A_IW_SLC__1SDV_20220215T225119_20220215T225146_041930_04FE2E_9252-SLC'))"
]
},
{
"cell_type": "markdown",
"id": "44497649-d720-45dc-a2f7-b6a737b635bb",
"metadata": {},
"source": [
"***\n",
"Next: [Downloading Data](./5-Download.ipynb)"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.9"
},
"toc-autonumbering": false
},
"nbformat": 4,
"nbformat_minor": 5
}
Discovery-asf_search-8.1.2/examples/5-Download.ipynb 0000664 0000000 0000000 00000041041 14777330235 0022372 0 ustar 00root root 0000000 0000000 {
"cells": [
{
"cell_type": "markdown",
"id": "928e44f7-b11c-4392-bb4f-832ea77f3855",
"metadata": {
"tags": []
},
"source": [
"### asf_search - Downloading Data\n",
"`asf_search` provides many ways to find data, but equally important is the ability to download that data. Fortunately, `asf_search` provides a simple interface through which to download data, using a variety of authentication methods.\n",
"***\n",
"## Before You Start\n",
"The steps outlined in this demonstration assume `asf_search` is available in your environment. For guidance on installing `asf_search`, [begin here](./1-Basic_Overview.ipynb#Before-You-Start).\n",
"\n",
"Additionally, this section expects you to have an [Earthdata Login](https://urs.earthdata.nasa.gov/) account with the appropriate applications authorized, EULAs signed, and profile fields set. The easiest way to check that your EDL account is in order is to simply go to [Vertex](https://search.asf.alaska.edu) and download a product.\n",
"\n",
"Lastly, the examples in this notebook assume a few directories exist, namely `./downloads`, `./downloads1`, `./downloads2`, and `./downloads3`. You can create them yourself, or run the following code block:"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "db06fa80-4ac3-40b5-9787-256b422d49e6",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"from pathlib import Path\n",
"dirs = ['downloads', 'downloads1', 'downloads2', 'downloads3']\n",
"for d in dirs:\n",
" Path(d).mkdir(exist_ok=True)"
]
},
{
"cell_type": "markdown",
"id": "43588eb5-1dfc-4e82-8503-940bbbaffbb6",
"metadata": {
"tags": []
},
"source": [
"***\n",
"## ASFSession\n",
"\n",
"Because downloading any product in the ASF archive requires authentication, `asf_search` provides the `ASFSession` class, a subclass of `Session` with a few specific methods added to make authentication straightforward.\n",
"\n",
"Using .netrc credentials is the preferred method for authentication. For more information, see the [session authentication documentation](https://docs.asf.alaska.edu/asf_search/downloading/#session-authentication)\n",
"\n",
"A new, unauthenticated session can be created, although the authentication methods listed below allow chaining for direct creation of an authenticated session."
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "38a2c671-0789-4e7c-b758-5b48745b2877",
"metadata": {},
"outputs": [],
"source": [
"import asf_search as asf\n",
"\n",
"session = asf.ASFSession()"
]
},
{
"cell_type": "markdown",
"id": "118f2976-fae9-481d-b8bf-6d35927f685a",
"metadata": {},
"source": [
"### `auth_with_creds()`\n",
"This authentication method accepts a username and password and establishes an authentication session with EDL and ASF."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "f4ea3b83-58cb-4baa-beb1-5ae272466ef1",
"metadata": {},
"outputs": [],
"source": [
"import getpass\n",
"username = input('Username:')\n",
"password = getpass.getpass('Password:')\n",
"\n",
"try:\n",
" user_pass_session = asf.ASFSession().auth_with_creds(username, password)\n",
"except asf.ASFAuthenticationError as e:\n",
" print(f'Auth failed: {e}')\n",
"else:\n",
" print('Success!')"
]
},
{
"cell_type": "markdown",
"id": "9ab80cec-237a-4517-bda6-5ca888e58ab8",
"metadata": {},
"source": [
"### `auth_with_token()`\n",
"This authentication method accepts an EDL Token which is then included as part of an `Authorization: Bearer` header on any downloads using this session. To generate an EDL Token, [sign in to EDL](https://urs.earthdata.nasa.gov/home), select the \"Generate Token\" tab, and then click the green \"Generate Token\" button. The token can then be copied and used below.\n",
" \n",
"__Note:__ While it is extremely convenient, not all datapool hosts are compatible with this authentication method yet."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "3f1e9023-98ba-46c2-911b-4aa043947f3b",
"metadata": {},
"outputs": [],
"source": [
"import getpass\n",
"token = getpass.getpass('EDL Token:')\n",
"\n",
"token_session = asf.ASFSession().auth_with_token(token)"
]
},
{
"cell_type": "markdown",
"id": "4c0551b8-5319-437c-97cd-c7dee8d7c17e",
"metadata": {},
"source": [
"### `auth_with_cookiejar()`\n",
"This method accepts an `http.cookiejar` compatible object, such as a previously authenticated session stored for later re-use.\n",
"\n",
"For this demonstration, we will make use of the cookiejar from one of the previously authenticated sessions above:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "fcedf4b8-f69f-4472-a96a-01a4c4e362da",
"metadata": {},
"outputs": [],
"source": [
"cookiejar = user_pass_session.cookies"
]
},
{
"cell_type": "markdown",
"id": "4a7466f1-4dd3-45cf-b47b-ec46d2c884e8",
"metadata": {},
"source": [
"It is to be assumed that this cookiejar is perhaps saved to a file, later loaded, etc. At that time, a new ASFSession can be instantiated using the cookiejar. While it is not required to use this method to reload the session, it can simplify exception handling and EDL/ASF-specific auth processes, and allows a normalized use of `ASFSession` in all cases:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "d295ae06-d933-49a7-ab5c-a1e824ef239c",
"metadata": {},
"outputs": [],
"source": [
"cookiejar_session = asf.ASFSession().auth_with_cookiejar(cookiejar)"
]
},
{
"cell_type": "markdown",
"id": "0bbfdce0-2a90-4dc0-9ca4-a8096be4f3bd",
"metadata": {},
"source": [
"***\n",
"## Downloading\n",
"[View this search in Vertex](https://search.asf.alaska.edu/#/?dataset=UAVSAR&productTypes=METADATA&resultsLoaded=true&zoom=8.090¢er=-90.488,28.359&polygon=POLYGON((-91.97%2028.78,-88.85%2028.78,-88.85%2030.31,-91.97%2030.31,-91.97%2028.78)))\n",
" \n",
"With authentication handled, we can now begin downloading products. First, we will need some search results to work with:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "a69116c4-7783-4648-bdbe-ee9a3966b059",
"metadata": {},
"outputs": [],
"source": [
"results = asf.geo_search(\n",
" intersectsWith='POLYGON((-91.97 28.78,-88.85 28.78,-88.85 30.31,-91.97 30.31,-91.97 28.78))',\n",
" platform=asf.PLATFORM.UAVSAR,\n",
" processingLevel=asf.PRODUCT_TYPE.METADATA,\n",
" maxResults=250)\n",
"\n",
"print(f'{len(results)} results found')"
]
},
{
"cell_type": "markdown",
"id": "6c4900f9-e610-4136-8c90-1fbdaee93bdf",
"metadata": {},
"source": [
"***\n",
"## Downloading Single Products\n",
"To download a single `ASFProduct`, simply call its `download()` method, passing in a previously-authenticated session, a path, and optionally a filename. If no filename is provided, the default is to use the filename of the product iself, as described in `properties['fileName']`."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "4afbac1b-df3d-4b8a-871d-df1f80b5f333",
"metadata": {},
"outputs": [],
"source": [
"from os import listdir\n",
"\n",
"results[0].download(path='./downloads1', session=user_pass_session)\n",
"\n",
"listdir('./downloads1')"
]
},
{
"cell_type": "markdown",
"id": "42506e17",
"metadata": {},
"source": [
"Some results may be stored as zip files. To download only part of a single `ASFProduct`'s zip, call its `remotezip()` method, passing in a previously-authenticated session. It should return a `RemoteZip` object, which provides functionality to download parts of the `ASFProduct`'s zip archive. Below is an example of using a `RemoteZip` object to download all .tiff files from a single product.\n",
"\n",
"NOTE: the `remotezip()` method requires installing the asf-search module's extra dependencies, in particular the `remotezip` package. Extra dependencies can be installed via pip like so:\n",
"\n",
"``` bash\n",
"python3 -m pip install asf-search[extras]\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "833d2a7e",
"metadata": {},
"outputs": [],
"source": [
"results_with_zips = asf.search(platform=asf.PLATFORM.SENTINEL1, processingLevel=asf.PRODUCT_TYPE.GRD_HD, maxResults=250)\n",
"\n",
"with results_with_zips[0].remotezip(session=user_pass_session) as z:\n",
" file_paths = [file.filename for file in z.filelist if file.filename.endswith('.tiff')]\n",
"\n",
" print(f'found {len(file_paths)} tiff files in zip')\n",
"\n",
" for file_path in file_paths:\n",
" z.extract(file_path, path='./downloads1')\n",
"\n",
"listdir('./downloads1')"
]
},
{
"cell_type": "markdown",
"id": "5684f338",
"metadata": {},
"source": [
"For more information on remotezip functionality, see https://github.com/gtsystem/python-remotezip"
]
},
{
"cell_type": "markdown",
"id": "8bd15b6e-355e-4b96-858b-cd2085ace0e8",
"metadata": {
"tags": []
},
"source": [
"***\n",
"## Downloading Multiple Products\n",
"More often than not, we want to download an entire set of search results rather than just a single product. `ASFSearchResults` provides this functionality similarly to `ASFProduct` via the identically-named `download()` method, albeit with two key differences: filenames always use the default behavior, and downloads can occur in parallel. If a particular file already exists, a `UserWarning` will be emitted, and the file will be skipped."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "232f04cc-88e6-4a23-ac78-9f2d98d0760a",
"metadata": {},
"outputs": [],
"source": [
"results[0:10].download(path='./downloads1', session=user_pass_session)\n",
"listdir('./downloads1')"
]
},
{
"cell_type": "markdown",
"id": "84a28894-3bae-4a32-a981-8242dd2b5b19",
"metadata": {},
"source": [
"While the above example downloads each file in sequence by default, it is often more performant to download multiple files in parallel. With that in mind, `ASFSearchResults.download()` allows setting a maximum number downloads to run in parallel:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "89b64286-c961-4b95-b67d-427abc30206e",
"metadata": {
"scrolled": true,
"tags": []
},
"outputs": [],
"source": [
"results.download(path='./downloads2', session=user_pass_session, processes=50)\n",
"\n",
"listdir('./downloads2')"
]
},
{
"cell_type": "markdown",
"id": "a3260757-d28d-4a2c-aaf7-5504ef37fb85",
"metadata": {},
"source": [
"***\n",
"## Downloading Arbitrary URLs\n",
"Lastly, it may occur that you have a list of product URLs you wish to download, but have not arrived at that list through `asf_search`. Perhaps you have a service in the cloud and it's convenient to just copy/paste a list of URLs from some external process. In that case, `asf_search` exposes its download functionality more directly, through `download_urls()`. This function takes a list of arbitrary URLs, a path, an authenticated session, and optionally a number of downloads to run in parallel:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "2e2626a6-14ba-4352-862a-b7a8fea05cc7",
"metadata": {},
"outputs": [],
"source": [
"urls = [\n",
" 'https://datapool.asf.alaska.edu/METADATA/UA/aleutn_06005_09051_003_090723_L090_CX_01.ann',\n",
" 'https://datapool.asf.alaska.edu/METADATA/UA/aleutn_06004_09051_004_090723_L090_CX_01.ann',\n",
" 'https://datapool.asf.alaska.edu/METADATA/UA/aleutn_04701_09051_005_090723_L090_CX_01.ann',\n",
" 'https://datapool.asf.alaska.edu/METADATA/UA/aleutn_23301_09050_001_090722_L090_CX_01.ann',\n",
" 'https://datapool.asf.alaska.edu/METADATA/UA/aleutn_19802_11054_001_110802_L090_CX_01.ann']\n",
"\n",
"asf.download_urls(urls=urls, path='./downloads3', session=user_pass_session, processes=5)\n",
"\n",
"listdir('./downloads3')"
]
},
{
"cell_type": "markdown",
"id": "30760f6f",
"metadata": {},
"source": [
"***\n",
"## S3 URIs\n",
"Some products have S3 URIs available (SENTINEL-1, OPERA, and NISAR)"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "dd4a81ed",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"['s3://asf-cumulus-prod-opera-browse/OPERA_L2_CSLC-S1/OPERA_L2_CSLC-S1_T113-241605-IW3_20240610T110743Z_20240611T073356Z_S1A_VV_v1.1/OPERA_L2_CSLC-S1_T113-241605-IW3_20240610T110743Z_20240611T073356Z_S1A_VV_v1.1_BROWSE.png.md5',\n",
" 's3://asf-cumulus-prod-opera-browse/OPERA_L2_CSLC-S1/OPERA_L2_CSLC-S1_T113-241605-IW3_20240610T110743Z_20240611T073356Z_S1A_VV_v1.1/OPERA_L2_CSLC-S1_T113-241605-IW3_20240610T110743Z_20240611T073356Z_S1A_VV_v1.1_BROWSE_low-res.png.md5',\n",
" 's3://asf-cumulus-prod-opera-browse/OPERA_L2_CSLC-S1/OPERA_L2_CSLC-S1_T113-241605-IW3_20240610T110743Z_20240611T073356Z_S1A_VV_v1.1/OPERA_L2_CSLC-S1_T113-241605-IW3_20240610T110743Z_20240611T073356Z_S1A_VV_v1.1_BROWSE_thumbnail.png.md5',\n",
" 's3://asf-cumulus-prod-opera-products/OPERA_L2_CSLC-S1/OPERA_L2_CSLC-S1_T113-241605-IW3_20240610T110743Z_20240611T073356Z_S1A_VV_v1.1/OPERA_L2_CSLC-S1_T113-241605-IW3_20240610T110743Z_20240611T073356Z_S1A_VV_v1.1.h5',\n",
" 's3://asf-cumulus-prod-opera-products/OPERA_L2_CSLC-S1/OPERA_L2_CSLC-S1_T113-241605-IW3_20240610T110743Z_20240611T073356Z_S1A_VV_v1.1/OPERA_L2_CSLC-S1_T113-241605-IW3_20240610T110743Z_20240611T073356Z_S1A_VV_v1.1.h5.md5',\n",
" 's3://asf-cumulus-prod-opera-products/OPERA_L2_CSLC-S1/OPERA_L2_CSLC-S1_T113-241605-IW3_20240610T110743Z_20240611T073356Z_S1A_VV_v1.1/OPERA_L2_CSLC-S1_T113-241605-IW3_20240610T110743Z_20240611T073356Z_S1A_VV_v1.1.iso.xml',\n",
" 's3://asf-cumulus-prod-opera-products/OPERA_L2_CSLC-S1/OPERA_L2_CSLC-S1_T113-241605-IW3_20240610T110743Z_20240611T073356Z_S1A_VV_v1.1/OPERA_L2_CSLC-S1_T113-241605-IW3_20240610T110743Z_20240611T073356Z_S1A_VV_v1.1.iso.xml.md5']"
]
},
"execution_count": 2,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"opera_product = asf.search(dataset=asf.DATASET.OPERA_S1, maxResults=1)[0]\n",
"opera_product.properties['s3Urls']"
]
},
{
"cell_type": "markdown",
"id": "159b5eb8",
"metadata": {},
"source": [
"From there authorized users can use their prefered method for authentication and downloading s3 objects."
]
},
{
"cell_type": "markdown",
"id": "8aed3b7c-a557-4cbb-878e-aa9fe8330646",
"metadata": {
"tags": []
},
"source": [
"***\n",
"## Summary\n",
"A complete, basic example of downloading search results:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "ceea6e9a-b1ea-417f-8daa-98bd17c9730c",
"metadata": {
"scrolled": true,
"tags": []
},
"outputs": [],
"source": [
"from os import listdir\n",
"import getpass\n",
"username = input('Username:')\n",
"password = getpass.getpass('Password:')\n",
"\n",
"import asf_search as asf\n",
"\n",
"session = asf.ASFSession().auth_with_creds(username=username, password=password)\n",
"\n",
"results = asf.geo_search(\n",
" intersectsWith='POLYGON((-91.97 28.78,-88.85 28.78,-88.85 30.31,-91.97 30.31,-91.97 28.78))',\n",
" platform=asf.PLATFORM.UAVSAR,\n",
" processingLevel=asf.PRODUCT_TYPE.METADATA,\n",
" maxResults=20)\n",
"\n",
"results.download(\n",
" path='./downloads',\n",
" session=session,\n",
" processes=10)\n",
"\n",
"listdir('./downloads')"
]
},
{
"cell_type": "markdown",
"id": "38ca6c8a-3b3a-45e4-9385-37f0eff0e60c",
"metadata": {},
"source": [
"***\n",
"Next: [Closing](./6-Outro.md)"
]
}
],
"metadata": {
"celltoolbar": "Raw Cell Format",
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.5"
},
"toc-autonumbering": false,
"toc-showtags": false
},
"nbformat": 4,
"nbformat_minor": 5
}
Discovery-asf_search-8.1.2/examples/6-Outro.md 0000664 0000000 0000000 00000001417 14777330235 0021216 0 ustar 00root root 0000000 0000000 # Thank You!
***
## Where to Go Next
`asf_search` is available through:
- [PyPi](https://pypi.org/project/asf-search/)
- [Conda](https://anaconda.org/conda-forge/asf_search)
- [Github](https://github.com/asfadmin/Discovery-asf_search)
- [The notebooks used for this presentation](https://github.com/asfadmin/Discovery-asf_search/tree/master/examples)
- [Documentation](https://docs.asf.alaska.edu/)
Contact ASF at:
- [ASF Website](https://asf.alaska.edu)
- [Contact ASF](https://asf.alaska.edu/contact/)
Contact the ASF Discovery team directly:
- [Gitter](https://gitter.im/ASFDiscovery/)
***
## The ASF Discovery Team
Andrew Anderson, Tyler Chase, Olena Ellis, Kim Fairbanks, Christy Fleming, Gregory Short, Cameron Showalter, William Horn
***
[Back to Start](./0-Intro.md) Discovery-asf_search-8.1.2/examples/Advanced-Custom-ASFProduct-Subclassing.ipynb 0000664 0000000 0000000 00000033505 14777330235 0027667 0 ustar 00root root 0000000 0000000 {
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# `ASFProduct` Subclasses\n",
"\n",
"`ASFProduct` is the base class for all search result objects as of asf-search v7.0.0. There are several subclasses of `ASFProduct` that asf-search uses for specific platforms and product types with unique properties/functionality.\n",
"\n",
"Key Methods:\n",
"- `geojson()`\n",
"- `download()`\n",
"- `stack()`\n",
"- `get_stack_opts()` (returns None in `ASFProduct`, implemented by `ASFStackableProduct` subclass and its subclasses)\n",
"- `centroid()`\n",
"- `remotezip()` (requires asf-search's optional dependency be installed)\n",
"- `translate_product()` (reads properties from umm, populates `properties` with associated keyword)\n",
"- `get_sort_keys()`\n",
"- `umm_get()`\n",
"\n",
"Key Properties:\n",
"- `properties`\n",
"- `_base_properties` (maps `properties` keys to values in umm json)\n",
"- `umm` (The product's umm JSON from CMR)\n",
"- `metadata` (The product's metadata JSON from CMR)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import asf_search as asf\n",
"products = ['S1A_IW_SLC__1SDV_20231226T162948_20231226T163016_051828_0642C6_272F-SLC', 'S1_185682_IW2_20210224T161634_VV_035E-BURST','S1-GUNW-D-R-087-tops-20190301_20190223-161540-20645N_18637N-PP-7a85-v2_0_1-unwrappedPhase','ALPSRP111041130-RTC_HI_RES', 'UA_newyor_03204_22005-013_22010-002_0014d_s01_L090_01-INTERFEROMETRY']\n",
"results = asf.product_search(product_list=products)\n",
"results"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Notice the different type in the `results` list: `S1Product`, `S1BurstProduct`, `ARIAS1GUNWProduct`, `ALOSProduct`, and `UAVSARProduct`.\n",
"Each of these classes are subclassed from `ASFProduct` in some way.\n",
"\n",
"Let's compare the `properties` of `S1Product` and `ALOSProduct`"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s1, uavsar, s1Burst, ariaGunw, alos = results\n",
"\n",
"def compare_properties(lhs: asf.ASFProduct, rhs: asf.ASFProduct):\n",
" # Compares properties of two ASFProduct objects in a color coded table\n",
" # values printed in red are missing from that product type altogether\n",
" \n",
" # Color Coding\n",
" RED = '\\033[31m'\n",
" GREEN = '\\033[32m'\n",
" BLUE = '\\033[34m'\n",
" RESET = '\\033[0m'\n",
"\n",
" print(f'\\t{GREEN}{type(lhs)}{RESET}\\t{BLUE}{type(rhs)}{RESET}')\n",
" \n",
" keys = {*lhs.properties.keys(), *rhs.properties.keys()}\n",
" for key in keys:\n",
" print(f\"{key}:\\n\\t{GREEN}{lhs.properties.get(key, f'{RED}None')}{RESET}\\t{BLUE}{rhs.properties.get(key, f'{RED}None')}{RESET}\\n\")\n",
"\n",
"compare_properties(s1, uavsar)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Notice a few properties (marked in red) are missing from each product properties dict. For example, `S1Product` has `pgeVersion`, while `UAVSARProduct` has `insarStackId`. \n",
"\n",
"Moreover, `S1Product` has one major difference with `UAVSARProduct`: `S1Product` inherits from `ASFStackableProduct` (see section below)."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(f\"{s1.properties['fileID']}\\n\\t{s1.baseline}\\n\")\n",
"print(f\"{uavsar.properties['fileID']}\\n\\t{uavsar.baseline}\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# `ASFStackableProduct`\n",
"\n",
"`ASFStackableProduct` is an important `ASFProduct` subclass, from which stackable products types meant for time-series analysis are derived from. `ASFStackableProduct` has a class enum, `BaselineCalcType` that determines how asf-search will handle perpendicular stack calculations. Each subclass keeps track of their baseline calculation type via the `baseline_type` property.\n",
"\n",
"Inherits: `ASFProduct`\n",
"\n",
"Inherited By:\n",
"- `ALOSProduct`\n",
"- `ERSProduct`\n",
"- `JERSProduct`\n",
"- `RADARSATProduct`\n",
"- `S1Product`\n",
" - `S1BurstProduct`\n",
" - `OPERAS1Product` (Stacking currently disabled)\n",
" - `ARIAS1GUNWProduct` (Stacking currently disabled)\n",
"\n",
"Key Methods:\n",
"- `get_baseline_calc_properties()`\n",
"- `get_stack_opts()` (Overrides `ASFproduct`)\n",
"- `is_valid_reference()`\n",
"- `get_default_baseline_product_type()`\n",
"\n",
"Key Definitions:\n",
"class enum `BaselineCalcType`:\n",
"- `PRE_CALCULATED` Has pre-calculated `insarBaseline` value that will be used for perpendicular calculations\n",
"- `CALCULATED` Uses position/velocity state vectors and ascending node time for perpendicular calculations\n",
"\n",
"Key Fields:\n",
"- `baseline`\n",
"- `baseline_type` (`BaselineCalcType.PRE_CALCULATED` by default or `BaselineCalcType.CALCULATED`)\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(f\"Baseline Calculation Types\")\n",
"print(f\"ASFProduct:\\t {asf.ASFStackableProduct.baseline_type}\")\n",
"print(f\"ALOSProduct:\\t {alos.baseline_type}\")\n",
"print(f\"S1Product:\\t {s1.baseline_type}\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"`ASFStackableProduct` subclasses even have their own stack search option methods. The `ASFStackableProduct` implementation of `get_stack_opts()` returns the commonly used params for pre-calculated datasets (processing level and insar stack ID), but subclasses like `S1Product` and `S1BurstProduct` use their own approach. "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(f\"S1Product:\\n{s1.get_stack_opts()}\\n\")\n",
"print(f\"S1BurstProduct:\\n{s1Burst.get_stack_opts()}\\n\")\n",
"print(f\"ALOSProduct:\\n{alos.get_stack_opts()}\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Writing Custom `ASFProduct` Subclasses\n",
"Because `ASFProduct` is built for subclassing, that means users can provide their own custom subclasses dervied directly from `ASFProduct` or even from a pre-existing subclass like `S1Product` or `OperaS1Product`.\n",
"\n",
"In this example we subclass `S1Product`, and overrides the default `ASFProduct.stack()` with one that returns a _list_ of `S1BurstProduct` stacks based on an area of interest, modify `geojson()` to return state vectors, and add a new helper method for getting raw umm CMR response!"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import copy\n",
"from typing import List, Type, Union, Dict\n",
"from asf_search import ASFSearchOptions, ASFSession, ASFSearchResults\n",
"from asf_search.ASFSearchOptions import ASFSearchOptions\n",
"from asf_search.CMR.translate import try_parse_int\n",
"from datetime import datetime\n",
"\n",
"class MyCustomS1Subclass(asf.S1Product):\n",
" def __init__(\n",
" #default ASFProduct constructor arguments\n",
" self, args: dict = {}, session: ASFSession = ASFSession()\n",
" ):\n",
" super().__init__(args, session)\n",
"\n",
" # totaly unique property of MyCustomClass\n",
" self.timestamp = datetime.now()\n",
"\n",
" # _base_properties is a special dict of ASFProduct that maps keywords to granule UMM json\n",
" # defining properties and their paths here will let you\n",
" # easily access them in the product's `properties` dictionary\n",
" # see `ASFProduct.umm_get()` for explanation of pathing\n",
" _base_properties = {\n",
" # Most product types use `CENTER_ESA_FRAME` as the value for `frameNumber` (unlike S1 and ALOS, which use `FRAME_NUMBER`), \n",
" # this creates a new `esaFrame` property so we have that value too\n",
" **asf.S1Product._base_properties,\n",
" 'esaFrame': {'path': ['AdditionalAttributes', ('Name', 'CENTER_ESA_FRAME'), 'Values', 0], 'cast': try_parse_int}, #Sentinel and ALOS product alt for frameNumber (ESA_FRAME)\n",
" }\n",
"\n",
" \"\"\" Example umm that the above pathing would map to:\n",
" 'umm': {\n",
" 'AdditionalAttributes': [\n",
" {\n",
" 'Name': 'CENTER_ESA_FRAME',\n",
" \"Values\": ['1300'] \n",
" },\n",
" ...\n",
" ],\n",
" ...\n",
" }\n",
" \"\"\"\n",
"\n",
" # CUSTOM CLASS METHODS\n",
" # Return\n",
" def as_umm_json(self) -> Dict:\n",
" return { 'umm': self.umm, 'meta': self.meta }\n",
" \n",
" # CLASS OVERRIDE METHODS\n",
" \n",
" # This override of `geojson()` includes the product's state vectors in the final geojson output, \n",
" # along with a custom class field timestamp and what version of asf-search was used at runtime\n",
" def geojson(self) -> dict:\n",
" output = super().geojson()\n",
"\n",
" output['properties']['stateVectors'] = self.get_state_vectors()\n",
" output['properties']['timestamp'] = str(self.timestamp)\n",
" output['properties']['ASFSearchVersion'] = asf.__version__\n",
" return output\n",
"\n",
" # ASFProduct.stack() normally stacks the current product\n",
" # in this version we search for every SLC-BURST product that\n",
" # overlaps the given area with the same source scene, \n",
" # and return a list of burst stacks\n",
" # if no bursts are found, we fall back to building a regular stack\n",
" def stack(self, \n",
" opts: ASFSearchOptions = None,\n",
" useSubclass: Type[asf.ASFProduct] = None,\n",
" aoi: str = None\n",
" ) -> Union[ASFSearchResults, List[ASFSearchResults]]:\n",
"\n",
" bursts = asf.search(\n",
" groupID=self.properties['groupID'], \n",
" processingLevel=asf.PRODUCT_TYPE.BURST,\n",
" intersectsWith=aoi if aoi is not None else opts.intersectsWith\n",
" )\n",
" \n",
" return [burst.stack(opts=opts) for burst in bursts]"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"\n",
"customS1SubclassProduct = MyCustomS1Subclass({'umm': s1.umm, 'meta': s1.meta}, session=s1.session)\n",
"\n",
"customS1SubclassProduct.geojson()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Notice the `timestamp`, `ASFSearchVersion`, `stateVectors`, and `esaFrame` fields in the output from `geojson()`.\n",
"Below is a comparison of properties between the built-in `S1Product` and our `customS1SubclassProduct`."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"compare_properties(s1, customS1SubclassProduct)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"palmer_to_anchorage = 'LINESTRING(-149.1052 61.6054,-149.5376 61.3162,-149.8764 61.2122)'\n",
"customStack = customS1SubclassProduct.stack(aoi=palmer_to_anchorage)\n",
"customStack"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Notice instead of a stack of `MyCustomS1Subclass` products we have a list of `S1BurstProduct` stacks!\n",
"Below is a breakdown of this list of stacks:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from typing import List\n",
"\n",
"def view_stack_of_stacks(stack_of_stacks: List):\n",
" print(f'Found {len(stack_of_stacks)} SLC-BURST stacks over AOI, stack lengths:')\n",
" for stack_idx, stack in enumerate(stack_of_stacks):\n",
" print(f\"\\t{stack_idx+1}:\\t{len(stack)} SLC-BURSTs \\t(Full Burst ID: {stack[-1].properties['burst']['fullBurstID']}, polarization: {stack[-1].properties['polarization']})\")\n",
"\n",
"view_stack_of_stacks(customStack)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Using Custom `ASFProduct` Subclasses in Baseline Search\n",
"\n",
"There may be instances where you want to build a spatio-temporal baseline stack from a reference of a custom subclass. `stack_from_id()` and `ASFProduct.stack()` support this via the `ASFProductSubclass` keyword."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"opts = asf.ASFSearchOptions(intersectsWith=palmer_to_anchorage) # our custom class will be able to use our aoi this way\n",
"\n",
"customSubclassStack = asf.stack_from_id('S1A_IW_SLC__1SDV_20231226T162948_20231226T163016_051828_0642C6_272F-SLC', opts=opts, useSubclass=MyCustomS1Subclass)\n",
"\n",
"view_stack_of_stacks(customSubclassStack)"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "asf-search-env-current",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.5"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Discovery-asf_search-8.1.2/examples/hello_world.py 0000664 0000000 0000000 00000007131 14777330235 0022304 0 ustar 00root root 0000000 0000000 """
Simple example script showing a few basic uses of asf_search
"""
import json
import asf_search as asf
print('=' * 80)
print('Constants')
print(f'asf.BEAMMODE.IW: {asf.BEAMMODE.IW}')
print(f'asf.POLARIZATION.HH_HV: {asf.POLARIZATION.HH_HV}')
print(f'asf.PLATFORM.SENTINEL1: {asf.PLATFORM.SENTINEL1}')
print('=' * 80)
print(f'Health check: {json.dumps(asf.health(), indent=2)}')
print('=' * 80)
results = asf.search(platform=[asf.PLATFORM.SENTINEL1], maxResults=2)
print(f'Basic search example: {results}')
print('=' * 80)
results = asf.granule_search(['ALPSRS279162400', 'ALPSRS279162200'])
print(f'Granule search example: {results}')
print('=' * 80)
results = asf.product_search(['ALAV2A279102730', 'ALAV2A279133150'])
print(f'Product search example: {results}')
print('=' * 80)
wkt = 'POLYGON((-135.7 58.2,-136.6 58.1,-135.8 56.9,-134.6 56.1,-134.9 58.0,-135.7 58.2))'
results = asf.geo_search(platform=[asf.PLATFORM.SENTINEL1], intersectsWith=wkt, maxResults=2)
print(f'Geographic search example: {results}')
print('=' * 80)
results = asf.search(
platform=[asf.PLATFORM.SENTINEL1],
frame=[100, 150, (200, 205)],
relativeOrbit=[100, 105, (110, 115)],
processingLevel=[asf.PRODUCT_TYPE.SLC],
)
print(f'Path/frame/platform/product type example: {results}')
print('=' * 80)
results = asf.stack_from_id(
'S1B_WV_SLC__1SSV_20210126T234925_20210126T235632_025332_030462_C733-SLC'
)
print(f'Baseline stack search example, ephemeris-based: {results}')
print('=' * 80)
try:
results = asf.stack_from_id('nonexistent-scene')
except asf.ASFSearchError as e:
print(f'Stacking a non-existent scene throws an exception: {e}')
print('=' * 80)
try:
results = asf.stack_from_id('UA_atchaf_06309_21024_020_210401_L090_CX_01-PROJECTED')
except asf.ASFBaselineError as e:
print(f'Not everything can be stacked: {e}')
print('=' * 80)
results = asf.stack_from_id('ALPSRP279071390-RTC_HI_RES')
print(f'Baseline stack search example, pre-calculated: {results}')
print('=' * 80)
results = results[0].stack()
print(f'Baseline stacks can also be made from an ASFProduct: {results}')
print('=' * 80)
print(f'ASFSearchResults work like lists: {results[3:5]}')
print('=' * 80)
print(f'ASFSearchResults serializes to geojson: {results[3:5]}')
print('=' * 80)
product = results[2]
print(f'ASFProduct serializes to geojson: {product}')
print('=' * 80)
wkt = 'POLYGON((-160 65,-150 65,-160 60,-150 60,-160 65))' # Self-intersecting bowtie
try:
results = asf.geo_search(platform=[asf.PLATFORM.SENTINEL1], intersectsWith=wkt)
except asf.ASFWKTError as e:
print(f'Exception example: {e}')
print('=' * 80)
print('A few more exception examples:')
try:
asf.search(offNadirAngle=[tuple([1])])
except ValueError as e:
print(f'Tuple too short: {e}')
try:
asf.search(offNadirAngle=[(1, 2, 3)])
except ValueError as e:
print(f'Tuple too long: {e}')
try:
asf.search(offNadirAngle=[('a', 2)])
except ValueError as e:
print(f'Tuple non-numeric min: {e}')
try:
asf.search(offNadirAngle=[(1, 'b')])
except ValueError as e:
print(f'Tuple non-numeric max: {e}')
try:
asf.search(offNadirAngle=[(float('NaN'), 2)])
except ValueError as e:
print(f'Tuple non-finite min: {e}')
try:
asf.search(offNadirAngle=[1, (float('Inf'))])
except ValueError as e:
print(f'Tuple non-finite max: {e}')
try:
asf.search(offNadirAngle=[(2, 1)])
except ValueError as e:
print(f'Tuple min > max: {e}')
try:
asf.search(offNadirAngle=[float('Inf')])
except ValueError as e:
print(f'Bare value non-finite: {e}')
try:
asf.search(offNadirAngle=['a'])
except ValueError as e:
print(f'Bare value non-numeric: {e}')
Discovery-asf_search-8.1.2/pyproject.toml 0000664 0000000 0000000 00000000731 14777330235 0020515 0 ustar 00root root 0000000 0000000 [build-system]
requires = [
"setuptools>=42",
"wheel",
"setuptools_scm[toml]>=3.4"
]
build-backend = "setuptools.build_meta"
# Same as declaring use_scm_version in setup.py, but avoids
# "UserWarning: Unknown distribution option: 'use_scm_version'"
# if setuptools_scm isn't installed when setup.py is called:
[tool.setuptools_scm]
[tool.ruff]
line-length = 100
fix = true
[tool.ruff.format]
# Prefer single quotes over double quotes.
quote-style = "single"
Discovery-asf_search-8.1.2/setup.py 0000664 0000000 0000000 00000004340 14777330235 0017313 0 ustar 00root root 0000000 0000000 """asf_search setuptools configuration"""
from setuptools import find_packages, setup
requirements = [
'requests',
'shapely',
'pytz',
'importlib_metadata',
'numpy',
'dateparser',
'python-dateutil',
'tenacity == 8.2.2',
]
test_requirements = [
'pytest==8.1.1',
'pytest-automation==3.0.0',
'pytest-cov',
'pytest-xdist',
'coverage',
'requests-mock==1.11.0',
'nbformat',
'nbconvert',
'ipykernel',
]
extra_requirements = [
'remotezip>=0.10.0',
'ciso8601',
]
with open('README.md', 'r') as readme_file:
readme = readme_file.read()
setup(
name='asf_search',
# version=Declared in pyproject.toml, through "[tool.setuptools_scm]"
author='Alaska Satellite Facility Discovery Team',
author_email='uaf-asf-discovery@alaska.edu',
description="Python wrapper for ASF's SearchAPI",
long_description=readme,
long_description_content_type='text/markdown',
url='https://github.com/asfadmin/Discovery-asf_search.git',
project_urls={'Documentation': 'https://docs.asf.alaska.edu/asf_search/basics/'},
packages=find_packages(exclude=['tests.*', 'tests', 'examples.*', 'examples']),
package_dir={'asf_search': 'asf_search'},
include_package_data=True,
python_requires='>=3.9',
install_requires=requirements,
extras_require={'test': test_requirements, 'extras': extra_requirements},
license='BSD',
license_files=('LICENSE',),
classifiers=[
'Development Status :: 5 - Production/Stable',
'License :: OSI Approved :: BSD License',
'Operating System :: OS Independent',
'Intended Audience :: Developers',
'Intended Audience :: Science/Research',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3 :: Only',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Programming Language :: Python :: 3.12',
'Topic :: Software Development',
'Topic :: Scientific/Engineering :: Atmospheric Science',
'Topic :: Scientific/Engineering :: GIS',
'Topic :: Scientific/Engineering :: Hydrology',
'Topic :: Utilities',
],
)
Discovery-asf_search-8.1.2/tests/ 0000775 0000000 0000000 00000000000 14777330235 0016742 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/tests/ASFProduct/ 0000775 0000000 0000000 00000000000 14777330235 0020714 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/tests/ASFProduct/test_ASFProduct.py 0000664 0000000 0000000 00000007246 14777330235 0024310 0 ustar 00root root 0000000 0000000 import pytest
import unittest
from asf_search import (
ASFProduct,
ASFSearchResults,
ASFSearchOptions,
ASFSession,
FileDownloadType,
)
from unittest.mock import patch
from shapely.geometry import shape
from shapely.ops import orient
import requests
from asf_search.search.search_generator import as_ASFProduct
def run_test_ASFProduct(product_json):
if product_json is None:
product = ASFProduct()
geojson = product.geojson()
assert geojson['type'] == 'Feature'
assert geojson['geometry'] == {'coordinates': None, 'type': 'Polygon'}
for val in geojson['properties'].values():
assert val is None
return
product = as_ASFProduct(product_json, ASFSession())
geojson = product.geojson()
if geojson['geometry']['coordinates'] is not None:
expected_shape = orient(shape(product_json['geometry']))
output_shape = orient(shape(geojson['geometry']))
assert output_shape.equals(expected_shape)
elif product.meta != {}:
assert product.properties == product_json['properties']
assert product.geometry == product_json['geometry']
assert product.umm == product_json['umm']
assert product.meta == product_json['meta']
def run_test_stack(reference, pre_processed_stack, processed_stack):
product = as_ASFProduct(reference, ASFSession())
with patch('asf_search.baseline_search.search') as search_mock:
temp = ASFSearchResults([as_ASFProduct(prod, ASFSession()) for prod in pre_processed_stack])
for idx, prod in enumerate(temp):
prod.baseline = pre_processed_stack[idx]['baseline']
search_mock.return_value = temp
stack = product.stack()
stack = [
product
for product in stack
if product.properties['temporalBaseline'] is not None
and product.properties['perpendicularBaseline'] is not None
]
for idx, secondary in enumerate(stack):
if idx > 0:
assert (
secondary.properties['temporalBaseline']
>= stack[idx - 1].properties['temporalBaseline']
)
assert (
secondary.properties['temporalBaseline']
== processed_stack[idx]['properties']['temporalBaseline']
)
assert (
secondary.properties['perpendicularBaseline']
== processed_stack[idx]['properties']['perpendicularBaseline']
)
def run_test_product_get_stack_options(reference, options):
product = as_ASFProduct(reference, ASFSession())
expected_options = dict(ASFSearchOptions(**options))
product_options = dict(product.get_stack_opts())
assert product_options == dict(expected_options)
def run_test_ASFProduct_download(reference, filename, filetype, additional_urls):
product = as_ASFProduct(reference, ASFSession())
product.properties['additionalUrls'] = additional_urls
with patch('asf_search.ASFSession.get') as mock_get:
resp = requests.Response()
resp.status_code = 200
mock_get.return_value = resp
resp.iter_content = lambda chunk_size: []
with patch('builtins.open', unittest.mock.mock_open()):
if filename is not None and (
(filetype == FileDownloadType.ADDITIONAL_FILES and len(additional_urls) > 1)
or (filetype == FileDownloadType.ALL_FILES and len(additional_urls) > 0)
):
with pytest.warns(Warning):
product.download('./', filename=filename, fileType=filetype)
else:
product.download('./', filename=filename, fileType=filetype)
Discovery-asf_search-8.1.2/tests/ASFSearchOptions/ 0000775 0000000 0000000 00000000000 14777330235 0022055 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/tests/ASFSearchOptions/test_ASFSearchOptions.py 0000664 0000000 0000000 00000005431 14777330235 0026604 0 ustar 00root root 0000000 0000000 import copy
from asf_search.ASFSearchOptions import validators, ASFSearchOptions
from asf_search.ASFSearchOptions.config import config
from asf_search.ASFSearchOptions.validator_map import validate, validator_map
from pytest import raises
def run_test_validator_map_validate(key, value, output):
if key not in list(validator_map.keys()):
with raises(KeyError) as keyerror:
validate(key, value)
if key in [
validator_key.lower()
for validator_key in list(validator_map.keys())
if key not in config.keys()
]:
assert 'Did you mean' in str(keyerror.value)
return
assert validate(key, value) == output
def run_test_ASFSearchOptions_validator(validator_name, param, output, error):
validator = getattr(validators, validator_name)
if error is None:
assert output == validator(param)
else:
with raises(ValueError) as e:
validator(param)
assert error in str(e.value)
def run_test_ASFSearchOptions(**kwargs):
test_info = copy.copy(kwargs['test_info'])
exception = test_info['exception'] # Can be "None" for don't.
if 'expect_output' in test_info:
expect_output = test_info.pop('expect_output')
else:
expect_output = {}
# Take out anything that isn't supposed to reach the options object:
del test_info['title']
del test_info['exception']
try:
options_obj = ASFSearchOptions(**test_info)
except (KeyError, ValueError) as e:
assert (
type(e).__name__ == exception
), f"ERROR: Didn't expect exception {type(e).__name__} to occur."
return
else:
assert (
exception is None
), f'ERROR: Expected exception {exception}, but SearchOptions never threw.'
for key, val in expect_output.items():
assert (
getattr(options_obj, key) == val
), f"ERROR: options object param '{key}' should have value '{val}'. Got '{getattr(options_obj, key)}'."
# test ASFSearchOptions.reset_search()
options_obj.reset_search()
assert (
len([val for key, val in dict(options_obj).items() if key not in config.keys()]) == 0
), 'ERROR: ASFSearchOptions.reset() did not clear all non-default searchable params'
for key, value in config.items():
if test_info.get(key) is not None:
assert (
getattr(options_obj, key) == test_info[key]
), f"ERROR: User defined value '{test_info[key]}' for default param '{key}', but value was lost after ASFSearchOptions.reset()"
else:
assert (
getattr(options_obj, key) == value
), f"ERROR: default param '{key}' left default by user changed, should have value '{val}'. Got '{getattr(options_obj, key)}'."
Discovery-asf_search-8.1.2/tests/ASFSearchResults/ 0000775 0000000 0000000 00000000000 14777330235 0022063 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/tests/ASFSearchResults/test_ASFSearchResults.py 0000664 0000000 0000000 00000023211 14777330235 0026614 0 ustar 00root root 0000000 0000000 from typing import Dict, List
import asf_search as asf
from asf_search import ASFSearchResults
import defusedxml.ElementTree as DefusedETree
import xml.etree.ElementTree as ETree
import json
import shapely.wkt as WKT
import requests
import csv
from shapely.geometry import Polygon
from shapely.wkt import loads
from shapely.geometry import shape
from shapely.geometry.base import BaseGeometry
from asf_search.CMR.translate import try_parse_date
from asf_search.constants import PLATFORM
from asf_search import ASF_LOGGER
import re
from asf_search.exceptions import ASFSearchError
# when this replaces SearchAPI change values to cached
API_URL = 'https://api.daac.asf.alaska.edu/services/search/param?'
def run_test_output_format(results: ASFSearchResults):
# search results are always sorted this way when returned from asf_search.search(),
# but not all test case resources are
results.sort(key=lambda p: (p.properties['stopTime'], p.properties['fileID']), reverse=True)
product_list_str = ','.join([product.properties['fileID'] for product in results])
results.searchComplete = True
for output_type in ['csv', 'kml', 'metalink', 'jsonlite', 'jsonlite2', 'geojson']:
expected = get_SearchAPI_Output(product_list_str, output_type)
if output_type == 'csv':
check_csv(results, expected)
if output_type == 'kml':
check_kml(results, expected)
elif output_type == 'metalink':
check_metalink(results, expected)
elif output_type in ['jsonlite', 'jsonlite2']:
check_jsonLite(results, expected, output_type)
elif output_type == 'geojson':
check_geojson(results)
def check_metalink(results: ASFSearchResults, expected_str: str):
actual = ''.join([line for line in results.metalink()])
actual_tree = DefusedETree.fromstring(actual)
expected_tree = DefusedETree.fromstring(expected_str)
canon_actual = ETree.canonicalize(DefusedETree.tostring(actual_tree), strip_text=True)
canon_expected = ETree.canonicalize(DefusedETree.tostring(expected_tree), strip_text=True)
assert canon_actual == canon_expected
def check_kml(results: ASFSearchResults, expected_str: str):
namespaces = {'kml': 'http://www.opengis.net/kml/2.2'}
placemarks_path = './/kml:Placemark'
expected_root = DefusedETree.fromstring(expected_str)
expected_placemarks = expected_root.findall(placemarks_path, namespaces)
actual_root = DefusedETree.fromstring(''.join([block for block in results.kml()]))
actual_placemarks = actual_root.findall(placemarks_path, namespaces)
# Check polygons for equivalence (asf-search starts from a different pivot)
# and remove them from the kml so we can easily compare the rest of the placemark data
for expected_placemark, actual_placemark in zip(expected_placemarks, actual_placemarks):
expected_polygon = expected_placemark.findall('./*')[-1]
actual_polygon = actual_placemark.findall('./*')[-1]
expected_coords = get_coordinates_from_kml(DefusedETree.tostring(expected_polygon))
actual_coords = get_coordinates_from_kml(DefusedETree.tostring(actual_polygon))
assert Polygon(expected_coords).equals(Polygon(actual_coords))
expected_placemark.remove(expected_polygon)
actual_placemark.remove(actual_polygon)
# Get canonicalize xml strings so minor differences are normalized
actual_canon = ETree.canonicalize(DefusedETree.tostring(actual_root), strip_text=True)
expected_canon = ETree.canonicalize(DefusedETree.tostring(expected_root), strip_text=True)
date_pattern = r'\>(?P[\w ]*time|Time): *(?P[^\<]*)\<'
actual_dates = re.findall(date_pattern, actual_canon, re.MULTILINE)
expected_date = re.findall(date_pattern, expected_canon, re.MULTILINE)
for idx, match in enumerate(actual_dates):
date_str, date_value = match
assert expected_date[idx][0] == date_str
assert try_parse_date(expected_date[idx][1]) == try_parse_date(date_value)
actual_canon = re.sub(date_pattern, '', actual_canon)
expected_canon = re.sub(date_pattern, '', expected_canon)
assert actual_canon == expected_canon
def get_coordinates_from_kml(data: str):
namespaces = {'kml': 'http://www.opengis.net/kml/2.2'}
coords = []
coords_lon_lat_path = './/kml:outerBoundaryIs/kml:LinearRing/kml:coordinates'
root = DefusedETree.fromstring(data)
coordinates_elements = root.findall(coords_lon_lat_path, namespaces)
for lon_lat_z in coordinates_elements[0].text.split('\n'):
if len(lon_lat_z.split(',')) == 3:
lon, lat, _ = lon_lat_z.strip().split(',')
coords.append([float(lon), float(lat)])
return coords
def check_csv(results: ASFSearchResults, expected_str: str):
expected = [product for product in csv.reader(expected_str.split('\n')) if product != []]
# actual = [prod for prod in csv.reader(''.join([s for s in results.csv()]).split('\n')) if prod != []]
expected = csv.DictReader(expected_str.split('\n'))
actual = csv.DictReader([s for s in results.csv()])
for actual_row, expected_row in zip(actual, expected):
actual_dict = dict(actual_row)
expected_dict = dict(expected_row)
for key in expected_dict.keys():
if expected_dict[key] in ['None', None, '']:
assert actual_dict[key] in ['None', None, '']
else:
try:
expected_value = float(expected_dict[key])
actual_value = float(actual_dict[key])
assert (
expected_value == actual_value
), f"expected '{expected_dict[key]}' for key '{key}', got '{actual_dict[key]}'"
except ValueError:
try:
expected_date = try_parse_date(expected_dict[key])
actual_date = try_parse_date(actual_dict[key])
assert (
expected_date == actual_date
), f"Expected date '{expected_date}' for key '{key}', got '{actual_date}'"
except ValueError:
assert (
expected_dict[key] == actual_dict[key]
), f"expected '{expected_dict[key]}' for key '{key}', got '{actual_dict[key]}'"
def check_jsonLite(results: ASFSearchResults, expected_str: str, output_type: str):
jsonlite2 = output_type == 'jsonlite2'
expected = json.loads(expected_str)['results']
if jsonlite2:
wkt_key = 'w'
wkt_unwrapped_key = 'wu'
start_time_key = 'st'
stop_time_key = 'stp'
else:
wkt_key = 'wkt'
wkt_unwrapped_key = 'wkt_unwrapped'
start_time_key = 'startTime'
stop_time_key = 'stopTime'
actual = json.loads(''.join(results.jsonlite2() if jsonlite2 else results.jsonlite()))[
'results'
]
for idx, expected_product in enumerate(expected):
wkt = expected_product.pop(wkt_key)
wkt_unwrapped = expected_product.pop(wkt_unwrapped_key)
startTime = expected_product.pop(start_time_key)
stopTime = expected_product.pop(stop_time_key)
for key in expected_product.keys():
assert actual[idx][key] == expected_product[key]
assert WKT.loads(actual[idx][wkt_key]).equals(WKT.loads(wkt))
assert WKT.loads(actual[idx][wkt_unwrapped_key]).equals(WKT.loads(wkt_unwrapped))
assert actual[idx][start_time_key] == try_parse_date(startTime)
assert actual[idx][stop_time_key] == try_parse_date(stopTime)
def check_geojson(results: ASFSearchResults):
expected = results.geojson()
actual = asf.export.results_to_geojson(results)
assert json.loads(''.join(actual)) == expected
def get_SearchAPI_Output(product_list: List[str], output_type: str) -> List[Dict]:
response = requests.get(API_URL, [('product_list', product_list), ('output', output_type)])
response.raise_for_status()
expected = response.text
return expected
def run_test_ASFSearchResults_intersection(wkt: str):
wrapped, unwrapped, _ = asf.validate_wkt(wkt)
unchanged_aoi = loads(wkt) # sometimes geometries don't come back with wrapping in mind
# exclude SMAP products
platforms = [PLATFORM.SENTINEL1, PLATFORM.UAVSAR]
def overlap_check(s1: BaseGeometry, s2: BaseGeometry):
return s1.overlaps(s2) or s1.touches(s2) or s2.distance(s1) <= 0.005
for platform in platforms:
asf.constants.INTERNAL.CMR_TIMEOUT = 120
try:
results = asf.geo_search(intersectsWith=wkt, platform=platform, maxResults=250)
except ASFSearchError as exc:
asf.constants.INTERNAL.CMR_TIMEOUT = 30
if str(exc).startswith("Connection Error (Timeout):"):
ASF_LOGGER.warning('CMR timeout while running intersection test')
continue
else:
raise BaseException(
f'Failed to perform intersection test with wkt: {wkt}\nplatform: {platform}.\nOriginal exception: {exc}'
)
asf.constants.INTERNAL.CMR_TIMEOUT = 30
for product in results:
if shape(product.geometry).is_valid:
product_geom_wrapped, product_geom_unwrapped, _ = asf.validate_wkt(
shape(product.geometry)
)
original_shape = unchanged_aoi
assert (
overlap_check(product_geom_wrapped, wrapped)
or overlap_check(product_geom_wrapped, original_shape)
), f"OVERLAP FAIL: {product.properties['sceneName']}, {product.geometry} \nproduct: {product_geom_wrapped.wkt} \naoi: {wrapped.wkt}"
Discovery-asf_search-8.1.2/tests/ASFSession/ 0000775 0000000 0000000 00000000000 14777330235 0020717 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/tests/ASFSession/test_ASFSession.py 0000664 0000000 0000000 00000016274 14777330235 0024317 0 ustar 00root root 0000000 0000000 import numbers
from typing import List
import asf_search
from asf_search.ASFSession import ASFSession
from requests.cookies import create_cookie
import http.cookiejar
import requests
from multiprocessing import Pool
from unittest.mock import patch
def run_auth_with_creds(username: str, password: str):
session = ASFSession()
session.auth_with_creds(username=username, password=password)
def run_auth_with_token(token: str):
session = ASFSession()
with patch('asf_search.ASFSession.post') as mock_token_session:
if not token.startswith('Bearer EDL'):
mock_token_session.return_value.status_code = 400
session.auth_with_token(token)
mock_token_session.return_value.status_code = 200
session.auth_with_token(token)
def run_auth_with_cookiejar(cookies: List):
cookiejar = http.cookiejar.CookieJar()
for cookie in cookies:
cookiejar.set_cookie(create_cookie(name=cookie.pop('name'), **cookie))
# requests.cookies.RequestsCookieJar, which has slightly different behaviour
session = ASFSession()
session.auth_with_cookiejar(cookiejar)
request_cookiejar_session = ASFSession()
request_cookiejar_session.auth_with_cookiejar(session.cookies)
def run_test_asf_session_rebuild_auth(
original_domain: str,
response_domain: str,
response_code: numbers.Number,
final_token,
):
if final_token == 'None':
final_token = None
session = ASFSession()
with patch('asf_search.ASFSession.post') as mock_token_session:
mock_token_session.return_value.status_code = 200
session.auth_with_token('bad_token')
req = requests.Request(original_domain)
req.headers.update({'Authorization': 'Bearer fakeToken'})
response = requests.Response()
response.status_code = response_code
response.location = response_domain
response.request = requests.Request()
response.request.url = response_domain
response.headers.update({'Authorization': 'Bearer fakeToken'})
with patch('asf_search.ASFSession._get_domain') as hostname_patch:
hostname_patch.side_effect = [original_domain, response_domain]
session.rebuild_auth(req, response)
assert req.headers.get('Authorization') == final_token
def test_ASFSession_INTERNAL_mangling():
session = asf_search.ASFSession()
session.cmr_host = asf_search.constants.INTERNAL.EDL_HOST
session.edl_host = asf_search.constants.INTERNAL.EDL_CLIENT_ID
session.auth_cookie_names = asf_search.constants.INTERNAL.ASF_AUTH_HOST
session.auth_domains = asf_search.constants.INTERNAL.CMR_HOST
session.asf_auth_host = asf_search.constants.INTERNAL.CMR_COLLECTIONS
session.cmr_collections = asf_search.constants.INTERNAL.AUTH_DOMAINS
session.edl_client_id = asf_search.constants.INTERNAL.AUTH_COOKIES
# get the current defaults since we're going to mangle them
DEFAULT_EDL_HOST = asf_search.constants.INTERNAL.EDL_HOST
DEFAULT_EDL_CLIENT_ID = asf_search.constants.INTERNAL.EDL_CLIENT_ID
DEFAULT_ASF_AUTH_HOST = asf_search.constants.INTERNAL.ASF_AUTH_HOST
DEFAULT_CMR_HOST = asf_search.constants.INTERNAL.CMR_HOST
DEFAULT_CMR_COLLECTIONS = asf_search.constants.INTERNAL.CMR_COLLECTIONS
DEFAULT_AUTH_DOMAINS = asf_search.constants.INTERNAL.AUTH_DOMAINS
DEFAULT_AUTH_COOKIES = asf_search.constants.INTERNAL.AUTH_COOKIES
uat_domain = 'cmr.uat.earthdata.nasa.gov'
edl_client_id = 'custom_client_id'
auth_host = 'custom_auth_host'
cmr_collection = '/search/granules'
auth_domains = ['custom_auth_domain']
uat_login_cookie = ['uat_urs_user_already_logged']
uat_login_domain = 'uat.urs.earthdata.nasa.gov'
asf_search.constants.INTERNAL.CMR_HOST = uat_domain
asf_search.constants.INTERNAL.EDL_HOST = uat_login_domain
asf_search.constants.INTERNAL.AUTH_COOKIES = uat_login_cookie
asf_search.constants.INTERNAL.EDL_CLIENT_ID = edl_client_id
asf_search.constants.INTERNAL.AUTH_DOMAINS = auth_domains
asf_search.constants.INTERNAL.ASF_AUTH_HOST = auth_host
asf_search.constants.INTERNAL.CMR_COLLECTIONS = cmr_collection
mangeled_session = asf_search.ASFSession()
# set them back
asf_search.constants.INTERNAL.EDL_HOST = DEFAULT_EDL_HOST
asf_search.constants.INTERNAL.EDL_CLIENT_ID = DEFAULT_EDL_CLIENT_ID
asf_search.constants.INTERNAL.ASF_AUTH_HOST = DEFAULT_ASF_AUTH_HOST
asf_search.constants.INTERNAL.CMR_HOST = DEFAULT_CMR_HOST
asf_search.constants.INTERNAL.CMR_COLLECTIONS = DEFAULT_CMR_COLLECTIONS
asf_search.constants.INTERNAL.AUTH_DOMAINS = DEFAULT_AUTH_DOMAINS
asf_search.constants.INTERNAL.AUTH_COOKIES = DEFAULT_AUTH_COOKIES
assert mangeled_session.cmr_host == uat_domain
assert mangeled_session.edl_host == uat_login_domain
assert mangeled_session.auth_cookie_names == uat_login_cookie
assert mangeled_session.auth_domains == auth_domains
assert mangeled_session.asf_auth_host == auth_host
assert mangeled_session.cmr_collections == cmr_collection
assert mangeled_session.edl_client_id == edl_client_id
custom_session = asf_search.ASFSession(
cmr_host=uat_domain,
edl_host=uat_login_domain,
auth_cookie_names=uat_login_cookie,
auth_domains=auth_domains,
asf_auth_host=auth_host,
cmr_collections=cmr_collection,
edl_client_id=edl_client_id,
)
assert custom_session.cmr_host == uat_domain
assert custom_session.edl_host == uat_login_domain
assert custom_session.auth_cookie_names == uat_login_cookie
assert custom_session.auth_domains == auth_domains
assert custom_session.asf_auth_host == auth_host
assert custom_session.cmr_collections == cmr_collection
assert custom_session.edl_client_id == edl_client_id
def test_ASFSession_pooling():
uat_domain = 'cmr.uat.earthdata.nasa.gov'
edl_client_id = 'custom_client_id'
auth_host = 'custom_auth_host'
cmr_collection = '/search/granules'
auth_domains = ['custom_auth_domain']
uat_login_cookie = ['uat_urs_user_already_logged']
uat_login_domain = 'uat.urs.earthdata.nasa.gov'
custom_session = asf_search.ASFSession(
cmr_host=uat_domain,
edl_host=uat_login_domain,
auth_cookie_names=uat_login_cookie,
auth_domains=auth_domains,
asf_auth_host=auth_host,
cmr_collections=cmr_collection,
edl_client_id=edl_client_id,
)
Pool()
pool = Pool(processes=2)
pool.map(_assert_pooled_instance_variables, [custom_session, custom_session])
pool.close()
pool.join()
def _assert_pooled_instance_variables(session):
uat_domain = 'cmr.uat.earthdata.nasa.gov'
edl_client_id = 'custom_client_id'
auth_host = 'custom_auth_host'
cmr_collection = '/search/granules'
auth_domains = ['custom_auth_domain']
uat_login_cookie = ['uat_urs_user_already_logged']
uat_login_domain = 'uat.urs.earthdata.nasa.gov'
assert session.cmr_host == uat_domain
assert session.edl_host == uat_login_domain
assert session.auth_cookie_names == uat_login_cookie
assert session.auth_domains == auth_domains
assert session.asf_auth_host == auth_host
assert session.cmr_collections == cmr_collection
assert session.edl_client_id == edl_client_id
Discovery-asf_search-8.1.2/tests/BaselineSearch/ 0000775 0000000 0000000 00000000000 14777330235 0021612 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/tests/BaselineSearch/Stack/ 0000775 0000000 0000000 00000000000 14777330235 0022657 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/tests/BaselineSearch/Stack/test_stack.py 0000664 0000000 0000000 00000005203 14777330235 0025375 0 ustar 00root root 0000000 0000000 from typing import List
from numbers import Number
from asf_search.baseline.stack import find_new_reference, get_baseline_from_stack
from asf_search import ASFProduct, ASFSearchResults, ASFSession, ASFStackableProduct
from asf_search.search.search_generator import as_ASFProduct
import pytest
def run_test_find_new_reference(stack: List, output_index: Number) -> None:
"""
Test asf_search.baseline.stack.find_new_reference
"""
if stack == []:
assert find_new_reference(stack) is None
else:
products = [as_ASFProduct(product, ASFSession()) for product in stack]
for idx, product in enumerate(products):
product = clear_baseline(stack[idx], product)
assert (
find_new_reference(products).properties['sceneName']
== stack[output_index]['properties']['sceneName']
)
def run_test_get_default_product_type(product: ASFStackableProduct, product_type: str) -> None:
assert product.get_default_baseline_product_type() == product_type
def run_test_get_baseline_from_stack(reference, stack, output_stack, error):
reference = as_ASFProduct(reference, ASFSession())
stack = ASFSearchResults([as_ASFProduct(product, ASFSession()) for product in stack])
if error is None:
stack, warnings = get_baseline_from_stack(reference, stack)
keys = ['sceneName', 'perpendicularBaseline', 'temporalBaseline']
for idx, product in enumerate(stack):
for key in keys:
assert product.properties[key] == output_stack[idx]['properties'][key]
return
with pytest.raises(ValueError):
for product in stack:
if product.baseline.get('insarBaseline', False):
product.baseline = {}
else:
product.baseline['stateVectors']['positions'] = {}
product.baseline['stateVectors']['velocities'] = {}
reference.baseline = {}
get_baseline_from_stack(reference=reference, stack=stack)
def run_test_valid_state_vectors(reference, output):
if reference is not None:
product = as_ASFProduct(reference, ASFSession())
clear_baseline(reference, product)
assert output == product.is_valid_reference()
return
def clear_baseline(resource, product: ASFProduct):
# Baseline values can be restored from UMM in asfProduct constructor,
# this erases them again if the resource omitted them from the product
if stateVectors := resource['baseline'].get('stateVectors'):
if stateVectors.get('positions') == {}:
product.baseline = {'stateVectors': {'positions': {}, 'velocities': {}}}
return product
Discovery-asf_search-8.1.2/tests/BaselineSearch/test_baseline_search.py 0000664 0000000 0000000 00000007543 14777330235 0026343 0 ustar 00root root 0000000 0000000 from copy import deepcopy
from unittest.mock import patch
from asf_search.exceptions import ASFBaselineError, ASFSearchError
from asf_search.ASFSearchResults import ASFSearchResults
from asf_search import ASFSession
from asf_search.search.baseline_search import stack_from_id, stack_from_product
from asf_search.baseline.stack import calculate_temporal_baselines
import pytest
from asf_search.search.search_generator import as_ASFProduct
def run_test_get_preprocessed_stack_params(product):
reference = as_ASFProduct(product, ASFSession())
params = reference.get_stack_opts()
original_properties = product['properties']
assert params.processingLevel == [reference.get_default_baseline_product_type()]
assert params.insarStackId == original_properties['insarStackId']
assert len(dict(params)) == 2
def run_test_get_unprocessed_stack_params(product):
reference = as_ASFProduct(product, ASFSession())
params = reference.get_stack_opts()
original_properties = product['properties']
assert original_properties['polarization'] in params.polarization
if reference.properties['processingLevel'] == 'BURST':
assert [reference.properties['polarization']] == params.polarization
assert [reference.properties['burst']['fullBurstID']] == params.fullBurstID
else:
assert (
['VV', 'VV+VH'] == params.polarization
if reference.properties['polarization'] in ['VV', 'VV+VH']
else ['HH', 'HH+HV'] == params.polarization
)
assert len(dict(params)) == 7
def run_get_stack_opts_invalid_insarStackId(product):
invalid_reference = as_ASFProduct(product, ASFSession())
invalid_reference.properties['insarStackId'] = '0'
with pytest.raises(ASFBaselineError):
invalid_reference.get_stack_opts()
def run_test_calc_temporal_baselines(reference, stack):
reference = as_ASFProduct(reference, ASFSession())
stack = ASFSearchResults([as_ASFProduct(product, ASFSession()) for product in stack])
stackLength = len(stack)
calculate_temporal_baselines(reference, stack)
assert len(stack) == stackLength
for secondary in stack:
assert 'temporalBaseline' in secondary.properties
def run_test_stack_from_product(reference, stack):
reference = as_ASFProduct(reference, ASFSession())
with patch('asf_search.baseline_search.search') as search_mock:
search_mock.return_value = ASFSearchResults(
[as_ASFProduct(product, ASFSession()) for product in stack]
)
stack = stack_from_product(reference)
for idx, secondary in enumerate(stack):
if idx > 0:
assert (
secondary.properties['temporalBaseline']
>= stack[idx - 1].properties['temporalBaseline']
)
def run_test_stack_from_id(stack_id: str, reference, stack):
temp = deepcopy(stack)
with patch('asf_search.baseline_search.product_search') as mock_product_search:
mock_product_search.return_value = ASFSearchResults(
[as_ASFProduct(product, ASFSession()) for product in stack]
)
if not stack_id:
with pytest.raises(ASFSearchError):
stack_from_id(stack_id)
else:
with patch('asf_search.baseline_search.search') as search_mock:
search_mock.return_value = ASFSearchResults(
[as_ASFProduct(product, ASFSession()) for product in temp]
)
returned_stack = stack_from_id(stack_id)
assert len(returned_stack) == len(stack)
for idx, secondary in enumerate(returned_stack):
if idx > 0:
assert (
secondary.properties['temporalBaseline']
>= stack[idx - 1]['properties']['temporalBaseline']
)
Discovery-asf_search-8.1.2/tests/CMR/ 0000775 0000000 0000000 00000000000 14777330235 0017363 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/tests/CMR/test_MissionList.py 0000664 0000000 0000000 00000001616 14777330235 0023255 0 ustar 00root root 0000000 0000000 from asf_search.CMR.MissionList import get_campaigns
from asf_search.search.campaigns import _get_project_names
import pytest
import requests_mock
from asf_search.constants.INTERNAL import CMR_COLLECTIONS_PATH, CMR_HOST
from asf_search.exceptions import CMRError
def test_getMissions_error():
with requests_mock.Mocker() as m:
m.register_uri(
'POST',
'https://' + CMR_HOST + CMR_COLLECTIONS_PATH,
status_code=300,
json={'error': {'report': ''}},
)
with pytest.raises(CMRError):
get_campaigns({})
def test_getMissions_error_parsing():
with requests_mock.Mocker() as m:
m.post('https://' + CMR_HOST + CMR_COLLECTIONS_PATH)
with pytest.raises(CMRError):
get_campaigns({})
def run_test_get_project_names(cmr_ummjson, campaigns):
assert _get_project_names(cmr_ummjson) == campaigns
Discovery-asf_search-8.1.2/tests/Search/ 0000775 0000000 0000000 00000000000 14777330235 0020147 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/tests/Search/test_search.py 0000664 0000000 0000000 00000017164 14777330235 0023036 0 ustar 00root root 0000000 0000000 from numbers import Number
from asf_search import ASFProduct, ASFSearchOptions
from asf_search import ASFSession
# from asf_search.CMR.translate import get
from tenacity import retry, retry_if_exception_type, stop_after_attempt
from asf_search import ASF_LOGGER
from asf_search.CMR.subquery import build_subqueries
from asf_search.CMR.translate import try_parse_date
from asf_search.constants import INTERNAL
from asf_search.exceptions import ASFSearchError
from asf_search.search import search
from asf_search.ASFSearchResults import ASFSearchResults
from asf_search.CMR import dataset_collections
from pytest import raises
from typing import List
import requests
import requests_mock
from asf_search.search.search_generator import as_ASFProduct, preprocess_opts
SEARCHAPI_URL = 'https://api.daac.asf.alaska.edu'
SEARCHAPI_ENDPOINT = '/services/search/param?'
def run_test_ASFSearchResults(search_resp):
search_results = ASFSearchResults(
[as_ASFProduct(product, ASFSession()) for product in search_resp]
)
assert len(search_results) == len(search_resp)
assert search_results.geojson()['type'] == 'FeatureCollection'
for idx, feature in enumerate(search_results):
# temporal and perpendicular baseline values are calculated post-search,
# so there's no instance where they'll be returned in a CMR search
search_resp[idx]['properties'].pop('temporalBaseline', None)
search_resp[idx]['properties'].pop('perpendicularBaseline', None)
assert feature.geojson()['geometry'] == search_resp[idx]['geometry']
for key, item in feature.geojson()['properties'].items():
if key == 'esaFrame':
assert search_resp[idx]['properties']['frameNumber'] == item
elif 'esaFrame' in feature.geojson()['properties'].keys() and key == 'frameNumber':
continue
elif key in ['stopTime', 'startTime', 'processingDate']:
assert try_parse_date(item) == try_parse_date(search_resp[idx]['properties'][key])
elif search_resp[idx]['properties'].get(key) is not None and item is not None:
assert item == search_resp[idx]['properties'][key]
def run_test_search(search_parameters, answer):
with requests_mock.Mocker() as m:
m.post(
f'https://{INTERNAL.CMR_HOST}{INTERNAL.CMR_GRANULE_PATH}',
json={'items': answer, 'hits': len(answer)},
)
response = search(**search_parameters)
if search_parameters.get('maxResults', False):
assert len(response) == search_parameters['maxResults']
assert len(response) == len(answer)
# assert(response.geojson()["features"] == answer)
def run_test_search_http_error(search_parameters, status_code: Number, report: str):
if not len(search_parameters.keys()):
with requests_mock.Mocker() as m:
m.register_uri(
'POST',
f'https://{INTERNAL.CMR_HOST}{INTERNAL.CMR_GRANULE_PATH}',
status_code=status_code,
json={'errors': {'report': report}},
)
m.register_uri('POST', 'https://search-error-report.asf.alaska.edu/', real_http=True)
searchOptions = ASFSearchOptions(**search_parameters)
with raises(ASFSearchError):
search(opts=searchOptions)
return
# If we're not doing an empty search we want to fire off one real query to CMR, then interrupt it with an error
# We can tell a search isn't the first one by checking if 'CMR-Search-After' has been set
def custom_matcher(request: requests.Request):
if 'CMR-Search-After' in request.headers.keys():
resp = requests.Response()
resp.status_code = 200
return resp
return None
with requests_mock.Mocker() as m:
m.register_uri(
'POST',
f'https://{INTERNAL.CMR_HOST}{INTERNAL.CMR_GRANULE_PATH}',
real_http=True,
)
m.register_uri(
'POST',
f'https://{INTERNAL.CMR_HOST}{INTERNAL.CMR_GRANULE_PATH}',
additional_matcher=custom_matcher,
status_code=status_code,
json={'errors': {'report': report}},
)
m.register_uri('POST', 'https://search-error-report.asf.alaska.edu/', real_http=True)
search_parameters['maxResults'] = INTERNAL.CMR_PAGE_SIZE + 1
searchOptions = ASFSearchOptions(**search_parameters)
with raises(ASFSearchError):
search(opts=searchOptions)
def run_test_dataset_search(datasets: List):
if any(dataset for dataset in datasets if dataset_collections.get(dataset) is None):
with raises(ValueError):
search(dataset=datasets, maxResults=1)
else:
for dataset in datasets:
valid_shortnames = list(dataset_collections.get(dataset))
response = search(dataset=dataset, maxResults=250)
# Get collection shortName of all granules
shortNames = list(
set(
[
shortName
for product in response
if (
shortName := ASFProduct.umm_get(
product.umm, 'CollectionReference', 'ShortName'
)
)
is not None
]
)
)
# and check that results are limited to the expected datasets by their shortname
for shortName in shortNames:
assert shortName in valid_shortnames
def run_test_build_subqueries(params: ASFSearchOptions, expected: List):
# mainly for getting platform aliases
preprocess_opts(params)
actual = build_subqueries(params)
for a, b in zip(actual, expected):
for key, actual_val in a:
expected_val = getattr(b, key)
if isinstance(actual_val, list):
if key == 'cmr_keywords':
for idx, key_value_pair in enumerate(actual_val):
assert key_value_pair == expected_val[idx]
else:
if len(actual_val) > 0: # ASFSearchOptions leaves empty lists as None
expected_set = set(expected_val)
actual_set = set(actual_val)
difference = expected_set.symmetric_difference(actual_set)
assert (
len(difference) == 0
), f'Found {len(difference)} missing entries for subquery generated keyword: "{key}"\n{list(difference)}'
else:
assert actual_val == expected_val
def run_test_keyword_aliasing_results(params: ASFSearchOptions):
module_response = search(opts=params)
try:
api_response = query_endpoint(dict(params))
except requests.ReadTimeout:
ASF_LOGGER.warn(f'SearchAPI timed out, skipping test for params {str(params)}')
return
api_results = api_response['results']
api_dict = {product['granuleName']: True for product in api_results}
for product in module_response:
sceneName = product.properties['sceneName']
assert api_dict.get(
sceneName, False
), f'Found unexpected scene in asf-search module results, {sceneName}\{dict(params)}'
@retry(
stop=stop_after_attempt(3),
retry=retry_if_exception_type(requests.HTTPError),
reraise=True,
)
def query_endpoint(params):
response = requests.post(
url=SEARCHAPI_URL + SEARCHAPI_ENDPOINT, data={**params, 'output': 'jsonlite'}
)
response.raise_for_status()
return response.json()
Discovery-asf_search-8.1.2/tests/Search/test_search_generator.py 0000664 0000000 0000000 00000004631 14777330235 0025077 0 ustar 00root root 0000000 0000000
from asf_search import ASFSearchOptions, ASFSearchResults
from asf_search import INTERNAL
from typing import List
import math
from asf_search.search import search_generator, preprocess_opts
def run_test_search_generator_multi(search_opts: List[ASFSearchOptions]):
queries = [search_generator(opts=opts) for opts in search_opts]
expected_results_size = sum([opts.maxResults for opts in search_opts])
expected_page_count = sum(
[math.ceil(opts.maxResults / INTERNAL.CMR_PAGE_SIZE) for opts in search_opts]
)
combined_results = []
page_count = 0
searches = {}
for opt in search_opts:
if isinstance(opt.platform, list):
for platform in opt.platform:
searches[platform] = False
else:
searches[opt.platform] = False
while len(queries):
queries_iter = iter(queries)
for idx, query in enumerate(queries_iter): # Alternate pages between results
page = next(query, None)
if page is not None:
combined_results.extend(page)
page_count += 1
if page.searchComplete:
if isinstance(page.searchOptions.platform, list):
for platform in page.searchOptions.platform:
searches[platform] = True
else:
searches[page.searchOptions.platform] = True
else:
queries[idx] = None
queries = [query for query in queries if query is not None]
assert page_count == expected_page_count
assert len(combined_results) == expected_results_size
assert len([completed for completed in searches if completed]) >= len(search_opts)
def run_test_search_generator(search_opts: ASFSearchOptions):
pages_iter = search_generator(opts=search_opts)
page_count = int(search_opts.maxResults / INTERNAL.CMR_PAGE_SIZE)
page_idx = 0
results = ASFSearchResults([])
for page in pages_iter:
results.extend(page)
results.searchComplete = page.searchComplete
results.searchOptions = page.searchOptions
page_idx += 1
assert page_count <= page_idx
assert len(results) <= search_opts.maxResults
assert results.searchComplete
preprocess_opts(search_opts)
for key, val in search_opts:
if key != 'maxResults':
assert getattr(results.searchOptions, key) == val
Discovery-asf_search-8.1.2/tests/Serialization/ 0000775 0000000 0000000 00000000000 14777330235 0021557 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/tests/Serialization/test_serialization.py 0000664 0000000 0000000 00000003011 14777330235 0026040 0 ustar 00root root 0000000 0000000 from asf_search import ASFSearchResults, ASFSession
from asf_search.ASFSearchOptions.ASFSearchOptions import ASFSearchOptions
import os
import json
from asf_search.search.search_generator import as_ASFProduct
def run_test_serialization(product=None, results=None, opts=ASFSearchOptions()):
if product is None:
to_serialize = ASFSearchResults([json_to_product(prod) for prod in results])
else:
to_serialize = ASFSearchResults([json_to_product(product)])
with open('serialized_product.json', 'w') as f:
f.write(json.dumps({'results': to_serialize.geojson(), 'opts': dict(opts)}))
f.close()
with open('serialized_product.json', 'r') as f:
deserialized = json.loads(f.read())
f.close()
os.remove('serialized_product.json')
deserialized_results = deserialized.get('results')
deserialized_opts = deserialized.get('opts')
for key, value in deserialized_opts.items():
assert value == getattr(opts, key)
for idx, original in enumerate(to_serialize):
assert deserialized_results['features'][idx]['properties'] == original.properties
assert deserialized_results['features'][idx]['geometry'] == original.geometry
assert deserialized_results['type'] == 'FeatureCollection'
def json_to_product(product):
output = as_ASFProduct(product, session=ASFSession())
output.meta = product['meta']
output.properties = product['properties']
output.geometry = product['geometry']
output.umm = product['umm']
return output
Discovery-asf_search-8.1.2/tests/WKT/ 0000775 0000000 0000000 00000000000 14777330235 0017407 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/tests/WKT/test_validate_wkt.py 0000664 0000000 0000000 00000005602 14777330235 0023501 0 ustar 00root root 0000000 0000000 from numbers import Number
import pytest
from typing import List
from shapely.wkt import loads
from shapely.geometry.base import BaseMultipartGeometry
from asf_search.WKT.validate_wkt import (
validate_wkt,
_search_wkt_prep,
_get_clamped_and_wrapped_geometry,
_get_convex_hull,
_merge_overlapping_geometry,
_counter_clockwise_reorientation,
_simplify_aoi,
_get_shape_coords,
)
from asf_search.exceptions import ASFWKTError
def run_test_validate_wkt_invalid_wkt_error(wkt: str):
with pytest.raises(ASFWKTError):
validate_wkt(wkt)
def run_test_validate_wkt_valid_wkt(wkt: str, validated_wkt: str):
expected_aoi = loads(validated_wkt)
actual_wrapped, actual_unwrapped, _ = validate_wkt(wkt)
assert actual_wrapped.equals(
expected_aoi
), f'expected, {expected_aoi.wkt}, got {actual_wrapped.wkt}'
actual_from_geom_wrapped, actual_from_geom_unwrapped, _ = validate_wkt(loads(wkt))
assert actual_from_geom_wrapped.equals(expected_aoi)
def run_test_validate_wkt_clamp_geometry(
wkt: str, clamped_wkt: str, clamped_count: Number, wrapped_count: Number
):
resp = _get_clamped_and_wrapped_geometry(loads(wkt))
assert resp[0].wkt == clamped_wkt
if clamped_count > 0:
assert resp[2][0].report.split(' ')[2] == str(clamped_count)
if wrapped_count > 0:
assert resp[2][1].report.split(' ')[2] == str(wrapped_count)
def run_test_validate_wkt_convex_hull(wkt: str, corrected_wkt: str):
shape = loads(wkt)
assert corrected_wkt == _get_convex_hull(shape)[0].wkt
def run_test_validate_wkt_merge_overlapping_geometry(wkt: str, merged_wkt: str):
shape = loads(wkt)
overlapping = _merge_overlapping_geometry(shape)
if isinstance(overlapping, BaseMultipartGeometry):
overlapping = overlapping.geoms
assert overlapping[0].equals(loads(merged_wkt))
def run_test_validate_wkt_counter_clockwise_reorientation(wkt: str, cc_wkt: str):
shape = loads(wkt)
assert cc_wkt == _counter_clockwise_reorientation(shape)[0].wkt
def run_test_validate_wkt_get_shape_coords(wkt: str, coords: List[Number]):
shape = loads(wkt)
shape_coords = [[coord[0], coord[1]] for coord in _get_shape_coords(shape)]
coords.sort()
shape_coords.sort()
assert len(shape_coords) == len(coords)
assert shape_coords == coords
def run_test_search_wkt_prep(wkt: str):
if wkt == ' ':
with pytest.raises(ASFWKTError):
_search_wkt_prep(None)
return
shape = loads(wkt)
ls = _search_wkt_prep(shape)
assert ls.geometryType() == shape.geometryType()
assert shape.wkt == wkt
def run_test_simplify_aoi(wkt: str, simplified: str, repairs: List[str]):
shape = loads(wkt)
resp, shape_repairs = _simplify_aoi(shape)
assert resp.equals(loads(simplified))
for idx, repair in enumerate(repairs):
assert shape_repairs[idx].report.startswith(repair)
Discovery-asf_search-8.1.2/tests/download/ 0000775 0000000 0000000 00000000000 14777330235 0020551 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/tests/download/test_download.py 0000664 0000000 0000000 00000004353 14777330235 0023776 0 ustar 00root root 0000000 0000000 import unittest
from asf_search.exceptions import ASFAuthenticationError, ASFDownloadError
import pytest
from unittest.mock import patch
import requests
from asf_search.download.download import download_url
def run_test_download_url_auth_error(url, path, filename):
with patch('asf_search.ASFSession.get') as mock_get:
resp = requests.Response()
resp.status_code = 401
mock_get.return_value = resp
if url == 'pathError':
with pytest.raises(ASFDownloadError):
download_url(url, path, filename)
with patch('os.path.isdir') as path_mock:
path_mock.return_value = True
if url == 'urlError':
with patch('os.path.isfile') as isfile_mock:
isfile_mock.return_value = False
with pytest.raises(ASFAuthenticationError):
download_url(url, path, filename)
with patch('os.path.isfile') as isfile_mock:
isfile_mock.return_value = True
with pytest.warns(Warning):
download_url(url, path, filename)
def run_test_download_url(url, path, filename):
if filename == 'BURST':
with patch('asf_search.ASFSession.get') as mock_get:
resp = requests.Response()
resp.status_code = 202
resp.headers.update({'content-type': 'application/json'})
mock_get.return_value = resp
with patch('asf_search.ASFSession.get') as mock_get_burst:
resp_2 = requests.Response()
resp_2.status_code = 200
resp_2.headers.update({'content-type': 'image/tiff'})
mock_get_burst.return_value = resp_2
resp_2.iter_content = lambda chunk_size: []
with patch('builtins.open', unittest.mock.mock_open()):
download_url(url, path, filename)
else:
with patch('asf_search.ASFSession.get') as mock_get:
resp = requests.Response()
resp.status_code = 200
mock_get.return_value = resp
resp.iter_content = lambda chunk_size: []
with patch('builtins.open', unittest.mock.mock_open()):
download_url(url, path, filename)
Discovery-asf_search-8.1.2/tests/pytest-config.yml 0000664 0000000 0000000 00000016342 14777330235 0022266 0 ustar 00root root 0000000 0000000 # Contents of pytest-config.yml
test_types:
- For running ASFProduct tests:
required_keys: products
method: test_ASFProduct
- For running ASFProduct_Stack tests:
required_keys: ["product", "preprocessed_stack", "processed_stack"]
required_in_title: ASFProduct_Stack
method: test_ASFProduct_Stack
- For running ASFProduct_get_stack_options tests:
required_keys: ["product", "options"]
required_in_title: ASFProduct-get-stack-options
method: test_ASFProduct_get_stack_options
- For running ASFProduct_download_file tests:
required_keys: ["product", "filename", "filetype", "additionalUrls"]
required_in_title: ASFProduct-download-file
method: test_ASFProduct_download
- For running ASFSession tests:
required_in_title: password-login
required_keys: ['username', 'password']
method: test_ASFSession_Error
- For running ASFSession token_auth tests:
required_in_title: token-auth
required_keys: token
method: test_ASFSession_Token_Error
- For running ASFSession cookiejar_auth tests:
required_in_title: cookiejar-auth
required_keys: cookies
method: test_ASFSession_Cookie_Error
- For running ASFSession rebuild_auth tests:
required_in_title: rebuild_auth
required_keys: ['original_domain', 'response_domain', 'response_code', 'final_token']
method: test_asf_session_rebuild_auth
- For running preprocessed baseline stack params tests:
required_keys: product
required_in_title: test-preprocessed-product
method: test_get_preprocessed_stack_params
- For running unprocessed baseline stack params tests:
required_keys: product
required_in_title: test-unprocessed-product
method: test_get_unprocessed_stack_params
- For running Invalid Stack ID stack params tests:
required_keys: product
required_in_title: test-invalid-insarStackID
method: test_get_stack_opts_invalid_insarStackId
- For running Invalid Platform stack params tests:
required_keys: ["product", "stack"]
required_in_title: test-temporal-baseline
method: test_temporal_baseline
- For running Invalid Platform stack params tests:
required_keys: ["product", "stack"]
required_in_title: test-product-stack
method: test_stack_from_product
- For running stack_from_id tests:
required_keys: ["stack_id", "stack", "stack_reference"]
required_in_title: test-stack-id
method: test_stack_from_id
- For running stack from ASFSearch tests:
required_keys: response
required_in_title: test-ASFSearch
method: test_ASFSearchResults
- For ASFSearch search with parameters tests:
required_keys: ["parameters", "answer"]
required_in_title: test-ASFSearch-search-valid
method: test_ASFSearch_Search
- For ASFSearch search_generator with parameters tests:
required_keys: parameters
required_in_title: test-ASFSearch-search-valid
method: test_ASFSearch_Search_Generator
- For running ASFSearch search error tests:
required_keys: ["parameters", "status_code", "report"]
required_in_title: test-ASFSearch-search-error
method: test_ASFSearch_Search_Error
- For running WKT Validation invalid wkt error tests:
required_keys: wkt
required_in_title: 'test-validate-wkt-invalid-error'
method: test_wkt_validation_Invalid_WKT_Error
- For running WKT Validation point tests:
required_keys: ["wkt", "validated-wkt"]
required_in_title: 'test-validate-wkt-valid'
method: test_wkt_validation_WKT_Valid
- For running _get_clamped_and_wrapped_geometry tests:
required_keys: ["wkt", "clamped-wkt", "clamped-count", "wrapped-count"]
required_in_title: test-validate-wkt-clamp
method: test_wkt_validation_WKT_clamp_geometry
- For running _get_convex_hull tests:
required_keys: ["wkt", "convex-wkt"]
required_in_title: test-validate-wkt-convex-hull
method: test_wkt_validation_convex_hull
- For running _get_convex_hull tests:
required_keys: ["wkt", "merged-wkt"]
required_in_title: test-validate-wkt-merge-overlapping-geometry
method: test_wkt_validation_merge_overlapping_geometry
- For running _counter_clockwise_reorientation tests:
required_keys: ["wkt", "cc-wkt"]
required_in_title: test-validate-wkt-reorient
method: test_wkt_validation_counter_clockwise_reorientation
- For running _wkt_prep tests:
required_keys: wkt
required_in_title: test-search-wkt-prep
method: test_search_wkt_prep
- For running _simplify_aoi tests:
required_keys: ["wkt", "simplified-wkt", "RepairEntries"]
required_in_title: test-simplify-aoi
method: test_simplify_aoi
- For running _get_project_names tests:
required_keys: ["cmr_ummjson", "campaigns"]
required_in_title: test_get_project_names
method: test_get_platform_campaign_names
- For running _get_project_names tests:
required_keys: ["wkt", "coordinates"]
required_in_title: test-validate-wkt-get-shape-coords
method: test_validate_wkt_get_shape_coords
- For running ASFSearchResults_intersection tests:
required_keys: ["wkt"]
required_in_title: ASFSearchResults_intersection
method: test_ASFSearchResults_intersection
- For running download_url tests:
required_keys: ["url", "path", "filename"]
required_in_title: test-download-url
method: test_download_url
- For running find_new_reference tests:
required_keys: ["stack", "output_index"]
required_in_title: test-find-new-reference
method: test_find_new_reference
- For running test_get_default_product_type tests:
required_keys: ["product", "product_type"]
required_in_title: test-get_default_product_type
method: test_get_default_product_type
- For running test_get_baseline_from_stack tests:
required_keys: ['reference', 'stack', 'output_stack', 'error']
required_in_title: test-get-baseline-from-stack
method: test_get_baseline_from_stack
- For running test_get_default_product_type tests:
required_keys: ["reference", "output"]
required_in_title: test-valid-state-vectors
method: test_valid_state_vectors
- For running test_validator_map_validate tests:
required_keys: ['key', 'value', 'output']
required_in_title: test-validator-map-validate
method: test_validator_map_validate
- For running ASFSearchOption-validator tests:
required_keys: ['validator', 'input', 'output', 'error']
required_in_title: test-validators
method: test_ASFSearchOptions_validator
- For running ASFSearchOptions tests:
required_in_title: test-ASFSearchOptions
method: test_ASFSearchOptions
- For running serialization tests:
required_in_title: serialization
method: test_serialization
# - For running ASFSearchOptions tests:
# required_in_title: ASFSearchResults-format
# required_keys: results
# method: test_output_format
- For running search-api keyword-collection aliasing tests:
required_in_title: test-aliasing-search-against-api
required_keys: params
method: test_keyword_aliasing_results
- For running dataset keyword tests:
required_in_title: search-dataset
required_keys: dataset
method: test_search_dataset
- For running build_subquery tests:
required_in_title: search-build_subquery
required_keys: ['params', 'expected']
method: test_build_subqueries
- For running jupyter notebook example tests:
required_keys: notebook
method: test_notebook_examples
Discovery-asf_search-8.1.2/tests/pytest-managers.py 0000664 0000000 0000000 00000060503 14777330235 0022443 0 ustar 00root root 0000000 0000000 from typing import Dict, List
from asf_search import ASFSearchOptions, ASFSession, FileDownloadType, search
from asf_search.exceptions import ASFAuthenticationError
from ASFProduct.test_ASFProduct import (
run_test_ASFProduct,
run_test_ASFProduct_download,
run_test_product_get_stack_options,
run_test_stack,
)
from ASFSearchOptions.test_ASFSearchOptions import run_test_ASFSearchOptions
from ASFSearchResults.test_ASFSearchResults import (
run_test_ASFSearchResults_intersection,
)
from ASFSession.test_ASFSession import (
run_auth_with_cookiejar,
run_auth_with_creds,
run_auth_with_token,
run_test_asf_session_rebuild_auth,
)
from BaselineSearch.test_baseline_search import (
run_test_stack_from_id,
run_test_stack_from_product,
run_test_calc_temporal_baselines,
run_get_stack_opts_invalid_insarStackId,
run_test_get_unprocessed_stack_params,
run_test_get_preprocessed_stack_params,
)
from Search.test_search import (
run_test_ASFSearchResults,
run_test_build_subqueries,
run_test_dataset_search,
run_test_keyword_aliasing_results,
run_test_search,
run_test_search_http_error,
)
from Search.test_search_generator import (
run_test_search_generator,
run_test_search_generator_multi,
)
from CMR.test_MissionList import run_test_get_project_names
from pytest import raises
from unittest.mock import patch
import os
import pathlib
import yaml
from WKT.test_validate_wkt import (
run_test_search_wkt_prep,
run_test_validate_wkt_get_shape_coords,
run_test_validate_wkt_clamp_geometry,
run_test_validate_wkt_valid_wkt,
run_test_validate_wkt_convex_hull,
run_test_validate_wkt_counter_clockwise_reorientation,
run_test_validate_wkt_invalid_wkt_error,
run_test_validate_wkt_merge_overlapping_geometry,
run_test_simplify_aoi,
)
from ASFSearchOptions.test_ASFSearchOptions import (
run_test_ASFSearchOptions_validator,
run_test_validator_map_validate,
)
from BaselineSearch.Stack.test_stack import (
run_test_find_new_reference,
run_test_get_baseline_from_stack,
run_test_get_default_product_type,
run_test_valid_state_vectors,
)
from asf_search.search.search_generator import as_ASFProduct
from download.test_download import (
run_test_download_url,
run_test_download_url_auth_error,
)
from Serialization.test_serialization import run_test_serialization
import nbformat
from nbconvert.preprocessors import ExecutePreprocessor
# asf_search.ASFProduct Tests
def test_ASFProduct(**args) -> None:
"""
Tests Basic ASFProduct with mock searchAPI response
"""
test_info = args['test_info']
geographic_response = get_resource(test_info['products'])
run_test_ASFProduct(geographic_response)
def test_ASFProduct_Stack(**args) -> None:
"""
Tests ASFProduct.stack() with reference and corresponding stack
Checks for temporalBaseline order,
asserting the stack is ordered by the scene's temporalBaseline (in ascending order)
"""
test_info = args['test_info']
reference = get_resource(test_info['product'])
preprocessed_stack = get_resource(test_info['preprocessed_stack'])
processed_stack = get_resource(test_info['processed_stack'])
run_test_stack(reference, preprocessed_stack, processed_stack)
def test_ASFProduct_get_stack_options(**args) -> None:
test_info = args['test_info']
reference = get_resource(test_info['product'])
options = get_resource(test_info['options'])
run_test_product_get_stack_options(reference, options)
def test_ASFProduct_download(**args) -> None:
test_info = args['test_info']
reference = get_resource(test_info['product'])
filename = test_info['filename']
filetype_raw = test_info['filetype']
additional_urls = test_info['additionalUrls']
if filetype_raw == 1:
filetype = FileDownloadType.DEFAULT_FILE
elif filetype_raw == 2:
filetype = FileDownloadType.ADDITIONAL_FILES
else:
filetype = FileDownloadType.ALL_FILES
run_test_ASFProduct_download(reference, filename, filetype, additional_urls)
# asf_search.ASFSession Tests
def test_ASFSession_Error(**args) -> None:
"""
Test ASFSession.auth_with_creds for sign in errors
"""
test_info = args['test_info']
username = test_info['username']
password = test_info['password']
with patch('asf_search.ASFSession.get') as mock_get:
mock_get.return_value = 'Error'
with raises(ASFAuthenticationError):
run_auth_with_creds(username, password)
def test_ASFSession_Token_Error(**args) -> None:
"""
Test ASFSession.auth_with_token for sign in errors
"""
test_info = args['test_info']
token = test_info['token']
with raises(ASFAuthenticationError):
run_auth_with_token(token)
def test_ASFSession_Cookie_Error(**args) -> None:
"""
Test ASFSession.auth_with_cookie for sign in errors
"""
test_info = args['test_info']
cookies = test_info['cookies']
with raises(ASFAuthenticationError):
run_auth_with_cookiejar(cookies)
def test_asf_session_rebuild_auth(**args) -> None:
"""
Test asf_search.ASFSession.rebuild_auth
When redirecting from an ASF domain, only accept
domains listed in ASFSession.AUTH_DOMAINS
"""
test_info = args['test_info']
original_domain = test_info['original_domain']
response_domain = test_info['response_domain']
response_code = test_info['response_code']
final_token = test_info['final_token']
run_test_asf_session_rebuild_auth(original_domain, response_domain, response_code, final_token)
# asf_search.search.baseline_search Tests
def test_get_preprocessed_stack_params(**args) -> None:
"""
Test asf_search.search.baseline_search.get_stack_opts with a reference scene
that's part of a pre-calculated platform, asserting that get_stack_opts returns an object with two parameters
\n1. processingLevel
\n2. insarStackId
"""
test_info = args['test_info']
reference = get_resource(test_info['product'])
run_test_get_preprocessed_stack_params(reference)
def test_get_unprocessed_stack_params(**args) -> None:
"""
Test asf_search.search.baseline_search.get_stack_opts with a reference scene
that's not part of a pre-calculated platform, asserting that get_stack_opts returns an object with seven parameters
"""
test_info = args['test_info']
reference = get_resource(test_info['product'])
run_test_get_unprocessed_stack_params(reference)
def test_get_stack_opts_invalid_insarStackId(**args) -> None:
"""
Test asf_search.search.baseline_search.get_stack_opts with a the reference scene's
insarStackID set to an invalid value, and asserting an ASFBaselineError is raised
"""
test_info = args['test_info']
reference = get_resource(test_info['product'])
run_get_stack_opts_invalid_insarStackId(reference)
def test_temporal_baseline(**args) -> None:
"""
Test asf_search.search.baseline_search.calc_temporal_baselines, asserting mutated baseline stack
is still the same length and that each product's properties contain a temporalBaseline key
"""
test_info = args['test_info']
reference = get_resource(test_info['product'])
stack = get_resource(test_info['stack'])
run_test_calc_temporal_baselines(reference, stack)
def test_stack_from_product(**args) -> None:
"""
Test asf_search.search.baseline_search.stack_from_product, asserting stack returned is ordered
by temporalBaseline value in ascending order
"""
test_info = args['test_info']
reference = get_resource(test_info['product'])
stack = get_resource(test_info['stack'])
run_test_stack_from_product(reference, stack)
def test_stack_from_id(**args) -> None:
"""
Test asf_search.search.baseline_search.stack_from_id, asserting stack returned is ordered
by temporalBaseline value in ascending order
"""
test_info = args['test_info']
stack_id = test_info['stack_id']
stack_reference_data = test_info['stack_reference']
stack_data = test_info['stack']
stack_reference = get_resource(stack_reference_data)
stack = []
if stack_data != []:
stack = get_resource(stack_data)
run_test_stack_from_id(stack_id, stack_reference, stack)
# asf_search.ASFSearchResults Tests
def test_ASFSearchResults(**args) -> None:
"""
Test asf_search.ASFSearchResults, asserting initialized values,
and geojson response returns object with type FeatureCollection
"""
test_info = args['test_info']
search_response = get_resource(test_info['response'])
run_test_ASFSearchResults(search_response)
# asf_search.search Tests
def test_ASFSearch_Search(**args) -> None:
"""
Test asf_search.search, asserting returned value is expected result
"""
test_info = args['test_info']
parameters = get_resource(test_info['parameters'])
answer = get_resource(test_info['answer'])
run_test_search(parameters, answer)
def test_ASFSearch_Search_Generator(**args) -> None:
test_info = args['test_info']
params = get_resource(test_info['parameters'])
if isinstance(params, list):
opts = []
for p in params:
opts.append(ASFSearchOptions(**p))
run_test_search_generator_multi(opts)
else:
run_test_search_generator(ASFSearchOptions(**params))
def test_ASFSearch_Search_Error(**args) -> None:
"""
Test asf_search.search errors,
asserting server and client errors are raised
"""
test_info = args['test_info']
parameters = test_info['parameters']
report = test_info['report']
error_code = test_info['status_code']
run_test_search_http_error(parameters, error_code, report)
def test_wkt_validation_Invalid_WKT_Error(**args) -> None:
"""
Test asf_search.wkt errors,
asserting wkt validation errors are raised
"""
test_info = args['test_info']
wkt = get_resource(test_info['wkt'])
run_test_validate_wkt_invalid_wkt_error(wkt)
def test_wkt_validation_WKT_Valid(**args) -> None:
"""
Test asf_search.validate_wkt, asserting expected wkts are returned
"""
test_info = args['test_info']
wkt = get_resource(test_info['wkt'])
validated_wkt = get_resource(test_info['validated-wkt'])
run_test_validate_wkt_valid_wkt(wkt, validated_wkt)
def test_wkt_validation_WKT_clamp_geometry(**args) -> None:
"""
Test asf_search.validate_wkt._get_clamped_and_wrapped_geometry, asserting the amount of clamped and wrapped coordinates
"""
test_info = args['test_info']
wkt = get_resource(test_info['wkt'])
clamped_wkt = get_resource(test_info['clamped-wkt'])
clamped_count = get_resource(test_info['clamped-count'])
wrapped_count = get_resource(test_info['wrapped-count'])
run_test_validate_wkt_clamp_geometry(wkt, clamped_wkt, clamped_count, wrapped_count)
def test_wkt_validation_convex_hull(**args) -> None:
"""
Test asf_search.validate_wkt._get_convex_hull, asserting convex hulls producted are expected
"""
test_info = args['test_info']
wkt = get_resource(test_info['wkt'])
convex_wkt = get_resource(test_info['convex-wkt'])
run_test_validate_wkt_convex_hull(wkt, convex_wkt)
def test_wkt_validation_merge_overlapping_geometry(**args) -> None:
"""
Test asf_search.validate_wkt._merge_overlapping_geometry, asserting expected shapes are merged
"""
test_info = args['test_info']
wkt = get_resource(test_info['wkt'])
merged_wkt = get_resource(test_info['merged-wkt'])
run_test_validate_wkt_merge_overlapping_geometry(wkt, merged_wkt)
def test_wkt_validation_counter_clockwise_reorientation(**args) -> None:
"""
Test asf_search.validate_wkt._counter_clockwise_reorientation reverses polygon orientation if polygon is wound clockwise,
and maintains counter-clockwise winding when polygon orientation is correct
"""
test_info = args['test_info']
wkt = get_resource(test_info['wkt'])
cc_wkt = get_resource(test_info['cc-wkt'])
run_test_validate_wkt_counter_clockwise_reorientation(wkt, cc_wkt)
def test_validate_wkt_get_shape_coords(**args) -> None:
"""
Test asf_search.validate_wkt._get_shape_coords asserting all coordinates are returned and expected
"""
test_info = args['test_info']
wkt = get_resource(test_info['wkt'])
coords = get_resource(test_info['coordinates'])
run_test_validate_wkt_get_shape_coords(wkt, coords)
def test_search_wkt_prep(**args) -> None:
"""
Test asf_search.validate_wkt.wkt_prep, asserting returned shape is correct geometric type and expected shape
"""
test_info = args['test_info']
wkt = get_resource(test_info['wkt'])
run_test_search_wkt_prep(wkt)
def test_simplify_aoi(**args) -> None:
"""
Test asf_search.validate_wkt.wkt_prep, asserting returned shape is correct geometric type and expected shape
"""
test_info = args['test_info']
wkt = get_resource(test_info['wkt'])
simplified = get_resource(test_info['simplified-wkt'])
RepairEntries = get_resource(test_info['RepairEntries'])
run_test_simplify_aoi(wkt, simplified, RepairEntries)
def test_get_platform_campaign_names(**args) -> None:
test_info = args['test_info']
cmr_ummjson = get_resource(test_info['cmr_ummjson'])
campaigns: List[str] = get_resource(test_info['campaigns'])
run_test_get_project_names(cmr_ummjson, campaigns)
def test_download_url(**args) -> None:
"""
Test asf_search.download.download_url
"""
test_info = args['test_info']
url = test_info['url']
path = test_info['path']
filename = test_info['filename']
if filename == 'error':
run_test_download_url_auth_error(url, path, filename)
else:
run_test_download_url(url, path, filename)
def test_find_new_reference(**args) -> None:
"""
Test asf_search.baseline.calc.find_new_reference
"""
test_info = args['test_info']
stack = get_resource(test_info['stack'])
output_index = get_resource(test_info['output_index'])
run_test_find_new_reference(stack, output_index)
def test_get_default_product_type(**args) -> None:
test_info = args['test_info']
product = get_resource(test_info['product'])
product_type = get_resource(test_info['product_type'])
product = as_ASFProduct({'meta': product['meta'], 'umm': product['umm']}, ASFSession())
if product.properties.get('sceneName') is None:
product.properties['sceneName'] = 'BAD_SCENE'
run_test_get_default_product_type(product, product_type)
def test_get_baseline_from_stack(**args) -> None:
test_info = args['test_info']
reference = get_resource(test_info['reference'])
stack = get_resource(test_info['stack'])
output_stack = get_resource(test_info['output_stack'])
error = get_resource(test_info['error'])
run_test_get_baseline_from_stack(reference, stack, output_stack, error)
def test_valid_state_vectors(**args) -> None:
test_info = args['test_info']
reference = get_resource(test_info['reference'])
output = get_resource(test_info['output'])
run_test_valid_state_vectors(reference, output)
def test_validator_map_validate(**args) -> None:
test_info = args['test_info']
key = get_resource(test_info['key'])
value = get_resource(test_info['value'])
output = get_resource(test_info['output'])
run_test_validator_map_validate(key, value, output)
def test_ASFSearchOptions_validator(**kargs) -> None:
test_info = kargs['test_info']
validator_name = get_resource(test_info['validator'])
param = safe_load_tuple(get_resource(test_info['input']))
output = safe_load_tuple(get_resource(test_info['output']))
error = get_resource(test_info['error'])
run_test_ASFSearchOptions_validator(validator_name, param, output, error)
def test_ASFSearchOptions(**kwargs) -> None:
run_test_ASFSearchOptions(**kwargs)
def test_ASFSearchResults_get_urls() -> None:
response = search(
granule_list=[
'OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0'
]
)
actual_urls = response.find_urls()
expected_urls = [
'https://cumulus.asf.alaska.edu/s3credentials',
'https://datapool.asf.alaska.edu/BROWSE/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_BROWSE.png',
'https://datapool.asf.alaska.edu/BROWSE/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_BROWSE.png.md5',
'https://datapool.asf.alaska.edu/BROWSE/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_BROWSE_low-res.png',
'https://datapool.asf.alaska.edu/BROWSE/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_BROWSE_low-res.png.md5',
'https://datapool.asf.alaska.edu/BROWSE/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_BROWSE_thumbnail.png',
'https://datapool.asf.alaska.edu/BROWSE/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_BROWSE_thumbnail.png.md5',
'https://datapool.asf.alaska.edu/RTC/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0.h5',
'https://datapool.asf.alaska.edu/RTC/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0.h5.md5',
'https://datapool.asf.alaska.edu/RTC/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0.iso.xml',
'https://datapool.asf.alaska.edu/RTC/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0.iso.xml.md5',
'https://datapool.asf.alaska.edu/RTC/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_VH.tif',
'https://datapool.asf.alaska.edu/RTC/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_VH.tif.md5',
'https://datapool.asf.alaska.edu/RTC/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_VV.tif',
'https://datapool.asf.alaska.edu/RTC/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_VV.tif.md5',
'https://datapool.asf.alaska.edu/RTC/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_mask.tif',
'https://datapool.asf.alaska.edu/RTC/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_mask.tif.md5',
]
assert actual_urls == expected_urls
assert response.find_urls('.tif') == [
'https://datapool.asf.alaska.edu/RTC/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_VH.tif',
'https://datapool.asf.alaska.edu/RTC/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_VV.tif',
'https://datapool.asf.alaska.edu/RTC/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_mask.tif',
]
assert response.find_urls(pattern='.*s3credentials') == [
'https://cumulus.asf.alaska.edu/s3credentials'
]
assert response.find_urls('.tif', directAccess=True) == [
's3://asf-cumulus-prod-opera-products/OPERA_L2_RTC-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_VH.tif',
's3://asf-cumulus-prod-opera-products/OPERA_L2_RTC-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_VV.tif',
's3://asf-cumulus-prod-opera-products/OPERA_L2_RTC-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_mask.tif',
]
assert response.find_urls(pattern=r'.*BROWSE.*') == [
'https://datapool.asf.alaska.edu/BROWSE/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_BROWSE.png',
'https://datapool.asf.alaska.edu/BROWSE/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_BROWSE.png.md5',
'https://datapool.asf.alaska.edu/BROWSE/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_BROWSE_low-res.png',
'https://datapool.asf.alaska.edu/BROWSE/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_BROWSE_low-res.png.md5',
'https://datapool.asf.alaska.edu/BROWSE/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_BROWSE_thumbnail.png',
'https://datapool.asf.alaska.edu/BROWSE/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_BROWSE_thumbnail.png.md5',
]
assert response.find_urls('.png', pattern=r'.*BROWSE.*') == [
'https://datapool.asf.alaska.edu/BROWSE/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_BROWSE.png',
'https://datapool.asf.alaska.edu/BROWSE/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_BROWSE_low-res.png',
'https://datapool.asf.alaska.edu/BROWSE/OPERA-S1/OPERA_L2_RTC-S1_T131-279916-IW1_20231202T162856Z_20231202T232622Z_S1A_30_v1.0_BROWSE_thumbnail.png',
]
def test_ASFSearchResults_intersection(**kwargs) -> None:
wkt = get_resource(kwargs['test_info']['wkt'])
run_test_ASFSearchResults_intersection(wkt)
def test_search_dataset(**kwargs) -> None:
dataset = get_resource(kwargs['test_info']['dataset'])
run_test_dataset_search(dataset)
def test_build_subqueries(**kwargs) -> None:
params = ASFSearchOptions(**get_resource(kwargs['test_info']['params']))
expected = [
ASFSearchOptions(**subquery) for subquery in get_resource(kwargs['test_info']['expected'])
]
run_test_build_subqueries(params, expected)
def test_serialization(**args) -> None:
test_info = args['test_info']
product = get_resource(test_info.get('product'))
results = get_resource(test_info.get('results'))
search_opts = get_resource(test_info.get('searchOpts'))
options = ASFSearchOptions(**search_opts if search_opts else {})
run_test_serialization(product, results, options)
def test_notebook_examples(**args) -> None:
test_info = args['test_info']
notebook_file = test_info['notebook']
path = os.path.join('examples', notebook_file)
with open(path) as f:
notebook = nbformat.read(f, as_version=4)
ep = ExecutePreprocessor(timeout=600)
try:
assert ep.preprocess(notebook) is not None, f'Got empty notebook for {notebook_file}'
except Exception as e:
assert False, f'Failed executing {notebook_file}: {e}'
# Testing resource loading utilities
def safe_load_tuple(param):
"""
loads a tuple from a list if a param is an object with key 'tuple'
(Arbritrary constructor initialization is not supported by yaml.safe_load
as a security measure)
"""
if isinstance(param, Dict):
if 'tuple' in param.keys():
param = tuple(param['tuple'])
return param
# def test_output_format(**args) -> None:
# test_info = args['test_info']
# products = get_resource(test_info['results'])
# if not isinstance(products, List):
# products = [products]
# results = ASFSearchResults([as_ASFProduct({'meta': product['meta'], 'umm': product['umm']}, ASFSession()) for product in products])
# run_test_output_format(results)
def test_keyword_aliasing_results(**args) -> None:
test_info = args['test_info']
opts = ASFSearchOptions(**test_info['params'])
opts.maxResults = 250
run_test_keyword_aliasing_results(opts)
# Finds and loads file from yml_tests/Resouces/ if loaded field ends with .yml/yaml extension
def get_resource(yml_file):
if isinstance(yml_file, str):
if yml_file.endswith(('.yml', '.yaml')):
base_path = pathlib.Path(__file__).parent.resolve()
with open(os.path.join(base_path, 'yml_tests', 'Resources', yml_file), 'r') as f:
try:
return yaml.safe_load(f)
except yaml.YAMLError as exc:
print(exc)
elif isinstance(yml_file, List): # check if it's a list of yml files
if len(yml_file) > 0:
if isinstance(yml_file[0], str):
if yml_file[0].endswith(('.yml', '.yaml')):
return [get_resource(file) for file in yml_file]
return yml_file
Discovery-asf_search-8.1.2/tests/yml_tests/ 0000775 0000000 0000000 00000000000 14777330235 0020765 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/tests/yml_tests/Resources/ 0000775 0000000 0000000 00000000000 14777330235 0022737 5 ustar 00root root 0000000 0000000 Discovery-asf_search-8.1.2/tests/yml_tests/Resources/Alos_response.yml 0000664 0000000 0000000 00000027257 14777330235 0026313 0 ustar 00root root 0000000 0000000 {
"type": "Feature",
"geometry": {
"coordinates": [
[
[
-135.071,
56.643
],
[
-135.227,
57.142
],
[
-136.295,
57.037
],
[
-136.125,
56.538
],
[
-135.071,
56.643
]
]
],
"type": "Polygon"
},
"properties": {
"beamModeType": "FBS",
"browse": "https://datapool.asf.alaska.edu/BROWSE/A3/ALPSRP111041130.jpg",
"bytes": 408738585,
"centerLat": 56.8411,
"centerLon": -135.6799,
"faradayRotation": 0.456056,
"fileID": "ALPSRP111041130-L1.0",
"flightDirection": null,
"groupID": "ALPSRP111041130",
"granuleType": "ALOS_PALSAR_SCENE",
"insarStackId": "1486384",
"md5sum": "2a5fa75a25f9eb8d176ffd5bf1bfab21",
"offNadirAngle": 34.3,
"orbit": 11104,
"pathNumber": 238,
"platform": "ALOS",
"pointingAngle": null,
"polarization": "HH",
"processingDate": "2012-08-23T00:00:00.000Z",
"processingLevel": "L1.0",
"sceneName": "ALPSRP111041130",
"sensor": "PALSAR",
"startTime": "2008-02-24T07:13:21.000Z",
"stopTime": "2008-02-24T07:13:29.000Z",
"url": "https://datapool.asf.alaska.edu/L1.0/A3/ALPSRP111041130-L1.0.zip",
"fileName": "ALPSRP111041130-L1.0.zip",
"frameNumber": "1130"
},
"baseline": {
"insarBaseline": 4798.7874
},
"umm": {
"TemporalExtent": {
"RangeDateTime": {
"BeginningDateTime": "2008-02-24T07:13:21.000Z",
"EndingDateTime": "2008-02-24T07:13:29.000Z"
}
},
"OrbitCalculatedSpatialDomains": [
{
"OrbitNumber": 11104
}
],
"GranuleUR": "ALPSRP111041130-L1.0",
"AdditionalAttributes": [
{
"Name": "FLIGHT_LINE",
"Values": [
"NULL"
]
},
{
"Name": "GROUP_ID",
"Values": [
"ALPSRP111041130"
]
},
{
"Name": "OFF_NADIR_ANGLE",
"Values": [
"34.3"
]
},
{
"Name": "MD5SUM",
"Values": [
"2a5fa75a25f9eb8d176ffd5bf1bfab21"
]
},
{
"Name": "GRANULE_TYPE",
"Values": [
"ALOS_PALSAR_SCENE"
]
},
{
"Name": "ASCENDING_DESCENDING",
"Values": [
"ASCENDING"
]
},
{
"Name": "FAR_END_LAT",
"Values": [
"57.142"
]
},
{
"Name": "INSAR_STACK_SIZE",
"Values": [
"23"
]
},
{
"Name": "BEAM_MODE_TYPE",
"Values": [
"FBS"
]
},
{
"Name": "INSAR_BASELINE",
"Values": [
"4798.7874"
]
},
{
"Name": "CENTER_FRAME_ID",
"Values": [
"1137"
]
},
{
"Name": "CENTER_ESA_FRAME",
"Values": [
"1137"
]
},
{
"Name": "ACQUISITION_DATE",
"Values": [
"2008-02-24T07:13:29Z"
]
},
{
"Name": "MISSION_NAME",
"Values": [
"NULL"
]
},
{
"Name": "CENTER_LON",
"Values": [
"-135.6799"
]
},
{
"Name": "NEAR_START_LAT",
"Values": [
"56.538"
]
},
{
"Name": "BEAM_MODE",
"Values": [
"FBS"
]
},
{
"Name": "BEAM_MODE_DESC",
"Values": [
"ALOS PALSAR sensor: High Resolution Observation Mode (single polarization)"
]
},
{
"Name": "PROCESSING_TYPE",
"Values": [
"L1.0"
]
},
{
"Name": "PROCESSING_DESCRIPTION",
"Values": [
"Reconstructed, unprocessed signal data"
]
},
{
"Name": "FRAME_NUMBER",
"Values": [
"1130"
]
},
{
"Name": "PROCESSING_LEVEL",
"Values": [
"L0"
]
},
{
"Name": "PROCESSING_DATE",
"Values": [
"2012-08-23 00:00:00"
]
},
{
"Name": "NEAR_START_LON",
"Values": [
"-136.125"
]
},
{
"Name": "DOPPLER",
"Values": [
"0"
]
},
{
"Name": "FAR_START_LAT",
"Values": [
"56.643"
]
},
{
"Name": "NEAR_END_LON",
"Values": [
"-136.295"
]
},
{
"Name": "PROCESSING_TYPE_DISPLAY",
"Values": [
"Level 1.0"
]
},
{
"Name": "POLARIZATION",
"Values": [
"HH"
]
},
{
"Name": "FAR_START_LON",
"Values": [
"-135.071"
]
},
{
"Name": "THUMBNAIL_URL",
"Values": [
"https://datapool.asf.alaska.edu/THUMBNAIL/A3/AP_11104_FBS_F1130_THUMBNAIL.jpg"
]
},
{
"Name": "ASF_PLATFORM",
"Values": [
"ALOS"
]
},
{
"Name": "INSAR_STACK_ID",
"Values": [
"1486384"
]
},
{
"Name": "LOOK_DIRECTION",
"Values": [
"R"
]
},
{
"Name": "PATH_NUMBER",
"Values": [
"238"
]
},
{
"Name": "NEAR_END_LAT",
"Values": [
"57.037"
]
},
{
"Name": "FARADAY_ROTATION",
"Values": [
"0.456056"
]
},
{
"Name": "FAR_END_LON",
"Values": [
"-135.227"
]
},
{
"Name": "BYTES",
"Values": [
"408738585"
]
},
{
"Name": "CENTER_LAT",
"Values": [
"56.8411"
]
}
],
"SpatialExtent": {
"HorizontalSpatialDomain": {
"Geometry": {
"GPolygons": [
{
"Boundary": {
"Points": [
{
"Longitude": -135.071,
"Latitude": 56.643
},
{
"Longitude": -135.227,
"Latitude": 57.142
},
{
"Longitude": -136.295,
"Latitude": 57.037
},
{
"Longitude": -136.125,
"Latitude": 56.538
},
{
"Longitude": -135.071,
"Latitude": 56.643
}
]
}
}
]
}
}
},
"ProviderDates": [
{
"Date": "2012-01-16T12:10:57.000Z",
"Type": "Insert"
},
{
"Date": "2012-08-23T00:00:00.000Z",
"Type": "Update"
}
],
"CollectionReference": {
"EntryTitle": "ALOS_PALSAR_LEVEL1.0"
},
"RelatedUrls": [
{
"Format": "Not provided",
"Type": "GET DATA",
"URL": "https://datapool.asf.alaska.edu/L1.0/A3/ALPSRP111041130-L1.0.zip"
},
{
"Format": "Not provided",
"Type": "GET RELATED VISUALIZATION",
"URL": "https://datapool.asf.alaska.edu/BROWSE/A3/ALPSRP111041130.jpg"
},
{
"Format": "Not provided",
"Type": "GET RELATED VISUALIZATION",
"URL": "https://datapool.asf.alaska.edu/BROWSE/A3/AP_11104_FBS_F1130.jpg"
}
],
"DataGranule": {
"DayNightFlag": "Unspecified",
"Identifiers": [
{
"Identifier": "ALPSRP111041130",
"IdentifierType": "ProducerGranuleId"
}
],
"ProductionDateTime": "2012-08-23T00:00:00.000Z",
"ArchiveAndDistributionInformation": [
{
"Name": "Not provided",
"Size": 389.8,
"SizeUnit": "MB",
"Format": "Not provided"
}
]
},
"Platforms": [
{
"ShortName": "ALOS",
"Instruments": [
{
"ShortName": "PALSAR",
"ComposedOf": [
{
"ShortName": "FBS"
}
]
}
]
}
]
},
"meta": {
"concept-type": "granule",
"concept-id": "G1213802033-ASF",
"revision-id": 2,
"native-id": "ALPSRP111041130-L1.0",
"provider-id": "ASF",
"format": "application/echo10+xml",
"revision-date": "2016-05-13T14:32:33.502Z"
}
} Discovery-asf_search-8.1.2/tests/yml_tests/Resources/Alos_response_maxResults3.yml 0000664 0000000 0000000 00000055467 14777330235 0030631 0 ustar 00root root 0000000 0000000 [
{
"properties":
{
"beamModeType": "WB1",
"browse": ["https://datapool.asf.alaska.edu/BROWSE/A3/ALPSRS279162650.jpg"],
"bytes": 493077369,
"centerLat": 47.8206,
"centerLon": -145.828,
"faradayRotation": 7.068555,
"fileID": "ALPSRS279162650-L1.0",
"flightDirection": "DESCENDING",
"groupID": null,
"granuleType": "ALOS_PALSAR_SCENE",
"insarStackId": "0",
"md5sum": "53412c179f5b16b73970ac9314d95cc0",
"offNadirAngle": 27.1,
"orbit": 27916,
"pathNumber": 598,
"platform": "ALOS",
"pointingAngle": null,
"polarization": "HH 5scan",
"processingDate": "2013-10-12T21:37:42.000Z",
"processingLevel": "L1.0",
"sceneName": "ALPSRS279162650",
"sensor": "PALSAR",
"startTime": "2011-04-21T20:23:11.000Z",
"stopTime": "2011-04-21T20:23:36.000Z",
"url": "https://datapool.asf.alaska.edu/L1.0/A3/ALPSRS279162650-L1.0.zip",
"fileName": "ALPSRS279162650-L1.0.zip",
"frameNumber": 2650,
"pgeVersion": null
},
"geometry":
{
"coordinates":
[
[
[-147.788, 49.629],
[-148.523, 46.565],
[-144.011, 45.968],
[-143.006, 49.024],
[-147.788, 49.629],
],
],
"type": "Polygon",
},
"baseline": { "insarBaseline": 0.0 },
"umm":
{
"TemporalExtent":
{
"RangeDateTime":
{
"BeginningDateTime": "2011-04-21T20:23:11.000Z",
"EndingDateTime": "2011-04-21T20:23:36.000Z",
},
},
"OrbitCalculatedSpatialDomains": [{ "OrbitNumber": 27916 }],
"GranuleUR": "ALPSRS279162650-L1.0",
"AdditionalAttributes":
[
{ "Name": "FLIGHT_LINE", "Values": ["NULL"] },
{ "Name": "OFF_NADIR_ANGLE", "Values": ["27.1"] },
{
"Name": "MD5SUM",
"Values": ["53412c179f5b16b73970ac9314d95cc0"],
},
{ "Name": "GRANULE_TYPE", "Values": ["ALOS_PALSAR_SCENE"] },
{ "Name": "ASCENDING_DESCENDING", "Values": ["DESCENDING"] },
{ "Name": "FAR_END_LAT", "Values": ["46.565"] },
{ "Name": "INSAR_STACK_SIZE", "Values": ["0"] },
{ "Name": "BEAM_MODE_TYPE", "Values": ["WB1"] },
{ "Name": "INSAR_BASELINE", "Values": ["0"] },
{ "Name": "CENTER_FRAME_ID", "Values": ["2613"] },
{ "Name": "CENTER_ESA_FRAME", "Values": ["2613"] },
{ "Name": "ACQUISITION_DATE", "Values": ["2011-04-21T20:23:36Z"] },
{ "Name": "MISSION_NAME", "Values": ["NULL"] },
{ "Name": "CENTER_LON", "Values": ["-145.828"] },
{ "Name": "NEAR_START_LAT", "Values": ["49.024"] },
{ "Name": "BEAM_MODE", "Values": ["WB1"] },
{
"Name": "BEAM_MODE_DESC",
"Values":
["ALOS PALSAR sensor: Scan Observation Mode (Burst Mode 1)"],
},
{ "Name": "PROCESSING_TYPE", "Values": ["L1.0"] },
{
"Name": "PROCESSING_DESCRIPTION",
"Values": ["Reconstructed, unprocessed signal data"],
},
{ "Name": "FRAME_NUMBER", "Values": ["2650"] },
{ "Name": "PROCESSING_LEVEL", "Values": ["L0"] },
{
"Name": "PROCESSING_DATE",
"Values": ["2013-10-12 21:37:42.395788"],
},
{ "Name": "NEAR_START_LON", "Values": ["-143.006"] },
{ "Name": "DOPPLER", "Values": ["-1"] },
{ "Name": "FAR_START_LAT", "Values": ["49.629"] },
{ "Name": "NEAR_END_LON", "Values": ["-144.011"] },
{ "Name": "PROCESSING_TYPE_DISPLAY", "Values": ["Level 1.0"] },
{ "Name": "POLARIZATION", "Values": ["HH 5scan"] },
{ "Name": "FAR_START_LON", "Values": ["-147.788"] },
{
"Name": "THUMBNAIL_URL",
"Values":
[
"https://datapool.asf.alaska.edu/THUMBNAIL/A3/ALPSRS279162650_THUMBNAIL.jpg",
],
},
{ "Name": "ASF_PLATFORM", "Values": ["ALOS"] },
{ "Name": "INSAR_STACK_ID", "Values": ["0"] },
{ "Name": "LOOK_DIRECTION", "Values": ["R"] },
{ "Name": "PATH_NUMBER", "Values": ["598"] },
{ "Name": "NEAR_END_LAT", "Values": ["45.968"] },
{ "Name": "FARADAY_ROTATION", "Values": ["7.068555"] },
{ "Name": "FAR_END_LON", "Values": ["-148.523"] },
{ "Name": "BYTES", "Values": ["493077369"] },
{ "Name": "CENTER_LAT", "Values": ["47.8206"] },
],
"SpatialExtent":
{
"HorizontalSpatialDomain":
{
"Geometry":
{
"GPolygons":
[
{
"Boundary":
{
"Points":
[
{ "Longitude": -147.788, "Latitude": 49.629 },
{ "Longitude": -148.523, "Latitude": 46.565 },
{ "Longitude": -144.011, "Latitude": 45.968 },
{ "Longitude": -143.006, "Latitude": 49.024 },
{ "Longitude": -147.788, "Latitude": 49.629 },
],
},
},
],
},
},
},
"ProviderDates":
[
{ "Date": "2013-10-12T21:37:42.000Z", "Type": "Insert" },
{ "Date": "2013-10-12T21:37:42.000Z", "Type": "Update" },
],
"CollectionReference": { "EntryTitle": "ALOS_PALSAR_LEVEL1.0" },
"RelatedUrls":
[
{
"Format": "Not provided",
"Type": "GET DATA",
"URL": "https://datapool.asf.alaska.edu/L1.0/A3/ALPSRS279162650-L1.0.zip",
},
{
"Format": "Not provided",
"Type": "GET RELATED VISUALIZATION",
"URL": "https://datapool.asf.alaska.edu/BROWSE/A3/ALPSRS279162650.jpg",
},
],
"DataGranule":
{
"DayNightFlag": "Unspecified",
"Identifiers":
[
{
"Identifier": "ALPSRS279162650",
"IdentifierType": "ProducerGranuleId",
},
],
"ProductionDateTime": "2013-10-12T21:37:42.000Z",
"ArchiveAndDistributionInformation":
[
{
"Name": "Not provided",
"Size": 470.23,
"SizeUnit": "MB",
"Format": "Not provided",
},
],
},
"Platforms":
[
{
"ShortName": "ALOS",
"Instruments":
[
{
"ShortName": "PALSAR",
"ComposedOf": [{ "ShortName": "WB1" }],
},
],
},
],
},
"meta":
{
"concept-type": "granule",
"concept-id": "G1209461416-ASF",
"revision-id": 1,
"native-id": "ALPSRS279162650-L1.0",
"provider-id": "ASF",
"format": "application/echo10+xml",
"revision-date": "2015-11-06T19:20:15.260Z",
},
},
{
"properties":
{
"beamModeType": "WB1",
"browse": ["https://datapool.asf.alaska.edu/BROWSE/A3/ALPSRS279162650.jpg"],
"bytes": 25543487,
"centerLat": 47.8206,
"centerLon": -145.828,
"faradayRotation": 7.068555,
"fileID": "ALPSRS279162650-L1.5",
"flightDirection": "DESCENDING",
"groupID": null,
"granuleType": "ALOS_PALSAR_SCENE",
"insarStackId": "0",
"md5sum": "f885395571540ed61b4ef02f4e0c49c9",
"offNadirAngle": 27.1,
"orbit": 27916,
"pathNumber": 598,
"platform": "ALOS",
"pointingAngle": null,
"polarization": "HH 5scan",
"processingDate": "2013-10-12T21:50:49.000Z",
"processingLevel": "L1.5",
"sceneName": "ALPSRS279162650",
"sensor": "PALSAR",
"startTime": "2011-04-21T20:23:11.000Z",
"stopTime": "2011-04-21T20:23:36.000Z",
"url": "https://datapool.asf.alaska.edu/L1.5/A3/ALPSRS279162650-L1.5.zip",
"fileName": "ALPSRS279162650-L1.5.zip",
"frameNumber": 2650,
"pgeVersion": null
},
"geometry":
{
"coordinates":
[
[
[-147.788, 49.629],
[-148.523, 46.565],
[-144.011, 45.968],
[-143.006, 49.024],
[-147.788, 49.629],
],
],
"type": "Polygon",
},
"baseline": { "insarBaseline": 0.0 },
"umm":
{
"TemporalExtent":
{
"RangeDateTime":
{
"BeginningDateTime": "2011-04-21T20:23:11.000Z",
"EndingDateTime": "2011-04-21T20:23:36.000Z",
},
},
"OrbitCalculatedSpatialDomains": [{ "OrbitNumber": 27916 }],
"GranuleUR": "ALPSRS279162650-L1.5",
"AdditionalAttributes":
[
{ "Name": "FLIGHT_LINE", "Values": ["NULL"] },
{ "Name": "OFF_NADIR_ANGLE", "Values": ["27.1"] },
{
"Name": "MD5SUM",
"Values": ["f885395571540ed61b4ef02f4e0c49c9"],
},
{ "Name": "GRANULE_TYPE", "Values": ["ALOS_PALSAR_SCENE"] },
{ "Name": "ASCENDING_DESCENDING", "Values": ["DESCENDING"] },
{ "Name": "FAR_END_LAT", "Values": ["46.565"] },
{ "Name": "INSAR_STACK_SIZE", "Values": ["0"] },
{ "Name": "BEAM_MODE_TYPE", "Values": ["WB1"] },
{ "Name": "INSAR_BASELINE", "Values": ["0"] },
{ "Name": "CENTER_FRAME_ID", "Values": ["2613"] },
{ "Name": "CENTER_ESA_FRAME", "Values": ["2613"] },
{ "Name": "ACQUISITION_DATE", "Values": ["2011-04-21T20:23:36Z"] },
{ "Name": "MISSION_NAME", "Values": ["NULL"] },
{ "Name": "CENTER_LON", "Values": ["-145.828"] },
{ "Name": "NEAR_START_LAT", "Values": ["49.024"] },
{ "Name": "BEAM_MODE", "Values": ["WB1"] },
{
"Name": "BEAM_MODE_DESC",
"Values":
["ALOS PALSAR sensor: Scan Observation Mode (Burst Mode 1)"],
},
{ "Name": "PROCESSING_TYPE", "Values": ["L1.5"] },
{
"Name": "PROCESSING_DESCRIPTION",
"Values": ["Fully processed, multi-look, georeferenced data"],
},
{ "Name": "FRAME_NUMBER", "Values": ["2650"] },
{ "Name": "PROCESSING_LEVEL", "Values": ["L1"] },
{
"Name": "PROCESSING_DATE",
"Values": ["2013-10-12 21:50:49.238796"],
},
{ "Name": "NEAR_START_LON", "Values": ["-143.006"] },
{ "Name": "DOPPLER", "Values": ["-1"] },
{ "Name": "FAR_START_LAT", "Values": ["49.629"] },
{ "Name": "NEAR_END_LON", "Values": ["-144.011"] },
{
"Name": "PROCESSING_TYPE_DISPLAY",
"Values": ["Level 1.5 Image"],
},
{ "Name": "POLARIZATION", "Values": ["HH 5scan"] },
{ "Name": "FAR_START_LON", "Values": ["-147.788"] },
{
"Name": "THUMBNAIL_URL",
"Values":
[
"https://datapool.asf.alaska.edu/THUMBNAIL/A3/ALPSRS279162650_THUMBNAIL.jpg",
],
},
{ "Name": "ASF_PLATFORM", "Values": ["ALOS"] },
{ "Name": "INSAR_STACK_ID", "Values": ["0"] },
{ "Name": "LOOK_DIRECTION", "Values": ["R"] },
{ "Name": "PATH_NUMBER", "Values": ["598"] },
{ "Name": "NEAR_END_LAT", "Values": ["45.968"] },
{ "Name": "FARADAY_ROTATION", "Values": ["7.068555"] },
{ "Name": "FAR_END_LON", "Values": ["-148.523"] },
{ "Name": "BYTES", "Values": ["25543487"] },
{ "Name": "CENTER_LAT", "Values": ["47.8206"] },
],
"SpatialExtent":
{
"HorizontalSpatialDomain":
{
"Geometry":
{
"GPolygons":
[
{
"Boundary":
{
"Points":
[
{ "Longitude": -147.788, "Latitude": 49.629 },
{ "Longitude": -148.523, "Latitude": 46.565 },
{ "Longitude": -144.011, "Latitude": 45.968 },
{ "Longitude": -143.006, "Latitude": 49.024 },
{ "Longitude": -147.788, "Latitude": 49.629 },
],
},
},
],
},
},
},
"ProviderDates":
[
{ "Date": "2013-10-12T21:50:49.000Z", "Type": "Insert" },
{ "Date": "2013-10-12T21:50:49.000Z", "Type": "Update" },
],
"CollectionReference": { "EntryTitle": "ALOS_PALSAR_LEVEL1.5" },
"RelatedUrls":
[
{
"Format": "Not provided",
"Type": "GET DATA",
"URL": "https://datapool.asf.alaska.edu/L1.5/A3/ALPSRS279162650-L1.5.zip",
},
{
"Format": "Not provided",
"Type": "GET RELATED VISUALIZATION",
"URL": "https://datapool.asf.alaska.edu/BROWSE/A3/ALPSRS279162650.jpg",
},
],
"DataGranule":
{
"DayNightFlag": "Unspecified",
"Identifiers":
[
{
"Identifier": "ALPSRS279162650",
"IdentifierType": "ProducerGranuleId",
},
],
"ProductionDateTime": "2013-10-12T21:50:49.000Z",
"ArchiveAndDistributionInformation":
[
{
"Name": "Not provided",
"Size": 24.36,
"SizeUnit": "MB",
"Format": "Not provided",
},
],
},
"Platforms":
[
{
"ShortName": "ALOS",
"Instruments":
[
{
"ShortName": "PALSAR",
"ComposedOf": [{ "ShortName": "WB1" }],
},
],
},
],
},
"meta":
{
"concept-type": "granule",
"concept-id": "G1212133627-ASF",
"revision-id": 1,
"native-id": "ALPSRS279162650-L1.5",
"provider-id": "ASF",
"format": "application/echo10+xml",
"revision-date": "2015-11-10T15:33:44.983Z",
},
},
{
"properties":
{
"beamModeType": "WB1",
"browse": ["https://datapool.asf.alaska.edu/BROWSE/A3/ALPSRS279162600.jpg"],
"bytes": 434727854,
"centerLat": 50.2797,
"centerLon": -145.1062,
"faradayRotation": 6.533453,
"fileID": "ALPSRS279162600-L1.0",
"flightDirection": "DESCENDING",
"groupID": null,
"granuleType": "ALOS_PALSAR_SCENE",
"insarStackId": "0",
"md5sum": "3027326bda0f3973fba1736dd7211760",
"offNadirAngle": 27.1,
"orbit": 27916,
"pathNumber": 598,
"platform": "ALOS",
"pointingAngle": null,
"polarization": "HH 5scan",
"processingDate": "2013-10-12T21:37:14.000Z",
"processingLevel": "L1.0",
"sceneName": "ALPSRS279162600",
"sensor": "PALSAR",
"startTime": "2011-04-21T20:22:30.000Z",
"stopTime": "2011-04-21T20:22:55.000Z",
"url": "https://datapool.asf.alaska.edu/L1.0/A3/ALPSRS279162600-L1.0.zip",
"fileName": "ALPSRS279162600-L1.0.zip",
"frameNumber": 2600,
"pgeVersion": null
},
"geometry":
{
"coordinates":
[
[
[-147.166, 52.088],
[-147.936, 49.028],
[-143.21, 48.424],
[-142.131, 51.474],
[-147.166, 52.088],
],
],
"type": "Polygon",
},
"baseline": { "insarBaseline": 0.0 },
"umm":
{
"TemporalExtent":
{
"RangeDateTime":
{
"BeginningDateTime": "2011-04-21T20:22:30.000Z",
"EndingDateTime": "2011-04-21T20:22:55.000Z",
},
},
"OrbitCalculatedSpatialDomains": [{ "OrbitNumber": 27916 }],
"GranuleUR": "ALPSRS279162600-L1.0",
"AdditionalAttributes":
[
{ "Name": "FLIGHT_LINE", "Values": ["NULL"] },
{ "Name": "OFF_NADIR_ANGLE", "Values": ["27.1"] },
{
"Name": "MD5SUM",
"Values": ["3027326bda0f3973fba1736dd7211760"],
},
{ "Name": "GRANULE_TYPE", "Values": ["ALOS_PALSAR_SCENE"] },
{ "Name": "ASCENDING_DESCENDING", "Values": ["DESCENDING"] },
{ "Name": "FAR_END_LAT", "Values": ["49.028"] },
{ "Name": "INSAR_STACK_SIZE", "Values": ["0"] },
{ "Name": "BEAM_MODE_TYPE", "Values": ["WB1"] },
{ "Name": "INSAR_BASELINE", "Values": ["0"] },
{ "Name": "CENTER_FRAME_ID", "Values": ["2564"] },
{ "Name": "CENTER_ESA_FRAME", "Values": ["2564"] },
{ "Name": "ACQUISITION_DATE", "Values": ["2011-04-21T20:22:55Z"] },
{ "Name": "MISSION_NAME", "Values": ["NULL"] },
{ "Name": "CENTER_LON", "Values": ["-145.1062"] },
{ "Name": "NEAR_START_LAT", "Values": ["51.474"] },
{ "Name": "BEAM_MODE", "Values": ["WB1"] },
{
"Name": "BEAM_MODE_DESC",
"Values":
["ALOS PALSAR sensor: Scan Observation Mode (Burst Mode 1)"],
},
{ "Name": "PROCESSING_TYPE", "Values": ["L1.0"] },
{
"Name": "PROCESSING_DESCRIPTION",
"Values": ["Reconstructed, unprocessed signal data"],
},
{ "Name": "FRAME_NUMBER", "Values": ["2600"] },
{ "Name": "PROCESSING_LEVEL", "Values": ["L0"] },
{
"Name": "PROCESSING_DATE",
"Values": ["2013-10-12 21:37:14.696874"],
},
{ "Name": "NEAR_START_LON", "Values": ["-142.131"] },
{ "Name": "DOPPLER", "Values": ["-1"] },
{ "Name": "FAR_START_LAT", "Values": ["52.088"] },
{ "Name": "NEAR_END_LON", "Values": ["-143.21"] },
{ "Name": "PROCESSING_TYPE_DISPLAY", "Values": ["Level 1.0"] },
{ "Name": "POLARIZATION", "Values": ["HH 5scan"] },
{ "Name": "FAR_START_LON", "Values": ["-147.166"] },
{
"Name": "THUMBNAIL_URL",
"Values":
[
"https://datapool.asf.alaska.edu/THUMBNAIL/A3/ALPSRS279162600_THUMBNAIL.jpg",
],
},
{ "Name": "ASF_PLATFORM", "Values": ["ALOS"] },
{ "Name": "INSAR_STACK_ID", "Values": ["0"] },
{ "Name": "LOOK_DIRECTION", "Values": ["R"] },
{ "Name": "PATH_NUMBER", "Values": ["598"] },
{ "Name": "NEAR_END_LAT", "Values": ["48.424"] },
{ "Name": "FARADAY_ROTATION", "Values": ["6.533453"] },
{ "Name": "FAR_END_LON", "Values": ["-147.936"] },
{ "Name": "BYTES", "Values": ["434727854"] },
{ "Name": "CENTER_LAT", "Values": ["50.2797"] },
],
"SpatialExtent":
{
"HorizontalSpatialDomain":
{
"Geometry":
{
"GPolygons":
[
{
"Boundary":
{
"Points":
[
{ "Longitude": -147.166, "Latitude": 52.088 },
{ "Longitude": -147.936, "Latitude": 49.028 },
{ "Longitude": -143.21, "Latitude": 48.424 },
{ "Longitude": -142.131, "Latitude": 51.474 },
{ "Longitude": -147.166, "Latitude": 52.088 },
],
},
},
],
},
},
},
"ProviderDates":
[
{ "Date": "2013-10-12T21:37:14.000Z", "Type": "Insert" },
{ "Date": "2013-10-12T21:37:14.000Z", "Type": "Update" },
],
"CollectionReference": { "EntryTitle": "ALOS_PALSAR_LEVEL1.0" },
"RelatedUrls":
[
{
"Format": "Not provided",
"Type": "GET DATA",
"URL": "https://datapool.asf.alaska.edu/L1.0/A3/ALPSRS279162600-L1.0.zip",
},
{
"Format": "Not provided",
"Type": "GET RELATED VISUALIZATION",
"URL": "https://datapool.asf.alaska.edu/BROWSE/A3/ALPSRS279162600.jpg",
},
],
"DataGranule":
{
"DayNightFlag": "Unspecified",
"Identifiers":
[
{
"Identifier": "ALPSRS279162600",
"IdentifierType": "ProducerGranuleId",
},
],
"ProductionDateTime": "2013-10-12T21:37:14.000Z",
"ArchiveAndDistributionInformation":
[
{
"Name": "Not provided",
"Size": 414.58,
"SizeUnit": "MB",
"Format": "Not provided",
},
],
},
"Platforms":
[
{
"ShortName": "ALOS",
"Instruments":
[
{
"ShortName": "PALSAR",
"ComposedOf": [{ "ShortName": "WB1" }],
},
],
},
],
},
"meta":
{
"concept-type": "granule",
"concept-id": "G1212389847-ASF",
"revision-id": 1,
"native-id": "ALPSRS279162600-L1.0",
"provider-id": "ASF",
"format": "application/echo10+xml",
"revision-date": "2015-11-11T11:28:39.482Z",
},
},
]
Discovery-asf_search-8.1.2/tests/yml_tests/Resources/Alos_response_missing_baseline.yml 0000664 0000000 0000000 00000027265 14777330235 0031705 0 ustar 00root root 0000000 0000000 {
"type": "Feature",
"geometry": {
"coordinates": [
[
[
-135.071,
56.643
],
[
-135.227,
57.142
],
[
-136.295,
57.037
],
[
-136.125,
56.538
],
[
-135.071,
56.643
]
]
],
"type": "Polygon"
},
"properties": {
"beamModeType": "FBS",
"browse": "https://datapool.asf.alaska.edu/BROWSE/A3/ALPSRP111041130.jpg",
"bytes": 408738585,
"centerLat": 56.8411,
"centerLon": -135.6799,
"faradayRotation": 0.456056,
"fileID": "ALPSRP111041130-L1.0",
"flightDirection": null,
"groupID": "ALPSRP111041130",
"granuleType": "ALOS_PALSAR_SCENE",
"insarStackId": "1486384",
"md5sum": "2a5fa75a25f9eb8d176ffd5bf1bfab21",
"offNadirAngle": 34.3,
"orbit": 11104,
"pathNumber": 238,
"platform": "ALOS",
"pointingAngle": null,
"polarization": "HH",
"processingDate": "2012-08-23T00:00:00.000Z",
"processingLevel": "L1.0",
"sceneName": "ALPSRP111041130",
"sensor": "PALSAR",
"startTime": "2008-02-24T07:13:21.000Z",
"stopTime": "2008-02-24T07:13:29.000Z",
"url": "https://datapool.asf.alaska.edu/L1.0/A3/ALPSRP111041130-L1.0.zip",
"fileName": "ALPSRP111041130-L1.0.zip",
"frameNumber": "1130"
},
# "baseline": {
# "insarBaseline": 4798.7874
# },
"umm": {
"TemporalExtent": {
"RangeDateTime": {
"BeginningDateTime": "2008-02-24T07:13:21.000Z",
"EndingDateTime": "2008-02-24T07:13:29.000Z"
}
},
"OrbitCalculatedSpatialDomains": [
{
"OrbitNumber": 11104
}
],
"GranuleUR": "ALPSRP111041130-L1.0",
"AdditionalAttributes": [
{
"Name": "FLIGHT_LINE",
"Values": [
"NULL"
]
},
{
"Name": "GROUP_ID",
"Values": [
"ALPSRP111041130"
]
},
{
"Name": "OFF_NADIR_ANGLE",
"Values": [
"34.3"
]
},
{
"Name": "MD5SUM",
"Values": [
"2a5fa75a25f9eb8d176ffd5bf1bfab21"
]
},
{
"Name": "GRANULE_TYPE",
"Values": [
"ALOS_PALSAR_SCENE"
]
},
{
"Name": "ASCENDING_DESCENDING",
"Values": [
"ASCENDING"
]
},
{
"Name": "FAR_END_LAT",
"Values": [
"57.142"
]
},
{
"Name": "INSAR_STACK_SIZE",
"Values": [
"23"
]
},
{
"Name": "BEAM_MODE_TYPE",
"Values": [
"FBS"
]
},
{
"Name": "INSAR_BASELINE",
"Values": [
"4798.7874"
]
},
{
"Name": "CENTER_FRAME_ID",
"Values": [
"1137"
]
},
{
"Name": "CENTER_ESA_FRAME",
"Values": [
"1137"
]
},
{
"Name": "ACQUISITION_DATE",
"Values": [
"2008-02-24T07:13:29Z"
]
},
{
"Name": "MISSION_NAME",
"Values": [
"NULL"
]
},
{
"Name": "CENTER_LON",
"Values": [
"-135.6799"
]
},
{
"Name": "NEAR_START_LAT",
"Values": [
"56.538"
]
},
{
"Name": "BEAM_MODE",
"Values": [
"FBS"
]
},
{
"Name": "BEAM_MODE_DESC",
"Values": [
"ALOS PALSAR sensor: High Resolution Observation Mode (single polarization)"
]
},
{
"Name": "PROCESSING_TYPE",
"Values": [
"L1.0"
]
},
{
"Name": "PROCESSING_DESCRIPTION",
"Values": [
"Reconstructed, unprocessed signal data"
]
},
{
"Name": "FRAME_NUMBER",
"Values": [
"1130"
]
},
{
"Name": "PROCESSING_LEVEL",
"Values": [
"L0"
]
},
{
"Name": "PROCESSING_DATE",
"Values": [
"2012-08-23 00:00:00"
]
},
{
"Name": "NEAR_START_LON",
"Values": [
"-136.125"
]
},
{
"Name": "DOPPLER",
"Values": [
"0"
]
},
{
"Name": "FAR_START_LAT",
"Values": [
"56.643"
]
},
{
"Name": "NEAR_END_LON",
"Values": [
"-136.295"
]
},
{
"Name": "PROCESSING_TYPE_DISPLAY",
"Values": [
"Level 1.0"
]
},
{
"Name": "POLARIZATION",
"Values": [
"HH"
]
},
{
"Name": "FAR_START_LON",
"Values": [
"-135.071"
]
},
{
"Name": "THUMBNAIL_URL",
"Values": [
"https://datapool.asf.alaska.edu/THUMBNAIL/A3/AP_11104_FBS_F1130_THUMBNAIL.jpg"
]
},
{
"Name": "ASF_PLATFORM",
"Values": [
"ALOS"
]
},
{
"Name": "INSAR_STACK_ID",
"Values": [
"1486384"
]
},
{
"Name": "LOOK_DIRECTION",
"Values": [
"R"
]
},
{
"Name": "PATH_NUMBER",
"Values": [
"238"
]
},
{
"Name": "NEAR_END_LAT",
"Values": [
"57.037"
]
},
{
"Name": "FARADAY_ROTATION",
"Values": [
"0.456056"
]
},
{
"Name": "FAR_END_LON",
"Values": [
"-135.227"
]
},
{
"Name": "BYTES",
"Values": [
"408738585"
]
},
{
"Name": "CENTER_LAT",
"Values": [
"56.8411"
]
}
],
"SpatialExtent": {
"HorizontalSpatialDomain": {
"Geometry": {
"GPolygons": [
{
"Boundary": {
"Points": [
{
"Longitude": -135.071,
"Latitude": 56.643
},
{
"Longitude": -135.227,
"Latitude": 57.142
},
{
"Longitude": -136.295,
"Latitude": 57.037
},
{
"Longitude": -136.125,
"Latitude": 56.538
},
{
"Longitude": -135.071,
"Latitude": 56.643
}
]
}
}
]
}
}
},
"ProviderDates": [
{
"Date": "2012-01-16T12:10:57.000Z",
"Type": "Insert"
},
{
"Date": "2012-08-23T00:00:00.000Z",
"Type": "Update"
}
],
"CollectionReference": {
"EntryTitle": "ALOS_PALSAR_LEVEL1.0"
},
"RelatedUrls": [
{
"Format": "Not provided",
"Type": "GET DATA",
"URL": "https://datapool.asf.alaska.edu/L1.0/A3/ALPSRP111041130-L1.0.zip"
},
{
"Format": "Not provided",
"Type": "GET RELATED VISUALIZATION",
"URL": "https://datapool.asf.alaska.edu/BROWSE/A3/ALPSRP111041130.jpg"
},
{
"Format": "Not provided",
"Type": "GET RELATED VISUALIZATION",
"URL": "https://datapool.asf.alaska.edu/BROWSE/A3/AP_11104_FBS_F1130.jpg"
}
],
"DataGranule": {
"DayNightFlag": "Unspecified",
"Identifiers": [
{
"Identifier": "ALPSRP111041130",
"IdentifierType": "ProducerGranuleId"
}
],
"ProductionDateTime": "2012-08-23T00:00:00.000Z",
"ArchiveAndDistributionInformation": [
{
"Name": "Not provided",
"Size": 389.8,
"SizeUnit": "MB",
"Format": "Not provided"
}
]
},
"Platforms": [
{
"ShortName": "ALOS",
"Instruments": [
{
"ShortName": "PALSAR",
"ComposedOf": [
{
"ShortName": "FBS"
}
]
}
]
}
]
},
"meta": {
"concept-type": "granule",
"concept-id": "G1213802033-ASF",
"revision-id": 2,
"native-id": "ALPSRP111041130-L1.0",
"provider-id": "ASF",
"format": "application/echo10+xml",
"revision-date": "2016-05-13T14:32:33.502Z"
}
} Discovery-asf_search-8.1.2/tests/yml_tests/Resources/Fairbanks_L1.yml 0000664 0000000 0000000 00000025312 14777330235 0025721 0 ustar 00root root 0000000 0000000 {
"type": "Feature",
"properties":
{
"beamModeType": "STD",
"browse": "https://datapool.asf.alaska.edu/BROWSE/E1/E1_19942_STD_F287.jpg",
"bytes": 58750445,
"centerLat": 64.9813,
"centerLon": -147.7602,
"faradayRotation": None,
"fileID": "E1_19942_STD_F287-L1",
"flightDirection": "DESCENDING",
"groupID": None,
"granuleType": "E1_STD_FRAME",
"insarStackId": "1736495",
"md5sum": "612958259af2fa499cd10a12d9e8c9a4",
"offNadirAngle": -1.0,
"orbit": 19942,
"pathNumber": 415,
"platform": "ERS-1",
"pointingAngle": None,
"polarization": "VV",
"processingDate": "2010-12-05T11:21:45.000Z",
"processingLevel": "L1",
"sceneName": "E1_19942_STD_F287",
"sensor": "SAR",
"startTime": "1995-05-08T21:09:19.000Z",
"stopTime": "1995-05-08T21:09:36.000Z",
"url": "https://datapool.asf.alaska.edu/L1/E1/E1_19942_STD_F287.zip",
"pgeVersion": None,
"fileName": "E1_19942_STD_F287.zip",
"frameNumber": 287,
},
"meta":
{
"concept-type": "granule",
"concept-id": "G1213363105-ASF",
"revision-id": 1,
"native-id": "E1_19942_STD_F287-L1",
"provider-id": "ASF",
"format": "application/echo10+xml",
"revision-date": "2015-11-13T18:39:18.230Z",
},
"umm":
{
"TemporalExtent":
{
"RangeDateTime":
{
"BeginningDateTime": "1995-05-08T21:09:19.000Z",
"EndingDateTime": "1995-05-08T21:09:36.000Z",
},
},
"OrbitCalculatedSpatialDomains": [{ "OrbitNumber": 19942 }],
"GranuleUR": "E1_19942_STD_F287-L1",
"AdditionalAttributes":
[
{ "Name": "FLIGHT_LINE", "Values": ["NULL"] },
{ "Name": "OFF_NADIR_ANGLE", "Values": ["-1"] },
{
"Name": "MD5SUM",
"Values": ["612958259af2fa499cd10a12d9e8c9a4"],
},
{ "Name": "GRANULE_TYPE", "Values": ["E1_STD_FRAME"] },
{
"Name": "ASCENDING_DESCENDING",
"Values": ["DESCENDING"],
},
{ "Name": "FAR_END_LAT", "Values": ["64.6274"] },
{ "Name": "INSAR_STACK_SIZE", "Values": ["139"] },
{ "Name": "BEAM_MODE_TYPE", "Values": ["STD"] },
{ "Name": "INSAR_BASELINE", "Values": ["0"] },
{ "Name": "CENTER_FRAME_ID", "Values": ["2291"] },
{ "Name": "CENTER_ESA_FRAME", "Values": ["2291"] },
{
"Name": "ACQUISITION_DATE",
"Values": ["1995-05-08T21:09:36Z"],
},
{ "Name": "MISSION_NAME", "Values": ["NULL"] },
{ "Name": "CENTER_LON", "Values": ["-147.7602"] },
{ "Name": "NEAR_START_LAT", "Values": ["65.3242"] },
{ "Name": "BEAM_MODE", "Values": ["Standard"] },
{
"Name": "BEAM_MODE_DESC",
"Values":
[
"ERS-1,ERS-2,JERS-1,SEASAT,SMAP Standard Beam SAR",
],
},
{ "Name": "PROCESSING_TYPE", "Values": ["L1"] },
{
"Name": "PROCESSING_DESCRIPTION",
"Values": ["Fully processed SAR data."],
},
{ "Name": "FRAME_NUMBER", "Values": ["287"] },
{ "Name": "PROCESSING_LEVEL", "Values": ["L1"] },
{
"Name": "PROCESSING_DATE",
"Values": ["2010-12-05 11:21:45.673251"],
},
{ "Name": "NEAR_START_LON", "Values": ["-146.4032"] },
{ "Name": "DOPPLER", "Values": ["0"] },
{ "Name": "FAR_START_LAT", "Values": ["65.5755"] },
{ "Name": "NEAR_END_LON", "Values": ["-147.0941"] },
{
"Name": "PROCESSING_TYPE_DISPLAY",
"Values": ["Level One Image"],
},
{ "Name": "POLARIZATION", "Values": ["VV"] },
{ "Name": "FAR_START_LON", "Values": ["-148.4602"] },
{
"Name": "THUMBNAIL_URL",
"Values":
[
"https://datapool.asf.alaska.edu/THUMBNAIL/E1/E1_19942_STD_F287_THUMBNAIL.jpg",
],
},
{ "Name": "ASF_PLATFORM", "Values": ["ERS-1"] },
{ "Name": "INSAR_STACK_ID", "Values": ["1736495"] },
{ "Name": "LOOK_DIRECTION", "Values": ["R"] },
{ "Name": "PATH_NUMBER", "Values": ["415"] },
{ "Name": "NEAR_END_LAT", "Values": ["64.3827"] },
{ "Name": "FARADAY_ROTATION", "Values": ["NA"] },
{ "Name": "FAR_END_LON", "Values": ["-149.0811"] },
{ "Name": "BYTES", "Values": ["58750445"] },
{ "Name": "CENTER_LAT", "Values": ["64.9813"] },
],
"SpatialExtent":
{
"HorizontalSpatialDomain":
{
"Geometry":
{
"GPolygons":
[
{
"Boundary":
{
"Points":
[
{
"Longitude": -148.460235,
"Latitude": 65.57549,
},
{
"Longitude": -149.081122,
"Latitude": 64.627404,
},
{
"Longitude": -147.094104,
"Latitude": 64.38274,
},
{
"Longitude": -146.403205,
"Latitude": 65.324152,
},
{
"Longitude": -148.460235,
"Latitude": 65.57549,
},
],
},
},
],
},
},
},
"ProviderDates":
[
{ "Date": "2010-12-05T11:21:45.000Z", "Type": "Insert" },
{ "Date": "2010-12-05T11:21:45.000Z", "Type": "Update" },
],
"CollectionReference": { "EntryTitle": "ERS-1_LEVEL1" },
"RelatedUrls":
[
{
"URL": "https://datapool.asf.alaska.edu/L1/E1/E1_19942_STD_F287.zip",
"Type": "GET DATA",
},
{
"URL": "https://datapool.asf.alaska.edu/BROWSE/E1/E1_19942_STD_F287.jpg",
"Type": "GET RELATED VISUALIZATION",
},
],
"DataGranule":
{
"DayNightFlag": "Unspecified",
"Identifiers":
[
{
"Identifier": "E1_19942_STD_F287",
"IdentifierType": "ProducerGranuleId",
},
],
"ProductionDateTime": "2010-12-05T11:21:45.000Z",
"ArchiveAndDistributionInformation":
[
{
"Name": "Not provided",
"Size": 56.02,
"SizeUnit": "MB",
},
],
},
"Platforms":
[
{
"ShortName": "ERS-1",
"Instruments":
[
{
"ShortName": "SAR",
"ComposedOf": [{ "ShortName": "STD" }],
},
],
},
],
"MetadataSpecification":
{
"URL": "https://cdn.earthdata.nasa.gov/umm/granule/v1.6.5",
"Name": "UMM-G",
"Version": "1.6.5",
},
},
"geometry":
{
"coordinates":
[
[
[-148.460235, 65.57549],
[-149.081122, 64.627404],
[-147.094104, 64.38274],
[-146.403205, 65.324152],
[-148.460235, 65.57549],
],
],
"type": "Polygon",
},
"baseline": { "insarBaseline": 0.0 },
}
Discovery-asf_search-8.1.2/tests/yml_tests/Resources/Fairbanks_S1_stack incomplete.yml 0000664 0000000 0000000 00000074640 14777330235 0031245 0 ustar 00root root 0000000 0000000 [
{
"type": "Feature",
"geometry": {
"coordinates": [
[
[
-150.171432,
65.531036
],
[
-149.24501,
63.941902
],
[
-144.135376,
64.386147
],
[
-144.750443,
65.98999
],
[
-150.171432,
65.531036
]
]
],
"type": "Polygon"
},
"properties": {
"beamModeType": "IW",
"browse": null,
"bytes": 4190691686,
"centerLat": 64.9858,
"centerLon": -147.0898,
"faradayRotation": null,
"fileID": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A-SLC",
"flightDirection": null,
"groupID": "S1B_IWDV_0209_0216_025145_094",
"granuleType": "SENTINEL_1B_FRAME",
"insarStackId": null,
"md5sum": "2a76325db9d931414189689082163ee5",
"offNadirAngle": null,
"orbit": 25145,
"pathNumber": 94,
"platform": "Sentinel-1B",
"pointingAngle": null,
"polarization": "VV+VH",
"processingDate": "2021-01-14T03:20:30.000Z",
"processingLevel": "SLC",
"sceneName": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A",
"sensor": "C-SAR",
"startTime": "2021-01-14T03:20:30.000Z",
"stopTime": "2021-01-14T03:20:57.000Z",
"url": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A.zip",
"fileName": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A.zip",
"frameNumber": 210,
"temporalBaseline": 0,
"perpendicularBaseline": 0,
},
"baseline": {
"stateVectors": {
"positions": {
"prePosition": [
-2893725.563321,
-1235772.278165,
6327540.675522
],
"prePositionTime": "2021-01-14T03:20:42.000000",
"postPosition": [
-2845242.394622,
-1186516.792913,
6358810.884979
],
"postPositionTime": "2021-01-14T03:20:52.000000"
},
"velocities": {
"preVelocity": [
4828.615891,
4922.252624,
3162.766956
],
"preVelocityTime": "2021-01-14T03:20:42.000000",
"postVelocity": [
4867.928786,
4928.742823,
3091.216457
],
"postVelocityTime": "2021-01-14T03:20:52.000000"
}
},
"ascendingNodeTime": "2021-01-14T03:02:58.414522"
},
"umm": {
"TemporalExtent": {
"RangeDateTime": {
"BeginningDateTime": "2021-01-14T03:20:30.000Z",
"EndingDateTime": "2021-01-14T03:20:57.000Z"
}
},
"OrbitCalculatedSpatialDomains": [
{
"OrbitNumber": 25145
}
],
"GranuleUR": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A-SLC",
"AdditionalAttributes": [
{
"Name": "ACQUISITION_DATE",
"Values": [
"2021-01-14T03:20:57.000000"
]
},
{
"Name": "ASCENDING_DESCENDING",
"Values": [
"ASCENDING"
]
},
{
"Name": "ASC_NODE_TIME",
"Values": [
"2021-01-14T03:02:58.414522"
]
},
{
"Name": "ASF_PLATFORM",
"Values": [
"Sentinel-1B"
]
},
{
"Name": "BEAM_MODE",
"Values": [
"IW"
]
},
{
"Name": "BEAM_MODE_DESC",
"Values": [
"Interferometric Wide. 250 km swath, 5 m x 20 m spatial resolution and burst synchronization for interferometry. IW is considered to be the standard mode over land masses."
]
},
{
"Name": "BEAM_MODE_TYPE",
"Values": [
"IW"
]
},
{
"Name": "BYTES",
"Values": [
"4190691686"
]
},
{
"Name": "CENTER_ESA_FRAME",
"Values": [
"1300"
]
},
{
"Name": "CENTER_FRAME_ID",
"Values": [
"213"
]
},
{
"Name": "CENTER_LAT",
"Values": [
"64.9858"
]
},
{
"Name": "CENTER_LON",
"Values": [
"-147.0898"
]
},
{
"Name": "DOPPLER",
"Values": [
"0"
]
},
{
"Name": "FARADAY_ROTATION",
"Values": [
"NA"
]
},
{
"Name": "FAR_END_LAT",
"Values": [
"65.98999"
]
},
{
"Name": "FAR_END_LON",
"Values": [
"-144.750443"
]
},
{
"Name": "FAR_START_LAT",
"Values": [
"64.386147"
]
},
{
"Name": "FAR_START_LON",
"Values": [
"-144.135376"
]
},
{
"Name": "FRAME_NUMBER",
"Values": [
"210"
]
},
{
"Name": "GRANULE_TYPE",
"Values": [
"SENTINEL_1B_FRAME"
]
},
{
"Name": "GROUP_ID",
"Values": [
"S1B_IWDV_0209_0216_025145_094"
]
},
{
"Name": "LOOK_DIRECTION",
"Values": [
"R"
]
},
{
"Name": "MD5SUM",
"Values": [
"2a76325db9d931414189689082163ee5"
]
},
{
"Name": "MISSION_NAME",
"Values": [
"NA"
]
},
{
"Name": "NEAR_END_LAT",
"Values": [
"65.531036"
]
},
{
"Name": "NEAR_END_LON",
"Values": [
"-150.171432"
]
},
{
"Name": "NEAR_START_LAT",
"Values": [
"63.941902"
]
},
{
"Name": "NEAR_START_LON",
"Values": [
"-149.24501"
]
},
{
"Name": "PATH_NUMBER",
"Values": [
"94"
]
},
{
"Name": "POLARIZATION",
"Values": [
"VV+VH"
]
},
{
"Name": "PROCESSING_DATE",
"Values": [
"2021-01-15T05:50:38.350917"
]
},
{
"Name": "PROCESSING_DESCRIPTION",
"Values": [
"Sentinel-1B Single Look Complex product"
]
},
{
"Name": "PROCESSING_LEVEL",
"Values": [
"L1"
]
},
{
"Name": "PROCESSING_TYPE",
"Values": [
"SLC"
]
},
{
"Name": "PROCESSING_TYPE_DISPLAY",
"Values": [
"L1 Single Look Complex (SLC)"
]
},
{
"Name": "SV_POSITION_POST",
"Values": [
"-2845242.394622,-1186516.792913,6358810.884979,2021-01-14T03:20:52.000000"
]
},
{
"Name": "SV_POSITION_PRE",
"Values": [
"-2893725.563321,-1235772.278165,6327540.675522,2021-01-14T03:20:42.000000"
]
},
{
"Name": "SV_VELOCITY_POST",
"Values": [
"4867.928786,4928.742823,3091.216457,2021-01-14T03:20:52.000000"
]
},
{
"Name": "SV_VELOCITY_PRE",
"Values": [
"4828.615891,4922.252624,3162.766956,2021-01-14T03:20:42.000000"
]
}
],
"SpatialExtent": {
"HorizontalSpatialDomain": {
"Geometry": {
"GPolygons": [
{
"Boundary": {
"Points": [
{
"Longitude": -150.171432,
"Latitude": 65.531036
},
{
"Longitude": -149.24501,
"Latitude": 63.941902
},
{
"Longitude": -144.135376,
"Latitude": 64.386147
},
{
"Longitude": -144.750443,
"Latitude": 65.98999
},
{
"Longitude": -150.171432,
"Latitude": 65.531036
}
]
}
}
]
}
}
},
"ProviderDates": [
{
"Date": "2021-01-15T05:50:42.000Z",
"Type": "Insert"
},
{
"Date": "2021-01-15T05:50:42.000Z",
"Type": "Update"
}
],
"CollectionReference": {
"ShortName": "SENTINEL-1B_SLC",
"Version": "1"
},
"RelatedUrls": [
{
"Format": "Not provided",
"Description": "This link provides direct download access to the granule.",
"Type": "GET DATA",
"URL": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A.zip"
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 data set landing page",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-data-sets/sentinel-1"
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 User Guide and Technical Documentation",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-information/sentinel-1-documents-tools"
}
],
"DataGranule": {
"DayNightFlag": "Unspecified",
"Identifiers": [
{
"Identifier": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A",
"IdentifierType": "ProducerGranuleId"
}
],
"ProductionDateTime": "2021-01-14T03:20:30.000Z",
"ArchiveAndDistributionInformation": [
{
"Name": "Not provided",
"Size": 3996.5550289154053,
"SizeUnit": "MB",
"Format": "Not provided"
}
]
},
"Platforms": [
{
"ShortName": "SENTINEL-1B",
"Instruments": [
{
"ShortName": "C-SAR"
}
]
}
]
},
"meta": {
"concept-type": "granule",
"concept-id": "G1993900886-ASF",
"revision-id": 1,
"native-id": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A-SLC",
"provider-id": "ASF",
"format": "application/echo10+xml",
"revision-date": "2021-01-15T05:50:42.934Z"
}
},
{
"type": "Feature",
"geometry": {
"coordinates": [
[
[
-150.171249,
65.531166
],
[
-149.244873,
63.942146
],
[
-144.135239,
64.386398
],
[
-144.750275,
65.99012
],
[
-150.171249,
65.531166
]
]
],
"type": "Polygon"
},
"properties": {
"beamModeType": "IW",
"browse": null,
"bytes": 4184931225,
"centerLat": 64.986,
"centerLon": -147.0897,
"faradayRotation": null,
"fileID": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5-SLC",
"flightDirection": null,
"groupID": "S1B_IWDV_0209_0216_025320_094",
"granuleType": "SENTINEL_1B_FRAME",
"insarStackId": null,
"md5sum": "4e51b15bbe20cf3fdb1e099ebb4d2c36",
"offNadirAngle": null,
"orbit": 25320,
"pathNumber": 94,
"platform": "Sentinel-1B",
"pointingAngle": null,
"polarization": "VV+VH",
"processingDate": "2021-01-26T03:20:30.000Z",
"processingLevel": "SLC",
"sceneName": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5",
"sensor": "C-SAR",
"startTime": "2021-01-26T03:20:30.000Z",
"stopTime": "2021-01-26T03:20:57.000Z",
"url": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5.zip",
"fileName": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5.zip",
"frameNumber": 210,
"temporalBaseline": 12,
"perpendicularBaseline": 3,
},
"baseline": {
"stateVectors": {
"positions": {
"prePosition": [
-2893709.242044,
-1235768.184608,
6327545.175914
],
"prePositionTime": "2021-01-26T03:20:42.000000",
"postPosition": [
-2845226.283146,
-1186512.31404,
6358815.223669
],
"postPositionTime": "2021-01-26T03:20:52.000000"
},
"velocities": {
"preVelocity": [
4828.59494,
4922.291175,
3162.750859
],
"preVelocityTime": "2021-01-26T03:20:42.000000",
"postVelocity": [
4867.907754,
4928.781369,
3091.20021
],
"postVelocityTime": "2021-01-26T03:20:52.000000"
}
},
"ascendingNodeTime": "2021-01-26T03:02:57.994247"
},
"umm": {
"TemporalExtent": {
"RangeDateTime": {
"BeginningDateTime": "2021-01-26T03:20:30.000Z",
"EndingDateTime": "2021-01-26T03:20:57.000Z"
}
},
"OrbitCalculatedSpatialDomains": [
{
"OrbitNumber": 25320
}
],
"GranuleUR": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5-SLC",
"AdditionalAttributes": [
{
"Name": "ACQUISITION_DATE",
"Values": [
"2021-01-26T03:20:57.000000"
]
},
{
"Name": "ASCENDING_DESCENDING",
"Values": [
"ASCENDING"
]
},
{
"Name": "ASC_NODE_TIME",
"Values": [
"2021-01-26T03:02:57.994247"
]
},
{
"Name": "ASF_PLATFORM",
"Values": [
"Sentinel-1B"
]
},
{
"Name": "BEAM_MODE",
"Values": [
"IW"
]
},
{
"Name": "BEAM_MODE_DESC",
"Values": [
"Interferometric Wide. 250 km swath, 5 m x 20 m spatial resolution and burst synchronization for interferometry. IW is considered to be the standard mode over land masses."
]
},
{
"Name": "BEAM_MODE_TYPE",
"Values": [
"IW"
]
},
{
"Name": "BYTES",
"Values": [
"4184931225"
]
},
{
"Name": "CENTER_ESA_FRAME",
"Values": [
"1300"
]
},
{
"Name": "CENTER_FRAME_ID",
"Values": [
"213"
]
},
{
"Name": "CENTER_LAT",
"Values": [
"64.986"
]
},
{
"Name": "CENTER_LON",
"Values": [
"-147.0897"
]
},
{
"Name": "DOPPLER",
"Values": [
"0"
]
},
{
"Name": "FARADAY_ROTATION",
"Values": [
"NA"
]
},
{
"Name": "FAR_END_LAT",
"Values": [
"65.99012"
]
},
{
"Name": "FAR_END_LON",
"Values": [
"-144.750275"
]
},
{
"Name": "FAR_START_LAT",
"Values": [
"64.386398"
]
},
{
"Name": "FAR_START_LON",
"Values": [
"-144.135239"
]
},
{
"Name": "FRAME_NUMBER",
"Values": [
"210"
]
},
{
"Name": "GRANULE_TYPE",
"Values": [
"SENTINEL_1B_FRAME"
]
},
{
"Name": "GROUP_ID",
"Values": [
"S1B_IWDV_0209_0216_025320_094"
]
},
{
"Name": "LOOK_DIRECTION",
"Values": [
"R"
]
},
{
"Name": "MD5SUM",
"Values": [
"4e51b15bbe20cf3fdb1e099ebb4d2c36"
]
},
{
"Name": "MISSION_NAME",
"Values": [
"NA"
]
},
{
"Name": "NEAR_END_LAT",
"Values": [
"65.531166"
]
},
{
"Name": "NEAR_END_LON",
"Values": [
"-150.171249"
]
},
{
"Name": "NEAR_START_LAT",
"Values": [
"63.942146"
]
},
{
"Name": "NEAR_START_LON",
"Values": [
"-149.244873"
]
},
{
"Name": "PATH_NUMBER",
"Values": [
"94"
]
},
{
"Name": "POLARIZATION",
"Values": [
"VV+VH"
]
},
{
"Name": "PROCESSING_DATE",
"Values": [
"2021-01-26T23:06:47.704061"
]
},
{
"Name": "PROCESSING_DESCRIPTION",
"Values": [
"Sentinel-1B Single Look Complex product"
]
},
{
"Name": "PROCESSING_LEVEL",
"Values": [
"L1"
]
},
{
"Name": "PROCESSING_TYPE",
"Values": [
"SLC"
]
},
{
"Name": "PROCESSING_TYPE_DISPLAY",
"Values": [
"L1 Single Look Complex (SLC)"
]
},
{
"Name": "SV_POSITION_POST",
"Values": [
"-2845226.283146,-1186512.31404,6358815.223669,2021-01-26T03:20:52.000000"
]
},
{
"Name": "SV_POSITION_PRE",
"Values": [
"-2893709.242044,-1235768.184608,6327545.175914,2021-01-26T03:20:42.000000"
]
},
{
"Name": "SV_VELOCITY_POST",
"Values": [
"4867.907754,4928.781369,3091.20021,2021-01-26T03:20:52.000000"
]
},
{
"Name": "SV_VELOCITY_PRE",
"Values": [
"4828.59494,4922.291175,3162.750859,2021-01-26T03:20:42.000000"
]
}
],
"SpatialExtent": {
"HorizontalSpatialDomain": {
"Geometry": {
"GPolygons": [
{
"Boundary": {
"Points": [
{
"Longitude": -150.171249,
"Latitude": 65.531166
},
{
"Longitude": -149.244873,
"Latitude": 63.942146
},
{
"Longitude": -144.135239,
"Latitude": 64.386398
},
{
"Longitude": -144.750275,
"Latitude": 65.99012
},
{
"Longitude": -150.171249,
"Latitude": 65.531166
}
]
}
}
]
}
}
},
"ProviderDates": [
{
"Date": "2021-01-26T23:07:05.000Z",
"Type": "Insert"
},
{
"Date": "2021-01-26T23:07:05.000Z",
"Type": "Update"
}
],
"CollectionReference": {
"ShortName": "SENTINEL-1B_SLC",
"Version": "1"
},
"RelatedUrls": [
{
"Format": "Not provided",
"Description": "This link provides direct download access to the granule.",
"Type": "GET DATA",
"URL": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5.zip"
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 data set landing page",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-data-sets/sentinel-1"
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 User Guide and Technical Documentation",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-information/sentinel-1-documents-tools"
}
],
"DataGranule": {
"DayNightFlag": "Unspecified",
"Identifiers": [
{
"Identifier": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5",
"IdentifierType": "ProducerGranuleId"
}
],
"ProductionDateTime": "2021-01-26T03:20:30.000Z",
"ArchiveAndDistributionInformation": [
{
"Name": "Not provided",
"Size": 3991.0614252090454,
"SizeUnit": "MB",
"Format": "Not provided"
}
]
},
"Platforms": [
{
"ShortName": "SENTINEL-1B",
"Instruments": [
{
"ShortName": "C-SAR"
}
]
}
]
},
"meta": {
"concept-type": "granule",
"concept-id": "G1996951408-ASF",
"revision-id": 1,
"native-id": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5-SLC",
"provider-id": "ASF",
"format": "application/echo10+xml",
"revision-date": "2021-01-26T23:07:05.405Z"
}
}
] Discovery-asf_search-8.1.2/tests/yml_tests/Resources/Fairbanks_S1_stack.yml 0000664 0000000 0000000 00000136065 14777330235 0027125 0 ustar 00root root 0000000 0000000 [
{
"geometry":
{
"coordinates":
[
[
[-150.172562, 65.53125],
[-149.246063, 63.942123],
[-144.136368, 64.386414],
[-144.751495, 65.99025],
[-150.172562, 65.53125],
],
],
"type": "Polygon",
},
"meta":
{
"concept-type": "granule",
"concept-id": "G1989758351-ASF",
"revision-id": 13,
"native-id": "S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081-SLC",
"provider-id": "ASF",
"format": "application/echo10+xml",
"revision-date": "2023-06-16T03:04:36.493Z",
},
"umm":
{
"TemporalExtent":
{
"RangeDateTime":
{
"BeginningDateTime": "2021-01-02T03:20:31.092706Z",
"EndingDateTime": "2021-01-02T03:20:58.059549Z",
},
},
"OrbitCalculatedSpatialDomains": [{ "OrbitNumber": 24970 }],
"GranuleUR": "S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081-SLC",
"AdditionalAttributes":
[
{
"Name": "ACQUISITION_DATE",
"Values": ["2021-01-02T03:20:58.059549Z"],
},
{
"Name": "ASCENDING_DESCENDING",
"Values": ["ASCENDING"],
},
{
"Name": "ASC_NODE_TIME",
"Values": ["2021-01-02T03:02:58.934857Z"],
},
{ "Name": "ASF_PLATFORM", "Values": ["Sentinel-1B"] },
{ "Name": "BEAM_MODE", "Values": ["IW"] },
{
"Name": "BEAM_MODE_DESC",
"Values":
[
"Interferometric Wide. 250 km swath, 5 m x 20 m spatial resolution and burst synchronization for interferometry. IW is considered to be the standard mode over land masses.",
],
},
{ "Name": "BEAM_MODE_TYPE", "Values": ["IW"] },
{ "Name": "BYTES", "Values": ["4193723581"] },
{ "Name": "CENTER_ESA_FRAME", "Values": ["1300"] },
{ "Name": "CENTER_FRAME_ID", "Values": ["213"] },
{ "Name": "CENTER_LAT", "Values": ["64.9861"] },
{ "Name": "CENTER_LON", "Values": ["-147.0909"] },
{ "Name": "DOPPLER", "Values": ["0"] },
{ "Name": "FARADAY_ROTATION", "Values": ["NA"] },
{ "Name": "FAR_END_LAT", "Values": ["65.99025"] },
{ "Name": "FAR_END_LON", "Values": ["-144.751495"] },
{ "Name": "FAR_START_LAT", "Values": ["64.386414"] },
{ "Name": "FAR_START_LON", "Values": ["-144.136368"] },
{ "Name": "FRAME_NUMBER", "Values": ["210"] },
{
"Name": "GRANULE_TYPE",
"Values": ["SENTINEL_1B_FRAME"],
},
{
"Name": "GROUP_ID",
"Values": ["S1B_IWDV_0209_0216_024970_094"],
},
{ "Name": "LOOK_DIRECTION", "Values": ["R"] },
{
"Name": "MD5SUM",
"Values": ["6dd7f6a56ed98ba7037dfeb833217d5b"],
},
{ "Name": "MISSION_NAME", "Values": ["NA"] },
{ "Name": "NEAR_END_LAT", "Values": ["65.53125"] },
{ "Name": "NEAR_END_LON", "Values": ["-150.172562"] },
{ "Name": "NEAR_START_LAT", "Values": ["63.942123"] },
{ "Name": "NEAR_START_LON", "Values": ["-149.246063"] },
{ "Name": "PATH_NUMBER", "Values": ["94"] },
{ "Name": "POLARIZATION", "Values": ["VV+VH"] },
{
"Name": "PROCESSING_DATE",
"Values": ["2021-01-02T12:40:19.324537Z"],
},
{
"Name": "PROCESSING_DESCRIPTION",
"Values":
["Sentinel-1B Single Look Complex product"],
},
{ "Name": "PROCESSING_LEVEL", "Values": ["L1"] },
{ "Name": "PROCESSING_TYPE", "Values": ["SLC"] },
{
"Name": "PROCESSING_TYPE_DISPLAY",
"Values": ["L1 Single Look Complex (SLC)"],
},
{
"Name": "SV_POSITION_POST",
"Values":
[
"-2845284.115433,-1186496.621016,6358798.348458,2021-01-02T03:20:53.000000",
],
},
{
"Name": "SV_POSITION_PRE",
"Values":
[
"-2893767.065414,-1235752.268405,6327528.043215,2021-01-02T03:20:43.000000",
],
},
{
"Name": "SV_VELOCITY_POST",
"Values":
[
"4867.907153,4928.758938,3091.226142,2021-01-02T03:20:53.000000",
],
},
{
"Name": "SV_VELOCITY_PRE",
"Values":
[
"4828.593801,4922.268943,3162.776438,2021-01-02T03:20:43.000000",
],
},
],
"SpatialExtent":
{
"HorizontalSpatialDomain":
{
"Geometry":
{
"GPolygons":
[
{
"Boundary":
{
"Points":
[
{
"Longitude": -150.172562,
"Latitude": 65.53125,
},
{
"Longitude": -149.246063,
"Latitude": 63.942123,
},
{
"Longitude": -144.136368,
"Latitude": 64.386414,
},
{
"Longitude": -144.751495,
"Latitude": 65.99025,
},
{
"Longitude": -150.172562,
"Latitude": 65.53125,
},
],
},
},
],
},
},
},
"ProviderDates":
[
{
"Date": "2023-06-16T03:04:36.000Z",
"Type": "Insert",
},
{
"Date": "2023-06-16T03:04:36.000Z",
"Type": "Update",
},
],
"CollectionReference":
{ "ShortName": "SENTINEL-1B_SLC", "Version": "1" },
"PGEVersionClass":
{ "PGEName": "Sentinel-1 IPF", "PGEVersion": "003.31" },
"RelatedUrls":
[
{
"Format": "Not provided",
"Description": "This link provides direct download access to the granule.",
"Type": "GET DATA",
"URL": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081.zip",
},
{
"Format": "Not provided",
"Description": "This link provides direct download access to the granule.",
"Type": "GET DATA",
"URL": "s3://asf-ngap2w-p-s1-slc-7b420b89/S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081.zip",
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 data set landing page",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-data-sets/sentinel-1",
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 User Guide and Technical Documentation",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-information/sentinel-1-documents-tools",
},
],
"DataGranule":
{
"DayNightFlag": "Unspecified",
"Identifiers":
[
{
"Identifier": "S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081",
"IdentifierType": "ProducerGranuleId",
},
],
"ProductionDateTime": "2021-01-02T03:20:31.092706Z",
"ArchiveAndDistributionInformation":
[
{
"Name": "Not provided",
"Size": 3999.446469306946,
"SizeUnit": "MB",
"Format": "Not provided",
},
],
},
"Platforms":
[
{
"ShortName": "SENTINEL-1B",
"Instruments": [{ "ShortName": "C-SAR" }],
},
],
},
"properties":
{
"beamModeType": "IW",
"browse": None,
"bytes": 4193723581,
"centerLat": 64.9861,
"centerLon": -147.0909,
"faradayRotation": None,
"fileID": "S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081-SLC",
"flightDirection": "ASCENDING",
"groupID": "S1B_IWDV_0209_0216_024970_094",
"granuleType": "SENTINEL_1B_FRAME",
"insarStackId": None,
"md5sum": "6dd7f6a56ed98ba7037dfeb833217d5b",
"offNadirAngle": None,
"orbit": 24970,
"pathNumber": 94,
"platform": "Sentinel-1B",
"pointingAngle": None,
"polarization": "VV+VH",
"processingDate": "2021-01-02T03:20:31.092706Z",
"processingLevel": "SLC",
"sceneName": "S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081",
"sensor": "C-SAR",
"startTime": "2021-01-02T03:20:31.092706Z",
"stopTime": "2021-01-02T03:20:58.059549Z",
"url": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081.zip",
"pgeVersion": "003.31",
"fileName": "S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081.zip",
"frameNumber": 210,
"temporalBaseline": 0,
"perpendicularBaseline": 0,
},
"baseline":
{
"stateVectors":
{
"positions":
{
"prePosition":
[
-2893767.065414,
-1235752.268405,
6327528.043215,
],
"prePositionTime": "2021-01-02T03:20:43.000000",
"postPosition":
[
-2845284.115433,
-1186496.621016,
6358798.348458,
],
"postPositionTime": "2021-01-02T03:20:53.000000",
},
"velocities":
{
"preVelocity":
[4828.593801, 4922.268943, 3162.776438],
"preVelocityTime": "2021-01-02T03:20:43.000000",
"postVelocity":
[4867.907153, 4928.758938, 3091.226142],
"postVelocityTime": "2021-01-02T03:20:53.000000",
},
},
"ascendingNodeTime": "2021-01-02T03:02:58.934857Z",
},
},
{
"geometry":
{
"coordinates":
[
[
[-150.171432, 65.531036],
[-149.24501, 63.941902],
[-144.135376, 64.386147],
[-144.750443, 65.98999],
[-150.171432, 65.531036],
],
],
"type": "Polygon",
},
"meta":
{
"concept-type": "granule",
"concept-id": "G1993900886-ASF",
"revision-id": 5,
"native-id": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A-SLC",
"provider-id": "ASF",
"format": "application/echo10+xml",
"revision-date": "2023-06-16T03:05:43.819Z",
},
"umm":
{
"TemporalExtent":
{
"RangeDateTime":
{
"BeginningDateTime": "2021-01-14T03:20:30.566073Z",
"EndingDateTime": "2021-01-14T03:20:57.532917Z",
},
},
"OrbitCalculatedSpatialDomains": [{ "OrbitNumber": 25145 }],
"GranuleUR": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A-SLC",
"AdditionalAttributes":
[
{
"Name": "ACQUISITION_DATE",
"Values": ["2021-01-14T03:20:57.532917Z"],
},
{
"Name": "ASCENDING_DESCENDING",
"Values": ["ASCENDING"],
},
{
"Name": "ASC_NODE_TIME",
"Values": ["2021-01-14T03:02:58.414522Z"],
},
{ "Name": "ASF_PLATFORM", "Values": ["Sentinel-1B"] },
{ "Name": "BEAM_MODE", "Values": ["IW"] },
{
"Name": "BEAM_MODE_DESC",
"Values":
[
"Interferometric Wide. 250 km swath, 5 m x 20 m spatial resolution and burst synchronization for interferometry. IW is considered to be the standard mode over land masses.",
],
},
{ "Name": "BEAM_MODE_TYPE", "Values": ["IW"] },
{ "Name": "BYTES", "Values": ["4190691686"] },
{ "Name": "CENTER_ESA_FRAME", "Values": ["1300"] },
{ "Name": "CENTER_FRAME_ID", "Values": ["213"] },
{ "Name": "CENTER_LAT", "Values": ["64.9858"] },
{ "Name": "CENTER_LON", "Values": ["-147.0898"] },
{ "Name": "DOPPLER", "Values": ["0"] },
{ "Name": "FARADAY_ROTATION", "Values": ["NA"] },
{ "Name": "FAR_END_LAT", "Values": ["65.98999"] },
{ "Name": "FAR_END_LON", "Values": ["-144.750443"] },
{ "Name": "FAR_START_LAT", "Values": ["64.386147"] },
{ "Name": "FAR_START_LON", "Values": ["-144.135376"] },
{ "Name": "FRAME_NUMBER", "Values": ["210"] },
{
"Name": "GRANULE_TYPE",
"Values": ["SENTINEL_1B_FRAME"],
},
{
"Name": "GROUP_ID",
"Values": ["S1B_IWDV_0209_0216_025145_094"],
},
{ "Name": "LOOK_DIRECTION", "Values": ["R"] },
{
"Name": "MD5SUM",
"Values": ["2a76325db9d931414189689082163ee5"],
},
{ "Name": "MISSION_NAME", "Values": ["NA"] },
{ "Name": "NEAR_END_LAT", "Values": ["65.531036"] },
{ "Name": "NEAR_END_LON", "Values": ["-150.171432"] },
{ "Name": "NEAR_START_LAT", "Values": ["63.941902"] },
{ "Name": "NEAR_START_LON", "Values": ["-149.24501"] },
{ "Name": "PATH_NUMBER", "Values": ["94"] },
{ "Name": "POLARIZATION", "Values": ["VV+VH"] },
{
"Name": "PROCESSING_DATE",
"Values": ["2021-01-15T05:50:38.350917Z"],
},
{
"Name": "PROCESSING_DESCRIPTION",
"Values":
["Sentinel-1B Single Look Complex product"],
},
{ "Name": "PROCESSING_LEVEL", "Values": ["L1"] },
{ "Name": "PROCESSING_TYPE", "Values": ["SLC"] },
{
"Name": "PROCESSING_TYPE_DISPLAY",
"Values": ["L1 Single Look Complex (SLC)"],
},
{
"Name": "SV_POSITION_POST",
"Values":
[
"-2845242.394622,-1186516.792913,6358810.884979,2021-01-14T03:20:52.000000",
],
},
{
"Name": "SV_POSITION_PRE",
"Values":
[
"-2893725.563321,-1235772.278165,6327540.675522,2021-01-14T03:20:42.000000",
],
},
{
"Name": "SV_VELOCITY_POST",
"Values":
[
"4867.928786,4928.742823,3091.216457,2021-01-14T03:20:52.000000",
],
},
{
"Name": "SV_VELOCITY_PRE",
"Values":
[
"4828.615891,4922.252624,3162.766956,2021-01-14T03:20:42.000000",
],
},
],
"SpatialExtent":
{
"HorizontalSpatialDomain":
{
"Geometry":
{
"GPolygons":
[
{
"Boundary":
{
"Points":
[
{
"Longitude": -150.171432,
"Latitude": 65.531036,
},
{
"Longitude": -149.24501,
"Latitude": 63.941902,
},
{
"Longitude": -144.135376,
"Latitude": 64.386147,
},
{
"Longitude": -144.750443,
"Latitude": 65.98999,
},
{
"Longitude": -150.171432,
"Latitude": 65.531036,
},
],
},
},
],
},
},
},
"ProviderDates":
[
{
"Date": "2023-06-16T03:05:43.000Z",
"Type": "Insert",
},
{
"Date": "2023-06-16T03:05:43.000Z",
"Type": "Update",
},
],
"CollectionReference":
{ "ShortName": "SENTINEL-1B_SLC", "Version": "1" },
"PGEVersionClass":
{ "PGEName": "Sentinel-1 IPF", "PGEVersion": "003.31" },
"RelatedUrls":
[
{
"Format": "Not provided",
"Description": "This link provides direct download access to the granule.",
"Type": "GET DATA",
"URL": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A.zip",
},
{
"Format": "Not provided",
"Description": "This link provides direct download access to the granule.",
"Type": "GET DATA",
"URL": "s3://asf-ngap2w-p-s1-slc-7b420b89/S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A.zip",
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 data set landing page",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-data-sets/sentinel-1",
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 User Guide and Technical Documentation",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-information/sentinel-1-documents-tools",
},
],
"DataGranule":
{
"DayNightFlag": "Unspecified",
"Identifiers":
[
{
"Identifier": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A",
"IdentifierType": "ProducerGranuleId",
},
],
"ProductionDateTime": "2021-01-14T03:20:30.566073Z",
"ArchiveAndDistributionInformation":
[
{
"Name": "Not provided",
"Size": 3996.5550289154053,
"SizeUnit": "MB",
"Format": "Not provided",
},
],
},
"Platforms":
[
{
"ShortName": "SENTINEL-1B",
"Instruments": [{ "ShortName": "C-SAR" }],
},
],
},
"properties":
{
"beamModeType": "IW",
"browse": None,
"bytes": 4190691686,
"centerLat": 64.9858,
"centerLon": -147.0898,
"faradayRotation": None,
"fileID": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A-SLC",
"flightDirection": "ASCENDING",
"groupID": "S1B_IWDV_0209_0216_025145_094",
"granuleType": "SENTINEL_1B_FRAME",
"insarStackId": None,
"md5sum": "2a76325db9d931414189689082163ee5",
"offNadirAngle": None,
"orbit": 25145,
"pathNumber": 94,
"platform": "Sentinel-1B",
"pointingAngle": None,
"polarization": "VV+VH",
"processingDate": "2021-01-14T03:20:30.566073Z",
"processingLevel": "SLC",
"sceneName": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A",
"sensor": "C-SAR",
"startTime": "2021-01-14T03:20:30.566073Z",
"stopTime": "2021-01-14T03:20:57.532917Z",
"url": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A.zip",
"pgeVersion": "003.31",
"fileName": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A.zip",
"frameNumber": 210,
"temporalBaseline": 12,
"perpendicularBaseline": 37,
},
"baseline":
{
"stateVectors":
{
"positions":
{
"prePosition":
[
-2893725.563321,
-1235772.278165,
6327540.675522,
],
"prePositionTime": "2021-01-14T03:20:42.000000",
"postPosition":
[
-2845242.394622,
-1186516.792913,
6358810.884979,
],
"postPositionTime": "2021-01-14T03:20:52.000000",
},
"velocities":
{
"preVelocity":
[4828.615891, 4922.252624, 3162.766956],
"preVelocityTime": "2021-01-14T03:20:42.000000",
"postVelocity":
[4867.928786, 4928.742823, 3091.216457],
"postVelocityTime": "2021-01-14T03:20:52.000000",
},
},
"ascendingNodeTime": "2021-01-14T03:02:58.414522Z",
},
},
{
"geometry":
{
"coordinates":
[
[
[-150.171249, 65.531166],
[-149.244873, 63.942146],
[-144.135239, 64.386398],
[-144.750275, 65.99012],
[-150.171249, 65.531166],
],
],
"type": "Polygon",
},
"meta":
{
"concept-type": "granule",
"concept-id": "G1996951408-ASF",
"revision-id": 5,
"native-id": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5-SLC",
"provider-id": "ASF",
"format": "application/echo10+xml",
"revision-date": "2023-06-16T03:10:08.749Z",
},
"umm":
{
"TemporalExtent":
{
"RangeDateTime":
{
"BeginningDateTime": "2021-01-26T03:20:30.147637Z",
"EndingDateTime": "2021-01-26T03:20:57.112425Z",
},
},
"OrbitCalculatedSpatialDomains": [{ "OrbitNumber": 25320 }],
"GranuleUR": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5-SLC",
"AdditionalAttributes":
[
{
"Name": "ACQUISITION_DATE",
"Values": ["2021-01-26T03:20:57.112425Z"],
},
{
"Name": "ASCENDING_DESCENDING",
"Values": ["ASCENDING"],
},
{
"Name": "ASC_NODE_TIME",
"Values": ["2021-01-26T03:02:57.994247Z"],
},
{ "Name": "ASF_PLATFORM", "Values": ["Sentinel-1B"] },
{ "Name": "BEAM_MODE", "Values": ["IW"] },
{
"Name": "BEAM_MODE_DESC",
"Values":
[
"Interferometric Wide. 250 km swath, 5 m x 20 m spatial resolution and burst synchronization for interferometry. IW is considered to be the standard mode over land masses.",
],
},
{ "Name": "BEAM_MODE_TYPE", "Values": ["IW"] },
{ "Name": "BYTES", "Values": ["4184931225"] },
{ "Name": "CENTER_ESA_FRAME", "Values": ["1300"] },
{ "Name": "CENTER_FRAME_ID", "Values": ["213"] },
{ "Name": "CENTER_LAT", "Values": ["64.986"] },
{ "Name": "CENTER_LON", "Values": ["-147.0897"] },
{ "Name": "DOPPLER", "Values": ["0"] },
{ "Name": "FARADAY_ROTATION", "Values": ["NA"] },
{ "Name": "FAR_END_LAT", "Values": ["65.99012"] },
{ "Name": "FAR_END_LON", "Values": ["-144.750275"] },
{ "Name": "FAR_START_LAT", "Values": ["64.386398"] },
{ "Name": "FAR_START_LON", "Values": ["-144.135239"] },
{ "Name": "FRAME_NUMBER", "Values": ["210"] },
{
"Name": "GRANULE_TYPE",
"Values": ["SENTINEL_1B_FRAME"],
},
{
"Name": "GROUP_ID",
"Values": ["S1B_IWDV_0209_0216_025320_094"],
},
{ "Name": "LOOK_DIRECTION", "Values": ["R"] },
{
"Name": "MD5SUM",
"Values": ["4e51b15bbe20cf3fdb1e099ebb4d2c36"],
},
{ "Name": "MISSION_NAME", "Values": ["NA"] },
{ "Name": "NEAR_END_LAT", "Values": ["65.531166"] },
{ "Name": "NEAR_END_LON", "Values": ["-150.171249"] },
{ "Name": "NEAR_START_LAT", "Values": ["63.942146"] },
{ "Name": "NEAR_START_LON", "Values": ["-149.244873"] },
{ "Name": "PATH_NUMBER", "Values": ["94"] },
{ "Name": "POLARIZATION", "Values": ["VV+VH"] },
{
"Name": "PROCESSING_DATE",
"Values": ["2021-01-26T23:06:47.704061Z"],
},
{
"Name": "PROCESSING_DESCRIPTION",
"Values":
["Sentinel-1B Single Look Complex product"],
},
{ "Name": "PROCESSING_LEVEL", "Values": ["L1"] },
{ "Name": "PROCESSING_TYPE", "Values": ["SLC"] },
{
"Name": "PROCESSING_TYPE_DISPLAY",
"Values": ["L1 Single Look Complex (SLC)"],
},
{
"Name": "SV_POSITION_POST",
"Values":
[
"-2845226.283146,-1186512.31404,6358815.223669,2021-01-26T03:20:52.000000",
],
},
{
"Name": "SV_POSITION_PRE",
"Values":
[
"-2893709.242044,-1235768.184608,6327545.175914,2021-01-26T03:20:42.000000",
],
},
{
"Name": "SV_VELOCITY_POST",
"Values":
[
"4867.907754,4928.781369,3091.20021,2021-01-26T03:20:52.000000",
],
},
{
"Name": "SV_VELOCITY_PRE",
"Values":
[
"4828.59494,4922.291175,3162.750859,2021-01-26T03:20:42.000000",
],
},
],
"SpatialExtent":
{
"HorizontalSpatialDomain":
{
"Geometry":
{
"GPolygons":
[
{
"Boundary":
{
"Points":
[
{
"Longitude": -150.171249,
"Latitude": 65.531166,
},
{
"Longitude": -149.244873,
"Latitude": 63.942146,
},
{
"Longitude": -144.135239,
"Latitude": 64.386398,
},
{
"Longitude": -144.750275,
"Latitude": 65.99012,
},
{
"Longitude": -150.171249,
"Latitude": 65.531166,
},
],
},
},
],
},
},
},
"ProviderDates":
[
{
"Date": "2023-06-16T03:10:08.000Z",
"Type": "Insert",
},
{
"Date": "2023-06-16T03:10:08.000Z",
"Type": "Update",
},
],
"CollectionReference":
{ "ShortName": "SENTINEL-1B_SLC", "Version": "1" },
"PGEVersionClass":
{ "PGEName": "Sentinel-1 IPF", "PGEVersion": "003.31" },
"RelatedUrls":
[
{
"Format": "Not provided",
"Description": "This link provides direct download access to the granule.",
"Type": "GET DATA",
"URL": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5.zip",
},
{
"Format": "Not provided",
"Description": "This link provides direct download access to the granule.",
"Type": "GET DATA",
"URL": "s3://asf-ngap2w-p-s1-slc-7b420b89/S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5.zip",
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 data set landing page",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-data-sets/sentinel-1",
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 User Guide and Technical Documentation",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-information/sentinel-1-documents-tools",
},
],
"DataGranule":
{
"DayNightFlag": "Unspecified",
"Identifiers":
[
{
"Identifier": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5",
"IdentifierType": "ProducerGranuleId",
},
],
"ProductionDateTime": "2021-01-26T03:20:30.147637Z",
"ArchiveAndDistributionInformation":
[
{
"Name": "Not provided",
"Size": 3991.0614252090454,
"SizeUnit": "MB",
"Format": "Not provided",
},
],
},
"Platforms":
[
{
"ShortName": "SENTINEL-1B",
"Instruments": [{ "ShortName": "C-SAR" }],
},
],
},
"properties":
{
"beamModeType": "IW",
"browse": None,
"bytes": 4184931225,
"centerLat": 64.986,
"centerLon": -147.0897,
"faradayRotation": None,
"fileID": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5-SLC",
"flightDirection": "ASCENDING",
"groupID": "S1B_IWDV_0209_0216_025320_094",
"granuleType": "SENTINEL_1B_FRAME",
"insarStackId": None,
"md5sum": "4e51b15bbe20cf3fdb1e099ebb4d2c36",
"offNadirAngle": None,
"orbit": 25320,
"pathNumber": 94,
"platform": "Sentinel-1B",
"pointingAngle": None,
"polarization": "VV+VH",
"processingDate": "2021-01-26T03:20:30.147637Z",
"processingLevel": "SLC",
"sceneName": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5",
"sensor": "C-SAR",
"startTime": "2021-01-26T03:20:30.147637Z",
"stopTime": "2021-01-26T03:20:57.112425Z",
"url": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5.zip",
"pgeVersion": "003.31",
"fileName": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5.zip",
"frameNumber": 210,
"temporalBaseline": 24,
"perpendicularBaseline": 40,
},
"baseline":
{
"stateVectors":
{
"positions":
{
"prePosition":
[
-2893709.242044,
-1235768.184608,
6327545.175914,
],
"prePositionTime": "2021-01-26T03:20:42.000000",
"postPosition":
[
-2845226.283146,
-1186512.31404,
6358815.223669,
],
"postPositionTime": "2021-01-26T03:20:52.000000",
},
"velocities":
{
"preVelocity":
[4828.59494, 4922.291175, 3162.750859],
"preVelocityTime": "2021-01-26T03:20:42.000000",
"postVelocity":
[4867.907754, 4928.781369, 3091.20021],
"postVelocityTime": "2021-01-26T03:20:52.000000",
},
},
"ascendingNodeTime": "2021-01-26T03:02:57.994247Z",
},
},
]
Discovery-asf_search-8.1.2/tests/yml_tests/Resources/Fairbanks_S1_stack_preprocessed.yml 0000664 0000000 0000000 00000132611 14777330235 0031674 0 ustar 00root root 0000000 0000000 [
{
"type": "Feature",
"geometry": {
"coordinates": [
[
[
-150.172562,
65.53125
],
[
-149.246063,
63.942123
],
[
-144.136368,
64.386414
],
[
-144.751495,
65.99025
],
[
-150.172562,
65.53125
]
]
],
"type": "Polygon"
},
"properties": {
"beamModeType": "IW",
"browse": null,
"bytes": 4193723581,
"centerLat": 64.9861,
"centerLon": -147.0909,
"faradayRotation": null,
"fileID": "S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081-SLC",
"flightDirection": null,
"groupID": "S1B_IWDV_0209_0216_024970_094",
"granuleType": "SENTINEL_1B_FRAME",
"insarStackId": null,
"md5sum": "6dd7f6a56ed98ba7037dfeb833217d5b",
"offNadirAngle": null,
"orbit": 24970,
"pathNumber": 94,
"platform": "Sentinel-1B",
"pointingAngle": null,
"polarization": "VV+VH",
"processingDate": "2021-01-02T03:20:31.000Z",
"processingLevel": "SLC",
"sceneName": "S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081",
"sensor": "C-SAR",
"startTime": "2021-01-02T03:20:31.000Z",
"stopTime": "2021-01-02T03:20:58.000Z",
"url": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081.zip",
"fileName": "S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081.zip",
"frameNumber": 210
},
"baseline": {
"stateVectors": {
"positions": {
"prePosition": [
-2893767.065414,
-1235752.268405,
6327528.043215
],
"prePositionTime": "2021-01-02T03:20:43.000000",
"postPosition": [
-2845284.115433,
-1186496.621016,
6358798.348458
],
"postPositionTime": "2021-01-02T03:20:53.000000"
},
"velocities": {
"preVelocity": [
4828.593801,
4922.268943,
3162.776438
],
"preVelocityTime": "2021-01-02T03:20:43.000000",
"postVelocity": [
4867.907153,
4928.758938,
3091.226142
],
"postVelocityTime": "2021-01-02T03:20:53.000000"
}
},
"ascendingNodeTime": "2021-01-02T03:02:58.934857"
},
"umm": {
"TemporalExtent": {
"RangeDateTime": {
"BeginningDateTime": "2021-01-02T03:20:31.000Z",
"EndingDateTime": "2021-01-02T03:20:58.000Z"
}
},
"OrbitCalculatedSpatialDomains": [
{
"OrbitNumber": 24970
}
],
"GranuleUR": "S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081-SLC",
"AdditionalAttributes": [
{
"Name": "ACQUISITION_DATE",
"Values": [
"2021-01-02T03:20:58.000000"
]
},
{
"Name": "ASCENDING_DESCENDING",
"Values": [
"ASCENDING"
]
},
{
"Name": "ASC_NODE_TIME",
"Values": [
"2021-01-02T03:02:58.934857"
]
},
{
"Name": "ASF_PLATFORM",
"Values": [
"Sentinel-1B"
]
},
{
"Name": "BEAM_MODE",
"Values": [
"IW"
]
},
{
"Name": "BEAM_MODE_DESC",
"Values": [
"Interferometric Wide. 250 km swath, 5 m x 20 m spatial resolution and burst synchronization for interferometry. IW is considered to be the standard mode over land masses."
]
},
{
"Name": "BEAM_MODE_TYPE",
"Values": [
"IW"
]
},
{
"Name": "BYTES",
"Values": [
"4193723581"
]
},
{
"Name": "CENTER_ESA_FRAME",
"Values": [
"1300"
]
},
{
"Name": "CENTER_FRAME_ID",
"Values": [
"213"
]
},
{
"Name": "CENTER_LAT",
"Values": [
"64.9861"
]
},
{
"Name": "CENTER_LON",
"Values": [
"-147.0909"
]
},
{
"Name": "DOPPLER",
"Values": [
"0"
]
},
{
"Name": "FARADAY_ROTATION",
"Values": [
"NA"
]
},
{
"Name": "FAR_END_LAT",
"Values": [
"65.99025"
]
},
{
"Name": "FAR_END_LON",
"Values": [
"-144.751495"
]
},
{
"Name": "FAR_START_LAT",
"Values": [
"64.386414"
]
},
{
"Name": "FAR_START_LON",
"Values": [
"-144.136368"
]
},
{
"Name": "FRAME_NUMBER",
"Values": [
"210"
]
},
{
"Name": "GRANULE_TYPE",
"Values": [
"SENTINEL_1B_FRAME"
]
},
{
"Name": "GROUP_ID",
"Values": [
"S1B_IWDV_0209_0216_024970_094"
]
},
{
"Name": "LOOK_DIRECTION",
"Values": [
"R"
]
},
{
"Name": "MD5SUM",
"Values": [
"6dd7f6a56ed98ba7037dfeb833217d5b"
]
},
{
"Name": "MISSION_NAME",
"Values": [
"NA"
]
},
{
"Name": "NEAR_END_LAT",
"Values": [
"65.53125"
]
},
{
"Name": "NEAR_END_LON",
"Values": [
"-150.172562"
]
},
{
"Name": "NEAR_START_LAT",
"Values": [
"63.942123"
]
},
{
"Name": "NEAR_START_LON",
"Values": [
"-149.246063"
]
},
{
"Name": "PATH_NUMBER",
"Values": [
"94"
]
},
{
"Name": "POLARIZATION",
"Values": [
"VV+VH"
]
},
{
"Name": "PROCESSING_DATE",
"Values": [
"2021-01-02T12:40:19.324537"
]
},
{
"Name": "PROCESSING_DESCRIPTION",
"Values": [
"Sentinel-1B Single Look Complex product"
]
},
{
"Name": "PROCESSING_LEVEL",
"Values": [
"L1"
]
},
{
"Name": "PROCESSING_TYPE",
"Values": [
"SLC"
]
},
{
"Name": "PROCESSING_TYPE_DISPLAY",
"Values": [
"L1 Single Look Complex (SLC)"
]
},
{
"Name": "SV_POSITION_POST",
"Values": [
"-2845284.115433,-1186496.621016,6358798.348458,2021-01-02T03:20:53.000000"
]
},
{
"Name": "SV_POSITION_PRE",
"Values": [
"-2893767.065414,-1235752.268405,6327528.043215,2021-01-02T03:20:43.000000"
]
},
{
"Name": "SV_VELOCITY_POST",
"Values": [
"4867.907153,4928.758938,3091.226142,2021-01-02T03:20:53.000000"
]
},
{
"Name": "SV_VELOCITY_PRE",
"Values": [
"4828.593801,4922.268943,3162.776438,2021-01-02T03:20:43.000000"
]
}
],
"SpatialExtent": {
"HorizontalSpatialDomain": {
"Geometry": {
"GPolygons": [
{
"Boundary": {
"Points": [
{
"Longitude": -150.172562,
"Latitude": 65.53125
},
{
"Longitude": -149.246063,
"Latitude": 63.942123
},
{
"Longitude": -144.136368,
"Latitude": 64.386414
},
{
"Longitude": -144.751495,
"Latitude": 65.99025
},
{
"Longitude": -150.172562,
"Latitude": 65.53125
}
]
}
}
]
}
}
},
"ProviderDates": [
{
"Date": "2021-01-02T14:36:15.000Z",
"Type": "Insert"
},
{
"Date": "2021-01-02T14:36:15.000Z",
"Type": "Update"
}
],
"CollectionReference": {
"ShortName": "SENTINEL-1B_SLC",
"Version": "1"
},
"RelatedUrls": [
{
"Format": "Not provided",
"Description": "This link provides direct download access to the granule.",
"Type": "GET DATA",
"URL": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081.zip"
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 data set landing page",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-data-sets/sentinel-1"
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 User Guide and Technical Documentation",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-information/sentinel-1-documents-tools"
}
],
"DataGranule": {
"DayNightFlag": "Unspecified",
"Identifiers": [
{
"Identifier": "S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081",
"IdentifierType": "ProducerGranuleId"
}
],
"ProductionDateTime": "2021-01-02T03:20:31.000Z",
"ArchiveAndDistributionInformation": [
{
"Name": "Not provided",
"Size": 3999.446469306946,
"SizeUnit": "MB",
"Format": "Not provided"
}
]
},
"Platforms": [
{
"ShortName": "SENTINEL-1B",
"Instruments": [
{
"ShortName": "C-SAR"
}
]
}
]
},
"meta": {
"concept-type": "granule",
"concept-id": "G1989758351-ASF",
"revision-id": 9,
"native-id": "S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081-SLC",
"provider-id": "ASF",
"format": "application/echo10+xml",
"revision-date": "2021-01-02T14:36:16.174Z"
}
},
{
"type": "Feature",
"geometry": {
"coordinates": [
[
[
-150.171432,
65.531036
],
[
-149.24501,
63.941902
],
[
-144.135376,
64.386147
],
[
-144.750443,
65.98999
],
[
-150.171432,
65.531036
]
]
],
"type": "Polygon"
},
"properties": {
"beamModeType": "IW",
"browse": null,
"bytes": 4190691686,
"centerLat": 64.9858,
"centerLon": -147.0898,
"faradayRotation": null,
"fileID": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A-SLC",
"flightDirection": null,
"groupID": "S1B_IWDV_0209_0216_025145_094",
"granuleType": "SENTINEL_1B_FRAME",
"insarStackId": null,
"md5sum": "2a76325db9d931414189689082163ee5",
"offNadirAngle": null,
"orbit": 25145,
"pathNumber": 94,
"platform": "Sentinel-1B",
"pointingAngle": null,
"polarization": "VV+VH",
"processingDate": "2021-01-14T03:20:30.000Z",
"processingLevel": "SLC",
"sceneName": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A",
"sensor": "C-SAR",
"startTime": "2021-01-14T03:20:30.000Z",
"stopTime": "2021-01-14T03:20:57.000Z",
"url": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A.zip",
"fileName": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A.zip",
"frameNumber": 210
},
"baseline": {
"stateVectors": {
"positions": {
"prePosition": [
-2893725.563321,
-1235772.278165,
6327540.675522
],
"prePositionTime": "2021-01-14T03:20:42.000000",
"postPosition": [
-2845242.394622,
-1186516.792913,
6358810.884979
],
"postPositionTime": "2021-01-14T03:20:52.000000"
},
"velocities": {
"preVelocity": [
4828.615891,
4922.252624,
3162.766956
],
"preVelocityTime": "2021-01-14T03:20:42.000000",
"postVelocity": [
4867.928786,
4928.742823,
3091.216457
],
"postVelocityTime": "2021-01-14T03:20:52.000000"
}
},
"ascendingNodeTime": "2021-01-14T03:02:58.414522"
},
"umm": {
"TemporalExtent": {
"RangeDateTime": {
"BeginningDateTime": "2021-01-14T03:20:30.000Z",
"EndingDateTime": "2021-01-14T03:20:57.000Z"
}
},
"OrbitCalculatedSpatialDomains": [
{
"OrbitNumber": 25145
}
],
"GranuleUR": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A-SLC",
"AdditionalAttributes": [
{
"Name": "ACQUISITION_DATE",
"Values": [
"2021-01-14T03:20:57.000000"
]
},
{
"Name": "ASCENDING_DESCENDING",
"Values": [
"ASCENDING"
]
},
{
"Name": "ASC_NODE_TIME",
"Values": [
"2021-01-14T03:02:58.414522"
]
},
{
"Name": "ASF_PLATFORM",
"Values": [
"Sentinel-1B"
]
},
{
"Name": "BEAM_MODE",
"Values": [
"IW"
]
},
{
"Name": "BEAM_MODE_DESC",
"Values": [
"Interferometric Wide. 250 km swath, 5 m x 20 m spatial resolution and burst synchronization for interferometry. IW is considered to be the standard mode over land masses."
]
},
{
"Name": "BEAM_MODE_TYPE",
"Values": [
"IW"
]
},
{
"Name": "BYTES",
"Values": [
"4190691686"
]
},
{
"Name": "CENTER_ESA_FRAME",
"Values": [
"1300"
]
},
{
"Name": "CENTER_FRAME_ID",
"Values": [
"213"
]
},
{
"Name": "CENTER_LAT",
"Values": [
"64.9858"
]
},
{
"Name": "CENTER_LON",
"Values": [
"-147.0898"
]
},
{
"Name": "DOPPLER",
"Values": [
"0"
]
},
{
"Name": "FARADAY_ROTATION",
"Values": [
"NA"
]
},
{
"Name": "FAR_END_LAT",
"Values": [
"65.98999"
]
},
{
"Name": "FAR_END_LON",
"Values": [
"-144.750443"
]
},
{
"Name": "FAR_START_LAT",
"Values": [
"64.386147"
]
},
{
"Name": "FAR_START_LON",
"Values": [
"-144.135376"
]
},
{
"Name": "FRAME_NUMBER",
"Values": [
"210"
]
},
{
"Name": "GRANULE_TYPE",
"Values": [
"SENTINEL_1B_FRAME"
]
},
{
"Name": "GROUP_ID",
"Values": [
"S1B_IWDV_0209_0216_025145_094"
]
},
{
"Name": "LOOK_DIRECTION",
"Values": [
"R"
]
},
{
"Name": "MD5SUM",
"Values": [
"2a76325db9d931414189689082163ee5"
]
},
{
"Name": "MISSION_NAME",
"Values": [
"NA"
]
},
{
"Name": "NEAR_END_LAT",
"Values": [
"65.531036"
]
},
{
"Name": "NEAR_END_LON",
"Values": [
"-150.171432"
]
},
{
"Name": "NEAR_START_LAT",
"Values": [
"63.941902"
]
},
{
"Name": "NEAR_START_LON",
"Values": [
"-149.24501"
]
},
{
"Name": "PATH_NUMBER",
"Values": [
"94"
]
},
{
"Name": "POLARIZATION",
"Values": [
"VV+VH"
]
},
{
"Name": "PROCESSING_DATE",
"Values": [
"2021-01-15T05:50:38.350917"
]
},
{
"Name": "PROCESSING_DESCRIPTION",
"Values": [
"Sentinel-1B Single Look Complex product"
]
},
{
"Name": "PROCESSING_LEVEL",
"Values": [
"L1"
]
},
{
"Name": "PROCESSING_TYPE",
"Values": [
"SLC"
]
},
{
"Name": "PROCESSING_TYPE_DISPLAY",
"Values": [
"L1 Single Look Complex (SLC)"
]
},
{
"Name": "SV_POSITION_POST",
"Values": [
"-2845242.394622,-1186516.792913,6358810.884979,2021-01-14T03:20:52.000000"
]
},
{
"Name": "SV_POSITION_PRE",
"Values": [
"-2893725.563321,-1235772.278165,6327540.675522,2021-01-14T03:20:42.000000"
]
},
{
"Name": "SV_VELOCITY_POST",
"Values": [
"4867.928786,4928.742823,3091.216457,2021-01-14T03:20:52.000000"
]
},
{
"Name": "SV_VELOCITY_PRE",
"Values": [
"4828.615891,4922.252624,3162.766956,2021-01-14T03:20:42.000000"
]
}
],
"SpatialExtent": {
"HorizontalSpatialDomain": {
"Geometry": {
"GPolygons": [
{
"Boundary": {
"Points": [
{
"Longitude": -150.171432,
"Latitude": 65.531036
},
{
"Longitude": -149.24501,
"Latitude": 63.941902
},
{
"Longitude": -144.135376,
"Latitude": 64.386147
},
{
"Longitude": -144.750443,
"Latitude": 65.98999
},
{
"Longitude": -150.171432,
"Latitude": 65.531036
}
]
}
}
]
}
}
},
"ProviderDates": [
{
"Date": "2021-01-15T05:50:42.000Z",
"Type": "Insert"
},
{
"Date": "2021-01-15T05:50:42.000Z",
"Type": "Update"
}
],
"CollectionReference": {
"ShortName": "SENTINEL-1B_SLC",
"Version": "1"
},
"RelatedUrls": [
{
"Format": "Not provided",
"Description": "This link provides direct download access to the granule.",
"Type": "GET DATA",
"URL": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A.zip"
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 data set landing page",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-data-sets/sentinel-1"
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 User Guide and Technical Documentation",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-information/sentinel-1-documents-tools"
}
],
"DataGranule": {
"DayNightFlag": "Unspecified",
"Identifiers": [
{
"Identifier": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A",
"IdentifierType": "ProducerGranuleId"
}
],
"ProductionDateTime": "2021-01-14T03:20:30.000Z",
"ArchiveAndDistributionInformation": [
{
"Name": "Not provided",
"Size": 3996.5550289154053,
"SizeUnit": "MB",
"Format": "Not provided"
}
]
},
"Platforms": [
{
"ShortName": "SENTINEL-1B",
"Instruments": [
{
"ShortName": "C-SAR"
}
]
}
]
},
"meta": {
"concept-type": "granule",
"concept-id": "G1993900886-ASF",
"revision-id": 1,
"native-id": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A-SLC",
"provider-id": "ASF",
"format": "application/echo10+xml",
"revision-date": "2021-01-15T05:50:42.934Z"
}
},
{
"type": "Feature",
"geometry": {
"coordinates": [
[
[
-150.171249,
65.531166
],
[
-149.244873,
63.942146
],
[
-144.135239,
64.386398
],
[
-144.750275,
65.99012
],
[
-150.171249,
65.531166
]
]
],
"type": "Polygon"
},
"properties": {
"beamModeType": "IW",
"browse": null,
"bytes": 4184931225,
"centerLat": 64.986,
"centerLon": -147.0897,
"faradayRotation": null,
"fileID": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5-SLC",
"flightDirection": null,
"groupID": "S1B_IWDV_0209_0216_025320_094",
"granuleType": "SENTINEL_1B_FRAME",
"insarStackId": null,
"md5sum": "4e51b15bbe20cf3fdb1e099ebb4d2c36",
"offNadirAngle": null,
"orbit": 25320,
"pathNumber": 94,
"platform": "Sentinel-1B",
"pointingAngle": null,
"polarization": "VV+VH",
"processingDate": "2021-01-26T03:20:30.000Z",
"processingLevel": "SLC",
"sceneName": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5",
"sensor": "C-SAR",
"startTime": "2021-01-26T03:20:30.000Z",
"stopTime": "2021-01-26T03:20:57.000Z",
"url": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5.zip",
"fileName": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5.zip",
"frameNumber": 210
},
"baseline": {
"stateVectors": {
"positions": {
"prePosition": [
-2893709.242044,
-1235768.184608,
6327545.175914
],
"prePositionTime": "2021-01-26T03:20:42.000000",
"postPosition": [
-2845226.283146,
-1186512.31404,
6358815.223669
],
"postPositionTime": "2021-01-26T03:20:52.000000"
},
"velocities": {
"preVelocity": [
4828.59494,
4922.291175,
3162.750859
],
"preVelocityTime": "2021-01-26T03:20:42.000000",
"postVelocity": [
4867.907754,
4928.781369,
3091.20021
],
"postVelocityTime": "2021-01-26T03:20:52.000000"
}
},
"ascendingNodeTime": "2021-01-26T03:02:57.994247"
},
"umm": {
"TemporalExtent": {
"RangeDateTime": {
"BeginningDateTime": "2021-01-26T03:20:30.000Z",
"EndingDateTime": "2021-01-26T03:20:57.000Z"
}
},
"OrbitCalculatedSpatialDomains": [
{
"OrbitNumber": 25320
}
],
"GranuleUR": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5-SLC",
"AdditionalAttributes": [
{
"Name": "ACQUISITION_DATE",
"Values": [
"2021-01-26T03:20:57.000000"
]
},
{
"Name": "ASCENDING_DESCENDING",
"Values": [
"ASCENDING"
]
},
{
"Name": "ASC_NODE_TIME",
"Values": [
"2021-01-26T03:02:57.994247"
]
},
{
"Name": "ASF_PLATFORM",
"Values": [
"Sentinel-1B"
]
},
{
"Name": "BEAM_MODE",
"Values": [
"IW"
]
},
{
"Name": "BEAM_MODE_DESC",
"Values": [
"Interferometric Wide. 250 km swath, 5 m x 20 m spatial resolution and burst synchronization for interferometry. IW is considered to be the standard mode over land masses."
]
},
{
"Name": "BEAM_MODE_TYPE",
"Values": [
"IW"
]
},
{
"Name": "BYTES",
"Values": [
"4184931225"
]
},
{
"Name": "CENTER_ESA_FRAME",
"Values": [
"1300"
]
},
{
"Name": "CENTER_FRAME_ID",
"Values": [
"213"
]
},
{
"Name": "CENTER_LAT",
"Values": [
"64.986"
]
},
{
"Name": "CENTER_LON",
"Values": [
"-147.0897"
]
},
{
"Name": "DOPPLER",
"Values": [
"0"
]
},
{
"Name": "FARADAY_ROTATION",
"Values": [
"NA"
]
},
{
"Name": "FAR_END_LAT",
"Values": [
"65.99012"
]
},
{
"Name": "FAR_END_LON",
"Values": [
"-144.750275"
]
},
{
"Name": "FAR_START_LAT",
"Values": [
"64.386398"
]
},
{
"Name": "FAR_START_LON",
"Values": [
"-144.135239"
]
},
{
"Name": "FRAME_NUMBER",
"Values": [
"210"
]
},
{
"Name": "GRANULE_TYPE",
"Values": [
"SENTINEL_1B_FRAME"
]
},
{
"Name": "GROUP_ID",
"Values": [
"S1B_IWDV_0209_0216_025320_094"
]
},
{
"Name": "LOOK_DIRECTION",
"Values": [
"R"
]
},
{
"Name": "MD5SUM",
"Values": [
"4e51b15bbe20cf3fdb1e099ebb4d2c36"
]
},
{
"Name": "MISSION_NAME",
"Values": [
"NA"
]
},
{
"Name": "NEAR_END_LAT",
"Values": [
"65.531166"
]
},
{
"Name": "NEAR_END_LON",
"Values": [
"-150.171249"
]
},
{
"Name": "NEAR_START_LAT",
"Values": [
"63.942146"
]
},
{
"Name": "NEAR_START_LON",
"Values": [
"-149.244873"
]
},
{
"Name": "PATH_NUMBER",
"Values": [
"94"
]
},
{
"Name": "POLARIZATION",
"Values": [
"VV+VH"
]
},
{
"Name": "PROCESSING_DATE",
"Values": [
"2021-01-26T23:06:47.704061"
]
},
{
"Name": "PROCESSING_DESCRIPTION",
"Values": [
"Sentinel-1B Single Look Complex product"
]
},
{
"Name": "PROCESSING_LEVEL",
"Values": [
"L1"
]
},
{
"Name": "PROCESSING_TYPE",
"Values": [
"SLC"
]
},
{
"Name": "PROCESSING_TYPE_DISPLAY",
"Values": [
"L1 Single Look Complex (SLC)"
]
},
{
"Name": "SV_POSITION_POST",
"Values": [
"-2845226.283146,-1186512.31404,6358815.223669,2021-01-26T03:20:52.000000"
]
},
{
"Name": "SV_POSITION_PRE",
"Values": [
"-2893709.242044,-1235768.184608,6327545.175914,2021-01-26T03:20:42.000000"
]
},
{
"Name": "SV_VELOCITY_POST",
"Values": [
"4867.907754,4928.781369,3091.20021,2021-01-26T03:20:52.000000"
]
},
{
"Name": "SV_VELOCITY_PRE",
"Values": [
"4828.59494,4922.291175,3162.750859,2021-01-26T03:20:42.000000"
]
}
],
"SpatialExtent": {
"HorizontalSpatialDomain": {
"Geometry": {
"GPolygons": [
{
"Boundary": {
"Points": [
{
"Longitude": -150.171249,
"Latitude": 65.531166
},
{
"Longitude": -149.244873,
"Latitude": 63.942146
},
{
"Longitude": -144.135239,
"Latitude": 64.386398
},
{
"Longitude": -144.750275,
"Latitude": 65.99012
},
{
"Longitude": -150.171249,
"Latitude": 65.531166
}
]
}
}
]
}
}
},
"ProviderDates": [
{
"Date": "2021-01-26T23:07:05.000Z",
"Type": "Insert"
},
{
"Date": "2021-01-26T23:07:05.000Z",
"Type": "Update"
}
],
"CollectionReference": {
"ShortName": "SENTINEL-1B_SLC",
"Version": "1"
},
"RelatedUrls": [
{
"Format": "Not provided",
"Description": "This link provides direct download access to the granule.",
"Type": "GET DATA",
"URL": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5.zip"
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 data set landing page",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-data-sets/sentinel-1"
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 User Guide and Technical Documentation",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-information/sentinel-1-documents-tools"
}
],
"DataGranule": {
"DayNightFlag": "Unspecified",
"Identifiers": [
{
"Identifier": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5",
"IdentifierType": "ProducerGranuleId"
}
],
"ProductionDateTime": "2021-01-26T03:20:30.000Z",
"ArchiveAndDistributionInformation": [
{
"Name": "Not provided",
"Size": 3991.0614252090454,
"SizeUnit": "MB",
"Format": "Not provided"
}
]
},
"Platforms": [
{
"ShortName": "SENTINEL-1B",
"Instruments": [
{
"ShortName": "C-SAR"
}
]
}
]
},
"meta": {
"concept-type": "granule",
"concept-id": "G1996951408-ASF",
"revision-id": 1,
"native-id": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5-SLC",
"provider-id": "ASF",
"format": "application/echo10+xml",
"revision-date": "2021-01-26T23:07:05.405Z"
}
}
] Discovery-asf_search-8.1.2/tests/yml_tests/Resources/Fairbanks_S1_stack_preprocessed_incomplete.yml 0000664 0000000 0000000 00000126644 14777330235 0034124 0 ustar 00root root 0000000 0000000 [
{
"type": "Feature",
"geometry": {
"coordinates": [
[
[
-150.172562,
65.53125
],
[
-149.246063,
63.942123
],
[
-144.136368,
64.386414
],
[
-144.751495,
65.99025
],
[
-150.172562,
65.53125
]
]
],
"type": "Polygon"
},
"properties": {
"beamModeType": "IW",
"browse": null,
"bytes": 4193723581,
"centerLat": 64.9861,
"centerLon": -147.0909,
"faradayRotation": null,
"fileID": "S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081-SLC",
"flightDirection": null,
"groupID": "S1B_IWDV_0209_0216_024970_094",
"granuleType": "SENTINEL_1B_FRAME",
"insarStackId": null,
"md5sum": "6dd7f6a56ed98ba7037dfeb833217d5b",
"offNadirAngle": null,
"orbit": 24970,
"pathNumber": 94,
"platform": "Sentinel-1B",
"pointingAngle": null,
"polarization": "VV+VH",
"processingDate": "2021-01-02T03:20:31.000Z",
"processingLevel": "SLC",
"sceneName": "S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081",
"sensor": "C-SAR",
"startTime": "2021-01-02T03:20:31.000Z",
"stopTime": "2021-01-02T03:20:58.000Z",
"url": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081.zip",
"fileName": "S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081.zip",
"frameNumber": 210
},
"baseline": {
"stateVectors": {
"positions": {},
"velocities": {}
},
"ascendingNodeTime": "2021-01-02T03:02:58.934857"
},
"umm": {
"TemporalExtent": {
"RangeDateTime": {
"BeginningDateTime": "2021-01-02T03:20:31.000Z",
"EndingDateTime": "2021-01-02T03:20:58.000Z"
}
},
"OrbitCalculatedSpatialDomains": [
{
"OrbitNumber": 24970
}
],
"GranuleUR": "S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081-SLC",
"AdditionalAttributes": [
{
"Name": "ACQUISITION_DATE",
"Values": [
"2021-01-02T03:20:58.000000"
]
},
{
"Name": "ASCENDING_DESCENDING",
"Values": [
"ASCENDING"
]
},
{
"Name": "ASC_NODE_TIME",
"Values": [
"2021-01-02T03:02:58.934857"
]
},
{
"Name": "ASF_PLATFORM",
"Values": [
"Sentinel-1B"
]
},
{
"Name": "BEAM_MODE",
"Values": [
"IW"
]
},
{
"Name": "BEAM_MODE_DESC",
"Values": [
"Interferometric Wide. 250 km swath, 5 m x 20 m spatial resolution and burst synchronization for interferometry. IW is considered to be the standard mode over land masses."
]
},
{
"Name": "BEAM_MODE_TYPE",
"Values": [
"IW"
]
},
{
"Name": "BYTES",
"Values": [
"4193723581"
]
},
{
"Name": "CENTER_ESA_FRAME",
"Values": [
"1300"
]
},
{
"Name": "CENTER_FRAME_ID",
"Values": [
"213"
]
},
{
"Name": "CENTER_LAT",
"Values": [
"64.9861"
]
},
{
"Name": "CENTER_LON",
"Values": [
"-147.0909"
]
},
{
"Name": "DOPPLER",
"Values": [
"0"
]
},
{
"Name": "FARADAY_ROTATION",
"Values": [
"NA"
]
},
{
"Name": "FAR_END_LAT",
"Values": [
"65.99025"
]
},
{
"Name": "FAR_END_LON",
"Values": [
"-144.751495"
]
},
{
"Name": "FAR_START_LAT",
"Values": [
"64.386414"
]
},
{
"Name": "FAR_START_LON",
"Values": [
"-144.136368"
]
},
{
"Name": "FRAME_NUMBER",
"Values": [
"210"
]
},
{
"Name": "GRANULE_TYPE",
"Values": [
"SENTINEL_1B_FRAME"
]
},
{
"Name": "GROUP_ID",
"Values": [
"S1B_IWDV_0209_0216_024970_094"
]
},
{
"Name": "LOOK_DIRECTION",
"Values": [
"R"
]
},
{
"Name": "MD5SUM",
"Values": [
"6dd7f6a56ed98ba7037dfeb833217d5b"
]
},
{
"Name": "MISSION_NAME",
"Values": [
"NA"
]
},
{
"Name": "NEAR_END_LAT",
"Values": [
"65.53125"
]
},
{
"Name": "NEAR_END_LON",
"Values": [
"-150.172562"
]
},
{
"Name": "NEAR_START_LAT",
"Values": [
"63.942123"
]
},
{
"Name": "NEAR_START_LON",
"Values": [
"-149.246063"
]
},
{
"Name": "PATH_NUMBER",
"Values": [
"94"
]
},
{
"Name": "POLARIZATION",
"Values": [
"VV+VH"
]
},
{
"Name": "PROCESSING_DATE",
"Values": [
"2021-01-02T12:40:19.324537"
]
},
{
"Name": "PROCESSING_DESCRIPTION",
"Values": [
"Sentinel-1B Single Look Complex product"
]
},
{
"Name": "PROCESSING_LEVEL",
"Values": [
"L1"
]
},
{
"Name": "PROCESSING_TYPE",
"Values": [
"SLC"
]
},
{
"Name": "PROCESSING_TYPE_DISPLAY",
"Values": [
"L1 Single Look Complex (SLC)"
]
},
{
"Name": "SV_POSITION_POST",
"Values": [
"-2845284.115433,-1186496.621016,6358798.348458,2021-01-02T03:20:53.000000"
]
},
{
"Name": "SV_POSITION_PRE",
"Values": [
"-2893767.065414,-1235752.268405,6327528.043215,2021-01-02T03:20:43.000000"
]
},
{
"Name": "SV_VELOCITY_POST",
"Values": [
"4867.907153,4928.758938,3091.226142,2021-01-02T03:20:53.000000"
]
},
{
"Name": "SV_VELOCITY_PRE",
"Values": [
"4828.593801,4922.268943,3162.776438,2021-01-02T03:20:43.000000"
]
}
],
"SpatialExtent": {
"HorizontalSpatialDomain": {
"Geometry": {
"GPolygons": [
{
"Boundary": {
"Points": [
{
"Longitude": -150.172562,
"Latitude": 65.53125
},
{
"Longitude": -149.246063,
"Latitude": 63.942123
},
{
"Longitude": -144.136368,
"Latitude": 64.386414
},
{
"Longitude": -144.751495,
"Latitude": 65.99025
},
{
"Longitude": -150.172562,
"Latitude": 65.53125
}
]
}
}
]
}
}
},
"ProviderDates": [
{
"Date": "2021-01-02T14:36:15.000Z",
"Type": "Insert"
},
{
"Date": "2021-01-02T14:36:15.000Z",
"Type": "Update"
}
],
"CollectionReference": {
"ShortName": "SENTINEL-1B_SLC",
"Version": "1"
},
"RelatedUrls": [
{
"Format": "Not provided",
"Description": "This link provides direct download access to the granule.",
"Type": "GET DATA",
"URL": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081.zip"
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 data set landing page",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-data-sets/sentinel-1"
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 User Guide and Technical Documentation",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-information/sentinel-1-documents-tools"
}
],
"DataGranule": {
"DayNightFlag": "Unspecified",
"Identifiers": [
{
"Identifier": "S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081",
"IdentifierType": "ProducerGranuleId"
}
],
"ProductionDateTime": "2021-01-02T03:20:31.000Z",
"ArchiveAndDistributionInformation": [
{
"Name": "Not provided",
"Size": 3999.446469306946,
"SizeUnit": "MB",
"Format": "Not provided"
}
]
},
"Platforms": [
{
"ShortName": "SENTINEL-1B",
"Instruments": [
{
"ShortName": "C-SAR"
}
]
}
]
},
"meta": {
"concept-type": "granule",
"concept-id": "G1989758351-ASF",
"revision-id": 9,
"native-id": "S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081-SLC",
"provider-id": "ASF",
"format": "application/echo10+xml",
"revision-date": "2021-01-02T14:36:16.174Z"
}
},
{
"type": "Feature",
"geometry": {
"coordinates": [
[
[
-150.171432,
65.531036
],
[
-149.24501,
63.941902
],
[
-144.135376,
64.386147
],
[
-144.750443,
65.98999
],
[
-150.171432,
65.531036
]
]
],
"type": "Polygon"
},
"properties": {
"beamModeType": "IW",
"browse": null,
"bytes": 4190691686,
"centerLat": 64.9858,
"centerLon": -147.0898,
"faradayRotation": null,
"fileID": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A-SLC",
"flightDirection": null,
"groupID": "S1B_IWDV_0209_0216_025145_094",
"granuleType": "SENTINEL_1B_FRAME",
"insarStackId": null,
"md5sum": "2a76325db9d931414189689082163ee5",
"offNadirAngle": null,
"orbit": 25145,
"pathNumber": 94,
"platform": "Sentinel-1B",
"pointingAngle": null,
"polarization": "VV+VH",
"processingDate": "2021-01-14T03:20:30.000Z",
"processingLevel": "SLC",
"sceneName": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A",
"sensor": "C-SAR",
"startTime": "2021-01-14T03:20:30.000Z",
"stopTime": "2021-01-14T03:20:57.000Z",
"url": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A.zip",
"fileName": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A.zip",
"frameNumber": 210
},
"baseline": {
"stateVectors": {
"positions": {
"prePosition": [
-2893725.563321,
-1235772.278165,
6327540.675522
],
"prePositionTime": "2021-01-14T03:20:42.000000",
"postPosition": [
-2845242.394622,
-1186516.792913,
6358810.884979
],
"postPositionTime": "2021-01-14T03:20:52.000000"
},
"velocities": {
"preVelocity": [
4828.615891,
4922.252624,
3162.766956
],
"preVelocityTime": "2021-01-14T03:20:42.000000",
"postVelocity": [
4867.928786,
4928.742823,
3091.216457
],
"postVelocityTime": "2021-01-14T03:20:52.000000"
}
},
"ascendingNodeTime": "2021-01-14T03:02:58.414522"
},
"umm": {
"TemporalExtent": {
"RangeDateTime": {
"BeginningDateTime": "2021-01-14T03:20:30.000Z",
"EndingDateTime": "2021-01-14T03:20:57.000Z"
}
},
"OrbitCalculatedSpatialDomains": [
{
"OrbitNumber": 25145
}
],
"GranuleUR": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A-SLC",
"AdditionalAttributes": [
{
"Name": "ACQUISITION_DATE",
"Values": [
"2021-01-14T03:20:57.000000"
]
},
{
"Name": "ASCENDING_DESCENDING",
"Values": [
"ASCENDING"
]
},
{
"Name": "ASC_NODE_TIME",
"Values": [
"2021-01-14T03:02:58.414522"
]
},
{
"Name": "ASF_PLATFORM",
"Values": [
"Sentinel-1B"
]
},
{
"Name": "BEAM_MODE",
"Values": [
"IW"
]
},
{
"Name": "BEAM_MODE_DESC",
"Values": [
"Interferometric Wide. 250 km swath, 5 m x 20 m spatial resolution and burst synchronization for interferometry. IW is considered to be the standard mode over land masses."
]
},
{
"Name": "BEAM_MODE_TYPE",
"Values": [
"IW"
]
},
{
"Name": "BYTES",
"Values": [
"4190691686"
]
},
{
"Name": "CENTER_ESA_FRAME",
"Values": [
"1300"
]
},
{
"Name": "CENTER_FRAME_ID",
"Values": [
"213"
]
},
{
"Name": "CENTER_LAT",
"Values": [
"64.9858"
]
},
{
"Name": "CENTER_LON",
"Values": [
"-147.0898"
]
},
{
"Name": "DOPPLER",
"Values": [
"0"
]
},
{
"Name": "FARADAY_ROTATION",
"Values": [
"NA"
]
},
{
"Name": "FAR_END_LAT",
"Values": [
"65.98999"
]
},
{
"Name": "FAR_END_LON",
"Values": [
"-144.750443"
]
},
{
"Name": "FAR_START_LAT",
"Values": [
"64.386147"
]
},
{
"Name": "FAR_START_LON",
"Values": [
"-144.135376"
]
},
{
"Name": "FRAME_NUMBER",
"Values": [
"210"
]
},
{
"Name": "GRANULE_TYPE",
"Values": [
"SENTINEL_1B_FRAME"
]
},
{
"Name": "GROUP_ID",
"Values": [
"S1B_IWDV_0209_0216_025145_094"
]
},
{
"Name": "LOOK_DIRECTION",
"Values": [
"R"
]
},
{
"Name": "MD5SUM",
"Values": [
"2a76325db9d931414189689082163ee5"
]
},
{
"Name": "MISSION_NAME",
"Values": [
"NA"
]
},
{
"Name": "NEAR_END_LAT",
"Values": [
"65.531036"
]
},
{
"Name": "NEAR_END_LON",
"Values": [
"-150.171432"
]
},
{
"Name": "NEAR_START_LAT",
"Values": [
"63.941902"
]
},
{
"Name": "NEAR_START_LON",
"Values": [
"-149.24501"
]
},
{
"Name": "PATH_NUMBER",
"Values": [
"94"
]
},
{
"Name": "POLARIZATION",
"Values": [
"VV+VH"
]
},
{
"Name": "PROCESSING_DATE",
"Values": [
"2021-01-15T05:50:38.350917"
]
},
{
"Name": "PROCESSING_DESCRIPTION",
"Values": [
"Sentinel-1B Single Look Complex product"
]
},
{
"Name": "PROCESSING_LEVEL",
"Values": [
"L1"
]
},
{
"Name": "PROCESSING_TYPE",
"Values": [
"SLC"
]
},
{
"Name": "PROCESSING_TYPE_DISPLAY",
"Values": [
"L1 Single Look Complex (SLC)"
]
},
{
"Name": "SV_POSITION_POST",
"Values": [
"-2845242.394622,-1186516.792913,6358810.884979,2021-01-14T03:20:52.000000"
]
},
{
"Name": "SV_POSITION_PRE",
"Values": [
"-2893725.563321,-1235772.278165,6327540.675522,2021-01-14T03:20:42.000000"
]
},
{
"Name": "SV_VELOCITY_POST",
"Values": [
"4867.928786,4928.742823,3091.216457,2021-01-14T03:20:52.000000"
]
},
{
"Name": "SV_VELOCITY_PRE",
"Values": [
"4828.615891,4922.252624,3162.766956,2021-01-14T03:20:42.000000"
]
}
],
"SpatialExtent": {
"HorizontalSpatialDomain": {
"Geometry": {
"GPolygons": [
{
"Boundary": {
"Points": [
{
"Longitude": -150.171432,
"Latitude": 65.531036
},
{
"Longitude": -149.24501,
"Latitude": 63.941902
},
{
"Longitude": -144.135376,
"Latitude": 64.386147
},
{
"Longitude": -144.750443,
"Latitude": 65.98999
},
{
"Longitude": -150.171432,
"Latitude": 65.531036
}
]
}
}
]
}
}
},
"ProviderDates": [
{
"Date": "2021-01-15T05:50:42.000Z",
"Type": "Insert"
},
{
"Date": "2021-01-15T05:50:42.000Z",
"Type": "Update"
}
],
"CollectionReference": {
"ShortName": "SENTINEL-1B_SLC",
"Version": "1"
},
"RelatedUrls": [
{
"Format": "Not provided",
"Description": "This link provides direct download access to the granule.",
"Type": "GET DATA",
"URL": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A.zip"
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 data set landing page",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-data-sets/sentinel-1"
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 User Guide and Technical Documentation",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-information/sentinel-1-documents-tools"
}
],
"DataGranule": {
"DayNightFlag": "Unspecified",
"Identifiers": [
{
"Identifier": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A",
"IdentifierType": "ProducerGranuleId"
}
],
"ProductionDateTime": "2021-01-14T03:20:30.000Z",
"ArchiveAndDistributionInformation": [
{
"Name": "Not provided",
"Size": 3996.5550289154053,
"SizeUnit": "MB",
"Format": "Not provided"
}
]
},
"Platforms": [
{
"ShortName": "SENTINEL-1B",
"Instruments": [
{
"ShortName": "C-SAR"
}
]
}
]
},
"meta": {
"concept-type": "granule",
"concept-id": "G1993900886-ASF",
"revision-id": 1,
"native-id": "S1B_IW_SLC__1SDV_20210114T032030_20210114T032057_025145_02FE61_454A-SLC",
"provider-id": "ASF",
"format": "application/echo10+xml",
"revision-date": "2021-01-15T05:50:42.934Z"
}
},
{
"type": "Feature",
"geometry": {
"coordinates": [
[
[
-150.171249,
65.531166
],
[
-149.244873,
63.942146
],
[
-144.135239,
64.386398
],
[
-144.750275,
65.99012
],
[
-150.171249,
65.531166
]
]
],
"type": "Polygon"
},
"properties": {
"beamModeType": "IW",
"browse": null,
"bytes": 4184931225,
"centerLat": 64.986,
"centerLon": -147.0897,
"faradayRotation": null,
"fileID": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5-SLC",
"flightDirection": null,
"groupID": "S1B_IWDV_0209_0216_025320_094",
"granuleType": "SENTINEL_1B_FRAME",
"insarStackId": null,
"md5sum": "4e51b15bbe20cf3fdb1e099ebb4d2c36",
"offNadirAngle": null,
"orbit": 25320,
"pathNumber": 94,
"platform": "Sentinel-1B",
"pointingAngle": null,
"polarization": "VV+VH",
"processingDate": "2021-01-26T03:20:30.000Z",
"processingLevel": "SLC",
"sceneName": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5",
"sensor": "C-SAR",
"startTime": "2021-01-26T03:20:30.000Z",
"stopTime": "2021-01-26T03:20:57.000Z",
"url": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5.zip",
"fileName": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5.zip",
"frameNumber": 210
},
"baseline": {
"stateVectors": {
"positions": {},
"velocities": {}
},
"ascendingNodeTime": "2021-01-26T03:02:57.994247"
},
"umm": {
"TemporalExtent": {
"RangeDateTime": {
"BeginningDateTime": "2021-01-26T03:20:30.000Z",
"EndingDateTime": "2021-01-26T03:20:57.000Z"
}
},
"OrbitCalculatedSpatialDomains": [
{
"OrbitNumber": 25320
}
],
"GranuleUR": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5-SLC",
"AdditionalAttributes": [
{
"Name": "ACQUISITION_DATE",
"Values": [
"2021-01-26T03:20:57.000000"
]
},
{
"Name": "ASCENDING_DESCENDING",
"Values": [
"ASCENDING"
]
},
{
"Name": "ASC_NODE_TIME",
"Values": [
"2021-01-26T03:02:57.994247"
]
},
{
"Name": "ASF_PLATFORM",
"Values": [
"Sentinel-1B"
]
},
{
"Name": "BEAM_MODE",
"Values": [
"IW"
]
},
{
"Name": "BEAM_MODE_DESC",
"Values": [
"Interferometric Wide. 250 km swath, 5 m x 20 m spatial resolution and burst synchronization for interferometry. IW is considered to be the standard mode over land masses."
]
},
{
"Name": "BEAM_MODE_TYPE",
"Values": [
"IW"
]
},
{
"Name": "BYTES",
"Values": [
"4184931225"
]
},
{
"Name": "CENTER_ESA_FRAME",
"Values": [
"1300"
]
},
{
"Name": "CENTER_FRAME_ID",
"Values": [
"213"
]
},
{
"Name": "CENTER_LAT",
"Values": [
"64.986"
]
},
{
"Name": "CENTER_LON",
"Values": [
"-147.0897"
]
},
{
"Name": "DOPPLER",
"Values": [
"0"
]
},
{
"Name": "FARADAY_ROTATION",
"Values": [
"NA"
]
},
{
"Name": "FAR_END_LAT",
"Values": [
"65.99012"
]
},
{
"Name": "FAR_END_LON",
"Values": [
"-144.750275"
]
},
{
"Name": "FAR_START_LAT",
"Values": [
"64.386398"
]
},
{
"Name": "FAR_START_LON",
"Values": [
"-144.135239"
]
},
{
"Name": "FRAME_NUMBER",
"Values": [
"210"
]
},
{
"Name": "GRANULE_TYPE",
"Values": [
"SENTINEL_1B_FRAME"
]
},
{
"Name": "GROUP_ID",
"Values": [
"S1B_IWDV_0209_0216_025320_094"
]
},
{
"Name": "LOOK_DIRECTION",
"Values": [
"R"
]
},
{
"Name": "MD5SUM",
"Values": [
"4e51b15bbe20cf3fdb1e099ebb4d2c36"
]
},
{
"Name": "MISSION_NAME",
"Values": [
"NA"
]
},
{
"Name": "NEAR_END_LAT",
"Values": [
"65.531166"
]
},
{
"Name": "NEAR_END_LON",
"Values": [
"-150.171249"
]
},
{
"Name": "NEAR_START_LAT",
"Values": [
"63.942146"
]
},
{
"Name": "NEAR_START_LON",
"Values": [
"-149.244873"
]
},
{
"Name": "PATH_NUMBER",
"Values": [
"94"
]
},
{
"Name": "POLARIZATION",
"Values": [
"VV+VH"
]
},
{
"Name": "PROCESSING_DATE",
"Values": [
"2021-01-26T23:06:47.704061"
]
},
{
"Name": "PROCESSING_DESCRIPTION",
"Values": [
"Sentinel-1B Single Look Complex product"
]
},
{
"Name": "PROCESSING_LEVEL",
"Values": [
"L1"
]
},
{
"Name": "PROCESSING_TYPE",
"Values": [
"SLC"
]
},
{
"Name": "PROCESSING_TYPE_DISPLAY",
"Values": [
"L1 Single Look Complex (SLC)"
]
},
{
"Name": "SV_POSITION_POST",
"Values": [
"-2845226.283146,-1186512.31404,6358815.223669,2021-01-26T03:20:52.000000"
]
},
{
"Name": "SV_POSITION_PRE",
"Values": [
"-2893709.242044,-1235768.184608,6327545.175914,2021-01-26T03:20:42.000000"
]
},
{
"Name": "SV_VELOCITY_POST",
"Values": [
"4867.907754,4928.781369,3091.20021,2021-01-26T03:20:52.000000"
]
},
{
"Name": "SV_VELOCITY_PRE",
"Values": [
"4828.59494,4922.291175,3162.750859,2021-01-26T03:20:42.000000"
]
}
],
"SpatialExtent": {
"HorizontalSpatialDomain": {
"Geometry": {
"GPolygons": [
{
"Boundary": {
"Points": [
{
"Longitude": -150.171249,
"Latitude": 65.531166
},
{
"Longitude": -149.244873,
"Latitude": 63.942146
},
{
"Longitude": -144.135239,
"Latitude": 64.386398
},
{
"Longitude": -144.750275,
"Latitude": 65.99012
},
{
"Longitude": -150.171249,
"Latitude": 65.531166
}
]
}
}
]
}
}
},
"ProviderDates": [
{
"Date": "2021-01-26T23:07:05.000Z",
"Type": "Insert"
},
{
"Date": "2021-01-26T23:07:05.000Z",
"Type": "Update"
}
],
"CollectionReference": {
"ShortName": "SENTINEL-1B_SLC",
"Version": "1"
},
"RelatedUrls": [
{
"Format": "Not provided",
"Description": "This link provides direct download access to the granule.",
"Type": "GET DATA",
"URL": "https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5.zip"
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 data set landing page",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-data-sets/sentinel-1"
},
{
"Format": "Not provided",
"Description": "ASF DAAC Sentinel-1 User Guide and Technical Documentation",
"Type": "VIEW RELATED INFORMATION",
"URL": "www.asf.alaska.edu/sar-information/sentinel-1-documents-tools"
}
],
"DataGranule": {
"DayNightFlag": "Unspecified",
"Identifiers": [
{
"Identifier": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5",
"IdentifierType": "ProducerGranuleId"
}
],
"ProductionDateTime": "2021-01-26T03:20:30.000Z",
"ArchiveAndDistributionInformation": [
{
"Name": "Not provided",
"Size": 3991.0614252090454,
"SizeUnit": "MB",
"Format": "Not provided"
}
]
},
"Platforms": [
{
"ShortName": "SENTINEL-1B",
"Instruments": [
{
"ShortName": "C-SAR"
}
]
}
]
},
"meta": {
"concept-type": "granule",
"concept-id": "G1996951408-ASF",
"revision-id": 1,
"native-id": "S1B_IW_SLC__1SDV_20210126T032030_20210126T032057_025320_0303F3_7BE5-SLC",
"provider-id": "ASF",
"format": "application/echo10+xml",
"revision-date": "2021-01-26T23:07:05.405Z"
}
}
] Discovery-asf_search-8.1.2/tests/yml_tests/Resources/Fairbanks_SLC.csv 0000664 0000000 0000000 00000002272 14777330235 0026060 0 ustar 00root root 0000000 0000000 "Granule Name","Platform","Sensor","Beam Mode","Beam Mode Description","Orbit","Path Number","Frame Number","Acquisition Date","Processing Date","Processing Level","Start Time","End Time","Center Lat","Center Lon","Near Start Lat","Near Start Lon","Far Start Lat","Far Start Lon","Near End Lat","Near End Lon","Far End Lat","Far End Lon","Faraday Rotation","Ascending or Descending?","URL","Size (MB)","Off Nadir Angle","Stack Size","Doppler","GroupID","Pointing Angle"
"S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081","Sentinel-1B","C-SAR","IW","Interferometric Wide. 250 km swath, 5 m x 20 m spatial resolution and burst synchronization for interferometry. IW is considered to be the standard mode over land masses.","24970","94","210","2021-01-02T03:20:58.000000","2021-01-02T03:20:31.000000","SLC","2021-01-02T03:20:31.000000","2021-01-02T03:20:58.000000","64.9861","-147.0909","63.942123","-149.246063","64.386414","-144.136368","65.53125","-150.172562","65.99025","-144.751495","None","ASCENDING","https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081.zip","3999.446469306946","None","None","0","S1B_IWDV_0209_0216_024970_094",""
Discovery-asf_search-8.1.2/tests/yml_tests/Resources/Fairbanks_SLC.kml 0000664 0000000 0000000 00000004440 14777330235 0026047 0 ustar 00root root 0000000 0000000
ASF Datapool Search Results
Search Performed:
S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081
<![CDATA[
Sentinel-1B (Interferometric Wide. 250 km swath, 5 m x 20 m spatial resolution and burst synchronization for interferometry. IW is considered to be the standard mode over land masses.), acquired 2021-01-02T03:20:58.000000
https://datapool.asf.alaska.edu/SLC/SB/S1B_IW_SLC__1SDV_20210102T032031_20210102T032058_024970_02F8C3_C081.zip
Metadata
- Processing type: L1 Single Look Complex (SLC)
- Frame: 210
- Path: 94
- Orbit: 24970
- Start time: 2021-01-02T03:20:31.000000
- End time: 2021-01-02T03:20:58.000000
- Faraday Rotation: None
- Ascending/Descending: ASCENDING
- Off Nadir Angle: None
- Pointing Angle: None
#yellowLineGreenPoly
1
relativeToGround
-144.751495,65.990250,2000
-144.136368,64.386414,2000
-149.246063,63.942123,2000
-150.172562,65.531250,2000
-144.751495,65.990250,2000