Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
139 commits
Select commit Hold shift + click to select a range
6b1b7cc
Separated NTS backends
Jun 29, 2023
4f3f611
n/a
Jul 5, 2023
e32d5ad
More nts backend stuff
Jul 6, 2023
ccc0de4
Working(?) np backend
Jul 10, 2023
743fb1d
Merge branch 'apache:master' into SDAP-472-gridded-zarr
RKuttruff Jul 10, 2023
b77aa11
Working(?) np backend
Jul 10, 2023
4ccec2e
gitignore ini
Jul 10, 2023
736a44e
ASF headers
Jul 10, 2023
70bdab1
First functioning test of 2 simultaneous backends
Jul 11, 2023
f3981cd
Removed accidentally committed ini files
Jul 12, 2023
26f6220
Working zarr backend ds list
Jul 12, 2023
91de6ef
Capture and handle NTS requests routed to backend that doesn't (yet) …
Jul 12, 2023
df23919
analysis setup fails to find VERSION.txt when building locally
Jul 12, 2023
07404f0
Implemented more NTS functions in zarr backend
Jul 12, 2023
72888aa
Added misc backend time metrics record field in NCSH
Jul 12, 2023
1c4a0e4
fixes
Jul 13, 2023
0a7cd7f
Dynamic dataset management
Jul 17, 2023
c8e7dbb
Dynamic dataset management
Jul 18, 2023
e78f7ad
Dataset management
Jul 20, 2023
a84d77e
Timeseriesspark support
Jul 27, 2023
53190e2
Update backend dict on dataset mgmt query
Jul 31, 2023
2e7a0dc
Fixes and improvements
Jul 31, 2023
0869375
Adapted matchup to work with zarr backends
Jul 31, 2023
c156826
Merge branch 'apache:master' into SDAP-472-gridded-zarr
RKuttruff Jul 31, 2023
1eb680b
Zarr support
Aug 1, 2023
0aef0f1
DDAS adjustments
Aug 2, 2023
42b912e
find_tile_by_polygon_and_most_recent_day_of_year impl
Aug 3, 2023
1559fba
Don't sel by time if neither max nor min time are given
Aug 8, 2023
2bb52af
Fix not calling partial when needed
Aug 15, 2023
f9dc2ae
Pinned s3fs and fsspec versions
Aug 18, 2023
a6f602d
Fixed some dependencies to ensure image builds properly + s3fs works
Aug 18, 2023
1a451eb
Config override for backends
Aug 21, 2023
6f8f7b1
Deps update
Aug 21, 2023
5baf9ec
Merge remote-tracking branch 'RKuttruff/master' into SDAP-472-gridded…
Aug 21, 2023
8cc9d5d
Merge branch 'apache:master' into SDAP-472-gridded-zarr
RKuttruff Aug 22, 2023
e693d27
Merge branch 'apache:master' into SDAP-484-CoG
RKuttruff Aug 22, 2023
492be4b
Merge branch 'apache:master' into SDAP-472-gridded-zarr
RKuttruff Aug 23, 2023
8a9283a
Merge branch 'apache:master' into SDAP-484-CoG
RKuttruff Aug 23, 2023
483ad9f
Add metadata from Zarr collection to /list
Aug 31, 2023
e1dec65
CoG backend start
Sep 1, 2023
bb2fa00
Start of CoG work
Sep 5, 2023
d57a531
rioxarray dep
Sep 5, 2023
3ad5530
CoG URL work and some Solr queries
Sep 6, 2023
4b24ec3
Merge branch 'apache:master' into SDAP-472-gridded-zarr
RKuttruff Sep 6, 2023
87f7bc0
Merge branch 'apache:master' into SDAP-484-CoG
RKuttruff Sep 6, 2023
6712c82
Merge remote-tracking branch 'RKuttruff/SDAP-484-CoG' into SDAP-484-CoG
Sep 6, 2023
aee843b
More functions implemented
Sep 7, 2023
c7b938d
More functions implemented
Sep 7, 2023
9d2605b
Merge remote-tracking branch 'RKuttruff/SDAP-484-CoG' into SDAP-484-CoG
Sep 7, 2023
6077ac2
Merge branch 'apache:master' into SDAP-472-gridded-zarr
RKuttruff Sep 7, 2023
a43db23
Merge branch 'apache:master' into SDAP-484-CoG
RKuttruff Sep 7, 2023
d51cf63
More CoG stuff
Sep 11, 2023
0d3c0fc
Merge branch 'apache:master' into SDAP-472-gridded-zarr
RKuttruff Sep 14, 2023
f80496c
Merge branch 'apache:master' into SDAP-484-CoG
RKuttruff Sep 14, 2023
8d51337
Merge branch 'apache:master' into SDAP-472-gridded-zarr
RKuttruff Sep 14, 2023
d26f4a2
Merge branch 'apache:master' into SDAP-484-CoG
RKuttruff Sep 14, 2023
f5750c3
Zarr: Probe lat order and flip if necessary
Sep 14, 2023
b97677e
Zarr: Probe lat order and flip if necessary
Sep 14, 2023
4cdb485
Fixes for subsetting
Sep 18, 2023
1ee25c2
Warnings for geo subsetting
Sep 20, 2023
7fc260a
Strip quotes from variable names
Sep 20, 2023
62564f7
Strip quotes from variable names
Sep 20, 2023
b5a223b
Fixed find_tile_by_id not routing to correct backend
Sep 21, 2023
06504e0
Ensure geotiffs are sorted by dt
Sep 21, 2023
b5df944
removed resultSizeLimit param from matchup
stephenykp Sep 25, 2023
5e0fbb2
Add # of primaries/avergae secondaries to job output
stephenykp Sep 25, 2023
fbad6b7
rename to executionId
stephenykp Sep 25, 2023
e0a5999
update changelog
stephenykp Sep 25, 2023
8942afc
add totalSecondaryMatched field to /job output
stephenykp Sep 29, 2023
dd73036
num unique secondaries addition
stephenykp Sep 29, 2023
a7beb85
Reenabled TIFF geo subsetting
Oct 3, 2023
741d4b6
Ensure latitudes are in ascending order
Oct 3, 2023
5003420
Catch and report algs that are unsupported by ds backend
Oct 9, 2023
15fce2e
Catch and report algs that are unsupported by ds backend
Oct 9, 2023
d0742ad
Code for opening tiffs in S3
Oct 12, 2023
db68d4f
updated docs to use correct sea_water_temperature param name
stephenykp Oct 13, 2023
5e77bae
Merge remote-tracking branch 'origin/master' into SDAP-484-CoG
Oct 17, 2023
29f556f
Merge branch 'apache:master' into SDAP-484-CoG
RKuttruff Oct 26, 2023
7e11a4c
Merge remote-tracking branch 'origin' into SDAP-493
stephenykp Nov 1, 2023
a8be9b8
bugfix
stephenykp Nov 1, 2023
62de867
fix division by zero bug
stephenykp Nov 6, 2023
972f3dd
add params to dataset management handler classes
Nov 8, 2023
9f0a107
Merge remote-tracking branch 'origin/master' into SDAP-472-gridded-zarr
Nov 8, 2023
831ca37
add page number to default filename for matchup output
stephenykp Nov 16, 2023
4ab2f9b
pagination improvements
stephenykp Nov 16, 2023
3677c11
removed debugging line
stephenykp Nov 16, 2023
86f1348
changelog
stephenykp Nov 16, 2023
1e8cc4e
Update helm cassandra dependency (#289)
RKuttruff Nov 27, 2023
32ca3d7
stac catalog
stephenykp Jan 5, 2024
3563ae9
Updated openapi spec
stephenykp Jan 6, 2024
0691d87
move stac endpoints to matchup tag in openapi spec
stephenykp Jan 6, 2024
e02fc78
SDAP-507 - Changes to remove geos sub-dependency
Jan 11, 2024
51231ca
SDAP-507 - Changelog
Jan 11, 2024
5c75573
SDAP-507 - Changes to remove geos sub-dependency
Jan 11, 2024
7f717c0
SDAP-507 - Changelog
Jan 11, 2024
2b6efa6
Merge remote-tracking branch 'origin/SDAP-507' into SDAP-507
Jan 11, 2024
9779f40
delete instead of comment out
Jan 19, 2024
3e700b8
Merge branch 'SDAP-500' into SDAP-499
stephenykp Jan 19, 2024
9378760
Revert "Update helm cassandra dependency (#289)"
stephenykp Jan 19, 2024
092c87b
Merge branch 'SDAP-499' into SDAP-506
stephenykp Jan 19, 2024
e6730eb
Merge branch 'release/1.2.0' into SDAP-506
stephenykp Jan 19, 2024
3aafb7e
Merge remote-tracking branch 'origin/SDAP-506' into SDAP-472-gridded-…
Jan 19, 2024
5303146
deleted disabled endpoint files
Jan 19, 2024
d6a75e3
Merge branch 'SDAP-507' into SDAP-472-gridded-zarr
Jan 19, 2024
2ed29fd
Merge branch 'develop' into SDAP-493
stephenykp Jan 19, 2024
2a340dc
Merge branch 'SDAP-493' into SDAP-500
stephenykp Jan 19, 2024
718067c
Merge branch 'SDAP-500' into SDAP-499
stephenykp Jan 19, 2024
681ba5a
Merge branch 'SDAP-499' into SDAP-506
stephenykp Jan 19, 2024
935000b
Merge branch 'release/1.2.0' into SDAP-472-gridded-zarr
Jan 19, 2024
ee5e5c8
fix bug where still-running jobs failed /job endpoint due to missing …
stephenykp Jan 25, 2024
0f388a3
Merge branch 'release/1.2.0' into SDAP-506
stephenykp Jan 25, 2024
6bd5f0e
Merge remote-tracking branch 'origin/SDAP-506' into SDAP-472-gridded-…
Jan 29, 2024
639d7d7
Update .asf.yaml (#293)
RKuttruff Feb 1, 2024
5937db1
Merge branch 'apache:master' into SDAP-472-gridded-zarr
RKuttruff Feb 1, 2024
3050ef3
Merge remote-tracking branch 'origin/develop' into SDAP-472-gridded-zarr
Feb 1, 2024
3875f2d
Moved changelog entries
Feb 1, 2024
d989c66
SDAP-472 changelog entries
Feb 1, 2024
2c22be6
Merge branch 'SDAP-472-gridded-zarr' into SDAP-484-CoG
Feb 6, 2024
f766df1
Merge branch 'develop' into SDAP-472-gridded-zarr
Feb 8, 2024
3285c1e
Merge branch 'SDAP-472-gridded-zarr' into SDAP-484-CoG
Feb 12, 2024
ec2ed11
pyproj requirement
Feb 12, 2024
1392983
CoG: some minor fixes for docker build &c
Feb 13, 2024
1ce1732
Merge branch 'develop' into SDAP-472-gridded-zarr
RKuttruff Mar 6, 2024
42530a2
Merge branch 'SDAP-472-gridded-zarr' into SDAP-484-CoG
Apr 1, 2024
c1c4c49
Merge remote-tracking branch 'origin/develop' into SDAP-484-CoG
Apr 1, 2024
168ae73
Dependencies update to poetry
Apr 1, 2024
cefa8ab
Handling of GeoTIFFs with non-float dtypes
Apr 2, 2024
2800569
Improved opening of GeoTIFFs to avoid mask & scale issues with dtype
Apr 16, 2024
6b6bb75
Merge branch 'develop' into SDAP-484-CoG
May 6, 2024
d1fb37c
Merge branch 'develop' into SDAP-484-CoG
Jun 10, 2024
18bbfea
Poetry re-lock
Jun 10, 2024
9a11bad
Fix bad merge
Jun 10, 2024
d44e530
Merge branch 'apache:develop' into SDAP-484-CoG
RKuttruff Jun 25, 2024
f41ebe6
Merge remote-tracking branch 'origin/develop' into SDAP-484-CoG
Jul 10, 2024
c7e6e82
poetry lock
Jul 10, 2024
606e13f
Merge remote-tracking branch 'origin/develop' into SDAP-484-CoG
Oct 8, 2024
f2e5cf1
Poetry re-lock
Oct 8, 2024
0cb9c83
Update backend for new NTS method
Oct 8, 2024
1e746f8
Merge branch 'apache:develop' into SDAP-484-CoG
RKuttruff Oct 14, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions analysis/webservice/algorithms/DataInBoundsSearch.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@

EPOCH = timezone('UTC').localize(datetime(1970, 1, 1))
ISO_8601 = '%Y-%m-%dT%H:%M:%S%z'
logger = logging.getLogger(__name__)


@nexus_handler
Expand Down Expand Up @@ -157,6 +158,8 @@ def calc(self, computeOptions, **args):
tiles = self._get_tile_service().get_tiles_by_metadata(metadata_filter, ds, start_time, end_time)
need_to_fetch = False

logger.info(f'Matched {len(tiles)} tiles')

data = []

log.info(f'Matched {len(tiles):,} tiles.')
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,9 @@
from webservice.webmodel import NexusRequestObjectTornadoFree, NexusRequestObject, NexusProcessingException
from webservice.algorithms_spark.NexusCalcSparkTornadoHandler import NexusCalcSparkTornadoHandler

from nexustiles.exception import AlgorithmUnsupportedForDatasetException

from py4j.protocol import Py4JJavaError

class NexusRequestHandler(tornado.web.RequestHandler):
def initialize(self, thread_pool, clazz=None, **kargs):
Expand Down Expand Up @@ -72,6 +75,56 @@ def get(self):
except NexusProcessingException as e:
self.async_onerror_callback(e.reason, e.code)

# except pyspark

except AlgorithmUnsupportedForDatasetException as e:
self.logger.exception(e)
self.async_onerror_callback(
reason='Algorithm unsupported for dataset (backend has yet to implement functionality)',
code=400
)

except Py4JJavaError as e:
self.logger.exception(e)

if 'AlgorithmUnsupportedForDatasetException' in str(e):
self.async_onerror_callback(
reason='Algorithm unsupported for dataset (backend has yet to implement functionality)',
code=400
)
else:
self.async_onerror_callback(str(e), 500)

except Exception as e:
print(type(e))
self.async_onerror_callback(str(e), 500)

@tornado.gen.coroutine
def post(self):
self.logger.info("Received %s" % self._request_summary())

request = NexusRequestObject(self)

# create NexusCalcHandler which will process the request
instance = self.__clazz(**self._clazz_init_args)

try:
# process the request asynchronously on a different thread,
# the current tornado handler is still available to get other user requests
results = yield tornado.ioloop.IOLoop.current().run_in_executor(self.executor, instance.calc, request)

if results:
try:
self.set_status(results.status_code)
except AttributeError:
pass

renderer = NexusRendererFactory.get_renderer(request)
renderer.render(self, results)

except NexusProcessingException as e:
self.async_onerror_callback(e.reason, e.code)

except Exception as e:
self.async_onerror_callback(str(e), 500)

Expand Down
3 changes: 3 additions & 0 deletions data-access/nexustiles/backends/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,3 +13,6 @@
# See the License for the specific language governing permissions and
# limitations under the License.

from nexustiles.backends.cog.backend import CoGBackend
from nexustiles.backends.zarr.backend import ZarrBackend
from nexustiles.backends.nexusproto.backend import NexusprotoTileService
284 changes: 284 additions & 0 deletions data-access/nexustiles/backends/backend.py.template
Original file line number Diff line number Diff line change
@@ -0,0 +1,284 @@
############################################################################
### ###
### THIS IS A TEMPLATE FOR STARTING A NEW NTS BACKEND IMPLEMENTATION ###
### ###
############################################################################

# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

import logging
import sys
from datetime import datetime
from urllib.parse import urlparse

import numpy as np
import numpy.ma as ma
import s3fs
import xarray as xr
from nexustiles.AbstractTileService import AbstractTileService
from nexustiles.exception import NexusTileServiceException
from nexustiles.model.nexusmodel import Tile, BBox, TileVariable
from pytz import timezone
from shapely.geometry import MultiPolygon, box
from yarl import URL

EPOCH = timezone('UTC').localize(datetime(1970, 1, 1))
ISO_8601 = '%Y-%m-%dT%H:%M:%S%z'

logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
datefmt="%Y-%m-%dT%H:%M:%S", stream=sys.stdout)
logger = logging.getLogger(__name__)


class ZarrBackend(AbstractTileService):
def __init__(self, dataset_name, path, config=None):
AbstractTileService.__init__(self, dataset_name)
self.__config = config if config is not None else {}

logger.info(f'Opening zarr backend at {path} for dataset {self._name}')

url = urlparse(path)

self.__url = path

self.__store_type = url.scheme
self.__host = url.netloc
self.__path = url.path

if 'variable' in config:
data_vars = config['variable']
elif 'variables' in config:
data_vars = config['variables']
else:
raise KeyError('Data variables not provided in config')

if isinstance(data_vars, str):
self.__variables = [data_vars]
elif isinstance(data_vars, list):
self.__variables = data_vars
else:
raise TypeError(f'Improper type for variables config: {type(data_vars)}')

self.__longitude = config['coords']['longitude']
self.__latitude = config['coords']['latitude']
self.__time = config['coords']['time']

self.__depth = config['coords'].get('depth')

if self.__store_type in ['', 'file']:
store = self.__path
elif self.__store_type == 's3':
try:
aws_cfg = self.__config['aws']

if aws_cfg['public']:
# region = aws_cfg.get('region', 'us-west-2')
# store = f'https://{self.__host}.s3.{region}.amazonaws.com{self.__path}'
s3 = s3fs.S3FileSystem(True)
store = s3fs.S3Map(root=path, s3=s3, check=False)
else:
s3 = s3fs.S3FileSystem(False, key=aws_cfg['accessKeyID'], secret=aws_cfg['secretAccessKey'])
store = s3fs.S3Map(root=path, s3=s3, check=False)
except Exception as e:
logger.error(f'Failed to open zarr dataset at {self.__path}, ignoring it. Cause: {e}')
raise NexusTileServiceException(f'Cannot open S3 dataset ({e})')
else:
raise ValueError(self.__store_type)

try:
self.__ds: xr.Dataset = xr.open_zarr(store, consolidated=True)
except Exception as e:
logger.error(f'Failed to open zarr dataset at {self.__path}, ignoring it. Cause: {e}')
raise NexusTileServiceException(f'Cannot open dataset ({e})')

def get_dataseries_list(self, simple=False):
raise NotImplementedError()

def find_tile_by_id(self, tile_id, **kwargs):
return [tile_id]

def find_tiles_by_id(self, tile_ids, ds=None, **kwargs):
return tile_ids

def find_days_in_range_asc(self, min_lat, max_lat, min_lon, max_lon, dataset, start_time, end_time,
metrics_callback=None, **kwargs):
raise NotImplementedError()

def find_tile_by_polygon_and_most_recent_day_of_year(self, bounding_polygon, ds, day_of_year, **kwargs):
"""
Given a bounding polygon, dataset, and day of year, find tiles in that dataset with the same bounding
polygon and the closest day of year.

For example:
given a polygon minx=0, miny=0, maxx=1, maxy=1; dataset=MY_DS; and day of year=32
search for first tile in MY_DS with identical bbox and day_of_year <= 32 (sorted by day_of_year desc)

Valid matches:
minx=0, miny=0, maxx=1, maxy=1; dataset=MY_DS; day of year = 32
minx=0, miny=0, maxx=1, maxy=1; dataset=MY_DS; day of year = 30

Invalid matches:
minx=1, miny=0, maxx=2, maxy=1; dataset=MY_DS; day of year = 32
minx=0, miny=0, maxx=1, maxy=1; dataset=MY_OTHER_DS; day of year = 32
minx=0, miny=0, maxx=1, maxy=1; dataset=MY_DS; day of year = 30 if minx=0, miny=0, maxx=1, maxy=1; dataset=MY_DS; day of year = 32 also exists

:param bounding_polygon: The exact bounding polygon of tiles to search for
:param ds: The dataset name being searched
:param day_of_year: Tile day of year to search for, tile nearest to this day (without going over) will be returned
:return: List of one tile from ds with bounding_polygon on or before day_of_year or raise NexusTileServiceException if no tile found
"""

raise NotImplementedError()

def find_all_tiles_in_box_at_time(self, min_lat, max_lat, min_lon, max_lon, dataset, time, **kwargs):
return self.find_tiles_in_box(min_lat, max_lat, min_lon, max_lon, dataset, time, time, **kwargs)

def find_all_tiles_in_polygon_at_time(self, bounding_polygon, dataset, time, **kwargs):
return self.find_tiles_in_polygon(bounding_polygon, dataset, time, time, **kwargs)

def find_tiles_in_box(self, min_lat, max_lat, min_lon, max_lon, ds=None, start_time=0, end_time=-1, **kwargs):
raise NotImplementedError()

def find_tiles_in_polygon(self, bounding_polygon, ds=None, start_time=None, end_time=None, **kwargs):
# Find tiles that fall within the polygon in the Solr index
raise NotImplementedError()

def find_tiles_by_metadata(self, metadata, ds=None, start_time=0, end_time=-1, **kwargs):
"""
Return list of tiles whose metadata matches the specified metadata, start_time, end_time.
:param metadata: List of metadata values to search for tiles e.g ["river_id_i:1", "granule_s:granule_name"]
:param ds: The dataset name to search
:param start_time: The start time to search for tiles
:param end_time: The end time to search for tiles
:return: A list of tiles
"""
raise NotImplementedError()

def find_tiles_by_exact_bounds(self, bounds, ds, start_time, end_time, **kwargs):
"""
The method will return tiles with the exact given bounds within the time range. It differs from
find_tiles_in_polygon in that only tiles with exactly the given bounds will be returned as opposed to
doing a polygon intersection with the given bounds.

:param bounds: (minx, miny, maxx, maxy) bounds to search for
:param ds: Dataset name to search
:param start_time: Start time to search (seconds since epoch)
:param end_time: End time to search (seconds since epoch)
:param kwargs: fetch_data: True/False = whether or not to retrieve tile data
:return:
"""
raise NotImplementedError()

def find_all_boundary_tiles_at_time(self, min_lat, max_lat, min_lon, max_lon, dataset, time, **kwargs):
# Due to the precise nature of gridded Zarr's subsetting, it doesn't make sense to have a boundary region like
# this
raise NotImplementedError()

def get_min_max_time_by_granule(self, ds, granule_name):
raise NotImplementedError()

def get_dataset_overall_stats(self, ds):
raise NotImplementedError()

def get_stats_within_box_at_time(self, min_lat, max_lat, min_lon, max_lon, dataset, time, **kwargs):
raise NotImplementedError()

def get_bounding_box(self, tile_ids):
"""
Retrieve a bounding box that encompasses all of the tiles represented by the given tile ids.
:param tile_ids: List of tile ids
:return: shapely.geometry.Polygon that represents the smallest bounding box that encompasses all of the tiles
"""

raise NotImplementedError()

# def __get_ds_min_max_date(self):
# min_date = self.__ds[self.__time].min().to_numpy()
# max_date = self.__ds[self.__time].max().to_numpy()
#
# if np.issubdtype(min_date.dtype, np.datetime64):
# min_date = ((min_date - np.datetime64(EPOCH)) / 1e9).astype(int).item()
#
# if np.issubdtype(max_date.dtype, np.datetime64):
# max_date = ((max_date - np.datetime64(EPOCH)) / 1e9).astype(int).item()
#
# return min_date, max_date

def get_min_time(self, tile_ids, ds=None):
"""
Get the minimum tile date from the list of tile ids
:param tile_ids: List of tile ids
:param ds: Filter by a specific dataset. Defaults to None (queries all datasets)
:return: long time in seconds since epoch
"""
raise NotImplementedError()

def get_max_time(self, tile_ids, ds=None):
"""
Get the maximum tile date from the list of tile ids
:param tile_ids: List of tile ids
:param ds: Filter by a specific dataset. Defaults to None (queries all datasets)
:return: long time in seconds since epoch
"""
raise NotImplementedError()

def get_distinct_bounding_boxes_in_polygon(self, bounding_polygon, ds, start_time, end_time):
"""
Get a list of distinct tile bounding boxes from all tiles within the given polygon and time range.
:param bounding_polygon: The bounding polygon of tiles to search for
:param ds: The dataset name to search
:param start_time: The start time to search for tiles
:param end_time: The end time to search for tiles
:return: A list of distinct bounding boxes (as shapely polygons) for tiles in the search polygon
"""
raise NotImplementedError()

def get_tile_count(self, ds, bounding_polygon=None, start_time=0, end_time=-1, metadata=None, **kwargs):
"""
Return number of tiles that match search criteria.
:param ds: The dataset name to search
:param bounding_polygon: The polygon to search for tiles
:param start_time: The start time to search for tiles
:param end_time: The end time to search for tiles
:param metadata: List of metadata values to search for tiles e.g ["river_id_i:1", "granule_s:granule_name"]
:return: number of tiles that match search criteria
"""
raise NotImplementedError()

def fetch_data_for_tiles(self, *tiles):
for tile in tiles:
self.__fetch_data_for_tile(tile)

return tiles

def __fetch_data_for_tile(self, tile: Tile):
raise NotImplementedError()


def _metadata_store_docs_to_tiles(self, *store_docs):
return [ZarrBackend.__nts_url_to_tile(d) for d in store_docs]

@staticmethod
def __nts_url_to_tile(nts_url):
raise NotImplementedError()

@staticmethod
def __to_url(dataset, **kwargs):
raise NotImplementedError()


Loading