Skip to content

Update stac-fastapi parent libraries to 5.1.1 #354

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 22 commits into from
Apr 16, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/cicd.yml
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ jobs:

strategy:
matrix:
python-version: [ "3.8", "3.9", "3.10", "3.11", "3.12", "3.13"]
python-version: [ "3.9", "3.10", "3.11", "3.12", "3.13"]
backend: [ "elasticsearch7", "elasticsearch8", "opensearch"]

name: Python ${{ matrix.python-version }} testing with ${{ matrix.backend }}
Expand Down
8 changes: 7 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,13 +7,18 @@ and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.

## [Unreleased]

## [v4.0.0a0]

### Added
- Added support for dynamically-generated queryables based on Elasticsearch/OpenSearch mappings, with extensible metadata augmentation [#351](https://github.com/stac-utils/stac-fastapi-elasticsearch-opensearch/pull/351)
- Included default queryables configuration for seamless integration. [#351](https://github.com/stac-utils/stac-fastapi-elasticsearch-opensearch/pull/351)

### Changed
- Refactored database logic to reduce duplication [#351](https://github.com/stac-utils/stac-fastapi-elasticsearch-opensearch/pull/351)
- Replaced `fastapi-slim` with `fastapi` dependency [#351](https://github.com/stac-utils/stac-fastapi-elasticsearch-opensearch/pull/351)
- Changed minimum Python version to 3.9 [#354](https://github.com/stac-utils/stac-fastapi-elasticsearch-opensearch/pull/354)
- Updated stac-fastapi api, types, and extensions libraries to 5.1.1 from 3.0.0 and made various associated changes [#354](https://github.com/stac-utils/stac-fastapi-elasticsearch-opensearch/pull/354)
- Changed makefile commands from 'docker-compose' to 'docker compose' [#354](https://github.com/stac-utils/stac-fastapi-elasticsearch-opensearch/pull/354)

### Fixed
- Improved performance of `mk_actions` and `filter-links` methods [#351](https://github.com/stac-utils/stac-fastapi-elasticsearch-opensearch/pull/351)
Expand Down Expand Up @@ -314,7 +319,8 @@ and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.
- Use genexp in execute_search and get_all_collections to return results.
- Added db_to_stac serializer to item_collection method in core.py.

[Unreleased]: https://github.com/stac-utils/stac-fastapi-elasticsearch/tree/v3.2.5...main
[Unreleased]: https://github.com/stac-utils/stac-fastapi-elasticsearch/tree/v4.0.0a0...main
[v4.0.0a0]: https://github.com/stac-utils/stac-fastapi-elasticsearch/tree/v3.2.5...v4.0.0a0
[v3.2.5]: https://github.com/stac-utils/stac-fastapi-elasticsearch/tree/v3.2.4...v3.2.5
[v3.2.4]: https://github.com/stac-utils/stac-fastapi-elasticsearch/tree/v3.2.3...v3.2.4
[v3.2.3]: https://github.com/stac-utils/stac-fastapi-elasticsearch/tree/v3.2.2...v3.2.3
Expand Down
22 changes: 11 additions & 11 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -10,15 +10,15 @@ OS_APP_PORT ?= 8082
OS_HOST ?= docker.for.mac.localhost
OS_PORT ?= 9202

run_es = docker-compose \
run_es = docker compose \
run \
-p ${EXTERNAL_APP_PORT}:${ES_APP_PORT} \
-e PY_IGNORE_IMPORTMISMATCH=1 \
-e APP_HOST=${APP_HOST} \
-e APP_PORT=${ES_APP_PORT} \
app-elasticsearch

run_os = docker-compose \
run_os = docker compose \
run \
-p ${EXTERNAL_APP_PORT}:${OS_APP_PORT} \
-e PY_IGNORE_IMPORTMISMATCH=1 \
Expand All @@ -45,7 +45,7 @@ run-deploy-locally:

.PHONY: image-dev
image-dev:
docker-compose build
docker compose build

.PHONY: docker-run-es
docker-run-es: image-dev
Expand All @@ -66,28 +66,28 @@ docker-shell-os:
.PHONY: test-elasticsearch
test-elasticsearch:
-$(run_es) /bin/bash -c 'export && ./scripts/wait-for-it-es.sh elasticsearch:9200 && cd stac_fastapi/tests/ && pytest'
docker-compose down
docker compose down

.PHONY: test-opensearch
test-opensearch:
-$(run_os) /bin/bash -c 'export && ./scripts/wait-for-it-es.sh opensearch:9202 && cd stac_fastapi/tests/ && pytest'
docker-compose down
docker compose down

.PHONY: test
test:
-$(run_es) /bin/bash -c 'export && ./scripts/wait-for-it-es.sh elasticsearch:9200 && cd stac_fastapi/tests/ && pytest'
docker-compose down
docker compose down

-$(run_os) /bin/bash -c 'export && ./scripts/wait-for-it-es.sh opensearch:9202 && cd stac_fastapi/tests/ && pytest'
docker-compose down
docker compose down

.PHONY: run-database-es
run-database-es:
docker-compose run --rm elasticsearch
docker compose run --rm elasticsearch

.PHONY: run-database-os
run-database-os:
docker-compose run --rm opensearch
docker compose run --rm opensearch

.PHONY: pybase-install
pybase-install:
Expand All @@ -107,10 +107,10 @@ install-os: pybase-install

.PHONY: docs-image
docs-image:
docker-compose -f docker-compose.docs.yml \
docker compose -f docker compose.docs.yml \
build

.PHONY: docs
docs: docs-image
docker-compose -f docker-compose.docs.yml \
docker compose -f docker compose.docs.yml \
run docs
2 changes: 0 additions & 2 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
version: '3.9'

services:
app-elasticsearch:
container_name: stac-fastapi-es
Expand Down
8 changes: 4 additions & 4 deletions stac_fastapi/core/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,10 +9,10 @@
"fastapi",
"attrs>=23.2.0",
"pydantic",
"stac_pydantic>=3",
"stac-fastapi.types==3.0.0",
"stac-fastapi.api==3.0.0",
"stac-fastapi.extensions==3.0.0",
"stac_pydantic==3.1.*",
"stac-fastapi.api==5.1.1",
"stac-fastapi.extensions==5.1.1",
"stac-fastapi.types==5.1.1",
"orjson",
"overrides",
"geojson-pydantic",
Expand Down
57 changes: 34 additions & 23 deletions stac_fastapi/core/stac_fastapi/core/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@
from stac_fastapi.types.core import AsyncBaseCoreClient, AsyncBaseTransactionsClient
from stac_fastapi.types.extension import ApiExtension
from stac_fastapi.types.requests import get_base_url
from stac_fastapi.types.rfc3339 import DateTimeType
from stac_fastapi.types.rfc3339 import DateTimeType, rfc3339_str_to_datetime
from stac_fastapi.types.search import BaseSearchPostRequest

logger = logging.getLogger(__name__)
Expand Down Expand Up @@ -277,7 +277,7 @@ async def item_collection(
self,
collection_id: str,
bbox: Optional[BBox] = None,
datetime: Optional[DateTimeType] = None,
datetime: Optional[str] = None,
limit: Optional[int] = 10,
token: Optional[str] = None,
**kwargs,
Expand All @@ -287,7 +287,7 @@ async def item_collection(
Args:
collection_id (str): The identifier of the collection to read items from.
bbox (Optional[BBox]): The bounding box to filter items by.
datetime (Optional[DateTimeType]): The datetime range to filter items by.
datetime (Optional[str]): The datetime range to filter items by.
limit (int): The maximum number of items to return. The default value is 10.
token (str): A token used for pagination.
request (Request): The incoming request.
Expand Down Expand Up @@ -426,39 +426,50 @@ def _return_date(

return result

def _format_datetime_range(self, date_tuple: DateTimeType) -> str:
def _format_datetime_range(self, date_str: str) -> str:
"""
Convert a tuple of datetime objects or None into a formatted string for API requests.
Convert a datetime range string into a normalized UTC string for API requests using rfc3339_str_to_datetime.

Args:
date_tuple (tuple): A tuple containing two elements, each can be a datetime object or None.
date_str (str): A string containing two datetime values separated by a '/'.

Returns:
str: A string formatted as 'YYYY-MM-DDTHH:MM:SS.sssZ/YYYY-MM-DDTHH:MM:SS.sssZ', with '..' used if any element is None.
str: A string formatted as 'YYYY-MM-DDTHH:MM:SSZ/YYYY-MM-DDTHH:MM:SSZ', with '..' used if any element is None.
"""

def format_datetime(dt):
"""Format a single datetime object to the ISO8601 extended format with 'Z'."""
return dt.strftime("%Y-%m-%dT%H:%M:%S.%f")[:-3] + "Z" if dt else ".."

start, end = date_tuple
return f"{format_datetime(start)}/{format_datetime(end)}"
def normalize(dt):
dt = dt.strip()
if not dt or dt == "..":
return ".."
dt_obj = rfc3339_str_to_datetime(dt)
dt_utc = dt_obj.astimezone(timezone.utc)
return dt_utc.strftime("%Y-%m-%dT%H:%M:%SZ")

if not isinstance(date_str, str):
return "../.."
if "/" not in date_str:
return f"{normalize(date_str)}/{normalize(date_str)}"
try:
start, end = date_str.split("/", 1)
except Exception:
return "../.."
return f"{normalize(start)}/{normalize(end)}"

async def get_search(
self,
request: Request,
collections: Optional[List[str]] = None,
ids: Optional[List[str]] = None,
bbox: Optional[BBox] = None,
datetime: Optional[DateTimeType] = None,
datetime: Optional[str] = None,
limit: Optional[int] = 10,
query: Optional[str] = None,
token: Optional[str] = None,
fields: Optional[List[str]] = None,
sortby: Optional[str] = None,
q: Optional[List[str]] = None,
intersects: Optional[str] = None,
filter: Optional[str] = None,
filter_expr: Optional[str] = None,
filter_lang: Optional[str] = None,
**kwargs,
) -> stac_types.ItemCollection:
Expand All @@ -468,7 +479,7 @@ async def get_search(
collections (Optional[List[str]]): List of collection IDs to search in.
ids (Optional[List[str]]): List of item IDs to search for.
bbox (Optional[BBox]): Bounding box to search in.
datetime (Optional[DateTimeType]): Filter items based on the datetime field.
datetime (Optional[str]): Filter items based on the datetime field.
limit (Optional[int]): Maximum number of results to return.
query (Optional[str]): Query string to filter the results.
token (Optional[str]): Access token to use when searching the catalog.
Expand All @@ -495,7 +506,7 @@ async def get_search(
}

if datetime:
base_args["datetime"] = self._format_datetime_range(datetime)
base_args["datetime"] = self._format_datetime_range(date_str=datetime)

if intersects:
base_args["intersects"] = orjson.loads(unquote_plus(intersects))
Expand All @@ -506,12 +517,12 @@ async def get_search(
for sort in sortby
]

if filter:
base_args["filter-lang"] = "cql2-json"
if filter_expr:
base_args["filter_lang"] = "cql2-json"
base_args["filter"] = orjson.loads(
unquote_plus(filter)
unquote_plus(filter_expr)
if filter_lang == "cql2-json"
else to_cql2(parse_cql2_text(filter))
else to_cql2(parse_cql2_text(filter_expr))
)

if fields:
Expand Down Expand Up @@ -593,8 +604,8 @@ async def post_search(
)

# only cql2_json is supported here
if hasattr(search_request, "filter"):
cql2_filter = getattr(search_request, "filter", None)
if hasattr(search_request, "filter_expr"):
cql2_filter = getattr(search_request, "filter_expr", None)
try:
search = self.database.apply_cql2_filter(search, cql2_filter)
except Exception as e:
Expand Down
16 changes: 8 additions & 8 deletions stac_fastapi/core/stac_fastapi/core/extensions/aggregation.py
Original file line number Diff line number Diff line change
Expand Up @@ -338,7 +338,7 @@ async def aggregate(
datetime: Optional[DateTimeType] = None,
intersects: Optional[str] = None,
filter_lang: Optional[str] = None,
filter: Optional[str] = None,
filter_expr: Optional[str] = None,
aggregations: Optional[str] = None,
ids: Optional[List[str]] = None,
bbox: Optional[BBox] = None,
Expand Down Expand Up @@ -380,18 +380,18 @@ async def aggregate(
if datetime:
base_args["datetime"] = self._format_datetime_range(datetime)

if filter:
base_args["filter"] = self.get_filter(filter, filter_lang)
if filter_expr:
base_args["filter"] = self.get_filter(filter_expr, filter_lang)
aggregate_request = EsAggregationExtensionPostRequest(**base_args)
else:
# Workaround for optional path param in POST requests
if "collections" in path:
collection_id = path.split("/")[2]

filter_lang = "cql2-json"
if aggregate_request.filter:
aggregate_request.filter = self.get_filter(
aggregate_request.filter, filter_lang
if aggregate_request.filter_expr:
aggregate_request.filter_expr = self.get_filter(
aggregate_request.filter_expr, filter_lang
)

if collection_id:
Expand Down Expand Up @@ -465,10 +465,10 @@ async def aggregate(
detail=f"Aggregation {agg_name} not supported at catalog level",
)

if aggregate_request.filter:
if aggregate_request.filter_expr:
try:
search = self.database.apply_cql2_filter(
search, aggregate_request.filter
search, aggregate_request.filter_expr
)
except Exception as e:
raise HTTPException(
Expand Down
2 changes: 1 addition & 1 deletion stac_fastapi/core/stac_fastapi/core/version.py
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
"""library version."""
__version__ = "3.2.5"
__version__ = "4.0.0a0"
2 changes: 1 addition & 1 deletion stac_fastapi/elasticsearch/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
desc = f.read()

install_requires = [
"stac-fastapi.core==3.2.5",
"stac-fastapi.core==4.0.0a0",
"elasticsearch[async]==8.11.0",
"elasticsearch-dsl==8.11.0",
"uvicorn",
Expand Down
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
"""library version."""
__version__ = "3.2.5"
__version__ = "4.0.0a0"
2 changes: 1 addition & 1 deletion stac_fastapi/opensearch/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
desc = f.read()

install_requires = [
"stac-fastapi.core==3.2.5",
"stac-fastapi.core==4.0.0a0",
"opensearch-py==2.4.2",
"opensearch-py[async]==2.4.2",
"uvicorn",
Expand Down
2 changes: 1 addition & 1 deletion stac_fastapi/opensearch/stac_fastapi/opensearch/version.py
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
"""library version."""
__version__ = "3.2.5"
__version__ = "4.0.0a0"
5 changes: 1 addition & 4 deletions stac_fastapi/tests/resources/test_item.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
import os
import uuid
from copy import deepcopy
from datetime import datetime, timedelta, timezone
from datetime import datetime, timedelta
from random import randint
from urllib.parse import parse_qs, urlparse, urlsplit

Expand Down Expand Up @@ -478,13 +478,10 @@ async def test_item_search_temporal_window_timezone_get(
app_client, ctx, load_test_data
):
"""Test GET search with spatio-temporal query ending with Zulu and pagination(core)"""
tzinfo = timezone(timedelta(hours=1))
test_item = load_test_data("test_item.json")
item_date = rfc3339_str_to_datetime(test_item["properties"]["datetime"])
item_date_before = item_date - timedelta(seconds=1)
item_date_before = item_date_before.replace(tzinfo=tzinfo)
item_date_after = item_date + timedelta(seconds=1)
item_date_after = item_date_after.replace(tzinfo=tzinfo)
Comment on lines -481 to -487
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we wanted to keep the timezone info in the test we could use astimezone instead of replace. But I think it's okay to leave it out.

    tzinfo = timezone(timedelta(hours=1))
    test_item = load_test_data("test_item.json")
    item_date = rfc3339_str_to_datetime(test_item["properties"]["datetime"])
    item_date_before = item_date - timedelta(seconds=1)
    item_date_before = item_date_before.astimezone(tzinfo=tzinfo)
    item_date_after = item_date + timedelta(seconds=1)
    item_date_after = item_date_after.astimezone(tzinfo=tzinfo)

astimezone updates the time and timezone, replace just changes the timezone.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am getting errors in the test. Maybe we should add an issue for more comprehensive tests related to datetime parsing which is something I don't know too well.


params = {
"collections": test_item["collection"],
Expand Down