Skip to content

Add put endpoint for editing site #88

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 94 commits into from
Apr 23, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
94 commits
Select commit Hold shift + click to select a range
a9fb439
Merge pull request #55 from openclimatefix/clearsky-forecast-endpoint
peterdudfield Mar 24, 2023
01a5e21
Bump version: 0.0.36 → 0.0.37
Mar 24, 2023
17341a8
TDD: add failing test, looking at forecasts in the future
peterdudfield Mar 27, 2023
8495325
add start utc filter on forecast future qery
peterdudfield Mar 27, 2023
50d8f67
lint
peterdudfield Mar 27, 2023
b4c83ac
lint
peterdudfield Mar 27, 2023
b806ea9
isort
peterdudfield Mar 27, 2023
de2de3e
add freeze gun to dev dependencies
peterdudfield Mar 27, 2023
f0e1800
self PR comments
peterdudfield Mar 27, 2023
d4e397d
Merge pull request #64 from openclimatefix/issue/future-forecasts
peterdudfield Mar 28, 2023
25904bb
Bump version: 0.0.37 → 0.0.38
Mar 28, 2023
3157e67
Make site_uuid field on PVSiteMetadata model optional
AndrewLester Mar 29, 2023
585bb2d
Make pydantic field optional
AndrewLester Mar 29, 2023
48e08ff
Move test site with real UUID to real db test
AndrewLester Mar 29, 2023
8520566
Don't use uuid in tests
AndrewLester Mar 29, 2023
d6f59e5
Add part of check back
AndrewLester Mar 29, 2023
89910a0
Ensure autogenerated site uuid is not null in database
AndrewLester Mar 30, 2023
c93f46f
Add WHERE statement on query for past forecasts
simlmx Mar 30, 2023
f5bb648
Merge pull request #66 from openclimatefix/optimize-forecast-query
peterdudfield Mar 31, 2023
ceeecd0
Bump version: 0.0.38 → 0.0.39
Mar 31, 2023
dacdf29
Merge pull request #65 from openclimatefix/db-id-generation
peterdudfield Mar 31, 2023
db98579
Bump version: 0.0.39 → 0.0.40
Mar 31, 2023
15afb03
add logging
peterdudfield Mar 31, 2023
47f8fa3
Merge commit 'db98579aa877a5b8fd906477921a627f28b76569' into issue/ad…
peterdudfield Mar 31, 2023
f1b4d71
lint
peterdudfield Mar 31, 2023
920e5c9
run blacks
peterdudfield Mar 31, 2023
d357c2c
add extra logs
peterdudfield Mar 31, 2023
2761085
Merge pull request #67 from openclimatefix/issue/add-logging
peterdudfield Mar 31, 2023
7a4586a
Bump version: 0.0.40 → 0.0.41
Mar 31, 2023
b0c61f4
add structlogging
peterdudfield Mar 31, 2023
6ef9e75
add structlog to requiremwnts
peterdudfield Mar 31, 2023
fdbb441
add logging
peterdudfield Mar 31, 2023
cae5329
Merge pull request #69 from openclimatefix/issue/add-struct-logs
peterdudfield Mar 31, 2023
e0815c6
Bump version: 0.0.41 → 0.0.42
Mar 31, 2023
3e2744e
Merge remote-tracking branch 'origin/main' into h4i/enode
AndrewLester Apr 11, 2023
d0d4fa1
Add structlog initialisation
devsjc Apr 11, 2023
8975181
blacks
peterdudfield Apr 11, 2023
65bef9e
isort
peterdudfield Apr 11, 2023
cc4037a
Merge pull request #76 from openclimatefix/structlog-fix
peterdudfield Apr 11, 2023
a49fd34
Bump version: 0.0.42 → 0.0.43
Apr 11, 2023
a916dff
Uniformize check to FAKE and fix related test
simlmx Apr 11, 2023
187e0db
Set codecov targets
simlmx Apr 12, 2023
4307066
Hard-code the "now" time for all tests
simlmx Apr 12, 2023
f9b1b60
add caching
peterdudfield Apr 13, 2023
6d8b0e2
fix
peterdudfield Apr 13, 2023
595947c
lint
peterdudfield Apr 13, 2023
707862e
isort
peterdudfield Apr 13, 2023
2f77710
fix for routes calling routes
peterdudfield Apr 13, 2023
0d1c782
lint
peterdudfield Apr 13, 2023
b9414b4
Bump version: 0.0.43 → 0.0.44
Apr 13, 2023
316fc31
PR comment
peterdudfield Apr 13, 2023
429646d
add args to cache
peterdudfield Apr 13, 2023
5456c1a
lint
peterdudfield Apr 13, 2023
d8ee3ef
Update pv_site_api/cache.py
peterdudfield Apr 13, 2023
38a4e2f
tidy
peterdudfield Apr 13, 2023
c505c55
Merge pull request #78 from openclimatefix/caching
peterdudfield Apr 13, 2023
b1011e4
Bump version: 0.0.44 → 0.0.45
Apr 13, 2023
32496fb
What I've got
AndrewLester Apr 17, 2023
c4a0a2b
Add basic authorization to the /sites* routes
simlmx Apr 17, 2023
4190a01
refactor
ericcccsliu Apr 17, 2023
ed017b7
add tests
ericcccsliu Apr 17, 2023
60c4a7a
fix error in import
ericcccsliu Apr 17, 2023
a9bac8c
merge in main
ericcccsliu Apr 17, 2023
e0d85ab
update pvsite-datamodel (in line with main)
ericcccsliu Apr 17, 2023
7232d67
fix bug
ericcccsliu Apr 17, 2023
806408a
fix incompatible types
ericcccsliu Apr 17, 2023
347a0ad
lint and format
ericcccsliu Apr 17, 2023
d39d081
add site existence check
ericcccsliu Apr 18, 2023
b721b5c
add back correct datamodel dependency
ericcccsliu Apr 18, 2023
c87fb1c
fix test name
ericcccsliu Apr 18, 2023
6fcf5b2
allow tests to pass by adding fake condition
ericcccsliu Apr 18, 2023
f94837b
remove 404 check
ericcccsliu Apr 18, 2023
d43fa45
use or instead of is not none
ericcccsliu Apr 18, 2023
1ee3196
Bump version: 0.0.45 → 0.0.46
Apr 19, 2023
1beb448
regenerate lock file to resolve conflicts
ericcccsliu Apr 19, 2023
fc0f3a3
add site existence check
ericcccsliu Apr 19, 2023
5eecace
format and lint
ericcccsliu Apr 19, 2023
3e63ffe
Merge pull request #84 from openclimatefix/al/multiple-clearsky
peterdudfield Apr 20, 2023
e8fae44
Bump version: 0.0.46 → 0.0.47
Apr 20, 2023
7b67556
merge in main
ericcccsliu Apr 22, 2023
3c7c3ba
add post endpoint
ericcccsliu Apr 22, 2023
cfe6aa6
add test back in
ericcccsliu Apr 22, 2023
1600aac
fix errors and format
ericcccsliu Apr 22, 2023
74d59dd
add site name to test
ericcccsliu Apr 23, 2023
8a7b6c4
fix tiny error
ericcccsliu Apr 23, 2023
0cc2a45
fix another small error
ericcccsliu Apr 23, 2023
fccdeb1
run tests
ericcccsliu Apr 23, 2023
5da0e67
fix bugs
ericcccsliu Apr 23, 2023
2afc24e
lint and format
ericcccsliu Apr 23, 2023
300f468
fix fake
ericcccsliu Apr 23, 2023
de944b7
:pleading-face:
ericcccsliu Apr 23, 2023
24ac3fe
merge in h4i/enode
ericcccsliu Apr 23, 2023
32478d2
yet another is_fake() addition
ericcccsliu Apr 23, 2023
9ed54b5
address comments
ericcccsliu Apr 23, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .bumpversion.cfg
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
[bumpversion]
commit = True
tag = True
current_version = 0.0.36
current_version = 0.0.47

[bumpversion:file:pv_site_api/__init__.py]
search = __version__ = "{current_version}"
Expand Down
8 changes: 8 additions & 0 deletions codecov.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
coverage:
status:
project:
default:
target: 90%
patch:
default:
target: 90%
187 changes: 186 additions & 1 deletion poetry.lock

Large diffs are not rendered by default.

24 changes: 23 additions & 1 deletion pv_site_api/__init__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,25 @@
"""pv_site_api package"""

__version__ = "0.0.36"
import structlog

__version__ = "0.0.47"

# Add required processors and formatters to structlog
structlog.configure(
processors=[
structlog.processors.EventRenamer("message", replace_by="_event"),
structlog.stdlib.PositionalArgumentsFormatter(),
structlog.processors.CallsiteParameterAdder(
[
structlog.processors.CallsiteParameter.FILENAME,
structlog.processors.CallsiteParameter.LINENO,
],
),
structlog.processors.dict_tracebacks,
structlog.stdlib.add_log_level,
structlog.processors.TimeStamper(fmt="iso"),
structlog.processors.StackInfoRenderer(),
structlog.processors.format_exc_info,
structlog.processors.JSONRenderer(sort_keys=True),
],
)
51 changes: 42 additions & 9 deletions pv_site_api/_db_helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,10 @@
import datetime as dt
import uuid
from collections import defaultdict
from typing import Any
from typing import Any, Optional

import sqlalchemy as sa
import structlog
from pvsite_datamodel.read.generation import get_pv_generation_by_sites
from pvsite_datamodel.sqlmodels import ForecastSQL, ForecastValueSQL, InverterSQL, SiteSQL
from sqlalchemy.orm import Session, aliased
Expand All @@ -24,6 +25,9 @@
SiteForecastValues,
)

logger = structlog.stdlib.get_logger()


# Sqlalchemy rows are tricky to type: we use this to make the code more readable.
Row = Any

Expand All @@ -46,6 +50,7 @@ def _get_forecasts_for_horizon(
.where(ForecastSQL.site_uuid.in_(site_uuids))
# Also filtering on `timestamp_utc` makes the query faster.
.where(ForecastSQL.timestamp_utc >= start_utc - dt.timedelta(minutes=horizon_minutes))
.where(ForecastSQL.timestamp_utc < end_utc)
.where(ForecastValueSQL.horizon_minutes == horizon_minutes)
.where(ForecastValueSQL.start_utc >= start_utc)
.where(ForecastValueSQL.start_utc < end_utc)
Expand All @@ -61,7 +66,9 @@ def _get_inverters_by_site(session: Session, site_uuid: str) -> list[Row]:
return query.all()


def _get_latest_forecast_by_sites(session: Session, site_uuids: list[str]) -> list[Row]:
def _get_latest_forecast_by_sites(
session: Session, site_uuids: list[str], start_utc: Optional[dt.datetime] = None
) -> list[Row]:
"""Get the latest forecast for given site uuids."""
# Get the latest forecast for each site.
subquery = (
Expand All @@ -77,11 +84,15 @@ def _get_latest_forecast_by_sites(session: Session, site_uuids: list[str]) -> li
forecast_subq = aliased(ForecastSQL, subquery, name="ForecastSQL")

# Join the forecast values.
query = (
session.query(forecast_subq, ForecastValueSQL)
.join(ForecastValueSQL)
.order_by(forecast_subq.timestamp_utc, ForecastValueSQL.start_utc)
)
query = session.query(forecast_subq, ForecastValueSQL)
query = query.join(ForecastValueSQL)

# only get future forecast values. This solves the case when a forecast is made 1 day a go,
# but since then, no new forecast have been made
if start_utc is not None:
query = query.filter(ForecastValueSQL.start_utc >= start_utc)

query.order_by(forecast_subq.timestamp_utc, ForecastValueSQL.start_utc)

return query.all()

Expand Down Expand Up @@ -141,6 +152,8 @@ def get_forecasts_by_sites(
This is what we show in the UI.
"""

logger.info(f"Getting forecast for {len(site_uuids)} sites")

end_utc = dt.datetime.utcnow()

rows_past = _get_forecasts_for_horizon(
Expand All @@ -150,9 +163,16 @@ def get_forecasts_by_sites(
end_utc=end_utc,
horizon_minutes=horizon_minutes,
)
rows_future = _get_latest_forecast_by_sites(session, site_uuids)
logger.debug("Found %s past forecasts", len(rows_past))

rows_future = _get_latest_forecast_by_sites(
session=session, site_uuids=site_uuids, start_utc=start_utc
)
logger.debug("Found %s future forecasts", len(rows_future))

logger.debug("Formatting forecasts to pydantic objects")
forecasts = _forecast_rows_to_pydantic(rows_past + rows_future)
logger.debug("Formatting forecasts to pydantic objects: done")

return forecasts

Expand All @@ -161,13 +181,16 @@ def get_generation_by_sites(
session: Session, site_uuids: list[str], start_utc: dt.datetime
) -> list[MultiplePVActual]:
"""Get the generation since yesterday (midnight) for a list of sites."""
logger.info(f"Getting generation for {len(site_uuids)} sites")
rows = get_pv_generation_by_sites(
session=session, start_utc=start_utc, site_uuids=[uuid.UUID(su) for su in site_uuids]
)

# Go through the rows and split the data by site.
pv_actual_values_per_site: dict[str, list[PVActualValue]] = defaultdict(list)

# TODO can we speed this up?
logger.info("Formatting generation 1")
for row in rows:
site_uuid = str(row.site_uuid)
pv_actual_values_per_site[site_uuid].append(
Expand All @@ -177,11 +200,21 @@ def get_generation_by_sites(
)
)

return [
logger.info("Formatting generation 2")
multiple_pv_actuals = [
MultiplePVActual(site_uuid=site_uuid, pv_actual_values=pv_actual_values)
for site_uuid, pv_actual_values in pv_actual_values_per_site.items()
]

logger.debug("Getting generation for {len(site_uuids)} sites: done")
return multiple_pv_actuals


def get_sites_by_uuids(session: Session, site_uuids: list[str]) -> list[PVSiteMetadata]:
sites = session.query(SiteSQL).where(SiteSQL.site_uuid.in_(site_uuids)).all()
pydantic_sites = [site_to_pydantic(site) for site in sites]
return pydantic_sites


def site_to_pydantic(site: SiteSQL) -> PVSiteMetadata:
"""Converts a SiteSQL object into a PVSiteMetadata object."""
Expand Down
37 changes: 37 additions & 0 deletions pv_site_api/auth.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
import jwt
from fastapi import Depends, HTTPException
from fastapi.security import HTTPAuthorizationCredentials, HTTPBearer

token_auth_scheme = HTTPBearer()


class Auth:
"""Fast api dependency that validates an JWT token."""

def __init__(self, domain: str, api_audience: str, algorithm: str):
self._domain = domain
self._api_audience = api_audience
self._algorithm = algorithm

self._jwks_client = jwt.PyJWKClient(f"https://{domain}/.well-known/jwks.json")

def __call__(self, auth_credentials: HTTPAuthorizationCredentials = Depends(token_auth_scheme)):
token = auth_credentials.credentials

try:
signing_key = self._jwks_client.get_signing_key_from_jwt(token).key
except (jwt.exceptions.PyJWKClientError, jwt.exceptions.DecodeError) as e:
raise HTTPException(status_code=401, detail=str(e))

try:
payload = jwt.decode(
token,
signing_key,
algorithms=self._algorithm,
audience=self._api_audience,
issuer=f"https://{self._domain}/",
)
except Exception as e:
raise HTTPException(status_code=401, detail=str(e))

return payload
77 changes: 77 additions & 0 deletions pv_site_api/cache.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
""" Caching utils for api"""
import json
import os
from datetime import datetime, timedelta, timezone
from functools import wraps

import structlog

logger = structlog.stdlib.get_logger()

CACHE_TIME_SECONDS = 120
cache_time_seconds = int(os.getenv("CACHE_TIME_SECONDS", CACHE_TIME_SECONDS))


def cache_response(func):
"""
Decorator that caches the response of a FastAPI async function.

Example:
```
app = FastAPI()

@app.get("/")
@cache_response
async def example():
return {"message": "Hello World"}
```
"""
response = {}
last_updated = {}

@wraps(func)
def wrapper(*args, **kwargs): # noqa
nonlocal response
nonlocal last_updated

# get the variables that go into the route
# we don't want to use the cache for different variables
route_variables = kwargs.copy()

# drop session and user
for var in ["session", "user"]:
if var in route_variables:
route_variables.pop(var)

# make into string
route_variables = json.dumps(route_variables)
args_as_json = json.dumps(args)
function_name = func.__name__
key = f"{function_name}_{args_as_json}_{route_variables}"

# seeing if we need to run the function
now = datetime.now(tz=timezone.utc)
last_updated_datetime = last_updated.get(key)
refresh_cache = (last_updated_datetime is None) or (
now - timedelta(seconds=cache_time_seconds) > last_updated_datetime
)

# check if it's been called before
if last_updated_datetime is None:
logger.debug(f"First time this is route run for {key}")

# re-run if cache time out is up
elif refresh_cache:
logger.debug(f"Not using cache as longer than {cache_time_seconds} seconds for {key}")

if refresh_cache:
# calling function
response[key] = func(*args, **kwargs)
last_updated[key] = now
return response[key]
else:
# use cache
logger.debug(f"Using cache route {key}")
return response[key]

return wrapper
Loading