diff --git a/.github/SCREENSHOTS/damp.png b/.github/SCREENSHOTS/damp.png
index 3a27fc2a..745ea3bc 100644
Binary files a/.github/SCREENSHOTS/damp.png and b/.github/SCREENSHOTS/damp.png differ
diff --git a/.gitignore b/.gitignore
index d0d0cd07..17b2e6eb 100644
--- a/.gitignore
+++ b/.gitignore
@@ -1,4 +1,5 @@
.DS_Store
.vscode/settings.json
.vscode
+.vs
__pycache__
\ No newline at end of file
diff --git a/README.md b/README.md
index 1c041a1a..626aedff 100644
--- a/README.md
+++ b/README.md
@@ -42,7 +42,7 @@ This custom component integrates the Solcast Hobby PV Forecast API into Home Ass
> [!NOTE]
> Solcast have altered their API limits for new account creators
>
-> Solcast now only offer new account creators a quota of 10 API calls per day (used to be 50).
+> Solcast now only offer new account creators a limit of 10 API calls per day (used to be 50).
> Old account users still have 50 API calls.
>
> The integration currently no longer includes auto API polling. Users now need to create their own automations to call the update solcast service to poll for new data. Keep in mind your API poll limit.
@@ -52,6 +52,8 @@ Sign up for an API key (https://solcast.com/).
> Solcast may take up to 24hrs to create the account.
+Configure your rooftop sites correctly at `solcast.com`.
+
Copy the API Key for use with this integration (See [Configuration](#Configuration) below).
## Installation
@@ -127,7 +129,7 @@ You probably **do not** want to do this! Use the HACS method above unless you kn
[
](https://github.com/BJReplay/ha-solcast-solar/blob/main/.github/SCREENSHOTS/Setupanewintegration.png)
-1. Enter your `Solcast API Key`, `API quota` and click `Submit`. If you have more than one Solcast account because you have more than two rooftop setups, enter both account API keys separated by a comma `xxxxxxxx-xxxxx-xxxx,yyyyyyyy-yyyyy-yyyy` (_NB: this goes against Solcast T&C's by having more than one account_). If the API quota is the same for multiple accounts then enter a single value, or both values separated by a comma.
+1. Enter your `Solcast API Key`, `API limit` and click `Submit`. If you have more than one Solcast account because you have more than two rooftop setups, enter both account API keys separated by a comma `xxxxxxxx-xxxxx-xxxx,yyyyyyyy-yyyyy-yyyy` (_NB: this goes against Solcast T&C's by having more than one account_). If the API limit is the same for multiple accounts then enter a single value, or both values separated by a comma.
1. Create your own automation to call the service `solcast_solar.update_forecasts` at the times you would like to update the solar forecast.
1. Set up HA Energy Dashboard settings.
1. To change other configuration options after installation, select the integration in `Devices & services` then `CONFIGURE`.
@@ -243,7 +245,7 @@ mode: single
>
> Log capture instructions are in the Bug Issue Template - you will see them if you start creating a new issue - make sure you include these logs if you want the assistance of the repository constributors.
>
-> An example of busy messages and a successful retry are shown below (with debug logging enabled). In this case there is no issue, as the retry succeeds. Should five consecutive attempts fail, then the forecast retrieval will end with an `ERROR`. If that happens, manually trigger another `solcast_solar.update_forecasts` service call, or wait for your next scheduled automation run.
+> An example of busy messages and a successful retry are shown below (with debug logging enabled). In this case there is no issue, as the retry succeeds. Should ten consecutive attempts fail, then the forecast retrieval will end with an `ERROR`. If that happens, manually trigger another `solcast_solar.update_forecasts` service call, or wait for your next scheduled automation run.
>
> Should the load of sites data on integration startup be the call that has failed with 429/Too busy, then the integration cannot start correctly, and it will retry continuously.
@@ -420,7 +422,8 @@ The following YAML produces a graph of today's PV generation, PV forecast and PV
Customise with appropriate Home Assistant sensors for today's total solar generation and solar panel PV power output.
-> [!NOTE] The chart assumes that your Solar PV sensors are in kW, but if some are in W, add the line `transform: "return x / 1000;"` under the entity id to convert the sensor value to kW.
+> [!NOTE]
+> The chart assumes that your Solar PV sensors are in kW, but if some are in W, add the line `transform: "return x / 1000;"` under the entity id to convert the sensor value to kW.
### Reveal code
Click here
@@ -547,10 +550,25 @@ series:
## Known issues
-If a hard limit or dampening factors are set then the individual sites breakdown attributes will not be limited by these factors. The only way to implement this would be to have separate hard limits and dampening factors for each site, and this would become overly complex.
+* Code was added that checked whether the integration had been down for a long time, and if so then on next start it would automatically poll Solcast. This feature is now broken, caused by Solcast removing the API call to get current usage.
+* If a hard limit or dampening factors are set then the individual sites breakdown attributes will not be limited by these factors. The only way to implement this would be to have separate hard limits and dampening factors for each site, and this would become overly complex.
## Changes
+v4.1.4
+* Update Polish translation by @home409ca
+* Rename integration in HACS to Solcast PV Forecast by @BJReplay
+* Reduce aiofiles version requirement to >=23.2.0 by @autoSteve
+* Configuration dialog improvements by @autoSteve
+* Misc translation updates by @autoSteve
+* Refactor moment and remaining spline build by @autoSteve
+* Prevent negative forecast for X hour sensor by @autoSteve
+* Suppress spline bounce for reducing spline by @autoSteve
+* More careful serialisation of solcast.json by @autoSteve
+* Extensive code clean-up by #autoSteve
+
+Full Changelog: https://github.com/BJReplay/ha-solcast-solar/compare/v4.1.3...v4.1.4
+
v4.1.3
* Accommodate the removal of API call GetUserUsageAllowance by @autoSteve
* Halve retry delays by @autoSteve
@@ -585,9 +603,9 @@ v4.1
* @autoSteve is welcomed as a CodeOwner
* It is now apparent that it is unlikely that this repo will be added as a default repo in HACS until HACS 2.0 is out, so the installation instructions make it clear that adding via the Manual Repository flow is the preferred approach, and new instructions have been added to show how to do this.
-Release Changelog: https://github.com/BJReplay/ha-solcast-solar/compare/v4.0.31...v4.1
+Release Changelog: https://github.com/BJReplay/ha-solcast-solar/compare/v4.0.31...v4.1.0
-Most Recent Changes: https://github.com/BJReplay/ha-solcast-solar/compare/v4.0.43...v4.1
+Most Recent Changes: https://github.com/BJReplay/ha-solcast-solar/compare/v4.0.43...v4.1.0
### Prior changes
Click here for changes back to v3.0
diff --git a/custom_components/solcast_solar/__init__.py b/custom_components/solcast_solar/__init__.py
index 0042241d..d7a3e1a9 100644
--- a/custom_components/solcast_solar/__init__.py
+++ b/custom_components/solcast_solar/__init__.py
@@ -1,14 +1,17 @@
"""Support for Solcast PV forecast."""
+# pylint: disable=C0304, C0321, E0401, E1135, W0613, W0702, W0718
+
import logging
import traceback
import random
import os
import json
-import aiofiles
-import os.path as path
from datetime import timedelta
+from typing import Final
+import aiofiles
+import voluptuous as vol
from homeassistant import loader
from homeassistant.config_entries import ConfigEntry
from homeassistant.const import CONF_API_KEY, Platform
@@ -45,10 +48,6 @@
from .coordinator import SolcastUpdateCoordinator
from .solcastapi import ConnectionOptions, SolcastApi
-from typing import Final
-
-import voluptuous as vol
-
PLATFORMS = [Platform.SENSOR, Platform.SELECT,]
_LOGGER = logging.getLogger(__name__)
@@ -97,7 +96,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
# Introduced in core 2024.6.0: async_get_time_zone
try:
- dt_util.async_get_time_zone
+ dt_util.async_get_time_zone # pylint: disable=W0104
asynctz = True
except:
asynctz = False
@@ -110,7 +109,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
entry.options[CONF_API_KEY],
entry.options[API_QUOTA],
SOLCAST_URL,
- hass.config.path('%s/solcast.json' % os.path.abspath(os.path.join(os.path.dirname(__file__) ,"../.."))),
+ hass.config.path(f"{os.path.abspath(os.path.join(os.path.dirname(__file__) ,'../..'))}/solcast.json"),
tz,
optdamp,
entry.options.get(CUSTOM_HOUR_SENSOR, 1),
@@ -128,32 +127,33 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
try:
await solcast.sites_data()
- if solcast._sites_loaded: await solcast.sites_usage()
+ if solcast.sites_loaded:
+ await solcast.sites_usage()
except Exception as ex:
raise ConfigEntryNotReady(f"Getting sites data failed: {ex}") from ex
- if not solcast._sites_loaded:
- raise ConfigEntryNotReady(f"Sites data could not be retrieved")
+ if not solcast.sites_loaded:
+ raise ConfigEntryNotReady('Sites data could not be retrieved')
- if not await solcast.load_saved_data():
- raise ConfigEntryNotReady(f"Failed to load initial data from cache or the Solcast API")
+ status = await solcast.load_saved_data()
+ if status != '':
+ raise ConfigEntryNotReady(status)
- _VERSION = ""
try:
+ _VERSION = "" # pylint: disable=C0103
integration = await loader.async_get_integration(hass, DOMAIN)
- _VERSION = str(integration.version)
- _LOGGER.info(
- f"\n{'-'*67}\n"
- f"Solcast integration version: {_VERSION}\n\n"
- f"This is a custom integration. When troubleshooting a problem, after\n"
- f"reviewing open and closed issues, and the discussions, check the\n"
- f"required automation is functioning correctly and try enabling debug\n"
- f"logging to see more. Troubleshooting tips available at:\n"
- f"https://github.com/BJReplay/ha-solcast-solar/discussions/38\n\n"
- f"Beta versions may also have addressed some issues so look at those.\n\n"
- f"If all else fails, then open an issue and our community will try to\n"
- f"help: https://github.com/BJReplay/ha-solcast-solar/issues\n"
- f"{'-'*67}")
+ _VERSION = str(integration.version) # pylint: disable=C0103
+ _LOGGER.info('''\n%s
+Solcast integration version: %s\n
+This is a custom integration. When troubleshooting a problem, after
+reviewing open and closed issues, and the discussions, check the
+required automation is functioning correctly and try enabling debug
+logging to see more. Troubleshooting tips available at:
+https://github.com/BJReplay/ha-solcast-solar/discussions/38\n
+Beta versions may also have addressed some issues so look at those.\n
+If all else fails, then open an issue and our community will try to
+help: https://github.com/BJReplay/ha-solcast-solar/issues
+%s''', '-'*67, _VERSION, '-'*67)
except loader.IntegrationNotFound:
pass
@@ -169,41 +169,38 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
entry.async_on_unload(entry.add_update_listener(async_update_options))
- _LOGGER.debug(f"UTC times are converted to {hass.config.time_zone}")
+ _LOGGER.debug("UTC times are converted to %s", hass.config.time_zone)
if options.hard_limit < 100:
- _LOGGER.info(
- f"Solcast inverter hard limit value has been set. If the forecasts and graphs are not as you expect, try running the service 'solcast_solar.remove_hard_limit' to remove this setting. "
- f"This setting is really only for advanced quirky solar setups."
- )
+ _LOGGER.info("Solcast inverter hard limit value has been set. If the forecasts and graphs are not as you expect, remove this setting")
# If the integration has failed for some time and then is restarted retrieve forecasts
if solcast.get_api_used_count() == 0 and solcast.get_last_updated_datetime() < solcast.get_day_start_utc() - timedelta(days=1):
try:
- _LOGGER.info('Integration has been failed for some time, or your update automation has not been running (see readme). Retrieving forecasts.')
+ _LOGGER.info('Integration has been failed for some time, or your update automation has not been running (see readme), so retrieving forecasts')
#await solcast.sites_weather()
await solcast.http_data(dopast=False)
- coordinator._dataUpdated = True
+ coordinator.set_data_updated(True)
await coordinator.update_integration_listeners()
- coordinator._dataUpdated = False
- except Exception as ex:
- _LOGGER.error("Exception force fetching data on stale start: %s", ex)
+ coordinator.set_data_updated(False)
+ except Exception as e:
+ _LOGGER.error("Exception force fetching data on stale start: %s", e)
_LOGGER.error(traceback.format_exc())
async def handle_service_update_forecast(call: ServiceCall):
"""Handle service call"""
- _LOGGER.info(f"Service call: {SERVICE_UPDATE}")
+ _LOGGER.info("Service call: %s", SERVICE_UPDATE)
await coordinator.service_event_update()
async def handle_service_clear_solcast_data(call: ServiceCall):
"""Handle service call"""
- _LOGGER.info(f"Service call: {SERVICE_CLEAR_DATA}")
+ _LOGGER.info("Service call: %s", SERVICE_CLEAR_DATA)
await coordinator.service_event_delete_old_solcast_json_file()
async def handle_service_get_solcast_data(call: ServiceCall) -> ServiceResponse:
"""Handle service call"""
try:
- _LOGGER.info(f"Service call: {SERVICE_QUERY_FORECAST_DATA}")
+ _LOGGER.info("Service call: %s", SERVICE_QUERY_FORECAST_DATA)
start = call.data.get(EVENT_START_DATETIME, dt_util.now())
end = call.data.get(EVENT_END_DATETIME, dt_util.now())
@@ -220,12 +217,12 @@ async def handle_service_get_solcast_data(call: ServiceCall) -> ServiceResponse:
async def handle_service_set_dampening(call: ServiceCall):
"""Handle service call"""
try:
- _LOGGER.info(f"Service call: {SERVICE_SET_DAMPENING}")
+ _LOGGER.info("Service call: %s", SERVICE_SET_DAMPENING)
factors = call.data.get(DAMP_FACTOR, None)
- if factors == None:
- raise HomeAssistantError(f"Error processing {SERVICE_SET_DAMPENING}: Empty factor string")
+ if factors is None:
+ raise HomeAssistantError("Error processing {SERVICE_SET_DAMPENING}: Empty factor string")
else:
factors = factors.strip().replace(" ","")
if len(factors) == 0:
@@ -245,7 +242,7 @@ async def handle_service_set_dampening(call: ServiceCall):
d.update({f"{i}": float(sp[i])})
opt[f"damp{i:02}"] = float(sp[i])
- solcast._damp = d
+ solcast.damp = d
hass.config_entries.async_update_entry(entry, options=opt)
except intent.IntentHandleError as err:
raise HomeAssistantError(f"Error processing {SERVICE_SET_DAMPENING}: {err}") from err
@@ -253,12 +250,12 @@ async def handle_service_set_dampening(call: ServiceCall):
async def handle_service_set_hard_limit(call: ServiceCall):
"""Handle service call"""
try:
- _LOGGER.info(f"Service call: {SERVICE_SET_HARD_LIMIT}")
+ _LOGGER.info("Service call: %s", SERVICE_SET_HARD_LIMIT)
hl = call.data.get(HARD_LIMIT, 100000)
- if hl == None:
+ if hl is None:
raise HomeAssistantError(f"Error processing {SERVICE_SET_HARD_LIMIT}: Empty hard limit value")
else:
val = int(hl)
@@ -267,18 +264,17 @@ async def handle_service_set_hard_limit(call: ServiceCall):
opt = {**entry.options}
opt[HARD_LIMIT] = val
- # solcast._hardlimit = val
hass.config_entries.async_update_entry(entry, options=opt)
- except ValueError:
- raise HomeAssistantError(f"Error processing {SERVICE_SET_HARD_LIMIT}: Hard limit value not a positive number")
+ except ValueError as err:
+ raise HomeAssistantError(f"Error processing {SERVICE_SET_HARD_LIMIT}: Hard limit value not a positive number") from err
except intent.IntentHandleError as err:
raise HomeAssistantError(f"Error processing {SERVICE_SET_DAMPENING}: {err}") from err
async def handle_service_remove_hard_limit(call: ServiceCall):
"""Handle service call"""
try:
- _LOGGER.info(f"Service call: {SERVICE_REMOVE_HARD_LIMIT}")
+ _LOGGER.info("Service call: %s", SERVICE_REMOVE_HARD_LIMIT)
opt = {**entry.options}
opt[HARD_LIMIT] = 100000
@@ -314,7 +310,7 @@ async def handle_service_remove_hard_limit(call: ServiceCall):
return True
async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
- """Unload a config entry."""
+ """Unload a config entry"""
unload_ok = await hass.config_entries.async_unload_platforms(entry, PLATFORMS)
if unload_ok:
hass.data[DOMAIN].pop(entry.entry_id)
@@ -329,6 +325,7 @@ async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
return unload_ok
async def async_remove_config_entry_device(hass: HomeAssistant, entry: ConfigEntry, device) -> bool:
+ """Remove ConfigEntry device"""
device_registry(hass).async_remove_device(device.id)
return True
@@ -337,7 +334,7 @@ async def async_update_options(hass: HomeAssistant, entry: ConfigEntry):
await hass.config_entries.async_reload(entry.entry_id)
async def async_migrate_entry(hass: HomeAssistant, config_entry: ConfigEntry) -> bool:
- """Migrate old entry."""
+ """Migrate old entry"""
def upgraded():
_LOGGER.debug("Upgraded to options version %s", config_entry.version)
@@ -436,15 +433,15 @@ def upgraded():
new = {**config_entry.options}
try:
default = []
- configDir = path.abspath(path.join(path.dirname(__file__) ,"../.."))
+ _config_dir = os.path.abspath(os.path.join(os.path.dirname(__file__) ,"../.."))
for spl in new[CONF_API_KEY].split(','):
- apiCacheFileName = "%s/solcast-usage%s.json" % (configDir, "" if len(new[CONF_API_KEY].split(',')) < 2 else "-" + spl.strip())
- async with aiofiles.open(apiCacheFileName) as f:
+ api_cache_filename = f"{_config_dir}/solcast-usage{'' if len(new[CONF_API_KEY].split(',')) < 2 else '-' + spl.strip()}.json"
+ async with aiofiles.open(api_cache_filename) as f:
usage = json.loads(await f.read())
default.append(str(usage['daily_limit']))
default = ','.join(default)
except Exception as e:
- _LOGGER.warning('Could not load API usage cache quota while upgrading config, using default: %s', e)
+ _LOGGER.warning('Could not load API usage cached limit while upgrading config, using default: %s', e)
default = '10'
if new.get(API_QUOTA) is None: new[API_QUOTA] = default
try:
diff --git a/custom_components/solcast_solar/config_flow.py b/custom_components/solcast_solar/config_flow.py
index a06c87f9..34bf5b03 100755
--- a/custom_components/solcast_solar/config_flow.py
+++ b/custom_components/solcast_solar/config_flow.py
@@ -1,4 +1,7 @@
"""Config flow for Solcast Solar integration"""
+
+# pylint: disable=C0304, E0401, W0702
+
from __future__ import annotations
from typing import Any
@@ -35,10 +38,10 @@ async def async_step_user(
"""Handle a flow initiated by the user."""
if self._async_current_entries():
return self.async_abort(reason="single_instance_allowed")
-
+
if user_input is not None:
return self.async_create_entry(
- title= TITLE,
+ title= TITLE,
data = {},
options={
CONF_API_KEY: user_input[CONF_API_KEY],
@@ -96,18 +99,20 @@ def __init__(self, config_entry: ConfigEntry) -> None:
self.config_entry = config_entry
self.options = dict(config_entry.options)
- async def async_step_init(self, user_input=None):
+ async def async_step_init(self, user_input=None) -> Any:
+ """Initialise steps"""
+
errors = {}
if user_input is not None:
if "solcast_config_action" in user_input:
- nextAction = user_input["solcast_config_action"]
- if nextAction == "configure_dampening":
+ next_action = user_input["solcast_config_action"]
+ if next_action == "configure_dampening":
return await self.async_step_dampen()
- elif nextAction == "configure_api":
+ elif next_action == "configure_api":
return await self.async_step_api()
- elif nextAction == "configure_customsensor":
+ elif next_action == "configure_customsensor":
return await self.async_step_customsensor()
- elif nextAction == "configure_attributes":
+ elif next_action == "configure_attributes":
return await self.async_step_attributes()
else:
errors["base"] = "incorrect_options_action"
@@ -132,25 +137,25 @@ async def async_step_api(self, user_input: dict[str, Any] | None = None) -> Flow
"""Manage the API key/quota"""
errors = {}
- apiQuota = self.config_entry.options[API_QUOTA]
-
+ api_quota = self.config_entry.options[API_QUOTA]
+
if user_input is not None:
try:
- apiQuota = user_input[API_QUOTA]
+ api_quota = user_input[API_QUOTA]
- allConfigData = {**self.config_entry.options}
+ all_config_data = {**self.config_entry.options}
k = user_input["api_key"].replace(" ","").strip()
k = ','.join([s for s in k.split(',') if s])
- allConfigData["api_key"] = k
- allConfigData[API_QUOTA] = apiQuota
+ all_config_data["api_key"] = k
+ all_config_data[API_QUOTA] = api_quota
self.hass.config_entries.async_update_entry(
self.config_entry,
title=TITLE,
- options=allConfigData,
+ options=all_config_data,
)
return self.async_create_entry(title=TITLE, data=None)
- except Exception as e:
+ except:
errors["base"] = "unknown"
return self.async_show_form(
@@ -193,7 +198,7 @@ async def async_step_dampen(self, user_input: dict[str, Any] | None = None) -> F
damp21 = self.config_entry.options["damp21"]
damp22 = self.config_entry.options["damp22"]
damp23 = self.config_entry.options["damp23"]
-
+
if user_input is not None:
try:
damp00 = user_input["damp00"]
@@ -221,40 +226,40 @@ async def async_step_dampen(self, user_input: dict[str, Any] | None = None) -> F
damp22 = user_input["damp22"]
damp23 = user_input["damp23"]
- allConfigData = {**self.config_entry.options}
- allConfigData["damp00"] = damp00
- allConfigData["damp01"] = damp01
- allConfigData["damp02"] = damp02
- allConfigData["damp03"] = damp03
- allConfigData["damp04"] = damp04
- allConfigData["damp05"] = damp05
- allConfigData["damp06"] = damp06
- allConfigData["damp07"] = damp07
- allConfigData["damp08"] = damp08
- allConfigData["damp09"] = damp09
- allConfigData["damp10"] = damp10
- allConfigData["damp11"] = damp11
- allConfigData["damp12"] = damp12
- allConfigData["damp13"] = damp13
- allConfigData["damp14"] = damp14
- allConfigData["damp15"] = damp15
- allConfigData["damp16"] = damp16
- allConfigData["damp17"] = damp17
- allConfigData["damp18"] = damp18
- allConfigData["damp19"] = damp19
- allConfigData["damp20"] = damp20
- allConfigData["damp21"] = damp21
- allConfigData["damp22"] = damp22
- allConfigData["damp23"] = damp23
+ all_config_data = {**self.config_entry.options}
+ all_config_data["damp00"] = damp00
+ all_config_data["damp01"] = damp01
+ all_config_data["damp02"] = damp02
+ all_config_data["damp03"] = damp03
+ all_config_data["damp04"] = damp04
+ all_config_data["damp05"] = damp05
+ all_config_data["damp06"] = damp06
+ all_config_data["damp07"] = damp07
+ all_config_data["damp08"] = damp08
+ all_config_data["damp09"] = damp09
+ all_config_data["damp10"] = damp10
+ all_config_data["damp11"] = damp11
+ all_config_data["damp12"] = damp12
+ all_config_data["damp13"] = damp13
+ all_config_data["damp14"] = damp14
+ all_config_data["damp15"] = damp15
+ all_config_data["damp16"] = damp16
+ all_config_data["damp17"] = damp17
+ all_config_data["damp18"] = damp18
+ all_config_data["damp19"] = damp19
+ all_config_data["damp20"] = damp20
+ all_config_data["damp21"] = damp21
+ all_config_data["damp22"] = damp22
+ all_config_data["damp23"] = damp23
self.hass.config_entries.async_update_entry(
self.config_entry,
title=TITLE,
- options=allConfigData,
+ options=all_config_data,
)
-
+
return self.async_create_entry(title=TITLE, data=None)
- except Exception as e:
+ except:
errors["base"] = "unknown"
return self.async_show_form(
@@ -320,22 +325,22 @@ async def async_step_customsensor(self, user_input: dict[str, Any] | None = None
errors = {}
customhoursensor = self.config_entry.options[CUSTOM_HOUR_SENSOR]
-
+
if user_input is not None:
try:
customhoursensor = user_input[CUSTOM_HOUR_SENSOR]
- allConfigData = {**self.config_entry.options}
- allConfigData[CUSTOM_HOUR_SENSOR] = customhoursensor
+ all_config_data = {**self.config_entry.options}
+ all_config_data[CUSTOM_HOUR_SENSOR] = customhoursensor
self.hass.config_entries.async_update_entry(
self.config_entry,
title=TITLE,
- options=allConfigData,
+ options=all_config_data,
)
-
+
return self.async_create_entry(title=TITLE, data=None)
- except Exception as e:
+ except:
errors["base"] = "unknown"
return self.async_show_form(
@@ -348,55 +353,55 @@ async def async_step_customsensor(self, user_input: dict[str, Any] | None = None
),
errors=errors,
)
-
+
async def async_step_attributes(self, user_input: dict[str, Any] | None = None) -> FlowResult:
"""Manage the attributes present"""
errors = {}
- estimateBreakdown = self.config_entry.options[BRK_ESTIMATE]
- estimateBreakdown10 = self.config_entry.options[BRK_ESTIMATE10]
- estimateBreakdown90 = self.config_entry.options[BRK_ESTIMATE90]
- siteBreakdown = self.config_entry.options[BRK_SITE]
- halfHourly = self.config_entry.options[BRK_HALFHOURLY]
+ estimate_breakdown = self.config_entry.options[BRK_ESTIMATE]
+ estimate_breakdown10 = self.config_entry.options[BRK_ESTIMATE10]
+ estimate_breakdown90 = self.config_entry.options[BRK_ESTIMATE90]
+ site_breakdown = self.config_entry.options[BRK_SITE]
+ half_hourly = self.config_entry.options[BRK_HALFHOURLY]
hourly = self.config_entry.options[BRK_HOURLY]
-
+
if user_input is not None:
try:
- estimateBreakdown = user_input[BRK_ESTIMATE]
- estimateBreakdown10 = user_input[BRK_ESTIMATE10]
- estimateBreakdown90 = user_input[BRK_ESTIMATE90]
- siteBreakdown = user_input[BRK_SITE]
- halfHourly = user_input[BRK_HALFHOURLY]
+ estimate_breakdown = user_input[BRK_ESTIMATE]
+ estimate_breakdown10 = user_input[BRK_ESTIMATE10]
+ estimate_breakdown90 = user_input[BRK_ESTIMATE90]
+ site_breakdown = user_input[BRK_SITE]
+ half_hourly = user_input[BRK_HALFHOURLY]
hourly = user_input[BRK_HOURLY]
- allConfigData = {**self.config_entry.options}
- allConfigData[BRK_ESTIMATE] = estimateBreakdown
- allConfigData[BRK_ESTIMATE10] = estimateBreakdown10
- allConfigData[BRK_ESTIMATE90] = estimateBreakdown90
- allConfigData[BRK_SITE] = siteBreakdown
- allConfigData[BRK_HALFHOURLY] = halfHourly
- allConfigData[BRK_HOURLY] = hourly
+ all_config_data = {**self.config_entry.options}
+ all_config_data[BRK_ESTIMATE] = estimate_breakdown
+ all_config_data[BRK_ESTIMATE10] = estimate_breakdown10
+ all_config_data[BRK_ESTIMATE90] = estimate_breakdown90
+ all_config_data[BRK_SITE] = site_breakdown
+ all_config_data[BRK_HALFHOURLY] = half_hourly
+ all_config_data[BRK_HOURLY] = hourly
self.hass.config_entries.async_update_entry(
self.config_entry,
title=TITLE,
- options=allConfigData,
+ options=all_config_data,
)
-
+
return self.async_create_entry(title=TITLE, data=None)
- except Exception as e:
+ except:
errors["base"] = "unknown"
return self.async_show_form(
step_id="attributes",
data_schema=vol.Schema(
{
- vol.Required(BRK_ESTIMATE10, description={"suggested_value": estimateBreakdown10}): bool,
- vol.Required(BRK_ESTIMATE, description={"suggested_value": estimateBreakdown}): bool,
- vol.Required(BRK_ESTIMATE90, description={"suggested_value": estimateBreakdown90}): bool,
- vol.Required(BRK_SITE, description={"suggested_value": siteBreakdown}): bool,
- vol.Required(BRK_HALFHOURLY, description={"suggested_value": halfHourly}): bool,
+ vol.Required(BRK_ESTIMATE10, description={"suggested_value": estimate_breakdown10}): bool,
+ vol.Required(BRK_ESTIMATE, description={"suggested_value": estimate_breakdown}): bool,
+ vol.Required(BRK_ESTIMATE90, description={"suggested_value": estimate_breakdown90}): bool,
+ vol.Required(BRK_SITE, description={"suggested_value": site_breakdown}): bool,
+ vol.Required(BRK_HALFHOURLY, description={"suggested_value": half_hourly}): bool,
vol.Required(BRK_HOURLY, description={"suggested_value": hourly}): bool,
}
),
diff --git a/custom_components/solcast_solar/const.py b/custom_components/solcast_solar/const.py
index 296ff12c..8233199b 100755
--- a/custom_components/solcast_solar/const.py
+++ b/custom_components/solcast_solar/const.py
@@ -1,5 +1,7 @@
"""Constants for the Solcast Solar integration."""
+# pylint: disable=C0304, E0401
+
from __future__ import annotations
from typing import Final
@@ -10,7 +12,6 @@
TITLE = "Solcast Solar"
SOLCAST_URL = "https://api.solcast.com.au"
-
ATTR_ENTRY_TYPE: Final = "entry_type"
ENTRY_TYPE_SERVICE: Final = "service"
diff --git a/custom_components/solcast_solar/coordinator.py b/custom_components/solcast_solar/coordinator.py
index 372af597..aed0a8bc 100644
--- a/custom_components/solcast_solar/coordinator.py
+++ b/custom_components/solcast_solar/coordinator.py
@@ -1,12 +1,16 @@
"""The Solcast Solar coordinator"""
+
+# pylint: disable=C0302, C0304, C0321, E0401, R0902, R0914, W0105, W0613, W0702, W0706, W0719
+
from __future__ import annotations
from datetime import datetime as dt
+from typing import Any, Dict
+
import logging
import traceback
from homeassistant.core import HomeAssistant
-from homeassistant.helpers.event import async_track_time_change
from homeassistant.helpers.event import async_track_utc_time_change
from homeassistant.helpers.update_coordinator import DataUpdateCoordinator
@@ -25,9 +29,9 @@ def __init__(self, hass: HomeAssistant, solcast: SolcastApi, version: str) -> No
self._hass = hass
self._previousenergy = None
self._version = version
- self._lastDay = None
- self._dayChanged = False
- self._dataUpdated = False
+ self._last_day = None
+ self._date_changed = False
+ self._data_updated = False
super().__init__(
hass,
@@ -38,70 +42,98 @@ def __init__(self, hass: HomeAssistant, solcast: SolcastApi, version: str) -> No
async def _async_update_data(self):
"""Update data via library"""
- return self.solcast._data
+ return self.solcast.get_data()
- async def setup(self):
+ async def setup(self) -> None:
+ """Set up time change tracking"""
d={}
self._previousenergy = d
- self._lastDay = dt.now(self.solcast._tz).day
+ self._last_day = dt.now(self.solcast.options.tz).day
try:
#4.0.18 - added reset usage call to reset usage sensors at UTC midnight
async_track_utc_time_change(self._hass, self.update_utcmidnight_usage_sensor_data, hour=0,minute=0,second=0)
async_track_utc_time_change(self._hass, self.update_integration_listeners, minute=range(0, 60, 5), second=0)
- except Exception as error:
+ except:
_LOGGER.error("Exception in Solcast coordinator setup: %s", traceback.format_exc())
- async def update_integration_listeners(self, *args):
+ async def update_integration_listeners(self, *args) -> None:
+ """Get updated sensor values"""
try:
- crtDay = dt.now(self.solcast._tz).day
- self._dateChanged = (crtDay != self._lastDay)
- if self._dateChanged:
- self._lastDay = crtDay
+ current_day = dt.now(self.solcast.options.tz).day
+ self._date_changed = current_day != self._last_day
+ if self._date_changed:
+ self._last_day = current_day
#4.0.41 - recalculate splines at midnight local
await self.update_midnight_spline_recalc()
self.async_update_listeners()
- except Exception:
+ except:
#_LOGGER.error("update_integration_listeners: %s", traceback.format_exc())
pass
- async def update_utcmidnight_usage_sensor_data(self, *args):
+ async def update_utcmidnight_usage_sensor_data(self, *args) -> None:
+ """Resets tracked API usage at midnight UTC"""
try:
await self.solcast.reset_api_usage()
- except Exception:
+ except:
#_LOGGER.error("Exception in update_utcmidnight_usage_sensor_data(): %s", traceback.format_exc())
pass
- async def update_midnight_spline_recalc(self, *args):
+ async def update_midnight_spline_recalc(self, *args) -> None:
+ """Re-calculates splines at midnight local time"""
try:
- _LOGGER.debug('Recalculating splines')
+ _LOGGER.debug("Recalculating splines")
await self.solcast.spline_moments()
await self.solcast.spline_remaining()
- except Exception:
+ except:
_LOGGER.error("Exception in update_midnight_spline_recalc(): %s", traceback.format_exc())
- pass
- async def service_event_update(self, *args):
+ async def service_event_update(self, *args) -> None:
+ """Get updated forecast data when requested by a service call"""
try:
#await self.solcast.sites_weather()
await self.solcast.http_data(dopast=False)
- self._dataUpdated = True
+ self._data_updated = True
await self.update_integration_listeners()
- self._dataUpdated = False
- except Exception as ex:
+ self._data_updated = False
+ except:
_LOGGER.error("Exception in service_event_update(): %s", traceback.format_exc())
- async def service_event_delete_old_solcast_json_file(self, *args):
+ async def service_event_delete_old_solcast_json_file(self, *args) -> None:
+ """Delete the solcast.json file when requested by a service call"""
await self.solcast.delete_solcast_file()
async def service_query_forecast_data(self, *args) -> tuple:
+ """Return forecast data requested by a service call"""
return await self.solcast.get_forecast_list(*args)
- def get_energy_tab_data(self):
+ def get_solcast_sites(self) -> dict[str, Any]:
+ """Return the active solcast sites"""
+ return self.solcast.sites
+
+ def get_previousenergy(self) -> dict[str, Any]:
+ """Return the prior energy dictionary"""
+ return self._previousenergy
+
+ def get_energy_tab_data(self) -> dict[str, Any]:
+ """Return an energy page compatible dictionary"""
return self.solcast.get_energy_data()
- def get_sensor_value(self, key=""):
+ def get_data_updated(self) -> bool:
+ """Return whether all data has updated, which will trigger all sensor values to update"""
+ return self._data_updated
+
+ def set_data_updated(self, updated) -> bool:
+ """Set whether all data has updated"""
+ self._data_updated = updated
+
+ def get_date_changed(self) -> bool:
+ """Return whether rolled over to tomorrow, which will trigger all sensor values to update"""
+ return self._date_changed
+
+ def get_sensor_value(self, key="") -> (int | dt | float | Any | str | bool | None):
+ """Return the value of a sensor"""
match key:
case "peak_w_today":
return self.solcast.get_peak_w_day(0)
@@ -112,7 +144,7 @@ def get_sensor_value(self, key=""):
case "forecast_next_hour":
return self.solcast.get_forecast_n_hour(1)
case "forecast_custom_hours":
- return self.solcast.get_forecast_custom_hours(self.solcast._customhoursensor)
+ return self.solcast.get_forecast_custom_hours(self.solcast.custom_hour_sensor)
case "total_kwh_forecast_today":
return self.solcast.get_total_kwh_forecast_day(0)
case "total_kwh_forecast_tomorrow":
@@ -146,20 +178,21 @@ def get_sensor_value(self, key=""):
case "lastupdated":
return self.solcast.get_last_updated_datetime()
case "hard_limit":
- return False if self.solcast._hardlimit == 100 else f"{round(self.solcast._hardlimit * 1000)}w"
+ return False if self.solcast.hard_limit == 100 else f"{round(self.solcast.hard_limit * 1000)}w"
# case "weather_description":
# return self.solcast.get_weather()
case _:
return None
- def get_sensor_extra_attributes(self, key=""):
+ def get_sensor_extra_attributes(self, key="") -> (Dict[str, Any] | None):
+ """Return the attributes for a sensor"""
match key:
case "forecast_this_hour":
return self.solcast.get_forecasts_n_hour(0)
case "forecast_next_hour":
return self.solcast.get_forecasts_n_hour(1)
case "forecast_custom_hours":
- return self.solcast.get_forecasts_custom_hours(self.solcast._customhoursensor)
+ return self.solcast.get_forecasts_custom_hours(self.solcast.custom_hour_sensor)
case "total_kwh_forecast_today":
ret = self.solcast.get_forecast_day(0)
ret = {**ret, **self.solcast.get_sites_total_kwh_forecast_day(0)}
@@ -207,14 +240,16 @@ def get_sensor_extra_attributes(self, key=""):
case _:
return None
- def get_site_sensor_value(self, roof_id, key):
+ def get_site_sensor_value(self, roof_id, key) -> (float | None):
+ """Get the site total for today"""
match key:
case "site_data":
return self.solcast.get_rooftop_site_total_today(roof_id)
case _:
return None
- def get_site_sensor_extra_attributes(self, roof_id, key):
+ def get_site_sensor_extra_attributes(self, roof_id, key) -> (dict[str, Any] | None):
+ """Get the attributes for a sensor"""
match key:
case "site_data":
return self.solcast.get_rooftop_site_extra_data(roof_id)
diff --git a/custom_components/solcast_solar/diagnostics.py b/custom_components/solcast_solar/diagnostics.py
index 057a0be4..3e11f35d 100644
--- a/custom_components/solcast_solar/diagnostics.py
+++ b/custom_components/solcast_solar/diagnostics.py
@@ -1,4 +1,7 @@
"""Support for the Solcast diagnostics."""
+
+# pylint: disable=C0304, E0401
+
from __future__ import annotations
from typing import Any
@@ -22,13 +25,12 @@ async def async_get_config_entry_diagnostics(
coordinator: SolcastUpdateCoordinator = hass.data[DOMAIN][config_entry.entry_id]
return {
- "tz_conversion": coordinator.solcast._tz,
+ "tz_conversion": coordinator.solcast.options.tz,
"used_api_requests": coordinator.solcast.get_api_used_count(),
"api_request_limit": coordinator.solcast.get_api_limit(),
- "rooftop_site_count": len(coordinator.solcast._sites),
- "forecast_hard_limit_set": coordinator.solcast._hardlimit < 100,
+ "rooftop_site_count": len(coordinator.solcast.sites),
+ "forecast_hard_limit_set": coordinator.solcast.hard_limit < 100,
"data": (coordinator.data, TO_REDACT),
- "energy_history_graph": coordinator._previousenergy,
- "energy_forecasts_graph": coordinator.solcast._dataenergy["wh_hours"],
- }
-
+ "energy_history_graph": coordinator.get_previousenergy(),
+ "energy_forecasts_graph": coordinator.solcast.get_energy_data()["wh_hours"],
+ }
\ No newline at end of file
diff --git a/custom_components/solcast_solar/energy.py b/custom_components/solcast_solar/energy.py
index 23715826..2771cc05 100755
--- a/custom_components/solcast_solar/energy.py
+++ b/custom_components/solcast_solar/energy.py
@@ -1,4 +1,7 @@
"""Energy platform"""
+
+# pylint: disable=C0304, E0401
+
from __future__ import annotations
import logging
diff --git a/custom_components/solcast_solar/manifest.json b/custom_components/solcast_solar/manifest.json
index 03de771b..106d7536 100644
--- a/custom_components/solcast_solar/manifest.json
+++ b/custom_components/solcast_solar/manifest.json
@@ -9,6 +9,6 @@
"integration_type": "service",
"iot_class": "cloud_polling",
"issue_tracker": "https://github.com/BJReplay/ha-solcast-solar/issues",
- "requirements": ["aiohttp>=3.8.5", "aiofiles>=24.1.0", "datetime>=4.3", "isodate>=0.6.1"],
- "version": "4.1.3"
+ "requirements": ["aiohttp>=3.8.5", "aiofiles>=23.2.0", "datetime>=4.3", "isodate>=0.6.1"],
+ "version": "4.1.4"
}
diff --git a/custom_components/solcast_solar/recorder.py b/custom_components/solcast_solar/recorder.py
index 8a8d2a3d..9f82f5a0 100644
--- a/custom_components/solcast_solar/recorder.py
+++ b/custom_components/solcast_solar/recorder.py
@@ -1,4 +1,7 @@
"""Integration platform for recorder."""
+
+# pylint: disable=C0304, E0401, W0613
+
from __future__ import annotations
from homeassistant.core import HomeAssistant, callback
diff --git a/custom_components/solcast_solar/select.py b/custom_components/solcast_solar/select.py
index 89feaa0f..60dbda73 100644
--- a/custom_components/solcast_solar/select.py
+++ b/custom_components/solcast_solar/select.py
@@ -1,4 +1,7 @@
"""Selector to allow users to select the pv_ data field to use for calcualtions."""
+
+# pylint: disable=C0304, E0401, W0212
+
import logging
from enum import IntEnum
@@ -32,7 +35,6 @@ class PVEstimateMode(IntEnum):
ESTIMATE - Default forecasts
ESTIMATE10 = Forecasts 10 - cloudier than expected scenario
ESTIMATE90 = Forecasts 90 - less cloudy than expected scenario
-
"""
ESTIMATE = 0
@@ -63,12 +65,12 @@ async def async_setup_entry(
entry: ConfigEntry,
async_add_entities: AddEntitiesCallback,
) -> None:
-
+ """Setup entry"""
coordinator: SolcastUpdateCoordinator = hass.data[DOMAIN][entry.entry_id]
try:
est_mode = coordinator.solcast.options.key_estimate
- except (ValueError):
+ except ValueError:
_LOGGER.debug("Could not read estimate mode", exc_info=True)
else:
entity = EstimateModeEntity(
@@ -84,7 +86,7 @@ async def async_setup_entry(
class EstimateModeEntity(SelectEntity):
"""Entity representing the solcast estimate field to use for calculations."""
- _attr_attribution = ATTRIBUTION
+ _attr_attribution = ATTRIBUTION
_attr_should_poll = False
_attr_has_entity_name = True
@@ -103,7 +105,7 @@ def __init__(
self.entity_description = entity_description
self._attr_unique_id = f"{entity_description.key}"
-
+
self._attr_options = supported_options
self._attr_current_option = current_option
@@ -114,7 +116,7 @@ def __init__(
self._attr_device_info = {
ATTR_IDENTIFIERS: {(DOMAIN, entry.entry_id)},
- ATTR_NAME: "Solcast PV Forecast",
+ ATTR_NAME: "Solcast PV Forecast",
ATTR_MANUFACTURER: "BJReplay",
ATTR_MODEL: "Solcast PV Forecast",
ATTR_ENTRY_TYPE: DeviceEntryType.SERVICE,
@@ -130,4 +132,4 @@ async def async_select_option(self, option: str) -> None:
new = {**self._entry.options}
new[KEY_ESTIMATE] = option
- self.coordinator._hass.config_entries.async_update_entry(self._entry, options=new)
+ self.coordinator._hass.config_entries.async_update_entry(self._entry, options=new)
\ No newline at end of file
diff --git a/custom_components/solcast_solar/sensor.py b/custom_components/solcast_solar/sensor.py
index 4bbdd531..2d717b2a 100755
--- a/custom_components/solcast_solar/sensor.py
+++ b/custom_components/solcast_solar/sensor.py
@@ -1,5 +1,7 @@
"""Support for Solcast PV forecast sensors."""
+# pylint: disable=C0304, E0401, W0718
+
from __future__ import annotations
import logging
@@ -235,10 +237,12 @@
}
class SensorUpdatePolicy(Enum):
+ """Sensor update policy"""
DEFAULT = 0
EVERY_TIME_INTERVAL = 1
-def getSensorUpdatePolicy(key) -> SensorUpdatePolicy:
+def get_sensor_update_policy(key) -> SensorUpdatePolicy:
+ """Get the sensor update policy"""
match key:
case (
"forecast_this_hour" |
@@ -264,11 +268,11 @@ async def async_setup_entry(
coordinator: SolcastUpdateCoordinator = hass.data[DOMAIN][entry.entry_id]
entities = []
- for sensor_types in SENSORS:
+ for sensor_types, _ in SENSORS.items():
sen = SolcastSensor(coordinator, SENSORS[sensor_types], entry)
entities.append(sen)
- for site in coordinator.solcast._sites:
+ for site in coordinator.get_solcast_sites():
k = RooftopSensorEntityDescription(
key=site["resource_id"],
name=site["name"],
@@ -306,11 +310,11 @@ def __init__(
#doesnt work :()
if entity_description.key == "forecast_custom_hours":
- self._attr_translation_placeholders = {"forecast_custom_hours": f"{coordinator.solcast._customhoursensor}"}
+ self._attr_translation_placeholders = {"forecast_custom_hours": f"{coordinator.solcast.custom_hour_sensor}"}
self.entity_description = entity_description
self.coordinator = coordinator
- self.update_policy = getSensorUpdatePolicy(entity_description.key)
+ self.update_policy = get_sensor_update_policy(entity_description.key)
self._attr_unique_id = f"{entity_description.key}"
self._attributes = {}
@@ -318,10 +322,8 @@ def __init__(
try:
self._sensor_data = coordinator.get_sensor_value(entity_description.key)
- except Exception as ex:
- _LOGGER.error(
- f"Unable to get sensor value {ex} %s", traceback.format_exc()
- )
+ except Exception as e:
+ _LOGGER.error("Unable to get sensor value: %s: %s", e, traceback.format_exc())
self._sensor_data = None
if self._sensor_data is None:
@@ -344,13 +346,9 @@ def __init__(
def extra_state_attributes(self):
"""Return the state extra attributes of the sensor."""
try:
- return self.coordinator.get_sensor_extra_attributes(
- self.entity_description.key
- )
- except Exception as ex:
- _LOGGER.error(
- f"Unable to get sensor value {ex} %s", traceback.format_exc()
- )
+ return self.coordinator.get_sensor_extra_attributes(self.entity_description.key)
+ except Exception as e:
+ _LOGGER.error("Unable to get sensor value: %s: %s", e, traceback.format_exc())
return None
@property
@@ -368,21 +366,17 @@ def _handle_coordinator_update(self) -> None:
"""Handle updated data from the coordinator."""
# these sensors will pick-up the change on the next interval update (5mins)
- if self.update_policy == SensorUpdatePolicy.EVERY_TIME_INTERVAL and self.coordinator._dataUpdated:
+ if self.update_policy == SensorUpdatePolicy.EVERY_TIME_INTERVAL and self.coordinator.get_data_updated():
return
# these sensors update when the date changed or when there is new data
- if self.update_policy == SensorUpdatePolicy.DEFAULT and not (self.coordinator._dateChanged or self.coordinator._dataUpdated) :
- return
+ if self.update_policy == SensorUpdatePolicy.DEFAULT and not (self.coordinator.get_date_changed() or self.coordinator.get_data_updated()) :
+ return
try:
- self._sensor_data = self.coordinator.get_sensor_value(
- self.entity_description.key
- )
- except Exception as ex:
- _LOGGER.error(
- f"Unable to get sensor value {ex} %s", traceback.format_exc()
- )
+ self._sensor_data = self.coordinator.get_sensor_value(self.entity_description.key)
+ except Exception as e:
+ _LOGGER.error("Unable to get sensor value: %s: %s", e, traceback.format_exc())
self._sensor_data = None
if self._sensor_data is None:
@@ -394,10 +388,17 @@ def _handle_coordinator_update(self) -> None:
@dataclass
class RooftopSensorEntityDescription(SensorEntityDescription):
+ """Representation of a rooftop entity description"""
+ key: str | None = None
+ name: str | None = None
+ icon: str | None = None
+ device_class: SensorDeviceClass = SensorDeviceClass.ENERGY
+ native_unit_of_measurement: UnitOfEnergy = UnitOfEnergy.KILO_WATT_HOUR
+ suggested_display_precision: int = 2
rooftop_id: str | None = None
class RooftopSensor(CoordinatorEntity, SensorEntity):
- """Representation of a Solcast Sensor device."""
+ """Representation of a rooftop sensor device"""
_attr_attribution = ATTRIBUTION
@@ -423,10 +424,8 @@ def __init__(
try:
self._sensor_data = coordinator.get_site_sensor_value(self.rooftop_id, key)
- except Exception as ex:
- _LOGGER.error(
- f"Unable to get sensor value {ex} %s", traceback.format_exc()
- )
+ except Exception as e:
+ _LOGGER.error("Unable to get sensor value: %s: %s", e, traceback.format_exc())
self._sensor_data = None
self._attr_device_info = {
@@ -460,14 +459,9 @@ def unique_id(self):
def extra_state_attributes(self):
"""Return the state extra attributes of the sensor."""
try:
- return self.coordinator.get_site_sensor_extra_attributes(
- self.rooftop_id,
- self.key,
- )
- except Exception as ex:
- _LOGGER.error(
- f"Unable to get sensor value {ex} %s", traceback.format_exc()
- )
+ return self.coordinator.get_site_sensor_extra_attributes(self.rooftop_id, self.key )
+ except Exception as e:
+ _LOGGER.error("Unable to get sensor attributes: %s: %s", e, traceback.format_exc())
return None
@property
@@ -483,21 +477,14 @@ def should_poll(self) -> bool:
async def async_added_to_hass(self) -> None:
"""When entity is added to hass."""
await super().async_added_to_hass()
- self.async_on_remove(
- self.coordinator.async_add_listener(self._handle_coordinator_update)
- )
+ self.async_on_remove(self.coordinator.async_add_listener(self._handle_coordinator_update))
@callback
def _handle_coordinator_update(self) -> None:
"""Handle updated data from the coordinator."""
try:
- self._sensor_data = self.coordinator.get_site_sensor_value(
- self.rooftop_id,
- self.key,
- )
- except Exception as ex:
- _LOGGER.error(
- f"Unable to get sensor value {ex} %s", traceback.format_exc()
- )
+ self._sensor_data = self.coordinator.get_site_sensor_value(self.rooftop_id, self.key)
+ except Exception as e:
+ _LOGGER.error("Unable to get sensor value: %s: %s", e, traceback.format_exc())
self._sensor_data = None
self.async_write_ha_state()
\ No newline at end of file
diff --git a/custom_components/solcast_solar/solcastapi.py b/custom_components/solcast_solar/solcastapi.py
index 9378c6cd..2dee6351 100644
--- a/custom_components/solcast_solar/solcastapi.py
+++ b/custom_components/solcast_solar/solcastapi.py
@@ -1,8 +1,10 @@
"""Solcast API"""
+
+# pylint: disable=C0302, C0304, C0321, E0401, R0902, R0914, W0105, W0702, W0706, W0718, W0719
+
from __future__ import annotations
import asyncio
-import aiofiles
import copy
import json
import logging
@@ -13,7 +15,6 @@
import traceback
import random
import re
-from .spline import cubic_interp
from dataclasses import dataclass
from datetime import datetime as dt
from datetime import timedelta, timezone
@@ -23,31 +24,41 @@
from typing import Any, Dict, cast
import async_timeout
+import aiofiles
from aiohttp import ClientConnectionError, ClientSession
from aiohttp.client_reqrep import ClientResponse
from isodate import parse_datetime
+from .spline import cubic_interp
+
# For current func name, specify 0 or no argument
# For name of caller of current func, specify 1
# For name of caller of caller of current func, specify 2, etc.
-currentFuncName = lambda n=0: sys._getframe(n + 1).f_code.co_name
+currentFuncName = lambda n=0: sys._getframe(n + 1).f_code.co_name # pylint: disable=C3001, W0212
_SENSOR_DEBUG_LOGGING = False
+_FORECAST_DEBUG_LOGGING = False
+_SPLINE_DEBUG_LOGGING = False
_JSON_VERSION = 4
_LOGGER = logging.getLogger(__name__)
class DateTimeEncoder(json.JSONEncoder):
+ """Date/time helper"""
def default(self, o):
if isinstance(o, dt):
return o.isoformat()
+ else:
+ return None
class JSONDecoder(json.JSONDecoder):
- def __init__(self, *args, **kwargs):
+ """JSON decoder helper"""
+ def __init__(self, *args, **kwargs) -> None:
json.JSONDecoder.__init__(
self, object_hook=self.object_hook, *args, **kwargs)
- def object_hook(self, obj):
+ def object_hook(self, obj) -> dict: # pylint: disable=E0202
+ """Hook"""
ret = {}
for key, value in obj.items():
if key in {'period_start'}:
@@ -56,12 +67,14 @@ def object_hook(self, obj):
ret[key] = value
return ret
+# HTTP status code translation.
+# A 418 error is included here for fun. This was included in RFC2324#section-2.3.2 as an April Fools joke in 1998.
statusTranslate = {
200: 'Success',
401: 'Unauthorized',
403: 'Forbidden',
404: 'Not found',
- 418: 'I\'m a teapot', # Included here for fun. An April Fools joke in 1998. Included in RFC2324#section-2.3.2
+ 418: 'I\'m a teapot',
429: 'Try again later',
500: 'Internal web server error',
501: 'Not implemented',
@@ -70,8 +83,9 @@ def object_hook(self, obj):
504: 'Gateway timeout',
}
-def translate(status):
- return ('%s/%s' % (str(status), statusTranslate[status], )) if statusTranslate.get(status) else status
+def translate(status) -> str | Any:
+ """Translate HTTP status code to a human-readable translation"""
+ return (f"{str(status)}/{statusTranslate[status]}") if statusTranslate.get(status) else status
@dataclass
@@ -84,7 +98,7 @@ class ConnectionOptions:
file_path: str
tz: timezone
dampening: dict
- customhoursensor: int
+ custom_hour_sensor: int
key_estimate: str
hard_limit: int
attr_brk_estimate: bool
@@ -102,246 +116,307 @@ def __init__(
self,
aiohttp_session: ClientSession,
options: ConnectionOptions,
- apiCacheEnabled: bool = False
+ api_cache_enabled: bool = False
):
"""Device init"""
- self.aiohttp_session = aiohttp_session
+
+ # Public vars
self.options = options
- self.apiCacheEnabled = apiCacheEnabled
- self._sites_loaded = False
- self._sites = []
+ self.hard_limit = options.hard_limit
+ self.custom_hour_sensor = options.custom_hour_sensor
+ self.damp = options.dampening
+ self.sites = []
+ self.sites_loaded = False
+
+ # Protected vars
+ self._aiohttp_session = aiohttp_session
self._data = {'siteinfo': {}, 'last_updated': dt.fromtimestamp(0, timezone.utc).isoformat()}
self._tally = {}
self._api_used = {}
self._api_limit = {}
+ self._api_used_reset = {}
self._filename = options.file_path
- self.configDir = dirname(self._filename)
- _LOGGER.debug("Configuration directory is %s", self.configDir)
+ self._config_dir = dirname(self._filename)
self._tz = options.tz
- self._dataenergy = {}
+ self._data_energy = {}
self._data_forecasts = []
self._site_data_forecasts = {}
+ self._spline_period = list(range(0, 90000, 1800))
+ self._forecasts_moment = {}
+ self._forecasts_remaining = {}
self._forecasts_start_idx = 0
- self._detailedForecasts = []
self._loaded_data = False
self._serialize_lock = asyncio.Lock()
- self._damp = options.dampening
- self._customhoursensor = options.customhoursensor
self._use_data_field = f"pv_{options.key_estimate}"
- self._hardlimit = options.hard_limit
- self._estimen = {'pv_estimate': options.attr_brk_estimate, 'pv_estimate10': options.attr_brk_estimate10, 'pv_estimate90': options.attr_brk_estimate90}
- self._spline_period = list(range(0, 90000, 1800))
- self.fc_moment = {}
- self.fc_remaining = {}
+ self._estimate_set = {'pv_estimate': options.attr_brk_estimate, 'pv_estimate10': options.attr_brk_estimate10, 'pv_estimate90': options.attr_brk_estimate90}
#self._weather = ""
+ self._api_cache_enabled = api_cache_enabled # For offline development
+
+ _LOGGER.debug("Configuration directory is %s", self._config_dir)
+
+ def get_data(self) -> dict[str, Any]:
+ """Return the data dictionary"""
+ return self._data
+
+ def redact_api_key(self, api_key) -> str:
+ """Obfuscate API key"""
+ return '*'*6 + api_key[-6:]
+
+ def redact_msg_api_key(self, msg, api_key) -> str:
+ """Obfuscate API key in messages"""
+ return msg.replace(api_key, self.redact_api_key(api_key))
+
+ def get_usage_cache_filename(self, entry_name):
+ """Build a fully qualified API usage cache filename using a simple name or separate files for more than one API key"""
+ return '%s/solcast-usage%s.json' % (self._config_dir, "" if len(self.options.api_key.split(",")) < 2 else "-" + entry_name) # pylint: disable=C0209
+
+ def get_sites_cache_filename(self, entry_name):
+ """Build a fully qualified site details cache filename using a simple name or separate files for more than one API key"""
+ return '%s/solcast-sites%s.json' % (self._config_dir, "" if len(self.options.api_key.split(",")) < 2 else "-" + entry_name) # pylint: disable=C0209
async def serialize_data(self):
"""Serialize data to file"""
+ serialise = True
try:
if not self._loaded_data:
_LOGGER.debug("Not saving forecast cache in serialize_data() as no data has been loaded yet")
return
- # If the _loaded_data flag is True, yet last_updated is 1/1/1970 then data has not been
- # loaded properly for some reason, or no forecast has been received since startup.
- # Abort the save.
+ # If the _loaded_data flag is True, yet last_updated is 1/1/1970 then data has not been loaded
+ # properly for some reason, or no forecast has been received since startup then abort the save
if self._data['last_updated'] == dt.fromtimestamp(0, timezone.utc).isoformat():
_LOGGER.error("Internal error: Solcast forecast cache date has not been set, not saving data")
return
-
- async with self._serialize_lock:
- async with aiofiles.open(self._filename, "w") as f:
- await f.write(json.dumps(self._data, ensure_ascii=False, cls=DateTimeEncoder))
- _LOGGER.debug("Saved forecast cache")
- except Exception as ex:
- _LOGGER.error("Exception in serialize_data(): %s", ex)
+ payload = json.dumps(self._data, ensure_ascii=False, cls=DateTimeEncoder)
+ except Exception as e:
+ _LOGGER.error("Exception in serialize_data(): %s", e)
_LOGGER.error(traceback.format_exc())
+ serialise = False
+ if serialise:
+ try:
+ async with self._serialize_lock:
+ async with aiofiles.open(self._filename, 'w') as f:
+ await f.write(payload)
+ _LOGGER.debug("Saved forecast cache")
+ except Exception as e:
+ _LOGGER.error("Exception writing forecast data: %s", e)
- def redact_api_key(self, api_key):
- return '*'*6 + api_key[-6:]
-
- def redact_msg_api_key(self, msg, api_key):
- return msg.replace(api_key, self.redact_api_key(api_key))
-
- async def write_api_usage_cache_file(self, json_file, json_content, api_key):
+ async def serialise_usage(self, api_key, reset=False):
+ """Serialise the usage cache file"""
+ serialise = True
try:
- _LOGGER.debug(f"Writing API usage cache file: {self.redact_msg_api_key(json_file, api_key)}")
- async with aiofiles.open(json_file, 'w') as f:
- await f.write(json.dumps(json_content, ensure_ascii=False))
- except Exception as ex:
- _LOGGER.error("Exception in write_api_usage_cache_file(): %s", ex)
+ json_file = self.get_usage_cache_filename(api_key)
+ if reset:
+ self._api_used_reset[api_key] = self.get_day_start_utc()
+ _LOGGER.debug("Writing API usage cache file: %s", self.redact_msg_api_key(json_file, api_key))
+ json_content = {"daily_limit": self._api_limit[api_key], "daily_limit_consumed": self._api_used[api_key], "reset": self._api_used_reset[api_key]}
+ payload = json.dumps(json_content, ensure_ascii=False, cls=DateTimeEncoder)
+ except Exception as e:
+ _LOGGER.error("Exception in serialise_usage(): %s", e)
_LOGGER.error(traceback.format_exc())
-
- def get_api_usage_cache_filename(self, entry_name):
- return "%s/solcast-usage%s.json" % (self.configDir, "" if len(self.options.api_key.split(",")) < 2 else "-" + entry_name) # For more than one API key use separate files
-
- def get_api_sites_cache_filename(self, entry_name):
- return "%s/solcast-sites%s.json" % (self.configDir, "" if len(self.options.api_key.split(",")) < 2 else "-" + entry_name) # Ditto
-
- async def reset_api_usage(self):
- for api_key in self._api_used.keys():
- self._api_used[api_key] = 0
- await self.write_api_usage_cache_file(
- self.get_api_usage_cache_filename(api_key),
- {"daily_limit": self._api_limit[api_key], "daily_limit_consumed": self._api_used[api_key]},
- api_key
- )
+ serialise = False
+ if serialise:
+ try:
+ async with self._serialize_lock:
+ async with aiofiles.open(json_file, 'w') as f:
+ await f.write(payload)
+ except Exception as e:
+ _LOGGER.error("Exception writing usage cache for %s: %s", self.redact_msg_api_key(json_file, api_key), e)
async def sites_data(self):
- """Request sites detail"""
-
+ """Request site details"""
try:
- def redact(s):
+ def redact_lat_lon(s):
return re.sub(r'itude\': [0-9\-\.]+', 'itude\': **.******', s)
sp = self.options.api_key.split(",")
for spl in sp:
- params = {"format": "json", "api_key": spl.strip()}
+ api_key = spl.strip()
async with async_timeout.timeout(60):
- apiCacheFileName = self.get_api_sites_cache_filename(spl)
- _LOGGER.debug(f"{'Sites cache ' + ('exists' if file_exists(apiCacheFileName) else 'does not yet exist')}")
- if self.apiCacheEnabled and file_exists(apiCacheFileName):
- _LOGGER.debug(f"Loading cached sites data")
+ cache_filename = self.get_sites_cache_filename(api_key)
+ _LOGGER.debug("%s", 'Sites cache ' + ('exists' if file_exists(cache_filename) else 'does not yet exist'))
+ if self._api_cache_enabled and file_exists(cache_filename):
+ _LOGGER.debug("Loading cached sites data")
status = 404
- async with aiofiles.open(apiCacheFileName) as f:
+ async with aiofiles.open(cache_filename) as f:
resp_json = json.loads(await f.read())
status = 200
else:
- _LOGGER.debug(f"Connecting to {self.options.host}/rooftop_sites?format=json&api_key={self.redact_api_key(spl)}")
+ url = f"{self.options.host}/rooftop_sites"
+ params = {"format": "json", "api_key": api_key}
+ _LOGGER.debug("Connecting to %s?format=json&api_key=%s", url, self.redact_api_key(api_key))
retries = 3
retry = retries
success = False
- useCacheImmediate = False
- cacheExists = file_exists(apiCacheFileName)
+ use_cache_immediate = False
+ cache_exists = file_exists(cache_filename)
while retry >= 0:
- resp: ClientResponse = await self.aiohttp_session.get(
- url=f"{self.options.host}/rooftop_sites", params=params, ssl=False
- )
+ resp: ClientResponse = await self._aiohttp_session.get(url=url, params=params, ssl=False)
status = resp.status
- _LOGGER.debug(f"HTTP session returned status {translate(status)} in sites_data()")
+ _LOGGER.debug("HTTP session returned status %s in sites_data()%s", translate(status), ', trying cache' if status != 200 else '')
try:
resp_json = await resp.json(content_type=None)
except json.decoder.JSONDecodeError:
_LOGGER.error("JSONDecodeError in sites_data(): Solcast site could be having problems")
- except: raise
+ except:
+ raise
if status == 200:
- _LOGGER.debug(f"Writing sites cache")
- async with aiofiles.open(apiCacheFileName, 'w') as f:
- await f.write(json.dumps(resp_json, ensure_ascii=False))
- success = True
- break
+ if resp_json['total_records'] > 0:
+ _LOGGER.debug("Writing sites cache")
+ async with self._serialize_lock:
+ async with aiofiles.open(cache_filename, 'w') as f:
+ await f.write(json.dumps(resp_json, ensure_ascii=False))
+ success = True
+ break
+ else:
+ _LOGGER.error('No sites for the API key %s are configured at solcast.com', self.redact_api_key(api_key))
+ return
else:
- if cacheExists:
- useCacheImmediate = True
+ if cache_exists:
+ use_cache_immediate = True
break
if retry > 0:
- _LOGGER.debug(f"Will retry get sites, retry {(retries - retry) + 1}")
+ _LOGGER.debug("Will retry get sites, retry %d", (retries - retry) + 1)
await asyncio.sleep(5)
retry -= 1
if not success:
- if not useCacheImmediate:
- _LOGGER.warning(f"Retries exhausted gathering Solcast sites, last call result: {translate(status)}, using cached data if it exists")
+ if not use_cache_immediate:
+ _LOGGER.warning("Retries exhausted gathering Solcast sites, last call result: %s, using cached data if it exists", translate(status))
status = 404
- if cacheExists:
- async with aiofiles.open(apiCacheFileName) as f:
+ if cache_exists:
+ async with aiofiles.open(cache_filename) as f:
resp_json = json.loads(await f.read())
status = 200
- _LOGGER.info(f"Loaded sites cache for {self.redact_api_key(spl)}")
+ _LOGGER.info("Sites loaded for %s", self.redact_api_key(api_key))
else:
- _LOGGER.error(f"Cached Solcast sites are not yet available for {self.redact_api_key(spl)} to cope with API call failure")
- _LOGGER.error(f"At least one successful API 'get sites' call is needed, so the integration will not function correctly")
+ _LOGGER.error("Cached Solcast sites are not yet available for %s to cope with API call failure", self.redact_api_key(api_key))
+ _LOGGER.error("At least one successful API 'get sites' call is needed, so the integration will not function correctly")
if status == 200:
d = cast(dict, resp_json)
- _LOGGER.debug(f"Sites data: {redact(str(d))}")
+ _LOGGER.debug("Sites data: %s", redact_lat_lon(str(d)))
for i in d['sites']:
- i['apikey'] = spl.strip()
+ i['apikey'] = api_key
#v4.0.14 to stop HA adding a pin to the map
i.pop('longitude', None)
i.pop('latitude', None)
- self._sites = self._sites + d['sites']
- self._sites_loaded = True
+ self.sites = self.sites + d['sites']
+ self.sites_loaded = True
+ self._api_used_reset[api_key] = None
+ _LOGGER.info("Sites loaded for %s", self.redact_api_key(api_key))
else:
- _LOGGER.error(f"{self.options.host} HTTP status error {translate(status)} in sites_data() while gathering sites")
- raise Exception(f"HTTP sites_data error: Solcast Error gathering sites")
- except ConnectionRefusedError as err:
- _LOGGER.error("Connection refused in sites_data(): %s", err)
+ _LOGGER.error("%s HTTP status error %s in sites_data() while gathering sites", self.options.host, translate(status))
+ raise Exception("HTTP sites_data error: Solcast Error gathering sites")
+ except ConnectionRefusedError as e:
+ _LOGGER.error("Connection refused in sites_data(): %s", e)
except ClientConnectionError as e:
- _LOGGER.error('Connection error in sites_data(): %s', str(e))
+ _LOGGER.error('Connection error in sites_data(): %s', e)
except asyncio.TimeoutError:
try:
_LOGGER.warning("Retrieving Solcast sites timed out, attempting to continue")
error = False
for spl in sp:
- apiCacheFileName = self.get_api_sites_cache_filename(spl)
- cacheExists = file_exists(apiCacheFileName)
- if cacheExists:
- _LOGGER.info("Loading cached Solcast sites for {self.redact_api_key(spl)}")
- async with aiofiles.open(apiCacheFileName) as f:
+ api_key = spl.strip()
+ cache_filename = self.get_sites_cache_filename(api_key)
+ cache_exists = file_exists(cache_filename)
+ if cache_exists:
+ _LOGGER.info("Loading cached Solcast sites for %s", self.redact_api_key(api_key))
+ async with aiofiles.open(cache_filename) as f:
resp_json = json.loads(await f.read())
d = cast(dict, resp_json)
- _LOGGER.debug(f"Sites data: {redact(str(d))}")
+ _LOGGER.debug("Sites data: %s", redact_lat_lon(str(d)))
for i in d['sites']:
- i['apikey'] = spl.strip()
+ i['apikey'] = api_key
#v4.0.14 to stop HA adding a pin to the map
i.pop('longitude', None)
i.pop('latitude', None)
- self._sites = self._sites + d['sites']
- self._sites_loaded = True
- _LOGGER.info(f"Loaded sites cache for {self.redact_api_key(spl)}")
+ self.sites = self.sites + d['sites']
+ self.sites_loaded = True
+ self._api_used_reset[api_key] = None
+ _LOGGER.info("Sites loaded for %s", self.redact_api_key(api_key))
else:
error = True
- _LOGGER.error(f"Cached sites are not yet available for {self.redact_api_key(spl)} to cope with Solcast API call failure")
- _LOGGER.error(f"At least one successful API 'get sites' call is needed, so the integration cannot function yet")
+ _LOGGER.error("Cached Solcast sites are not yet available for %s to cope with API call failure", self.redact_api_key(api_key))
+ _LOGGER.error("At least one successful API 'get sites' call is needed, so the integration will not function correctly")
if error:
- _LOGGER.error("Timed out getting Solcast sites, and one or more site caches failed to load")
- _LOGGER.error("This is critical, and the integration cannot function reliably yet")
- _LOGGER.error("Suggestion: Double check your overall HA configuration, specifically networking related")
- except Exception as e:
+ _LOGGER.error("Suggestion: Check your overall HA configuration, specifically networking related (Is IPV6 an issue for you? DNS? Proxy?)")
+ except:
pass
except Exception as e:
- _LOGGER.error("Exception in sites_data(): %s", traceback.format_exc())
+ _LOGGER.error("Exception in sites_data(): %s: %s", e, traceback.format_exc())
async def sites_usage(self):
"""Load api usage cache"""
try:
+ if not self.sites_loaded:
+ _LOGGER.error("Internal error. Sites must be loaded before sites_usage() is called")
+ return
+
sp = self.options.api_key.split(",")
qt = self.options.api_quota.split(",")
try:
for i in range(len(sp)): # If only one quota value is present, yet there are multiple sites then use the same quota
- if len(qt) < i+1: qt.append(qt[i-1])
+ if len(qt) < i+1:
+ qt.append(qt[i-1])
quota = { sp[i].strip(): int(qt[i].strip()) for i in range(len(qt)) }
except Exception as e:
- _LOGGER.error('Exception: %s', e)
- _LOGGER.warning('Could not interpret API quota configuration string, using default of 10')
- quota = {}
- for i in range(len(sp)): quota[sp[i]] = 10
+ _LOGGER.error("Exception: %s", e)
+ _LOGGER.warning("Could not interpret API limit configuration string (%s), using default of 10", self.options.api_quota)
+ quota = {s: 10 for s in sp}
for spl in sp:
- sitekey = spl.strip()
- _LOGGER.debug(f"Getting API usage from cache for API key {self.redact_api_key(sitekey)}")
- apiCacheFileName = self.get_api_usage_cache_filename(sitekey)
- _LOGGER.debug(f"{'API usage cache ' + ('exists' if file_exists(apiCacheFileName) else 'does not yet exist')}")
- if file_exists(apiCacheFileName):
- async with aiofiles.open(apiCacheFileName) as f:
- usage = json.loads(await f.read())
- if usage['daily_limit'] != quota[spl]:
- usage['daily_limit'] = quota[spl]
- await self.write_api_usage_cache_file(apiCacheFileName, usage, sitekey)
- _LOGGER.info(f"API usage cache loaded and updated with new quota")
- else:
- _LOGGER.debug(f"API usage cache loaded")
+ api_key = spl.strip()
+ cache_filename = self.get_usage_cache_filename(api_key)
+ _LOGGER.debug("%s for %s", 'Usage cache ' + ('exists' if file_exists(cache_filename) else 'does not yet exist'), self.redact_api_key(api_key))
+ cache = True
+ if file_exists(cache_filename):
+ async with aiofiles.open(cache_filename) as f:
+ try:
+ usage = json.loads(await f.read(), cls=JSONDecoder)
+ except json.decoder.JSONDecodeError:
+ _LOGGER.error("The usage cache for %s is corrupt, re-creating cache with zero usage", self.redact_api_key(api_key))
+ cache = False
+ except Exception as e:
+ _LOGGER.error("Load usage cache exception %s for %s, re-creating cache with zero usage", e, self.redact_api_key(api_key))
+ cache = False
+ if cache:
+ self._api_limit[api_key] = usage.get("daily_limit", None)
+ self._api_used[api_key] = usage.get("daily_limit_consumed", None)
+ self._api_used_reset[api_key] = usage.get("reset", None)
+ if self._api_used_reset[api_key] is not None:
+ try:
+ self._api_used_reset[api_key] = parse_datetime(self._api_used_reset[api_key]).astimezone(timezone.utc)
+ except:
+ _LOGGER.error("Internal error parsing datetime from usage cache, continuing")
+ _LOGGER.error(traceback.format_exc())
+ self._api_used_reset[api_key] = None
+ if usage['daily_limit'] != quota[api_key]: # Limit has been adjusted, so rewrite the cache
+ self._api_limit[api_key] = quota[api_key]
+ await self.serialise_usage(api_key)
+ _LOGGER.info("Usage loaded and cache updated with new limit")
+ else:
+ _LOGGER.info("Usage loaded for %s", self.redact_api_key(api_key))
+ if self._api_used_reset[api_key] is not None and self.get_real_now_utc() > self._api_used_reset[api_key] + timedelta(hours=24):
+ _LOGGER.warning("Resetting usage for %s, last reset was more than 24-hours ago", self.redact_api_key(api_key))
+ self._api_used[api_key] = 0
+ await self.serialise_usage(api_key, reset=True)
else:
- _LOGGER.warning(f"No Solcast API usage cache found, creating one assuming zero API used")
- usage = {'daily_limit': quota[spl], 'daily_limit_consumed': 0}
- await self.write_api_usage_cache_file(apiCacheFileName, usage, sitekey)
+ cache = False
+ if not cache:
+ _LOGGER.warning("Creating usage cache for %s, assuming zero API used", self.redact_api_key(api_key))
+ self._api_limit[api_key] = quota[api_key]
+ self._api_used[api_key] = 0
+ await self.serialise_usage(api_key, reset=True)
+ _LOGGER.debug("API counter for %s is %d/%d", self.redact_api_key(api_key), self._api_used[api_key], self._api_limit[api_key])
+ except Exception as e:
+ _LOGGER.error("Exception in sites_usage(): %s: %s", e, traceback.format_exc())
- self._api_limit[sitekey] = usage.get("daily_limit", None)
- self._api_used[sitekey] = usage.get("daily_limit_consumed", None)
- _LOGGER.debug(f"API counter for {self.redact_api_key(sitekey)} is {self._api_used[sitekey]}/{self._api_limit[sitekey]}")
- except:
- _LOGGER.error("Exception in sites_usage(): %s", traceback.format_exc())
+ async def reset_api_usage(self):
+ """Reset the daily API usage counter"""
+ for api_key, _ in self._api_used.items():
+ self._api_used[api_key] = 0
+ await self.serialise_usage(api_key, reset=True)
'''
async def sites_usage(self):
@@ -351,67 +426,70 @@ async def sites_usage(self):
sp = self.options.api_key.split(",")
for spl in sp:
- sitekey = spl.strip()
- params = {"api_key": sitekey}
- _LOGGER.debug(f"Getting API limit and usage from solcast for {self.redact_api_key(sitekey)}")
+ api_key = spl.strip()
+ _LOGGER.debug("Getting API limit and usage from solcast for %s", self.redact_api_key(api_key))
async with async_timeout.timeout(60):
- apiCacheFileName = self.get_api_usage_cache_filename(sitekey)
- _LOGGER.debug(f"{'API usage cache ' + ('exists' if file_exists(apiCacheFileName) else 'does not yet exist')}")
+ cache_filename = self.get_usage_cache_filename(api_key)
+ _LOGGER.debug("%s", 'API usage cache ' + ('exists' if file_exists(cache_filename) else 'does not yet exist'))
+ url = f"{self.options.host}/json/reply/GetUserUsageAllowance"
+ params = {"api_key": api_key}
retries = 3
retry = retries
success = False
- useCacheImmediate = False
- cacheExists = file_exists(apiCacheFileName)
+ use_cache_immediate = False
+ cache_exists = file_exists(cache_filename)
while retry > 0:
- resp: ClientResponse = await self.aiohttp_session.get(
- url=f"{self.options.host}/json/reply/GetUserUsageAllowance", params=params, ssl=False
- )
+ resp: ClientResponse = await self._aiohttp_session.get(url=url, params=params, ssl=False)
status = resp.status
try:
resp_json = await resp.json(content_type=None)
except json.decoder.JSONDecodeError:
_LOGGER.error("JSONDecodeError in sites_usage() - Solcast site could be having problems")
except: raise
- _LOGGER.debug(f"HTTP session returned status {translate(status)} in sites_usage()")
+ _LOGGER.debug("HTTP session returned status %s in sites_usage()", translate(status))
if status == 200:
- await self.write_api_usage_cache_file(apiCacheFileName, resp_json, sitekey)
+ d = cast(dict, resp_json)
+ self._api_limit[api_key] = d.get("daily_limit", None)
+ self._api_used[api_key] = d.get("daily_limit_consumed", None)
+ await self.serialise_usage(api_key)
retry = 0
success = True
else:
- if cacheExists:
- useCacheImmediate = True
+ if cache_exists:
+ use_cache_immediate = True
break
- _LOGGER.debug(f"Will retry GetUserUsageAllowance, retry {(retries - retry) + 1}")
+ _LOGGER.debug("Will retry GetUserUsageAllowance, retry %d", (retries - retry) + 1)
await asyncio.sleep(5)
retry -= 1
if not success:
- if not useCacheImmediate:
- _LOGGER.warning(f"Timeout getting Solcast API usage allowance, last call result: {translate(status)}, using cached data if it exists")
+ if not use_cache_immediate:
+ _LOGGER.warning("Timeout getting Solcast API usage allowance, last call result: %s, using cached data if it exists", translate(status))
status = 404
- if cacheExists:
- async with aiofiles.open(apiCacheFileName) as f:
+ if cache_exists:
+ async with aiofiles.open(cache_filename) as f:
resp_json = json.loads(await f.read())
status = 200
- _LOGGER.info(f"Loaded API usage cache")
+ d = cast(dict, resp_json)
+ self._api_limit[api_key] = d.get("daily_limit", None)
+ self._api_used[api_key] = d.get("daily_limit_consumed", None)
+ _LOGGER.info("Loaded API usage cache")
else:
- _LOGGER.warning(f"No Solcast API usage cache found")
+ _LOGGER.warning("No Solcast API usage cache found")
if status == 200:
- d = cast(dict, resp_json)
- self._api_limit[sitekey] = d.get("daily_limit", None)
- self._api_used[sitekey] = d.get("daily_limit_consumed", None)
- _LOGGER.debug(f"API counter for {self.redact_api_key(sitekey)} is {self._api_used[sitekey]}/{self._api_limit[sitekey]}")
+ _LOGGER.debug("API counter for %s is %d/%d", self.redact_api_key(api_key), self._api_used[api_key], self._api_limit[api_key])
else:
- self._api_limit[sitekey] = 10
- self._api_used[sitekey] = 0
+ self._api_limit[api_key] = 10
+ self._api_used[api_key] = 0
+ await self.serialise_usage(api_key)
raise Exception(f"Gathering site usage failed in sites_usage(). Request returned Status code: {translate(status)} - Response: {resp_json}.")
except json.decoder.JSONDecodeError:
_LOGGER.error("JSONDecodeError in sites_usage(): Solcast site could be having problems")
- except ConnectionRefusedError as err:
- _LOGGER.error("Error in sites_usage(): %s", err)
+ except ConnectionRefusedError as e:
+ _LOGGER.error("Error in sites_usage(): %s", e)
except ClientConnectionError as e:
- _LOGGER.error('Connection error in sites_usage(): %s', str(e))
+ _LOGGER.error('Connection error in sites_usage(): %s', e)
except asyncio.TimeoutError:
_LOGGER.error("Connection error in sites_usage(): Timed out connecting to solcast server")
except Exception as e:
@@ -423,33 +501,31 @@ async def sites_weather(self):
"""Request site weather byline"""
try:
- if len(self._sites) > 0:
+ if len(self.sites) > 0:
sp = self.options.api_key.split(",")
- rid = self._sites[0].get("resource_id", None)
-
+ rid = self.sites[0].get("resource_id", None)
+ url=f"{self.options.host}/json/reply/GetRooftopSiteSparklines"
params = {"resourceId": rid, "api_key": sp[0]}
- _LOGGER.debug(f"Get weather byline")
+ _LOGGER.debug("Get weather byline")
async with async_timeout.timeout(60):
- resp: ClientResponse = await self.aiohttp_session.get(
- url=f"https://api.solcast.com.au/json/reply/GetRooftopSiteSparklines", params=params, ssl=False
- )
+ resp: ClientResponse = await self._aiohttp_session.get(url=url, params=params, ssl=False)
resp_json = await resp.json(content_type=None)
status = resp.status
if status == 200:
d = cast(dict, resp_json)
- _LOGGER.debug(f"Returned data in sites_weather(): {d}")
+ _LOGGER.debug("Returned data in sites_weather(): %s", str(d))
self._weather = d.get("forecast_descriptor", None).get("description", None)
- _LOGGER.debug(f"Weather description: {self._weather}")
+ _LOGGER.debug("Weather description: %s", self._weather)
else:
raise Exception(f"Gathering weather description failed. request returned Status code: {translate(status)} - Response: {resp_json}.")
except json.decoder.JSONDecodeError:
_LOGGER.error("JSONDecodeError in sites_weather(): Solcast site could be having problems")
- except ConnectionRefusedError as err:
- _LOGGER.error("Error in sites_weather(): %s", err)
+ except ConnectionRefusedError as e:
+ _LOGGER.error("Error in sites_weather(): %s", e)
except ClientConnectionError as e:
- _LOGGER.error("Connection error in sites_weather(): %s", str(e))
+ _LOGGER.error("Connection error in sites_weather(): %s", e)
except asyncio.TimeoutError:
_LOGGER.error("Connection Error in sites_weather(): Timed out connection to solcast server")
except Exception as e:
@@ -457,62 +533,66 @@ async def sites_weather(self):
'''
async def load_saved_data(self):
+ """Load the saved solcast.json data, also checking for new API keys and site removal"""
try:
- if len(self._sites) > 0:
+ status = ''
+ if len(self.sites) > 0:
if file_exists(self._filename):
async with aiofiles.open(self._filename) as data_file:
- jsonData = json.loads(await data_file.read(), cls=JSONDecoder)
- json_version = jsonData.get("version", 1)
- #self._weather = jsonData.get("weather", "unknown")
- _LOGGER.debug(f"The saved data file exists, file type is {type(jsonData)}")
+ json_data = json.loads(await data_file.read(), cls=JSONDecoder)
+ json_version = json_data.get("version", 1)
+ #self._weather = json_data.get("weather", "unknown")
+ _LOGGER.debug("Data cache exists, file type is %s", type(json_data))
if json_version == _JSON_VERSION:
- self._data = jsonData
+ self._data = json_data
self._loaded_data = True
# Check for any new API keys so no sites data yet for those
ks = {}
- for d in self._sites:
- if not any(s == d.get('resource_id', '') for s in jsonData['siteinfo']):
+ for d in self.sites:
+ if not any(s == d.get('resource_id', '') for s in json_data['siteinfo']):
ks[d.get('resource_id')] = d.get('apikey')
if len(ks.keys()) > 0:
# Some site data does not exist yet so get it
- _LOGGER.info("New site(s) have been added, so getting forecast data for just those site(s)")
- for a in ks:
- await self.http_data_call(self.get_api_usage_cache_filename(ks[a]), r_id=a, api=ks[a], dopast=True)
+ _LOGGER.info("New site(s) have been added, so getting forecast data for them")
+ for a, _api_key in ks:
+ await self.http_data_call(r_id=a, api=_api_key, dopast=True)
await self.serialize_data()
# Check for sites that need to be removed
l = []
- for s in jsonData['siteinfo']:
- if not any(d.get('resource_id', '') == s for d in self._sites):
- _LOGGER.info(f"Solcast site resource id {s} is no longer configured, removing saved data from cached file")
+ for s in json_data['siteinfo']:
+ if not any(d.get('resource_id', '') == s for d in self.sites):
+ _LOGGER.warning("Solcast site resource id %s is no longer configured, removing saved data from cached file", s)
l.append(s)
for ll in l:
- del jsonData['siteinfo'][ll]
+ del json_data['siteinfo'][ll]
# Create an up to date forecast
await self.buildforecastdata()
- _LOGGER.info(f"Loaded solcast.json forecast cache")
+ _LOGGER.info("Data loaded")
if not self._loaded_data:
# No file to load
- _LOGGER.warning(f"There is no solcast.json to load, so fetching solar forecast, including past forecasts")
+ _LOGGER.warning("There is no solcast.json to load, so fetching solar forecast, including past forecasts")
# Could be a brand new install of the integation, or the file has been removed. Poll once now...
- await self.http_data(dopast=True)
-
- if self._loaded_data: return True
+ status = await self.http_data(dopast=True)
else:
- _LOGGER.error(f"Solcast site count is zero in load_saved_data(); the get sites must have failed, and there is no sites cache")
+ _LOGGER.error("Solcast site count is zero in load_saved_data(); the get sites must have failed, and there is no sites cache")
+ status = 'Solcast sites count is zero, add sites'
except json.decoder.JSONDecodeError:
_LOGGER.error("The cached data in solcast.json is corrupt in load_saved_data()")
+ status = 'The cached data in /config/solcast.json is corrupted, suggest removing or repairing it'
except Exception as e:
_LOGGER.error("Exception in load_saved_data(): %s", traceback.format_exc())
- return False
+ status = f"Exception in load_saved_data(): {e}"
+ return status
- async def delete_solcast_file(self, *args):
- _LOGGER.debug(f"Service event to delete old solcast.json file")
+ async def delete_solcast_file(self, *args): # pylint: disable=W0613
+ """Service event to delete old solcast.json file"""
+ _LOGGER.debug("Service event to delete old solcast.json file")
try:
if file_exists(self._filename):
os.remove(self._filename)
@@ -522,9 +602,10 @@ async def delete_solcast_file(self, *args):
else:
_LOGGER.warning("There is no solcast.json to delete")
except Exception:
- _LOGGER.error(f"Service event to delete old solcast.json file failed")
+ _LOGGER.error("Service event to delete old solcast.json file failed")
async def get_forecast_list(self, *args):
+ """Service event to get list of forecasts"""
try:
st_time = time.time()
@@ -539,23 +620,22 @@ async def get_forecast_list(self, *args):
return tuple( {**d, "period_start": d["period_start"].astimezone(self._tz)} for d in h )
except Exception:
- _LOGGER.error(f"Service event to get list of forecasts failed")
+ _LOGGER.error("Service event to get list of forecasts failed")
return None
def get_api_used_count(self):
- """Return API polling count for this UTC 24hr period"""
+ """Return total API polling count for this UTC 24hr period (all accounts combined)"""
used = 0
- for _, v in self._api_used.items(): used += v
+ for _, v in self._api_used.items():
+ used += v
return used
def get_api_limit(self):
- """Return API polling limit for this account"""
- try:
- limit = 0
- for _, v in self._api_limit.items(): limit += v
- return limit
- except Exception:
- return None
+ """Return API polling limit (all accounts combined)"""
+ limit = 0
+ for _, v in self._api_limit.items():
+ limit += v
+ return limit
# def get_weather(self):
# """Return weather description"""
@@ -567,12 +647,13 @@ def get_last_updated_datetime(self) -> dt:
def get_rooftop_site_total_today(self, site) -> float:
"""Return total kW for today for a site"""
- if self._tally.get(site) == None: _LOGGER.warning(f"Site total kW forecast today is currently unavailable for {site}")
+ if self._tally.get(site) is None:
+ _LOGGER.warning("Site total kW forecast today is currently unavailable for %s", site)
return self._tally.get(site)
def get_rooftop_site_extra_data(self, site = ""):
"""Return information about a site"""
- g = tuple(d for d in self._sites if d["resource_id"] == site)
+ g = tuple(d for d in self.sites if d["resource_id"] == site)
if len(g) != 1:
raise ValueError(f"Unable to find site {site}")
site: Dict[str, Any] = g[0]
@@ -593,21 +674,29 @@ def get_rooftop_site_extra_data(self, site = ""):
return ret
def get_now_utc(self):
+ """Datetime helper"""
return dt.now(self._tz).replace(second=0, microsecond=0).astimezone(timezone.utc)
+ def get_real_now_utc(self):
+ """Datetime helper"""
+ return dt.now(self._tz).astimezone(timezone.utc)
+
def get_interval_start_utc(self, moment):
+ """Datetime helper"""
n = moment.replace(second=0, microsecond=0)
return n.replace(minute=0 if n.minute < 30 else 30).astimezone(timezone.utc)
def get_hour_start_utc(self):
+ """Datetime helper"""
return dt.now(self._tz).replace(minute=0, second=0, microsecond=0).astimezone(timezone.utc)
def get_day_start_utc(self):
+ """Datetime helper"""
return dt.now(self._tz).replace(hour=0, minute=0, second=0, microsecond=0).astimezone(timezone.utc)
def get_forecast_day(self, futureday) -> Dict[str, Any]:
"""Return forecast data for the Nth day ahead"""
- noDataError = True
+ no_data_error = True
start_utc = self.get_day_start_utc() + timedelta(days=futureday)
end_utc = start_utc + timedelta(days=1)
@@ -624,7 +713,7 @@ def get_forecast_day(self, futureday) -> Dict[str, Any]:
tup = tuple( {**d, "period_start": d["period_start"].astimezone(self._tz)} for d in h )
if len(tup) < 48:
- noDataError = False
+ no_data_error = False
hourlytup = []
for index in range(0,len(tup),2):
@@ -639,16 +728,18 @@ def get_forecast_day(self, futureday) -> Dict[str, Any]:
x2 = round((tup[index]["pv_estimate10"]), 4)
x3 = round((tup[index]["pv_estimate90"]), 4)
hourlytup.append({"period_start":tup[index]["period_start"], "pv_estimate":x1, "pv_estimate10":x2, "pv_estimate90":x3})
- except Exception as ex:
- _LOGGER.error("Exception in get_forecast_day(): %s", ex)
+ except Exception as e:
+ _LOGGER.error("Exception in get_forecast_day(): %s", e)
_LOGGER.error(traceback.format_exc())
res = {
"dayname": start_utc.astimezone(self._tz).strftime("%A"),
- "dataCorrect": noDataError,
+ "dataCorrect": no_data_error,
}
- if self.options.attr_brk_halfhourly: res["detailedForecast"] = tup
- if self.options.attr_brk_hourly: res["detailedHourly"] = hourlytup
+ if self.options.attr_brk_halfhourly:
+ res["detailedForecast"] = tup
+ if self.options.attr_brk_hourly:
+ res["detailedHourly"] = hourlytup
return res
def get_forecast_n_hour(self, n_hour, site=None, _use_data_field=None) -> int:
@@ -659,14 +750,17 @@ def get_forecast_n_hour(self, n_hour, site=None, _use_data_field=None) -> int:
return res
def get_forecasts_n_hour(self, n_hour) -> Dict[str, Any]:
+ """Return forecast for the Nth hour for all sites and individual sites"""
res = {}
if self.options.attr_brk_site:
- for site in self._sites:
+ for site in self.sites:
res[site['resource_id']] = self.get_forecast_n_hour(n_hour, site=site['resource_id'])
for _data_field in ('pv_estimate', 'pv_estimate10', 'pv_estimate90'):
- if self._estimen.get(_data_field): res[_data_field.replace('pv_','')+'-'+site['resource_id']] = self.get_forecast_n_hour(n_hour, site=site['resource_id'], _use_data_field=_data_field)
+ if self._estimate_set.get(_data_field):
+ res[_data_field.replace('pv_','')+'-'+site['resource_id']] = self.get_forecast_n_hour(n_hour, site=site['resource_id'], _use_data_field=_data_field)
for _data_field in ('pv_estimate', 'pv_estimate10', 'pv_estimate90'):
- if self._estimen.get(_data_field): res[_data_field.replace('pv_','')] = self.get_forecast_n_hour(n_hour, _use_data_field=_data_field)
+ if self._estimate_set.get(_data_field):
+ res[_data_field.replace('pv_','')] = self.get_forecast_n_hour(n_hour, _use_data_field=_data_field)
return res
def get_forecast_custom_hours(self, n_hours, site=None, _use_data_field=None) -> int:
@@ -677,14 +771,17 @@ def get_forecast_custom_hours(self, n_hours, site=None, _use_data_field=None) ->
return res
def get_forecasts_custom_hours(self, n_hour) -> Dict[str, Any]:
+ """Return forecast for the next N hours for all sites and individual sites"""
res = {}
if self.options.attr_brk_site:
- for site in self._sites:
+ for site in self.sites:
res[site['resource_id']] = self.get_forecast_custom_hours(n_hour, site=site['resource_id'])
for _data_field in ('pv_estimate', 'pv_estimate10', 'pv_estimate90'):
- if self._estimen.get(_data_field): res[_data_field.replace('pv_','')+'-'+site['resource_id']] = self.get_forecast_custom_hours(n_hour, site=site['resource_id'], _use_data_field=_data_field)
+ if self._estimate_set.get(_data_field):
+ res[_data_field.replace('pv_','')+'-'+site['resource_id']] = self.get_forecast_custom_hours(n_hour, site=site['resource_id'], _use_data_field=_data_field)
for _data_field in ('pv_estimate', 'pv_estimate10', 'pv_estimate90'):
- if self._estimen.get(_data_field): res[_data_field.replace('pv_','')] = self.get_forecast_custom_hours(n_hour, _use_data_field=_data_field)
+ if self._estimate_set.get(_data_field):
+ res[_data_field.replace('pv_','')] = self.get_forecast_custom_hours(n_hour, _use_data_field=_data_field)
return res
def get_power_n_mins(self, n_mins, site=None, _use_data_field=None) -> int:
@@ -693,14 +790,17 @@ def get_power_n_mins(self, n_mins, site=None, _use_data_field=None) -> int:
return round(1000 * self.get_forecast_pv_moment(time_utc, site=site, _use_data_field=_use_data_field))
def get_sites_power_n_mins(self, n_mins) -> Dict[str, Any]:
+ """Return expected power generation in the next N minutes for all sites and individual sites"""
res = {}
if self.options.attr_brk_site:
- for site in self._sites:
+ for site in self.sites:
res[site['resource_id']] = self.get_power_n_mins(n_mins, site=site['resource_id'])
for _data_field in ('pv_estimate', 'pv_estimate10', 'pv_estimate90'):
- if self._estimen.get(_data_field): res[_data_field.replace('pv_','')+'-'+site['resource_id']] = self.get_power_n_mins(n_mins, site=site['resource_id'], _use_data_field=_data_field)
+ if self._estimate_set.get(_data_field):
+ res[_data_field.replace('pv_','')+'-'+site['resource_id']] = self.get_power_n_mins(n_mins, site=site['resource_id'], _use_data_field=_data_field)
for _data_field in ('pv_estimate', 'pv_estimate10', 'pv_estimate90'):
- if self._estimen.get(_data_field): res[_data_field.replace('pv_','')] = self.get_power_n_mins(n_mins, site=None, _use_data_field=_data_field)
+ if self._estimate_set.get(_data_field):
+ res[_data_field.replace('pv_','')] = self.get_power_n_mins(n_mins, site=None, _use_data_field=_data_field)
return res
def get_peak_w_day(self, n_day, site=None, _use_data_field=None) -> int:
@@ -712,14 +812,17 @@ def get_peak_w_day(self, n_day, site=None, _use_data_field=None) -> int:
return 0 if res is None else round(1000 * res[_data_field])
def get_sites_peak_w_day(self, n_day) -> Dict[str, Any]:
+ """Return max kW for site N days ahead for all sites and individual sites"""
res = {}
if self.options.attr_brk_site:
- for site in self._sites:
+ for site in self.sites:
res[site['resource_id']] = self.get_peak_w_day(n_day, site=site['resource_id'])
for _data_field in ('pv_estimate', 'pv_estimate10', 'pv_estimate90'):
- if self._estimen.get(_data_field): res[_data_field.replace('pv_','')+'-'+site['resource_id']] = self.get_peak_w_day(n_day, site=site['resource_id'], _use_data_field=_data_field)
+ if self._estimate_set.get(_data_field):
+ res[_data_field.replace('pv_','')+'-'+site['resource_id']] = self.get_peak_w_day(n_day, site=site['resource_id'], _use_data_field=_data_field)
for _data_field in ('pv_estimate', 'pv_estimate10', 'pv_estimate90'):
- if self._estimen.get(_data_field): res[_data_field.replace('pv_','')] = self.get_peak_w_day(n_day, site=None, _use_data_field=_data_field)
+ if self._estimate_set.get(_data_field):
+ res[_data_field.replace('pv_','')] = self.get_peak_w_day(n_day, site=None, _use_data_field=_data_field)
return res
def get_peak_w_time_day(self, n_day, site=None, _use_data_field=None) -> dt:
@@ -730,14 +833,17 @@ def get_peak_w_time_day(self, n_day, site=None, _use_data_field=None) -> dt:
return res if res is None else res["period_start"]
def get_sites_peak_w_time_day(self, n_day) -> Dict[str, Any]:
+ """Return hour of max kW for site N days ahead for all sites and individual sites"""
res = {}
if self.options.attr_brk_site:
- for site in self._sites:
+ for site in self.sites:
res[site['resource_id']] = self.get_peak_w_time_day(n_day, site=site['resource_id'])
for _data_field in ('pv_estimate', 'pv_estimate10', 'pv_estimate90'):
- if self._estimen.get(_data_field): res[_data_field.replace('pv_','')+'-'+site['resource_id']] = self.get_peak_w_time_day(n_day, site=site['resource_id'], _use_data_field=_data_field)
+ if self._estimate_set.get(_data_field):
+ res[_data_field.replace('pv_','')+'-'+site['resource_id']] = self.get_peak_w_time_day(n_day, site=site['resource_id'], _use_data_field=_data_field)
for _data_field in ('pv_estimate', 'pv_estimate10', 'pv_estimate90'):
- if self._estimen.get(_data_field): res[_data_field.replace('pv_','')] = self.get_peak_w_time_day(n_day, site=None, _use_data_field=_data_field)
+ if self._estimate_set.get(_data_field):
+ res[_data_field.replace('pv_','')] = self.get_peak_w_time_day(n_day, site=None, _use_data_field=_data_field)
return res
def get_forecast_remaining_today(self, site=None, _use_data_field=None) -> float:
@@ -749,14 +855,17 @@ def get_forecast_remaining_today(self, site=None, _use_data_field=None) -> float
return res
def get_forecasts_remaining_today(self) -> Dict[str, Any]:
+ """Return remaining forecasted production for today for all sites and individual sites"""
res = {}
if self.options.attr_brk_site:
- for site in self._sites:
+ for site in self.sites:
res[site['resource_id']] = self.get_forecast_remaining_today(site=site['resource_id'])
for _data_field in ('pv_estimate', 'pv_estimate10', 'pv_estimate90'):
- if self._estimen.get(_data_field): res[_data_field.replace('pv_','')+'-'+site['resource_id']] = self.get_forecast_remaining_today(site=site['resource_id'], _use_data_field=_data_field)
+ if self._estimate_set.get(_data_field):
+ res[_data_field.replace('pv_','')+'-'+site['resource_id']] = self.get_forecast_remaining_today(site=site['resource_id'], _use_data_field=_data_field)
for _data_field in ('pv_estimate', 'pv_estimate10', 'pv_estimate90'):
- if self._estimen.get(_data_field): res[_data_field.replace('pv_','')] = self.get_forecast_remaining_today(_use_data_field=_data_field)
+ if self._estimate_set.get(_data_field):
+ res[_data_field.replace('pv_','')] = self.get_forecast_remaining_today(_use_data_field=_data_field)
return res
def get_total_kwh_forecast_day(self, n_day, site=None, _use_data_field=None) -> float:
@@ -767,19 +876,23 @@ def get_total_kwh_forecast_day(self, n_day, site=None, _use_data_field=None) ->
return res
def get_sites_total_kwh_forecast_day(self, n_day) -> Dict[str, Any]:
+ """Return forecast kWh total for site N days ahead for all sites and individual sites"""
res = {}
if self.options.attr_brk_site:
- for site in self._sites:
+ for site in self.sites:
res[site['resource_id']] = self.get_total_kwh_forecast_day(n_day, site=site['resource_id'])
for _data_field in ('pv_estimate', 'pv_estimate10', 'pv_estimate90'):
- if self._estimen.get(_data_field): res[_data_field.replace('pv_','')+'-'+site['resource_id']] = self.get_total_kwh_forecast_day(n_day, site=site['resource_id'], _use_data_field=_data_field)
+ if self._estimate_set.get(_data_field):
+ res[_data_field.replace('pv_','')+'-'+site['resource_id']] = self.get_total_kwh_forecast_day(n_day, site=site['resource_id'], _use_data_field=_data_field)
for _data_field in ('pv_estimate', 'pv_estimate10', 'pv_estimate90'):
- if self._estimen.get(_data_field): res[_data_field.replace('pv_','')] = self.get_total_kwh_forecast_day(n_day, site=None, _use_data_field=_data_field)
+ if self._estimate_set.get(_data_field):
+ res[_data_field.replace('pv_','')] = self.get_total_kwh_forecast_day(n_day, site=None, _use_data_field=_data_field)
return res
- def get_forecast_list_slice(self, _data, start_utc, end_utc=None, search_past=False):
+ def get_forecast_list_slice(self, _data, start_utc, end_utc=None, search_past=False) -> tuple[int, int]:
"""Return pv_estimates list slice (st_i, end_i) for interval"""
- if end_utc is None: end_utc = start_utc + timedelta(seconds=1800)
+ if end_utc is None:
+ end_utc = start_utc + timedelta(seconds=1800)
crt_i = -1
st_i = -1
end_i = len(_data)
@@ -800,96 +913,84 @@ def get_forecast_list_slice(self, _data, start_utc, end_utc=None, search_past=Fa
end_i = 0
return st_i, end_i
- async def spline_moments(self):
- """A cubic spline to retrieve interpolated inter-interval momentary estimates for five minute periods"""
- df = ['pv_estimate']
- if self.options.attr_brk_estimate10: df.append('pv_estimate10')
- if self.options.attr_brk_estimate90: df.append('pv_estimate90')
- xx = [ i for i in range(0, 1800*len(self._spline_period), 300) ]
- _data = self._data_forecasts
- st, _ = self.get_forecast_list_slice(_data, self.get_day_start_utc()) # Get start of day index
- self.fc_moment['all'] = {}
+ def get_spline(self, spline, st, xx, _data, df, reducing=False) -> None:
+ """Build a forecast spline, momentary or day reducing"""
for _data_field in df:
if st > 0:
y = [_data[st+i][_data_field] for i in range(0, len(self._spline_period))]
- self.fc_moment['all'][_data_field] = cubic_interp(xx, self._spline_period, y)
- for j in xx:
- i = int(j/300)
- if math.copysign(1.0, self.fc_moment['all'][_data_field][i]) < 0: self.fc_moment['all'][_data_field][i] = 0.0 # Suppress negative values
- k = int(math.floor(j/1800))
- if k+1 <= len(y)-1 and y[k] == 0 and y[k+1] == 0: self.fc_moment['all'][_data_field][i] = 0.0 # Suppress spline bounce
- self.fc_moment['all'][_data_field] = ([0]*3) + self.fc_moment['all'][_data_field] # Shift right by fifteen minutes because 30-minute averages, padding
- else: # The list slice was not found, so zero the moments
- self.fc_moment['all'][_data_field] = [0] * (len(self._spline_period) * 6)
- if self.options.attr_brk_site:
- for site in self._sites:
- self.fc_moment[site['resource_id']] = {}
- _data = self._site_data_forecasts[site['resource_id']]
- st, _ = self.get_forecast_list_slice(_data, self.get_day_start_utc()) # Get start of day index
- for _data_field in df:
- if st > 0:
- y = [_data[st+i][_data_field] for i in range(0, len(self._spline_period))]
- self.fc_moment[site['resource_id']][_data_field] = cubic_interp(xx, self._spline_period, y)
- for j in xx:
- i = int(j/300)
- if math.copysign(1.0, self.fc_moment[site['resource_id']][_data_field][i]) < 0: self.fc_moment[site['resource_id']][_data_field][i] = 0.0 # Suppress negative values
- k = int(math.floor(j/1800))
- if k+1 <= len(y)-1 and y[k] == 0 and y[k+1] == 0: self.fc_moment[site['resource_id']][_data_field][i] = 0.0 # Suppress spline bounce
- self.fc_moment[site['resource_id']][_data_field] = ([0]*3) + self.fc_moment[site['resource_id']][_data_field] # Shift right by fifteen minutes because 30-minute averages, padding
- else: # The list slice was not found, so zero the moments
- self.fc_moment[site['resource_id']][_data_field] = [0] * (len(self._spline_period) * 6)
-
- def get_moment(self, site, _data_field, t):
- return self.fc_moment['all' if site is None else site][self._data_field if _data_field is None else _data_field][int(t / 300)]
-
- async def spline_remaining(self):
- """A cubic spline to retrieve interpolated inter-interval reducing estimates for five minute periods"""
- def buildY(_data, _data_field, st):
- y = []
- for i in range(0, len(self._spline_period)):
- rem = 0
- for j in range(i, len(self._spline_period)): rem += _data[st+j][_data_field]
- y.append(0.5 * rem)
- return y
- df = ['pv_estimate']
- if self.options.attr_brk_estimate10: df.append('pv_estimate10')
- if self.options.attr_brk_estimate90: df.append('pv_estimate90')
+ if reducing:
+ # Build a decreasing set of forecasted values instead
+ y = [0.5 * sum(y[i:]) for i in range(0, len(self._spline_period))]
+ spline[_data_field] = cubic_interp(xx, self._spline_period, y)
+ self.sanitise_spline(spline, _data_field, xx, y, reducing=reducing)
+ else: # The list slice was not found, so zero all values in the spline
+ spline[_data_field] = [0] * (len(self._spline_period) * 6)
+ if _SPLINE_DEBUG_LOGGING:
+ _LOGGER.debug(str(spline))
+
+ def sanitise_spline(self, spline, _data_field, xx, y, reducing=False) -> None:
+ """Ensures that no negative values are returned, and also shifts the spline to account for half-hour average input values"""
+ for j in xx:
+ i = int(j/300)
+ # Suppress negative values
+ if math.copysign(1.0, spline[_data_field][i]) < 0:
+ spline[_data_field][i] = 0.0
+ # Suppress spline bounce
+ if reducing:
+ if i+1 <= len(xx)-1 and spline[_data_field][i+1] > spline[_data_field][i]:
+ spline[_data_field][i+1] = spline[_data_field][i]
+ else:
+ k = int(math.floor(j/1800))
+ if k+1 <= len(y)-1 and y[k] == 0 and y[k+1] == 0:
+ spline[_data_field][i] = 0.0
+ # Shift right by fifteen minutes because 30-minute averages, padding as appropriate
+ if reducing:
+ spline[_data_field] = ([spline[_data_field][0]]*3) + spline[_data_field]
+ else:
+ spline[_data_field] = ([0]*3) + spline[_data_field]
+
+ def build_splines(self, variant, reducing=False) -> None:
+ """Build cubic splines for interpolated inter-interval momentary or reducing estimates"""
+ df = ['pv_estimate'] + (['pv_estimate10'] if self.options.attr_brk_estimate10 else []) + (['pv_estimate90'] if self.options.attr_brk_estimate90 else [])
xx = [ i for i in range(0, 1800*len(self._spline_period), 300) ]
- _data = self._data_forecasts
- st, _ = self.get_forecast_list_slice(_data, self.get_day_start_utc()) # Get start of day index
- self.fc_remaining['all'] = {}
- for _data_field in df:
- if st > 0:
- y = buildY(_data, _data_field, st)
- self.fc_remaining['all'][_data_field] = cubic_interp(xx, self._spline_period, y)
- for j in xx:
- i = int(j/300)
- k = int(math.floor(j/1800))
- if math.copysign(1.0, self.fc_remaining['all'][_data_field][i]) < 0: self.fc_remaining['all'][_data_field][i] = 0.0 # Suppress negative values
- if k+1 <= len(y)-1 and y[k] == y[k+1] and self.fc_remaining['all'][_data_field][i] > round(y[k],4): self.fc_remaining['all'][_data_field][i] = y[k] # Suppress spline bounce
- self.fc_remaining['all'][_data_field] = ([self.fc_remaining['all'][_data_field][0]]*3) + self.fc_remaining['all'][_data_field] # Shift right by fifteen minutes because 30-minute averages, padding
- else: # The list slice was not found, so zero the remainings
- self.fc_remaining['all'][_data_field] = [0] * (len(self._spline_period) * 6)
+ st, _ = self.get_forecast_list_slice(self._data_forecasts, self.get_day_start_utc()) # Get start of day index
+
+ variant['all'] = {}
+ self.get_spline(variant['all'], st, xx, self._data_forecasts, df, reducing=reducing)
if self.options.attr_brk_site:
- for site in self._sites:
- self.fc_remaining[site['resource_id']] = {}
- _data = self._site_data_forecasts[site['resource_id']]
- st, _ = self.get_forecast_list_slice(_data, self.get_day_start_utc()) # Get start of day index
- for _data_field in df:
- if st > 0:
- y = buildY(_data, _data_field, st)
- self.fc_remaining[site['resource_id']][_data_field] = cubic_interp(xx, self._spline_period, y)
- for j in xx:
- i = int(j/300)
- k = int(math.floor(j/1800))
- if math.copysign(1.0, self.fc_remaining[site['resource_id']][_data_field][i]) < 0: self.fc_remaining[site['resource_id']][_data_field][i] = 0.0 # Suppress negative values
- if k+1 <= len(y)-1 and y[k] == y[k+1] and self.fc_remaining[site['resource_id']][_data_field][i] > round(y[k],4): self.fc_remaining[site['resource_id']][_data_field][i] = y[k] # Suppress spline bounce
- self.fc_remaining[site['resource_id']][_data_field] = ([self.fc_remaining[site['resource_id']][_data_field][0]]*3) + self.fc_remaining[site['resource_id']][_data_field] # Shift right by fifteen minutes because 30-minute averages, padding
- else: # The list slice was not found, so zero the remainings
- self.fc_remaining[site['resource_id']][_data_field] = [0] * (len(self._spline_period) * 6)
-
- def get_remaining(self, site, _data_field, t):
- return self.fc_remaining['all' if site is None else site][self._data_field if _data_field is None else _data_field][int(t / 300)]
+ for site in self.sites:
+ variant[site['resource_id']] = {}
+ self.get_spline(variant[site['resource_id']], st, xx, self._data_forecasts, df, reducing=reducing)
+
+ async def spline_moments(self) -> None:
+ """Build the moments splines"""
+ try:
+ self.build_splines(self._forecasts_moment)
+ except Exception as e:
+ _LOGGER.debug('Exception in spline_moments(): %s', e)
+
+ def get_moment(self, site, _data_field, t) -> float:
+ """Get a time value from a moment spline, with times needing to be for today, and also on five-minute boundaries"""
+ try:
+ return self._forecasts_moment['all' if site is None else site][self._use_data_field if _data_field is None else _data_field][int(t / 300)]
+ except Exception as e:
+ _LOGGER.debug('Exception in get_moment(): %s', e)
+ return 0
+
+ async def spline_remaining(self) -> None:
+ """Build the descending splines"""
+ try:
+ self.build_splines(self._forecasts_remaining, reducing=True)
+ except Exception as e:
+ _LOGGER.debug('Exception in spline_remaining(): %s', e)
+
+ def get_remaining(self, site, _data_field, t) -> float:
+ """Get a time value from a reducing spline, with times needing to be for today, and also on five-minute boundaries"""
+ try:
+ return self._forecasts_remaining['all' if site is None else site][self._use_data_field if _data_field is None else _data_field][int(t / 300)]
+ except Exception as e:
+ _LOGGER.debug('Exception in get_remaining(): %s', e)
+ return 0
def get_forecast_pv_remaining(self, start_utc, end_utc=None, site=None, _use_data_field=None) -> float:
"""Return pv_estimates remaining for period"""
@@ -902,9 +1003,11 @@ def get_forecast_pv_remaining(self, start_utc, end_utc=None, site=None, _use_dat
res = self.get_remaining(site, _data_field, (start_utc - day_start).total_seconds())
if end_utc is not None:
end_utc = end_utc.replace(minute = math.floor(end_utc.minute / 5) * 5)
- if end_utc < day_start + timedelta(seconds=1800*len(self._spline_period)): # Spline data points are limited
+ if end_utc < day_start + timedelta(seconds=1800*len(self._spline_period)):
+ # End is within today so use spline data
res -= self.get_remaining(site, _data_field, (end_utc - day_start).total_seconds())
else:
+ # End is beyond today, so revert to simple linear interpolation
st_i2, _ = self.get_forecast_list_slice(_data, day_start + timedelta(seconds=1800*len(self._spline_period))) # Get post-spline day onwards start index
for d in _data[st_i2:end_i]:
d2 = d['period_start'] + timedelta(seconds=1800)
@@ -912,7 +1015,7 @@ def get_forecast_pv_remaining(self, start_utc, end_utc=None, site=None, _use_dat
f = 0.5 * d[_data_field]
if end_utc < d2:
s -= (d2 - end_utc).total_seconds()
- res += f * s / 1800 # Simple linear interpolation
+ res += f * s / 1800
else:
res += f
if _SENSOR_DEBUG_LOGGING: _LOGGER.debug(
@@ -922,9 +1025,9 @@ def get_forecast_pv_remaining(self, start_utc, end_utc=None, site=None, _use_dat
end_utc.strftime('%Y-%m-%d %H:%M:%S') if end_utc is not None else None,
st_i, end_i, round(res,4)
)
- return res
- except Exception as ex:
- _LOGGER.error(f"Exception in get_forecast_pv_remaining(): {ex}")
+ return res if res > 0 else 0
+ except Exception as e:
+ _LOGGER.error("Exception in get_forecast_pv_remaining(): %s", e)
_LOGGER.error(traceback.format_exc())
return 0
@@ -947,8 +1050,8 @@ def get_forecast_pv_estimates(self, start_utc, end_utc, site=None, _use_data_fie
st_i, end_i, round(res,4)
)
return res
- except Exception as ex:
- _LOGGER.error(f"Exception in get_forecast_pv_estimates(): {ex}")
+ except Exception as e:
+ _LOGGER.error("Exception in get_forecast_pv_estimates(): %s", e)
_LOGGER.error(traceback.format_exc())
return 0
@@ -965,8 +1068,8 @@ def get_forecast_pv_moment(self, time_utc, site=None, _use_data_field=None) -> f
time_utc.strftime('%Y-%m-%d %H:%M:%S'), (time_utc - day_start).total_seconds(), round(res, 4)
)
return res
- except Exception as ex:
- _LOGGER.error(f"Exception in get_forecast_pv_moment(): {ex}")
+ except Exception as e:
+ _LOGGER.error("Exception in get_forecast_pv_moment(): %s", e)
_LOGGER.error(traceback.format_exc())
return 0
@@ -975,10 +1078,10 @@ def get_max_forecast_pv_estimate(self, start_utc, end_utc, site=None, _use_data_
try:
_data = self._data_forecasts if site is None else self._site_data_forecasts[site]
_data_field = self._use_data_field if _use_data_field is None else _use_data_field
- res = None
st_i, end_i = self.get_forecast_list_slice(_data, start_utc, end_utc)
+ res = _data[st_i]
for d in _data[st_i:end_i]:
- if res is None or res[_data_field] < d[_data_field]:
+ if res[_data_field] < d[_data_field]:
res = d
if _SENSOR_DEBUG_LOGGING: _LOGGER.debug(
"Get max estimate: %s()%s %s st %s end %s st_i %d end_i %d res %s",
@@ -988,41 +1091,48 @@ def get_max_forecast_pv_estimate(self, start_utc, end_utc, site=None, _use_data_
st_i, end_i, res
)
return res
- except Exception as ex:
- _LOGGER.error(f"Exception in get_max_forecast_pv_estimate(): {ex}")
+ except Exception as e:
+ _LOGGER.error("Exception in get_max_forecast_pv_estimate(): %s", e)
_LOGGER.error(traceback.format_exc())
return None
def get_energy_data(self) -> dict[str, Any]:
+ """Get energy data"""
try:
- return self._dataenergy
- except Exception as ex:
- _LOGGER.error(f"Exception in get_energy_data(): {ex}")
+ return self._data_energy
+ except Exception as e:
+ _LOGGER.error("Exception in get_energy_data(): %s", e)
_LOGGER.error(traceback.format_exc())
return None
async def http_data(self, dopast = False):
"""Request forecast data for all sites"""
try:
- if self.get_last_updated_datetime() + timedelta(minutes=15) > dt.now(timezone.utc):
- _LOGGER.warning(f"Not requesting a forecast from Solcast because time is within fifteen minutes of last update ({self.get_last_updated_datetime().astimezone(self._tz)})")
- return
+ status = ''
+ if self.get_last_updated_datetime() + timedelta(minutes=1) > dt.now(timezone.utc):
+ status = f"Not requesting a forecast from Solcast because time is within one minute of last update ({self.get_last_updated_datetime().astimezone(self._tz)})"
+ _LOGGER.warning(status)
+ return status
failure = False
- sitesAttempted = 0
- for site in self._sites:
- sitesAttempted += 1
- _LOGGER.info(f"Getting forecast update for Solcast site {site['resource_id']}")
- result = await self.http_data_call(self.get_api_usage_cache_filename(site['apikey']), site['resource_id'], site['apikey'], dopast)
+ sites_attempted = 0
+ for site in self.sites:
+ sites_attempted += 1
+ _LOGGER.info("Getting forecast update for Solcast site %s", site['resource_id'])
+ result = await self.http_data_call(site['resource_id'], site['apikey'], dopast)
if not result:
failure = True
- if len(self._sites) > sitesAttempted:
- _LOGGER.warning('Forecast update for site %s failed, so not getting remaining sites', site['resource_id'])
+ if len(self.sites) > 1:
+ if sites_attempted < len(self.sites):
+ _LOGGER.warning('Forecast update for site %s failed so not getting remaining sites%s', site['resource_id'], ' - API use count may be odd' if len(self.sites) > 2 else '')
+ else:
+ _LOGGER.warning('Forecast update for the last site queued failed (%s) so not getting remaining sites - API use count may be odd', site['resource_id'])
else:
- _LOGGER.warning('Forecast update for the last site queued failed (%s), so not getting remaining sites - API use count will look odd', site['resource_id'])
+ _LOGGER.warning('Forecast update for site %s failed', site['resource_id'])
+ status = 'At least one site forecast get failed'
break
- if sitesAttempted > 0 and not failure:
+ if sites_attempted > 0 and not failure:
self._data["last_updated"] = dt.now(timezone.utc).isoformat()
#self._data["weather"] = self._weather
@@ -1032,20 +1142,23 @@ async def http_data(self, dopast = False):
await self.serialize_data()
else:
- if sitesAttempted > 0:
- _LOGGER.error("At least one Solcast site forecast failed to fetch, so forecast data has not been built")
+ if sites_attempted > 0:
+ _LOGGER.error("At least one Solcast site forecast failed to fetch, so forecast has not been built")
else:
- _LOGGER.error("No Solcast sites were attempted, so forecast data has not been built - check for earlier failure to retrieve sites")
- except Exception as ex:
- _LOGGER.error("Exception in http_data(): %s - Forecast data has not been built", ex)
+ _LOGGER.error("Internal error, there is no sites data so forecast has not been built")
+ status = 'At least one site forecast get failed'
+ except Exception as e:
+ status = f"Exception in http_data(): {e} - Forecast has not been built"
+ _LOGGER.error(status)
_LOGGER.error(traceback.format_exc())
+ return status
- async def http_data_call(self, usageCacheFileName, r_id = None, api = None, dopast = False):
+ async def http_data_call(self, r_id = None, api = None, dopast = False):
"""Request forecast data via the Solcast API"""
try:
lastday = self.get_day_start_utc() + timedelta(days=8)
numhours = math.ceil((lastday - self.get_now_utc()).total_seconds() / 3600)
- _LOGGER.debug(f"Polling API for site {r_id} lastday {lastday} numhours {numhours}")
+ _LOGGER.debug('Polling API for site %s lastday %s numhours %d', r_id, lastday.strftime('%Y-%m-%d'), numhours)
_data = []
_data2 = []
@@ -1053,18 +1166,15 @@ async def http_data_call(self, usageCacheFileName, r_id = None, api = None, dopa
if dopast:
# Run once, for a new install or if the solcast.json file is deleted. This will use up api call quota.
ae = None
- resp_dict = await self.fetch_data(usageCacheFileName, "estimated_actuals", 168, site=r_id, apikey=api, cachedname="actuals")
+ resp_dict = await self.fetch_data("estimated_actuals", 168, site=r_id, apikey=api, cachedname="actuals")
if not isinstance(resp_dict, dict):
- _LOGGER.error(f"No data was returned for Solcast estimated_actuals so this WILL cause errors...")
- _LOGGER.error(f"Either your API limit is exhaused, Internet down, or networking is misconfigured...")
- _LOGGER.error(f"This almost certainly not a problem with the integration, and sensor values will be wrong"
- )
+ _LOGGER.error('No data was returned for estimated_actuals so this WILL cause issues. Your API limit may be exhaused, or Solcast has a problem...')
raise TypeError(f"Solcast API did not return a json object. Returned {resp_dict}")
ae = resp_dict.get("estimated_actuals", None)
if not isinstance(ae, list):
- raise TypeError(f"estimated actuals must be a list, not {type(ae)}")
+ raise TypeError(f"Estimated actuals must be a list, not {type(ae)}")
oldest = dt.now(self._tz).replace(hour=0,minute=0,second=0,microsecond=0) - timedelta(days=6)
oldest = oldest.astimezone(timezone.utc)
@@ -1086,7 +1196,7 @@ async def http_data_call(self, usageCacheFileName, r_id = None, api = None, dopa
}
)
- resp_dict = await self.fetch_data(usageCacheFileName, "forecasts", numhours, site=r_id, apikey=api, cachedname="forecasts")
+ resp_dict = await self.fetch_data("forecasts", numhours, site=r_id, apikey=api, cachedname="forecasts")
if resp_dict is None:
return False
@@ -1097,7 +1207,7 @@ async def http_data_call(self, usageCacheFileName, r_id = None, api = None, dopa
if not isinstance(af, list):
raise TypeError(f"forecasts must be a list, not {type(af)}")
- _LOGGER.debug(f"Solcast returned {len(af)} records")
+ _LOGGER.debug("Solcast returned %d records", len(af))
st_time = time.time()
for x in af:
@@ -1128,19 +1238,20 @@ async def http_data_call(self, usageCacheFileName, r_id = None, api = None, dopa
_LOGGER.debug("Forecasts dictionary length %s", len(_fcasts_dict))
+ #loop each site and its forecasts
for x in _data:
- #loop each site and its forecasts
-
itm = _fcasts_dict.get(x["period_start"])
if itm:
itm["pv_estimate"] = x["pv_estimate"]
itm["pv_estimate10"] = x["pv_estimate10"]
itm["pv_estimate90"] = x["pv_estimate90"]
else:
- _fcasts_dict[x["period_start"]] = {"period_start": x["period_start"],
- "pv_estimate": x["pv_estimate"],
- "pv_estimate10": x["pv_estimate10"],
- "pv_estimate90": x["pv_estimate90"]}
+ _fcasts_dict[x["period_start"]] = {
+ "period_start": x["period_start"],
+ "pv_estimate": x["pv_estimate"],
+ "pv_estimate10": x["pv_estimate10"],
+ "pv_estimate90": x["pv_estimate90"]
+ }
# _fcasts_dict contains all data for the site up to 730 days worth
# Delete data that is older than two years
@@ -1151,40 +1262,38 @@ async def http_data_call(self, usageCacheFileName, r_id = None, api = None, dopa
self._data['siteinfo'].update({r_id:{'forecasts': copy.deepcopy(_forecasts)}})
- _LOGGER.debug(f"HTTP data call processing took {round(time.time() - st_time, 4)}s")
+ _LOGGER.debug("HTTP data call processing took %.3f seconds", round(time.time() - st_time, 4))
return True
- except Exception as ex:
- _LOGGER.error("Exception in http_data_call(): %s", ex)
+ except Exception as e:
+ _LOGGER.error("Exception in http_data_call(): %s", e)
_LOGGER.error(traceback.format_exc())
return False
- async def fetch_data(self, usageCacheFileName, path="error", hours=168, site="", apikey="", cachedname="forcasts") -> dict[str, Any]:
+ async def fetch_data(self, path="error", hours=168, site="", apikey="", cachedname="forcasts") -> dict[str, Any]:
"""Fetch forecast data"""
try:
- params = {"format": "json", "api_key": apikey, "hours": hours}
- url=f"{self.options.host}/rooftop_sites/{site}/{path}"
- _LOGGER.debug(f"Fetch data url: {url}")
-
async with async_timeout.timeout(900):
- apiCacheFileName = self.configDir + '/' + cachedname + "_" + site + ".json"
- if self.apiCacheEnabled and file_exists(apiCacheFileName):
- status = 404
- async with aiofiles.open(apiCacheFileName) as f:
- resp_json = json.loads(await f.read())
- status = 200
- _LOGGER.debug(f"Got cached file data for site {site}")
+ if self._api_cache_enabled:
+ api_cache_filename = self._config_dir + '/' + cachedname + "_" + site + ".json"
+ if file_exists(api_cache_filename):
+ status = 404
+ async with aiofiles.open(api_cache_filename) as f:
+ resp_json = json.loads(await f.read())
+ status = 200
+ _LOGGER.debug("Offline cached mode enabled, loaded data for site %s", site)
else:
if self._api_used[apikey] < self._api_limit[apikey]:
+ url = f"{self.options.host}/rooftop_sites/{site}/{path}"
+ params = {"format": "json", "api_key": apikey, "hours": hours}
+ _LOGGER.debug("Fetch data url: %s", url)
tries = 10
counter = 0
backoff = 15 # On every retry the back-off increases by (at least) fifteen seconds more than the previous back-off
while True:
- _LOGGER.debug(f"Fetching forecast")
+ _LOGGER.debug("Fetching forecast")
counter += 1
- resp: ClientResponse = await self.aiohttp_session.get(
- url=url, params=params, ssl=False
- )
+ resp: ClientResponse = await self._aiohttp_session.get(url=url, params=params, ssl=False)
status = resp.status
if status == 200:
break
@@ -1196,15 +1305,12 @@ async def fetch_data(self, usageCacheFileName, path="error", hours=168, site="",
if rs is not None:
if rs.get('error_code') == 'TooManyRequests':
status = 998
- _LOGGER.debug(f"Exceeded daily free limit, setting API Counter to {self._api_limit[apikey]}")
self._api_used[apikey] = self._api_limit[apikey]
- await self.write_api_usage_cache_file(usageCacheFileName,
- {"daily_limit": self._api_limit[apikey], "daily_limit_consumed": self._api_used[apikey]},
- apikey)
+ await self.serialise_usage(apikey)
break
else:
+ status = 1000
_LOGGER.warning("An unexpected error occurred: %s", rs.get('message'))
- status = 1000 # Intentionally not handled below
break
except:
pass
@@ -1213,66 +1319,67 @@ async def fetch_data(self, usageCacheFileName, path="error", hours=168, site="",
break
# Solcast is busy, so delay (15 seconds * counter), plus a random number of seconds between zero and 15
delay = (counter * backoff) + random.randrange(0,15)
- _LOGGER.warning(f"The Solcast API is busy, pausing {delay} seconds before retry")
+ _LOGGER.warning("The Solcast API is busy, pausing %d seconds before retry", delay)
await asyncio.sleep(delay)
else:
break
if status == 200:
- _LOGGER.debug(f"Fetch successful")
+ _LOGGER.debug("Fetch successful")
- _LOGGER.debug(f"API returned data. API Counter incremented from {self._api_used[apikey]} to {self._api_used[apikey] + 1}")
- self._api_used[apikey] = self._api_used[apikey] + 1
- await self.write_api_usage_cache_file(usageCacheFileName,
- {"daily_limit": self._api_limit[apikey], "daily_limit_consumed": self._api_used[apikey]},
- apikey)
+ _LOGGER.debug("API returned data, API counter incremented from %d to %d", self._api_used[apikey], self._api_used[apikey] + 1)
+ self._api_used[apikey] += 1
+ await self.serialise_usage(apikey)
resp_json = await resp.json(content_type=None)
- if self.apiCacheEnabled:
- async with aiofiles.open(apiCacheFileName, 'w') as f:
- await f.write(json.dumps(resp_json, ensure_ascii=False))
- elif status == 998:
- _LOGGER.error(f"The Solcast API use quota has been exceeded, attempt failed")
+ if self._api_cache_enabled:
+ async with self._serialize_lock:
+ async with aiofiles.open(api_cache_filename, 'w') as f:
+ await f.write(json.dumps(resp_json, ensure_ascii=False))
+ elif status == 998: # Exceeded API limit
+ _LOGGER.error("API allowed polling limit has been exceeded, API counter set to %d/%d", self._api_used[apikey], self._api_limit[apikey])
+ return None
+ elif status == 999: # Attempts exhausted
+ _LOGGER.error("API was tried %d times, but all attempts failed", tries)
return None
- elif status == 999:
- _LOGGER.error(f"The Solcast API was tried {tries} times, but all attempts have failed")
+ elif status == 1000: # An unexpected response
return None
else:
- _LOGGER.error(f"Solcast API returned status {translate(status)}. API used is {self._api_used[apikey]}/{self._api_limit[apikey]}")
+ _LOGGER.error("API returned status %s, API used is %d/%d", translate(status), self._api_used[apikey], self._api_limit[apikey])
return None
else:
- _LOGGER.warning(f"API limit exceeded, not getting forecast")
+ _LOGGER.warning("API polling limit exhausted, not getting forecast for site %s, API used is %d/%d", site, self._api_used[apikey], self._api_limit[apikey])
return None
- _LOGGER.debug(f"HTTP session returned data type in fetch_data() is {type(resp_json)}")
- _LOGGER.debug(f"HTTP session status in fetch_data() is {translate(status)}")
+ _LOGGER.debug("HTTP session returned data type %s", type(resp_json))
+ _LOGGER.debug("HTTP session status %s", translate(status))
if status == 429:
- _LOGGER.warning("Solcast is too busy or exceeded API allowed polling limit, API used is {self._api_used[apikey]}/{self._api_limit[apikey]}")
+ _LOGGER.warning("Solcast is too busy, try again later")
elif status == 400:
- _LOGGER.warning(
- "Status {translate(status)}: The Solcast site is likely missing capacity, please specify capacity or provide historic data for tuning."
- )
+ _LOGGER.warning("Status %s: The Solcast site is likely missing capacity, please specify capacity or provide historic data for tuning", translate(status))
elif status == 404:
- _LOGGER.error(f"The Solcast site cannot be found, status {translate(status)} returned")
+ _LOGGER.error("The Solcast site cannot be found, status %s returned", translate(status))
elif status == 200:
d = cast(dict, resp_json)
- _LOGGER.debug(f"Status {translate(status)} in fetch_data(), returned: {d}")
+ if _FORECAST_DEBUG_LOGGING:
+ _LOGGER.debug("HTTP session returned: %s", str(d))
return d
#await self.format_json_data(d)
- except ConnectionRefusedError as err:
- _LOGGER.error("Connection error in fetch_data(), connection refused: %s", err)
+ except ConnectionRefusedError as e:
+ _LOGGER.error("Connection error in fetch_data(), connection refused: %s", e)
except ClientConnectionError as e:
- _LOGGER.error("Connection error in fetch_data(): %s", str(e))
+ _LOGGER.error("Connection error in fetch_data(): %s", e)
except asyncio.TimeoutError:
_LOGGER.error("Connection error in fetch_data(): Timed out connecting to Solcast API server")
- except Exception as e:
+ except:
_LOGGER.error("Exception in fetch_data(): %s", traceback.format_exc())
return None
def makeenergydict(self) -> dict:
+ """Make an energy-compatible dictionary"""
wh_hours = {}
try:
lastv = -1
@@ -1294,7 +1401,7 @@ def makeenergydict(self) -> dict:
lastk = d
lastv = v[self._use_data_field]
- except Exception as e:
+ except:
_LOGGER.error("Exception in makeenergydict(): %s", traceback.format_exc())
return wh_hours
@@ -1317,27 +1424,32 @@ async def buildforecastdata(self):
z = x["period_start"]
zz = z.astimezone(self._tz) #- timedelta(minutes=30)
- # v4.0.8 added code to dampen the forecast data: (* self._damp[h])
+ # v4.0.8 added code to dampen the forecast data: (* self.damp[h])
if yesterday < zz.date() < lastday:
h = f"{zz.hour}"
if zz.date() == today:
- tally += min(x[self._use_data_field] * 0.5 * self._damp[h], self._hardlimit)
+ tally += min(x[self._use_data_field] * 0.5 * self.damp[h], self.hard_limit)
# Add the forecast for this site to the total
itm = _fcasts_dict.get(z)
if itm:
- itm["pv_estimate"] = min(round(itm["pv_estimate"] + (x["pv_estimate"] * self._damp[h]),4), self._hardlimit)
- itm["pv_estimate10"] = min(round(itm["pv_estimate10"] + (x["pv_estimate10"] * self._damp[h]),4), self._hardlimit)
- itm["pv_estimate90"] = min(round(itm["pv_estimate90"] + (x["pv_estimate90"] * self._damp[h]),4), self._hardlimit)
+ itm["pv_estimate"] = min(round(itm["pv_estimate"] + (x["pv_estimate"] * self.damp[h]),4), self.hard_limit)
+ itm["pv_estimate10"] = min(round(itm["pv_estimate10"] + (x["pv_estimate10"] * self.damp[h]),4), self.hard_limit)
+ itm["pv_estimate90"] = min(round(itm["pv_estimate90"] + (x["pv_estimate90"] * self.damp[h]),4), self.hard_limit)
else:
_fcasts_dict[z] = {"period_start": z,
- "pv_estimate": min(round((x["pv_estimate"] * self._damp[h]),4), self._hardlimit),
- "pv_estimate10": min(round((x["pv_estimate10"] * self._damp[h]),4), self._hardlimit),
- "pv_estimate90": min(round((x["pv_estimate90"] * self._damp[h]),4), self._hardlimit)}
+ "pv_estimate": min(round((x["pv_estimate"] * self.damp[h]),4), self.hard_limit),
+ "pv_estimate10": min(round((x["pv_estimate10"] * self.damp[h]),4), self.hard_limit),
+ "pv_estimate90": min(round((x["pv_estimate90"] * self.damp[h]),4), self.hard_limit)}
# Record the individual site forecast
- _site_fcasts_dict[z] = {"period_start": z, "pv_estimate": round((x["pv_estimate"]),4), "pv_estimate10": round((x["pv_estimate10"]),4), "pv_estimate90": round((x["pv_estimate90"]),4)}
+ _site_fcasts_dict[z] = {
+ "period_start": z,
+ "pv_estimate": round((x["pv_estimate"]),4),
+ "pv_estimate10": round((x["pv_estimate10"]),4),
+ "pv_estimate90": round((x["pv_estimate90"]),4),
+ }
self._site_data_forecasts[site] = sorted(_site_fcasts_dict.values(), key=itemgetter("period_start"))
@@ -1346,33 +1458,34 @@ async def buildforecastdata(self):
self._data_forecasts = sorted(_fcasts_dict.values(), key=itemgetter("period_start"))
- self._forecasts_start_idx = self.calcForecastStartIndex()
+ self._forecasts_start_idx = self.calc_forecast_start_index()
- self._dataenergy = {"wh_hours": self.makeenergydict()}
+ self._data_energy = {"wh_hours": self.makeenergydict()}
- await self.checkDataRecords()
+ await self.check_data_records()
- _LOGGER.debug('Calculating splines')
+ _LOGGER.debug("Calculating splines")
await self.spline_moments()
await self.spline_remaining()
- _LOGGER.debug(f"Build forecast processing took {round(time.time()-st_time,4)}s")
+ _LOGGER.debug("Build forecast processing took %.3f seconds", round(time.time() - st_time, 4))
- except Exception as e:
+ except:
_LOGGER.error("Exception in http_data(): %s", traceback.format_exc())
- def calcForecastStartIndex(self):
+ def calc_forecast_start_index(self):
+ """Get the start of forecasts as-at just before midnight (Doesn't stop at midnight because some sensors may need the previous interval)"""
midnight_utc = self.get_day_start_utc()
- # Search in reverse (less to iterate) and find the interval just before midnight
- # (Doesn't stop at midnight because some sensors may need the previous interval)
- for idx in range(len(self._data_forecasts)-1, -1, -1):
- if self._data_forecasts[idx]["period_start"] < midnight_utc: break
- _LOGGER.debug("Calc forecast start index midnight: %s UTC, idx %s, len %s", midnight_utc.strftime('%Y-%m-%d %H:%M:%S'), idx, len(self._data_forecasts))
+ for idx in range(len(self._data_forecasts)-1, -1, -1): # Search in reverse (less to iterate)
+ if self._data_forecasts[idx]["period_start"] < midnight_utc:
+ break
+ _LOGGER.debug("Calc forecast start index midnight: %s UTC, idx %d, len %d", midnight_utc.strftime('%Y-%m-%d %H:%M:%S'), idx, len(self._data_forecasts))
return idx
- async def checkDataRecords(self):
+ async def check_data_records(self):
+ """Verify that all records are present for each day"""
for i in range(0, 8):
start_utc = self.get_day_start_utc() + timedelta(days=i)
end_utc = start_utc + timedelta(days=1)
@@ -1381,6 +1494,6 @@ async def checkDataRecords(self):
da = dt.now(self._tz).date() + timedelta(days=i)
if num_rec == 48:
- _LOGGER.debug(f"Data for {da} contains all 48 records")
+ _LOGGER.debug("Data for %s contains all 48 records", da.strftime('%Y-%m-%d'))
else:
- _LOGGER.debug(f"Data for {da} contains only {num_rec} of 48 records and may produce inaccurate forecast data")
\ No newline at end of file
+ _LOGGER.debug("Data for %s contains only %d of 48 records and may produce inaccurate forecast data", da.strftime('%Y-%m-%d'), num_rec)
\ No newline at end of file
diff --git a/custom_components/solcast_solar/spline.py b/custom_components/solcast_solar/spline.py
index 86cca871..4d61f5b8 100644
--- a/custom_components/solcast_solar/spline.py
+++ b/custom_components/solcast_solar/spline.py
@@ -1,69 +1,71 @@
+"""Cubic spline from one-dimensional arrays"""
+
+# pylint: disable=C0200, C0304, C0321, R0914
+
import math
def cubic_interp(x0, x, y):
"""
- Cubic spline from one-dimensional arrays
-
x0: Array of floats to interpolate at
x : Array of floats in increasing order
y : Array of floats to interpolate
- Returns array of interpolaated values
+ Returns array of interpolated values
"""
def diff(lst): # numpy-like diff
size = len(lst) - 1
r = [0] * size
- for i in range(size): r[i] = lst[i+1] - lst[i]
+ for i in range(size): r[i] = lst[i+1] - lst[i]
return r
-
- def clip(lst, min_val, max_val, inPlace = False): # numpy-like clip
- if not inPlace: lst = lst[:]
+
+ def clip(lst, min_val, max_val, in_place = False): # numpy-like clip
+ if not in_place: lst = lst[:]
for i in range(len(lst)):
if lst[i] < min_val:
lst[i] = min_val
elif lst[i] > max_val:
- lst[i] = max_val
+ lst[i] = max_val
return lst
-
- def searchsorted(listToInsert, insertInto): # numpy-like searchsorted
- def float_searchsorted(floatToInsert, insertInto):
- for i in range(len(insertInto)):
- if floatToInsert <= insertInto[i]: return i
- return len(insertInto)
- return [float_searchsorted(i, insertInto) for i in listToInsert]
-
+
+ def searchsorted(list_to_insert, insert_into): # numpy-like searchsorted
+ def float_searchsorted(float_to_insert, insert_into):
+ for i in range(len(insert_into)):
+ if float_to_insert <= insert_into[i]: return i
+ return len(insert_into)
+ return [float_searchsorted(i, insert_into) for i in list_to_insert]
+
def subtract(a, b):
return a - b
-
+
size = len(x)
xdiff = diff(x)
ydiff = diff(y)
- Li = [0] * size
- Li_1 = [0] * (size - 1)
+ li = [0] * size
+ li_1 = [0] * (size - 1)
z = [0] * (size)
- Li[0] = math.sqrt(2 * xdiff[0])
- Li_1[0] = 0.0
- B0 = 0.0
- z[0] = B0 / Li[0]
+ li[0] = math.sqrt(2 * xdiff[0])
+ li_1[0] = 0.0
+ b0 = 0.0
+ z[0] = b0 / li[0]
for i in range(1, size - 1, 1):
- Li_1[i] = xdiff[i-1] / Li[i-1]
- Li[i] = math.sqrt(2 * (xdiff[i-1] + xdiff[i]) - Li_1[i-1] * Li_1[i-1])
- Bi = 6 * (ydiff[i] / xdiff[i] - ydiff[i-1] / xdiff[i-1])
- z[i] = (Bi - Li_1[i-1] * z[i-1]) / Li[i]
+ li_1[i] = xdiff[i-1] / li[i-1]
+ li[i] = math.sqrt(2 * (xdiff[i-1] + xdiff[i]) - li_1[i-1] * li_1[i-1])
+ bi = 6 * (ydiff[i] / xdiff[i] - ydiff[i-1] / xdiff[i-1])
+ z[i] = (bi - li_1[i-1] * z[i-1]) / li[i]
i = size - 1
- Li_1[i-1] = xdiff[-1] / Li[i-1]
- Li[i] = math.sqrt(2 * xdiff[-1] - Li_1[i-1] * Li_1[i-1])
- Bi = 0.0
- z[i] = (Bi - Li_1[i-1] * z[i-1]) / Li[i]
+ li_1[i-1] = xdiff[-1] / li[i-1]
+ li[i] = math.sqrt(2 * xdiff[-1] - li_1[i-1] * li_1[i-1])
+ bi = 0.0
+ z[i] = (bi - li_1[i-1] * z[i-1]) / li[i]
i = size - 1
- z[i] = z[i] / Li[i]
+ z[i] = z[i] / li[i]
for i in range(size - 2, -1, -1):
- z[i] = (z[i] - Li_1[i-1] * z[i+1]) / Li[i]
+ z[i] = (z[i] - li_1[i-1] * z[i+1]) / li[i]
index = searchsorted(x0, x)
index = clip(index, 1, size - 1)
@@ -82,8 +84,8 @@ def subtract(a, b):
zi0[j] / (6 * hi1[j]) * (xi1[j] - x0[j]) ** 3 + \
zi1[j] / (6 * hi1[j]) * (x0[j] - xi0[j]) ** 3 + \
(yi1[j] / hi1[j] - zi1[j] * hi1[j] / 6) * (x0[j] - xi0[j]) + \
- (yi0[j] / hi1[j] - zi0[j] * hi1[j] / 6) * (xi1[j] - x0[j])
+ (yi0[j] / hi1[j] - zi0[j] * hi1[j] / 6) * (xi1[j] - x0[j])
,4
)
-
+
return f0
\ No newline at end of file
diff --git a/custom_components/solcast_solar/strings.json b/custom_components/solcast_solar/strings.json
index 2c6af6c3..8bd9a841 100755
--- a/custom_components/solcast_solar/strings.json
+++ b/custom_components/solcast_solar/strings.json
@@ -7,9 +7,9 @@
"user": {
"data": {
"api_key": "API key (comma separate multiple values)",
- "api_quota": "API quota (optionally comma separate multiple values per key)"
+ "api_quota": "API limit (optionally comma separate multiple values per key)"
},
- "description": "Solcast API Account Details"
+ "description": "Solcast Account Details"
}
}
},
@@ -24,9 +24,9 @@
"api": {
"data": {
"api_key": "API key (comma separate multiple values)",
- "api_quota": "API quota (optionally comma separate multiple values per key)"
+ "api_quota": "API limit (optionally comma separate multiple values per key)"
},
- "description": "Solcast API Account Details"
+ "description": "Solcast Account Details"
},
"dampen": {
"data": {
@@ -83,10 +83,10 @@
"selector": {
"solcast_config_action": {
"options": {
- "configure_api": "Solcast API key",
- "configure_dampening": "Configure Dampening",
- "configure_customsensor": "Configure Custom Hour Sensor",
- "configure_attributes": "Configure Available Attributes"
+ "configure_api": "Solcast account details",
+ "configure_dampening": "Configure dampening",
+ "configure_customsensor": "Configure custom hours sensor",
+ "configure_attributes": "Configure available attributes"
}
}
},
diff --git a/custom_components/solcast_solar/system_health.py b/custom_components/solcast_solar/system_health.py
index b84fce1f..5aa85ffb 100644
--- a/custom_components/solcast_solar/system_health.py
+++ b/custom_components/solcast_solar/system_health.py
@@ -1,4 +1,7 @@
"""Provide info to system health."""
+
+# pylint: disable=C0304, E0401, W0212, W0613
+
from __future__ import annotations
from typing import Any
@@ -26,5 +29,5 @@ async def system_health_info(hass: HomeAssistant) -> dict[str, Any]:
return {
"can_reach_server": system_health.async_check_can_reach_url(hass, SOLCAST_URL),
"used_requests": used_requests,
- "rooftop_site_count": len(coordinator.solcast._sites),
+ "rooftop_site_count": len(coordinator.solcast.sites),
}
\ No newline at end of file
diff --git a/custom_components/solcast_solar/test.py b/custom_components/solcast_solar/test.py
index 2738498c..4ced50f2 100755
--- a/custom_components/solcast_solar/test.py
+++ b/custom_components/solcast_solar/test.py
@@ -1,35 +1,57 @@
+"""Integration test - development only"""
#!/usr/bin/python3
+# pylint: disable=C0304, E0401, W0702
+
import asyncio
import logging
import traceback
+from aiohttp import ClientSession
-from aiohttp import ClientConnectionError, ClientSession
+from .const import SOLCAST_URL
from .solcastapi import ConnectionOptions, SolcastApi
-#logging.basicConfig(level=logging.DEBUG)
+logging.basicConfig(level=logging.DEBUG)
_LOGGER = logging.getLogger(__name__)
async def test():
+ """testing"""
+ print('This script is for development purposes only')
try:
-
+ optdamp = {}
+ for a in range(0,24):
+ optdamp[str(a)] = 1.0
+
options = ConnectionOptions(
- "changetoyourapikey",
- "https://api.solcast.com.au",
- 'solcast.json'
+ "apikeygoeshere",
+ SOLCAST_URL,
+ 'solcast.json',
+ "/config",
+ "Australia/Sydney",
+ optdamp,
+ 1,
+ "estimate",
+ 100,
+ True,
+ True,
+ True,
+ True,
+ True,
+ True
)
-
+
async with ClientSession() as session:
- solcast = SolcastApi(session, options, apiCacheEnabled=True)
+ solcast = SolcastApi(session, options, api_cache_enabled=True)
await solcast.sites_data()
+ await solcast.sites_usage()
await solcast.load_saved_data()
- print("Total today " + str(solcast.get_total_kwh_forecast_today()))
- print("Peak today " + str(solcast.get_peak_w_today()))
- print("Peak time today " + str(solcast.get_peak_w_time_today()))
- except Exception as err:
- _LOGGER.error("async_setup_entry: %s",traceback.format_exc())
+ print("Total today " + str(solcast.get_total_kwh_forecast_day(0)))
+ print("Peak today " + str(solcast.get_peak_w_day(0)))
+ print("Peak time today " + str(solcast.get_peak_w_time_day(0)))
+ except:
+ _LOGGER.error(traceback.format_exc())
return False
diff --git a/custom_components/solcast_solar/translations/de.json b/custom_components/solcast_solar/translations/de.json
index 3c5f323d..c64c6332 100644
--- a/custom_components/solcast_solar/translations/de.json
+++ b/custom_components/solcast_solar/translations/de.json
@@ -1,12 +1,15 @@
{
"config": {
+ "abort": {
+ "single_instance_allowed": "Es ist nur eine Solcast-Instanz zulässig"
+ },
"step": {
"user": {
"data": {
"api_key": "API-Schlüssel (mehrere Werte durch Kommas trennen)",
"api_quota": "API-Kontingent (mehrere Werte durch Kommas trennen)"
},
- "description": "Details zum Solcast API-Konto"
+ "description": "Solcast-Kontodaten"
}
}
},
@@ -23,7 +26,7 @@
"api_key": "API-Schlüssel (mehrere Werte durch Kommas trennen)",
"api_quota": "API-Kontingent (mehrere Werte durch Kommas trennen)"
},
- "description": "Details zum Solcast API-Konto"
+ "description": "Solcast-Kontodaten"
},
"dampen": {
"data": {
@@ -73,6 +76,16 @@
}
}
},
+ "selector": {
+ "solcast_config_action": {
+ "options": {
+ "configure_api": "Solcast-Kontodaten",
+ "configure_dampening": "Dämpfung konfigurieren",
+ "configure_customsensor": "Konfigurieren Sie einen benutzerdefinierten Stundensensor",
+ "configure_attributes": "Konfigurieren Sie verfügbare Attribute"
+ }
+ }
+ },
"system_health": {
"info": {
"can_reach_server": "Verbindung zum Solcast-Server",
diff --git a/custom_components/solcast_solar/translations/en.json b/custom_components/solcast_solar/translations/en.json
index 7b7841e4..ab52dd31 100644
--- a/custom_components/solcast_solar/translations/en.json
+++ b/custom_components/solcast_solar/translations/en.json
@@ -7,9 +7,9 @@
"user": {
"data": {
"api_key": "API key (comma separate multiple values)",
- "api_quota": "API quota (optionally comma separate multiple values per key)"
+ "api_quota": "API limit (optionally comma separate multiple values per key)"
},
- "description": "Solcast API Account Details"
+ "description": "Solcast Account Details"
}
}
},
@@ -24,9 +24,9 @@
"api": {
"data": {
"api_key": "API key (comma separate multiple values)",
- "api_quota": "API quota (optionally comma separate multiple values per key)"
+ "api_quota": "API limit (optionally comma separate multiple values per key)"
},
- "description": "Solcast API Account Details"
+ "description": "Solcast Account Details"
},
"dampen": {
"data": {
@@ -83,10 +83,10 @@
"selector": {
"solcast_config_action": {
"options": {
- "configure_api": "Solcast API key",
- "configure_dampening": "Configure Dampening",
- "configure_customsensor": "Configure Custom Hour Sensor",
- "configure_attributes": "Configure Available Attributes"
+ "configure_api": "Solcast account details",
+ "configure_dampening": "Configure dampening",
+ "configure_customsensor": "Configure custom hours sensor",
+ "configure_attributes": "Configure available attributes"
}
}
},
diff --git a/custom_components/solcast_solar/translations/fr.json b/custom_components/solcast_solar/translations/fr.json
index c2a37ccc..4589c30c 100644
--- a/custom_components/solcast_solar/translations/fr.json
+++ b/custom_components/solcast_solar/translations/fr.json
@@ -9,7 +9,7 @@
"api_key": "Clé API (plusieurs valeurs séparées par des virgules)",
"api_quota": "Quota d'API (plusieurs valeurs séparées par des virgules)"
},
- "description": "Détails de votre compte API Solcast"
+ "description": "Détails du compte Solcast"
}
}
},
@@ -26,7 +26,7 @@
"api_key": "Clé API (plusieurs valeurs séparées par des virgules)",
"api_quota": "Quota d'API (plusieurs valeurs séparées par des virgules)"
},
- "description": "Détails de votre compte API Solcast"
+ "description": "Détails du compte Solcast"
},
"dampen": {
"data": {
@@ -83,8 +83,10 @@
"selector": {
"solcast_config_action": {
"options": {
- "configure_api": "Clé API Solcast",
- "configure_dampening": "Configurer le coefficient"
+ "configure_api": "Détails du compte Solcast",
+ "configure_dampening": "Configurer le coefficient",
+ "configure_customsensor": "Configure custom hours sensor",
+ "configure_attributes": "Configure available attributes"
}
}
},
diff --git a/custom_components/solcast_solar/translations/pl.json b/custom_components/solcast_solar/translations/pl.json
index 32aa4014..4312f367 100644
--- a/custom_components/solcast_solar/translations/pl.json
+++ b/custom_components/solcast_solar/translations/pl.json
@@ -1,12 +1,15 @@
{
"config": {
+ "abort": {
+ "single_instance_allowed": "Dozwolona tylko jedna instancja Solcast"
+ },
"step": {
"user": {
"data": {
"api_key": "Klucz API (wielokrotne wartości oddzielane przecinkiem)",
- "api_quota": "Limit interfejsu API (wielokrotne wartości oddzielane przecinkiem)"
+ "api_quota": "Limit API (opcjonalnie wielokrotne wartości oddzielane przecinkiem)"
},
- "description": "Dane konta Solcast API"
+ "description": "Dane konta Solcast"
}
}
},
@@ -14,16 +17,16 @@
"step": {
"init": {
"data": {
- "api_key": "Działanie"
+ "solcast_config_action": "Działanie"
},
"description": "Opcje konfiguracji Solcast"
},
"api": {
"data": {
"api_key": "Klucz API (wielokrotne wartości oddzielane przecinkiem)",
- "api_quota": "Limit interfejsu API (wielokrotne wartości oddzielane przecinkiem)"
+ "api_quota": "Limit API (opcjonalnie wielokrotne wartości oddzielane przecinkiem)"
},
- "description": "Dane konta Solcast API"
+ "description": "Dane konta Solcast"
},
"dampen": {
"data": {
@@ -71,12 +74,26 @@
},
"description": "Wybierz atrybuty czujnika, które będą dostępne"
}
+ },
+ "error": {
+ "unknown": "Nieznany błąd",
+ "incorrect_options_action": "Wybrano nieprawidłowe działanie"
+ }
+ },
+ "selector": {
+ "solcast_config_action": {
+ "options": {
+ "configure_api": "Dane konta Solcast",
+ "configure_dampening": "Konfiguruj tłumienie",
+ "configure_customsensor": "Konfiguruj niestandardowy czujnik godzin",
+ "configure_attributes": "Konfiguruj dostępne atrybuty"
+ }
}
},
"system_health": {
"info": {
"can_reach_server": "Połączenie z serwerem Solcast",
- "used_requests": "Wykorzystane zapytania API",
+ "used_requests": "Wykorzystane żądania API",
"rooftop_site_count": "Liczba połaci"
}
},
@@ -86,22 +103,46 @@
"description": "Pobierz najnowsze dane prognoz Solcast."
},
"clear_all_solcast_data": {
- "name": "Wyczyść dane Solcast",
- "description": "Usunięte zostaną wszystkie przechowywane dane Solcast. Plik solcast.json zostanie usunięty."
+ "name": "Wyczyść wszystkie zapisane dane Solcast",
+ "description": "Usuwa plik solcast.json, aby usunąć wszystkie aktualne dane witryny Solcast."
},
"query_forecast_data": {
"name": "Pobierz dane prognoz",
- "description": "Pobierz aktualne dane prognoz.",
+ "description": "Zwraca zestaw danych lub wartość dla podanego zapytania.",
"fields": {
"start_date_time": {
"name": "Data i godzina rozpoczęcia",
- "description": "Czas rozpoczęcia danych prognozowych."
+ "description": "Pobierz dane prognoz od określonej daty i godziny."
},
"end_date_time": {
"name": "Data i godzina zakończenia",
- "description": "Czas zakończenia danych prognozowych."
+ "description": "Pobierz dane prognoz do określonej daty i godziny."
}
}
+ },
+ "set_dampening": {
+ "name": "Ustaw tłumienie prognoz",
+ "description": "Ustaw godzinowy współczynnik tłumienia prognoz.",
+ "fields": {
+ "damp_factor": {
+ "name": "Ciąg tłumienia",
+ "description": "Ciąg wartości współczynnika tłumienia godzinowego, oddzielany przecinkiem."
+ }
+ }
+ },
+ "set_hard_limit": {
+ "name": "Ustaw twardy limit prognoz inwertera",
+ "description": "Zabrania wartości prognoz przekraczających maksymalną moc inwertera.",
+ "fields": {
+ "hard_limit": {
+ "name": "Wartość limitu w watach",
+ "description": "Ustaw maksymalną wartość w watach, jaką może wyprodukować inwerter."
+ }
+ }
+ },
+ "remove_hard_limit": {
+ "name": "Usuń twardy limit prognoz inwertera",
+ "description": "Usuń ustawiony limit."
}
},
"entity": {
@@ -114,6 +155,7 @@
"forecast_this_hour": {"name": "Prognoza na bieżącą godzinę"},
"get_remaining_today": {"name": "Pozostała prognoza na dziś"},
"forecast_next_hour": {"name": "Prognoza na następną godzinę"},
+ "forecast_custom_hours": {"name": "Prognoza na następne X godzin"},
"total_kwh_forecast_tomorrow": {"name": "Prognoza na jutro"},
"peak_w_tomorrow": {"name": "Szczytowa moc jutro"},
"peak_w_time_tomorrow": {"name": "Czas szczytowej mocy jutro"},
@@ -125,7 +167,12 @@
"total_kwh_forecast_d5": {"name": "Prognoza na dzień 5"},
"total_kwh_forecast_d6": {"name": "Prognoza na dzień 6"},
"total_kwh_forecast_d7": {"name": "Prognoza na dzień 7"},
- "power_now": {"name": "Aktualna moc"}
+ "power_now": {"name": "Aktualna moc"},
+ "weather_description": {"name": "Pogoda"},
+ "hard_limit": {"name": "Ustawiony twardy limit"}
+ },
+ "select": {
+ "estimate_mode" : {"name": "Użyj pola prognozy"}
}
}
}
\ No newline at end of file
diff --git a/custom_components/solcast_solar/translations/sk.json b/custom_components/solcast_solar/translations/sk.json
index 88adf67b..3852f080 100644
--- a/custom_components/solcast_solar/translations/sk.json
+++ b/custom_components/solcast_solar/translations/sk.json
@@ -9,7 +9,7 @@
"api_key": "Kľúč API (viac hodnôt oddelených čiarkou)",
"api_quota": "Kvóta rozhrania API (viac hodnôt oddelených čiarkou)"
},
- "description": "Podrobnosti účtu Solcast API"
+ "description": "Podrobnosti o účte Solcast"
}
}
},
@@ -26,7 +26,7 @@
"api_key": "Kľúč API (viac hodnôt oddelených čiarkou)",
"api_quota": "Kvóta rozhrania API (viac hodnôt oddelených čiarkou)"
},
- "description": "Podrobnosti účtu Solcast API"
+ "description": "Podrobnosti o účte Solcast"
},
"dampen": {
"data": {
@@ -83,7 +83,7 @@
"selector": {
"solcast_config_action": {
"options": {
- "configure_api": "Solcast API kľúč",
+ "configure_api": "Podrobnosti o účte Solcast",
"configure_dampening": "Konfigurácia tlmenia",
"configure_customsensor": "Konfigurácia vlastného snímača hodín",
"configure_attributes": "Konfigurácia dostupných atribútov"
diff --git a/custom_components/solcast_solar/translations/ur.json b/custom_components/solcast_solar/translations/ur.json
index 101a40a5..653801fc 100644
--- a/custom_components/solcast_solar/translations/ur.json
+++ b/custom_components/solcast_solar/translations/ur.json
@@ -9,7 +9,7 @@
"api_key": "کلید (کوما سے الگ متعدد اقدار) API",
"api_quota": "کوٹہ (کوما سے الگ متعدد اقدار) API"
},
- "description": "سولکاسٹ API اکاؤنٹ کی تفصیلات"
+ "description": "سولکاسٹ اکاؤنٹ کی تفصیلات"
}
}
},
@@ -26,7 +26,7 @@
"api_key": "کلید (کوما سے الگ متعدد اقدار) API",
"api_quota": "کوٹہ (کوما سے الگ متعدد اقدار) API"
},
- "description": "سولکاسٹ API اکاؤنٹ کی تفصیلات"
+ "description": "سولکاسٹ اکاؤنٹ کی تفصیلات"
},
"dampen": {
"data": {
@@ -83,8 +83,10 @@
"selector": {
"solcast_config_action": {
"options": {
- "configure_api": "سولکاسٹ API کلید",
- "configure_dampening": "ڈیمپننگ کو ترتیب دیں۔"
+ "configure_api": "سولکاسٹ اکاؤنٹ کی تفصیلات",
+ "configure_dampening": "ڈیمپننگ کو ترتیب دیں۔",
+ "configure_customsensor": "حسب ضرورت اوقات کے سینسر کو ترتیب دیں۔",
+ "configure_attributes": "دستیاب صفات کو ترتیب دیں۔"
}
}
},
diff --git a/hacs.json b/hacs.json
index de4f6c76..e4b5991c 100644
--- a/hacs.json
+++ b/hacs.json
@@ -1,5 +1,5 @@
{
- "name": "Solcast PV Solar",
+ "name": "Solcast PV Forecast",
"render_readme": true,
"homeassistant": "2023.7",
"zip_release": true,