-
Notifications
You must be signed in to change notification settings - Fork 4
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge branch 'dev' into features/#101-import-heat-demand-data
- Loading branch information
Showing
11 changed files
with
3,940 additions
and
144 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,91 @@ | ||
*************** | ||
Troubleshooting | ||
*************** | ||
|
||
Having trouble installing or running ``eGon-data``? Here's a list of | ||
known issues including a solution. | ||
|
||
|
||
Installation Errors | ||
=================== | ||
|
||
These are some errors you might encounter while trying to install | ||
:py:mod:`egon.data`. | ||
|
||
``importlib_metadata.PackageNotFoundError: No package metadata ...`` | ||
-------------------------------------------------------------------- | ||
|
||
It might happen that you have installed `importlib-metadata=3.1.0` for some | ||
reason which will lead to this error. Make sure you have | ||
`importlib-metadata>=3.1.1` installed. For more information read the | ||
discussion in :issue:`60`. | ||
|
||
|
||
Runtime Errors | ||
============== | ||
|
||
These are some of the errors you might encounter while trying to run | ||
:code:`egon-data`. | ||
|
||
``ERROR: Couldn't connect to Docker daemon ...`` | ||
------------------------------------------------ | ||
|
||
To verify, please execute :code:`docker-compose -f <(echo {"service": | ||
{"image": "hellow-world"}}) ps` and you should see something like | ||
|
||
|
||
.. code-block:: none | ||
ERROR: Couldn't connect to Docker daemon at http+docker://localunixsocket - is it running? | ||
If it's at a non-standard location, specify the URL with the DOCKER_HOST environment | ||
variable. | ||
This can have at least two possible reasons. First, the docker daemon | ||
might not be running. On Linux Systems, you can check for this by | ||
running :code:`ps -e | grep dockerd`. If this generates no output, you | ||
have to start the docker daemon, which you can do via :code:`sudo | ||
systemctl start docker.service` on recent Ubuntu systems. | ||
|
||
Second, your current user might not be a member of the `docker` group. On | ||
Linux, you can check this by running :code:`groups $(whoami)`. If the | ||
output does not contain the word `docker`, you have to add your current | ||
user to the `docker` group. You can find more information on how to do | ||
this in the `docker documentation`_. Read the :issue:`initial discussion | ||
<33>` for more context. | ||
|
||
.. _docker documentation: https://docs.docker.com/engine/install/linux-postinstall/#manage-docker-as-a-non-root-user | ||
|
||
|
||
``[ERROR] Connection in use ...`` | ||
--------------------------------- | ||
|
||
This error might arise when running :code:`egon-data serve` making it | ||
shut down early with :code:`ERROR - Shutting down webserver`. The reason | ||
for this is that the local webserver from a previous :code:`egon-data | ||
serve` run didn't shut down properly and is still running. This can be | ||
fixed by running :code:`ps -eo pid,command | grep "gunicorn: master" | | ||
grep -v grep` which should lead to output like :code:`NUMBER gunicorn: | ||
master [airflow-webserver]` where :code:`NUMBER` is a varying number. | ||
Once you got this, run :code:`kill -s INT NUMBER`, substituting | ||
:code:`NUMBER` with what you got previously. After this, | ||
:code:`egon-data serve` should run without errors again. | ||
|
||
|
||
Other import or incompatible package version errors | ||
=================================================== | ||
|
||
If you get an :py:class:`ImportError` when trying to run ``egon-data``, | ||
or the installation complains with something like | ||
|
||
.. code-block:: none | ||
first-package a.b.c requires second-package>=q.r.r, but you'll have | ||
second-package x.y.z which is incompatible. | ||
you might have run into a problem of earlier ``pip`` versions. Either | ||
upgrade to a ``pip`` version >=20.3 and reinstall ``egon.data``, or | ||
reinstall the package via ``pip install -U --use-feature=2020-resolver``. | ||
The ``-U`` flag is important to actually force a reinstall. For more | ||
information read the discussions in issues :issue:`#36 <36>` and | ||
:issue:`#37 <37>`. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,3 +1,45 @@ | ||
******** | ||
Workflow | ||
******** | ||
|
||
Project background | ||
----------------- | ||
|
||
egon-data provides a transparent and reproducible open data based data processing pipeline for generating data models suitable for energy system modeling. The data is customized for the requirements of the research project eGo_n. The research project aims to develop tools for an open and cross-sectoral planning of transmission and distribution grids. For further information please visit the `eGo_n project website <https://ego-n.org/>`_. | ||
egon-data is a further development of the `Data processing <https://github.com/openego/data_processing>`_ developed in the former research project `open_eGo <https://openegoproject.wordpress.com/>`_. It aims for an extensions of the data models as well as for a better replicability and manageability of the data preparation and processing. | ||
The resulting data set serves as an input for the optimization tools `eTraGo <https://github.com/openego/eTraGo>`_, `ding0 <https://github.com/openego/ding0>`_ and `eDisGo <https://github.com/openego/eDisGo>`_ and delivers for example data on grid topologies, demands/demand curves and generation capacities in a high spatial resolution. The outputs of egon-data are published under open source and open data licenses. | ||
|
||
Data | ||
---- | ||
|
||
egon-data retrieves and processes data from several different external input sources which are all freely available and published under an open data license. The process handles data with different data types, such as spatial data with a high geographical resolution or load/generation time series with an hourly resolution. | ||
|
||
Execution | ||
--------- | ||
|
||
In principle egon-data is not limited to the use of a specific programming language as the workflow integrates different scripts using Apache Airflow, but Python and SQL are widely used within the process. Apache Airflow organizes the order of execution of processing steps through so-called operators. In the default case the SQL processing is executed on a containerized local PostgreSQL database using Docker. For further information on Docker and its installation please refer to their `documentation <https://docs.docker.com/>`_. Connection information of our local Docker database are defined in the corresponding `docker-compose.yml <https://github.com/openego/eGon-data/blob/dev/src/egon/data/airflow/docker-compose.yml>`_ | ||
|
||
The egon-data workflow is composed of four different sections: database setup, data import, data processing and data export to the OpenEnergy Platform. Each section consists of different tasks, which are managed by Apache Airflow and correspond with the local database. | ||
Only final datasets which function as an input for the optimization tools or selected interim results are uploaded to the `Open Energy Platform <https://openenergy-platform.org/>`_. | ||
The data processing in egon-data needs to be performed locally as calculations on the Open Energy Platform are prohibited. | ||
More information on how to run the workflow can be found in the `getting started section <https://egon-data.readthedocs.io/en/latest/getting_started.html#run-the-workflow>`_ of our documentation. | ||
|
||
.. _DP_workflow_sketch: | ||
.. figure:: images/DP_Workflow_15012021.svg | ||
|
||
|
||
Versioning | ||
---------- | ||
|
||
.. warning:: | ||
Please note, the following is not implemented yet, but we are working on it. | ||
|
||
Source code and data are versioned independendly from each other. Every data table uploaded to the Open Energy Platform contains a column 'version' which is used to identify different versions of the same data set. The version number is maintained for every table separately. This is a major difference to the versioning concept applied in the former data processing where all (interim) results were versioned under the same version number. | ||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
Oops, something went wrong.