FloodsML is a deep learning–based workflow for flood extent detection using Sentinel-1 SAR imagery and DEM data. The pipeline is packaged in Docker for reproducibility and portability, and it integrates seamlessly with STAC-compliant datasets.
From the project root directory, build the Docker image:
docker build -t dev_floodsml -f dev_floodsml_Dockerfile .Prepare an input folder on your local machine (e.g., C:\Users\<user>\Desktop\Dev FloodsML\input_src) and mount it into the container:
docker run -it --rm --name floods_demo_container ^
-v "C:\Users\<user>\Desktop\Dev FloodsML\input_src:/app/input_src" ^
dev_floodsmlThe mounted folder must contain:
- Sentinel-1 before-flood acquisitions (VV and VH)
- Sentinel-1 after-flood acquisitions (VV and VH)
- DEM file(s) (one or more, named
copdem.tif)
⚠️ All input data must follow the STAC item structure so that metadata, georeferencing, and provenance are preserved.
Inside the container, run:
python /app/env/main.py \
--aoi_wkt "POLYGON((...))" \
[--hand] \
[--buffer 5000] \
[--treshold 5]--aoi_wkt(required): AOI polygon in WGS84 (EPSG:4326). Must be of type POLYGON.--hand(optional): Apply HAND-based filtering in post-processing.--buffer(optional, default:5000): Buffer distance for HAND calculation.--treshold(optional, default:5): Threshold parameter for HAND filtering.
Results are published to:
/app/tmp/results
If you want to make the results visible on the host system, map the results directory when starting the container, e.g.:
docker run -it --rm --name floods_demo_container ^
-v "C:\Users\<user>\Desktop\Dev FloodsML\input_src:/app/input_src" ^
-v "C:\Users\<user>\Desktop\Dev FloodsML\results:/app/tmp/results" ^
dev_floodsmlThe outputs are:
- Cloud-Optimized GeoTIFF (COG) masks of flooded areas
- Wrapped as STAC items for interoperability
The masks contain three classes:
0– non-water1– permanent hydrography2– flooded areas
This structure ensures compatibility with catalog-based workflows and easy integration with other disaster mapping products.
STAC input items → Preprocessing & Inference → STAC output items (COG masks)
The workflow ensures that both input and output remain fully standardized, enabling transparent use in broader EO platforms and services.
📌 Ready-to-use, portable, and interoperable — FloodsML can be deployed wherever flood mapping support is required.
"# floodsdl-mcube"