Timo21 commited on
Commit
44d7bee
Β·
2 Parent(s): c3216fe e108169

Merge pull request #1 from rodekruis/convert_to_work_with_gfm

Browse files
.env_template ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ gfm_username=your-gfm-username
2
+ gfm_password=your-gfm-password
.gitignore CHANGED
@@ -1,3 +1,7 @@
 
 
 
 
1
  # Byte-compiled / optimized / DLL files
2
  __pycache__/
3
  *.py[cod]
 
1
+ .env
2
+ /output/*
3
+ !/output/.gitkeep
4
+
5
  # Byte-compiled / optimized / DLL files
6
  __pycache__/
7
  *.py[cod]
.pre-commit-config.yaml DELETED
@@ -1,39 +0,0 @@
1
- default_language_version:
2
- python: python3.9
3
-
4
- repos:
5
- - repo: https://github.com/pre-commit/pre-commit-hooks
6
- rev: v4.4.0
7
- hooks:
8
- - id: trailing-whitespace
9
- - id: end-of-file-fixer
10
- - id: check-ast
11
- - id: check-toml
12
- - id: check-yaml
13
- - repo: https://github.com/psf/black
14
- rev: 23.1.0
15
- hooks:
16
- - id: black
17
- - repo: https://github.com/PyCQA/isort
18
- rev: 5.12.0
19
- hooks:
20
- - id: isort
21
- - repo: https://github.com/pycqa/flake8
22
- rev: 6.0.0
23
- hooks:
24
- - id: flake8
25
- additional_dependencies:
26
- # - flake8-bugbear==21.9.2
27
- # - flake8-comprehensions==3.6.1
28
- # - flake8-deprecated==1.3
29
- - flake8-docstrings==1.6.0
30
- # - flake8-keyword-arguments==0.1.0
31
- - repo: https://github.com/pre-commit/mirrors-mypy
32
- rev: v0.910
33
- hooks:
34
- - id: mypy
35
- additional_dependencies:
36
- - types-PyYAML==5.4.10
37
- - types-setuptools==57.4.0
38
- - types-requests==2.25.9
39
- exclude: tests
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
README.md CHANGED
@@ -1,63 +1,15 @@
1
  # Flood Mapping Tool
2
 
3
- [![Open in Streamlit](https://static.streamlit.io/badges/streamlit_badge_black_white.svg)](https://mapaction-flood-map.streamlit.app/)
4
- [![license](https://img.shields.io/github/license/OCHA-DAP/pa-aa-toolbox.svg)](https://github.com/mapaction/flood-mapping-tool/blob/main/LICENSE)
5
- [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
6
- [![Imports: isort](https://img.shields.io/badge/%20imports-isort-%231674b1?style=flat&labelColor=ef8336)](https://pycqa.github.io/isort/)
7
 
8
- This repository contains a Streamlit app that allows to estimate flood extent using Sentinel-1 synthetic-aperture radar <a href='https://sentinel.esa.int/web/sentinel/user-guidessentinel-1-sar'>SAR</a> data.
 
9
 
10
- The methodology is based on a <a href='https://un-spider.org/advisory-support/recommended-practices/recommended-\practice-google-earth-engine-flood-mapping'> recommended practice </a> published by the United Nations Platform for Space-based Information for Disaster Management and Emergency Response (UN-SPIDER) and it uses several satellite imagery datasets to produce the final output. The datasets are retrieved from <a href='https://earthengine.google.com/'>Google Earth Engine</a> which is a powerful web-platform for cloud-based processing of remote sensing data on large scales. More information on the methodology is given in the <i>Description</i> page in the Streamlit app.
11
-
12
- This analysis provides a comprehensive overview of a flooding event, across different areas of interest, from settlements to countries. However, as mentioned in the UN-SPIDER website, the methodology is meant for broad information provision in a global context, and contains inherent uncertainties. Therefore, it is important that the tool is not used as the only source of information for rescue response planning.
13
-
14
- ## Usage
15
-
16
- #### Requirements
17
-
18
- The Python version currently used is 3.10. Please install all packages from
19
- ``requirements.txt``:
20
-
21
- ```shell
22
- pip install -r requirements.txt
23
  ```
24
-
25
- #### Google Earth Engine authentication
26
-
27
- [Sign up](https://signup.earthengine.google.com/) for a Google Earth Engine account, if you don't already have one. Open a terminal window, type `python` and then paste the following code:
28
-
29
- ```python
30
- import ee
31
- ee.Authenticate()
32
- ```
33
-
34
- Log in to your Google account to obtain the authorization code and paste it back into the terminal. Once you press "Enter", an authorization token will be saved to your computer under the following file path (depending on your operating system):
35
-
36
- - Windows: `C:\\Users\\USERNAME\\.config\\earthengine\\credentials`
37
- - Linux: `/home/USERNAME/.config/earthengine/credentials`
38
- - MacOS: `/Users/USERNAME/.config/earthengine/credentials`
39
-
40
- The credentials will be used when initialising Google Earth Engine in the app.
41
-
42
- #### Run the app
43
-
44
- Finally, open a terminal and run
45
-
46
- ```shell
47
  streamlit run app/Home.py
48
  ```
49
 
50
- A new browser window will open and you can start using the tool.
51
-
52
- ## Contributing
53
-
54
- #### Pre-commit
55
-
56
- All code is formatted according to
57
- [black](https://github.com/psf/black) and [flake8](https://flake8.pycqa.org/en/latest) guidelines. The repo is set-up to use [pre-commit](https://github.com/pre-commit/pre-commit). Please run ``pre-commit install`` the first time you are editing. Thereafter all commits will be checked against black and flake8 guidelines.
58
-
59
- To check if your changes pass pre-commit without committing, run:
60
-
61
- ```shell
62
- pre-commit run --all-files
63
- ```
 
1
  # Flood Mapping Tool
2
 
3
+ ## Installation & Running
4
+ ### GFM account
5
+ To run the app you will need a (free) GFM account. To get one register at https://portal.gfm.eodc.eu/login . Once you have an account create a file `.env` with the content of `.env_template` where you will need to fill out your GFM username and password. The `.env` file will be gitignored.
 
6
 
7
+ ### Python
8
+ Project is using python 3.12. Install requirements from `pyproject.toml` in your preferred way. We suggest using `uv`, see [here](https://docs.astral.sh/uv/getting-started/installation/) for installation instructions. Once installed your can run `uv sync` to create a `.venv` folder. Activate the `.venv` and then run the line below to run the app:
9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
  ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
11
  streamlit run app/Home.py
12
  ```
13
 
14
+ ## Project
15
+ TODO: Add more complete documentation.
 
 
 
 
 
 
 
 
 
 
 
 
app/Home.py CHANGED
@@ -1,21 +1,21 @@
1
  """Home page for Streamlit app."""
 
2
  import streamlit as st
3
  from src.config_parameters import params
 
 
 
 
4
  from src.utils import (
5
  add_about,
6
- add_logo,
7
  set_home_page_style,
8
  toggle_menu_button,
9
  )
10
 
11
- # Page configuration
12
- st.set_page_config(layout="wide", page_title=params["browser_title"])
13
-
14
  # If app is deployed hide menu button
15
  toggle_menu_button()
16
 
17
  # Create sidebar
18
- add_logo("app/img/MA-logo.png")
19
  add_about()
20
 
21
  # Set page style
@@ -26,110 +26,8 @@ st.markdown("# Home")
26
 
27
  # First section
28
  st.markdown("## Introduction")
29
- st.markdown(
30
- """
31
- This tool allows to estimate flood extent using Sentinel-1
32
- synthetic-aperture radar
33
- <a href='%s'>SAR</a> data.<br><br>
34
- The methodology is based on a <a href=
35
- '%s'>recommended practice</a>
36
- published by the United Nations Platform for Space-based Information for
37
- Disaster Management and Emergency Response (UN-SPIDER) and it uses several
38
- satellite imagery datasets to produce the final output. The datasets are
39
- retrieved from <a href='%s'>Google Earth
40
- Engine</a> which is a powerful web-platform for cloud-based processing of
41
- remote sensing data on large scales. More information on the methodology is
42
- given in the Description.<br><br>
43
- This analysis provides a comprehensive overview of a flooding event, across
44
- different areas of interest, from settlements to countries. However, as
45
- mentioned in the UN-SPIDER website, the methodology is meant for broad
46
- information provision in a global context, and contains inherent
47
- uncertainties. Therefore, it is important that the tool is not used as the
48
- only source of information for rescue response planning.
49
- """
50
- % (
51
- params["url_sentinel_esa"],
52
- params["url_unspider_tutorial"],
53
- params["url_gee"],
54
- ),
55
- unsafe_allow_html=True,
56
- )
57
 
58
  # Second section
59
  st.markdown("## How to use the tool")
60
- st.markdown(
61
- """
62
- <ul>
63
- <li><p>
64
- In the sidebar, choose <i>Flood extent analysis</i> to start the
65
- analysis.
66
- </p>
67
- <li><p>
68
- In the left panel, use the drawing tool to select an area of
69
- interest on the map. You can delete your selection by clicking on
70
- the bin icon. While the flood mapping is generated regardless of
71
- the size of the selected region, you will be able to save raster
72
- and vector flooding extent only if the side of the rectangular
73
- selection does not exceed 100 km.
74
- </p>
75
- <li><p>
76
- In the right panel click on the title <i>Choose Image Dates</i>
77
- in order to expand the section. Here you need to select four dates.
78
- The first two identify a range of dates based on which the
79
- reference imagery (before the flooding event) is defined. You can
80
- select even years worth of data (the reference imagery is
81
- calculated as the median between the range of observations), but
82
- make sure you take into account wet and dry seasons if only taking
83
- a few months. The last two refer to a period of time which comes
84
- after the flooding event. By setting periods, not single dates, you
85
- allow the selection of enough tiles to cover the area of interest.
86
- Sentinel-1 imagery is acquired minimum every 12 days for each point
87
- on the globe (see Figure 2 in the documentation).
88
- </p>
89
- <li>
90
- <p>
91
- By clicking on <i>Choose parameters</i>, you will be able to
92
- set two variables:
93
- </p>
94
- <ul>
95
- <li><p>
96
- The <i>threshold</i> is the value against which the
97
- difference the two satellite images - before and after the
98
- flooding event - is tested. Lower thresholds result in a
99
- greater area considered "flooded". It is recommended to set
100
- the value to 1.25, which was selected through trial and
101
- error. You may want to adjust the value in case of high
102
- rates of false positive or negative values, especially in
103
- case other sources of information are available and it is
104
- possible to compare flood extent estimations between
105
- sources.
106
- </p>
107
- <li><p>
108
- The <i>pass direction</i> has to do with the way the
109
- satellite travels around the Earth. Depending on your area
110
- of interest and time period, you may find more imagery
111
- available for either the <i>Ascending</i> or the
112
- <i>Descending</i> pass directions (see Figure 2 in the
113
- Documentation). It is recommended to leave the parameter
114
- unchanged for a first estimation and change its value in
115
- case partial or no imagery is produced.
116
- </p>
117
- </ul>
118
- <li><p>
119
- Once the parameters are set, you can finally click on <i>Compute
120
- flood extent</i> to run the calculations. A map will appear
121
- underneath, with a layer containing the flooded area within the
122
- area of interest.
123
- </p>
124
- <li><p>
125
- If you wish to export the layer to file, you can click on <i>Export
126
- to file</i> and download the raster and/or vector data.
127
- </p>
128
- </ul>
129
- <p>
130
- In case you get errors, follow the intructions. If you have doubts,
131
- feel free to contact the Data Science team.
132
- </p>
133
- """,
134
- unsafe_allow_html=True,
135
- )
 
1
  """Home page for Streamlit app."""
2
+
3
  import streamlit as st
4
  from src.config_parameters import params
5
+
6
+ # Page configuration
7
+ st.set_page_config(layout="wide", page_title=params["browser_title"])
8
+
9
  from src.utils import (
10
  add_about,
 
11
  set_home_page_style,
12
  toggle_menu_button,
13
  )
14
 
 
 
 
15
  # If app is deployed hide menu button
16
  toggle_menu_button()
17
 
18
  # Create sidebar
 
19
  add_about()
20
 
21
  # Set page style
 
26
 
27
  # First section
28
  st.markdown("## Introduction")
29
+ st.markdown("TODO: new introduction")
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
30
 
31
  # Second section
32
  st.markdown("## How to use the tool")
33
+ st.markdown("TODO: new how to use the tool")
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
app/img/workflow.png DELETED
Binary file (44.6 kB)
 
app/pages/0_AOIs.py ADDED
@@ -0,0 +1,169 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import json
2
+ import os
3
+ from pathlib import Path
4
+
5
+ import folium
6
+ import requests
7
+ import streamlit as st
8
+ from folium.plugins import Draw
9
+ from src.config_parameters import params
10
+ from src.gfm import download_gfm_geojson, get_existing_flood_geojson, sync_cached_data
11
+ from src.utils import (
12
+ add_about,
13
+ set_tool_page_style,
14
+ toggle_menu_button,
15
+ )
16
+ from streamlit_folium import st_folium
17
+
18
+ # Page configuration
19
+ st.set_page_config(layout="wide", page_title=params["browser_title"])
20
+
21
+ # If app is deployed hide menu button
22
+ toggle_menu_button()
23
+
24
+ # Create sidebar
25
+ add_about()
26
+
27
+ # Page title
28
+ st.markdown("# Flood extent analysis")
29
+
30
+ # Set page style
31
+ set_tool_page_style()
32
+
33
+ # Create two rows: top and bottom panel
34
+ row1 = st.container()
35
+ save_area = False
36
+
37
+ with row1:
38
+ action_type = st.radio(
39
+ label="Action Type",
40
+ options=["See Areas", "Create New Area", "Delete Area"],
41
+ label_visibility="hidden",
42
+ )
43
+
44
+ # call to render Folium map in Streamlit
45
+ folium_map = folium.Map([39, 0], zoom_start=8)
46
+ feat_group_selected_area = folium.FeatureGroup(name="selected_area")
47
+
48
+ if action_type == "See Areas":
49
+ with open("./bboxes/bboxes.json", "r") as f:
50
+ bboxes = json.load(f)
51
+ for area_name in bboxes.keys():
52
+ bbox = bboxes[area_name]["bounding_box"]
53
+ feat_group_selected_area.add_child(folium.GeoJson(bbox))
54
+
55
+ folium_map.fit_bounds(feat_group_selected_area.get_bounds())
56
+
57
+ elif action_type == "Create New Area":
58
+ Draw(
59
+ export=False,
60
+ draw_options={
61
+ "circle": False,
62
+ "polyline": False,
63
+ "polygon": False,
64
+ "rectangle": True,
65
+ "marker": False,
66
+ "circlemarker": False,
67
+ },
68
+ ).add_to(folium_map)
69
+
70
+ new_area_name = st.text_input("Area name")
71
+ save_area = st.button("Save Area")
72
+
73
+ elif action_type == "Delete Area":
74
+ # Load existing bboxes
75
+ with open("./bboxes/bboxes.json", "r") as f:
76
+ bboxes = json.load(f)
77
+ existing_areas = bboxes.keys()
78
+
79
+ area_to_delete = st.selectbox("Choose area to delete", options=existing_areas)
80
+ bbox = bboxes[area_to_delete]["bounding_box"]
81
+ feat_group_selected_area.add_child(folium.GeoJson(bbox))
82
+ folium_map.fit_bounds(feat_group_selected_area.get_bounds())
83
+
84
+ delete_area = st.button("Delete")
85
+
86
+ if delete_area:
87
+ with open("./bboxes/bboxes.json", "r") as f:
88
+ bboxes = json.load(f)
89
+ bboxes.pop(area_to_delete, None)
90
+ with open("./bboxes/bboxes.json", "w") as f:
91
+ json.dump(bboxes, f)
92
+ st.toast("Area successfully deleted")
93
+
94
+ with open("./bboxes/bboxes.json", "r") as f:
95
+ bboxes = json.load(f)
96
+
97
+ # geojson_catania = get_existing_flood_geojson("Catania")
98
+ # print(geojson_catania)
99
+ # geojson_selected_area = folium.GeoJson(geojson_catania)
100
+
101
+ # feat_group_selected_area.add_child(geojson_selected_area)
102
+ m = st_folium(
103
+ folium_map,
104
+ width=800,
105
+ height=450,
106
+ feature_group_to_add=feat_group_selected_area,
107
+ )
108
+
109
+ if save_area:
110
+ check_drawing = m["all_drawings"] != [] and m["all_drawings"] is not None
111
+ if not check_drawing:
112
+ st.error("Please create a region using the rectangle tool on the map.")
113
+ elif new_area_name == "":
114
+ st.error("Please provide a name for the new area")
115
+ else:
116
+ # Get the drawn area
117
+ selected_area_geojson = m["all_drawings"][-1]
118
+
119
+ print("starting to post new area name to gfm api")
120
+ coordinates = selected_area_geojson["geometry"]["coordinates"]
121
+
122
+ username = os.environ["gfm_username"]
123
+ password = os.environ["gfm_password"]
124
+ base_url = "https://api.gfm.eodc.eu/v1"
125
+
126
+ # Get token, setup header
127
+ token_url = f"{base_url}/auth/login"
128
+
129
+ payload = {"email": username, "password": password}
130
+
131
+ response = requests.post(token_url, json=payload)
132
+ user_id = response.json()["client_id"]
133
+ access_token = response.json()["access_token"]
134
+ header = {"Authorization": f"bearer {access_token}"}
135
+
136
+ print("authenticated to API")
137
+ # Create area of impact
138
+ create_aoi_url = f"{base_url}/aoi/create"
139
+
140
+ payload = {
141
+ "aoi_name": new_area_name,
142
+ "description": new_area_name,
143
+ "user_id": user_id,
144
+ "geoJSON": {"type": "Polygon", "coordinates": coordinates},
145
+ }
146
+
147
+ r = requests.post(url=create_aoi_url, json=payload, headers=header)
148
+ print(r.json())
149
+ print("Posted new AOI")
150
+
151
+ print("Writing new area to bbox json")
152
+ # Load existing bboxes
153
+ with open("./bboxes/bboxes.json", "r") as f:
154
+ bboxes = json.load(f)
155
+
156
+ # If the area doesn't exist, create it
157
+ if new_area_name not in bboxes:
158
+ bboxes[new_area_name] = {}
159
+
160
+ # Save the new bounding box under the date range key
161
+ bboxes[new_area_name] = {
162
+ "bounding_box": selected_area_geojson,
163
+ "date_ranges": [], # Will be populated when files are downloaded
164
+ }
165
+ # Write the updated data back to file
166
+ with open("./bboxes/bboxes.json", "w") as f:
167
+ json.dump(bboxes, f, indent=4)
168
+
169
+ st.toast("Area successfully created")
app/pages/1_🌍_Flood_extent_analysis.py CHANGED
@@ -1,22 +1,16 @@
1
- """Flood extent analysis page for Streamlit app."""
2
- import datetime as dt
3
 
4
- import ee
5
  import folium
6
- import geemap.foliumap as geemap
7
- import requests
8
  import streamlit as st
9
- import streamlit_ext as ste
10
- from folium.plugins import Draw, Geocoder, MiniMap
11
  from src.config_parameters import params
 
12
  from src.utils import (
13
  add_about,
14
- add_logo,
15
  set_tool_page_style,
16
  toggle_menu_button,
17
  )
18
- from src.utils_ee import ee_initialize
19
- from src.utils_flood_analysis import derive_flood_extents
20
  from streamlit_folium import st_folium
21
 
22
  # Page configuration
@@ -26,7 +20,6 @@ st.set_page_config(layout="wide", page_title=params["browser_title"])
26
  toggle_menu_button()
27
 
28
  # Create sidebar
29
- add_logo("app/img/MA-logo.png")
30
  add_about()
31
 
32
  # Page title
@@ -35,268 +28,103 @@ st.markdown("# Flood extent analysis")
35
  # Set page style
36
  set_tool_page_style()
37
 
38
- # Initialise Google Earth Engine
39
- ee_initialize(force_use_service_account=True)
40
-
41
-
42
- # Output_created is useful to decide whether the bottom panel with the
43
- # output map should be visualised or not
44
- if "output_created" not in st.session_state:
45
- st.session_state.output_created = False
46
-
47
-
48
- # Function to be used to hide bottom panel (when setting parameters for a
49
- # new analysis)
50
- def callback():
51
- """Set output created to zero: reset tool."""
52
- st.session_state.output_created = False
53
-
54
 
55
  # Create two rows: top and bottom panel
56
  row1 = st.container()
57
- row2 = st.container()
58
  # Crate two columns in the top panel: input map and paramters
59
  col1, col2 = row1.columns([2, 1])
 
60
  with col1:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
61
  # Add collapsable container for input map
62
  with st.expander("Input map", expanded=True):
63
  # Create folium map
64
- Map = folium.Map(
65
- location=[52.205276, 0.119167],
66
- zoom_start=3,
67
- control_scale=True,
68
- # crs='EPSG4326'
 
 
 
 
 
 
 
 
 
 
 
69
  )
70
- # Add drawing tools to map
71
- Draw(
72
- export=False,
73
- draw_options={
74
- "circle": False,
75
- "polyline": False,
76
- "polygon": True,
77
- "circle": False,
78
- "marker": False,
79
- "circlemarker": False,
80
- },
81
- ).add_to(Map)
82
- # Add search bar with geocoder to map
83
- Geocoder(add_marker=False).add_to(Map)
84
- # Add minimap to map
85
- MiniMap().add_to(Map)
86
- # Export map to Streamlit
87
- output = st_folium(Map, width=800, height=600)
88
  with col2:
89
  # Add collapsable container for image dates
90
- with st.expander("Choose Image Dates"):
91
- # Callback is added, so that, every time a parameters is changed,
92
- # the bottom panel containing the output map is hidden
93
- before_start = st.date_input(
94
- "Start date for reference imagery",
95
- value=dt.date(year=2022, month=7, day=1),
96
- help="It needs to be prior to the flooding event",
97
- on_change=callback,
98
- )
99
- before_end = st.date_input(
100
- "End date for reference imagery",
101
- value=dt.date(year=2022, month=7, day=30),
102
- help=(
103
- "It needs to be prior to the flooding event, at least 15 "
104
- "days subsequent to the date selected above"
105
- ),
106
- on_change=callback,
107
- )
108
- after_start = st.date_input(
109
- "Start date for flooding imagery",
110
- value=dt.date(year=2022, month=9, day=1),
111
- help="It needs to be subsequent to the flooding event",
112
- on_change=callback,
113
- )
114
- after_end = st.date_input(
115
- "End date for flooding imagery",
116
- value=dt.date(year=2022, month=9, day=16),
117
- help=(
118
- "It needs to be subsequent to the flooding event and at "
119
- "least 10 days to the date selected above"
120
- ),
121
- on_change=callback,
122
- )
123
  # Add collapsable container for parameters
124
  with st.expander("Choose Parameters"):
125
  # Add slider for threshold
126
- add_slider = st.slider(
127
- label="Select a threshold",
128
- min_value=0.0,
129
- max_value=5.0,
130
- value=1.25,
131
- step=0.25,
132
- help="Higher values might reduce overall noise",
133
- on_change=callback,
134
- )
135
- # Add radio buttons for pass direction
136
- pass_direction = st.radio(
137
- "Set pass direction",
138
- ["Ascending", "Descending"],
139
- on_change=callback,
140
- )
141
- # Button for computation
142
- submitted = st.button("Compute flood extent")
143
- # Introduce date validation
144
- check_dates = before_start < before_end <= after_start < after_end
145
- # Introduce drawing validation (a polygon needs to exist)
146
- check_drawing = (
147
- output["all_drawings"] != [] and output["all_drawings"] is not None
148
- )
149
- # What happens when button is clicked on?
150
  if submitted:
151
  with col2:
 
 
 
 
152
  # Output error if dates are not valid
153
  if not check_dates:
154
  st.error("Make sure that the dates were inserted correctly")
155
- # Output error if no polygons were drawn
156
- elif not check_drawing:
157
- st.error("No region selected.")
158
- else:
159
- # Add output for computation
160
- with st.spinner("Computing... Please wait..."):
161
- # Extract coordinates from drawn polygon
162
- coords = output["all_drawings"][-1]["geometry"]["coordinates"][
163
- 0
164
- ]
165
- # Create geometry from coordinates
166
- ee_geom_region = ee.Geometry.Polygon(coords)
167
- # Crate flood raster and vector
168
- (
169
- detected_flood_vector,
170
- detected_flood_raster,
171
- _,
172
- _,
173
- ) = derive_flood_extents(
174
- aoi=ee_geom_region,
175
- before_start_date=str(before_start),
176
- before_end_date=str(before_end),
177
- after_start_date=str(after_start),
178
- after_end_date=str(after_end),
179
- difference_threshold=add_slider,
180
- polarization="VH",
181
- pass_direction=pass_direction,
182
- export=False,
183
- )
184
- # Create output map
185
- Map2 = geemap.Map(
186
- # basemap="HYBRID",
187
- plugin_Draw=False,
188
- Draw_export=False,
189
- locate_control=False,
190
- plugin_LatLngPopup=False,
191
  )
192
- try:
193
- # Add flood vector layer to map
194
- Map2.add_layer(
195
- ee_object=detected_flood_raster,
196
- name="Flood extent raster",
197
- )
198
- Map2.add_layer(
199
- ee_object=detected_flood_vector,
200
- name="Flood extent vector",
201
- )
202
- # Center map on flood raster
203
- Map2.centerObject(detected_flood_raster)
204
- except ee.EEException:
205
- # If error contains the sentence below, it means that
206
- # an image could not be properly generated
207
- st.error(
208
- """
209
- No satellite image found for the selected
210
- dates.\n\n
211
- Try changing the pass direction.\n\n
212
- If this does not work, choose different
213
- dates: it is likely that the satellite did not
214
- cover the area of interest in the range of
215
- dates specified (either before or after the
216
- flooding event).
217
- """
218
- )
219
- else:
220
- # If computation was succesfull, save outputs for
221
- # output map
222
- st.success("Computation complete")
223
- st.session_state.output_created = True
224
- st.session_state.Map2 = Map2
225
- st.session_state.detected_flood_raster = (
226
- detected_flood_raster
227
- )
228
- st.session_state.detected_flood_vector = (
229
- detected_flood_vector
230
- )
231
- st.session_state.ee_geom_region = ee_geom_region
232
- # If computation was successful, create output map in bottom panel
233
- if st.session_state.output_created:
234
- with row2:
235
- # Add collapsable container for output map
236
- with st.expander("Output map", expanded=True):
237
- # Export Map2 to streamlit
238
- st.session_state.Map2.to_streamlit()
239
- # Create button to export to file
240
- submitted2 = st.button("Export to file")
241
- # What happens if button is clicked on?
242
- if submitted2:
243
- # Add output for computation
244
- with st.spinner("Computing... Please wait..."):
245
- try:
246
- # Get download url for raster data
247
- raster = st.session_state.detected_flood_raster
248
- url_r = raster.getDownloadUrl(
249
- {
250
- "region": st.session_state.ee_geom_region,
251
- "scale": 30,
252
- "format": "GEO_TIFF",
253
- }
254
- )
255
- except Exception:
256
- st.error(
257
- """
258
- The image size is too big for the image to
259
- be exported to file. Select a smaller area
260
- of interest (side <~ 150km) and repeat the
261
- analysis.
262
- """
263
- )
264
- else:
265
- response_r = requests.get(url_r)
266
- # Get download url for raster data
267
- vector = st.session_state.detected_flood_vector
268
- url_v = vector.getDownloadUrl("GEOJSON")
269
- response_v = requests.get(url_v)
270
- filename = "flood_extent"
271
- timestamp = dt.datetime.now().strftime(
272
- "%Y-%m-%d_%H-%M"
273
- )
274
- with row2:
275
- # Create download buttons for raster and vector
276
- # data
277
- with open("flood_extent.tif", "wb"):
278
- ste.download_button(
279
- label="Download Raster Extent",
280
- data=response_r.content,
281
- file_name=(
282
- f"{filename}"
283
- "_raster_"
284
- f"{timestamp}"
285
- ".tif"
286
- ),
287
- mime="image/tif",
288
- )
289
- with open("flood_extent.geojson", "wb"):
290
- ste.download_button(
291
- label="Download Vector Extent",
292
- data=response_v.content,
293
- file_name=(
294
- f"{filename}"
295
- "_vector_"
296
- f"{timestamp}"
297
- ".geojson"
298
- ),
299
- mime="text/json",
300
- )
301
- # Output for computation complete
302
- st.success("Computation complete")
 
1
+ import json
2
+ from pathlib import Path
3
 
 
4
  import folium
 
 
5
  import streamlit as st
6
+ from folium.plugins import Draw
 
7
  from src.config_parameters import params
8
+ from src.gfm import download_gfm_geojson, get_existing_flood_geojson, sync_cached_data
9
  from src.utils import (
10
  add_about,
 
11
  set_tool_page_style,
12
  toggle_menu_button,
13
  )
 
 
14
  from streamlit_folium import st_folium
15
 
16
  # Page configuration
 
20
  toggle_menu_button()
21
 
22
  # Create sidebar
 
23
  add_about()
24
 
25
  # Page title
 
28
  # Set page style
29
  set_tool_page_style()
30
 
31
+ # Sync cached data
32
+ # ! WARNING: will erase your output folder
33
+ # # # sync_cached_data()
 
 
 
 
 
 
 
 
 
 
 
 
 
34
 
35
  # Create two rows: top and bottom panel
36
  row1 = st.container()
 
37
  # Crate two columns in the top panel: input map and paramters
38
  col1, col2 = row1.columns([2, 1])
39
+ feat_group_selected_area = folium.FeatureGroup(name="selected_area")
40
  with col1:
41
+ with open("./bboxes/bboxes.json", "r") as f:
42
+ bboxes = json.load(f)
43
+ selected_area = st.selectbox("Select saved area", options=bboxes.keys())
44
+
45
+ # retrieve and select available dates
46
+ if selected_area:
47
+ area_folder = Path(f"./output/{selected_area}")
48
+ if area_folder.exists():
49
+ available_product_times = [
50
+ str(f.name) for f in area_folder.iterdir() if f.is_dir()
51
+ ]
52
+
53
+ available_product_times = [
54
+ prod_time.replace("_", ":") for prod_time in available_product_times
55
+ ]
56
+ selected_product_time = st.selectbox(
57
+ "Select available date range", options=available_product_times
58
+ )
59
+ else:
60
+ selected_product_time = None
61
+
62
+ # display the bounding box
63
+ bounding_box = bboxes[selected_area]["bounding_box"]
64
+ geojson_selected_area = folium.GeoJson(bounding_box)
65
+ feat_group_selected_area.add_child(geojson_selected_area)
66
+
67
+ # geojson_selected_area = folium.GeoJson(bboxes[selected_area])
68
+ # feat_group_selected_area.add_child(geojson_selected_area)
69
+ if selected_product_time:
70
+ geojson_flood_area = get_existing_flood_geojson(
71
+ selected_area, selected_product_time
72
+ )
73
+ feat_group_selected_area.add_child(geojson_flood_area)
74
+
75
  # Add collapsable container for input map
76
  with st.expander("Input map", expanded=True):
77
  # Create folium map
78
+ # call to render Folium map in Streamlit
79
+ folium_map = folium.Map([39, 0], zoom_start=8)
80
+ # check if the FeatureGroup has any children (i.e., layers added)
81
+ if len(feat_group_selected_area._children) > 0:
82
+ # if there is data, fit the map to the bounds
83
+ folium_map.fit_bounds(feat_group_selected_area.get_bounds())
84
+ else:
85
+ # if there is no data, set a default view
86
+ # this is necessary to start up the page
87
+ folium_map = folium.Map(location=[39, 0], zoom_start=8)
88
+
89
+ m = st_folium(
90
+ folium_map,
91
+ width=800,
92
+ height=450,
93
+ feature_group_to_add=feat_group_selected_area,
94
  )
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
95
  with col2:
96
  # Add collapsable container for image dates
97
+ with st.expander("Choose Dates", expanded=True):
98
+ start_date = st.date_input("Start date")
99
+ end_date = st.date_input("End date")
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
100
  # Add collapsable container for parameters
101
  with st.expander("Choose Parameters"):
102
  # Add slider for threshold
103
+ st.text("Add relevant (API) parameters here")
104
+
105
+ submitted = st.button("Update flood extent")
106
+
107
+
108
+ # If the button is clicked do the following
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
109
  if submitted:
110
  with col2:
111
+ # Some basic validation on dates and that there's an area if relevant
112
+ get_gfm = True
113
+ check_dates = start_date <= end_date
114
+
115
  # Output error if dates are not valid
116
  if not check_dates:
117
  st.error("Make sure that the dates were inserted correctly")
118
+ get_gfm = False
119
+
120
+ # Only if checks pass go and get the GFM data
121
+ if get_gfm:
122
+ # Show loader because it will take a while
123
+ with st.spinner("Getting GFM files... Please wait..."):
124
+ # download_gfm_geojson(area_name)
125
+ download_gfm_geojson(
126
+ selected_area, from_date=start_date, to_date=end_date
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
127
  )
128
+
129
+ # Display that getting the files is finished
130
+ st.markdown("Getting GFM files finished")
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
app/pages/2_πŸ“–_Documentation.py CHANGED
@@ -1,10 +1,9 @@
1
  """Documentation page for Streamlit app."""
 
2
  import streamlit as st
3
- from PIL import Image
4
  from src.config_parameters import params
5
  from src.utils import (
6
  add_about,
7
- add_logo,
8
  set_doc_page_style,
9
  toggle_menu_button,
10
  )
@@ -16,7 +15,6 @@ st.set_page_config(layout="wide", page_title=params["browser_title"])
16
  toggle_menu_button()
17
 
18
  # Create sidebar
19
- add_logo("app/img/MA-logo.png")
20
  add_about()
21
 
22
  # Set page style
@@ -28,63 +26,9 @@ st.markdown("# Documentation")
28
  # First section
29
  st.markdown("## Methodology")
30
  st.markdown(
31
- """
32
- The methodology is based on the workflow depicted in Figure 1. In
33
- addition to Sentinel-1 synthetic-aperture radar <a href='%s'>SAR</a> data,
34
- two other datasets are used through <a href='%s'>Google Earth Engine</a>:
35
- <ul>
36
- <li><p>
37
- The <i>WWF HydroSHEDS Void-Filled DEM, 3 Arc-Seconds</i>
38
- <a href='%s'>dataset</a> is based on elevation data
39
- obtained in 2000 by NASA's Shuttle Radar Topography Mission (SRTM),
40
- and it is used to mask out areas with more than 5 percent slope
41
- (see following section on limitations).
42
- </p>
43
- <li><p>
44
- The <i>JRC Global Surface Water Mapping Layers, v1.4</i>
45
- <a href='%s'>dataset</a> contains maps of the
46
- location and temporal distribution of surface water from 1984 to
47
- 2021, and it is used to mask areas with perennial water bodies,
48
- such as rivers or lakes.
49
- </p>
50
- </ul>
51
- """
52
- % (
53
- params["url_sentinel_dataset"],
54
- params["url_gee"],
55
- params["url_elevation_dataset"],
56
- params["url_surface_water_dataset"],
57
- ),
58
- unsafe_allow_html=True,
59
  )
60
 
61
- # Add image workflow
62
- img = Image.open("app/img/workflow.png")
63
- col1, mid, col2, last = st.columns([5, 3, 10, 10])
64
- with col1:
65
- st.image(img, width=350)
66
- with col2:
67
- # Trick to add caption at the bottom of the column, as Streamlit has not
68
- # developed a functionality to allign text to bottom
69
- space_before_caption = "<br>" * 27
70
- st.markdown(
71
- space_before_caption,
72
- unsafe_allow_html=True,
73
- )
74
- st.markdown(
75
- """
76
- <p style="font-size:%s;">
77
- Figure 1. Workflow of the flood mapping methodology (<a href=
78
- '%s'>source</a>).
79
- </p>
80
- """
81
- % (
82
- params["docs_caption_fontsize"],
83
- params["url_unspider_tutorial_detail"],
84
- ),
85
- unsafe_allow_html=True,
86
- )
87
-
88
 
89
  # Second section
90
  st.markdown("## Radar imagery for flood detection")
@@ -140,40 +84,3 @@ st.markdown(
140
  % (params["docs_caption_fontsize"], params["url_sentinel_img_location"]),
141
  unsafe_allow_html=True,
142
  )
143
-
144
- # Third section
145
- st.markdown("## Key limitations")
146
- st.markdown(
147
- """
148
- Radar imagery is great for detecting floods, as it is good at picking up
149
- water and it is not affected by the time of the day or clouds (at this
150
- wavelength). But it has its limits, and performs actually quite bad if
151
- having to detect water in mountainous regions, especially if with narrow
152
- valleys, and in urban areas (urban canyons). The reasons are mainly around
153
- the viewing angles, which can cause image distortions. This method may also
154
- result in false positives for other land cover changes with smooth
155
- surfaces, such as roads and sand. Rough surface texture caused by wind or
156
- rainfall may also make it challenging for the radar imagery to identify
157
- water bodies.
158
- """,
159
- unsafe_allow_html=True,
160
- )
161
-
162
-
163
- # Last section
164
- st.markdown("## Useful links")
165
- st.markdown(
166
- """
167
- <a href='%s'>UN-SPIDER recommended practice</a><br>
168
- <a href='%s'>Sentinel-1 satellite imagery user guide</a><br>
169
- Relevant scientific publications:
170
- <a href='%s'>1</a>, <a href='%s'>2</a><br>
171
- """
172
- % (
173
- params["url_unspider_tutorial"],
174
- params["url_sentinel_esa"],
175
- params["url_publication_1"],
176
- params["url_publication_2"],
177
- ),
178
- unsafe_allow_html=True,
179
- )
 
1
  """Documentation page for Streamlit app."""
2
+
3
  import streamlit as st
 
4
  from src.config_parameters import params
5
  from src.utils import (
6
  add_about,
 
7
  set_doc_page_style,
8
  toggle_menu_button,
9
  )
 
15
  toggle_menu_button()
16
 
17
  # Create sidebar
 
18
  add_about()
19
 
20
  # Set page style
 
26
  # First section
27
  st.markdown("## Methodology")
28
  st.markdown(
29
+ "TODO: new documentation, only kept in Sentinel 1 section unchanged from the Mapaction tool"
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
30
  )
31
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
32
 
33
  # Second section
34
  st.markdown("## Radar imagery for flood detection")
 
84
  % (params["docs_caption_fontsize"], params["url_sentinel_img_location"]),
85
  unsafe_allow_html=True,
86
  )
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
app/src/config_parameters.py CHANGED
@@ -1,65 +1,26 @@
1
  """Configuration file."""
 
2
  params = {
3
  # Title browser tab
4
- "browser_title": "Flood mapping tool - MapAction",
5
  # Data scientists involved
6
  "data_scientists": {
7
- "Piet": "pgerrits@mapaction.org",
8
- "Daniele": "[email protected]",
9
- "Cate": "[email protected]",
10
  },
11
  # Urls
12
- "url_data_science_wiki": (
13
- "https://mapaction.atlassian.net/wiki/spaces/GAFO/overview"
14
- ),
15
- "url_gee": "https://earthengine.google.com/",
16
- "url_project_wiki": (
17
- "https://mapaction.atlassian.net/wiki/spaces/GAFO/pages/15920922751/"
18
- "Rapid+flood+mapping+from+satellite+imagery"
19
- ),
20
  "url_github_repo": "https://github.com/mapaction/flood-extent-tool",
21
- "url_sentinel_esa": (
22
- "https://sentinel.esa.int/web/sentinel/user-guides/sentinel-1-sar"
23
- ),
24
  "url_sentinel_dataset": (
25
- "https://developers.google.com/earth-engine/datasets/catalog/"
26
- "COPERNICUS_S1_GRD"
27
  ),
28
  "url_sentinel_img": (
29
  "https://sentinel.esa.int/documents/247904/4748961/Sentinel-1-Repeat-"
30
  "Coverage-Frequency-Geometry-2021.jpg"
31
  ),
32
  "url_sentinel_img_location": (
33
- "https://sentinel.esa.int/web/sentinel/missions/sentinel-1/"
34
- "observation-scenario"
35
- ),
36
- "url_unspider_tutorial": (
37
- "https://un-spider.org/advisory-support/recommended-practices/"
38
- "recommended-practice-google-earth-engine-flood-mapping"
39
- ),
40
- "url_unspider_tutorial_detail": (
41
- "https://un-spider.org/advisory-support/recommended-practices/"
42
- "recommended-practice-google-earth-engine-flood-mapping/in-detail"
43
- ),
44
- "url_elevation_dataset": (
45
- "https://developers.google.com/earth-engine/datasets/catalog/"
46
- "WWF_HydroSHEDS_03VFDEM"
47
- ),
48
- "url_surface_water_dataset": (
49
- "https://developers.google.com/earth-engine/datasets/catalog/"
50
- "JRC_GSW1_4_GlobalSurfaceWater"
51
- ),
52
- "url_publication_1": (
53
- "https://onlinelibrary.wiley.com/doi/full/10.1111/jfr3.12303"
54
- ),
55
- "url_publication_2": (
56
- "https://www.sciencedirect.com/science/article/abs/pii/"
57
- "S0924271620301702"
58
  ),
59
  # Layout and styles
60
  ## Sidebar
61
- "MA_logo_width": "60%",
62
- "MA_logo_background_position": "35% 10%",
63
  "sidebar_header": "Flood Mapping Tool",
64
  "sidebar_header_fontsize": "30px",
65
  "sidebar_header_fontweight": "bold",
 
1
  """Configuration file."""
2
+
3
  params = {
4
  # Title browser tab
5
+ "browser_title": "Flood mapping tool - 510",
6
  # Data scientists involved
7
  "data_scientists": {
8
+ "Daniele": "dcastellana@redcross.nl",
 
 
9
  },
10
  # Urls
 
 
 
 
 
 
 
 
11
  "url_github_repo": "https://github.com/mapaction/flood-extent-tool",
 
 
 
12
  "url_sentinel_dataset": (
13
+ "https://developers.google.com/earth-engine/datasets/catalog/COPERNICUS_S1_GRD"
 
14
  ),
15
  "url_sentinel_img": (
16
  "https://sentinel.esa.int/documents/247904/4748961/Sentinel-1-Repeat-"
17
  "Coverage-Frequency-Geometry-2021.jpg"
18
  ),
19
  "url_sentinel_img_location": (
20
+ "https://sentinel.esa.int/web/sentinel/missions/sentinel-1/observation-scenario"
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
21
  ),
22
  # Layout and styles
23
  ## Sidebar
 
 
24
  "sidebar_header": "Flood Mapping Tool",
25
  "sidebar_header_fontsize": "30px",
26
  "sidebar_header_fontweight": "bold",
app/src/gfm.py ADDED
@@ -0,0 +1,144 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import io
2
+ import json
3
+ import os
4
+ import zipfile
5
+ from pathlib import Path
6
+
7
+ import folium
8
+ import requests
9
+ from dotenv import load_dotenv
10
+
11
+ load_dotenv()
12
+
13
+
14
+ class FloodGeoJsonError(Exception):
15
+ """Custom exception for errors in fetching flood GeoJSON files."""
16
+
17
+ pass
18
+
19
+
20
+ def sync_cached_data(bboxes_path="./bboxes/bboxes.json", output_dir="./output"):
21
+ """
22
+ Ensures that all areas in bboxes.json have a corresponding folder in ./output/.
23
+ Removes any area entry from bboxes.json that does not have an output folder.
24
+ """
25
+ try:
26
+ # Load existing bounding boxes
27
+ with open(bboxes_path, "r") as f:
28
+ bboxes = json.load(f)
29
+
30
+ # Get a set of existing output folders
31
+ existing_folders = {
32
+ folder.name for folder in Path(output_dir).iterdir() if folder.is_dir()
33
+ }
34
+
35
+ # Remove entries from bboxes.json if the folder does not exist
36
+ updated_bboxes = {
37
+ area: data for area, data in bboxes.items() if area in existing_folders
38
+ }
39
+
40
+ # If changes were made, overwrite bboxes.json
41
+ if len(updated_bboxes) != len(bboxes):
42
+ with open(bboxes_path, "w") as f:
43
+ json.dump(updated_bboxes, f, indent=4)
44
+ print(f"Updated {bboxes_path}: Removed missing areas.")
45
+ else:
46
+ print("All areas have matching folders.")
47
+
48
+ except FileNotFoundError:
49
+ print(f"Error: {bboxes_path} not found.")
50
+ except json.JSONDecodeError:
51
+ print(f"Error: {bboxes_path} is not a valid JSON file.")
52
+
53
+
54
+ def download_gfm_geojson(
55
+ area_name, from_date=None, to_date=None, output_file_path=None
56
+ ):
57
+ """
58
+ Should provide an existing area name or a new area name with new_coordinates
59
+ """
60
+ username = os.environ["gfm_username"]
61
+ password = os.environ["gfm_password"]
62
+ base_url = "https://api.gfm.eodc.eu/v1"
63
+
64
+ # Get token, setup header
65
+ token_url = f"{base_url}/auth/login"
66
+
67
+ payload = {"email": username, "password": password}
68
+
69
+ response = requests.post(token_url, json=payload)
70
+ user_id = response.json()["client_id"]
71
+ access_token = response.json()["access_token"]
72
+ header = {"Authorization": f"bearer {access_token}"}
73
+ print("logged in")
74
+
75
+ # Get Area of Impact
76
+ aoi_url = f"{base_url}/aoi/user/{user_id}"
77
+ response = requests.get(aoi_url, headers=header)
78
+
79
+ # TODO: now only getting the first AOI, should extend to getting the whole list and unioning the geojsons
80
+ for aoi in response.json()["aois"]:
81
+ if aoi["aoi_name"] == area_name:
82
+ aoi_id = aoi["aoi_id"]
83
+ break
84
+
85
+ # Get all product IDs
86
+ params = {
87
+ "time": "range",
88
+ "from": f"{from_date}T00:00:00",
89
+ "to": f"{to_date}T00:00:00",
90
+ }
91
+ prod_url = f"{base_url}/aoi/{aoi_id}/products"
92
+ response = requests.get(prod_url, headers=header, params=params)
93
+ products = response.json()["products"]
94
+ print(f"Found {len(products)} products for {area_name}")
95
+
96
+ if not output_file_path:
97
+ base_file_path = "./output"
98
+
99
+ # Download all available flood products
100
+ for product in products:
101
+ product_id = product["product_id"]
102
+
103
+ # Converts product_time from e.g. "2025-01-05T06:10:37" to ""2025-01-05 06h
104
+ # Reason for bucketing per hour is that products are often seconds or minutes apart and should be grouped
105
+ product_time = product["product_time"]
106
+ product_time = product_time.split(":")[0].replace("T", " ") + "h"
107
+ output_file_path = f"{base_file_path}/{area_name}/{product_time}"
108
+ Path(output_file_path).mkdir(parents=True, exist_ok=True)
109
+
110
+ print(f"Downloading product: {product_id}")
111
+
112
+ download_url = f"{base_url}/download/product/{product_id}"
113
+ response = requests.get(download_url, headers=header)
114
+ download_link = response.json()["download_link"]
115
+
116
+ # Download and unzip file
117
+ r = requests.get(download_link)
118
+ with zipfile.ZipFile(io.BytesIO(r.content)) as z:
119
+ print("Extracting...")
120
+ z.extractall(str(Path(output_file_path)))
121
+
122
+ print("Done!")
123
+
124
+
125
+ def get_existing_flood_geojson(area_name, product_time, output_file_path=None):
126
+ """
127
+ Getting a saved GFM flood geojson in an output folder of GFM files. Merge in one feature group if multiple.
128
+ """
129
+ product_time = product_time.replace(":", "_")
130
+ if not output_file_path:
131
+ output_file_path = f"./output/{area_name}/{product_time}"
132
+
133
+ # Combine multiple flood files into a FeatureGroup
134
+ flood_geojson_group = folium.FeatureGroup(name=f"{area_name} Floods {product_time}")
135
+
136
+ for flood_file in Path(output_file_path).glob("*FLOOD*.geojson"):
137
+ with open(flood_file, "r") as f:
138
+ geojson_data = json.load(f)
139
+ flood_layer = folium.GeoJson(geojson_data)
140
+ flood_geojson_group.add_child(flood_layer)
141
+
142
+ # TODO: consider merging multiple flood layers into one, to avoid overlap
143
+
144
+ return flood_geojson_group
app/src/utils.py CHANGED
@@ -1,9 +1,11 @@
1
  """Functions for the layout of the Streamlit app, including the sidebar."""
 
2
  import base64
3
  import os
4
  from datetime import date
5
 
6
  import streamlit as st
 
7
  from src.config_parameters import params
8
 
9
 
@@ -92,88 +94,6 @@ def set_tool_page_style():
92
 
93
 
94
  # Sidebar
95
- @st.cache(allow_output_mutation=True)
96
- def get_base64_of_bin_file(png_file):
97
- """
98
- Get base64 from image file.
99
-
100
- Inputs:
101
- png_file (str): image filename
102
-
103
- Returns:
104
- str: encoded ASCII file
105
- """
106
- with open(png_file, "rb") as f:
107
- data = f.read()
108
- return base64.b64encode(data).decode()
109
-
110
-
111
- def build_markup_for_logo(
112
- png_file,
113
- ):
114
- """
115
- Create full string for navigation bar, including logo and title.
116
-
117
- Inputs:
118
- png_file (str): image filename
119
- background_position (str): position logo
120
- image_width (str): width logo
121
- image_height (str): height logo
122
-
123
- Returns
124
- str: full string with logo and title for sidebar
125
- """
126
- binary_string = get_base64_of_bin_file(png_file)
127
- return """
128
- <style>
129
- [data-testid="stSidebarNav"] {
130
- background-image: url("data:image/png;base64,%s");
131
- background-repeat: no-repeat;
132
- padding-top: 50px;
133
- padding-bottom: 10px;
134
- background-position: %s;
135
- background-size: %s %s;
136
- }
137
- [data-testid="stSidebarNav"]::before {
138
- content: "%s";
139
- margin-left: 20px;
140
- margin-top: 20px;
141
- margin-bottom: 20px;
142
- font-size: %s;
143
- font-weight: %s;
144
- position: relative;
145
- text-align: center;
146
- top: 85px;
147
- }
148
- </style>
149
- """ % (
150
- binary_string,
151
- params["MA_logo_background_position"],
152
- params["MA_logo_width"],
153
- "",
154
- params["sidebar_header"],
155
- params["sidebar_header_fontsize"],
156
- params["sidebar_header_fontweight"],
157
- )
158
-
159
-
160
- def add_logo(png_file):
161
- """
162
- Add logo to sidebar.
163
-
164
- Inputs:
165
- png_file (str): image filename
166
- Returns:
167
- None
168
- """
169
- logo_markup = build_markup_for_logo(png_file)
170
- # st.sidebar.title("ciao")
171
- st.markdown(
172
- logo_markup,
173
- unsafe_allow_html=True,
174
- )
175
-
176
-
177
  def add_about():
178
  """
179
  Add about and contacts to sidebar.
@@ -183,46 +103,16 @@ def add_about():
183
  Returns:
184
  None
185
  """
186
- today = date.today().strftime("%B %d, %Y")
187
-
188
  # About textbox
189
  st.sidebar.markdown("## About")
190
  st.sidebar.markdown(
191
- """
192
- <div class='warning' style='
193
- background-color: %s;
194
- margin: 0px;
195
- padding: 1em;'
196
- '>
197
- <p style='
198
- margin-left:1em;
199
- margin: 0px;
200
- font-size: 1rem;
201
- margin-bottom: 1em;
202
- '>
203
- Last update: %s
204
- </p>
205
- <p style='
206
- margin-left:1em;
207
- font-size: 1rem;
208
- margin: 0px
209
- '>
210
- <a href='%s'>
211
- Wiki reference page</a><br>
212
- <a href='%s'>
213
- GitHub repository</a><br>
214
- <a href='%s'>
215
- Data Science Lab</a>
216
- </p>
217
- </div>
218
- """
219
- % (
220
- params["about_box_background_color"],
221
- today,
222
- params["url_project_wiki"],
223
- params["url_github_repo"],
224
- params["url_data_science_wiki"],
225
- ),
226
  unsafe_allow_html=True,
227
  )
228
 
 
1
  """Functions for the layout of the Streamlit app, including the sidebar."""
2
+
3
  import base64
4
  import os
5
  from datetime import date
6
 
7
  import streamlit as st
8
+
9
  from src.config_parameters import params
10
 
11
 
 
94
 
95
 
96
  # Sidebar
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
97
  def add_about():
98
  """
99
  Add about and contacts to sidebar.
 
103
  Returns:
104
  None
105
  """
 
 
106
  # About textbox
107
  st.sidebar.markdown("## About")
108
  st.sidebar.markdown(
109
+ f"""
110
+ <p>
111
+ Todo: general about stuff <br />
112
+ <a href='{params["url_github_repo"]}'>
113
+ Github Repo</a>
114
+ </p>
115
+ """,
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
116
  unsafe_allow_html=True,
117
  )
118
 
app/src/utils_ee.py DELETED
@@ -1,36 +0,0 @@
1
- """Module for ee-related functionalities."""
2
- import ee
3
- import streamlit as st
4
- from ee import oauth
5
- from google.oauth2 import service_account
6
- from src.utils import is_app_on_streamlit
7
-
8
-
9
- @st.experimental_memo
10
- def ee_initialize(force_use_service_account: bool = False):
11
- """Initialise Google Earth Engine.
12
-
13
- Checks whether the app is deployed on Streamlit Cloud and, based on the
14
- result, initialises Google Earth Engine in different ways: if the app is
15
- run locally, the credentials are retrieved from the user's credentials
16
- stored in the local system (personal Google account is used). If the app
17
- is deployed on Streamlit Cloud, credentials are taken from the secrets
18
- field in the cloud (a dedicated service account is used).
19
- Inputs:
20
- force_use_service_account (bool): If True, the dedicated Google
21
- service account is used, regardless of whether the app is run
22
- locally or in the cloud. To be able to use a service account
23
- locally, a file called "secrets.toml" should be added to the
24
- folder ".streamlit", in the main project folder.
25
-
26
- Returns:
27
- None
28
- """
29
- if force_use_service_account or is_app_on_streamlit():
30
- service_account_keys = st.secrets["ee_keys"]
31
- credentials = service_account.Credentials.from_service_account_info(
32
- service_account_keys, scopes=oauth.SCOPES
33
- )
34
- ee.Initialize(credentials)
35
- else:
36
- ee.Initialize()
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
app/src/utils_flood_analysis.py DELETED
@@ -1,369 +0,0 @@
1
- """Functions to derive flood extent using Google Earth Engine."""
2
- import time
3
-
4
- import ee
5
-
6
-
7
- def _check_task_completed(task_id, verbose=False):
8
- """
9
- Return True if a task export completes successfully, else returns false.
10
-
11
- Inputs:
12
- task_id (str): Google Earth Engine task id
13
-
14
- Returns:
15
- boolean
16
-
17
- """
18
- status = ee.data.getTaskStatus(task_id)[0]
19
- if status["state"] in (
20
- ee.batch.Task.State.CANCELLED,
21
- ee.batch.Task.State.FAILED,
22
- ):
23
- if "error_message" in status:
24
- if verbose:
25
- print(status["error_message"])
26
- return True
27
- elif status["state"] == ee.batch.Task.State.COMPLETED:
28
- return True
29
- return False
30
-
31
-
32
- def wait_for_tasks(task_ids, timeout=3600, verbose=False):
33
- """
34
- Wait for tasks to complete, fail, or timeout.
35
-
36
- Wait for all active tasks if task_ids is not provided.
37
- Note: Tasks will not be canceled after timeout, and
38
- may continue to run.
39
- Inputs:
40
- task_ids (list):
41
- timeout (int):
42
-
43
- Returns:
44
- None
45
- """
46
- start = time.time()
47
- elapsed = 0
48
- while elapsed < timeout or timeout == 0:
49
- elapsed = time.time() - start
50
- finished = [_check_task_completed(task) for task in task_ids]
51
- if all(finished):
52
- if verbose:
53
- print(f"Tasks {task_ids} completed after {elapsed}s")
54
- return True
55
- time.sleep(5)
56
- if verbose:
57
- print(
58
- f"Stopped waiting for {len(task_ids)} tasks \
59
- after {timeout} seconds"
60
- )
61
- return False
62
-
63
-
64
- def export_flood_data(
65
- flooded_area_vector,
66
- flooded_area_raster,
67
- image_before_flood,
68
- image_after_flood,
69
- region,
70
- filename="flood_extents",
71
- verbose=False,
72
- ):
73
- """
74
- Export the results of derive_flood_extents function to Google Drive.
75
-
76
- Inputs:
77
- flooded_area_vector (ee.FeatureCollection): Detected flood extents as
78
- vector geometries.
79
- flooded_area_raster (ee.Image): Detected flood extents as a binary
80
- raster.
81
- image_before_flood (ee.Image): The 'before' Sentinel-1 image.
82
- image_after_flood (ee.Image): The 'after' Sentinel-1 image containing
83
- view of the flood waters.
84
- region (ee.Geometry.Polygon): Geographic extent of analysis area.
85
- filename (str): Desired filename prefix for exported files
86
-
87
- Returns:
88
- None
89
- """
90
- if verbose:
91
- print(
92
- "Exporting detected flood extents to your Google Drive. \
93
- Please wait..."
94
- )
95
- s1_before_task = ee.batch.Export.image.toDrive(
96
- image=image_before_flood,
97
- description="export_before_s1_scene",
98
- scale=30,
99
- region=region,
100
- fileNamePrefix=filename + "_s1_before",
101
- crs="EPSG:4326",
102
- fileFormat="GeoTIFF",
103
- )
104
-
105
- s1_after_task = ee.batch.Export.image.toDrive(
106
- image=image_after_flood,
107
- description="export_flooded_s1_scene",
108
- scale=30,
109
- region=region,
110
- fileNamePrefix=filename + "_s1_after",
111
- crs="EPSG:4326",
112
- fileFormat="GeoTIFF",
113
- )
114
-
115
- raster_task = ee.batch.Export.image.toDrive(
116
- image=flooded_area_raster,
117
- description="export_flood_extents_raster",
118
- scale=30,
119
- region=region,
120
- fileNamePrefix=filename + "_raster",
121
- crs="EPSG:4326",
122
- fileFormat="GeoTIFF",
123
- )
124
-
125
- vector_task = ee.batch.Export.table.toDrive(
126
- collection=flooded_area_vector,
127
- description="export_flood_extents_polygons",
128
- fileFormat="shp",
129
- fileNamePrefix=filename + "_polygons",
130
- )
131
-
132
- s1_before_task.start()
133
- s1_after_task.start()
134
- raster_task.start()
135
- vector_task.start()
136
-
137
- if verbose:
138
- print("Exporting before Sentinel-1 scene: Task id ", s1_before_task.id)
139
- print("Exporting flooded Sentinel-1 scene: Task id ", s1_after_task.id)
140
- print("Exporting flood extent geotiff: Task id ", raster_task.id)
141
- print("Exporting flood extent shapefile: Task id ", vector_task.id)
142
-
143
- wait_for_tasks(
144
- [s1_before_task.id, s1_after_task.id, raster_task.id, vector_task.id]
145
- )
146
-
147
-
148
- def retrieve_image_collection(
149
- search_region,
150
- start_date,
151
- end_date,
152
- polarization="VH",
153
- pass_direction="Ascending",
154
- ):
155
- """
156
- Retrieve Sentinel-1 immage collection from Google Earth Engine.
157
-
158
- Inputs:
159
- search_region (ee.Geometry.Polygon): Geographic extent of image search.
160
- start_date (str): Date in format yyyy-mm-dd, e.g., '2020-10-01'.
161
- end_date (str): Date in format yyyy-mm-dd, e.g., '2020-10-01'.
162
- polarization (str): Synthetic aperture radar polarization mode, e.g.,
163
- 'VH' or 'VV'. VH is mostly is the preferred polarization for
164
- flood mapping.
165
- pass_direction (str): Synthetic aperture radar pass direction, either
166
- 'Ascending' or 'Descending'.
167
-
168
- Returns:
169
- collection (ee.ImageCollection): Sentinel-1 images matching the search
170
- criteria.
171
- """
172
- collection = (
173
- ee.ImageCollection("COPERNICUS/S1_GRD")
174
- .filter(ee.Filter.eq("instrumentMode", "IW"))
175
- .filter(
176
- ee.Filter.listContains(
177
- "transmitterReceiverPolarisation", polarization
178
- )
179
- )
180
- .filter(ee.Filter.eq("orbitProperties_pass", pass_direction.upper()))
181
- .filter(ee.Filter.eq("resolution_meters", 10))
182
- .filterDate(start_date, end_date)
183
- .filterBounds(search_region)
184
- .select(polarization)
185
- )
186
-
187
- return collection
188
-
189
-
190
- def smooth(image, smoothing_radius=50):
191
- """
192
- Reduce the radar speckle by smoothing.
193
-
194
- Inputs:
195
- image (ee.Image): Input image.
196
- smoothing_radius (int): The radius of the kernel to use for focal mean
197
- smoothing.
198
-
199
- Returns:
200
- smoothed_image (ee.Image): The resulting image after smoothing is
201
- applied.
202
- """
203
- smoothed_image = image.focal_mean(
204
- radius=smoothing_radius, kernelType="circle", units="meters"
205
- )
206
-
207
- return smoothed_image
208
-
209
-
210
- def mask_permanent_water(image):
211
- """
212
- Query the JRC Global Surface Water Mapping Layers, v1.3.
213
-
214
- The goal is to determine where perennial water bodies (water > 10
215
- months/yr), and mask these areas.
216
- Inputs:
217
- image (ee.Image): Input image.
218
-
219
- Returns:
220
- masked_image (ee.Image): The resulting image after surface water
221
- masking is applied.
222
- """
223
- surface_water = ee.Image("JRC/GSW1_4/GlobalSurfaceWater").select(
224
- "seasonality"
225
- )
226
- surface_water_mask = surface_water.gte(10).updateMask(
227
- surface_water.gte(10)
228
- )
229
-
230
- # Flooded layer where perennial water bodies(water > 10 mo / yr) is
231
- # assigned a 0 value
232
- where_surface_water = image.where(surface_water_mask, 0)
233
-
234
- masked_image = image.updateMask(where_surface_water)
235
-
236
- return masked_image
237
-
238
-
239
- def reduce_noise(image):
240
- """
241
- Reduce noise in the image.
242
-
243
- Compute connectivity of pixels to eliminate those connected to 8 or fewer
244
- neighbours.
245
- Inputs:
246
- image (ee.Image): A binary image.
247
-
248
- Returns:
249
- reduced_noise_image (ee.Image): The resulting image after noise
250
- reduction is applied.
251
- """
252
- connections = image.connectedPixelCount()
253
- reduced_noise_image = image.updateMask(connections.gte(8))
254
-
255
- return reduced_noise_image
256
-
257
-
258
- def mask_slopes(image):
259
- """
260
- Mask out areas with more than 5 % slope with a Digital Elevation Model.
261
-
262
- Inputs:
263
- image (ee.Image): Input image.
264
- Returns:
265
- slopes_masked (ee.Image): The resulting image after slope masking is
266
- applied.
267
- """
268
- dem = ee.Image("WWF/HydroSHEDS/03VFDEM")
269
- terrain = ee.Algorithms.Terrain(dem)
270
- slope = terrain.select("slope")
271
- slopes_masked = image.updateMask(slope.lt(5))
272
-
273
- return slopes_masked
274
-
275
-
276
- def derive_flood_extents(
277
- aoi,
278
- before_start_date,
279
- before_end_date,
280
- after_start_date,
281
- after_end_date,
282
- difference_threshold=1.25,
283
- polarization="VH",
284
- pass_direction="Ascending",
285
- export=False,
286
- export_filename="flood_extents",
287
- ):
288
- """
289
- Set start and end dates of a period BEFORE and AFTER a flood.
290
-
291
- These periods need to be long enough for Sentinel-1 to acquire an image.
292
-
293
- Inputs:
294
- aoi (ee.Geometry.Polygon): Geographic extent of analysis area.
295
- before_start_date (str): Date in format yyyy-mm-dd, e.g., '2020-10-01'.
296
- before_end_date (str): Date in format yyyy-mm-dd, e.g., '2020-10-01'.
297
- after_start_date (str): Date in format yyyy-mm-dd, e.g., '2020-10-01'.
298
- after_end_date (str): Date in format yyyy-mm-dd, e.g., '2020-10-01'.
299
- difference_threshold (float): Threshold to be applied on the
300
- differenced image (after flood - before flood). It has been chosen
301
- by trial and error. In case your flood extent result shows many
302
- false-positive or negative signals, consider changing it.
303
- export (bool): Flag to export derived flood extents to Google Drive
304
- export_filename (str): Desired filename prefix for exported files. Only
305
- used if export=True.
306
-
307
- Returns:
308
- flood_vectors (ee.FeatureCollection): Detected flood extents as vector
309
- geometries.
310
- flood_rasters (ee.Image): Detected flood extents as a binary raster.
311
- before_filtered (ee.Image): The 'before' Sentinel-1 image.
312
- after_filtered (ee.Image): The 'after' Sentinel-1 image containing view
313
- of the flood waters.
314
- """
315
- before_flood_img_col = retrieve_image_collection(
316
- search_region=aoi,
317
- start_date=before_start_date,
318
- end_date=before_end_date,
319
- polarization=polarization,
320
- pass_direction=pass_direction,
321
- )
322
- after_flood_img_col = retrieve_image_collection(
323
- search_region=aoi,
324
- start_date=after_start_date,
325
- end_date=after_end_date,
326
- polarization=polarization,
327
- pass_direction=pass_direction,
328
- )
329
-
330
- # Create a mosaic of selected tiles and clip to study area
331
- before_mosaic = before_flood_img_col.mosaic().clip(aoi)
332
- after_mosaic = after_flood_img_col.mosaic().clip(aoi)
333
-
334
- before_filtered = smooth(before_mosaic)
335
- after_filtered = smooth(after_mosaic)
336
-
337
- # Calculate the difference between the before and after images
338
- difference = after_filtered.divide(before_filtered)
339
-
340
- # Apply the predefined difference - threshold and create the flood extent
341
- # mask
342
- difference_binary = difference.gt(difference_threshold)
343
- difference_binary_masked = mask_permanent_water(difference_binary)
344
- difference_binary_masked_reduced_noise = reduce_noise(
345
- difference_binary_masked
346
- )
347
- flood_rasters = mask_slopes(difference_binary_masked_reduced_noise)
348
-
349
- # Export the extent of detected flood in vector format
350
- flood_vectors = flood_rasters.reduceToVectors(
351
- scale=10,
352
- geometryType="polygon",
353
- geometry=aoi,
354
- eightConnected=False,
355
- bestEffort=True,
356
- tileScale=2,
357
- )
358
-
359
- if export:
360
- export_flood_data(
361
- flooded_area_vector=flood_vectors,
362
- flooded_area_raster=flood_rasters,
363
- image_before_flood=before_filtered,
364
- image_after_flood=after_filtered,
365
- region=aoi,
366
- filename=export_filename,
367
- )
368
-
369
- return flood_vectors, flood_rasters, before_filtered, after_filtered
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
bboxes/bboxes.json ADDED
@@ -0,0 +1,167 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "Valencia": {
3
+ "bounding_box": {
4
+ "type": "Feature",
5
+ "properties": {},
6
+ "geometry": {
7
+ "type": "Polygon",
8
+ "coordinates": [
9
+ [
10
+ [
11
+ -1.098633,
12
+ 39.227998
13
+ ],
14
+ [
15
+ -1.098633,
16
+ 39.639538
17
+ ],
18
+ [
19
+ 0.439453,
20
+ 39.639538
21
+ ],
22
+ [
23
+ 0.439453,
24
+ 39.227998
25
+ ],
26
+ [
27
+ -1.098633,
28
+ 39.227998
29
+ ]
30
+ ]
31
+ ]
32
+ }
33
+ }
34
+ },
35
+ "Tsjad": {
36
+ "bounding_box": {
37
+ "type": "Feature",
38
+ "properties": {},
39
+ "geometry": {
40
+ "type": "Polygon",
41
+ "coordinates": [
42
+ [
43
+ [
44
+ 17.050781,
45
+ 9.178025
46
+ ],
47
+ [
48
+ 17.050781,
49
+ 11.18918
50
+ ],
51
+ [
52
+ 20.313721,
53
+ 11.18918
54
+ ],
55
+ [
56
+ 20.313721,
57
+ 9.178025
58
+ ],
59
+ [
60
+ 17.050781,
61
+ 9.178025
62
+ ]
63
+ ]
64
+ ]
65
+ }
66
+ }
67
+ },
68
+ "Uganda": {
69
+ "bounding_box": {
70
+ "type": "Feature",
71
+ "properties": {},
72
+ "geometry": {
73
+ "type": "Polygon",
74
+ "coordinates": [
75
+ [
76
+ [
77
+ 32.519531,
78
+ 0.966751
79
+ ],
80
+ [
81
+ 32.519531,
82
+ 2.921097
83
+ ],
84
+ [
85
+ 34.145508,
86
+ 2.921097
87
+ ],
88
+ [
89
+ 34.145508,
90
+ 0.966751
91
+ ],
92
+ [
93
+ 32.519531,
94
+ 0.966751
95
+ ]
96
+ ]
97
+ ]
98
+ }
99
+ }
100
+ },
101
+ "Pego": {
102
+ "bounding_box": {
103
+ "type": "Feature",
104
+ "properties": {},
105
+ "geometry": {
106
+ "type": "Polygon",
107
+ "coordinates": [
108
+ [
109
+ [
110
+ -0.085831,
111
+ 38.867781
112
+ ],
113
+ [
114
+ -0.085831,
115
+ 38.884085
116
+ ],
117
+ [
118
+ -0.058966,
119
+ 38.884085
120
+ ],
121
+ [
122
+ -0.058966,
123
+ 38.867781
124
+ ],
125
+ [
126
+ -0.085831,
127
+ 38.867781
128
+ ]
129
+ ]
130
+ ]
131
+ }
132
+ }
133
+ },
134
+ "Albufera": {
135
+ "bounding_box": {
136
+ "type": "Feature",
137
+ "properties": {},
138
+ "geometry": {
139
+ "type": "Polygon",
140
+ "coordinates": [
141
+ [
142
+ [
143
+ -0.410614,
144
+ 39.2615
145
+ ],
146
+ [
147
+ -0.410614,
148
+ 39.373057
149
+ ],
150
+ [
151
+ -0.263672,
152
+ 39.373057
153
+ ],
154
+ [
155
+ -0.263672,
156
+ 39.2615
157
+ ],
158
+ [
159
+ -0.410614,
160
+ 39.2615
161
+ ]
162
+ ]
163
+ ]
164
+ }
165
+ }
166
+ }
167
+ }
pyproject.toml CHANGED
@@ -1,6 +1,15 @@
1
- [tool.black]
2
- line-length = 79
3
-
4
- [tool.isort]
5
- profile = "black"
6
- line_length = 79
 
 
 
 
 
 
 
 
 
 
1
+ [project]
2
+ name = "flood-mapping-gfm"
3
+ version = "0.1.0"
4
+ description = "A streamlit app to map gfm flood forecasts"
5
+ readme = "README.md"
6
+ requires-python = ">=3.12"
7
+ dependencies = [
8
+ "requests>=2.32.3",
9
+ "geopandas>=1.0.1",
10
+ "folium>=0.19.4",
11
+ "python-dotenv==1.0.1",
12
+ "streamlit>=1.41.1",
13
+ "streamlit-folium>=0.24.0",
14
+ "ipykernel>=6.29.5",
15
+ ]
requirements.txt DELETED
@@ -1,7 +0,0 @@
1
- earthengine-api==0.1.331
2
- folium==0.13.0
3
- geemap==0.17.2
4
- streamlit==1.14.1
5
- streamlit_ext==0.1.4
6
- streamlit-folium==0.7.0
7
- pre-commit==2.18.1
 
 
 
 
 
 
 
 
setup.cfg DELETED
@@ -1,15 +0,0 @@
1
- [flake8]
2
- max-pos-args = 3
3
- ignore =
4
- # Allow f-strings
5
- SFS301,
6
- # Allow print statements
7
- T001,
8
- # Allow implictly concatenated string literals in one line
9
- ISC001,
10
- # Allow percent operator in string
11
- SFS101,
12
- # Allow more than one # for comments
13
- E266,
14
- # Allow line break before binary operator
15
- W503
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
uv.lock ADDED
The diff for this file is too large to render. See raw diff