Real-Time Flood Claims Analysis
Last updated
Last updated
In this tutorial, we’ll create a real-time analysis workflow to monitor flood-impacted properties in England. We'll integrate live data from an API, filter property boundaries within flood alert zones, and visualize the results on a map.
By the end of this tutorial, you will have:
✅ Accessed real-time flood data from an API
✅ Built and scheduled a workflow to analyze at-risk properties
✅ Scheduled a daily email and map update about at-risk properties
Let's get started!
To access the data that you need:
Asset locations: you can follow our example by downloading Properties_england.csv from our github, which is a list of all properties sold in England in 2023. Alternatively, why not use your own asset data or use some POI data from our Data Observatory?
Flood alert areas: These are the areas produced by England's Environment Agency which can be linked to live flood alerts. You can download a simplified version of this from our github (flood_alert_areas.geojson), or access the original Environment Agency data here.
That's all you need for now - let's get going!
Sign in to CARTO at app.carto.com
Head to the Data Explorer tab and click Import data. In turn, follow the instructions to import each of the above tables into CARTO Data Warehouse > Organization > Private. Alternatively, you can use your own data warehouse if you have one connected. When you get to the Schema Preview window, deselect "Let CARTO automatically define the schema" and ensure the variables have been defined correctly; any column called "geom" should be the type GEOGRAPHY and the "value" column in Properties_england.csv should be IN64.
Head to the Workflows tab and select + New Workflow. If you are connected to multiple data warehouses, you will be prompted to select a connection - please choose the one to which you have added your data. Give your workflow a name like "Real-time flood alerts."
In the Sources panel on the left of the window, expand connections and find where you loaded your data to (for us, that's CARTO Data Warehouse > Organization > Private). Drag the two tables onto the canvas. The flood alert areas in particular may take a couple of minutes to load as the geography is very complex.
First, let’s access real-time flood alerts from the Environment Agency.
Head to the left-hand side of your workflow, and switch to the Components panel. From here, find the HTTP request component and drag it onto the canvas. Copy the below URL into the URL box:
Now, add a Custom SQL Select component to the right side of the existing component (make sure you use the top node which is node "a" which is used in the code below), and connect the output of the HTTP request to the input of the Custom SQL Select component. Copy and paste the below SQL into the SQL box - this will format the API response into a table with the fields severity_level, river_or_sea, flood_area_id, notation and description. You can reference the API documentation for a full list of fields available if you'd like to adapt this.
Your workflow should look a little like the below - hit Run! Note we've added an annotation box to this section of our analysis to help keep our analysis organized - you can do this through the Aa button at the top of the screen.
Now, it's time to make this spatial!
Add a Join component to the right of the previous component. Connect the Custom SQL Select output to the top Join input, and the flood_alert_polygons source to the bottom. The respective join columns should be flood_area_id and fws_tacode. Use an inner join type, so we retain only fields which are present in each table. It should look a bit like the screenshot below.
If you open the Data preview at the bottom of the screen, you'll be able to see a table containing the live flood alert data. Note this number will likely be lower for the Join component than the Custom SQL Select component - this is because the API services both flood alerts and flood warnings.
Depending on the day you're doing this analysis, you will see a different number - we're running this on the 6th December 2024 and have 131 alerts.
Optional: if you are running this on a flooding-free day...
If you're running this analysis on a day that happens to have zero flood alerts, you can download a snapshot of flood alerts for the 12th December 2023 from our github (flood_alerts_20231231.geojson). You can download, drag and drop this file directly into your workflow and use it in place of everything we've just done. However, please note you won't be able to benefit from any of the real-time related functionality we're about to go through.
Whether you’re using real time flood alerts - or historic floods and just pretending they’re real-time - you should now have a component which contains flood boundaries.
Now let’s work out which property boundaries fall in areas with flood alerts. Add a Spatial filter component, and connect the Properties_england.csv source to the top input, and either the Join component (real-time) or flood_alerts_20231231.geojson (historic) to the bottom.
Let’s run our workflow again!
Now, connect this to a Send by Email component. Make sure you use the top-most (positive) output of your filter! Enter your email address and a subject line, check the Include data checkbox at the bottom of the panel, and hit run - and you should receive the results by email!
Altogether, your workflow should look something like this:
If you aren’t using the real-time version of this data, now is the time to fully suspend disbelief and pretend you are… because, wouldn’t it be great to get a daily report of which assets may be impacted by floods? We can!
We just need to adjust a couple of settings in our Workflow.
In workflow Settings (two to the left of Run) uncheck Use cached results. This means that every time you run your workflow, the entire thing will be re-run.
To the right of Settings, open the Schedule Workflow window (the clock icon). Set this to run once a day.
And that’s it! You will now get daily alerts as to which properties may be impacted by floods (you may want to turn this off at some point to avoid spamming yourself!).
Now, for the final flourish...
Finally, let's turn these results into something a bit more engaging than a table. First, we’ll turn these results into a H3 frequency grid.
Before doing the below, you may want to briefly disconnect Send via email so you don’t end up with loads of emails from yourself every time you run the workflow!
Connect the top (positive) output of the Spatial Filter to a H3 from Geopoint component to create a column with a H3 hexagonal grid cell reference, and change the resolution to 9 which has an "edgth length" of about 200 meters. Run the workflow.
Connect this to a Group by component. Group by the column H3 and set up the following three aggregations:
H3 (COUNT)
Value (SUM)
Value (AVG)
Finally, connect this to a Create Column component. Call the new column date and paste in the function CAST(CURRENT_DATE() AS STRING)
. This will be really helpful for your users to know exactly which data they are looking at.
Every component is saved as a temporary table. To commit this output, connect the Group by to a Save as Table component, and save it back in CARTO Data Warehouse > Organization > Private, calling the table "flood_alerts_daily." This will overwrite the table every time your workflow is run - or you can check the option to append the results to the existing table to add results over time.
❗Now would be a good time to reconnect the Send via email component to the Spatial Filter.
Your final workflow should be looking something like this:
Now let's turn this into something a bit more visual!
Select the Save as Table component. Open the Map preview at the bottom of the screen and select Create Map. This will take you to a fresh CARTO Builder map with your data pre-loaded - select the map name (top left of the screen) to rename the map "Live flood alerts."
In the Layer panel, click on Layer 1 to rename the layer "Assets in flood alert areas" and style your data. We’d recommend removing the stroke and changing the fill colour to be determined by SUM(h3_count) variable to show the number of potentially impacted assets in each H3 cell. Expand the fill styling options and change the color scale to Quantize.
Head to the Legend panel (to the right of Layers) to ensure the names used in the legend are clear (for instance we've changed h3_count to "Number of assets").
To the right of the Layer panel, switch to the Widgets panel, to add a couple of dashboard elements to help your users understand your map. We’d recommend:
Formula widget: SUM, H3_Count - to show the total number of properties in flood alert areas.
Formula widget: SUM, Value_sum - to show the total value of properties in flood alert areas.
Category widget: SUM, H3_Count, aggregation column: date. This will allow your users to see the most recent date that the data was updated.
For each of these widgets, scroll to the bottom of the Widget panel and change the behaviour from global to viewport, and watch as the values change as you pan and zoom.
Finally, in the Sources panel (bottom left of your screen), set the Data freshness from Default to 24 hours. This will ensure your data is updated daily.
Now, Share your map (top right of the screen) with your Organization or the public. Grab the shareable link from the share window, and head back to your workflow. Change the body email to:
Now every day, your lucky email recipients will recieve both a full report of the assets in flood alert areas, as well as an interactive dashboard to explore the results.
Want to take this further? Try changing the basemap, adding pop-ups and adding the filtered asset geometries in as an additional layer that appears as you zoom further in. Here's what our final version looks like (frozen on 06/12/2024):
Looking for tips? Head to the Data Visualization section of the Academy!