LogoLogo
HomeDocumentationLoginTry for free
  • CARTO Academy
  • Working with geospatial data
    • Geospatial data: the basics
      • What is location data?
      • Types of location data
      • Changing between types of geographical support
    • Optimizing your data for spatial analysis
    • Introduction to Spatial Indexes
      • Spatial Index support in CARTO
      • Create or enrich an index
      • Work with unique Spatial Index properties
      • Scaling common geoprocessing tasks with Spatial Indexes
      • Using Spatial Indexes for analysis
        • Calculating traffic accident rates
        • Which cell phone towers serve the most people?
    • The modern geospatial analysis stack
      • Spatial data management and analytics with CARTO QGIS Plugin
      • Using data from a REST API for real-time updates
  • Building interactive maps
    • Introduction to CARTO Builder
    • Data sources & map layers
    • Widgets & SQL Parameters
    • AI Agents
    • Data visualization
      • Build a dashboard with styled point locations
      • Style qualitative data using hex color codes
      • Create an animated visualization with time series
      • Visualize administrative regions by defined zoom levels
      • Build a dashboard to understand historic weather events
      • Customize your visualization with tailored-made basemaps
      • Visualize static geometries with attributes varying over time
      • Mapping the precipitation impact of Hurricane Milton with raster data
    • Data analysis
      • Filtering multiple data sources simultaneously with SQL Parameters
      • Generate a dynamic index based on user-defined weighted variables
      • Create a dashboard with user-defined analysis using SQL Parameters
      • Analyzing multiple drive-time catchment areas dynamically
      • Extract insights from your maps with AI Agents
    • Sharing and collaborating
      • Dynamically control your maps using URL parameters
      • Embedding maps in BI platforms
    • Solving geospatial use-cases
      • Build a store performance monitoring dashboard for retail stores in the USA
      • Analyzing Airbnb ratings in Los Angeles
      • Assessing the damages of La Palma Volcano
    • CARTO Map Gallery
  • Creating workflows
    • Introduction to CARTO Workflows
    • Step-by-step tutorials
      • Creating a composite score for fire risk
      • Spatial Scoring: Measuring merchant attractiveness and performance
      • Using crime data & spatial analysis to assess home insurance risk
      • Identify the best billboards and stores for a multi-channel product launch campaign
      • Estimate the population covered by LTE cells
      • A no-code approach to optimizing OOH advertising locations
      • Optimizing site selection for EV charging stations
      • How to optimize location planning for wind turbines
      • Calculate population living around top retail locations
      • Identifying customers potentially affected by an active fire in California
      • Finding stores in areas with weather risks
      • How to run scalable routing analysis the easy way
      • Geomarketing techniques for targeting sportswear consumers
      • How to use GenAI to optimize your spatial analysis
      • Analyzing origin and destination patterns
      • Understanding accident hotspots
      • Real-Time Flood Claims Analysis
      • Train a classification model to estimate customer churn
      • Space-time anomaly detection for real-time portfolio management
      • Identify buildings in areas with a deficit of cell network antennas
    • Workflow templates
      • Data Preparation
      • Data Enrichment
      • Spatial Indexes
      • Spatial Analysis
      • Generating new spatial data
      • Statistics
      • Retail and CPG
      • Telco
      • Insurance
      • Out Of Home Advertising
      • BigQuery ML
      • Snowflake ML
  • Advanced spatial analytics
    • Introduction to the Analytics Toolbox
    • Spatial Analytics for BigQuery
      • Step-by-step tutorials
        • How to create a composite score with your spatial data
        • Space-time hotspot analysis: Identifying traffic accident hotspots
        • Spacetime hotspot classification: Understanding collision patterns
        • Time series clustering: Identifying areas with similar traffic accident patterns
        • Detecting space-time anomalous regions to improve real estate portfolio management (quick start)
        • Detecting space-time anomalous regions to improve real estate portfolio management
        • Computing the spatial autocorrelation of POIs locations in Berlin
        • Identifying amenity hotspots in Stockholm
        • Applying GWR to understand Airbnb listings prices
        • Analyzing signal coverage with line-of-sight calculation and path loss estimation
        • Generating trade areas based on drive/walk-time isolines
        • Geocoding your address data
        • Find similar locations based on their trade areas
        • Calculating market penetration in CPG with merchant universe matching
        • Measuring merchant attractiveness and performance in CPG with spatial scores
        • Segmenting CPG merchants using trade areas characteristics
        • Store cannibalization: quantifying the effect of opening new stores on your existing network
        • Find Twin Areas of top-performing stores
        • Opening a new Pizza Hut location in Honolulu
        • An H3 grid of Starbucks locations and simple cannibalization analysis
        • Data enrichment using the Data Observatory
        • New police stations based on Chicago crime location clusters
        • Interpolating elevation along a road using kriging
        • Analyzing weather stations coverage using a Voronoi diagram
        • A NYC subway connection graph using Delaunay triangulation
        • Computing US airport connections and route interpolations
        • Identifying earthquake-prone areas in the state of California
        • Bikeshare stations within a San Francisco buffer
        • Census areas in the UK within tiles of multiple resolutions
        • Creating simple tilesets
        • Creating spatial index tilesets
        • Creating aggregation tilesets
        • Using raster and vector data to calculate total rooftop PV potential in the US
        • Using the routing module
      • About Analytics Toolbox regions
    • Spatial Analytics for Snowflake
      • Step-by-step tutorials
        • How to create a composite score with your spatial data
        • Space-time hotspot analysis: Identifying traffic accident hotspots
        • Computing the spatial autocorrelation of POIs locations in Berlin
        • Identifying amenity hotspots in Stockholm
        • Applying GWR to understand Airbnb listings prices
        • Opening a new Pizza Hut location in Honolulu
        • Generating trade areas based on drive/walk-time isolines
        • Geocoding your address data
        • Creating spatial index tilesets
        • A Quadkey grid of stores locations and simple cannibalization analysis
        • Minkowski distance to perform cannibalization analysis
        • Computing US airport connections and route interpolations
        • New supplier offices based on store locations clusters
        • Analyzing store location coverage using a Voronoi diagram
        • Enrichment of catchment areas for store characterization
        • Data enrichment using the Data Observatory
    • Spatial Analytics for Redshift
      • Step-by-step tutorials
        • Generating trade areas based on drive/walk-time isolines
        • Geocoding your address data
        • Creating spatial index tilesets
Powered by GitBook
On this page
  • You'll need...
  • Step 1: Setting up your workflow
  • Step 2: Call real-time flood alerts
  • Step 3: Report on assets in flood alert areas
  • Step 6: Sharing a live dashboard

Was this helpful?

Export as PDF
  1. Creating workflows
  2. Step-by-step tutorials

Real-Time Flood Claims Analysis

Last updated 4 months ago

Was this helpful?

In this tutorial, we’ll create a real-time analysis workflow to monitor flood-impacted properties in England. We'll integrate live data from an API, filter property boundaries within flood alert zones, and visualize the results on a map.

By the end of this tutorial, you will have:

✅ Accessed real-time flood data from an API

✅ Built and scheduled a workflow to analyze at-risk properties

✅ Scheduled a daily email and map update about at-risk properties

Let's get started!

You'll need...

To access the data that you need:

That's all you need for now - let's get going!


Step 1: Setting up your workflow

  1. Head to the Data Explorer tab and click Import data. In turn, follow the instructions to import each of the above tables into CARTO Data Warehouse > Organization > Private. Alternatively, you can use your own data warehouse if you have one connected. When you get to the Schema Preview window, deselect "Let CARTO automatically define the schema" and ensure the variables have been defined correctly; any column called "geom" should be the type GEOGRAPHY and the "value" column in Properties_england.csv should be IN64.

  2. Head to the Workflows tab and select + New Workflow. If you are connected to multiple data warehouses, you will be prompted to select a connection - please choose the one to which you have added your data. Give your workflow a name like "Real-time flood alerts."

  3. In the Sources panel on the left of the window, expand connections and find where you loaded your data to (for us, that's CARTO Data Warehouse > Organization > Private). Drag the two tables onto the canvas. The flood alert areas in particular may take a couple of minutes to load as the geography is very complex.


Step 2: Call real-time flood alerts

First, let’s access real-time flood alerts from the Environment Agency.

  1. Head to the left-hand side of your workflow, and switch to the Components panel. From here, find the HTTP request component and drag it onto the canvas. Copy the below URL into the URL box:

https://environment.data.gov.uk/flood-monitoring/id/floods 
WITH json_data AS (SELECT response_data AS json_response FROM $a
),

formatted_data AS (
  SELECT
    
  cast(JSON_EXTRACT_SCALAR(item, '$.severityLevel') as int64) AS severity_level,

    JSON_EXTRACT_SCALAR(item, '$.floodArea.riverOrSea') AS river_or_sea,
  JSON_EXTRACT_SCALAR(item, '$.floodAreaID') AS flood_area_id,

JSON_EXTRACT_SCALAR(item, '$.floodArea.notation') AS notation,
    JSON_EXTRACT_SCALAR(item, '$.description') AS description
    
  FROM json_data,
  UNNEST(JSON_EXTRACT_ARRAY(json_response, '$.items')) AS item
)

SELECT *
FROM formatted_data
  1. Your workflow should look a little like the below - hit Run! Note we've added an annotation box to this section of our analysis to help keep our analysis organized - you can do this through the Aa button at the top of the screen.

Now, it's time to make this spatial!

  1. Add a Join component to the right of the previous component. Connect the Custom SQL Select output to the top Join input, and the flood_alert_polygons source to the bottom. The respective join columns should be flood_area_id and fws_tacode. Use an inner join type, so we retain only fields which are present in each table. It should look a bit like the screenshot below.

If you open the Data preview at the bottom of the screen, you'll be able to see a table containing the live flood alert data. Note this number will likely be lower for the Join component than the Custom SQL Select component - this is because the API services both flood alerts and flood warnings.

Depending on the day you're doing this analysis, you will see a different number - we're running this on the 6th December 2024 and have 131 alerts.

Optional: if you are running this on a flooding-free day...


Step 3: Report on assets in flood alert areas

Whether you’re using real time flood alerts - or historic floods and just pretending they’re real-time - you should now have a component which contains flood boundaries.

  1. Now let’s work out which property boundaries fall in areas with flood alerts. Add a Spatial filter component, and connect the Properties_england.csv source to the top input, and either the Join component (real-time) or flood_alerts_20231231.geojson (historic) to the bottom.

  2. Let’s run our workflow again!

  3. Now, connect this to a Send by Email component. Make sure you use the top-most (positive) output of your filter! Enter your email address and a subject line, check the Include data checkbox at the bottom of the panel, and hit run - and you should receive the results by email!

Altogether, your workflow should look something like this:

If you aren’t using the real-time version of this data, now is the time to fully suspend disbelief and pretend you are… because, wouldn’t it be great to get a daily report of which assets may be impacted by floods? We can!

We just need to adjust a couple of settings in our Workflow.

  1. In workflow Settings (two to the left of Run) uncheck Use cached results. This means that every time you run your workflow, the entire thing will be re-run.

  2. To the right of Settings, open the Schedule Workflow window (the clock icon). Set this to run once a day.

And that’s it! You will now get daily alerts as to which properties may be impacted by floods (you may want to turn this off at some point to avoid spamming yourself!).

Now, for the final flourish...


Step 6: Sharing a live dashboard

Finally, let's turn these results into something a bit more engaging than a table. First, we’ll turn these results into a H3 frequency grid.

Before doing the below, you may want to briefly disconnect Send via email so you don’t end up with loads of emails from yourself every time you run the workflow!

  1. Connect the top (positive) output of the Spatial Filter to a H3 from Geopoint component to create a column with a H3 hexagonal grid cell reference, and change the resolution to 9 which has an "edgth length" of about 200 meters. Run the workflow.

  2. Connect this to a Group by component. Group by the column H3 and set up the following three aggregations:

    1. H3 (COUNT)

    2. Value (SUM)

    3. Value (AVG)

  3. Finally, connect this to a Create Column component. Call the new column date and paste in the function CAST(CURRENT_DATE() AS STRING). This will be really helpful for your users to know exactly which data they are looking at.

  4. Every component is saved as a temporary table. To commit this output, connect the Group by to a Save as Table component, and save it back in CARTO Data Warehouse > Organization > Private, calling the table "flood_alerts_daily." This will overwrite the table every time your workflow is run - or you can check the option to append the results to the existing table to add results over time.

❗Now would be a good time to reconnect the Send via email component to the Spatial Filter.

Your final workflow should be looking something like this:

Now let's turn this into something a bit more visual!

  1. Select the Save as Table component. Open the Map preview at the bottom of the screen and select Create Map. This will take you to a fresh CARTO Builder map with your data pre-loaded - select the map name (top left of the screen) to rename the map "Live flood alerts."

  2. In the Layer panel, click on Layer 1 to rename the layer "Assets in flood alert areas" and style your data. We’d recommend removing the stroke and changing the fill colour to be determined by SUM(h3_count) variable to show the number of potentially impacted assets in each H3 cell. Expand the fill styling options and change the color scale to Quantize.

  3. Head to the Legend panel (to the right of Layers) to ensure the names used in the legend are clear (for instance we've changed h3_count to "Number of assets").

  4. To the right of the Layer panel, switch to the Widgets panel, to add a couple of dashboard elements to help your users understand your map. We’d recommend:

    1. Formula widget: SUM, H3_Count - to show the total number of properties in flood alert areas.

    2. Formula widget: SUM, Value_sum - to show the total value of properties in flood alert areas.

    3. Category widget: SUM, H3_Count, aggregation column: date. This will allow your users to see the most recent date that the data was updated.

For each of these widgets, scroll to the bottom of the Widget panel and change the behaviour from global to viewport, and watch as the values change as you pan and zoom.

  1. Finally, in the Sources panel (bottom left of your screen), set the Data freshness from Default to 24 hours. This will ensure your data is updated daily.

  2. Now, Share your map (top right of the screen) with your Organization or the public. Grab the shareable link from the share window, and head back to your workflow. Change the body email to:

Explore the data [here](copy_your_link_here).

Now every day, your lucky email recipients will recieve both a full report of the assets in flood alert areas, as well as an interactive dashboard to explore the results.

Want to take this further? Try changing the basemap, adding pop-ups and adding the filtered asset geometries in as an additional layer that appears as you zoom further in. Here's what our final version looks like (frozen on 06/12/2024):

Asset locations: you can follow our example by downloading from our github, which is a list of all properties sold in England in 2023. Alternatively, why not use your own asset data or use some from our Data Observatory?

Flood alert areas: These are the areas produced by England's Environment Agency which can be linked to live flood alerts. You can download a simplified version of this from (flood_alert_areas.geojson), or access the original Environment Agency data .

Sign in to CARTO at

Now, add a Custom SQL Select component to the right side of the existing component (make sure you use the top node which is node "a" which is used in the code below), and connect the output of the HTTP request to the input of the Custom SQL Select component. Copy and paste the below SQL into the SQL box - this will format the API response into a table with the fields severity_level, river_or_sea, flood_area_id, notation and description. You can reference the for a full list of fields available if you'd like to adapt this.

If you're running this analysis on a day that happens to have zero flood alerts, you can download a snapshot of flood alerts for the 12th December 2023 from our (flood_alerts_20231231.geojson). You can download, drag and drop this file directly into your workflow and use it in place of everything we've just done. However, please note you won't be able to benefit from any of the real-time related functionality we're about to go through.

Looking for tips? Head to the section of the Academy!

Properties_england.csv
POI data
our github
here
app.carto.com
API documentation
github
Data Visualization
Setting up your workflow
Calling real-time data from an API
Joining the flood alerts to polygon data
Sending the results via email
The final workflow
A screenshot of CARTO Workflows
A screenshot of CARTO Workflows
A screenshot of CARTO Workflows
A screenshot of CARTO Workflows
A screenshot of CARTO Workflows