Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Read by thought-leaders and decision-makers around the world. Phone Number: +1-650-246-9381 Email: [email protected]
228 Park Avenue South New York, NY 10003 United States
Website: Publisher: https://towardsai.net/#publisher Diversity Policy: https://towardsai.net/about Ethics Policy: https://towardsai.net/about Masthead: https://towardsai.net/about
Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Founders: Roberto Iriondo, , Job Title: Co-founder and Advisor Works for: Towards AI, Inc. Follow Roberto: X, LinkedIn, GitHub, Google Scholar, Towards AI Profile, Medium, ML@CMU, FreeCodeCamp, Crunchbase, Bloomberg, Roberto Iriondo, Generative AI Lab, Generative AI Lab Denis Piffaretti, Job Title: Co-founder Works for: Towards AI, Inc. Louie Peters, Job Title: Co-founder Works for: Towards AI, Inc. Louis-François Bouchard, Job Title: Co-founder Works for: Towards AI, Inc. Cover:
Towards AI Cover
Logo:
Towards AI Logo
Areas Served: Worldwide Alternate Name: Towards AI, Inc. Alternate Name: Towards AI Co. Alternate Name: towards ai Alternate Name: towardsai Alternate Name: towards.ai Alternate Name: tai Alternate Name: toward ai Alternate Name: toward.ai Alternate Name: Towards AI, Inc. Alternate Name: towardsai.net Alternate Name: pub.towardsai.net
5 stars – based on 497 reviews

Frequently Used, Contextual References

TODO: Remember to copy unique IDs whenever it needs used. i.e., URL: 304b2e42315e

Resources

Take our 85+ lesson From Beginner to Advanced LLM Developer Certification: From choosing a project to deploying a working product this is the most comprehensive and practical LLM course out there!

Publication

Mapping the inferno: Harnessing Sentinel-2 satellites and Python to build a wildfire perimeter
Cloud Computing   Data Science   Latest   Machine Learning

Mapping the inferno: Harnessing Sentinel-2 satellites and Python to build a wildfire perimeter

Last Updated on November 3, 2024 by Editorial Team

Author(s): Ruiz Rivera

Originally published on Towards AI.

Mapping the inferno: Harnessing Sentinel-2 satellites and Python to build a wildfire perimeter

Photo by Husky Haven

In Canada’s westernmost province, British Columbia, the uptick in the severity and occurrence of wildfires have caused substantial damage to the social and ecological systems in the province. For instance, in 2023 alone, 2,245 total wildfires burned in the province amounting to 2.84 million hectares of land going up in smoke (BC Public Service, 2023). Although British Columbia is a large region in terms of land mass with diverse terrain, the total acres burnt was truly an anomaly as it had more than doubled the previous record of 1.215 million acres in 2018 (BC Public Service, 2023). As a result of these wildfires, tens of thousands of people were forced to evacuate their homes. While those fortunate were able to flee from the crisis, there were plenty of other living creatures who didn’t have the means, ability, or fortune to escape such an agonizing fate. And so far, no estimates have been released on the amount of wildlife that perished in those flames.

Whether a wildfire was caused through natural means like lightning strikes or by humans, detecting it and estimating its perimeter is one of the most challenging, yet impactful ways to minimize their destruction. One reason the early detection of a wildfire can be so impactful is that those that spark in remote forests can grow to unmanageable proportions before they spread close enough to an urban centre for locals to report it. Especially when we have a province as large as British Columbia with vast amounts of terrain such as dense forests, rocky mountains, or dry grasslands, communities on the ground may not be able to detect wildfires as easily if they start in a remote area until it’s already burnt hundreds, or even thousands, of acres and wildlife. As an example, on May 12th, 2023, a lightning strike in Donnie Creek which is an area about 136 km southeast of the nearest urban centre, Fort Nelson, lit the fuse for a wildfire to burn about 5,700 square km before it was finally contained (Kulkarni, 2023).

However, with the recent advances in satellite technology and remote sensing techniques, we can now combine raw images from various orbiting satellites with maps to build estimations of a wildfire perimeter. In this article, we’ll use the raw images from the Sentinel-2 satellite stored in the Google Cloud Earth Engine data catalogue to build a wildfire perimeter that we can overlay on a map that serves as a reference point to urban centres and other significant landmarks. By accessing image data from various satellite sources, we can highlight the strengths and weaknesses of the two approaches to have a broader foundation in wildfire detection and remote-sending more broadly.

Let’s begin.

import ee
import folium
import geemap.core as geemap
import numpy as np
import pandas as pd
import pprint
import pytz
import matplotlib.pyplot as plt
from IPython.display import Image
from datetime import datetime

Initializing access to the Google Cloud Earth Engine

First, we’ll need to import the following libraries then authenticate our Google accounts so that we’re able to access the images and computing power associated with this task. For our teaching purposes, we should be able to access the resources we need for free. For those who don’t yet have a Google cloud project set up to access the Earth Engine API, here’s a quick guide on how to get started. We’ll need a project account name associated to the Earth Engine (ee) library to access the compute and data required for the rest of our code to run.

Now, enter Google cloud project name credentials in the project argument to authenticate our Earth Engine account.

# Trigger the authentication flow.
ee.Authenticate()

# Initialize the library.
ee.Initialize(project="enter-project-name")

With our libraries and packages set up, let’s now zoom in on a particular area of interest. One major fire we can study was the tragic Lytton Creek wildfire which began on June 30th, 2021 and ravaged for several weeks (Lindsay & Dickson, 2021). The outcome left about 2,000 residents without a home as the fire burned down the entire village and killed two people in the process (Lindsay & Dickson, 2021).

The cause of the fire stemmed from a recipe of extreme climate events which severely elevated the risk of natural disasters in the area. For starters, just the day before the wildfire started, Lytton set an all-time record of having the highest temperature ever recorded in Canada with a high of 49.6∘ that day. Under those extreme drought conditions with winds travelling up to 71 km an hour that day, the fire spread incredibly fast leaving residents and firefighters with little time to prepare for the fire.

To get a first-hand glance at the severity of the Lytton Creek wildfire, let’s start by pulling images from the Sentinel-2 satellite. For the rest of this analysis, we’ll use the following latitude and longitude coordinates of the old Lytton library as the central point of our satellite data and a time range between 2024-06-15 and 2024-07-15 so that we have a view of Lytton before and after the fire.

# Getting coordinates of the point of interest
# which is the Lytton library as the poi for the Lytton Creek wildfire that started on 2021-06-30
lat = 50.23124506328952
lon = -121.58154057521354

# start date of range to filter for
start_date = "2021-06-15"

# end date
end_date = "2021-07-15"

# radius of interest in meters
radius_of_interest_meters = 40000

# point of interest as an ee.Geometry
poi = ee.Geometry.Point([lon, lat]).buffer(radius_of_interest_meters)

datetime_format = "%Y-%m-%d %H:%M:%S"

Extracting Sentinel-2 satellite data

Before we go further, let’s do a brief overview of what Sentinel-2 is and how it differs from other satellites. Sentinel-2 is a set of twin satellites deployed by the European Space Agency with the stated mission of gaining a β€œnew perspective of our land and vegetation” through its 13 spectral bands (European Space Agency, 2024).

For those new to satellite imaging or remote sensing, spectral bands refer to the wavelength of light that is either reflected or absorbed from the Earth’s surface and can be detected through satellite sensors. Examples of bands we’ll be using here include short-wave infrared (SWIR) bands, near-infrared (NIR) bands, and red, green, and blue bands. In the context of remote sensing, we can analyze the light from these bands to make inferences about the state of the Earth’s surface. Not only can we use this information to determine the location of wildfires, but we can also use it to ascertain information about an area’s water quality or quantify its amount of forest cover.

The way that Sentinel-2 gathers data is by orbiting around the Earth and taking snapshots of the surface approximately every 5 days (European Space Agency, 2024). Each image covers about 290 km which is great because it allows us to monitor large swaths of land with just a single image, given that it’s clear and largely free from clouds or smoke blocking the surface.

In addition to the image data, we’ll also make use of some functions written by Justin Braaten (2022) and his team at Google to aid us in our analysis. The first is the mask_s2_cloud() function which helps us reduce the noise from the cloud for clearer imaging (Braaten, 2022). The second is the add_ee_layer() which essentially allows us to overlay satellite images on top of a Google map-like interface from the folium package so that we're able to cross-reference our raw image data with all the symbols and markers we'd typically see on a map (Braaten, 2022).

# A Google function that allows us to mask clouds from our satellite images
def mask_s2_clouds(image):
"""Masks clouds in a Sentinel-2 image using the QA band.

Args:
image (ee.Image): A Sentinel-2 image.

Returns:
ee.Image: A cloud-masked Sentinel-2 image.
"""

qa = image.select("QA60")

# Bits 10 and 11 are clouds and cirrus, respectively.
cloud_bit_mask = 1 << 10
cirrus_bit_mask = 1 << 11

# Both flags should be set to zero, indicating clear conditions.
mask = (
qa.bitwiseAnd(cloud_bit_mask)
.eq(0)
.And(qa.bitwiseAnd(cirrus_bit_mask).eq(0))
)

return image.updateMask(mask).divide(10000)



# A Google function that allows ee layers on folium
def add_ee_layer(self, ee_image_object, vis_params, name):
"""
Adds a method for displaying Earth Engine image tiles to folium map.
"""


map_id_dict = ee.Image(ee_image_object).getMapId(vis_params)
folium.raster_layers.TileLayer(
tiles=map_id_dict['tile_fetcher'].url_format,
attr='Map Data &copy; <a href="https://earthengine.google.com/">Google Earth Engine</a>',
name=name,
overlay=True,
control=True
).add_to(self)

# Add Earth Engine drawing method to folium
folium.Map.add_ee_layer = add_ee_layer

With our date and geocoordinate parameters, we should extract about 27 Sentinel-2 satellite images from the Google Earth Engine data catalogue and store it in our s2_dataset object.

s2_dataset = ee.ImageCollection("COPERNICUS/S2_SR_HARMONIZED").filterDate(start_date, end_date).filterBounds(poi).filter(ee.Filter.lt("CLOUDY_PIXEL_PERCENTAGE", 20)).map(mask_s2_clouds)

# how many images did we get?
print("Total number:", s2_dataset.size().getInfo())
# Getting a feel of the resulting data structure
s2_dataset.first().getInfo()

If we look at each of the 27 images pulled from the Earth Engine bucket, we’ll find that the image quality on most of them isn’t great as there are often clouds or smoke blocking our view of Lytton. Therefore, we’ve decided to manually select those with the best image quality, store the list in our interesting_images object, and display them to get a visual of the raw images we'll be working with.

s2_params = {
"bands": ["B4", "B3", "B2"], # True color (RGB)
"min": 0,
"max": 0.3, # Adjust min and max values as needed
"dimensions": 512,
"region": poi
}

# Generating an image list for s2 data
s2_image_list = s2_dataset.toList(s2_dataset.size())

# Selecting the images with decent quality
# To access all the scraped images, replace the following objects:
# interesting_images = range(s2_dataset.size().getInfo())
interesting_images = [5, 9, 11, 12, 15, 18, 26]

for i in interesting_images:
try:

# Extract the timestamp string
s2_property = s2_image_list.get(i).getInfo()
timestamp_str = s2_property["properties"]["system:index"].split("_")[0]

# Convert to datetime object
datetime_obj = datetime.strptime(timestamp_str, "%Y%m%dT%H%M%S")

# Make the UTC datetime timezone-aware
dt_utc = pytz.utc.localize(datetime_obj)

# Convert to Pacific Time
pacific_tz = pytz.timezone("America/Los_Angeles")
dt_pacific = dt_utc.astimezone(pacific_tz).strftime(datetime_format)

print(f"Image #{i} / Date: {dt_pacific} PST/PDT")

s2_image = ee.Image(s2_image_list.get(i))
s2_url = s2_image.getThumbUrl(s2_params)
display(Image(url=s2_url))

except:
pass

Here’s an example of one of the images that should have been extracted:

Image #18 / Date: 2021–07–09 12:09:19 PST/PDT

Transforming the data through NBR

Now that we’ve extracted our raw images, we can see how difficult it is to spot activity from the naked eye. This is where the importance of spectral bands comes into play. Since Sentinel-2 is capable of gathering data from shortwave infrared (SWIR) and near-infrared (NIR) bands, we can use a combination of the two to transform our image data and give us a better understanding of the area that’s been burnt (or burning).

What we’re describing here is a normalized burn ratio (NBR) which uses the following formula to identify the burned areas in a particular and visually quantify the severity of the burn:

NBR = (NIRβˆ’SWIR) / (NIR+SWIR)

The NBR formula uses the difference between NIR and SWIR reflectance, normalized by their sum, to quantify burned areas based on the numerical values we observe from the NIR and SWIR bands (United Nations, n.d.). Without going too deep into the maths, let’s briefly describe the purpose of each underlying band.

Near-infrared (NIR) light is often reflected the more healthier or greener the vegetation is on a given land surface. In areas that have been burnt, we’ll see less of this spectral wave reflected from the Earth’s surface (United Nations, n.d.).

On the other hand, short-wave infrared (SWIR) light is reflected based on the moisture content of the underlying surface. Areas with high moisture, and presumably healthier vegetation, tend to absorb this spectral wavelength which means less of it is reflected and picked up by our satellites (United Nations, n.d.). In contrast, drier areas will reflect higher levels of SWIR light since there is less capability to absorb it.

The contrast between these two bands helps us easily distinguish between areas with healthy and burnt vegetation in addition to the severity of the burn (United Nations, n.d.). The resulting NBR values we can observe from this equation range from -1 to 1 where:

  • Negative values often indicate bare ground or recently burned areas
  • Values close to zero indicate urban or water areas
  • Positive values typically represent vegetated areas
# Function to calculate NBR
def calculate_nbr(image):
"""
Calculate the Normalized Burn Ratio (NBR) for a given satellite image using the Near-Infrared (NIR) and Short-Wave Infrared (SWIR) bands,
typically used for burn severity assessment in wildfire studies.

NBR = (NIR - SWIR) / (NIR + SWIR)

The resulting value typically range from -1 to 1, where:
* Negative values often indicate bare ground or recently burned areas
* Values close to zero indicate urban or water areas
* Positive values typically represent vegetated areas
"""


nbr = image.normalizedDifference(["B8", "B12"]).rename("NBR")
return image.addBands(nbr)

Visualizing the spatial data through Folium

Now that we have a better understanding of NBR, let’s apply it to each of the β€œinteresting images” we’ve extracted from Sentinel-2 and layer them over an open-source map.

# Apply NBR calculation
s2_nbr = s2_dataset.map(calculate_nbr)

# Get the first image and visualize
s2nbr_list = s2_nbr.toList(s2_nbr.size())

s2_vis_params = {
# B4 = Red band - useful for calculating vegetation indices
# B8 = Near Infrared (NIR) band - also useful for calculating vegetation and burn severity.
# B12 = Shortwave Infrared (SWIR) band - great for detecting fires and hot spots.
"bands": ["B12", "B8", "B4"], # Ash's bands: ["B12", "B11", "B9"]
"min": 0.0,
"max": 0.3,
"gamma": 1.4
}


# Create a map
wildfire_map = folium.Map(location=[lat, lon], zoom_start=10)

# Add a layer for each satellite image of interest (before, during and after)
for i in interesting_images:

# Extract the timestamp string
s2nbr_list_property = s2nbr_list.get(i).getInfo()
timestamp_str = s2nbr_list_property["properties"]["system:index"].split("_")[0]

# Convert to datetime object
s2_datetime = datetime.strptime(timestamp_str, "%Y%m%dT%H%M%S")

# Make the UTC datetime timezone-aware
dt_utc = pytz.utc.localize(s2_datetime)

# Convert to Pacific Time
pacific_tz = pytz.timezone("America/Los_Angeles")
s2_datetime_pst = dt_utc.astimezone(pacific_tz).strftime(datetime_format)

# Image title
title = f"Sentinel-2 SWIR Image #{i} / Date: {s2_datetime_pst} PST/PDT"

# Extract the image from the list
s2_nbr_image = ee.Image(s2nbr_list.get(i))

# Add the image layerto the map
wildfire_map.add_ee_layer(s2_nbr_image, s2_vis_params, name=title)

# Add a layer control panel to the map
folium.LayerControl(collapsed=False).add_to(wildfire_map)

# Display the map.
display(wildfire_map)

And if everything goes as expected, our code should generate the following result:

The output of our NBR-transformed Sentinel-2 images layered over our Folium map

From the NBR-transformed, satellite imaging we collected of the area surrounding Lytton between June to July 15th, we can see distinct areas that were ravaged by the fires in 2021. Removing all the layers by de-selecting the images gives us a bearing on where the town of Lytton is in the satellite image.

However, if we select image layers #9, 11, 12, and 26, we’ll find another fire perimeter northeast of Lytton, one that looks slightly larger. Because the Sentinel-2 images cover such a large area and with some further research, we were surprisingly able to discover that the Sparks Lake wildfire was another natural disaster occurring the same time as our Lytton Creek wildfire on June 29th, 2021 (Watson & Lindsay, 2021). The Sparks Lake wildfire also turned out to be one of B.C.’s most devastating fires in history due to the heat dome the province experienced that year. The fire appeared to have been human-caused by a nearby marijuana growth operation which burned down about 960 squared kilometres for 69 days before firefighters eventually got control of it.

Drawbacks

While the data from Sentinel-2 does a fine job in identifying and mapping out burnt areas of the region, it suffers some heavy drawbacks in its inability to provide real-time information to emergency responders and members of the public. Because Sentinel-2 needs time to orbit and image other areas of the globe, there is a 2–5 day lag between every snapshot at a particular point of interest (POI), depending on how close it is to the equator (European Space Agency, 2024). In situations where a natural disaster is unfolding rapidly and emergency responders need prompt access to information for rapid and high-stakes decisions, Sentinel-2 data will not be sufficient for this purpose so we must turn to other tools to fill the gap.

Additional concerns with Sentinel-2 images are the presence of clouds or smoke over our POI which can massively reduce the image quality because they limit the amount of light the Earth can reflect to its sensors. Sentinel-2 is also limited to operating during the day due to its reliance on sunlight so any opportunity to gather data at night is out of the question.

Conclusion

By harnessing the power of Sentinel-2 satellite technology, we’ve explored innovative approaches to wildfire monitoring that can significantly enhance our ability to protect communities, wildlife, and ecosystems. As we continue to face the growing threat of wildfires in an era of climate change, these remote sensing techniques represent a crucial step forward in mitigating the destructive force of these natural disasters by bridging the gap between space-based observation and on-the-ground firefighting efforts. With this workflow, we’re able to generate insights about wildfires and other such characteristics of the ecosystem by swapping out the geocoordinates of our POI along with the date range of the event we’re interested in analyzing.

Thank you for reaching the end of the article with us! In addition to the references included below, we invite you to visit the research released on the BC Government’s Wildfire Predictive Services’ (WPS) GitHub page aimed at helping others get started with their wildfire research or remote sensing journey.

References

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming aΒ sponsor.

Published via Towards AI

Feedback ↓