Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Read by thought-leaders and decision-makers around the world. Phone Number: +1-650-246-9381 Email: [email protected]
228 Park Avenue South New York, NY 10003 United States
Website: Publisher: https://towardsai.net/#publisher Diversity Policy: https://towardsai.net/about Ethics Policy: https://towardsai.net/about Masthead: https://towardsai.net/about
Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Founders: Roberto Iriondo, , Job Title: Co-founder and Advisor Works for: Towards AI, Inc. Follow Roberto: X, LinkedIn, GitHub, Google Scholar, Towards AI Profile, Medium, ML@CMU, FreeCodeCamp, Crunchbase, Bloomberg, Roberto Iriondo, Generative AI Lab, Generative AI Lab Denis Piffaretti, Job Title: Co-founder Works for: Towards AI, Inc. Louie Peters, Job Title: Co-founder Works for: Towards AI, Inc. Louis-François Bouchard, Job Title: Co-founder Works for: Towards AI, Inc. Cover:
Towards AI Cover
Logo:
Towards AI Logo
Areas Served: Worldwide Alternate Name: Towards AI, Inc. Alternate Name: Towards AI Co. Alternate Name: towards ai Alternate Name: towardsai Alternate Name: towards.ai Alternate Name: tai Alternate Name: toward ai Alternate Name: toward.ai Alternate Name: Towards AI, Inc. Alternate Name: towardsai.net Alternate Name: pub.towardsai.net
5 stars – based on 497 reviews

Frequently Used, Contextual References

TODO: Remember to copy unique IDs whenever it needs used. i.e., URL: 304b2e42315e

Resources

Take our 85+ lesson From Beginner to Advanced LLM Developer Certification: From choosing a project to deploying a working product this is the most comprehensive and practical LLM course out there!

Publication

From Ashes to Algorithms: How GOES Satellites and Python Can Protect Wildlife and Communities
Data Science   Latest   Machine Learning

From Ashes to Algorithms: How GOES Satellites and Python Can Protect Wildlife and Communities

Author(s): Ruiz Rivera

Originally published on Towards AI.

Photo by BBC News

Introduction

Imagine what it must be like to be a creature on a hot, dry summer day living in a remote forest within a dense mountainous region you’ve called home since you could remember. Imagine you’re a small, less mobile creature. Maybe you’re thinking of a pup, a cub, a fawn, or a mouse. Take your pick.

So far, nothing about this day seems to be any different than the last. That is until you smell an unfamiliar scent that’s difficult to inhale at first. You’re not sure what it is but the scent continues to be more potent and it’s at this point that your instincts are telling you to flee. You start running towards a direction where you sense the air isn’t as thick as before. Unfortunately, the limited size and strength of your legs neither allow you to travel very far or very quickly due to your small stature. What’s worse is that the scent is now overpowering at this point. It’s nauseating. Choking. Stinging your eyes. And worse, the temperature around you is now increasing to the point that you find it unbearable.

You look back and you see something menacing approaching. It’s the orange hue of what we know to be flames swallowing the surrounding trees. You have never encountered anything like this before but your brain is frantically screaming at your legs to move, to escape. But all your senses are impaired, either from the scorch of the flames or the lack of oxygen from the smoke. Either way, you feel the heat from the fire surrounding you as you desperately struggle to breathe, see, or even flee to safety.

And then it begins. The flames make contact with your skin and now every pore of your body is experiencing a scintillating, unimaginable pain. Tears flood your eyes and you scream in agony as your flesh blackens from the inferno for what seems to feel like an eternity.

Suddenly, you experience a moment of tranquility like the kind you feel before falling into a deep, long, peaceful sleep. The pain has disappeared. Key memories you hold dear then start flashing rapidly as the world around you fades.

While this may only be an approximation of what a creature with limited mobility experiences in their final moments during a wildfire, it doesn’t take much reasoning to conclude that countless creatures once inhabiting a fire-ravaged forest undergo some version of this excruciating ending. There’s possibly no worse ending imaginable than the experience of writhing in anguish from being burnt alive.

As elaborate as it was, this exposition is meant to illustrate how consequential it is to detect and respond to a wildfire as early as possible since it can be the difference between life and death for many of the creatures inhabiting the forest. With our purpose in mind, the work of Data Analytics professionals, Wildfire Researchers, and open-source developers who can bridge various domains to detect and forecast wildfires has never been more important in an age where mass summer burns are now a norm. With tools such as open-source access to near real-time satellite monitoring systems, developers can give emergency responders, First Nations leaders, government agencies, and community stakeholders an advantage in the damage control that wildfires cause. Thanks to the countless scientists and engineers who have worked on developing the hardware for such systems and open-source algorithms to detect environmental anomalies, the tools to keep our ecosystems and communities safe have never been more accessible! In the following sections, we’ll explore how to access NASA’s GOES-16/17 satellites using nothing but Python and Google’s Earth Engine API to build near real-time fire detection capabilities.

Scoping GOES-16 and GOES-17

In a previous article, we introduced the basics of remote sensing using the data captured by the Sentinel-2 satellites by highlighting its strengths and weaknesses, particularly in the use-case of building a wildfire perimeter. Luckily, we are not limited by a single source of failure as we have other systems to shore up the vulnerabilities of Sentinel-2, such as the aforementioned GOES-16 and GOES-17 satellites.

Before we go further, let’s quickly double click on how these satellites work and how they differ from others that are currently in orbit. The Geostationary Operational Environmental Satellites (GOES) are a set of geostationary satellites which takes high temporal resolution images every 5–15 min, with each pixel having a resolution of about 0.5 to 2 km (NOAA & NASA, 2024). When we refer to a satellite as geostationary, it means that it orbits the Earth in the same direction about 35,000 km above the equator and at about the same speed so that from the perspective of a ground-bound observer, the satellite appears nearly stationary. Among the two satellites we mentioned earlier, GOES-16 does the majority of the image capture over the North and South American continent while GOES-17 functions as a ready spare when necessary (NOAA & NASA, 2024).

On board each GOES satellite is the Advanced Baseline Imager (ABI) instrument for imaging the Earth’s weather, oceans, and environment through its 16 different spectral bands (NOAA & NASA, n.d.). While tracking the flow of wildfire is the use case we’re most interested in, these satellites can also provide independent data sources for monitoring things like cloud formation, land surface temperature, ocean dynamics, volcanic ash plumes, vegetative health and more. Because our GOES satellites can take snapshots every 5–15 minutes, decision-makers can rely on the monitoring and fire perimeter we build from this data to inform their emergency response. In contrast to Sentinel-2, GOES satellites are also capable of gathering data 24/7 through their thermal infrared bands which do not rely on sunlight (NOAA & NASA, n.d.). Additionally, it is also capable of penetrating cloud cover by snapping images during windows where the cover is less dense (NOAA & NASA, n.d.).

Now that we’ve gotten the brief overview of the GOES-16/17 satellites out of the way, let’s start extracting data again from the Earth Engine Data Catalog using the same parameters we used earlier to locate the Lytton Creek wildfire during the end of June 2021. As we can see, we extracted over 4,000 images from each satellite due to its ability to snap images in lightning-quick 5–15 minute intervals.

import ee
import folium
import geemap.core as geemap
import numpy as np
import pandas as pd
import pprint
import pytz
import matplotlib.pyplot as plt
from IPython.display import Image
from datetime import datetime
# Gathering satellite data
goes_16 = ee.ImageCollection("NOAA/GOES/16/FDCF").filterDate(start_date, end_date).filterBounds(poi)

goes_17 = ee.ImageCollection("NOAA/GOES/17/FDCF").filterDate(start_date, end_date).filterBounds(poi)

# Example: print the number of images in the collections (optional)
print(f"Number of GOES-16 images: {goes_16.size().getInfo()}")
print(f"Number of GOES-17 images: {goes_17.size().getInfo()}")
# Getting a feel for the data we've extracted from the Earth Engine dataset
pprint.pp(goes_17.first().getInfo())

Let’s also load the map_from_map_codes_to_confidence_values() and apply_scale_factors() functions the team at Google provided us to process our data.

def map_from_mask_codes_to_confidence_values(image):
return image.clip(poi).remap(fire_mask_codes, confidence_values, default_confidence_value)

# Applies scaling factors.
def apply_scale_factors(image):
optical_bands = image.select("SR_B.").multiply(0.0000275).add(-0.2)
thermal_bands = image.select("ST_B.*").multiply(0.00341802).add(149.0)
return image.addBands(optical_bands, None, True).addBands(
thermal_bands, None, True
)

Overview of the Fire Detection Characterization (FDC) Algorithm

Now that we’ve talked a little bit about the satellites used to generate the data, let’s discuss how we are to detect the presence of wildfires based on these images. Luckily for us, Google makes this easy by giving developers easy access to the FDC Fire Detection algorithm which was developed by a research team at the University of Wisconsin-Madison.

The primary objective of the FDC Fire Detection algorithm is to return the likelihood of a fire based on the pixel data of an input image (Restif & Hoffman, 2020). For those interested, below is a brief overview of the steps that the FDC Fire detection algorithm takes to accomplish this objective:

1) First, the algorithm takes the data from the thermal infrared (TIR) band of the satellite sensor (band 14), as well as the shortwave infrared (SWIR) band (7), and converts the brightness of each pixel to a temperature;

2) Next, it flags certain TIR pixels based on whether they exceed a certain threshold. Examples of such thresholds include:

  • Absolute threshold based on a set temperature;
  • Relative threshold based on the delta between a pixel’s temperature and its neighbour’s exceeding a set amount.

3) If a pixel is flagged, it checks for false positives by evaluating the temperature of its neighbouring pixels just like in the previous step. When checking the temperature of the pixel, we can choose to apply a different threshold from step 2 if we wish. And in the case of our code example below, we do just that by applying a relative threshold instead.

4) If our neighbouring pixel also exceeds the threshold, it will then apply one last check for false positives by evaluating whether the delta/difference between the pixel temperature produced by the TIR (band 14) and the SWIR (band 7) band exceeds a relative threshold.

5) And if the difference between the TIR and SWIR pixel temperatures exceeds our relative threshold, the algorithm will return a 1 or a True result, confirming that the pixel in question is indeed a fire pixel.

Our code below is a simplified demonstration of Steps 1–5 of the FDC algorithm. However, our explanation only covers the presence of a fire based on the pixel’s brightness so the final result of our simplified FDC algorithm is a binary True/False value.

# Fire Detection Characterization (FDC) Algorithm example implementation

# Simulated satellite image data
def create_simulated_data(width=50, height=50):

# Create background temperature (avg 290 Kelvin or 16.85 degrees Celsius)
background = np.random.normal(290, 2, (height, width))

# Add some hotter spots (potential fires) with temperatures between 310 to 330 Kelvins (i.e. 36.85 to 56.85 degrees Celsius)
num_hotspots = 5
for _ in range(num_hotspots):
x, y = np.random.randint(0, width), np.random.randint(0, height)
hotspot_temp = np.random.uniform(310, 330)
background[y, x] = hotspot_temp

return background

# Simplified FDC algorithm - our absolute thereshold is 310K or 36.85 degrees
def simplified_fdc(image_4um, image_11um, absolute_threshold=310, relative_threshold=10):
height, width = image_4um.shape
fire_mask = np.zeros((height, width), dtype=bool)

for i in range(1, height-1):
for j in range(1, width-1):
# Step 1: Check absolute threshold
if image_4um[i, j] > absolute_threshold:
# Step 2: Calculate background
background = np.mean(image_4um[i-1:i+2, j-1:j+2])

# Step 3: Check relative threshold
if image_4um[i, j] - background > relative_threshold:
# Step 4: Multi-channel confirmation
if image_4um[i, j] - image_11um[i, j] > 10:
fire_mask[i, j] = True

return fire_mask

# Create simulated data
image_4um = create_simulated_data()
image_11um = image_4um - np.random.normal(10, 2, image_4um.shape) # 11um channel is typically cooler
# Apply simplified FDC algorithm
fire_detections = simplified_fdc(image_4um, image_11um)

# Visualize results
fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(12, 5))

im1 = ax1.imshow(image_4um, cmap="hot")
ax1.set_title("Simulated 4ΞΌm Channel")
plt.colorbar(im1, ax=ax1, label="Temperature (K)")

ax2.imshow(image_4um, cmap="gray")
ax2.imshow(fire_detections, cmap="Reds", alpha=0.5)
ax2.set_title("FDC Algorithm Fire Detections")

plt.tight_layout()
plt.show()

print(f"Number of fire pixels detected: {np.sum(fire_detections)}")
Source: Image by the author

Number of fire pixels detected: 4

# Visualize results
fig1, (ax3, ax4) = plt.subplots(1, 2, figsize=(12, 5))

im2 = ax3.imshow(image_11um, cmap="hot")
ax3.set_title("Simulated 11ΞΌm Channel")
plt.colorbar(im2, ax=ax3, label="Temperature (K)")

ax4.imshow(image_11um, cmap="gray")
ax4.imshow(fire_detections, cmap="Reds", alpha=0.5)
ax4.set_title("FDC Algorithm Fire Detections")

plt.tight_layout()
plt.show()

print(f"Number of fire pixels detected: {np.sum(fire_detections)}")
Source: Image by the author

Number of fire pixels detected: 4

Applying the Fire Detection Algorithm (FDC)

There are additional steps associated with the algorithm such as estimating its fire radiative power (FRP) which represents the brightness or intensity of a fire in the confirmed pixel. From there, the algorithm then assigns a confidence value towards the probability of an actual fire being reflected from the pixel and plots it on a map to build a fire perimeter.

For the sake of brevity, we can cover the complexities behind these confidence values in a future article so for now, take these explanations at face value. At this point in the code, we are now assigning confidence_values between 50-100% to the outputs produced by the algorithm. With a single output, if the algorithm returns a value of 15, it's classifying it as a low probability fire pixel at 50% and in contrast, if it returns a value of 10, there's a near 100% probability that it is a processed fire pixel (Restif & Hoffman, 2020). The resulting values from this process are captured in the goes_16_confidence and goes_17_confidence objects in the following code.

# Conversion from mask codes to confidence values.
fire_mask_codes = [10, 30, 11, 31, 12, 32, 13, 33, 14, 34, 15, 35]
confidence_values = [1.0, 1.0, 0.9, 0.9, 0.8, 0.8, 0.5, 0.5, 0.3, 0.3, 0.1, 0.1]
default_confidence_value = 0

# Processing the GOES-16 satellite images
goes_16_confidence = goes_16.select(["Mask"]).map(map_from_mask_codes_to_confidence_values)
goes_16_max_confidence = goes_16_confidence.reduce(ee.Reducer.max())

# Processing the GOES-17 satellite images
goes_17_confidence = goes_17.select(["Mask"]).map(map_from_mask_codes_to_confidence_values)
goes_17_max_confidence = goes_17_confidence.reduce(ee.Reducer.max())

Data Visualization

Now, one last thing. Since the satellites collect data over a specific time range, the probability of a fire in a given pixel may vary greatly due to the evolving nature of the on-ground event. Although the temporal aspect of the data itself contains plenty of valuable information, in this instance, we’re more concerned with generating a broad outline of the fire boundary. To do so, we can use the ee.Reducer.max() function to return the highest confidence value of each pixel within the specified time range (Restif & Hoffman, 2020). We'll apply this to both the goes_16_confidence and the goes_17_confidence objects before overlaying the specific pixel plots on our map below.

# We can visualize that initial data processing step from each satellite, using:
affected_area_palette = ["white", "yellow", "orange", "red", "purple"]

earth_engine_viz = {
"opacity": 0.3,
"min": 0,
"max": 1,
"palette": affected_area_palette
}

# Create a map.
Map = geemap.Map()
Map.centerObject(poi, 9)
Map.addLayer(poi, {"color": "green"}, "Area of interest", True, 0.2)
Map.addLayer(goes_16_max_confidence, earth_engine_viz, "GOES-16 maximum confidence")
Map.addLayer(goes_17_max_confidence, earth_engine_viz, "GOES-17 maximum confidence")
Map
Source: Image by the author

From our initial results, we can see two iterations of the FDC Algorithm layered over top of each other on the map. We can combine the results of our two satellite images to increase the spatial resolution of our wildfire perimeter using the ee.Reducer.min() function which returns the lesser of the two confidence values where the two layers intersect (Restif & Hoffman, 2020).

# Combine the confidence values from both GOES-16 and GOES-17 using the minimum reducer
combined_confidence = ee.ImageCollection([goes_16_max_confidence, goes_17_max_confidence]).reduce(ee.Reducer.min())

# Create a map
Map = geemap.Map()
Map.centerObject(poi, 9)
Map.addLayer(poi, {"color": "green"}, "Area of interest", True, 0.2)
Map.addLayer(combined_confidence, earth_engine_viz, "Combined confidence")

# Display the map
Map
Source: Image by the author

With the results of our two satellites combined, notice how the generated boundary is highly pixelated due to the image quality of the satellites. One last thing we can do to our wildfire boundary is to smooth the boundaries between the combined fire masks using the ee.Image.reduceNeighborhood() function.

# Define the kernel for smoothing
kernel = ee.Kernel.square(2000, "meters", True)

# Apply the smoothing using reduceNeighborhood with the mean reducer
smoothed_confidence = combined_confidence.reduceNeighborhood(
reducer=ee.Reducer.mean(),
kernel=kernel,
optimization="boxcar"
)

# Create a map
Map = geemap.Map()
Map.centerObject(poi, 9)
Map.addLayer(poi, {"color": "green"}, "Area of interest", True, 0.2)
Map.addLayer(smoothed_confidence, earth_engine_viz, "Smoothed confidence")

# Display the map
Map
Source: Image by the author

There you have it! A near real-time wildfire boundary using Python to deploy the FDC Algorithm on GOES-16 and 17 satellite images from Google’s Data Catalog platform. However, as with most technologies, the use of the FDC on GOES-16/17 images doesn’t come without its weaknesses which we’ll discuss to have a better understanding of the situations where other technologies would be more appropriate.

One risk with using the FDC algorithm on GOES-16/17 images is its tendency to detect false positives with an image. For example, reflective surfaces from buildings in urban areas or lakes and dry vegetation in a forest may be misconstrued as a fire.

Additionally, the image resolution from GOES-16/17 satellites is poorer compared to other data collection techniques. We saw this first-hand from the pixelated fire perimeter we produced in our initial effort applying the FDC algorithm. The reason why the wildfire perimeter was so pixelated is because each pixel captures anywhere between 4–36 squared kilometers depending on how far the area is from the centre of the image. Due to the spherical shape of the Earth and the satellite’s position, the farther an area is from the centre of an image, the lower its image quality. For wildfire detection, what this means is that activities smaller than the pixel size may either be mischaracterized or missed completely.

Another aspect to consider is the terrain of the area of interest. This risk is mostly attributed to mountainous terrain where the lee ward side of a mountain may obfuscate a satellite’s view in that area.

To mitigate these risks, we must use other imaging techniques and technologies alongside GOES-16/17 data to gain a clearer understanding of the ground situation. As we’ve previously discussed, high-resolution data from Sentinel-2 and Landsat satellites can be highly complementary when they’re available as it allows us to cross-validate our resulting wildfire boundaries. On top of that, ground observations and aerial drone surveys add another layer of validation to a highly dynamic event.

By executing the FDC algorithm on GOES-16/17 data, there’s little doubt that this approach can be a powerful asset in helping us build wildfire perimeters in real-time as part of a broader mitigation strategy with other sensory techniques.

Thank you for taking the time to read through our work! If you’re interested in learning more, please feel free to check out our open source repository where we continue to research ways to improve the Government of British Columbia’s (Canada) detection and response to wildfires across the province. Additionally, feel free to access notebook associated to this article if you would like to run the code in its entirety.

See you in our next post ✌

Resources

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming aΒ sponsor.

Published via Towards AI

Feedback ↓