top of page

SATELLITE IMAGE PROCESSING

Writer's picture: madrasresearchorgmadrasresearchorg
Everyone knows what is a satellite, or at least kind of knows what uses they have. For example, Spacex Starlink satellite constellation and Bharathi Airtel’s one web satellite constellation are trying to provide high speed internet throughout the world and many other satellites are used for communications, weather forecasting, astronomy etc.
 

Apart from this, there are numerous satellites collecting images of all the land and water on the whole planet. These digital images after processing through different statistical methods, different kinds of discrete pixel information are produced. The information and details analyzed from the various pixel values lets you understand the location details.

Figure-1

These analysis have numerous applications: they can be used to find water and land bodies on earth, vegetation, urban development, mining, etc. These can further help us to prepare for draughts, find real-time deforestation, vegetation and forest growth, etc.

In micro scale, they can even help us finding the traffic congestion in a location on a particular day, precise agriculture in a location based on the moisture or other metrics available and even finding number of cars or people at a location. And in macro scale, it can be used for finding the solar energy on a place to help in solar plants construction, help in mineral exploration and even urban planning.

What is Multiband Image?

The sensors in the satellites scan the earth in different bands of the electromagnetic spectrum and sense the irradiated energy of the earth’s surface. These data are subsequently transmitted to the satellite ground station in the digital form. These received images have different layers of bands, hence termed as multiband images. On further pre-processing the images, they are converted into understandable imagery.

Most images are 3 band, i.e. they have three bands red, blue, green or called RGB band and another common band is 4 band; red, blue, green as well as near-infrared (NIR) band. The three band image with red, blue and green overlaying give the visible image our eyes can process, this called true color images.

Figure-2

What are Pixels?

Any digital image is a two dimensional array of integers. Every element in the array has a discrete value and refers to a smallest area on image with horizontal (x) and vertical distances (y). These are called picture element or pixel. These pixels, the smaller they are the higher information the image contains. The full high definition or full HD images contain 1080*1920 pixels, while the normal 480p image contains 720*480 pixels. Hence, lesser the size of pixel, higher the resolution of the image.

Now, these pixels in a band are given by the line and column of the matrix and the intensity of the brightness is given by an integer value. In a red band, each pixel represents the special location of the image and it’s value represents the intensity of red in that pixel area. For most of the digital image the pixel values are represented by 8 bit number which produces 256 levels of gray scale. The value zero means no illumination (black) and the value 255 means the maximum illumination (white).

A real-time Case Study:

In this case study, we are trying to find the vegetation in a part of the world, a region enclosing San Francisco and Oakland. We are doing this by collecting the data relating to the region and then analyzing the image pixels.

Step 1: Downloading the imagery:

In the first step, we are downloading the dataset from the internet. I’m using Planet.org site to download my image, but there are numerous other platforms to download the images. The Landsat series of NASA and sentinel of Europe space agency provide a lot of open data. You can even use google earth engine to download the data and other sites include USA’s USGS.gov, Europe’s Copernicus and even India’s Bhuvan site. All these sites let’s you download the data for your area of interest (AoI). This is the imagery I’m using for the case study.

Figure-3

Step 2: Loading the Data:

import rasterio
import numpy as np
from matplotlib import pyplot as plt
imagery = rasterio.open("4band location.tif")

We are using installed ‘rasterio’ package to read the imagery as N-D array and importing matplotlib for displaying images in later steps and numpy for further steps.

The downloaded imagery is stored as ‘imagery’ using rasterio.open.

Step 3: Loading the bands:

blue = imagery.read(1)
green =imagery.read(2)
red = imagery.read(3)
nir = imagery.read(4)

Using Rasterio’s read function we are loading the bands into numpy arrays. As this is a 4-band image, we are assigning 4 individual bands. As you can notice, the normal 3-band are called ‘rgb’ but the order the rasterio reads them is blue-green-red, so we assigned accordingly as blue, green, red and near-infrared (nir).

Step 4: Visualizing the bands:

display = plt.imshow(blue)
plt.axis('off')
plt.show()

Figure-4

display = plt.imshow(green)
display.set_cmap('gist_earth')
plt.axis('off')
plt.show()

Figure-5

display = plt.imshow(red)
display.set_cmap('inferno')
plt.axis('off')
plt.show()

Figure-6

Using matplotlib imshow, the bands imagery are displayed. As the bands are just array of integers, the imshow displays all the images in only blue, so we are using pseudo color like inferno to represent red and green colors.

Step 5: Vegetation Visulaization:

rgb = np.dstack((red, green, blue))
plt.axis('off')
plt.imshow(rgb)

Figure-7

The above image is the true color image of the region obtained by stacking red-green-blue bands.

nrg = np.dstack((nir,red,green))
plt.axis('off')
plt.imshow(nrg)

Figure-8

By stacking, nir, red and green bands, the vegetation is displayed as red. This is because the internal structure of healthy plants reflects near-infrared wavelengths or nir band. By stacking nir, the healthy plants appear red instead of the green color we see because of the chlorophyll in them. The more red the image the higher the vegetation is and also this is very helpful in monitoring the health of plants than the reflected green light.

End note:

The uses of satellite image processing is not limited to finding vegetation, in fact this is the basic application of it and the most simple one. Many researches and scholars are trying to better analyze the information from the imagery to better serve the humanity in times of crisis and for many other purposes.

For further understanding the code or to download the code, here is the GitHub link:

References:

Recent Posts

See All

Comments


Madras Scientific  Research Foundation

About US

 

Contact

 

Blog

 

Internship

 

Join us 

Know How In Action 

bottom of page