SATELLITE IMAGE PROCESSING AND CLASSIFICATION

 

by XIAOWEI WU

 

 

 

INTRODUCTION

 

 

 

A. WEATHER SATELLITE INTRODUCTION

 

Our everyday view of the atmosphere is from the bottom looking up and around. Our field of view is limited since most of us can see only a few kilometers in any direction. At the same time, the systems that dominate our weather can be hundreds or even thousands of kilometers across. Weather maps and radar have extended our views, but it is the weather satellite that gives us a completely different perspective on weather. Orbiting satellites are platforms from which the atmosphere and surfaces below can be observed from the outside. By looking down on weather, we can see that fair and stormy weather are somehow related. Clear areas and giant swirls of clouds fit together. In the continually changing atmosphere we can observe evidence of predictability through the order and evolution of weather systems.

 

The sensors onboard the satellites react to two basic types of radiant energy. Visible light is produced by the sun and reflected off Earth surfaces and clouds, back up to the satellite. These images appear the same as black-and- white television pictures. All clouds look white to the sensor as they do to our eyes. Darker ground surfaces and water bodies in clear areas reflect little sunlight back up to space and therefore appear dark, gray or black. Visible images from the current geostationary weather satellites can resolve objects such as clouds that are as small as one kilometer in width.

 

The second main type of sensor detects infrared or heat energy given off by surfaces with temperatures in the range of the Earth's land and water surfaces and cloud tops. The intensity of the infrared energy is related to the specific temperature of the emitting surface. In this way, infrared (IR) images are temperature maps of the Earth view. Because the Earth and atmosphere emit heat day and night, infrared images are always available. The infrared sensor on the geostationary weather satellites can distinguish areas as small as four kilometers in width.

 

 

University of Arkansas at Little Rock weather satellite dish for the GPS, Remote Sensing, and GIS Lab

 

 

 

B. SATELLITE IMAGES INTRODUCTION

 

(1) Visible Satellite Images

 

·         Visible satellite images are views produced from reflected sunlight. Thus, these pictures look similar to pictures made with an ordinary camera.

·         On visible satellite imagery, clouds appear white and the ground and water surfaces are dark gray or black. Since this imagery is, produced by sunlight, it is only available during daylight hours

  • Low clouds and fog are usually distinguishable from nearby land surfaces. In addition, the hazy conditions associated with air pollution can be tracked.
  • The shadows of thunderstorm clouds can be seen cast on lower clouds in the late afternoon. Snow cover can be monitored because it does not move as clouds do. Land features, such as streams, can be visible.

 

 

(2) Infrared Satellite Images

 

·         Infrared satellite images are produced by the infrared (heat) energy Earth radiates to space. Since Earth is always radiating heat, infrared images are available day and night.

·         On infrared images, warm land and water surfaces appear dark gray or black. The cold tops of high clouds are white and lower-level clouds, being warmer, are gray. Low clouds and fog are difficult to detect in the infrared when their temperatures are nearly the same as the nearby Earth surfaces.

·         An additional advantage of infrared imagery is that it can be processed to produce enhanced views. The data from the usual infrared pictures are specially treated to emphasize temperature details or structure by assigning contrasting shades of gray or color to narrow temperature ranges. Such imagery, often seen color-coded, appears regularly on television weathercasts and computer displays.

·         The enhanced images make it possible to keep track of land and oceanic surface temperatures. These surface temperatures play major roles in making and modifying weather. The high, cold clouds associated with severe weather are also easily monitored.

·         Enhanced imagery can be interpreted to produce rainfall rate estimates. This information is used in flash flood forecasting.

 

(3) Water Vapor Images

 

·         Solid, liquid and vapor forms of water interact with specific ranges of infrared energy. Specially tuned geostationary weather satellite sensors can detect water vapor in the atmosphere, in addition to clouds.

·         The water vapor sensors aboard weather satellites reveal regions of high atmospheric water vapor concentration in the troposphere between altitudes of 3 and 7 km. These regions, sometimes resembling gigantic swirls or plumes, can be seen to flow within and through broad scale weather patterns.

·         Recent studies suggest that, at any one time, atmospheric water vapor may be found concentrated in several large flowing streams forming the equivalent of "rivers in the sky".

 

(4) Weather Features in Satellite Imagery

 

·         Hurricanes look like pinwheels of clouds. More often than not, the beginnings of hurricanes are detected from satellite views, because they occur over broad expanses of oceans.

·         Large comma-shaped cloud shields give shape and form to mid-latitude low-pressure systems.

·         Clouds from which showers fall can look like grains of sand, especially on visible satellite pictures. Thunderstorms appear as "blobs" or "chains of blobs". Their high tops spread downwind from them as wispy cirrus clouds. They may have neighboring lower clouds appearing as tiny curved "tails" to the southwest. Such "tails" can also be indicators of the possibility of tornadoes.

·         Movements of cloud patterns detected by viewing sequential satellite images, indicate the circulations of broad-scale weather systems. Wind speeds can be estimated at different levels and even upper-air jet streams can be identified.

·         Meteorologists use satellite images to deter- mine cloud shapes, heights, and type. Changes in these cloud properties, along with cloud movement, provide valuable information to weather forecasters to determine what is happening and what is likely to happen to weather in the hours and days ahead.

·         Visible, infrared, and water vapor satellite imagery complement one another. There are weather features that can be clearly seen in one kind of image that are difficult to see in the others.

 

C. GOES SATELLITE INTRODUCTION

 

The National Oceanic and Atmospheric Administration’s (NOAA) operational environmental satellite system is composed of: geostationary operational environmental satellites (GOES) for short-range warning and “now-casting” and polar-orbiting environmental satellites (POES) for longer-term forecasting. Both kinds of satellites are necessary for providing a complete global weather monitoring system. The satellites carry search and rescue instruments, and have helped save the lives of about 10,000 people to date. The satellites are also used to support aviation safety (volcanic ash detection), and maritime/shipping safety (ice monitoring and prediction).

 

(1) History

 

Since the early 1960s, meteorological, hydrological, and oceanographic data from satellites have had a major impact on environmental analysis, weather forecasting, and atmospheric research in the United States and throughout the world. NASA research and development fostered the GOES program within NOAA. Five spin stabilized satellites were built and launched, introducing a new era of satellite service: NASA's demonstration of two Synchronous Meteorological Satellites (SMS) began with the launch of SMS-1 in May 1974 and NOAA's operation of a GOES series followed with the launch of GOES- I in October 1975. The Visible and Infrared Spin Scan Radiometer (VISSR) provided imagery from these original SMS and GOES satellites.

GOES significantly advanced our ability to observe weather systems by providing frequent interval visible and infrared imagery of the earth surface, atmospheric moisture, and cloud cover. GOES data soon became a critical part of National Weather Service (NWS) operations by providing unique information about existing and emerging storm systems both day and night. Subsequently, more spectral bands were added to the VISSR, enabling the GOES system to acquire multispectral measurements from which atmospheric temperature and humidity sounding could be derived: the VISSR Atmospheric Sounder (VAS) was introduced on GOES-4 in 1981.

 

(2) GOES Application

 

GOES satellites orbit the earth at the same speed as the earth rotates, thus continually watching over the same area. The geosynchronous plane is about 35,800 km (22,300) miles) above the Earth, high enough to allow the satellites a full-disc view of the Earth. GOES satellites are a mainstay of weather forecasting in the United States. They provide data for severe storm evaluation, information on cloud cover, winds, ocean currents, fog distribution, storm circulation and snow melt, using visual and infrared imagery. The satellites also receive transmissions from free-floating balloons, buoys and remote automatic data collection stations around the world. The weather data gathered by GOES satellites, combined with data from Doppler radars and automated surface observing systems, aids weather forecasters greatly in providing warnings of thunder-storms, winter storms, flash floods, hurricanes, and other severe weather. These warnings help to save lives, preserve property, and benefit commercial interests.

 

D. ARCVIEW – TOOL FOR REMOTE SENSING IMAGE PROCESSING

 

ArcView is a branch of ArcGIS, the product of ESRI Company, which is a scalable system of software for geographic data for every organization--from an individual to a globally distributed network of people.

 

GIS is expanding into new applications and user communities to meet the challenge of providing data and services to a geographically literate world. Strong editing, analysis, and modeling, along with cutting-edge data models and management, continue to distinguish the ArcGIS software family as the leading GIS software.

Users can deploy multiple ArcGIS clients (ArcReader, ArcView, ArcEditor, ArcInfo), mobile clients (ArcPad), and ArcGIS servers (ArcSDE and ArcIMS) to meet their needs for scalable GIS solutions.

 

With the ArcView Image Analysis extension, we can perform tasks that range from simply displaying images to performing detailed spectral analysis and detecting temporal change. The tools available in the ArcView Image Analysis extension provide:

(1) Import and incorporate raster imagery into ArcView GIS.

(2) Categorize an image into a number of classes corresponding to land cover types like vegetation.

(3) Evaluate images at different time periods to identify areas of change.

(4) Identify and automatically map a land cover type with a single click.

(5) Find areas of dense and thriving vegetation in an image.

(6) Enhance the appearance of an image by adjusting contrast and brightness or by applying histogram stretches.

(7) Align an image to a map coordinate system for precise area location


DESCRIPTION OF WORK

 

The work involves the following two topics:

 

A. Image processing using ArcView

 

Here are the results of some processing to a satellite infrared image, using the ArcView software.

 

Original image

Adjusting the brightness and contrast

Sharpening

Smoothing


 

Image mosaicking

Edge detection

Feature extraction

Image categorization

Figure 1 Image processing

 

Note:   

--Original image is from GOES satellite IR2 channel.

--In feature extraction, seed radius is 10.

--In unsupervised categorization, image is divided into 4 classes, whose colors are defined to red, green, blue and cyan.


B. Image classification using k-medoid method

 

(1) Theoretic Basis

 

Image classification is the process of partitioning of an image into related regions. It is a kind of clustering. The goal of image classification is to analyze the remote sensing data to identify and measure regions of interest.

 

Clustering is the process of grouping a set of physical or abstract objects into classes of similar objects. The purpose of clustering is to divide samples into k clusters striving for a high degree of similarity among elements in clusters and a high degree of dissimilarity among elements in different clusters. When we choose “distance” to measure the degree of similarity, a good clustering is one where the sum of distances between objects in the same cluster (intra-cluster distance) are minimized, while the distances between different clusters (inter-cluster distance) are maximized. This objective function can be shown by:

, dij is the distance between object i and object j in the same cluster.

, Dij is the distance between cluster i and cluster j.

Figure below is the illustration of clustering.

 

Figure 2 Illustration of clustering

 

There are two kinds of clustering methods, partitioning and hierarchical methods.

 

Partitioning methods include:

 

·         K-means method (n objects to k clusters)

 

Cluster similarity measured in regard to mean value of objects in a cluster (cluster’s center of gravity). The whole process is:

 

--Select randomly k-points (call them means)

--Assign each object to nearest mean

--Compute new mean for each cluster

--Repeat until criterion function converges

 

 

 

 

 

The criterion function is:

, mi is the mean of cluster Ci.

We try to minimize the squared error criterion: Min(E).

 

This method is sensitive to outliers.

 

·         K-medoids method

 

The K-medoids method has the same process as K-means method, except that it takes a medoid (most centrally located object in a cluster) instead of mean.

 

The K-medoids method is a bit more complex than K-means, but overcomes some problems. The most significant advantage is improved noise handling due to the use of medoids instead of centroidsThe outlying data points tend to form their own clusters.

 

Figures below are the illustration of the K-means and K-medoids process.

 

Select randomly k-points as initial seeds

Assign each object to nearest center, and compute new centers for each cluster

Repeat until criterion function converges, get final centers

Figure 3 Illustration of the K-means and K-medoids process

 

(2) Image classification using K-medoids method

 

Spectral pattern recognition refers to the set of spectral radiances measurements obtained in the various wavelength bands for each pixel. Spatial pattern recognition involves the categorization of image pixels on the basis of their spatial relationship with pixels surrounding them.

 

Because of the influence of the noise, our target image often indicates some obvious fluctuation in the gray values of some adjacent pixels, which can not be correctly recognized by spectral classification. So, an important decision in image classification is to strike a balance between spectral and spatial-recognition. By doing so, the weighted combination of contextual and non-contextual data could provide the best pollution contours, particularly in the presence of noise.

 

Let’s suppose that k=1 in the K-medoids method, which means that there is only one center or source in our image, as shown in figure 4(a). We consider the difference in distance from the center point to a pixel and to a potential representative pixel, |di-dj|, as the contextual part of the formulation, and the difference between gray values, |fi-fj|, as the non-contextual part of the formulation. The combination of spectral and spatial data can be accomplished through weights. These weights range from 0 to 1, and sum to unity.

 

The cost of assigning a node i to representative pixel j is: w|fi-fj|+(1-w)|di-dj|, where 0≤w≤1. We can adjust the weight w to get the most perfect classification result, which can almost eliminate the gray value fluctuation caused by the influence of the noise.

 

0

0

0

0

0

1

0

2

2

2

1

0

1

1

4

4

2

0

0

2

5

3

2

0

0

2

1

2

1

1

0

0

0

1

0

0

0

0

0

0

0

1

0

2

2

2

1

0

1

1

4

4

2

0

0

2

5

3

2

0

0

2

1

2

1

1

0

0

0

1

0

0

0

0

0

0

0

0

0

1

1

1

1

0

0

1

2

2

1

0

0

1

2

2

1

0

0

1

1

1

1

0

0

0

0

0

0

0

(a) Raw image

(b) Spectral pattern-recognition

(w=1)

(c) Spectral and spatial pattern-recognition (0≤w≤0.5)

Figure 4 Illustration of Spectral and spatial pattern-recognition

 

For the convenience of programming, here we only calculate the difference in distance from the center point to any pixel as the contextual part of the formulation, and the gray values difference between the center point and any pixel as the non-contextual part of the formulation.

 

Figure 5 is the result of this method.

 

Original image

Class 1, w=0.5

Class 1, w=1

Class 2, w=0.5

Class 2, w=1

Class 3, w=0.5

Class 3, w=1

Figure 5 Classification using K-medoids method

 

By contrasting the two groups of results (w=0.5 and w=1), we can see that the K-medoids method is effective in the spectral and spatial-recognition.
REFERENCES

 

[1] Chan, Y. (2001). Location Theory and Decision Analysis, ITP/South-Western

 

[2] Chan, Y. Location, transport and land-use: Modeling spatial-temporal information. Heidelberg, Germany: Springer-Verlag.

 

[3] Craig M. Wittenbrink, Glen Langdon, Jr. Gabriel Fernandez (1999), Feature Extraction of Clouds from GOES Satellite Data for Integrated Model Measurement Visualization, work paper

 

[4] Raymond T. Ng, Jiawei Han, Efficient and Effective Clustering Methods for Spatial Data Mining, Proceedings of the 20th VLDB Conference Santiago, Chile, 1994

 

[5] Osmar R. Zaiane, Andrew Foss, Chi-Hoon Lee, and Weinan Wang, On Data Clustering Analysis: Scalability, Constraints and Validation, work paper

 

[6] Gerald J. Dittberner (2001), NOAA’s GOES Satellite System – Status and Plans

 

[7] Weather satellites teacher’s guide, Published by Environment Canada, ISBN

Cat. No. En56-172/2001E-IN 0-662-31474-3

 

[8] ArcView user’s manual

 

[9] Websites:

     http://goes2.gsfc.nasa.gov

     http://www.osd.noaa.gov/sats/goes.htm

     http://rsd.gsfc.nasa.gov/goes/

     http://gtielectronics.com