Lab Exercises

CE671A: Lab-2
Exploring Google Earth
Submission Date- 05 Aug 2014 (Tuesday by 5:00 PM)
CE-671 Exercise 1
Session 2012-13
Objective: Exploring Geoinformatics using Google Earth
You will learn:
a.
b.
c.
d.
e.
Google Earth and its use in daily life
Effect of spatial resolution of satellite data
Image interpretation
Effect of different radiometric characteristics of images
Effect of error in geometric correction
1.
Download Google Earth installation programme from TAs ftp or from the internet.
2.
Install Google Earth on your machine.
3.
Play with the Google Earth and find the following:
a. Save an image of the capital of your state.
b. Choose one historical place of your choice in the state of Uttar Pradesh. Create a polygon
around this and save this polygon as .kmz file. Submit this file along with your report so
we can visit this site.
c. Save an image of Kanpur with IIT Kanpur included.
d. Determine the longitude, latitudes and elevation of Kanpur Central Railway Station, IIT
Kanpur Gate, IIT Kanpur Library.
e. Follow the boundary of any two images (where these are jointed, you can see this at low
resolution as the image appears as made of patches). Do you see some anomaly?
Comment and list the anomalies that you observe.
f. Click on the supplied AOI files. In AOI 1 and 5 comment on the possible reason on the
change of colour of images around boundaries.
g. Interpret the feature present within AOI 2, 3, 4, and 6. Write what could be the feature
within the AOI and why you think so.
h. Interpret the feature at AOI 7 and comment on the possible cause of the anomaly, if any.
i. Many applications of Google Earth are possible and are being attempted. Search the
internet and list 5 applications where Google Earth is being used.
j. Now, think hard and come out with one INNOVATIVE application that you can think is
possible with Google Earth and will be useful.
k. What are the other image servers similar to Google Earth? List these.
4.
Make a report; type it in MS word; wherever asked write the answer and paste the image, if
applicable.
CE671A: Lab-2
Familiarizing with IRS 1C-L3 data with ILWIS and Matlab
Submission Date- 11 Aug 2014 (Monday by 5:00 PM)
•
Objective:
o To import IRS 1C L-3 data in ILWIS and learn about all tools (facilities) that are
available in the Viewer.
o To import image is ‘tiff’ format in matlab and learning to visualize the color composites.
•
You will learn:
a. Importing satellite image for processing
b. Understanding the header of an image
c. Display tools and their use in ILWIS and Matlab
[A]- Working with ILWIS
1. Open ftp site (as given on white-board) and download the ILWIS and Patch file from the ftp, also
download the data files from ftp site on your machine. Alternatively download the ILWIS setup
and files from http://www.ilwis.org/open_source_gis_ilwis_download.htm
2. Install the ILWIS, and then put the patch file in same folder where the ILWIS is installed, and
then double click on the Patch file. Then start ILWIS.
First step required is to convert or bring the satellite data in the workable format.
3. Click on Import -> General Raster: to import the IRS1C – L3 data in the required format (For
importing the IRS1C-L3 data you can either use the “Import -> General Raster” from the FILE
option or from the “Import/Export” option of the Operation Tree on the left side). Select where
you downloaded the data from ftp. Within PRODUCT1 there is a file Imagery.L-3: select this.
This is the file which contains actual data. Give an appropriate directory and file name where you
would like to import it. Press OK.
4. List of files will appear on the right side (Map List) of the ILWIS window, with the name.
5. Make a note of file parameters as shown during the downloading process: e.g. bands, size etc.
These will be required to be put on your report. The data will be imported in the file directed by
you.
6. Visualization: At the top of the Map List you can see 5 icons, the first of which is “Open as Color
Composite”, use this to view the data. Now you can see the image in the new window called as
Map Window. In order to view only single band at a time, in the map list just double click on the
single imagery file. The single band image will be shown in the Map Window.
7. Exporting the Image: In the ‘file’ menu select the ‘Export..’ option. An ‘Export’ window will
open. Explore this window for various options. What features do you believe are missing in this
window? Try exporting one of the bands in ‘tiff’ format.
8. At home, read about the ‘Geotiff’ format and observe the difference in your exported image.
pg. 1
CE671A: Lab-2
Familiarizing with IRS 1C-L3 data with ILWIS and Matlab
Submission Date- 11 Aug 2014 (Monday by 5:00 PM)
[B] Working with Matlab
In this section you will be reading and viewing the images in Matlab. In general, the simplest way to work
is from the command window, however it is not recommended as it becomes tedious when larger and
complex operations are involved. Hence we will be working with scripts and functions. For this exercise
you will be writing a script to read and view the color composite of a LISS-IV IRS image. The images are
provided in ‘tiff’ (Exported from ILWIS) format for use.
1. Open Matlab. In the home tab, select the first option ‘New Script’. You can dock this editor by
selecting ‘dock’ option that appears after selecting the down arrow before the ‘x’ i.e. close button.
Select the ‘Editor’ tab, if you are not already there and save your script as rollno_lab2.m.
2.
Now think how you can read the image in matlab. Try dragging and dropping the image in the
command window. (However this is not the sophisticated way). In the command window type
‘help imread’. Use this function to read the images. Read the images and save the three bands into
three variables. To run the script you can select the ‘run’ option in the ‘Editor’ tab.
3.
Once the Image is read, it should show in the workspace with variable name you saved it in. What
is the size of the image that you see in the workspace? Why does this variable have 3 dimensions?
4. Select the first variable in the workspace. Now click on the ‘Plots tab’. Explore the given options
and differentiate between them. You can use the help command in command window for their
details, for e.g. type ‘help imagesc’ in command window. Similarly differentiate between the
other plotting commands. Now write the plotting commands in your script for displaying the
image. To use multiple windows use the command ‘figure’.
5. After displaying the image in figure window, explore the data cursor tool. Note the anomaly in
the image, Define this anomaly. Use the data cursor to identify the RGB of this area. Note the
RGB of other areas. What do you understand from this?
6. You can already note that each variable in the workspace consists of three identical matrices.
Now to generate a color composite we need one of these matrices from each of the image. To
simplify, let us save the first dimension of each of the variables into a new variable CC as below
CC(:,:,1) = Im1(:,:,1);
CC(:,:,2) = Im2(:,:,1);
CC(:,:,3) = Im3(:,:,1);
imshow(CC)
7. Similarly generate one more color composite as part of your submission. In this color composite
identify the different classes of features – water, buildings, urban, river etc. mark them (you can
use snapshots or encircle these areas in paint or whatever). Find the IITK Airstrip. To export an
image from figure window select ‘file’ then select ‘save as’.
pg. 2
CE671A: Lab-2
Familiarizing with IRS 1C-L3 data with ILWIS and Matlab
Submission Date- 11 Aug 2014 (Monday by 5:00 PM)
8. Your matlab code should have the following three sections
a. Reading Images
b. Displaying Images
c. Generating Color Composites
9. Your submission should havea. The report with images and explanations.
b. Matlab code.
Note: Keep the images and your Matlab codes with you, as they may be required/ used in further lab exercises.
pg. 3
CE671A: Lab- 3
Statistical Analysis and Image Cropping and Masking
Submission Date- 19 Aug 2014 (Tuesday by 5:00 PM)

Objective:
o Statistical Analysis of an Image
o Cropping and Masking an Image
o Measuring on an Image

You will learn:
a. Statistical parameters for Image Analysis
b. Subsetting an image, highlighting a region of interest.
c. Effect of contrast variation in image display.
[A]- Working with ILWIS
1. Stretching
1. Now go to Image Processing and then select the Stretch option, stretch band 3 from 0 to 255.
2. Observe the changes in the stretched image and the normal image. Do the statistical
comparison of both and report the reasons of the change in both the images.
3. Check the histograms in both cases and compare them.
2. Statistical Analysis
1. Go to Statistics  MapList  Variance-Covariance, select the map. Observe the VarianceCovariance matrix. What does this matrix indicate? Is this matrix a symmetric matrix? Is this
matrix symmetric in all the cases? If yes/no, then why?
2. In the similar way, observe the correlation matrix. Which combination of 2 bands gives you
maximum information, and why?
3. Are these matrices bound to change when computed from the stretched images? Why?
[B] Working with Matlab
In this section you will be creating a submap by cropping a section from an image. You will learn to
highlight an object or area by masking it. You will also learn to generate a mask for a specific region of
interest. You will be using the ‘tiff’ images for the LISS-IV scanner given in previous exercise.
1. In the image (Color Composite CC from previous exercise) identify the IIT Kanpur
Campus Area. Now crop a section of about 1500x1500 pixels such that the campus area
lies in the near middle. To crop from an image you can use the command ‘imcrop’. Save
the cropped section as a new variable Imcrp. This command can be used in two waysa. Firstly you can use the interactive interface.
b. You can specify the pixel coordinates of the bounding box for cropping the
image.
c. Define the applications of both.
2. Now you will generate a mask for the IIT Kanpur Campus area. Use the ‘roi’ tool in
matlab to create a mask. The mask is generated in a ‘Logical’ data type change it to
match the image.
3. Apply the mask to Imcrp to achieve the following:
a. Remove everything except the IIT Kanpur Campus area.
b. Highlight the IIT Kanpur Campus area in Imcrp.
3pg. 1
CE671A: Lab- 3
Statistical Analysis and Image Cropping and Masking
Submission Date- 19 Aug 2014 (Tuesday by 5:00 PM)
4. Now use the ‘imtool’ command with the color composite. Explore the tools in the menu
bar.
5. Measure the length of the airstrip in meters given the resolution of the image to be 5.8m.
Your submission should havea. The report with images and explanations.
b. Matlab code.
c. Detailed Answers to the questions in Appendix.
3pg. 2
CE671A: Lab- 4
Understanding Linear Stretching and Color Representations.
Submission Date- 26 Aug 2014 (Tuesday by 5:00 PM)
26
Objective: To understand color composites, color representation, LUT’s.
You will learn:
a. How colors are represented in an image
b. Role of DNs in different bands and
color composites
c. Linear stretching
d. Understanding of the Look Up Table
1. Use the Imagery L-4 which was used in last lab.
2. Display the image with RGB as layers 321, i.e. the standard FCC in ILWIS.
3. We hope that you all have observed how the image is represented while it is shown in the image
window. Just try it once again, try again displaying the image with RGB as layers 321, beside each
color band you can see some range of values.
Each of the bands is stretched between these values. The stretching used is the linear stretching
which is given by the following function:
Where, IN is the new value of the pixel, I is the
value to be stretched,
newMax and newMin are the new ranges (0-255)
Max and Min are the ranges of the current values/image
4. Implement the above method for linear stretching in matlab using the other image provided. Please
prepare the following –
a) Matlab function ‘linstretch’
b) Demo script for the function ‘linsterch’
Page | 1
CE671A: Lab- 4
Understanding Linear Stretching and Color Representations.
Submission Date- 26 Aug 2014 (Tuesday by 5:00 PM)
26
5. Now, select any five pixels of this image in different areas (say top of
workshop roof, water body, vegetation, road, concrete floor). Go to these
pixels by zooming in till image gets pixilated. Make a table like
following for these pixels in MS word in landscape mode. Note down
the values as required in the following table for each of the pixels.
“Color from the Image” is the screenshot of the color of the pixel,
“RGB values from Image” is the RGB value given by clicking on the
pixel; use these values in the blender which you can find at “
http://www.rapidtables.com/web/color/RGB_Color.htm ”.
6.
Pixel
(1)
Line
(2)
Type
of
land
use/cover
(3)
Color
from the
Image (4)
RGB values from the Color
Image
RG
(5)
B values
from the
Blender (6)
RGB
calculated
stretching
(Mo dified )
(7)
values Color for
new
from RGB values from
the blender
(8)
R
R
B
G
B
G
7. Having done this, for each of the pixels, calculate the values for RGB
using the formula for stretching and note the new RGB values, and now
use these values in the blender. This is the “Look up Tables” (LUT) value,
which is actually we are seeing.
8. The colors in the columns 4 and 6 will be seen as different, whereas
the colors from columns 4 and 8 will be seen same. The idea here is to
understand why ILWIS is using a different representation then that of true
RGB, here you will have to crunch your brain and understand about the
“Look up Tables” (LUT). The color representation by ILWIS is done using
the Look up Tables.
7. Understand the concept of the “LUT”, and report in short the use of LUTs.
Page | 2
CE671A: Lab- 4
Understanding Linear Stretching and Color Representations.
Submission Date- 26 Aug 2014 (Tuesday by 5:00 PM)
26
Objective: To understand the image enhancement techniques and visualize their effect.
You will learn: Image Enhancement Techniques and their effect
1. Image Enhancement
a. Histogram Equalization
 Use the images given in Lab4 (“L3-NG44I02-099-053-14oct08-BANDx.tif”, x = 2, 3, 4)

Perform histogram equalization using the ‘histeq’ function in Matlab.

Display the original and equalized image color composites using ‘subplot’.

What is the effect of varying the discrete gray levels? Explain and show with the help of
figures (Different Color Composites) from the Matlab code. Hint: For an 8-bit image discrete
gray levels are 256

Compute the mean, standard deviation, and variance of the bands. Show the histograms also.

Do this for each of the following transformations also.
Tip: Save them in a new variable. Export them either in .xls or .csv or any other format of your
choice format. See – ‘xlswrite’, ‘csvwrite’, ’dlmwrite’. (Not compulsory – try it at the end)
b. Inverse Transformation

If L3 represents the original images. Use the following equation for inverse transformation
L3 _ inv  255  L3

Display the original and equalized image color composites using ‘subplot’.
c. Log Transformation

Use the following equation
L3_ logt  C.log 1  L3 
here C is a constant should be selected to yield no more than the maximum value of 255.

Apply this transformation to the images and observe the results by varying the value of C.
(Visualize the color composite).

Hint: when performing a logarithmic transformation, it is often desirable to bring the result
back to valid image data. Use ‘mat2gray’ and ‘im2uint8’.

How does the value of C affect the transformation? What should be the maximum value of C.
Page | 1
CE671A: Lab- 4
Understanding Linear Stretching and Color Representations.
Submission Date- 26 Aug 2014 (Tuesday by 5:00 PM)
26
d. Power Law Transformation
 Use the following equation
L3_   (L3 ) .C
here C and  is a constant. Apply this transformation to the images and observe the results by varying the
value of C.


Apply this gamma transformation too
L3 with   1&   1
o
Inverse transformed image from section ‘b’ with   1&   1
How does the value of C and  affect the transformation?
e. ‘imadjust’
 Check the imadjust function and use it with the given data.
 Is the transformation (gamma) different now?
 How is this function different from that you developed?
2. Gray Level Slicing

Convert the RGB image to gray by using ‘rgb2gray’.

Perform gray level slicing with respect to the following classes.
Class
River
Vegetation
Band-2
Urban
Sand

Visualize the image and select the ‘data cursor’ tool. Click on the pixels of a particular class
and decide the intensity range for that particular class. Select some logical values. Give
reasons of the values you select.

View the pixel values of the first band for different classes.

Hint: To set a specific range of gray values in an image to a constant value‘L3glsn1(L3gls1>=75 & L3gls1<=90) = 100;%Vegetation’
Guidelines:

Before mathematically manipulating the images, do not forget to convert them to double.

In the report provide the following
o
Short description about each of the above mentioned image transformations
o
Figures of original images and transformed images
o
Attach your published code as an Appendix in the report.
Page | 2
CE671A: Lab- 4
Understanding Image Filtering and Edge Detection
Submission Date- 16 Sep 2014 (Tuesday by 5:00 PM)
Objectives: To understand filtering in images
You will learn:
1. Low & High Pass filters, their effect on images and working
2. Edge Detection in Images
Data Used: IRS P6, LISS IV Imagery
Guidelines
1. Import the Image in Matlab as done in previous exercises.
2. Construct a 3x3 filter f which performs an averaging operation.
1 1 1
1
f  1 1 1
9
1 1 1
3. Each pixel in the resulting image after filtering, has a value equal to the sum of the pixel
values of the original image I multiplied by the filter mask f when the mask is centered on
that pixel. Apply this filter to the histogram equalized image and the linear stretched image
using the predefined function ‘imfilter’. Zoom into some area in the original and resulting
image to observe the changes. Experiment with different options available in ‘fspecial’.
a. Which kind of a structure element in the filter would you consider for filtering
astronomical images of a star?
b. How would you highlight the boundaries of peppers in the example given in the
Matlab help document for ‘imfilter’?
Explore the options in ‘imfilter’. Search ‘imfilter’ in the search document bar at the top right
side and read. How does varying the boundary options would affect the resultant image,
explain by example. What happens if we reverse the order of parameters in the function? In
your report explain the detailed operation of ‘imfilter’.
4. Use the Histogram Equalized Image (stretched in full range [0 255]) for this section.
Generate and apply the following
a. Horizontal Low Pass Filter   x f
b.

Horizontal High Pass Filter   x f 

c. Vertical Low Pass Filter  y f

1 | Page
CE671A: Lab- 4
Understanding Image Filtering and Edge Detection
Submission Date- 16 Sep 2014 (Tuesday by 5:00 PM)

d. Vertical High Pass Filter  y f

Note: Generate these as just 2 element filters that perform the summation and gradient operation in
the sequence aforementioned. Observe the results in individual bands also. To simply the Matlab
coding make use of functions.
Delineate your results. Compare the input image and the output generated by various
filters in terms of success in edge detection, effect of noise, effect of edge direction on
filters, effect of filter on low frequency areas and report it.
5. Explore the various filters using ‘fspecial’ and apply them to the given image. Delineate your
results. Compare the input image and the output generated.
6. Generate the Edge Strength and the Edge Direction Images.
 f  
  
 x f   x  
2
2
1   y f 
f  

 Edge Strength   x f   y f , Edge Direction  tan 

 x f 
 y f   f  
 y  
7. Explore the following predefined functions in Matlab and apply them on the image and
compare your previous results.
a. ‘edge’
b. ‘imgradient’
c. ‘imgradientxy’
8. In your report tabulate (prepare separately as an appendix in your report, in landscape
layout) in the following manner the resultsS.No. Filter
Input Image
Output Image
Response
Attributes
2 | Page
CE671A: Lab-07
Georeferencing and Image to Image Registration with ILWIS
Submission Date- 7th Oct 2014 (Tuesday by 5:00 PM
Objective: Georeferencing the satellite data
You will learn:
a. To establish correspondence between the image coordinates and the Earth coordinates
(Georeferencing)
b. Connecting satellite image to a Georeferenced map (Image to Image Registration)
1. Introduction
Georeferencing is a process where we establish a mathematical model between the Earth coordinates
and the image coordinates. The mathematical model could be a least squares model or a model based
on the sensor parameters of the remote sensing satellite.
Image to Image registration is a process in which a mathematical model is established between the
coordinates of an ordinary image and a georeferenced image.
When we use a model to fit to the existing data, each of the data points contributes to some error in
the model. The Root mean square error is calculated by summing up the squares of these errors,
taking their average and finally the square root. For two dimensional spatial data, this Root Mean
Square Error is calculated by summing up the square of errors in both the directions x and y, then
taking the mean, and then the square root.
2. Importing the map: Select File -> Import -> Via Geogateway. Select the map downloaded from
the ftp (63B7). Select “convert to ILWIS format” and “show”.
Page | 1
CE671A: Lab-07
Georeferencing and Image to Image Registration with ILWIS
Submission Date- 7th Oct 2014 (Tuesday by 5:00 PM
A new window titled Foreign Collection "63b7" will appear on the screen
To visualize the imported file, double click on the Map List named 63b7, and click on the first icon
on the left.
Page | 2
CE671A: Lab-07
Georeferencing and Image to Image Registration with ILWIS
Submission Date- 7th Oct 2014 (Tuesday by 5:00 PM
3. Create a maplist of the map imported just now. Click on File > Create> Maplist
Select the files 63b7 1, 63b7 2 and 63b7__3 one by one and click the ">" button. You will see that
the files will also appear in the right side. Give the name of the maplist as map. A new map list named
map will be created in ILWIS.
4. Georeferencing the map: Click File > Create > Georeference
Page | 3
CE671A: Lab-07
Georeferencing and Image to Image Registration with ILWIS
Submission Date- 7th Oct 2014 (Tuesday by 5:00 PM
Put the name of the georeference as map_geo and select the option "Georef Tiepoints". Select the
coordinate system as "LatLonWGS84". Select the map list map which you just created a few
minutes ago.
As expected, since no tie points have been selected till now, the window shows “Not enough
points” in the window.
On the toolbar there is an option named “Transformation”, do note the transformation used.
We will now begin adding points to the model to build a solution. We suggest you to add points
from the corners first and then the internal points.
Select the tool named “Normal” to add the points with the known coordinates
Page | 4
CE671A: Lab-07
Georeferencing and Image to Image Registration with ILWIS
Submission Date- 7th Oct 2014 (Tuesday by 5:00 PM
Similarly add other tie-points (corner points for which the coordinates are known). After adding
the three points, as you select the fourth point it will show you the coordinates automatically, as
the number of points required for “Affine” transformation are enough. This gives you the
georeferenced map.
5. Image to Image registration: Import “imagery.L4” as usual. To georeference the image follow
the steps:
a. File > New > Create Georeference.
b. Name the file as imgeoref. Select the option Georef Tiepoints and select the coordinate
system as LatLonWGS84.
c. Further, open the already georeferenced map from the ILWIS Main Window.
d. Focus on the shoe shaped object in the image and the map.
e. Zoom on the shoe shaped object in both the windows.
f. Now click the arrow, and select one point on one of the corners of the shoe shaped object
on the FCC. Click on the corresponding point on the Georeferenced map.
6. A model is thus formed when we select more than three points where the image coordinates are
associated with the earth coordinates. The satellite image is thus georeferenced. The user can see
the geographic coordinates being displayed at the bottom of the window.
7. Do have a look at the object named “sigma” in the toolbar, it shows you the error. Select 15-16
points and target on the error being minimum (less than 0.5 pixels).
8. If you are unaware of “Affine” transformation, Google it to find out how it works. Give an
example of the same in your report using a 3-coefficient affine transformation. For ex.
X = Ax + By + C
Y = Dx + Ey + F
Page | 5
CE671A: Lab-07
Georeferencing and Image to Image Registration with ILWIS
Submission Date- 7th Oct 2014 (Tuesday by 5:00 PM
Use the following data for the example and find the coordinates of points P and Q
X
Y
x
y
12000
12000
100
100
12000
20000
100
500
20000
20000
500
500
Px
Py
250
250
Qx
Qy
100
150
Page | 6
CE671A: Lab- 8
Understanding Image Classification
Submission Date- 14th Oct 2014 (Tuesday by 5:00 PM)
Objective: Training data collection and supervised classification with ILWIS.
You will learn:
a. Collection of training data for supervised classification
b. Classification methods and thematic map generation
1. Create a Georeference Image of “Imagery.L4” as you did in the last exercise
2. Collecting Sample Set: Click on File > Create > Sample Set

Create a new Domain



Name the domain as superviseddomain
Click OK
A domain class definition window opens up. We can add the names of classes
here.
In this window, click the second icon indicated on the toolbar to add the
class. An example of adding the water class has been shown in this example.

P age 1|5
CE671A: Lab- 8
Understanding Image Classification
Submission Date- 14th Oct 2014 (Tuesday by 5:00 PM)
As an exercise, add the rest of the classes to the domain.

Create the classes as shallow (turbid) water body, deep (clear) water body, barren
land and sand, only forest, only urban, mixed urban and forest

After all the classes have been added, close this window. Click OK in the
window.
P age 2|5
CE671A: Lab- 8
Understanding Image Classification
Submission Date- 14th Oct 2014 (Tuesday by 5:00 PM)

A map list window will open up. Click OK.

A window like as in the above figure will open up showing the sample
statistics. We are now ready to select samples. Zoom in to the desired
area and click on the "Pointing Finger" icon

Select the sample, right click on it and click on Edit.
P age 3|5
CE671A: Lab- 8
Understanding Image Classification
Submission Date- 14th Oct 2014 (Tuesday by 5:00 PM)

A “Edit” dialog box will appear. Select the appropriate class as per
the sample. Click on OK. The sample statistics for water can be seen
in the window.
3. Repeat the procedure for collecting samples of other classes. See to it that the
samples are spread over the whole image

We can also verify whether we are selecting proper samples by clicking on the
Feature Space icon in the toolbar.

Select the two bands as im_1 and im_3. Click OK. The feature space diagram
appears.
Feature space diagram for three classes
P age 4|5
CE671A: Lab- 8
Understanding Image Classification
Submission Date- 14th Oct 2014 (Tuesday by 5:00 PM)

Close the sample set window

In the main window, right click on the file supervisedsample and select
Classify … Select Box classifier, set multiplication factor as 1.732, set the
output file name as - classif_sup_box. Click on Show

Click OK on the dialog box. The classified map will be displayed
4. Repeat the above exercise with Minimum distance, Maximum likelihood
classifiers
• Compare their output.
P age 5|5
CE671A: Lab- 09
Understanding Unsupervised Image Classification and Classification Accuracy Assessment
Submission Date- 21st Oct 2014 (Tuesday by 5:00 PM)
Objective: Analyzing classification accuracy
You will learn:
a. Unsupervised classification
b. To calculate the classification accuracy of a classified image
1. For unsupervised classification- Go to->operation->clustering. It is one method of
unsupervised classification.
2. Give the number of clusters as six and repeat this procedure by changing no. of clusters
and try to come up with the optimum number of clusters that the image can have.
3. To calculate the classification accuracy of a classified image, use the classified image and
the Georeferenced image of the last lab.
4. Creating Test Set or Raster Map:
•
Check the properties of the classified image to know which georeference and
domain are used
•
Check the georeference of the background map or georeferenced image
(imageryL4); it should have the same georeference and domain
•
From the File menu of the map window File -> Create -> Raster Map
Page 1 of 5
CE671A: Lab- 09
Understanding Unsupervised Image Classification and Classification Accuracy
Assessment
Submission Date- 21st Oct 2014 (Tuesday by 5:00 PM)
•
In the Create Raster Map dialog box: type a new name for the ground truth map
accept the Georeference of the classified image; and select the same domain as
used by the classified image
•
The pixel editor is opened automatically; the background map is displayed. When
you zoom in, you can start selecting and giving names to pixels in your ground
truth map
Make sure that the ground truth/test set raster map does not contain the same
pixels as the sample set raster map from the training phase
•
Page 2 of 5
CE671A: Lab- 09
Understanding Unsupervised Image Classification and Classification Accuracy Assessment
Submission Date- 21st Oct 2014 (Tuesday by 5:00 PM)
5. Perform a CROSS operation with your ground truth map and the classified image to
obtain a cross table
6. To start the Cross operation:
•
from the Operations menu in the Main window, select Raster Operations and then
the Cross command, or
•
double-click the Cross item in the Operation-list, or
•
use the right mouse button on your test set raster map in the Catalog and from the
context
7. In the Cross dialog box
•
for First Map, select the test set map,
•
accept the default selection of the Ignore Undefs check box for the first map,
•
for Second Map, select the classified image,
•
clear the Ignore Undefs check box for the second map,
•
type a name for the output cross table, and
•
Select the Show check box to directly display the output cross table in a table
window
Page 3 of 5
CE671A: Lab- 09
Understanding Unsupervised Image Classification and Classification Accuracy Assessment
Submission Date- 21st Oct 2014 (Tuesday by 5:00 PM)
8. In the table window displaying the cross table, open the View menu and choose
Confusion matrix
In the Confusion matrix dialog box:
•
for First Column, select the test set column in the table,
•
for Second Column, select the classified image column in the table,
•
for Frequency, select the NPix column in the table.
Page 4 of 5
CE671A: Lab- 09
Understanding Unsupervised Image Classification and Classification Accuracy Assessment
Submission Date- 21st Oct 2014 (Tuesday by 5:00 PM)
9. When you click OK, the confusion matrix is displayed in a matrix window
10. For more information about interpreting the matrix search for “confusion matrix” in the
Help Window
11. Compute the Confusion Matrix for the images classified with different methods and
compare the accuracies of the various methods
Page 5 of 5
CE671A: Lab- 10
Landsat8- land Cover Analysis with Spectral Indices.
Submission Date- 28th Oct 2014 (Tuesday by 5:00 PM)
Objective: Land Cover Analysis with Spectral Indices.
You will learn:
a. Land Cover Spectral Analysis with Landsat8
b. To calculate the classification accuracy of a classified image
Theoretical Background:
1. Landsat Bands
Landsat 8
Operational
Land Imager
(OLI)
and
Thermal
Infrared
Sensor
(TIRS)
Launched
February 11,
2013
Bands
Wavelength
Resolution
(micrometers) (meters)
Band 1 - Coastal
aerosol
0.43 - 0.45
30
Band 2 - Blue
0.45 - 0.51
30
Band 3 - Green
0.53 - 0.59
30
Band 4 - Red
0.64 - 0.67
30
Band 5 - Near
Infrared (NIR)
0.85 - 0.88
30
Band 6 - SWIR 1
1.57 - 1.65
30
Band 7 - SWIR 2
2.11 - 2.29
30
Band 8 Panchromatic
0.50 - 0.68
15
Band 9 - Cirrus
1.36 - 1.38
30
Band 10 - Thermal
Infrared (TIRS) 1
10.60 - 11.19
100 * (30)
Band 11 - Thermal
Infrared (TIRS) 2
11.50 - 12.51
100 * (30)
* TIRS bands are acquired at 100 meter resolution, but are resampled to 30 meter in
delivered data product.
2. DN to Reflectance Conversion.
For land cover analysis, it is significant to employ the reflectance images than directly using the DNs, since
during the calibration of the sensor for Reflectance to DN transformation the incident energy is influenced
largely by the point spread function of the sensor. The standard Landsat 8 products provided by the USGS
EROS Center consist of quantized and calibrated scaled Digital Numbers (DN) representing multispectral
Page 1 of 3
CE671A: Lab- 10
Landsat8- land Cover Analysis with Spectral Indices.
Submission Date- 28th Oct 2014 (Tuesday by 5:00 PM)
image data acquired by both the Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS). The
metadata provided in the Landsat8 datasets is a text file for e.g. ‘LC81470382013332LGN00_MTL.txt’. The
metadata includes requisite coefficients to estimate the reflectance images and for other transformations.
Follow the link to understand the mathematical relations for this manipulationhttp://landsat.usgs.gov/Landsat8_Using_Product.php
Write a matlab script to convert and display the Landsat8 images to TOA Reflectance, Radiance images and
At-Satellite Brightness Temperature. Note Again: All the requisite coefficients required are provided in the
‘_MTL.txt’ file.
3. Land Cover Analysis
Estimate and delineate the following spectral indices for studying the geographical and geophysical
properties of the landcover.
1.1 NDVI- Normalized Differenced Vegetation index
By differencing the Spectral bands at the red edge the vegetated areas can be highlighted. This is effectuated
by the radiation absorption in the near infra-red region. NDVI values lie between -1 and +1.
NDVI 
( NIR  R)
( NIR  R)

Explain the range of NDVI values per landform. That is for which land cover type it would be close
to 1, 0 and -1.

Display the NDVI map. Give the statistical distribution of the values and the histogram.
1.2 NDWI- Normalized Differenced Water Index
This index is useful in mapping the water areas, displaying the differences in turbidity and vegetal content of
the water, erratic soil or in measuring the water content of the vegetation. This index uses green spectral
bands and near infra red (increases the spectral feedback of the soil humidity, of the rocks and plants and
the water begins to absorb radiation from the surface layer).
NDWI 

Fill in the blanks corresponding to the given data
o

(G  NIR)
(G  NIR)
The dark color (values close to ____) represent the water crystal, the light color
(values close to ____ ) represent dry land and intermediate colours (values close to
____) represent lands with intermediate humidity content.
Display the NDVI map. Give the statistical distribution of the values and the histogram.
1.3 NDMI - Normalized Difference Moisture Index
Using this index the light colors represent excess of humidity and dark colors represent low humidity.
It evaluates the different content of humidity from the landscape elements, especially for soils, rocks and
vegetation (excellent indicator for dryness).
Page 2 of 3
CE671A: Lab- 10
Landsat8- land Cover Analysis with Spectral Indices.
Submission Date- 28th Oct 2014 (Tuesday by 5:00 PM)
NDMI 

( NIR  IR)
( NIR  IR)
Fill in the blanks corresponding to the given data
o Values higher than ___ are symbolized by light colors and they signal ____ humidity level.
Low values (close to ___) symbolized by dark colors represent ____ humidity level.
1.4 Normalized Difference Bareness Index (NDBaI)
This index is based on significant differences of spectral signature in the nir-infrared (Band 6) between the
bare-soil and the backgrounds. However, it showed little difference between built-up areas and bare-soil
areas in Band 6
NDBaI1  [OLI 7 – TIRS / OLI 7  TIRS ]
NDBaI 2  [OLI 6 – TIRS / OLI 6  TIRS ]
NDBaI 3  [OLI 5 – TIRS / OLI 5  TIRS ]
NDBaI 4  [OLI 4 – TIRS / OLI 4  TIRS ]
NDBaI 5  [OLI 3 – TIRS / OLI 3 – TIRS ]
NDBaI 6  [TOA6 – TOA5 / TOA6  TOA5]
NDBaI 7  [TOA5 – TOA4 / TOA5  TOA4]

Generate different color composites from the above 8 indices and compare them. In your report
write a detailed analysis.
Page 3 of 3
CE671A: Lab- 11
Landsat8- Change Detection of snow cover
Submission Date- 4th November 2014 (Tuesday by 5:00 PM)
Objective: Analyzing changes in the snow cover from bi-temporal Landsat-8 OLI Imagery
You will learn:
a. Snow Cover Mapping
b. Change Detection
Study Area: Beas River Basin
1. Conversion of DNs to Reflectance Images



The given images are cropped from the original, whose metadata is provided to you
under the bounding box [3550 1390 499 499]. Where in the images first band is B4,
Second Band is B5 and third band is B7.
Apply path radiance correction by the dark pixel method. Locate in each band of both
images the lowest DN and subtract this from entire images. This will eliminate the
effect of path radiance. Comment that which band has the maximum effect of path
radiance. Is this as per the expectation?
Use the developed script in the previous lab exercise to generate reflectance images for
the Landsat8 bands 4, 5, 7.
2. Snow Cover Mapping with the S3 index

Derive the Normalized Snow Index for the two images with the DNs (NDSI1 and
NDSI2, respectively). The Normalized Snow Index is defined as followsNDSI  ( R  SWIR) / ( R  SWIR)

Difference the two NDSI maps (DNDSI= NDSI2-NDSI1)

Derive the Snow cover maps (S3_1, S3_2 respectively) for the two given datasets using
the S3 index defined as followsS3 =

NIR (Red - SWIR )
(NIR + Red) (NIR + SWIR)
Optional

Use B7 to generate a cloud and shadow mask and apply them to corresponding images.
Ideally you should do this for both images, however due to time limitations just work
with second image.

NDSI doesn’t discriminate well between glacial runoffs or rivers. Use B4 in first image
to discriminate these from snow. Identify the reflectance of different pixels for runoff
regions as % of identified snow by NDSI. Set a criteria- say
o
If the reflectance of pixel is > x% in B4 and NDSI value is >= y %, pixel is
identified as snow.
o If the reflectance of pixel is < x% in B4 and NDSI value is ~ mean (NDSI of
Snow covered pixels, ~y %), pixel is identified as water.
Page 1 of 4
CE671A: Lab- 11
Landsat8- Change Detection of snow cover
Submission Date- 4th November 2014 (Tuesday by 5:00 PM)
o If the NDSI value is < mean (NDSI of Snow covered pixels, ~y %), pixel is
identified as land/shadow area.
o Based on the above develop snow cover maps for the two images (SCM1,
SCM2).
o Export them as ‘tif’ images using ‘imwrite’

Difference the two SI3 maps (DS3 = S3_2 – S3_1)

Compare the DNDSI and DS3.
3. Registration of Snow Cover Maps

Import the SCM1 and SCM2 ‘.Tiff’ images in ILWIS as ‘General Raster’. Use S3
outputs as SCM.

Note the coordinates of the image corners from the included Meta data in the given
datasets, ‘_MTL.txt’. The coordinates are given in UTM-43 Lat Long. This is required
for georeferencing one of the snow cover maps (SCM1). Use affine transform for
determining the coordinates of the corners of the cropped section.

Convert them to the DMS system at the followinghttp://www.rcn.montana.edu/resources/converter.aspx
Page 2 of 4
CE671A: Lab- 11
Landsat8- Change Detection of snow cover
Submission Date- 4th November 2014 (Tuesday by 5:00 PM)

Georeference the image in ILWIS using the above coordinates. For selecting the
corners, zoom maximum into the image and scroll to the corners then click ‘add point’
tool followed by another click at the appropriate corner.

If you were to directly import Geotiff format in ILWIS, it takes the georeference
information automatically, however since we have exported the snow cover maps in
matlab the georeferencing is lost.

Register using this snow cover map the SCM2.
4. Change Detection
a. Differencing.


Take the difference of the two co-registered Snow cover maps in Matlab. In ILWIS,
simply type in the address bar like ‘ChangeMap = SCM2 – SCM1 to do this.
Perform unsupervised classification on the generated difference in ILWIS.
b. Post Classification Analysis.

Perform supervised classification on the co-registered snow cover maps with
following classes
o Class 1:dry snow
Page 3 of 4
CE671A: Lab- 11
Landsat8- Change Detection of snow cover
Submission Date- 4th November 2014 (Tuesday by 5:00 PM)

o Class 2: wet snow
o Class 3: water
o Class 4 : Land
Difference the classified maps. Compare the statistics of the classified maps and the
difference.
Page 4 of 4
CE671A: Lab- 4
Understanding Image Filtering and Edge Detection
Submission Date- 16 Sep 2014 (Tuesday by 5:00 PM)
Objectives: To understand feature representations SAR multi look Images
You will learn:
1. Understanding of SAR image
2. Feature mapping from SAR Image
Data Used:
TerraSAR-X (Vishakhapatnam), Single Polarization-VV, Spot light mode
Software
Matlab
Guidelines
1. Backscatter Type
 JERS-1 SAR image is given below along with IRS-FCC image of Mumbai (Figure.2, courtesy IIT Bombay).
Please write the type of reflection/scattering takes place by SAR signal at the locations given in the image by
seeing the IRS-P6 Image.

For theoretical information refer to the attached file ‘Overview_SAR_Basics’ or
http://usgif.org/system/uploads/2545/original/Overview_SAR_Basics.pdf .
2. Visualization and Interpretation
 Read the given image in matlab. Visualize recall the image enhancement lab exercise and apply appropriate enhancement
technique. Hint: use the pre-defined function in matlab.
 Identify the path of the satellite with respect to the image. Save the image as ‘jpeg’ and annotate over it the range and
azimuth direction along with the rough path.
 Speckle Filtering: Matlab does not have predefined filters based on general statistical neighborhood processing. Although, it
does provide with very efficient wavelet tools, however they out of your scope. Thus, you may use the Wiener filter for this.
 Estimate the Image entropy and an entropy image based on local entropy of the pixels. Hint: See help ‘entropy’ in matlab.
 Along with the image another file titled ‘Metadata’ is provided. It has the scene specifications for computations. SAR images
are side look radar image, convert the given image to nadir look image.
1 | Page
CE671A: Lab- 4
Understanding Image Filtering and Edge Detection
Submission Date- 16 Sep 2014 (Tuesday by 5:00 PM)
3. Geometrical Distortions
 The geometrical distortions in a SAR image are as shown in the following figure.
 For the features given in ‘Product KMZ’, define which of the following distortions are present.
 For estimating this, calculate the slope angle by using dimension length and height of the feature from google earth. From the
scattering pattern this should actually be evident.
Courtesy of ESA
Figure 1: Distortions in SAR Image
4. Feature mappingo Refer to http://www.pf.bgu.tum.de/isprs/pia03/pub/pia03_s2p3.pdf &
https://drive.google.com/file/d/0B7Uc9z9_eTn2R1pHb0pWQjZCdzg/edit
o Given is a KMZ file ‘Product KMZ’. Open it in google earth observe the marked locations in the SAR image.
o Use the ‘imtools’ option to measure the pixel distances in the image.
o Use the metadata to convert the layover to ground height height.
2 | Page
CE671A: Lab- 4
Understanding Image Filtering and Edge Detection
Submission Date- 16 Sep 2014 (Tuesday by 5:00 PM)
JERS-1 L-bad SAR
IRS-P6, LISS-III R
4
4
3
3
7
7
5
5
2
2
8
8
1
6
Figure 2
1
6
3 | Page