Fundamentals of Digital Image Processing and Basic Concept of Classification
Fundamentals of Digital Image Processing and Basic Concept of Classification
Fundamentals of Digital Image Processing and Basic Concept of Classification
net/publication/299357099
CITATIONS READS
2 7,654
3 authors, including:
Muhammad Shoyab
Japan East West Medical College Hospital
9 PUBLICATIONS 5 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Shammi Shawal on 10 March 2019.
URL: www.pakinsight.com
1
Department of Urban and Regional Planning, Bangladesh University of Engineering and Technology, Bangladesh
2
Medical Officer, Popular Hospital, Dhaka, Bangladesh
3
Principal Scientific Officer, Bangladesh Space Research and Remote Sensing Organization, Bangladesh
ABSTRACT
We have to classify and analyze digital images for different study and purposes. Digital images are
obtained from sources like camera, satellites, aircraft etc. Data obtained from satellites or aircraft
i.e, the space based and remote sensing data needs to be corrected as they are usually
geometrically distorted due to their acquisition system and the movements of the platform of
aircraft. Processing and pre-processing are necessary for such correction prior to image
classification. Image Processing is a technique which is used to enhance raw images received from
cameras, satellites, space probes, aircrafts etc. and Digital image processing is the technique of
processing images in the form of discrete digital brightness using digital circuits or digital
computers. Image analysis and classification starts when processing and pre-processing ends.
© 2014 Pak Publishing Group. All Rights Reserved.
Contribution/ Originality
This paper contributes towards the concepts of digital image, digitization, image processing
and classification. It complies originally about the stages of digital image processing and its
application as well as its necessity.
1. INTRODUCTION
Digital image processing (DIP) can be defined as a transformation of an distorted image into a
modified one which helps people to detect silent features without difficulty for interpretation
† Corresponding author
DOI: 10.18488/journal.65/2014.1.6/65.6.98.108
ISSN(e): 2313-0776/ISSN(p): 2313-2558
© 2014 Pak Publishing Group. All Rights Reserved. 98
International Journal of Chemical and Process Engineering Research, 2014, 1(6): 98-108
necessary for image analysis in different study. It is an electronic data processing on a 2-D array of
numbers known as pixel which is the numeric representation of any image [4]. The output of image
processing can be an image or a set of characteristics related to the image. Digital Image Processing
techniques are used to manipulate the digital images using computers. Image processing system
includes treating images as 2d signal [5].Image processing system consists of a source of image
data, a processing element and a destination for the processed results. The source of image data
may be a camera, a sensor, satellite, scanner, a mathematical equation, statistical data, the Web, a
SONAR system, etc. The processing element is a computer, destination for the processed result,
and the output of the processing may be a display monitor.
2. OBJECTIVE
The objective can be summarized as follows:
Image Correction to remove distortion, errors, and noise during data acquisition
Image Enhancement to modify images or increases visual appearance and interpretability
of imagery
Hardware Software
- Computer monitor - Photoshop
- CD-ROM, ink-jet printer - ERDAS Imagine
- Disk Drive - IDRISI, Arc View
- Internet connection - MATLAB, Visual C++
- Scanner, digital camera - ENVI, ER Mapper etc.
A real image is formed on a sensor when an emitted energy strikes it with sufficient intensity
to create a sensor output [1].A digital image is formed as a 2-D function, f(x, y), where x and y can
be defined as spatial coordinates, & the amplitude of f at any pair of coordinates (x, y)which is
called the intensity or grey level of the image at that point. The coordinates x, y, and the amplitude
values of f are all finite, discrete quantities, we call the image a digital image which is obtained by
digitization of analog image [1].
Digitization
Digitization implies that a digital image is an approximation of a
real scene Pixel brightness value which typically represents gray
levels, colors, heights, opacities etc. [1]
Digitization accuracy
Resolutions express digitization accuracy.
They are of 2 types
– Pixel resolution
– Brightness resolution (color resolution)
Sampling means measuring the value of an image at a finite number of points normally
corresponds to the extent of the no of pixels in both vertical and horizontal directions [1].
Quantization is the representation of the measured value at the sampled point by an integer.
The number of gray levels in the equally spaced gray scale is called the quantization or gray scale
resolution of the system [1].
where f is the original image, g is a degraded/noisy version of the original image and f is a
restored version. Image restoration removes a known degradation [3].
The distortion may be specified by locating control points and identifying their corresponding
control points in an ideal. The distortion model then made transformation between these control
points to generate a special warping function which allows building output image pixel by pixel
(warped) [4].
Scene Illumination
In satellite remote sensing,
imagery acquired at
different times of the year
may be required (e.g., to
study phonological
cycle).These may require
sun elevation correction and
an earth-sun distance
Solar Elevation Angle: the angular elevation of the sun above the horizon Solar Zenith Angle:
The angular deviation from directly overhead (or complement of elevation). Corrections: DN/Sin
Elevation Angle or DN/Cosine Zenith Angle [2].
Rectification Method
This method includes Selection of Ground Control Points (GCP) that can be located on both
the map and the image. In these equations (a and b) the unknowns are solved by determining the
coordinates for a set of known locations known as ground control points (GCP’s). They should be
well defined , spatially small and well distributed over the entire image [2]
Resampling
This process calculates the new pixel values from the original digital pixel values in the raw
and uncorrected image. There are three common methods for resampling of images which are as
follows:
- nearest neighbor : this process assigns each
corrected output pixel the value of the nearest
input pixel
- bilinear interpolation :this process calculates the
new output pixel value using interpolations from
the four closest input pixels
- cubic convolution :this process interpolate a new
pixel value from a larger neighborhood of 9, 16,
Image Registration
• This process applies the similar techniques as
rectification for image to image and image to map
overlays. It includes edge detection which is used
for creating image outlines, giving areas with
strong intensity contrasts. Edge detected image
preserves the useful important structural properties
filtering out un wanted things [4].
4.5. Segmentation .
This process is used to subdivide an image
. into its component regions or objects. Algorithms
of this process are based generally on one of the following 2 basic properties of intensity values
- discontinuity: to make partition of an image based on sharp changes in intensity
(such as edges)
- similarity: to make partition of an image into regions that are similar according to a
set of predefined criteria [4].
It should stop when the objects of interest in an application have been isolated.
Image mosaic
training sets (pixels) that are representatives for all classes of interest. This training dataset forms
the basis for classification of the total image data. The algorithm classifies pixels with unknown
identities using samples with known values. The user should procedure start the process by
selecting and naming areas on the image corresponding to the classes of interest i.e, the information
classes. The classification algorithm then will find out all other areas of similar class [3].
Procedure
Display the image in a single band or three-band combination.
– Acquire training sets.
– Choose the classifier type.
– Perform classification.
– Refine training sets.
– Derive the accuracy assessment measures [3].
Spectral values of a given cover type are close together, whereas data of different classes are
well separated. After the classification, the analyst labels all identified spectral clusters. Object
Recognition & Representation is the final stage which gives the result of Digital image processing
.[3].
5. CONCLUSION
Image Processing is necessary for different types of works like Land use study, numerical
weather prediction, mapping etc. Digital Image Processing needs for digital images like space
based and remote sensing data. It includes data operation which normally precedes for
interpretation and further manipulation for analysis of the image data to extract specific information
of interest. These operations aim to correct distorted or degraded image data to create a more
faithful representation of the original scene which helps the user or analyst to detect information
easily in a faithful manner. Preprocessing commonly comprises a series of sequential operations,
including -atmospheric correction , normalization, radiometric & geometric correction, image
registration representation etc.
6. ACKNOWLEDGEMENT
The authors would like to express their gratitude to the officials of Bangladesh Space Research
and Remote Sensing Organization (SPARRSO), especially to Md. Mozammel Haque Sarker. The
library of SPARRSO has also been very helpful in this research.
REFERENCES
[1] B. M. Namee, "Digital image processing: Digital image fundamentals." Available:
http://www.comp.dit.ie/bmacnamee/materials/dip/lectures/ImageProcessing2-
ImageProcessingFundamentals.ppt [Accessed 13 August, 2014], n.d.
[2] D. DiBiase, The nature of geographic information. John A. Dutton E-Education Institute. Available:
https://www.e-education.psu.edu/natureofgeoinfo/c8_p18.html [Accessed 13 August, 2014], n.d.
[3] J. Campbell, "Digital image classification: 4354-Remote sensing, Ini Chitoz, Erdas." Available:
http://www.scribd.com/doc/130172996/ERDAS-Digital-Image-Classification-Geography-4354-
Remote-Sensing [Accessed 13 August, 2014], 2001.
[4] C. Solomon and T. Breckon, Fundamentals of digital image processing: A practical approach with
examples in Matlab. Wiley-Blackwell. Available: www.fundipbook.com [Accessed 13 August,
2014], n.d.
[5] G. Engineers, "Introduction to image processing." Available:
http://www.engineersgarage.com/articles/image-processing-tutorial-applications [Accessed 12
September, 2014], n.d.
Views and opinions expressed in this article are the views and opinions of the authors, International Journal of
Chemical and Process Engineering Research shall not be responsible or answerable for any loss, damage or liability etc.
caused in relation to/arising out of the use of the content.