Design and Analysis Introduction

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 7

Design and Analysis Introduction

An algorithm is a set of steps of operations to solve a problem performing calculation,


data processing, and automated reasoning tasks. An algorithm is an efficient method
that can be expressed within finite amount of time and space.
An algorithm is the best way to represent the solution of a particular problem in a very
simple and efficient way. If we have an algorithm for a specific problem, then we can
implement it in any programming language, meaning that the algorithm is
independent from any programming languages.

Algorithm Design
The important aspects of algorithm design include creating an efficient algorithm to
solve a problem in an efficient way using minimum time and space.
To solve a problem, different approaches can be followed. Some of them can be
efficient with respect to time consumption, whereas other approaches may be memory
efficient. However, one has to keep in mind that both time consumption and memory
usage cannot be optimized simultaneously. If we require an algorithm to run in lesser
time, we have to invest in more memory and if we require an algorithm to run with lesser
memory, we need to have more time.

Problem Development Steps


The following steps are involved in solving computational problems.

 Problem definition
 Development of a model
 Specification of an Algorithm
 Designing an Algorithm
 Checking the correctness of an Algorithm
 Analysis of an Algorithm
 Implementation of an Algorithm
 Program testing
 Documentation

Characteristics of Algorithms
The main characteristics of algorithms are as follows −
 Algorithms must have a unique name
 Algorithms should have explicitly defined set of inputs and outputs
 Algorithms are well-ordered with unambiguous operations
 Algorithms halt in a finite amount of time. Algorithms should not run for infinity,
i.e., an algorithm must end at some point

Pseudocode
Pseudocode gives a high-level description of an algorithm without the ambiguity
associated with plain text but also without the need to know the syntax of a particular
programming language.
The running time can be estimated in a more general manner by using Pseudocode to
represent the algorithm as a set of fundamental operations which can then be counted.

Difference between Algorithm and Pseudocode


An algorithm is a formal definition with some specific characteristics that describes a
process, which could be executed by a Turing-complete computer machine to perform a
specific task. Generally, the word "algorithm" can be used to describe any high level
task in computer science.
On the other hand, pseudocode is an informal and (often rudimentary) human readable
description of an algorithm leaving many granular details of it. Writing a pseudocode
has no restriction of styles and its only objective is to describe the high level steps of
algorithm in a much realistic manner in natural language.
For example, following is an algorithm for Insertion Sort.
Algorithm: Insertion-Sort
Input: A list L of integers of length n
Output: A sorted list L1 containing those integers present in L
Step 1: Keep a sorted list L1 which starts off empty
Step 2: Perform Step 3 for each element in the original list L
Step 3: Insert it into the correct position in the sorted list L1.
Step 4: Return the sorted list
Step 5: Stop
Here is a pseudocode which describes how the high level abstract process mentioned
above in the algorithm Insertion-Sort could be described in a more realistic way.
for i <- 1 to length(A)
x <- A[i]
j <- i
while j > 0 and A[j-1] > x
A[j] <- A[j-1]
j <- j - 1
A[j] <- x
In theoretical analysis of algorithms, it is common to estimate their complexity in the
asymptotic sense, i.e., to estimate the complexity function for arbitrarily large input. The
term "analysis of algorithms" was coined by Donald Knuth.
Algorithm analysis is an important part of computational complexity theory, which
provides theoretical estimation for the required resources of an algorithm to solve a
specific computational problem. Most algorithms are designed to work with inputs of
arbitrary length. Analysis of algorithms is the determination of the amount of time and
space resources required to execute it.
Usually, the efficiency or running time of an algorithm is stated as a function relating the
input length to the number of steps, known as time complexity, or volume of memory,
known as space complexity.

The Need for Analysis


Algorithms are often quite different from one another, though the objective of these
algorithms are the same. For example, we know that a set of numbers can be sorted
using different algorithms. Number of comparisons performed by one algorithm may
vary with others for the same input. Hence, time complexity of those algorithms may
differ. At the same time, we need to calculate the memory space required by each
algorithm.
Analysis of algorithm is the process of analyzing the problem-solving capability of the
algorithm in terms of the time and size required (the size of memory for storage while
implementation). However, the main concern of analysis of algorithms is the required
time or performance. Generally, we perform the following types of analysis −
 Worst-case − The maximum number of steps taken on any instance of size a.
 Best-case − The minimum number of steps taken on any instance of size a.
 Average case − An average number of steps taken on any instance of size a.
 Amortized − A sequence of operations applied to the input of size a averaged
over time.
To solve a problem, we need to consider time as well as space complexity as the
program may run on a system where memory is limited but adequate space is available
or may be vice-versa. In this context, if we compare bubble sort and merge sort.
Bubble sort does not require additional memory, but merge sort requires additional
space. Though time complexity of bubble sort is higher compared to merge sort, we
may need to apply bubble sort if the program needs to run in an environment, where
memory is very limited.

Asymptotic Notations
Asymptotic notations are the mathematical notations used to describe the
running time of an algorithm when the input tends towards a particular value
or a limiting value.

For example: In bubble sort, when the input array is already sorted, the time
taken by the algorithm is linear i.e. the best case.

But, when the input array is in reverse condition, the algorithm takes the
maximum time (quadratic) to sort the elements i.e. the worst case.

When the input array is neither sorted nor in reverse order, then it takes
average time. These durations are denoted using asymptotic notations.

There are mainly three asymptotic notations:

 Big-O notation

 Omega notation

 Theta notation

Big-O Notation (O-notation)


Big-O notation represents the upper bound of the running time of an
algorithm. Thus, it gives the worst-case complexity of an algorithm.
Big-O gives the upper bound of a function
O(g(n)) = { f(n): there exist positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n)
for all n ≥ n0 }
Since it gives the worst-case running time of an algorithm, it is widely used to
analyze an algorithm as we are always interested in the worst-case scenario.

Omega Notation (Ω-notation)


Omega notation represents the lower bound of the running time of an
algorithm. Thus, it provides the best case complexity of an algorithm.
Omega gives the lower bound of a function
Ω(g(n)) = { f(n): there exist positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n)
for all n ≥ n0 }

Theta Notation (Θ-notation)


Theta notation encloses the function from above and below. Since it
represents the upper and the lower bound of the running time of an algorithm,
it is used for analyzing the average-case complexity of an algorithm.
Theta bounds the function within constants factors
Θ (g(n)) = {f(n): there exist positive constants c1, c2 and n0 such that 0 ≤ c1 *
g(n) ≤ f(n) ≤ c2 * g(n) for all n ≥ n0}

You might also like