0% found this document useful (0 votes)
37 views

Name:-M.Dawood Roll No. CS-1810 Submitted To: - Prof - Ihteshaam

1. The document provides an overview of computers and programming. It discusses the basic components of a computer system including hardware and software. 2. Hardware components include memory, processing units, and input/output devices. Software includes programs that control the hardware to process data. 3. The document then discusses the major hardware components in more detail and provides an overview of software development processes.

Uploaded by

aslam jan
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views

Name:-M.Dawood Roll No. CS-1810 Submitted To: - Prof - Ihteshaam

1. The document provides an overview of computers and programming. It discusses the basic components of a computer system including hardware and software. 2. Hardware components include memory, processing units, and input/output devices. Software includes programs that control the hardware to process data. 3. The document then discusses the major hardware components in more detail and provides an overview of software development processes.

Uploaded by

aslam jan
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Name :- M.

Dawood

Roll No. CS-1810


Submitted To :- Prof.Ihteshaam

Overview of computer and Programming

Computers

Computers was originally invented to carry out numerical calculations in the 1930s-40s. Later
they were gradually developed to process all kinds of data, such as numbers, texts, and other
types of media. A computer system consists of hardware and software.Hardware is
the equipments used to perform the necessary computations. Software consists
of programs that control the hardware to process data.

Hardware

Major hardware components of a computer include memory, processing unit, and input/output
devices (monitor, keyboard, mouse, printer, and so on).Memory is the place where the
programs and data are stored. It can be imagined as an ordered sequence of storage locations
called memory cells. Each cell has a unique address, which is like a serial number of the cell in
the memory.The data stored in a memory cell are called the contents of the cell. Program is
treated as a special type of data. A memory cell contains a sequence of binary digit, or bit. Each
bit is either a 0 or a 1. All kinds of data are eventually represented by sequences of bits. A
sequence of 8 bits is usually called a byte, which can represent a character, such as the ones on
a keyboard.In a computer, there are several types of memory. There is a distinction
between main memory and secondary memory --- the former is faster and smaller, while the
latter is cheaper, and often removable. At the current time, the former is usually in silicon
chips, while the latter in hard disks, flash drives, CD/DVDs and so on.There are two types of
main memory: RAM (random access memory) and ROM (read-only memory). Their major
difference is that the content of RAM can be modified. In the following, "main memory" means
RAM.In a computer, most of the operations are performed by a CPU (central processing unit),
and there are computers with multiple CPUs. At the current time, a CPU is in a single integrated
circuit(IC), or call it a chip. The CPU follows the instructions contained in a program (written in a
computer-understandable language). In each step, the CPU fetches (i.e., retrieves) an
instruction, interprets its content to decide what to do, and then do it, which may mean to
move data from one place to another, or change data in a certain way. Other common
operations include addition, subtraction, multiplication, division, comparison, and so on.CPU
usually executes instructions one after another, but can also jump to another memory cell
according to an instruction.

A computer uses its input/output (I/O) devices to communicate with human users and other
computers.For a human user, the usual input device is a keyboard and a mouse, and the usual
output device is a monitor (display screen), and a printer. The human-computer interaction
(HCI) can either happen in a command-line user interface, or a graphical user interface (GUI).

Software

Every software is eventually executed by the CPU, in a machine language in which each line in a
program is an instruction to the hardware. Since programs in this language are not easily
understandable by a human user, the same program is usually also described in other, more
human-readable languages.One type of them is assembly language, in which the instructions
are represented by symbols and numbers. Another type of language are more human-oriented,
called "high-level languages", which are closer to mathematical languages and natural
languages (such as English), as well as machine-independent.

Typical examples of high-level language include FORTRAN, ALGOL, COBOL, BASIC, Pascal, LISP,
Prolog, Perl, C, C++, and Java.

The translation from high-level languages and assembly languages into machine languages is
accomplished by special programs: compilers, interpreters, and assemblers.
A compiler translates a source program in a high-level language into an object program in the
machine language. An interpreter interprets and executes a program in a high-level language
line by line. An assembler translates a source program in an assembly language into
an object program in the machine language.

A high-level language usually comes with many ready-made common programs, so the user can
include them in programs, rather than rewrite them. The program responsible for this is called
a "linker". It links user object programs and related "library programs", and produces
executable programs.
There are software packages called "integrated development environment" (IDE) which
organize all the related software (e.g., editor, compiler, linker, loader, debugger) together to
support the development of a software.

When a program in machine language runs, it typically gets some input data from the memory,
process them according to the predetermined procedure, then store some output data into the
memory, and display some information to the user.

In a computer, there is a software product that occupies a special position: the operating
system (OS). Most of other software products are application software, which are managed and
supported by the OS.

When a computer is turned on, it starts by executing part of the OS that is stored in a ROM,
which then loads the rest of the OS from hard disk and starts it. This process is called "booting".

When running, an OS has the following main responsibility:

 communicating with the user,


 allocating resources (CPU time, memory space, printer usage, ...),
 connecting I/O devices with running programs,
 transferring data between main and secondary memory.

In summary, we often say that the OS manages processes and resources.

At the current time, the most often used OS include Unix/Linux, Microsoft Windows, and
Macintosh OS. It is possible for a computer to have more than one OS stored in its memory, but
usually only one can be used at a time.

An application software uses the computer to accomplish a specific task. They are usually
purchased on CD or DVD, or downloaded from the Internet, and installed into the computer (so
it is stored in memory and known to the OS), before they can be used in the computer.

Software development

Software developing, also called programming, is a problem-solving process. It usually consists


of the following major steps:

1. Specify the problem: to state the problem clearly and unambiguously.


2. Analyze the problem: to identify the inputs and corresponding outputs.
3. Design an algorithm: to develop a list of steps, called an algorithm, that will start with
the input and stop with the output.
4. Implement the algorithm: to write a program in a programming language according to
the algorithm.
5. Test the program: to verify that the program indeed produces the desired results in
selected testing cases.
6. Maintain the program: to update the program according to new requirements.

Very often, steps in the above procedure need to be repeated to fix the errors found in the
process.

Basic algorithms and problem solving

There are many ways to write an algorithm. Some are very informal, some are quite formal and
mathematical in nature, and some are quite graphical. The instructions for connecting a DVD
player to a television are an algorithm. A mathematical formula such as πR2 is a special case of
an algorithm. The form is not particularly important as long as it provides a good way to
describe and check the logic of the plan.
The development of an algorithm (a plan) is a key step in solving a problem. Once we have an
algorithm, we can translate it into a computer program in some programming language. Our
algorithm development process consists of five major steps.
1. Step 1: Obtain a description of the problem. This step is much more difficult than it appears.
2. Step 2: Analyze the problem.
3. Step 3: Develop a high-level algorithm.
4. Step 4: Refine the algorithm by adding more detail.
5. Step 5: Review the algorithm.

Step 1: Obtain a description of the problem.


This step is much more difficult than it appears. In the following discussion, the
word client refers to someone who wants to find a solution to a problem, and the
word developer refers to someone who finds a way to solve the problem. The developer must
create an algorithm that will solve the client's problem.

Step 2: Analyze the problem.


The purpose of this step is to determine both the starting and ending points for solving the
problem. This process is analogous to a mathematician determining what is given and what
must be proven. A good problem description makes it easier to perform this step.
When determining the starting point, we should start by seeking answers to the following
questions:
 What data are available?
 Where is that data?
 What formulas pertain to the problem?
 What rules exist for working with the data?
 What relationships exist among the data values?

Step 3: Develop a high-level algorithm.


An algorithm is a plan for solving a problem, but plans come in several levels of detail. It's
usually better to start with a high-level algorithm that includes the major part of a solution, but
leaves the details until later. We can use an everyday example to demonstrate a high-level
algorithm.
Problem: I need a send a birthday card to my brother, Mark.
Analysis: I don't have a card. I prefer to buy a card rather than make one myself.
High-level algorithm:
Go to a store that sells greeting cards
Select a card
Purchase a card
Mail the card

Step 4: Refine the algorithm by adding more detail.


A high-level algorithm shows the major steps that need to be followed to solve a problem. Now
we need to add details to these steps, but how much detail should we add? Unfortunately, the
answer to this question depends on the situation. We have to consider who (or what) is going
to implement the algorithm and how much that person (or thing) already knows how to do. If
someone is going to purchase Mark's birthday card on my behalf, my instructions have to be
adapted to whether or not that person is familiar with the stores in the community and how
well the purchaser known my brother's taste in greeting cards.

Step 5: Review the algorithm.


The final step is to review the algorithm. What are we looking for? First, we need to work
through the algorithm step by step to determine whether or not it will solve the original
problem. Once we are satisfied that the algorithm does provide a solution to the problem, we
start to look for other things. The following questions are typical of ones that should be asked
whenever we review an algorithm. Asking these questions and seeking their answers is a good
way to develop skills that can be applied to the next problem.
 Does this algorithm solve a very specific problem or does it solve a more general problem?
If it solves a very specific problem, should it be generalized?
For example, an algorithm that computes the area of a circle having radius 5.2 meters
(formula π*5.22) solves a very specific problem, but an algorithm that computes the area of
any circle (formula π*R2) solves a more general problem.
 Can this algorithm be simplified?
One formula for computing the perimeter of a rectangle is:

length + width + length + width

A simpler formula would be:

2.0 * (length + width)

Development of algorithm

The development of an algorithm (a plan) is a key step in solving a problem. Once we have
analgorithm, we can translate it into a computer program in some programming language.

Analysis of algoithm

In theoretical analysis of algorithms, it is common to estimate their complexity in the


asymptotic sense, i.e., to estimate the complexity function for arbitrarily large input. The
term "analysis of algorithms" was coined by Donald Knuth.Algorithm analysis is an important
part of computational complexity theory, which provides theoretical estimation for the
required resources of an algorithm to solve a specific computational problem. Most algorithms
are designed to work with inputs of arbitrary length. Analysis of algorithms is the
determination of the amount of time and space resources required to execute it.Usually, the
efficiency or running time of an algorithm is stated as a function relating the input length to
the number of steps, known as time complexity, or volume of memory, known as space
complexity.

The Need for Analysis


In this chapter, we will discuss the need for analysis of algorithms and how to choose a better
algorithm for a particular problem as one computational problem can be solved by different
algorithms.By considering an algorithm for a specific problem, we can begin to develop
pattern recognition so that similar types of problems can be solved by the help of this
algorithm.Algorithms are often quite different from one another, though the objective of
these algorithms are the same. For example, we know that a set of numbers can be sorted
using different algorithms. Number of comparisons performed by one algorithm may vary with
others for the same input. Hence, time complexity of those algorithms may differ. At the same
time, we need to calculate the memory space required by each algorithm.Analysis of algorithm
is the process of analyzing the problem-solving capability of the algorithm in terms of the time
and size required (the size of memory for storage while implementation). However, the main
concern of analysis of algorithms is the required time or performance.

Design of algorithm
An algorithm is a series of instructions often referred to as a “process,” which is to be followed
when solving a particular problem. While technically not restricted by definition, the word is
almost invariably associated with computers, since computer-processed algorithms can tackle
much larger problems than a human, much more quickly. Since modern computing uses
algorithms much more frequently than at any other point in human history, a field has grown
up around their design, analysis, and refinement. The field of algorithm design requires a strong
mathematical background, with computer science degrees being particularly sought-after
qualifications. It offers a growing number of highly compensated career options, as the need for
more (as well as more sophisticated) algorithms continues to increase.

Conceptual Design

At their simplest level, algorithms are fundamentally just a set of instructions required to
complete a task. The development of algorithms, though they generally weren’t called that, has
been a popular habit and a professional pursuit for all of recorded history. Long before the
dawn of the modern computer age, people established set routines for how they would go
about their daily tasks, often writing down lists of steps to take to accomplish important goals,
reducing the risk of forgetting something important. This, essentially, is what an algorithm is.
Designers take a similar approach to the development of algorithms for computational
purposes: first, they look at a problem. Then, they outline the steps that would be required to
resolve it. Finally, they develop a series of mathematical operations to accomplish those steps.

Testing Algorithm

This section provides an introduction to software testing and the testing of Artificial Intelligence
algorithms. introduces software testing and focuses on a type of testing relevant to algorithms
called unit testing. provides a specific example of an algorithm and a prepared suite of unit
tests, and provides some rules-of-thumb for testing algorithms in general.

Software Testing

Software testing in the field of Software Engineering is a process in the life-cycle of a software
project that verifies that the product or service meets quality expectations and validates that
software meets the requirements specification. Software testing is intended to locate defects in
a program, although a given testing method cannot guarantee to locate all defects. As such, it is
common for an application to be subjected to a range of testing methodologies throughout the
software life-cycle, such as unit testing during development, integration testing once modules
and systems are completed, and user acceptance testing to allow the stakeholders to
determine if their needs have been met.

Unit testing is a type of software testing that involves the preparation of well-defined
procedural tests of discrete functionality of a program that provide confidence that a module or
function behaves as intended. Unit tests are referred to as 'white-box' tests (contrasted to
'black-box' tests) because they are written with full knowledge of the internal structure of the
functions and modules under tests. Unit tests are typically prepared by the developer that
wrote the code under test and are commonly automated, themselves written as small
programmers that are executed by a unit testing framework (such as JUnit for Java or the Test
framework in Ruby). The objective is not to test each path of execution within a unit (called
complete-test or complete-code coverage), but instead to focus tests on areas of risk,
uncertainty, or criticality. Each test focuses on one aspect of the code (test one thing) and are
commonly organized into test suites of commonality.

Some of the benefits of unit testing include:

 Documentation: The preparation of a suite of tests for a given system provide a type of
programming documentation highlighting the expected behavior of functions and
modules and providing examples of how to interact with key components.
 Readability: Unit testing encourages a programming style of small modules, clear input
and output and fewer inter-component dependencies. Code written for easy of testing
(testability) may be easier to read and follow.
 Regression: Together, the suite of tests can be executed as a regression-test of the
system. The automation of the tests means that any defects caused by changes to the
code can easily be identified. When a defect is found that slipped through, a new test
can be written to ensure it will be identified in the future .

You might also like