Module 1 Study Guide
Module 1 Study Guide
Module 1 Study Guide
I. INTRODUCTION
Programming is the art and science of writing instructions that carry out arithmetic and logical
operations that a computer can follow to accomplish certain tasks. The tasks can be as trivial as
playing a game, surfing the web or as important as controlling the processes in an industrial plant.
Before we can learn how to write programs however, we should understand what a program is and
how a computer understands and execute it. This module discusses the definition of computers
and traces its development from the earliest calculating machines up to the computers that we use
today. Concurrent with the evolution of computers is the development of programming languages
to cater the growing and ever changing needs of society and computer-driven technologies such
as the internet that allow information to be immediately accessible with the touch of a button.
1. identify the important people involved and milestones in the evolution of computers and
programming;
2. define terms associated with the computer and internet; and
3. discuss the impact of computers have in today’s setting and what we can expect computers
will offer for us in the future.
You can download the study guide in pdf format of Module 1 here.
In order to trace the history of computers, we must first define what it is and what it does. This
would allow us to correctly associate the relevant inventions that led to the development of modern
day computers.
Dey (2013) cites the Oxford Dictionary for the definition of a computer as “an automatic electronic
apparatus for making calculations or controlling operations that are expressible in numerical or
logical terms”.
Wassberg (2020) points out an example of a machine that does not fit this definition: “we can rule
out the Jacquard machine, which was the automated loom invented in the early years of the 19th
century. These looms could be programmed using punch cards, but they produced woven silk,
which, of course, is not the result of an arithmetic or logical operation”. Although his invention does
not fit the definition of a computer, the idea of using punch cards was borrowed by Babbage who
is considered the father of the computer.
The Oxford definition clearly categorizes the computer as an electronic apparatus although Dey
(2013) emphasized that the first computers were actually mechanical and electro-mechanical
apparatuses. He pointed out that the definition also highlights “the two major areas of computer
application: data processing and computer-assisted controls or operations”.
Dey (2013) further describes the computer as a data processor. “It can accept input, which may be
either data or instructions or both. The computer remembers the input by storing it in memory
cells. It then processes the stored input by performing calculations or by making logical
comparisons or both. It gives out the result of the arithmetic or logical computations as output
information. The computer accepts input and outputs data in an alphanumeric form. Internally it
converts the input data to meaningful binary digits, performs the instructed operations on the
binary data, and transforms the data from binary digit form to understandable alphanumeric form”.
Wassberg (2020) describes a program as “a set of instructions that the computer can execute”
and that these programs are written by a programmer using one or more programming languages.
We will discuss how exactly a program works and the parts of the computers involved in the
process in the next module.
Reflection question:
Now that we have defined a computer and what it does, in this era of smart devices can we
consider the following as computers: Smartphone? A Roomba? How about 3D printers?
Page 2 of 10
TOPIC 1: Evolution of Computers
The previous topic points out that a computer is a programmable electronic machine that performs
arithmetic or logical operations. This definition alludes to the use of electricity or electronic devices
hence they are classified under the electrical era. The predecessors of electric era computers
were mechanical devices that were programmable and performed mostly arithmetic calculations.
As a side note, Wikipedia also offers the following definition: “A computer is a machine that can be
instructed to carry out sequences of arithmetic or logical operations automatically via computer
programming.” Note how it left out the word “electronic”.
Way before the groundbreaking inventions of the early pioneers of mechanical computers,
humans needed tools to perform simple mathematical calculations. Das (2018) credited the
abacus that was made in Babylonia around 2000 B.C. as the device marked the beginning of the
development of the computer. He describes the abacus as “comprised a number of rings or beads
that could be moved along rods enclosed by a wooden frame. This calculator could be used only
for addition and subtraction, which were adequate for the age”. He continues that “the next major
leap in computing occurred with the invention of the logarithm by John Napier in the early 17th
century. Napier established that multiplication and division could be achieved by adding and
subtracting, respectively, the logarithm of the numbers. Napier’s invention soon led to the
development of the slide rule. The 17th century also saw the emergence of the gear-based digital
mechanical calculator. (The slide rule is actually an analog device.) Blaise Pascal was the first to
create one but it could perform only addition and subtraction. The stepped reckoner developed by
Gottfried Leibniz improved upon Pascal’s creation by adding the features of multiplication and
division. Both devices were based on the decimal system, but Leibniz recommended the use of
the binary system, which was to be adopted centuries later”. It is important to note that these
devices lacked one characteristic that would qualify them as computers, they were not
programmable.
While there are a number of ways we can classify computers, we will rely on the definition in the
previous topic as our basis in identifying machines that are “computers”. It should be noted that
computers can be classified as analog or digital, Wassberg (2020) cites that “the difference
between an analog and a digital computer is that the former is a mechanical machine that works
with an analog input such as voltage, temperature, or pressure. In comparison, a digital computer
works with input that can be represented by numbers”. As we trace the history of computers, we
could see that it was not until World War II that we saw the birth of real digital computers.
In this topic we break down the discussion of the history of programmable computers into the
mechanical era and electrical era. In the latter, computers are further classified according to
generation. The generation a computer belongs to is determined by the technology it uses.
Learning Activity 1
Read 1: 1.2 Evolution of Computers and 1.3 Generations of Computers (pages 1-3)
Page 3 of 10
The video in the link discusses some of the early mechanical era and electrical computers.
Instructor’s Remarks: From the video, we could see that modern computers owe their existence to
the work of inventors such as Blaise Pascal with his Arithmetic engine, Charles Babbage (and
Augusta Ada Byron aka Ada Lovelace, general accepted as the first programmer) with the
Difference engine and Analytical Engine, and Konrad Zuse with his electromechanical machine, the
Z3.
B. Electrical Era
Electrical era computers are grouped according to the technology it uses. Currently there are five
generations of computers, to know more about the key features and in depth discussion of each
computer generation, please read Section 1.4 from Computer Fundamentals & C Programming by
Das, S. (2018).
Instructor’s remarks:
Das (2018) sums it up perfectly when he mentions that computers of each generation is “faster,
smaller and more powerful than its counterpart of the preceding generation. Consequently,
advances in hardware have also resulted in the development of powerful and user-friendly
programming languages”.
Watch 2: The next video discusses the characteristics of modern computers and its evolution from
Instructor’s Remarks: The video first discusses the characteristics that distinguish computers from
other machines, which are speed, accuracy, versatility, diligence and high storage.
The video also identifies the computer’s limitations: no heuristics and no IQ.
It presents the five generations of computers as summarized in the table below and discussed at
length the advantages and disadvantages of each one.
Each generation is classified based on the technology it uses and can be summarized as follows:
Reflection Questions:
In the previous topic, we saw how computers evolved from being big and clunky mechanical
calculators to the electrical era computers that can now easily fit in our pockets. In the early days
of computing, computers were relatively few and this meant people had to travel to where the
computers were to use them. People working on these computers, who were engineers and
scientists, sought out ways to address this inconvenience and this likely brought about the creation
of the internet.
The concept of the internet may have actually been imagined much earlier than modern
computers as Andrews (2020) noted that “Nikola Tesla toyed with the idea of a “world wireless
system” in the early 1900s, and visionary thinkers like Paul Otlet and Vannevar Bush conceived of
mechanized, searchable storage systems of books and media in the 1930s and 1940s”. Today we
use computers and devices that are most likely connected to the internet and this allows us to
connect to other computers over vast distances and also allow us to acquire information on the
web.
In this module, we will learn that the internet was not invented by one person, rather it is a product
of a number of researches and works by scientists and engineers and grew along with the
advancements in digital technology. The module will also discuss a few relevant terms that are
associated with networking and the internet.
Learning Activity 2
Read 4: This website details the timeline of the internet. View the important milestones in each
decade by clicking on the tabs.
Read 5: The website in the following link lists the common terminologies related to the internet.
Page 5 of 10
Top 20 Internet Terms for Beginners https://www.lifewire.com/top-internet-terms-for-beginners-
2483381
Watch 3: The video in the link was created by CERN and it explains the history of the World Wide
Web created by Tim Berners Lee after he joined the institution in 1980.
Reflection Questions:
What is the difference between the internet and the world wide web?
Short Quiz
You have twelve (12) minutes to complete this 10-question multiple-choice quiz.
1. Read the instructions carefully, and then click “Take the Quiz.”
2. Complete the quiz according to instructions.
3. Click “Next” to move from one question to another.
4. Once through with all the questions, click “Submit Quiz”.
5. You will need to get 60% score to pass the quiz and unlock the next module.
6. You may attempt the quiz three (3) times. Your highest score will be recorded.
Quiz results are protected for this quiz and correct answers are not visible to you until the quiz
closes.
Good luck!
Page 6 of 10
Assignment Guide: Assignment 1
Task
Drawing from the module reading materials, videos, quiz and your own research, submit a one-page
synthesis paper that covers the following topic:
How the computer/internet can change how we live in the future. Name one aspect in our lives
today (i.e. transportation, medicine, food) and explain in your own words how they will be heavily
influenced/improved/controlled by computers or the internet.
Specific Guidelines
Synthesis writing is basically combining the ideas of more than one source with your own and
explaining it in your own words.
Report information from the sources using different phrases and sentences;
Organize so that readers can immediately see where information from the sources overlap;
Make sense of the sources and help the reader understand them in greater depth.
Rubrics
Below is the Rubric that will be used to rate the output that you will submit. Scoring is based on the
extent to which the given criteria are displayed/exhibited. Please take note of the different areas and
the corresponding weights. There can only be 4 numbers that you can be rated with and this is
multiplied to the corresponding weights.
Page 7 of 10
1point 2points points 4points
Supporting Essay includes Essay includes Essay includes Essay includes
Details no examples to a few examples some examples numerous
support to support to support examples to
student’s student’s student’s support
viewpoint viewpoint viewpoint student’s
viewpoint
1point 2points 3points 4points
Demonstrated Essay Essay Essay Essay
Understanding demonstrates demonstrates demonstrates a demonstrates
of the little some good thorough
Assignment understanding understanding understanding of understanding
of the of the the assignment of the
assignment by assignment by by synthesizing assignment by
synthesizing synthesizing concepts from synthesizing
concepts from concepts from class concepts from
class class discussions. class
discussions. discussions. 3points discussions.
1point 2points 4points
Format, Format not Format not Format followed, Format
Grammar, followed, not followed, essay essay is highly followed, File
Spelling, college level is adequate; polished; essay is highly
Mechanics, writing; Essay maximum of two maximum of one polished; no
and Sentence has more than grammatical or grammatical or grammatical or
Structure two grammatical spelling errors. spelling error. spelling errors.
or spelling
errors.
1point 2points 3points 4points
The instructor may provide a holistic feedback on the exceptional practice(s) you have demonstrated
in the output and what else you can do or incorporate to improve it in the future.
Now that we have learned about computers and the internet, can you share some interesting trivia
related to computers, programming or the internet? You can also share useful tips and tricks in using
the computer or the internet.
Examples:
About computer bugs: Did you know that the first actual case of bug being found, according to
the brainiacs at Harvard, 1945. The engineers who found the moth were the first to literally
"debug" a machine.
Source: https://www.nationalgeographic.org/thisday/sep9/worlds-first-computer-bug/
Use the – (minus sign) to hide the web-page containing that phrase
Source: https://www.techsettle.com/best-internet-tricks-and-hacks/
Page 8 of 10
1. Respect everyone. Express your thoughts properly and argue constructively.
2. You are required to post at least once in this discussion forum to mark this activity as
completed.
3. Interact as much as possible. Do not hesitate to ask questions or reply to the posts.
4. Posts with substantive contents are highly encouraged, such as the following:
posting a question about the module
sharing an additional resource or insight backed with a reliable reference replying to a
post either agreeing and then adding substantive content or respectfully disagreeing to
the original post backed with reference(s)
5. Posts that contain only affirmation or gratitude, e.g. "thank you for this post", "I agree", are
always welcome to foster a friendly learning environment.
6. Each student must create a discussion, must reply to at least 3 discussions
Module Feedback
Congratulations for reaching this far into the module! Kindly tell me about your learning experience
of module 1 by answering the module feedback in the eLearn Course site. You can also reach the
form through this link.
V. SUMMARY
A computer is defined as ‘an automatic electronic apparatus for making calculations or
controlling operations that are expressible in numerical or logical terms’.
Starting from the days of the abacus, the concept of developing a computing machine has
led to the development of the modern electronic computing machine.
There are five generations of computers and they are classified according to the technology
they use. Computers can be categorized into generations which used vacuum tubes,
transistors, integrated circuits and microprocessors.
Hardware changes went hand-in-hand with development of programming languages.
A program is a set of instructions that the computer can execute.
The internet is the networking infrastructure that connects devices together.
The World Wide Web is a way of accessing information through the medium of the internet.
VI. REFERENCES
Dey, P., Ghosh M. (2013). Computer Fundamentals and Programming in C. (2nd ed.). YMCA
Library Building, 1 Jai Singh Road, New Delhi 110001, India: Oxford University Press
Page 9 of 10
Wassberg, J. (2020). Computer Programming for Absolute Beginners. Livery Place 35 Livery
Street Birmingham B3 2PB, UK.: Packt Publishing Ltd.
Das, S. (2018). Computer Fundamentals & C Programming. 444/1, Sri Ekambara Naicker
Industrial Estate, Alapakkam, Porur, Chennai 600 116: McGraw Hill Education (India) Private
Limited
Science + Media Museum (2020, December 3). A Short History of the Internet. Science + Media
Museum. Accessed: https://www.scienceandmediamuseum.org.uk/objects-and-stories/short-
history-internet
Andrews, E. (updated 2019, October 28). Who Invented the Internet History. Accessed:
https://www.history.com/news/who-invented-the-internet
Gil, P. (updated 2021, June 30). Top 20 Internet Terms for Beginners.Lifewire. Accessed:
https://www.lifewire.com/top-internet-terms-for-beginners-2483381
CERN. (2019, March 8). A brief history of the World Wide Web. [Video]. Accessed:
https://www.youtube.com/watch?v=sSqZ_hJu9zA
Computer History Museum. (2021). INTERNET HISTORY 1962 TO 1992. Computer History
Museum. Accessed: https://www.computerhistory.org/internethistory/
Page 10 of 10