Coa Reviewer

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

COA REVIEWER

COA MIDTERMS REVIEWER


HISTORY
Prehistoric calculating machines: Fingers, stones
Antikythera: First mechanical calculating machine
1642: Blaise Pascal, 19, invented a gear-based machine for addition and subtraction
1672: Leibnitz added multiplication and division to machine
1792 - 1871: Charles Babbage created Difference Engine (only addition and subtraction) Hired first programmer -
Ada Lovelace, daughter of Lord Byron. Ada is the programming language used named after her.
Babbage’s Analytical Engine had 4 parts: 1) Store (memory), 2) mill (CPU), 3) input (card reader), 4) output (card
punch)
First fully programmable computer mechanical - read instructions from punched cards and executed them.
Designed by Ada Lovelace.
First Electronic Computer
Similar to Babbage’s Analytical Engine
Konrad Zuse built electronic calculating machine Z1 in 1936 using telephone relays and punched tape destroyed
in WW2. Z2 in 1993, Z3 in 1941 was 4-bit. World first computer startup company
John Atanasoff invented a machine called ABC (Atanasoff-Berry Computer) with binary arithmetic and capacitor-
based memory (DRAM!) in 1939.
George Stibbitz’ Complex Calculator worked in 1940. Howard Aiken rebuilt Babbage’s Analytical Engine design
using electronic relays in 1944
Von Neumann Computer: John von Neumann was the world’s foremost mathematician.
1945: EDVAC proposal for storing the program along with the data in memory, not on punched tape.
1952: IAS completed at Princeton: the stored program computer.
The architecture for almost all modern machines follows the IAS Design
Birmingham’s KDF-9
- 8K 48-bit words of main memory
- 8us (!25KHz) cycle time
- 2 magnetic tape decks and paper tape I/O - no printers, disk or other I/O devices
- First upgrade: machine envy set in 1968. Then machine was upgrade with a 4M word disk weighing
over a ton, a punched card i/o reader and punch, a high-speed printer.
- Birmingham’s First Mini. The KDF-9 was later replaced by a PDP-8 minicomputer.
The Integrated Circuit Third Generation Revolution
- 2G computers used thousand of discrete transistors
- Jack Kilby and others invented the integrated circuit (IC) which put many transistors on a single
microchip.
Edit with the Docs app
- This led to cheaper computers in 1970s
Make
- IBM 360 tweaks,
became leave comments,
a workhorse and
for both share and business computing
scientific
with
- 360 othersmultiprogramming,
allowed to edit at the same time.

Very Large Scale Integrated Circuit


NOwere
- IC-based minis THANKS GET THE APP
still large cupboard sized
- VLSI technology pushed the numbers of components per chip up from 100s or 1000s to 10,000s and
eventually 10,0000,000s.
- This sparked home compter revolution of microcomputers (Zilog Z80, Intel 8080, MosTech 6502_.
:
- This sparked home compter revolution of microcomputers (Zilog Z80, Intel 8080, MosTech 6502_.
- Early home computers cost one or two hundred pounds and used TV as monitor
- Mainly used for word-processing, spreadsheets, and games.
IBM PC arrived in 1981.
- Intel 8088 CPU (8 bit)
- MS Dos operating system
- Windows came much later (after Apple introduced the Lisa with a GUI, then Mac)
- IBM made design public so others could build add-ons
- Clones were sold at lower cost
- Microsoft and Intel eventually gained control of the market created by IBM
Modern VSLI Microprocessors
- 2G machines of 60s ahad 1000s of transistors
- Early VLSI chips had a few 10,000s of transistors
- Modern AMD Athlon FX has 105,000,000 transistors on one microchip area 1.4 cm sq
Scales
- Modern processors are about 1cm x 1cm in size
- Transistors are about 0.1um across so 100,000 fit on an edge of a processor chip.
- If chip were size of Birmingham, a transistor would be a size of a sheet of A4 paper
- Original transistors were about same size as microchips are now.

Moore’s Law
- Transistors density doubles every 18-24 months
- True over the last 35 years
- Means transistors size decreases by 30% every 18-24 months, 15-20% every year
- CPU speed is inversely proportional to the length of the circuit paths so CPU speeds increased by 10%
or more every year
- Eventually physics means we will need something else. Quantum effects interfere with reliability at
small scales.
- Quantum computing is possible but very very far off. More likely first to move to photonics from
electronics.
COMPUTER ORGANIZATION AND ARCHITECTURE
Computer Organization and Architecture is a field of study that delves into the internal workings of computers,
encompassing both their visible attributes to programmers (Computer Architecture) and the operational units
along with their interconnections (Computer Organization).
Terminologies:
Computer Architecture: This refers to the attributes of a computer system that are visible to the
programmer. It deals with the design decisions regarding instruction sets, addressing modes, data types,
and more, which influence programming techniques and optimizations.
Computer Organization: This pertains to the operational units and their interconnections within a
computer system. It deals with the physical components and their arrangement, including memory,
registers, ALU (Arithmetic Logic Unit), control units, and I/O interfaces.
Examples:
The distinction between architecture and organization can be illustrated by considering whether a computer
system has a multiply instruction (architecture) and how the multiplication operation is executed, either through a
specialized multiply unit or via repeated use of the add unit (organization).
Hierarchy of Structure and Function:
Definition:
In computer systems, the hierarchical nature is crucial for both design and description, with subsystems
interrelated and organized in a hierarchical manner.
Terminologies:
Structure: This refers to the way components are interrelated within a system.
:
Structure: This refers to the way components are interrelated within a system.
Function: It signifies the operation of individual components within the structure.
Function:
The fundamental functions of a computer system include processing data, storing data (both short-term and long-
term), moving data between the system and external devices, and controlling these operations.
Data Movement:
Definition:
Data movement involves transferring data within a computer system or between the system and external
devices.
Terminologies:
Input-Output (I/O): Refers to the process of moving data between the computer and its peripherals.
Data Communications: This involves the transfer of data between the computer and remote devices.
Basic Functions:
The primary functions of a computer include data movement, data storage, and data processing, with operations
categorized into various types such as processing from/to storage and I/O operations.
Structure at Different Levels:
Definition:
Computer structure encompasses various levels, from the top-level system architecture to the internal
components within the CPU.
Terminologies:
Central Processing Unit (CPU): This component controls the computer's operation, executes
instructions, and houses the ALU, control unit, and registers.
Main Memory: It serves as a storage location for data and instructions.
I/O: Handles data transfer between the computer and external devices.
System Interconnection: This facilitates communication among CPU, memory, and I/O devices,
commonly achieved through a system bus.
Structure within the CPU:
The CPU consists of major components like the control unit, ALU, registers, and interconnection mechanisms,
which collectively execute instructions and perform data processing operations.
Review of Chapter 2: Central Processing Unit
Terminologies and De/nitions:
Central Processing Unit (CPU):
The CPU is the core component of a digital computer responsible for executing instructions
fetched from memory, performing arithmetic and logic operations, and controlling data @ow.
Processor Unit:
This part of the CPU comprises the Arithmetic Logic Unit (ALU), registers, and internal
buses, facilitating data transfer between registers and the ALU.
Control Unit:
The control unit, a crucial element of the CPU, coordinates instruction execution, manages
data movement between memory and the ALU, and controls the operation of the ALU and
other components. It interprets instructions and directs the necessary data @ow.
Registers:
Registers serve as temporary storage locations within the CPU, facilitating quick access to
data for processing. They include user-visible registers and control/status registers, each
serving speci/c functions within the CPU's operation.
Instruction Cycle:
The basic instruction cycle comprises fetch, execute, and potentially interrupt sub-cycles,
ensuring the sequential execution of instructions stored in memory.
Arithmetic and Logic Unit (ALU):
:
Arithmetic and Logic Unit (ALU):
The ALU is a combinational circuit within the CPU responsible for performing arithmetic and
logical operations on data. It operates based on simple digital logic devices and includes
arithmetic and logical sections.
Instruction Formats:
Instructions stored in memory come in various formats, including three-address, two-
address, one-address, and zero-address formats, each specifying different ways of
specifying operands and operations. The operation code (opcode) /eld de/nes the
operation to be performed, while other /elds designate addresses or modes.
CPU: The central processing unit (CPU) is the brain of a computer. It performs most of the
data processing tasks. Here's a breakdown of its parts and functions:
CPU Parts:
● Processor Unit (PU): This unit does the actual calculations (addition, subtraction,
etc.) and data manipulation (moving, comparing, etc.) It has two main parts:
○ Arithmetic Logic Unit (ALU): Performs arithmetic and logical operations on
data.
○ Registers: Small, temporary storage locations within the CPU that hold data
for the ALU to work on.
● Control Unit (CU): Fetches instructions from memory, decodes them, and tells the
other parts of the CPU what to do. It's like the conductor of an orchestra, directing
the flow of data and instructions.
How it Works:
1. The CU fetches an instruction from memory.
2. The CU decodes the instruction to understand what needs to be done.
3. The CU sends signals to the PU:
○ It might tell the registers to provide data to the ALU.
○ It might tell the ALU what operation to perform (add, subtract, etc.).
4. The PU performs the operation and stores the result.
5. The CU might store the result in memory or a register.
Types of instructions:
● Instructions come in different formats, specifying how many memory locations
(addresses) they need to reference their data.
○ Three-address instructions: Specify two operands (data to be used) and a
result location (where to store the answer). (Less common)
○ Two-address instructions: Most common type. Specify one operand and a
result location (the other operand is usually in a register).
○ One-address instruction: Use an implied register (accumulator) for one
operand and a memory location for the other. (Used in older machines)
○ Zero-address instructions: Don't use any address fields in the instruction
itself. (For specific operations like push/pop to a stack)

CHAPTER 3 CONTROL UNIT


Chapter 3 of the book delves into the intricacies of the Control Unit in computer
organization and architecture. It explores various aspects of control memory,
microprogramming, addressing sequencing, symbolic microinstructions, and the design of
control units.
:
1. Control Memory:
● De/nition: Control Memory is a storage unit within the control unit responsible for
storing microprograms.
● Hardwired Control Unit: Generates control signals using conventional logic design
techniques.
● Microprogramming: Offers an elegant and systematic method for controlling
microoperation sequences.
● Bus-Organized Systems: Control signals are groups of bits selecting paths in
multiplexers, decoders, and arithmetic logic units.
● Microprogrammed Control Unit: Control variables stored in memory, known as a
microprogram.
● Control Memory Types: ROM or Writable Control Memory (dynamic
microprogramming).
2. Addressing Sequencing:
● Microinstruction Storage: Stored in control memory, specifying routines for
executing instructions.
● Address Sequencing: Process of determining microinstruction addresses based on
instruction codes.
3. Conditional Branching:
● Branch Logic: Provides decision-making capabilities based on status conditions.
● Unconditional/Branch Instructions: Alter control @ow based on speci/ed
conditions.
● Mapping of Instructions: Convert operation code bits to microinstruction
addresses.
4. Symbolic Microinstructions:
● Usage: Allows specifying microinstructions in symbolic form similar to assembly
language.
● Translation: Symbols are translated into binary equivalents using a microprogram
assembler.
5. Microinstruction Format:
● Structure: Divided into functional parts specifying microoperations, conditions, and
branching.
6. Control Unit Operation:
● Microoperations: Atomic operations of CPU execution, essential for each
instruction cycle.
● Functions: Sequencing and execution of microoperations using control signals.
7. Design of Control Unit:
● Hardwired Implementation: Combinational circuit generating control signals based
on inputs.
● Micro-programmed Implementation: Sequences of instructions controlling
microoperations stored in control memory.
8. Microprogram Sequencer:
● Components: Control memory and sequencer determining next microinstruction
address.
● Input Logic: Determines operation types based on microinstruction /elds.
:
:

You might also like