Computer Science Salybus
Computer Science Salybus
Computer Science Salybus
https://www.tutorialspoint.com/basics_of_computer_science/basics_of_computer_science_fundament
al_concept
https://www.tutorialspoint.com/computer_logical_organization/cpu_architecture.htm
Programming fundamentals
https://www.tutorialspoint.com/object_oriented_analysis_design/index.htm
https://www.tutorialspoint.com/data_structures_algorithms/
Software Eng.
https://www.tutorialspoint.com/software_engineering/
Compiler Structure
https://www.tutorialspoint.com/compiler_design/
Automata theory.
https://www.tutorialspoint.com/automata_theory/automata_theory_pdf_version.htm
https://www.tutorialspoint.com/data_communication_computer_network/
https://www.tutorialspoint.com/operating_system/
Database Systems
https://www.tutorialspoint.com/dbms/
https://www.tutorialspoint.com/dip/
https://www.tutorialspoint.com/php/
24-- oop, algo & ds, S.E, Networking, D.B, compiler construction, oo paradigm, O.S, design patterns
John Napier
Napier was a Scottish mathematician who invented logarithms.
Further, Napier also invented a computing device, which consisted of sticks with
numbers imprinted on them. Napier named sticks ‘bones,’ as they were made up of
bones.
Blaise Pascal
Pascal was a French mathematician who invented a machine based on gear wheels,
which helped greatly in calculation.
Charles Babbage
Babbage was an English Polymath, Mathematician, Mechanical Engineer, Philosopher,
and Inventor. In 1822, he developed a machine capable to calculate the successive
difference of expression and prepared a table which helped him in his calculations.
John Atanstoff
With the assistance of Berry, John Atanstoff developed the Atanstoff Berry Computer
(more popular as ABC) in 1937. It marked the beginning of the development of electronic
digital computer.
Maurice V. Wilkes
In 1949, Wilkes (at Cambridge University) designed Electronic Delay Storage Automatic
Calculator (EDSAC). It was the first computer that started its operating system on the
stored program concept.
Association vs Aggregation or Composition
Association depicts the relationship between objects of one or more classes. A link can be
defined as an instance of an association.
Aggregation is referred as a “part–of” or “has–a” relationship, with the ability to navigate
from the whole to its parts. An aggregate object is an object that is composed of one or
more other objects.
In the relationship, “a car has–a motor”
(Association use a part of complete properties but Composition use complete -- part of vs has
a)
Interpolation Search:
Applied on sorted list – just like binary search – but in this search list data is equally
distributed – e.g. 2, 4, 6, 8, 10
Hash Table:
Hashing is just like array – where each data value has its own unique index value. Access
of data becomes very fast if we know the index of the desired data.
1 1 1 % 20 = 1 1
2 2 2 % 20 = 2 2
S.E
Software Product:
8 stages
Requirements, System Analysis, System Design, Code Design, Testing, Deployment,
Maintenance, Updates.
Software Development Life Cycle:
6 Stages
Planning, Analysis, Design, Implementation, Testing and Integration, Maintenance
Communication, Requirement Gathering, Feasibility Study, System Analysis, Software
Design, Coding, Testing, Integration, Implementation, Operation and Maintenance,
Disposition
Software Evolution:
5 stages
Change Request, Impact Analysis, Release Planning, System Update, System Release
Waterfall model:
It says the all the phases of SDLC will function one after another in linear manner.
Iterative Model:
This model leads the software development process in iterations. It projects the process of
development in cyclic manner repeating every step after every cycle of SDLC process.
Spiral Model:
Spiral model is a combination of both, iterative model and one of the SDLC model. It can be
seen as if you choose one SDLC model and combine it with cyclic process (iterative model).
This includes risk analysis. Then one standard SDLC model is used to build the software. In
the fourth phase of the plan of next iteration is prepared.
V-Model:
V-Model provides means of testing of software at each stage in reverse manner.
It overcomes drawback of waterfall model of going back if something found wrong.
Big bang Model:
This model is the simplest model in its form. It requires little planning, lots of programming
and lots of funds.
Software Project Management:
Management Activities:
Project planning, Scope Management, Project Estimation
Project Estimation Techniques:
Decomposition Technique:
Line of Code, Function Points
Empirical Estimation Technique:
Putnam Model -- maps time and effort
Cocomo – Constructive Cost Model – divides software in three parts – organic,
semidetached and embedded
Pert Chart:
PERT (Program Evaluation & Review Technique) chart is a tool that depicts project as
network diagram. It is capable of graphically representing main events of project in both
parallel and consecutive way. Events, which occur one after another, show dependency of
the later event over the previous one.
Resource Histogram:
This is a graphical tool that contains bar or chart representing number of resources (usually
skilled staff) required over time for a project event (or phase). Resource Histogram is an
effective tool for staff planning and coordination.
Logical DFD - This type of DFD concentrates on the system process, and flow of data in the
system. For example in a Banking software system, how data is moved between different
entities.
Physical DFD - This type of DFD shows how the data flow is actually implemented in the
system. It is more specific and close to the implementation.
HIPO Diagram:
HIPO (Hierarchical Input Process Output) diagram is a combination of two organized
method to analyze the system and provide the means of documentation.
HIPO diagram represents the hierarchy of modules in the software system. Analyst uses
HIPO diagram in order to obtain high-level view of system functions. It decomposes
functions into sub-functions in a hierarchical manner. It depicts the functions performed by
system.
Halstead Complexity:
“A computer program is an implementation of an algorithm considered to be a collection of
tokens which can be classified as either operators or operands”.
Cyclomatic Complexity Measure:
Cyclomatic Complexity Measure to quantify complexity of a given software. It is graph
driven model that is based on decision-making constructs of program such as if-else, do-
while, repeat-until, switch-case and goto statements.
Testing Approaches:
Tests can be conducted based on two approaches –
Functionality testing
Implementation testing
Black-box testing
It is carried out to test functionality of the program. It is also called ‘Behavioral’ testing.
The tester in this case, has a set of input values and respective desired results. On
providing input, if the output matches with the desired results, the program is tested ‘ok’,
and problematic otherwise.
White-box testing
It is conducted to test program and its implementation, in order to improve code efficiency
or structure. It is also known as ‘Structural’ testing.
In this testing method, the design and structure of the code are known to the tester.
Programmers of the code conduct this test on the code.
Testing Levels:
Unit testing, Integration testing, System Testing, Acceptance testing, Regression testing.
Cost of Maintenance:
Maintenance Activities:
Tree Topology
Also known as Hierarchical Topology, this is the most common form of network topology
in use presently. This topology imitates as extended Star topology and inherits properties
of bus topology.
Daisy chain vs Ring Topology:
In daisy end nodes are not connected.
OSI Model:
Application Layer: This layer is responsible for providing interface to the application user.
This layer encompasses protocols which directly interact with the user.
Presentation Layer: This layer defines how data in the native format of remote host should
be presented in the native format of host.
Session Layer: This layer maintains sessions between remote hosts. For example, once
user/password authentication is done, the remote host maintains this session for a while and
does not ask for authentication again in that time span.
Transport Layer: This layer is responsible for end-to-end delivery between hosts.
Network Layer: This layer is responsible for address assignment and uniquely addressing
hosts in a network.
Data Link Layer: This layer is responsible for reading and writing data from and onto the line.
Link errors are detected at this layer.
Physical Layer: This layer defines the hardware, cabling wiring, power output, pulse rate etc.
Internet Model
Internet uses TCP/IP protocol suite, also known as Internet suite. This defines Internet
Model which contains four layered architecture. OSI Model is general communication
model but Internet Model is what the internet uses for all its communication.The internet
is independent of its underlying network architecture so is its Model. This model has the
following layers:
Application Layer: This layer defines the protocol which enables user to interact with the
network.For example, FTP, HTTP etc.
Transport Layer: This layer defines how data should flow between hosts. Major protocol at
this layer is Transmission Control Protocol (TCP). This layer ensures data delivered between
hosts is in-order and is responsible for end-to-end delivery.
Internet Layer: Internet Protocol (IP) works on this layer. This layer facilitates host addressing
and recognition. This layer defines routing.
Link Layer: This layer provides mechanism of sending and receiving actual data.Unlike its
OSI Model counterpart, this layer is independent of underlying network architecture and
hardware.