0% found this document useful (0 votes)
81 views

What Is Project?: Unit - 2

A project is a set of tasks with a clear goal that is usually overseen by a project manager. Software project management involves planning, implementing, monitoring, and controlling software projects. It aims to deliver quality software on time and within budget. Key aspects of software project management include estimating the project size, scheduling tasks, assigning resources, managing risks, and communicating among stakeholders. The project manager is responsible for leading the team, coordinating between clients and the team, and guiding the project to completion.

Uploaded by

alok nayak
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
81 views

What Is Project?: Unit - 2

A project is a set of tasks with a clear goal that is usually overseen by a project manager. Software project management involves planning, implementing, monitoring, and controlling software projects. It aims to deliver quality software on time and within budget. Key aspects of software project management include estimating the project size, scheduling tasks, assigning resources, managing risks, and communicating among stakeholders. The project manager is responsible for leading the team, coordinating between clients and the team, and guiding the project to completion.

Uploaded by

alok nayak
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 30

UNIT - 2

What is Project?
A project is a group of tasks that need to complete to reach a clear result. A project also defines
as a set of inputs and outputs which are required to achieve a goal. Projects can vary from
simple to difficult and can be operated by one person or a hundred.

Projects usually described and approved by a project manager or team executive. They go
beyond their expectations and objects, and it's up to the team to handle logistics and complete
the project on time. For good project development, some teams split the project into specific
tasks so they can manage responsibility and utilize team strengths.

What is software project management(SPM)?


Software project management is an art and discipline of planning and supervising software
projects. It is a sub-discipline of software project management in which software projects
planned, implemented, monitored and controlled.

It is a procedure of managing, allocating and timing resources to develop computer software


that fulfils requirements.

In software Project Management, the client and the developers need to know the length, period
and cost of the project.

Prerequisite of software project management?


There are three needs for software project management. These are:

1. Time
2. Cost
3. Quality

It is an essential part of the software organization to deliver a quality product, keeping the cost
within the client’s budget and deliver the project as per schedule. There are various factors,
both external and internal, which may impact this triple factor. Any of three-factor can severely
affect the other two.
Project Manager
A project manager is a character who has the overall responsibility for the planning, design,
execution, monitoring, controlling and closure of a project. A project manager represents an
essential role in the achievement of the projects.

A project manager is a character who is responsible for giving decisions, both large and small
projects. The project manager is used to manage the risk and minimize uncertainty. Every
decision the project manager makes must directly profit their project.

Role of a Project Manager:


1. Leader

A project manager must lead his team and should provide them direction to make them
understand what is expected from all of them.

2. Medium:

The Project manager is a medium between his clients and his team. He must coordinate and
transfer all the appropriate information from the clients to his team and report to the senior
management.

3. Mentor:

He should be there to guide his team at each step and make sure that the team has an attachment.
He provides a recommendation to his team and points them in the right direction.

Responsibilities of a Project Manager:


1. Managing risks and issues.
2. Create the project team and assigns tasks to several team members.
3. Activity planning and sequencing.
4. Monitoring and reporting progress.
5. Modifies the project plan to deal with the situation.
Software Project Planning
A Software Project is the complete methodology of programming advancement from
requirement gathering to testing and support, completed by the execution procedures, in a
specified period to achieve intended software product.

Need of Software Project Management


Software development is a sort of all new streams in world business, and there's next to no
involvement in structure programming items. Most programming items are customized to
accommodate customer's necessities. The most significant is that the underlying technology
changes and advances so generally and rapidly that experience of one element may not be
connected to the other one. All such business and ecological imperatives bring risk in
software development; hence, it is fundamental to manage software projects efficiently.

Software Project Manager


Software manager is responsible for planning and scheduling project development. They
manage the work to ensure that it is completed to the required standard. They monitor the
progress to check that the event is on time and within budget. The project planning must
incorporate the major issues like size & cost estimation scheduling, project monitoring,
personnel selection evaluation & risk management. To plan a successful software project, we
must understand:
Scope of work to be completed
 Risk analysis
 The resources mandatory
 The project to be accomplished
 Record of being followed
Software Project planning starts before technical work start. The various steps of planning
activities are:
The size is the crucial parameter for the estimation of other activities. Resources requirement
are required based on cost and development time. Project schedule may prove to be very
useful for controlling and monitoring the progress of the project. This is dependent on
resources & development time.
The list of activities are as follows:
 Project planning and Tracking
 Project Resource Management
 Scope Management
 Estimation Management
 Project Risk Management
 Scheduling Management
 Project Communication Management
 Configuration Management
Now we will discuss all these activities -
1. Project Planning: It is a set of multiple processes, or we can say that it a task that
performed before the construction of the product starts.
2. Scope Management: It describes the scope of the project. Scope management is important
because it clearly defines what would do and what would not. Scope Management create the
project to contain restricted and quantitative tasks, which may merely be documented and
successively avoids price and time overrun.
3. Estimation management: This is not only about cost estimation because whenever we
start to develop software, but we also figure out their size (line of code), efforts, time as well
as cost.
If we talk about the size, then Line of code depends upon user or software requirement.
If we talk about effort, we should know about the size of the software, because based on the
size we can quickly estimate how big team required to produce the software.
If we talk about time, when size and efforts are estimated, the time required to develop the
software can easily determine.

And if we talk about cost, it includes all the elements such as:
 Size of software
 Quality
 Hardware
 Communication
 Training
 Additional Software and tools
 Skilled manpower
4. Scheduling Management: Scheduling Management in software refers to all the activities
to complete in the specified order and within time slotted to each activity. Project managers
define multiple tasks and arrange them keeping various factors in mind.

For scheduling, it is compulsory -


 Find out multiple tasks and correlate them.
 Divide time into units.
 Assign the respective number of work-units for every job.
 Calculate the total time from start to finish.
 Break down the project into modules.
5. Project Resource Management: In software Development, all the elements are referred to
as resources for the project. It can be a human resource, productive tools, and libraries.

 Resource management includes:


 Create a project team and assign responsibilities to every team member
 Developing a resource plan is derived from the project plan.
 Adjustment of resources.
6. Project Risk Management: Risk management consists of all the activities like
identification, analysing and preparing the plan for predictable and unpredictable risk in the
project.
Several points show the risks in the project:
 The Experienced team leaves the project, and the new team joins it.
 Changes in requirement.
 Change in technologies and the environment.
 Market competition.
7. Project Communication Management: Communication is an essential factor in the
success of the project. It is a bridge between client, organization, team members and as well
as other stakeholders of the project such as hardware suppliers.
From the planning to closure, communication plays a vital role. In all the phases,
communication must be clear and understood. Miscommunication can create a big blunder in
the project.

8. Project Configuration Management: Configuration management is about to control the


changes in software like requirements, design, and development of the product.

The Primary goal is to increase productivity with fewer errors.


Some reasons show the need for configuration management:
 Several people work on software that is continually update.
 Help to build coordination among suppliers.
 Changes in requirement, budget, schedule need to accommodate.
 Software should run on multiple systems.

Tasks perform in Configuration management:

 Identification
 Baseline
 Change Control
 Configuration Status Accounting
 Configuration Audits and Reviews
METRICS FOR PROJECT SIZE ESTIMATION
As already mentioned, accurate estimation of project size is central to satisfactory estimation of
all other project parameters such as effort, completion time, and total project cost. Before
discussing the available metrics to estimate the size of a project, let us examine what does the
term “project size” exactly mean.
The size of a project is obviously not the number of bytes that the source code occupies, neither
is it the size of the executable code.
The project size is a measure of the problem complexity in terms of the effort and time required
to develop the product. Currently, two metrics are popularly being used to measure size—
1.lines of code (LOC)
2. function point (FP).
Each of these metrics has its own advantages and disadvantages. These are discussed in the
following subsection. Based on their relative advantages, one metric may be more appropriate
than the other in a particular situation.

1 Lines of Code (LOC) :-LOC is possibly the simplest among all metrics available to measure
project size. Consequently, this metric is extremely popular. This metric measures the size of a
project by counting the number of source instructions in the developed program. Obviously, while
counting the number of source instructions, comment lines, and header lines are ignored.
Determining the LOC count at the end of a project is very simple. However, accurate estimation
of LOC count at the beginning of a project is a very difficult task. One can possibly estimate the
LOC count at the starting of a project, only by using some form of systematic guess work.
Systematic guessing typically involves the following. The project manager divides the problem
into modules, and each module into sub-modules and so on, until the LOC of the leaf-level
modules are small enough to be predicted. To be able to predict the LOC count for the various
leaf-level modules sufficiently
LOC is a measure of coding activity alone. A good problem size measure should consider the
total effort needed to carry out various life cycle activities (i.e. specification, design, code, test,
etc.)
The implicit assumption made by the LOC metric is that the overall product development effort is
solely determined from the coding effort alone is flawed.
Even for the same programming problem, different programmers might come up with programs
having very different LOC counts. This situation does not improve, even if language tokens are
counted instead of lines of code.

Advantages of LOC
1. Simple to measure

Disadvantage of LOC
1. It is defined on the code. For example, it cannot measure the size of the specification.
2. It characterizes only one specific view of size, namely length, it takes no account of
functionality or complexity
3. Bad software design may cause an excessive line of code
4. It is language dependent
5. Users cannot easily understand it

Function Point (FP) Metric

Function point metric was proposed by Albrecht and Gaffney in 1983. This metric
overcomes many of the shortcomings of the LOC metric. Since its inception, function
point metric has steadily gained popularity. Function point metric has several
advantages over LOC metric. One of the important advantages of the function point
metric over the LOC metric is that it can easily be computed from the problem
specification itself.

Using the LOC metric, on the other hand, the size can accurately be determined only
after the code has been fully written.

The conceptual idea behind the function point metric is the following. The size of a
software product is directly dependent on the number of different high-level functions
or features it supports. This assumption is reasonable, since each feature would take
additional effort to implement.
Measurements Parameters Examples

1.Number of External Inputs(EI) Input screen and tables

2. Number of External Output (EO) Output screens and reports

3. Number of external inquiries (EQ) Prompts and interrupts.

4. Number of internal files (ILF) Databases and directories

5. Number of external interfaces (EIF) Shared databases and shared routines.

2. FP characterizes the complexity of the software system and hence can be used to depict the
project time and the manpower requirement.

3. The effort required to develop the project depends on what the software does.

4. FP is programming language independent.

5. FP method is used for data processing systems, business systems like information systems.

6. The five parameters mentioned above are also known as information domain characteristics.

7. All the parameters mentioned above are assigned some weights that have been
experimentally determined and are shown in Table
Weights of 5-FP Attributes

Measurement Parameter Low Average High

1. Number of external inputs (EI) 7 10 15

2. Number of external outputs (EO) 5 7 10

3. Number of external inquiries (EQ) 3 4 6

4. Number of internal files (ILF) 4 5 7

5. Number of external interfaces (EIF) 3 4 6

The functional complexities are multiplied with the corresponding weights against each
function, and the values are added up to determine the UFP (Unadjusted Function Point) of the
subsystem.

Here that weighing factor will be simple, average, or complex for a measurement parameter
type.

The Function Point (FP) is thus calculated with the following formula.

UFP = (Number of inputs)*4 + (Number of outputs)*5 + (Number of inquiries)*4


+ (Number of files)*10 + (Number of interfaces)*10

UFP computation
The unadjusted function points (UFP) is computed as the weighted sum of five characteristics
of a product as shown in the following expression. The weights associated with the five
characteristics were determined empirically by Albrecht through data gathered from many
projects.
1. Number of inputs: Each data item input by the user is counted. However, it should be noted
that data inputs are considered different from user inquiries.
2. Number of outputs: The outputs considered include reports printed, screen outputs, error
messages produced, etc. While computing the number of outputs, the individual data items
within a report are not considered; but a set of related data items is counted as just a single
output.
3. Number of inquiries: An inquiry is a user command (without any data input) and only
requires some actions to be performed by the system. Thus, the total number of inquiries is
essentially the number of distinct interactive queries (without data input) which can be made
by the users. Examples of such inquiries are print account balance, print all student grades,
display rank holders’ names, etc.
4. Number of files: The files referred to here are logical files. A logical file represents a group
of logically related data. Logical files include data structures as well as physical files.
5. Number of interfaces: Here the interfaces denote the different mechanisms that are used to
exchange information with other systems. Examples of such interfaces are data files on tapes,
disks, communication links with other systems, etc.

PROJECT ESTIMATION TECHNIQUES

Estimation of various project parameters is an important project planning activity. The different
parameters of a project that need to be estimated include—

 project size: -Estimate the size of development product.


 effort required to complete the project: -Estimate the effort in person month or person hours.
 project duration: -Estimate the schedule in calendar.
 cost.: -Estimate the project cost is agreed currently.

Accurate estimation of these parameters is important, since these not only help in quoting an
appropriate project cost to the customer, but also form the basis for resource planning and scheduling.
A large number of estimation techniques have been proposed by researchers. These can broadly be
classified into three main categories:

▪ Empirical estimation techniques

▪ Heuristic techniques

▪ Analytical estimation techniques


In the following subsections, we provide an overview of the different categories of

estimation techniques.:
\

1. Empirical Estimation Technique –


Empirical estimation is a technique or model in which empirically derived formulas are
used for predicting the data that are a required and essential part of the software project
planning step. These techniques are usually based on the data that is collected previously
from a project and also based on some guesses, prior experience with the development of
similar types of projects, and assumptions. It uses the size of the software to estimate the
effort.

In this technique, an educated guess of project parameters is made. Hence, these models
are based on common sense. However, as there are many activities involved in empirical
estimation techniques, this technique is formalized. For example, Delphi technique and
Expert Judgement technique.

2. Heuristic Technique –
Heuristic word is derived from a Greek word that means “to discover”. The heuristic
technique is a technique or model that is used for solving problems, learning, or discovery in
the practical methods which are used for achieving immediate goals. These techniques are
flexible and simple for taking quick decisions through shortcuts and good enough
calculations, most probably when working with complex data. But the decisions that are made
using this technique are necessary to be optimal.

In this technique, the relationship among different project parameters is expressed using
mathematical equations. The popular heuristic technique is given by Constructive Cost
Model (COCOMO). This technique is also used to increase or speed up the analysis and
investment decisions.

3. Analytical Estimation Technique –


Analytical estimation is a type of technique that is used to measure work. In this technique,
firstly the task is divided or broken down into its basic component operations or elements
for analysing. Second, if the standard time is available from some other source, then these
sources are applied to each element or component of work.

Third, if there is no such time available, then the work is estimated based on the experience
of the work. In this technique, results are derived by making certain basic assumptions
about the project. Hence, the analytical estimation technique has some scientific
basis. Halstead’s software science is based on an analytical estimation model.

EMPIRICAL ESTIMATION TECHNIQUES


Two popular empirical estimation techniques are—Expert judgement and Delphi estimation
techniques.
Expert Judgement
Expert judgement is a widely used size estimation technique. In this technique, an expert makes an
educated guess about the problem size after analysing the problem thoroughly. Usually, the expert
estimates the cost of the different components (i.e. modules or subsystems) that would make up the
system and then combines the estimates for the individual modules to arrive at the overall estimate.
However, this technique suffers from several shortcomings. The outcome of the expert judgement
technique is subject to human errors and individual bias. Also, it is possible that an expert may
overlook some factors inadvertently. Further, an expert making an estimate may not have relevant
experience and knowledge of all aspects of a project. For example, he may be conversant with the
database and user interface parts, but may not be very knowledgeable about the computer
communication part. Due to these factors, the size estimation arrived at by the judgement of a single
expert may be far from being accurate

Delphi Cost Estimation


Delphi cost estimation technique tries to overcome some of the shortcomings of the expert judgement
approach. Delphi estimation is carried out by a team comprising a group of experts and a co-ordinator.
In this approach, the co-ordinator provides each estimator with a copy of the software requirements
specification (SRS) document and a form for recording his cost estimate. Estimators complete their
individual estimates anonymously and submit them to the co-ordinator. In their estimates, the
estimators mention any unusual characteristic of the product which has influenced their estimations.
The co-ordinator prepares the summary of the responses of all the estimators, and also includes any
unusual rationale noted by any of the estimators. The prepared summary information is distributed to
the estimators. Based on this summary, the estimators re-estimate. This process is iterated for several
rounds. However, no discussions among the estimators is allowed during the entire estimation
process. The purpose behind this restriction is that if any discussion is allowed among the estimators,
then many estimators may easily get influenced by the rationale of an estimator who may be more
experienced or senior. After the completion of several iterations of estimations, the co-ordinator takes
the responsibility of compiling the results and preparing the final estimate. The Delphi estimation,
though consumes more time and effort, overcomes an important shortcoming of the expert judgement
technique in that the results cannot unjustly be influenced by overly assertive and senior members.

COCOMO: - Cocomo (Constructive Cost Model) is a regression model based on LOC,


i.e. number of Lines of Code. It is a procedural cost estimate model for software projects and often
used as a process of reliably predicting the various parameters associated with making a project such
as size, effort, cost, time and quality. It was proposed by Barry Boehm in 1970 and is based on the
study of 63 projects, which make it one of the best-documented models.
The key parameters which define the quality of any software products, which are also an outcome of
the Cocomo are primarily Effort & Schedule:
Effort: Amount of labour that will be required to complete a task. It is measured in person-months
units.
Schedule: Simply means the amount of time required for the completion of the job, which is, of
course, proportional to the effort put. It is measured in the units of time such as weeks, months.
Different models of Cocomo have been proposed to predict the cost estimation at different levels,
based on the amount of accuracy and correctness required. All of these models can be applied to a
variety of projects, whose characteristics determine the value of constant to be used in subsequent
calculations. These characteristics pertaining to different system types are mentioned below.
Boehm’s definition of organic, semidetached, and embedded systems:

 Organic – A software project is said to be an organic type if the team size required is
adequately small, the problem is well understood and has been solved in the past and also
the team members have a nominal experience regarding the problem.
 Semi-detached – A software project is said to be a Semi-detached type if the vital
characteristics such as team-size, experience, knowledge of the various programming
environment lie in between that of organic and Embedded. The projects classified as
Semi-Detached are comparatively less familiar and difficult to develop compared to the
organic ones and require more experience and better guidance and creativity. Eg:
Compilers or different Embedded Systems can be considered of Semi-Detached type.
 Embedded – A software project with requiring the highest level of complexity,
creativity, and experience requirement fall under this category. Such software requires a
larger team size than the other two models and also the developers need to be sufficiently
experienced and creative to develop such complex models.
All the above system types utilize different values of the constants used in Effort Calculations.

Types of Models: COCOMO consists of a hierarchy of three increasingly detailed and accurate
forms. Any of the three forms can be adopted according to our requirements. These are types of
COCOMO model:
1. Basic COCOMO Model
2. Intermediate COCOMO Model
3. Complete COCOMO Model
4. COCOMO 2

1.Basic COCOMO Model: The basic COCOMO model provide an accurate size of the project
parameters. The following expressions give the basic COCOMO estimation
Effort=a1(KLOC)ª²PM
Tdev=b1(efforts)b²Months

Where

KLOC is the estimated size of the software product indicate in Kilo Lines of Code,

a1, a2, b1, b2 are constants for each group of software products,

Tdev is the estimated time to develop the software, expressed in months,

Effort is the total effort required to develop the software product, expressed in person months
(PMs).

Some insight into the basic COCOMO model can be obtained by plotting the estimated
characteristics for different software sizes. Fig shows a plot of estimated effort versus product
size. From fig, we can observe that the effort is somewhat superliner in the size of the software
product. Thus, the effort required to develop a product increases very rapidly with project size.

The development time versus the product size in KLOC is plotted in fig. From fig it can be
observed that the development time is a sub linear function of the size of the product, i.e. when
the size of the product increases by two times, the time to develop the product does not double
but rises moderately. This can be explained by the fact that for larger products, a larger number
of activities which can be carried out concurrently can be identified. The parallel activities can
be carried out simultaneously by the engineers. This reduces the time to complete the project.
Further, from fig, it can be observed that the development time is roughly the same for all three
categories of products. For example, a 60 KLOC program can be developed in approximately
18 months, regardless of whether it is of organic, semidetached, or embedded type.

From the effort estimation, the project cost can be obtained by multiplying the required effort
by the manpower cost per month. But, implicit in this project cost computation is the
assumption that the entire project cost is incurred on account of the manpower cost alone. In
addition to manpower cost, a project would incur costs due to hardware and software required
for the project and the company overheads for administration, office space, etc.

It is important to note that the effort and the duration estimations obtained using the COCOMO
model are called a nominal effort estimate and nominal duration estimate. The term nominal
implies that if anyone tries to complete the project in a time shorter than the estimated duration,
then the cost will increase drastically. But, if anyone completes the project over a longer period
of time than the estimated, then there is almost no decrease in the estimated cost value.

2.Intermediate COCOMO Model: The basic COCOMO model considers that the effort is
only a function of the number of lines of code and some constants calculated according to the
various software systems. The intermediate COCOMO model recognizes these facts and
refines the initial estimates obtained through the basic COCOMO model by using a set of 15
cost drivers based on various attributes of software engineering.

Classification of Cost Drivers and their attributes:

(i) Product attributes -

o Required software reliability extent

o Size of the application database

o The complexity of the product

Hardware attributes -

o Run-time performance constraints

o Memory constraints

o The volatility of the virtual machine environment

o Required turnabout time

Personnel attributes -

o Analyst capability

o Software engineering capability

o Applications experience

o Virtual machine experience

o Programming language experience

Project attributes -

o Use of software tools

o Application of software engineering methods

o Required development schedule

o Coefficients for intermediate COCOMO


Intermediate COCOMO equation:

E=ai (KLOC)bi*EAF

D=ci (E)di

Project ai bi ci di

Organic 2.4 1.05 2.5 0.38

Semidetached 3.0 1.12 2.5 0.35

Embedded 3.6 1.20 2.5 0.32

Complete COCOMO
A major shortcoming of both the basic and the intermediate COCOMO models is that they
consider a software product as a single homogeneous entity. However, most large systems are
made up of several smaller sub-systems. These sub-systems often have widely different
characteristics. For example, some sub-systems may be considered as organic type, some
semidetached, and some even embedded. Not only may the inherent development complexity
of the subsystems be different, but for some subsystem the reliability requirements may be
high, for some the development team might have no previous experience of similar
development, and so on.
The complete COCOMO model considers these differences in characteristics of the
subsystems and estimates the effort and development time as the sum of the estimates for the
individual sub-systems.
In other words, the cost to develop each sub-system is estimated separately, and the complete
system cost is determined as the subsystem costs. This approach reduces the margin of error
in the final estimate.
L e t us consider the following development project as an example application of the
complete COCOMO model. A distributed management information system (MIS) product for
an organisation having offices at several places across the country can have the following
sub-component:
• Database part
• Graphical user interface (GUI) part
• Communication part
Of these, the communication part can be considered as embedded software. The database part
could be semi-detached software, and the GUI part organic software. The costs for these three
components can be estimated separately, and summed up to give the overall cost of the
system.
To further improve the accuracy of the results, the different parameter values of the model
can be fine-tuned and validated against an organisation’s historical project database to obtain
more accurate estimations. Estimation models such as COCOMO are not totally accurate and
lack a full scientific justification. Still, software cost estimation models such as COCOMO
are required for an engineering approach to software project management. Companies
consider computed cost estimates to be satisfactory, if these are within about 80 per cent of
the final cost. Although these estimates are gross approximations—without such models, one
has only subjective judgements to rely on.
COCOMO 2
Since the time that COCOMO estimation model was proposed in the early 1980s, the software
development paradigms as well as the characteristics of development projects have undergone a sea
change. The present day software projects are much larger in size and reuse of existing software to
develop new products has become pervasive. For example, component-based development and
service-oriented architectures (SoA) have become very popular.
COCOMO 2 provides three models to arrive at increasingly accurate cost estimations. These can be
used to estimate project costs at different phases of the software product. As the project progresses,
these models can be applied at the different stages of the same project. Application composition
model: This model as the name suggests, can be used to estimate the cost for prototype development.
We had already discussed in Chapter 2 that a prototype is usually developed to resolve user interface
issues. Early design model: This supports estimation of cost at the architectural design stage. Post-
architecture model: This provides cost estimation during detailed design and coding stages. The post-
architectural model can be considered as an update of the original COCOMO. The other two models
help consider the following two factors. Now a days every software is interactive and GUI-driven.
GUI development constitutes a significant part of the overall development effort. The second factor
concerns several issues that affect productivity such as the extent of reuse.

Staffing level Estimation: NORDEN WORK

Project manager has to figure out Staff Estimation after the effort required to develop a software has been

determined.

Norden investigated the staffing pattern of R & D project.

Norden Estimation:

He studied the Staffing patterns of R & D projects and proposed that Staffing level patterns can be

approximated by the "Rayleigh Distribution Curve" which specifies that the relationship between
applied effort and delivery time for software project. It is also called Putnam- NORDEN-Rayleigh

Curve or PNR curve.


He represented the Rayleigh Curve by this equation

E=K/t2d*t*e-t2/2td2

Here E is the effort required at time t. (engineers and staffs)

K= Area under curve.

td= time at with the curve attains its maximum values.

Putnam Resource Allocation Model


The Lawrence Putnam model describes the time and effort requires finishing a software project
of a specified size. Putnam makes a use of a so-called The Norden/Rayleigh Curve to estimate
project effort, schedule & defect rate as shown in fig:
Putnam noticed that software staffing profiles followed the well known Rayleigh distribution.
Putnam used his observation about productivity levels to derive the software equation.

The various terms of this expression are as follows:

K is the total effort expended (in PM) in product development, and L is the product estimate
in KLOC .

td correlate to the time of system and integration testing. Therefore, td can be relatively
considered as the time required for developing the product.

Ck Is the state of technology constant and reflects requirements that impede the development
of the program.
Typical values of Ck = 2 for poor development environment

Ck= 8 for good software development environment

Ck = 11 for an excellent environment (in addition to following software engineering


principles, automated tools and techniques are used).

The exact value of Ck for a specific task can be computed from the historical data of the
organization developing it.

Putnam proposed that optimal staff develop on a project should follow the Rayleigh curve.
Only a small number of engineers are required at the beginning of a plan to carry out planning
and specification tasks. As the project progresses and more detailed work are necessary, the
number of engineers reaches a peak. After implementation and unit testing, the number of
project staff falls.

Effect of a Schedule change on Cost


Putnam derived the following expression:

Where, K is the total effort expended (in PM) in the product development

L is the product size in KLOC

td corresponds to the time of system and integration testing

Ck Is the state of technology constant and reflects constraints that impede the progress of the
program

Now by using the above expression, it is obtained that,


For the same product size, C =L3 / Ck3 is a constant.

(As project development effort is equally proportional to project development cost)

From the above expression, it can be easily observed that when the schedule of a project is
compressed, the required development effort as well as project development cost increases in
proportion to the fourth power of the degree of compression. It means that a relatively small
compression in delivery schedule can result in a substantial penalty of human effort as well as
development cost

Halstead's Software Metrics

According to Halstead's "A computer program is an implementation of an algorithm


considered to be a collection of tokens which can be classified as either operators or operand."

Token Count

In these metrics, a computer program is considered to be a collection of tokens, which may be


classified as either operators or operands. All software science metrics can be defined in terms
of these basic symbols. These symbols are called as a token.

The basic measures are

n1 = count of unique operators.


n2 = count of unique operands.
N1 = count of total occurrences of operators.
N2 = count of total occurrence of operands.

Halstead metrics are:

Program Volume (V)

The unit of measurement of volume is the standard unit for size "bits." It is the actual size of
a program if a uniform binary encoding for the vocabulary is used.

V=N*log2n
Program Difficulty

The difficulty level or error-proneness (D) of the program is proportional to the number of the
unique operator in the program.

D= (n1/2) * (N2/n2)

Programming Effort (E)

The unit of measurement of E is elementary mental discriminations.

E=D*V

Estimated Program Length

According to Halstead, The first Hypothesis of software science is that the length of a well-
structured program is a function only of the number of unique operators and operands.

N=N1+N2

And estimated program length is denoted by N^

N^ = n1log2n1 + n2log2n2

Potential Minimum Volume

The potential minimum volume V is defined as the volume of the most short program in which
a problem can be coded.

V =N log2 n

Size of Vocabulary (n)

The size of the vocabulary of a program, which consists of the number of unique tokens used
to build a program, is defined as:

n=n1+n2

Schedule:
The Software schedule is directly correlated to the size of the project, efforts and
costs involved.
Scheduling is done for 3 primary reasons as listed below:
 To commit to the timeliness of the project.
 To estimate the resources required for the project execution.
 To Estimate the cost of the project in order to allocate funds and get approval.
Software Scheduling - Features:
 Scheduling is based on the experience in similar projects.
 The Software scheduling is done to ensure that critical milestone dates,
dependency dates are achieved.
 The assumptions made for scheduling is well documented.
 The Scheduling is usually shared to the stake holders, agreed and signed off
before kick-starting the actual development process.

Organization and Team Structures:


Personnel Planning deals with staffing. Staffing deals with the appoint personnel for the
position that is identified by the organizational structure.

It involves:

o Defining requirement for personnel


o Recruiting (identifying, interviewing, and selecting candidates)
o Compensating
o Developing and promoting agent

For personnel planning and scheduling, it is helpful to have efforts and schedule size for the
subsystems and necessary component in the system.

At planning time, when the system method has not been completed, the planner can only think
to know about the large subsystems in the system and possibly the major modules in these
subsystems.

Once the project plan is estimated, and the effort and schedule of various phases and functions
are known, staff requirements can be achieved.

From the cost and overall duration of the projects, the average staff size for the projects can
be determined by dividing the total efforts (in person-months) by the whole project duration
(in months).

Typically, the staff required for the project is small during requirement and design, the
maximum during implementation and testing, and drops again during the last stage of
integration and testing.

Using the COCOMO model, average staff requirement for various phases can be calculated
as the effort and schedule for each method are known.

When the schedule and average staff level for every action are well-known, the overall
personnel allocation for the project can be planned.
This plan will indicate how many people will be required for different activities at different
times for the duration of the project.

The total effort for each month and the total effort for each step can easily be calculated from
this plan.

Team Structure(Staffing):
Team structure addresses the issue of arrangement of the individual project teams. There are
some possible methods in which the different project teams can be organized. There are
primarily three formal team structures: chief programmer, Ego-less or democratic, and the
mixed team organizations even several other variations to these structures are possible.
Problems of various complexities and sizes often need different team structures for the chief
solution.

Ego-Less or Democratic Teams


Ego-Less teams subsist of a team of fewer programmers. The objective of the group is set by
consensus, and input from each member is taken for significant decisions. Group leadership
revolves among the group members. Due to its nature, egoless teams are consistently known
as democratic teams.

The structure allows input from all representatives, which can lead to better decisions in
various problems. This suggests that this method is well suited for long-term research-type
projects that do not have time constraints.
Chief Programmer Team
A chief-programmer team, in contrast to the ego-less team, has a hierarchy. It consists of a
chief-programmer, who has a backup programmer, a program librarian, and some
programmers.

The chief programmer is essential for all major technical decisions of the project.

He does most of the designs, and he assigns coding of the different part of the design to the
programmers.

The backup programmer uses the chief programmer makes technical decisions, and takes over
the chief programmer if the chief programmer drops sick or leaves.

The program librarian is vital for maintaining the documentation and other communication-
related work.

This structure considerably reduces interpersonal communication. The communication paths,


as shown in fig:

Controlled Decentralized Team


(Hierarchical Team Structure)
A third team structure known as the controlled decentralized team tries to combine the strength
of the democratic and chief programmer teams.

It consists of project leaders who have a class of senior programmers under him, while under
every senior programmer is a group of a junior programmer.

The group of a senior programmer and his junior programmers behave like an ego-less team,
but communication among different groups occurs only through the senior programmers of
the group.

The senior programmer also communicates with the project leader.

Such a team has fewer communication paths than a democratic team but more paths compared
to a chief programmer team.

This structure works best for large projects that are reasonably straightforward. It is not well
suited for simple projects or research-type projects.

What is Risk?
"Tomorrow problems are today's risk." Hence, a clear definition of a "risk" is a problem that
could cause some loss or threaten the progress of the project, but which has not happened yet.
These potential issues might harm cost, schedule or technical success of the project and the
quality of our software device, or project team morale.

Risk Management is the system of identifying addressing and eliminating these problems
before they can damage the project.

We need to differentiate risks, as potential issues, from the current problems of the project.

Different methods are required to address these two kinds of issues.

For example, staff storage, because we have not been able to select people with the right
technical skills is a current problem, but the threat of our technical persons being hired away
by the competition is a risk.

Risk Management
A software project can be concerned with a large variety of risks. In order to be adept to
systematically identify the significant risks which might affect a software project, it is essential
to classify risks into different classes. The project manager can then check which risks from
each class are relevant to the project.

There are three main classifications of risks which can affect a software project:

1. Project risks
2. Technical risks
3. Business risks

1. Project risks: Project risks concern differ forms of budgetary, schedule, personnel,
resource, and customer-related problems. A vital project risk is schedule slippage. Since the
software is intangible, it is very tough to monitor and control a software project. It is very
tough to control something which cannot be identified. For any manufacturing program, such
as the manufacturing of cars, the plan executive can recognize the product taking shape.

2. Technical risks: Technical risks concern potential method, implementation, interfacing,


testing, and maintenance issue. It also consists of an ambiguous specification, incomplete
specification, changing specification, technical uncertainty, and technical obsolescence. Most
technical risks appear due to the development team's insufficient knowledge about the project.

3. Business risks: This type of risks contain risks of building an excellent product that no one
need, losing budgetary or personnel commitments, etc.

Other risk categories

1. 1. Known risks: Those risks that can be uncovered after careful assessment of the
project program, the business and technical environment in which the plan is being
developed, and more reliable data sources (e.g., unrealistic delivery date)
2. 2. Predictable risks: Those risks that are hypothesized from previous project
experience (e.g., past turnover)
3. 3. Unpredictable risks: Those risks that can and do occur, but are extremely tough to
identify in advance.

Principle of Risk Management


1. Global Perspective: In this, we review the bigger system description, design, and
implementation. We look at the chance and the impact the risk is going to have.
2. Take a forward-looking view: Consider the threat which may appear in the future
and create future plans for directing the next events.
3. Open Communication: This is to allow the free flow of communications between the
client and the team members so that they have certainty about the risks.
4. Integrated management: In this method risk management is made an integral part of
project management.
5. Continuous process: In this phase, the risks are tracked continuously throughout the
risk management paradigm.

Software Configuration Management


When we develop software, the product (software) undergoes many changes in their
maintenance phase; we need to handle these changes effectively.

Several individuals (programs) works together to achieve these common goals. This
individual produces several work products (SC Items) e.g., Intermediate version of modules
or test data used during debugging, parts of the final product.

The elements that comprise all information produced as a part of the software process are
collectively called a software configuration.

As software development progresses, the number of Software Configuration elements (SCI's)


grow rapidly.

These are handled and controlled by SCM. This is where we require software
configuration management.

A configuration of the product refers not only to the product's constituent but also to a
particular version of the component.

Therefore, SCM is the discipline which

o Identify change
o Monitor and control change
o Ensure the proper implementation of change made to the item.
o Auditing and reporting on the change made.

Configuration Management (CM) is a technic of identifying, organizing, and controlling


modification to software being built by a programming team.

The objective is to maximize productivity by minimizing mistakes (errors).

CM is used to essential due to the inventory management, library management, and updation
management of the items essential for the project.

Why do we need Configuration Management?


Multiple people are working on software which is consistently updating. It may be a method
where multiple version, branches, authors are involved in a software project, and the team is
geographically distributed and works concurrently. It changes in user requirements, and
policy, budget, schedules need to be accommodated.

Importance of SCM
It is practical in controlling and managing the access to various SCIs e.g., by preventing the
two members of a team for checking out the same component for modification at the same
time.

It provides the tool to ensure that changes are being properly implemented.

It has the capability of describing and storing the various constituent of software.

SCM is used in keeping a system in a consistent state by automatically producing derived


version upon modification of the same component.

You might also like