Software Process and Project Metrics
Software Process and Project Metrics
Software Process and Project Metrics
Chapter 4
Software metrics
Measurement can be applied to the software process with the intent of improving to assist in estimation, quality control, productivity assessment, and project control to help assess the quality of technical work products and to assist in tactical decision making as a project proceeds
Process indicators
enable a software engineering organization to gain insight into the efficacy of an existing process ( i.e. , the paradigm, software engineering tasks , work products , and milestones ) .
enable managers and practitioners to assess what works and what doesn't. Process metrics are collected across all projects and over long periods of time. Their intent is to provide indicators that lead to longterm software process improvement.
Project indicators
enable a software project manager to assess the status of an ongoing project track potential risks uncover problem areas before they "go critical" adjust work flow or tasks evaluate the project team's ability to control quality of software engineering work products.
Process Metrics and Software Process Improvement The only rational way to improve any process is to measure specific attributes of the process develop a set of meaningful metrics based on these attributes use the metrics to provide indicators that will lead to a strategy for improvement
Product
Customer Characteristics
Business Conditions
Process
People
Development Environment
Technology oduct
Outcomes
We measure the efficacy of a software process indirectly based on the outcomes that can be derived from the process. Outcomes :
measures of errors uncovered before release of the software efects delivered to and reported by end users work products delivered human effort expended calendar time expended schedule conformance
We derive process metrics by measuring the characteristics of specific software engineering tasks.
measure the effort and time spent performing the umbrella activities measure the generic software engineering activities
Private metric
There are "private and public" uses for different types of process data : Data private to the individual
serve as an indicator for the individual only
Examples of metrics private to the individual defect rates (by individual) defect rates (by module) errors found during development
A structured set of process descriptions , measurements , and methods that can help engineers to improve their personal performance Some process metrics are private to the software project team but public to all team members
Defects reported for major software functions Errors found during formal technical reviews Lines of code or function points per module and function
Public metrics
Public metrics assimilate information that originally was private to individuals and teams. Project-level defect
rates , effort, calendar times, related data
are collected and evaluated in an attempt to uncover indicators that can improve organizational process performance.
Software interface 6%
Logic 20%
Missing
Ambiguous
Specification defects wrong customer queried customer gave wrong info inadequate inquiries used outdated info Incorrect Changes
Project Metrics
Used for strategic purposes
by a project manager and a software team to adapt project work flow and technical activities .
Production rates
pages of documentation, review hours, function points, delivered source lines
SOFTWARE MEASUREMENT
Direct measures (e.g., the length of a bolt) Indirect measures (e.g., the "quality" ) Direct measures
lines of code (LOC) execution speed memory size defects reported over some set period of time
Indirect measures
functionality, quality, complexity, efficiency, reliability, maintainability and many other "abilities"
Size-Oriented Metrics
Derived by normalizingquality and or productivity measures by considering the "size" of the software
10
a set of simple size -oriented metrics can be developed for each project:
errors per KLOC (thousand lines of code) defects per KLOC $ per LOC pages of documentation per YLOC
opponents claim
LOC measures are programming language dependent, they penalize well-designed but shorter programs, they cannot easily accommodate nonprocedural languages, their use in estimation requires a level of detail that may be difficult to achieve (i.e., the planner must estimate the LOC to be produced long before analysis and design have been completed).
11
Function-Oriented Metrics
Use a measure of the functionality delivered by the application as a normalization value. Function-oriented metrics were first proposed by Albrecht.
Function points
measurement parameter number of user inputs number of user outputs number of user inquiries number of files number of externel interfaces count = total count
Number of user inputs . Each user input that provides distinct applicationoriented data to the software is counted. Inputs should be distinguished from inquiries, which are counted separately. Number of user outputs. Each user output that provides application-oriented information to the user is counted. In this context output refers to reports, screens, error messages, and so on. Individual data items within a report are not counted separately. Number of user inquiries. An inquiry is defined as an on-line input that results in the generation of some immediate software response in the form of an on-line output. Each distinct inquiry is counted. Number of files . Each logical master file (i.e., a logical grouping of data that may be one part of a large database or a separate file), is counted. Number of external interfaces . All machine readable interfaces (e.g., data files on tape or disk) that are used to transmit information to another system ITU DEPARTMENT are counted. OF
COMPUTER ENGINEERING SOFTWARE ENGINEERING
12
No influenc
Incident
Moderat
Average
Significant
Essentia
1. Does the system require reliable backup and recovery? 2. Are data communications required? 3. Are there distributed processing functions? 4. Is performance critical? 5. Will the system run in an existing, heavily utilized operational environment? 6. Does the system require on-line data entry? 7. Does the on-line data entry require the input transaction to be built over multiple screens or operations? 8. Are the master files updated on-line? 9. Are the inputs, outputs, files, or inquiries complex? 10.. Is the internal processing complex? 11. Is the code designed to be reusable? 12. Are conversion and installation included in the design? 13. Is the system designed for multiple installations in different organizations? 14. Is the application designed to facilitate change and ease o f use by the user? ITU DEPARTMENT OF COMPUTER ENGINEERING SOFTWARE ENGINEERING
13
Function Point
Once function points have been calculated, they are used in a manner analogous to LOC to normalize measures of software productivity, quality, and other attributes : errors per FP defects per FP $ per FP page of documentation per FP FP per person-month
14
Function
Estimated LOC 2,300 5,300 6,800 3,350 4,950 2,100 8,400 33,200
User interface and control facilities (UICF) Two-dimensional geometric analysis (2DGA) Three-dimensional geometric analysis (3DGA) Database management (DBM) Computer graphics display facilities (CGDF) Peripheral control (PC) Design analysis modules (DAM) Estimated lines of code
optimistic: 4600 most likely: 6900 pessimistic: 8600 AN EXAMPLE OF FP-BASED ESTIMATION
opt.
likely pess.
est. count
weight
FP-count
Number of inputs Number of outputs Number of inquiries Number of files Number of external interfaces
20 12 16 4 2
24 15 22 4 2
30 22 28 5 3
24 16 22 4 2
4 5 4 10 7
96 80 88 40 14 318
Count-total
15
Value
4 2 0 4 3 4 5 3 5 5 4 3 5 5 1.17
The estimated number of FP is derived: FPestimated = count-total [0.65 + 0.01 Fi ] FPestimated = 372
Historical data normalized using function points indicate that the organizational average productivity for systems of this type is 6.5 FP/p m. Burdened labor rate of $8000 per month, the cost per FP is approximately $1230. Based on the LOC estimate and the historical productivity data, the total estimated project cost is $457,000 and the estimated effort is 58 personmonths.
16
To compute the feature point information domain values are again counted and weighted as described In addition, the feature point metric counts a new software characteristic, algorithms. An algorithm is defined as "a bounded computational problem that is included within a specific computer program"
17
3D Function Point
3D function point integrates Data dimension of software with the functional and control dimensions. Data Dimension Counts of retained data (the internal program data structure, e.g., files) External data (inputs, outputs, inquiries, and external references) Functional Dimension Measured by considering 'the number of internal operations required to transform input to output data'
Transformation A series of processing steps that are constrained by a set of semantic statements. A transformation is accomplished with an algorithm that results in a fundamental change to input data as it is processed to become output data. Acquire data from a file Simply place that data into program memory
18
The level of complexity assigned to each transformation is a function of : The number of processing steps The number of semantic statements that control the processing steps.
Semantic Statements 1-5 Processing Steps 1-10 low low low 6-10 11+
11-20
low
average
high
21+
average
high
high
Control Dimension
Measured by counting the number of transitions between states
A state represents some externally observable mode of behavior, and a transition occurs as a result of some event that causes the software or system to change its mode of behavior. For example, a cellular phone auto-dial state resting state dialing state
19
To compute 3D function points index = I+ O +Q+F+ E+ T+ R where I, 0, Q, F, E, T, and R represent inputs, outputs, inquiries, internal data structures, external files, transformations, and transitions, respectively.. Each complexity weighted value is computed using the following relationship complexity weighted value = Nil Wil + Nia Wia + Nih Wih where Nil , Nia , and Nih represent the number of occurrences of element i (e.g., outputs) for each level of complexity (low, average, high) Wil , Wia and Wih are the corresponding weights.
Function Point
The function point ,like the LOC, measure is controversial Proponents claim that FP is programming language independent, making it ideal for applications using conventional and nonprocedural languages It is based on data that are more likely to be known early in the evolution of a project Opponents claim that The method requires some "sleight of hand" in that computation s i based on subjective, rather than objective, data Counts of the information domain (and other dimensions) can be difficult to collect after-the fact FP has no direct physical meaning - it's just a number.
ITU DEPARTMENT OF COMPUTER ENGINEERING SOFTWARE ENGINEERING
20
low 7 5 3 4 3 7
+
average
high
10 + 7 4 5 4
+
15 = 10 = 6 7 6 = = =
10 + n/a +
15 = n/a =
n/a +
21
Software Productivity
Five important factors that influence software productivity: People factors. The size and expertise of the development organization. Problem factors . The complexity of the problem to be solved and the number of changes in design constraints or requirements. Process factors . Analysis and design techniques that are used, languages and CASE tools available, and review techniques. Product factors. Reliability and performance of the computer-based system. Resource factors . Availability of CASE tools and hardware and software resources.
22
Measuring Quality
Correctness : The degree to which the software performs its required function. Common measure: Defects per KLOC , where a defect is defined as a verified lack of conformance to requirements. Maintainability: The ease with which a program can be corrected if an error is encountered, adapted if its environment changes, or enhanced if the customer desires a change in requirements. A simple time -oriented metric Mean-time -to-change (MTTC), the time it takes to analyze the change request, design an appropriate modification, implement the change, test it, and distribute the change to all users
ITU DEPARTMENT OF COMPUTER ENGINEERING SOFTWARE ENGINEERING
Measuring Quality
Integrity Measures a system's ability to withstand attacks (both accidental and intentional) on its security. Attacks programs, data, and documents. To measure integrity, two additional attributes must be defined Threat Security Threat: The probability that an attack of a specific type will occur within a given time.
Security The probability that the attack of a specific type will be repelled. integrity = [1 - threat x (1 - security)] where threat and security are summed over each type of attack.
ITU DEPARTMENT OF COMPUTER ENGINEERING SOFTWARE ENGINEERING
23
Measuring Quality
Usability User friendliness. If a program is not "user friendly," it is often doomed to failure, even if the functions that it performs are valuable User friendliness can be measured in terms of four characteristics
the physical and/or intellectual skill required to learn the system the time required to become moderately efficient in the use of the system the net increase in productivity measured when the system is used by someone who is moderately efficient a subjective assessment of users attitudes toward the system.
24
DREi = Ei / (Ei + Ei + 1 ) where Ei = number of errors found during software engineering activity i. Ei + 1 = number of errors found during software engineering activity i + 1 that are traceable to errors that were not discovered in software engineering activity i.
by using measurement to establish a project baseline, each of these issues becomes more manageable.
25
Answers to these questions can be determined if metrics have been collected and used as a technical guide.
26
A Listserv mailing list that addresses function point metrics has been established. To subscribe, send mail to: [email protected] SUBJECT : "none" (this field must be empty) CONTENT: SUB FUNCTION.POINT.LIST "Your name" An up-to-date list of World Wide Web references for software process metrics can be found at: http://www.rspa.com
27