A Module On Introduction and Measuring Errors in Numerical Methods PDF
A Module On Introduction and Measuring Errors in Numerical Methods PDF
A Module On Introduction and Measuring Errors in Numerical Methods PDF
Precision
Refers to how closely individual computed
or measured values agrees with each other.
Errors are intrinsic to the
understanding and effective use
of numerical methods.
Numerical Errors - arise from the use of
approximation to represent exact mathematical
operations and quantities.
Example:
𝑥 𝑥2 𝑥3
1. Truncation Error: 𝑒 𝑥 = 1 + + + +⋯
1! 2! 3!
2. Round-off: when approximation numbers
are used to represent exact numbers.
(𝝿 3.1416)
Why measure errors?
1) To determine the accuracy of numerical
results.
2) To develop stopping criteria for iterative
algorithms.
Defined as the difference between the true
value in a calculation and the approximate
value found using a numerical method etc.