Deflection Temperature of Plastics Under Flexural Load in The Edgewise Position
Deflection Temperature of Plastics Under Flexural Load in The Edgewise Position
Deflection Temperature of Plastics Under Flexural Load in The Edgewise Position
This standard has been approved for use by agencies of the Department of Defense.
4
1
This test method is under the jurisdiction of ASTM Committee D20 on Plastics Annual Book of ASTM Standards, Vol 08.03.
5
and is the direct responsibility of Subcommittee D20.30 on Thermal Properties Annual Book of ASTM Standards, Vol 14.03.
6
(Section D20.30.07). Annual Book of ASTM Standards, Vol 14.02.
7
Current edition approved August 10, 2001. Published October 2001. Originally Available from American National Standards Institute, 11 W. 42nd St., 13th
published as D 648 – 41 T. Last previous edition D 648 – 00a. Floor, New York, NY 10036.
8
2
Annual Book of ASTM Standards, Vol 08.01. Mangum, B. W., “Platinum Resistance Thermometer Calibration,” NBS Special
3
Discontinued; see 1997 Annual Book of ASTM Standards, Vol 08.01. Publication 250-22, 1987. Available from National Institute of Standards and
Technology, Gaithersburg, MD.
1
D 648
edgewise position as a simple beam with the load applied at its 6.2 The results of this test may depend on the measured
center to give maximum fiber stresses of 0.455 MPa (66 psi) or width and depth of the specimen and the final deflection at
1.82 MPa (264 psi) (Note 3). The specimen is immersed under which the deflection temperature is determined.
load in a heat-transfer medium provided with a means of 6.3 The type of mold and the molding process used to
raising the temperature at 2 6 0.2°C/min. The temperature of produce test specimens affects the results obtained in this test.
the medium is measured when the test bar has deflected 0.25 Molding conditions shall be in accordance with the standard
mm (0.010 in.). This temperature is recorded as the deflection for that material or shall be agreed upon by the cooperating
temperature under flexural load of the test specimen. laboratories.
6.4 Results of testing may be affected by the design of the
NOTE 3—A round robin has been conducted that showed that there is no
advantage to using higher loads when measuring deflection temperature of test equipment. The test span (either 100 mm or 101.6 mm)
present-day plastics with present-day instruments. will influence the resultant measurement. Instrumentation
equipped with metal clips or other types of auxiliary supports
5. Significance and Use designed to maintain specimens perpendicular to the applied
5.1 This test is particularly suited to control and develop- load may affect the test results if the pressure is sufficient to
ment work. Data obtained by this test method may not be used restrict the downward motion of the specimen at its center.
to predict the behavior of plastic materials at elevated tempera-
tures except in applications in which the factors of time, 7. Apparatus
temperature, method of loading, and fiber stress are similar to 7.1 The apparatus shall be constructed essentially as shown
those specified in this test method. The data are not intended in Fig. 1 and shall consist of the following:
for use in design or predicting endurance at elevated tempera- 7.1.1 Specimen Supports, metal supports, allowing the load
tures. to be applied on top of the specimen vertically and midway
between the supports, which shall be separated by a distance,
6. Interferences defined in 7.1.1.1 or 7.1.1.2. The contact edges of the supports
6.1 The results of the test may depend on the rate of heat and of the piece by which load is applied shall be rounded to
transfer between the fluid and the specimen and the thermal a radius of 3 6 0.2 mm (0.118 6 0.008 in.).
conductivity of the fluid. 7.1.1.1 Method A—101.6 6 0.5 mm (4.0 6 0.02 in.).
2
D 648
7.1.1.2 Method B—100.0 6 0.5 mm (3.937 6 0.020 in.). Fs = force exerted by any spring-loaded component in-
volved, N; this is a positive value if the thrust of the
NOTE 4—A test should be made on each apparatus using a test bar made
of a material having a low coefficient of expansion.9 The temperature
spring is towards the test specimen (downwards), or
range to be used should be covered and a correction factor determined for a negative value if the thrust of the spring is opposing
each temperature. If this factor is 0.013 mm (0.0005 in.) or greater, its the descent of the rod, or zero if no such component
algebraic sign should be noted and the factor should be applied to each test is involved, and
by adding it algebraically to the reading of apparent deflection of the test mr = mass of the rod that applies the testing force to the
specimen. specimen, kg.
7.1.2 Immersion Bath—A suitable liquid heat-transfer me- NOTE 7—In some designs of this apparatus, the spring force of the dial
dium (Note 5) in which the specimen shall be immersed. It gage is directed upward (opposite the direction of specimen loading),
which reduces the net force applied to the specimen. In other designs, the
shall be well-stirred during the test and shall be provided with
spring force of the dial gage acts downward (in the direction of specimen
a means of raising the temperature at a uniform rate of 2 6 loading), which increases the net force applies to the specimen. The mass
0.2°C/min. This heating rate shall be considered to be met if, applied to the loading rod must be adjusted accordingly (increased for
over every 5-min interval during the test, the temperature of the upward dial force and decreased for downward dial force) to compensate.
bath shall rise 10 6 1°C at each specimen location. Since the force exerted by the spring in certain dial gages varies
considerably over the stroke, this force should be measured in that part of
NOTE 5—A liquid heat-transfer medium shall be chosen which will not the stroke that is to be used. Suggested procedures to determine the total
affect the specimen. Mineral oil is considered safe from ignition to 115°C. load required to correct for the force of the dial gage spring are given in
Silicone oils may be heated to about 260°C for short periods of time. For Appendix X1 and Appendix X2. Other procedures may be used if
still higher temperatures, special heat-transfer media should be used. equivalent results are obtained. Appendix X3 provides a method of
Improved performance with longer oil life may be obtained by the use of determining the spring force, uniformity of the force in the gage’s test
measurement range, and whether the gage is contaminated and sticking.
CO2 or other inert gas to isolate the oil surface from the atmosphere.
NOTE 6—A circulating air oven may be used if it can be shown that 7.1.5 Temperature Measurement System—Consisting of a
equivalent results are obtained. thermocouple, thermometer, resistance thermometer, ther-
mistor, etc., as the sensor, together with its associated condi-
7.1.3 Deflection Measurement Device, suitable for measur- tioning and readout instrumentation to cover a suitable range.
ing specimen deflection of at least 0.25 mm (0.010 in.). It shall The thermometer shall be one of the following, or its equiva-
be readable to 0.01 mm (0.0005 in.) or better. The device may lent, as prescribed in Specification E 1: Thermometer 1°C or
be a dial gage or any other indicating or recording device 2°C, having ranges from –20 to 150°C or –5 to 300°C
including electric displacement sensing apparatus. respectively, whichever temperature range is most suitable.
7.1.4 Weights—A set of weights of suitable sizes so that the Mercury in glass thermometers shall be calibrated for the depth
specimen can be loaded to a fiber stress of 0.455 MPa (66 psi) of immersion in accordance with Test Method E 77. Thermo-
6 2.5 % or 1.82 MPa (264 psi) 6 2.5 %. The mass of the rod couples shall comply with the requirements of Specification
that applies the testing force shall be determined and included E 608. Thermocouples shall be calibrated in accordance with
as part of the total load. If a dial gage is used, the force exerted Test Method E 220. Resistance thermometers shall comply
by its spring shall be determined and shall be included as part with the requirements of Test Methods E 644 and Specification
of the load (Note 8). Calculate the testing force and the mass E 1137. Thermistors shall comply with the requirements of
that must be added to achieve the desired stress as follows: Specification E 879 and be calibrated in accordance with NIST
Special Publication 250-22.
F 5 2Sbd2/3L (1) 7.2 Micrometers shall meet the requirements of Test Meth-
1
F 5 F/9.80665 ods D 5947 and be calibrated in accordance with that test
method.
mw 5 ~F – Fs!/9.80665 – mr
8. Sampling
where:
F = load, N, 8.1 Unless otherwise specified, sampling shall be in accor-
F1 = load, kgf, dance with the sampling procedure prescribed in Practice
S = fiber stress in the specimen (0.455 MPa or 1.82 D 1898. Adequate statistical sampling shall be considered an
MPa), acceptable alternative.
b = width of specimen, mm,
d = depth of specimen, mm, 9. Test Specimen
L = distance between supports, (101.6 mm—Method A, 9.1 At least two test specimens shall be used to test each
or 100 mm—Method B), see 7.1.1.1 and 7.1.1.2. sample at each fiber stress. The specimen shall be 127 mm (5
mw = added mass, kg, in.) in length, 13 mm (1⁄2 in.) in depth by any width from 3 mm
(1⁄8 in.) to 13 mm (1⁄2 in.). Tolerances on dimensions (for highly
reproducible work) should be of the order of 60.13 mm (0.005
in.) over the length of the specimen.
9
Invar or borosilicate glass have been found suitable for this purpose. NOTE 8—The test results obtained on specimens approaching 13 mm in
width may be 2 to 4°C above those obtained from 4 mm or narrower test
specimens because of poor heat transfer through the specimen.
3
D 648
9.2 The specimens shall have smooth flat surfaces free from NOTE 11—Holding of the specimens upright on the specimen supports
saw cuts, excessive sink marks, or flash. by the use of clips or auxiliary supports that apply pressure to the
9.3 Molding conditions shall be in accordance with the specimen have been shown to alter the deflection temperature when
testing at the 0.45 MPa stress level.
specification for that material or shall be agreed upon by the
cooperating laboratories. Discrepancies in test results due to 12.3 The thermometer bulb or sensitive part of the tempera-
variations in molding conditions may be minimized by anneal- ture measuring device shall be positioned as close as possible
ing the test specimens before the test. Since different materials to the test specimen (within 10 mm) without touching it. The
require different annealing conditions, annealing procedures stirring of the liquid-heat transfer medium shall be sufficient to
shall be employed only if required by the material standard or ensure that temperature of the medium is within 1.0°C at any
if agreed upon by the cooperating laboratories. point within 10 mm of the specimen. If stirring is not sufficient
to meet the 1.0°C requirement, then the temperature measuring
10. Preparation of Apparatus device shall be placed at the same level as the specimen and
10.1 The apparatus shall be arranged so that the deflection within 10 mm of the point at which the specimen is loaded.
of the specimen at midspan is measured by the deflection 12.4 Ascertain that the temperature of the bath is suitable.
measurement device described in 7.1.3. The apparatus may be The bath temperature shall be at ambient temperature at the
arranged to shut off the heat automatically and sound an alarm start of the test unless previous tests have shown that, for the
or record the temperature when the specific deflection has been particular material under test, no error is introduced by starting
reached. Sufficient heat transfer liquid shall be used to cover at a higher temperature.
the thermometers to the point specified in their calibration, or 12.5 Carefully apply the loaded rod to the specimen and
76 mm (3 in.) in the case of the ASTM thermometers referred lower the assembly into the bath.
to in 7.1.5. 12.6 Adjust the load so that the desired stress of 0.455 MPa
NOTE 9—It is desirable to have a means to cool the bath in order to (66 psi) or 1.82 MPa (264 psi) is obtained.
reduce the time required to lower the temperature of the bath after the test NOTE 12—Verification of the load should be made on all new equip-
has been completed. This may be accomplished by using a cooling coil ment, after replacement of dial gages, or following any other change that
installed in the bath, or an external heat transfer system that passes the hot could affect the loading. Verification of the load should also be performed
oil through it. If the rate of temperature rise of the oil is adversely affected periodically to ensure that the equipment is within calibration (see
by the presence of residual coolant in the coils, the coolant should be Appendix X1, Appendix X2, and Appendix X3). Depending on the type of
purged prior to starting the next test. deflection measurement device used, it may be necessary to adjust the
11. Conditioning device such that it records the deflection in the displacement range of the
device where the test is to be made.
11.1 Conditioning—Condition the test specimens at 23 6
2°C (73.4 6 3.6°F) and 50 6 5 % relative humidity for not less 12.7 Five minutes after applying the load, adjust the deflec-
than 40 h prior to test in accordance with Procedure A of tion measurement device to zero or record its starting position.
Practice D 618 unless otherwise specified in the material Heat the liquid heat-transfer medium at a rate of 2.0 6
standard or contract between interested parties. In cases of 0.2°C/min.
disagreement, the tolerances shall be 61°C (1.8°F) and 62 % NOTE 13—The 5-min waiting period is provided to compensate par-
relative humidity. tially for the creep exhibited by some materials at room temperature when
subjected to the specified nominal surface stress. That part of the creep
NOTE 10—Shorter conditioning periods may be used when it is shown that occurs in the initial 5 min is usually a significant fraction of that which
that they do not affect the results of this test. Longer conditioning times occurs in the first 30 min.
may be required for some materials that continue to change with time.
12.8 Record the temperature of the liquid heat-transfer
12. Procedure medium at which the specimen has deflected the specified
12.1 Measure the width and depth of each specimen with a amount at the specified fiber stress.
suitable micrometer (as described in 7.2) at several points NOTE 14—Continuous reading of the deflection versus temperature
along the span. Average these respective readings to obtain the even beyond the standard deflection might be useful in special situations.
nominal width and depth value for the specimen. These values
are used to determine the amount of applied force necessary to 13. Report
produce the specified fiber stress in each specimen (see 7.1.4). 13.1 Report the following information:
12.2 Position the test specimens edgewise in the apparatus 13.1.1 Full identification of the material tested,
and ensure that they are properly aligned on the supports so 13.1.2 Method of test specimen preparation,
that the direction of the testing force is perpendicular to the 13.1.3 Conditioning procedure,
direction of the molding flow. If the specimen support unit has 13.1.4 Test method, reported as D 648 Method A or D 648
metal clips or auxiliary supports on it to hold the specimen Method B,
perpendicular to the load and to prevent the specimen from 13.1.5 The width and depth of the specimen, measured to
being displaced by the circulating oil, only one surface of the 0.025 mm,
clip or auxiliary support may touch the specimen at any one 13.1.6 The standard deflection, the deflection temperature,
time. The presence of any clip or auxiliary support shall not and the resultant maximum fiber stress for each specimen,
impede the deflection of the specimen or place additional force 13.1.7 The immersion medium, the temperature at the start
on the specimen that will result in more load having to be of the test, and the actual heating rate,
applied to achieve deflection. 13.1.8 Average deflection temperature,
4
D 648
13.1.9 Any nontypical characteristics of the specimen noted TABLE 2 Precision, Deflection Temperature
during the test or after removal from the apparatus, (such as Units Expressed in °C
twisting, nonuniform bending, discoloration, swelling), and Material Average SrA SRB rC RD
ABS, 1.8 kPa 81.6 1.15 1.67 3.21 4.68
13.1.10 Type of apparatus: automated or manual. PP natural, 0.45 kPa 83.8 3.11 4.71 8.70 13.20
PP filled, 0.45 kPa 114.7 2.16 4.62 6.06 12.92
14. Precision and Bias A
Sr = within-laboratory standard deviation for the indicated material. It is
14.1 Precision—An interlaboratory test program10 was car- obtained by pooling the within-laboratory standard deviations of the test results
ried out with seven laboratories participating and utilizing both from all of the participating laboratories:
Sr 5 @@~S1!2 1 ~S2!2 1 ········ 1 ~Sn!2#/n#1/2
manual and automated instruments. Four polymers were in- B
SR = between-laboratories reproducibility, expressed as standard deviation:
cluded in the program. Statistical information is summarized in SR 5 @Sr2 1 SL2#1/2, where SL 5 standard deviation of laboratory means.
Table 1. The critical difference limits are the limits beyond C
r = within-laboratory critical interval between two test results = 2.8 3 Sr
which observed differences should be considered suspect.
D
R = between-laboratories critical interval between two test results = 2.8 3 SR.
5
D 648
ANNEX
(Mandatory Information)
A1.1 If the unit in operation is of the type that has only one bring the bath to 60.1°C of the bath set point, allowing a
temperature probe in the bath, and this probe is monitored to stabilization time of a minimum of 5 min between adjust-
record the deflection temperature of the specimen at all the ment(s) and readings. Once the calibrated probe indicates the
stations in the unit, then the following calibration and checks bath is at the set point make adjustments to the centralized
must be undertaken to ensure comparable results with units that probe’s display as necessary.
have a temperature probe at each station. A1.3.2.1 Move the NIST traceable probe to the other two
A1.2 This procedure must be performed annually as a points maintaining the probe within 10 mm of specimen height.
minimum to ensure proper temperature distribution and accu- Read and record the temperatures at these points, after allow-
racy of probe and display. ing the probe to stabilize a minimum of 5 min.
A1.3.3 Evaluate the data from each of the three points in the
A1.3 Calibration will require the use of temperature meter bath at both low and high temperature. If any point is greater
and probe traceable to NIST, with accuracy and display than 60.5°C from the set point, have the unit serviced or
resolution of 0.1°C or better, a stopwatch, and any tools needed repaired to correct this error. If it is not possible to correct the
to open and adjust the unit. bath uniformity to less than 0.5°C, then a thermal sensing
A1.3.1 Low-temperature calibration of the unit is accom- device must be placed at each station and used to record the
plished by placing the NIST traceable probe within 10 mm of temperature of the bath at the time of deflection while running
specimen height, in the bath at three different points in the bath. tests. The unit may be electronically modified or the use of
The three points will be at the center and left and right ends of glass thermometers (as outlined in 7.1.5) may be placed at each
the bath. Start with the station closest to the centralized probe, station and manually read and recorded at the moment of
while the unit is programmed to maintain a constant tempera- specimen deflection.
ture between 20 and 50°C, with all stirrers operating. Allow the A1.3.4 If the preceding steps have been taken and success-
bath to stabilize for a minimum of 5 min. Read and record the fully completed, cool the bath down to a normal start tempera-
readout of the calibrated probe and the units internal tempera-
ture and allow the bath to stabilize. Place the NIST probe at the
ture display to the nearest 0.1°C. Make any necessary adjust-
point in the bath that the preceding gathered data shows the
ments to the unit’s temperature controller to bring the bath to
greatest error. Start a test at 120°C/h. Read and record the
60.1°C of the bath set point, allowing a stabilization time of a
temperature of both the unit’s display and the readout of the
minimum of 5 min between adjustment(s) and readings. Once
NIST probe. An offset of 10 to 15 s between the two readings
the calibrated probe indicates the bath is at the set point, make
adjustments to the centralized probe’s display as necessary. is acceptable as long as this interval is maintained throughout
A1.3.1.1 Move the NIST traceable probe to the other two this test. Start the stopwatch when the first temperature is
points maintaining the probe within 10 mm of specimen height. recorded. Read and record the temperature of the unit’s display
Read and record the temperatures at these points, after allow- and the NIST probe, maintaining any delay interval, if used,
ing the probe to stabilize a minimum of 5 min. every 5 min for 1 h.
A1.3.2 High-temperature calibration will be accomplished A1.3.5 Evaluate the data acquired during the preceding test.
by programming the unit to maintain an elevated temperature Ensure that the temperature of the bath is rising at the correct
near, but not exceeding the highest temperature allowed by the rate as outlined in 7.1.2, at both the centralized probe and the
heat transfer media. All covers and stations must be in place other selected test point. If either is outside the limits for the
and stirrer motors operating. Place the NIST probe within 10 rate of rise, the unit must be serviced and rechecked before
mm of specimen height at the station closest to the centralized further use. If a unit fails to pass this calibration test the unit
probe, and allow the bath to stabilize for a minimum of 5 min. must be serviced or replaced. Placing a temperature sensing
Read and record the readout of the calibrated probe and the unit device at each station will not correct the problem observed in
internal temperature display to the nearest 0.1°C. Make any A1.3.4, as the unit’s rate of rise is outside the tolerances of this
necessary adjustments to the unit’s temperature controller to test method.
6
D 648
APPENDIXES
(Nonmandatory Information)
X1. PROCEDURE FOR DETERMINATION OF CORRECT SPECIMEN LOADING UTILIZING EQUILIBRIUM WEIGHING
OF THE LOADING ROD
X2. PROCEDURE FOR DETERMINATION OF CORRECT SPECIMEN LOADING BY WEIGHING THE APPLIED LOAD
WITH A TENSION-TESTING MACHINE
7
D 648
support platform and adjust the loading rod support so that the
tip of the loading rod is 12.7 mm (1⁄2in.) from the top of the
specimen supports.
X2.2.4 Lubricate the rod and guide hole surfaces with light
oil.
X2.2.5 Adjust the dial gage so that it reads zero, then turn
the nut on top of the loading rod clockwise until the deflector
arm almost makes contact with the contact arm on top of the
dial gage.
X2.2.6 Start the lower crosshead in the up direction at the
rate of 0.51 mm (0.02 in.)/min. This in effect causes the loading
rod to move down as in an actual test. When the pointer on the
dial gage shows movement, activate the chart drive at the rate
of 1 in./min.
X2.2.7 Record the force, in grams, at 0.89 6 0.05-mm
(0.035 6 0.002-in.) deflection.
X2.2.8 Adjust the weight of the loading rod required to give
the desired maximum fiber stress in accordance with Eq 1.
X3. PROCEDURE FOR DETERMINATION OF CORRECT SPECIMEN LOADING BY WEIGHING THE APPLIED LOAD IN
SITU
X3.2.1 The apparatus shall be constructed essentially as X3.3.5 If a scale or balance is used, position the platform
shown in Fig. X3.1 and shall consist of the following: assembly on top of the deflection temperature bath unit and
level it. Place the scale or balance on top of the platform
X3.2.1.1 Electronic Weighing System with Load Cell (for
assembly and verify that it is level.
example, digital scale or tensile testing machine), single-pan
X3.3.6 Attach the adjustment fitting to the bottom of the
balance, or equal-arm laboratory balance, with a minimum
load cell or balance.
capacity of 2000 g and a sensitivity of 0.1 g.
X3.3.7 Attach the mass support to the bottom of the
X3.2.1.2 Platform Assembly, for supporting the scale or adjustment fitting.
balance above the deflection temperature bath unit. X3.3.8 If a load cell is used, allow it to warm up before
X3.2.1.3 Mass Support Unit, to hold the loading rod and making the measurements. Tare out the weight due to the mass
mass in position while the force measurement is determined. support and adjustment fitting.
X3.2.1.4 Adjustment Fitting, for connection of the mass X3.3.9 Position the mass support so that it bears the weight
support to the load cell or balance. This fitting should facilitate of the loading rod and mass.
adjusting the test fixture so that the loading force can be X3.3.10 Verify that the load cell or balance, adjustment
measured at the desired position. fitting, mass support, and loading rod are uniaxially aligned. It
8
D 648
is very important to ensure that the test setup does not dispersed. If desired, the user may perform this load verification procedure
introduce any off-center loading into the system that will result at two different temperatures to confirm the condition.
in incorrect force measurements. X3.3.15 Based on these measurements, adjust the mass so
X3.3.11 Use the adjustment fitting to position the loading that the applied force corresponds to the calculated force of
assembly so that it corresponds to the zero deflection position. X3.3.1.
Zero the deflection measurement device of the machine, if
necessary. Dial gages should be adjusted in accordance with X3.3.16 The difference between the force measurement at
Appendix X5. the zero deflection position (0.00 mm) and the force measure-
X3.3.12 Record the indicated load at the zero deflection ment at the final deflection position (typically 0.25 mm) should
position to the nearest 0.1 g. be within the 62.5 % tolerance as specified in 7.1.4.
X3.3.13 Use the adjustment fitting to lower the loading NOTE X3.3—If the force differential is excessive over the deflection
assembly to the final deflection position, typically 0.25 mm. measuring range, the user should attempt to identify the component
X3.3.14 Record the indicated load at the final deflection responsible for the deviation, implement the necessary corrections, and
point to the nearest 0.1 g. repeat this procedure to ensure that the proper adjustments have been
NOTE X3.2—These force measurements may be made with the bath at made. It may be possible to adjust the machine so that the calculated load
any convenient temperature. The effect of temperature on the buoyancy is achieved at an intermediate position (for example, 0.12 mm), thereby
force over the usable range of the machine is generally negligible for permitting the load at the zero deflection position (0.00 mm) and the final
commonly used silicone fluids and loading assembly designs. The deflection position (typically 0.25 mm) to fall within the allowable
decrease in the oil density is offset by the increased volume of oil tolerance.
X4. PROCEDURE FOR VERIFYING THE CALIBRATION OF PENETRATION MEASURING DEVICES USING GAGE
BLOCKS
X4.1 This procedure is intended to provide a method of X4.2 Remove the test frame from the bath. Wipe excess
verifying the calibration of penetration measuring devices heat transfer medium from the frames and place on a sturdy,
typically found on DTUL measuring instruments. It is not a level surface. If it is not possible to remove the test frame from
calibration method. If the user finds that the measuring device the machine, the frame may be positioned on top of the
on one or more of the test frames is out of calibration, the instrument, providing the frame is level during the verification
manufacturer of the instrument, or a qualified calibration procedure so that the loading rod will apply its full load as it
service company should be consulted to have the problem would during a test. Verification should be made using the
corrected. This procedure may be used for dial indicator, minimum load that may be encountered during testing.
LVDT, and encoder-type penetration measurement devices. X4.3 Thoroughly clean the loading nose and the anvils
9
D 648
where the specimen is normally positioned. NOTE X4.1—Care must be taken to avoid damaging the gage blocks
when using heavy loads.
X4.4 Select a minimum of two gage blocks that, when
stacked together, are comparable in height to a typical test X4.6 Lift the loading rod and carefully remove the
specimen. At least one of the gage blocks should be a 1.00-mm 1.00-mm block from beneath the rod without changing the
block. If a 1.0-mm age block is not available, a 0.040-in. position of the remaining block. Lower the rod onto the
(1.016-mm) gage block can be substituted. remaining gage block. Record the reading on the indicator. The
reading should be equal to 1.00 6 0.02 mm.
X4.5 Place the stacked gage blocks in the test frame where
X4.7 Repeat the procedure at least twice to ensure repeat-
the specimen is normally positioned. Lower the loading rod
ability. Intermediate reading can be verified in a similar manner
onto the gage blocks in such a way that the loading nose rests
by using different gage blocks.
in the middle of the block. Add the required weight to the rod
to apply force to the block, simulating test conditions. Zero the X4.8 Repeat the procedure on all of the instrument’s test
indicator or record the reading on the display. frames.
10
D 648
FIG. X5.2 Load Versus Deflection Curve for Gage With No Current Problems
FIG. X5.3 Load Versus Deflection Curve for Gage With Problems
SUMMARY OF CHANGES
This section identifies the location of selected changes to this test method. For the convenience of the user,
Committee D20 has highlighted those changes that may impact the use of this test method. This section may also
include descriptions of the changes or reasons for the changes, or both.
11
D 648
(1) Title Change. (16) Added Note 12.
(2) Revised 1.4. (17) Revised Section 12, Procedure, to clarify testing proce-
(3) Revised Section 3, Terminology. dure, and added Note 13.
(4) Revised 4.1, changing units for fiber stress. (18) Revised Section 13, Report.
(5) Added new Section 6, Interferences. (19) Added Appendix X4 and Appendix X5.
(6) Revised 7.1.1, Apparatus, adding tolerances on the required (20) Added new Section 13, Precision and Bias.
span and on the radius of the load points. (21) Deleted Note 12 regarding bias between manual and
(7) Revised Note 4. instrumented units.
(8) Revised 7.1.4, clarifying calculations of fiber stress. (22) Added new Note 13, Caution.
(9) Revised 7.1.5, Temperature Measurement System, clarify- D 6480 – 00:
ing requirements for temperature measurement. (1) Added Methods A and B (see 7.1.1.1 and 7.1.1.2).
(10) Revised Section 8, Sampling. D 648 – 00a:
(11) Added Note 9. (1) 12.2—Added additional requirements to the end of the
(12) Added 9.3. paragraph.
(13) Revised Section 10. (2) Added Note 11 and renumbered subsequent notes.
(14) Added Note 10. D 648 – 01:
(15) Revised Section 11, Conditioning. (1) Added Section 6.4.
ASTM International takes no position respecting the validity of any patent rights asserted in connection with any item mentioned
in this standard. Users of this standard are expressly advised that determination of the validity of any such patent rights, and the risk
of infringement of such rights, are entirely their own responsibility.
This standard is subject to revision at any time by the responsible technical committee and must be reviewed every five years and
if not revised, either reapproved or withdrawn. Your comments are invited either for revision of this standard or for additional standards
and should be addressed to ASTM International Headquarters. Your comments will receive careful consideration at a meeting of the
responsible technical committee, which you may attend. If you feel that your comments have not received a fair hearing you should
make your views known to the ASTM Committee on Standards, at the address shown below.
This standard is copyrighted by ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959,
United States. Individual reprints (single or multiple copies) of this standard may be obtained by contacting ASTM at the above
address or at 610-832-9585 (phone), 610-832-9555 (fax), or [email protected] (e-mail); or through the ASTM website
(www.astm.org).
12