General circulation models (GCMs) are extensively used to estimate the influence of clouds on the global energy budget and other aspects of climate. Because radiative transfer computations involved in GCMs are costly, it is typical to consider only absorption but not scattering by clouds in longwave (LW) spectral bands. In this study, the flux and heating rate biases due to neglecting the scattering of LW radiation by clouds are quantified by using advanced cloud optical property models, and satellite data from Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO), CloudSat, Clouds and the Earth's Radiant Energy System (CERES), and Moderate Resolution Imaging Spectrometer (MODIS) merged products (CCCM). From the products, information about the atmosphere and clouds (microphysical and buck optical properties, and top and base heights) is used to simulate fluxes and heating rates. One-year global simulations for 2010 show that the LW scattering decreases top-of-atmosphere (TOA) upward flux and increases surface downward flux by 2.6 and 1.2 W/m2, respectively, or approximately 10% and 5% of the TOA and surface LW cloud radiative effect, respectively. Regional TOA upward flux biases are as much as 5% of global averaged outgoing longwave radiation (OLR). LW scattering causes approximately 0.018 K/d cooling at the tropopause and about 0.028 K/d heating at the surface. Furthermore, over 40% of the total OLR bias for ice clouds is observed in 350–500 cm−1. Overall, the radiative effects associated with neglecting LW scattering are comparable to the counterpart due to doubling atmospheric CO2 under clear-sky conditions.