The recent technique of local backlight dimming has a significant impact on the quality of images displayed with a LCD screen with LED local dimming. Therefore it represents a necessary step in the quality assessment chain, independently from the other processes applied to images. This paper investigates the modeling of one of the major spatial artifacts produced by local dimming: leakage. Leakage appears in dark areas when the backlight level is too high for LC cells to block sufficiently and the final displayed brightness is higher than it should.
A subjective quality experiment was run on videos displayed on LCD TV with local backlight dimming viewed from a 0° and 15° angles. The subjective results are then compared objective data using different leakage models: constant over the whole display or horizontally varying and three leakage factor (no leakage, measured at 0° and 15° respectively). Results show that for dark sequences accounting for the leakage artifact in the display model is definitely an improvement. Approximating that leakage is constant over the screen seems valid when viewing from a 15° angle while using a horizontally varying model might prove useful for 0° viewing.
Liquid Crystal Display (LCDs) with Light Emitting Diode (LED) backlight is a very popular display technology, used
for instance in television sets, monitors and mobile phones. This paper presents a new backlight dimming algorithm that
exploits the characteristics of the target image, such as the local histograms and the average pixel intensity of each
backlight segment, to reduce the power consumption of the backlight and enhance image quality. The local histogram of
the pixels within each backlight segment is calculated and, based on this average, an adaptive quantile value is extracted.
A classification into three classes based on the average luminance value is performed and, depending on the image
luminance class, the extracted information on the local histogram determines the corresponding backlight value. The
proposed method has been applied on two modeled screens: one with a high resolution direct-lit backlight, and the other
screen with 16 edge-lit backlight segments placed in two columns and eight rows. We have compared the proposed
algorithm against several known backlight dimming algorithms by simulations; and the results show that the proposed
algorithm provides better trade-off between power consumption and image quality preservation than the other algorithms
representing the state of the art among feature based backlight algorithms.
KEYWORDS: LCDs, Light emitting diodes, Point spread functions, Image segmentation, Optimization (mathematics), Transmittance, High dynamic range imaging, Detection and tracking algorithms, LED backlight, Image resolution
Local backlight dimming in Liquid Crystal Displays (LCD) is a technique for reducing power consumption and
simultaneously increasing contrast ratio to provide a High Dynamic Range (HDR) image reproduction. Several backlight
dimming algorithms exist with focus on reducing power consumption, while other algorithms aim at enhancing contrast,
with power savings as a side effect. In our earlier work, we have modeled backlight dimming as a linear programming
problem, where the target is to minimize the cost function measuring the distance between ideal and actual output. In this
paper, we propose a version of the abovementioned algorithm, speeding up execution by decreasing the number of input
variables. This is done by using a subset of the input pixels, selected among the ones experiencing leakage or clipping
distortions. The optimization problem is then solved on this subset. Sample reduction can also be beneficial in
conjunction with other approaches, such as an algorithm based on gradient descent, also presented here. All the proposals
have been compared against other known approaches on simulated edge- and direct-lit displays, and the results show that
the optimal distortion level can be reached using a subset of pixels, with significantly reduced computational load
compared to the optimal algorithm with the full image.
Traditionally, algorithm-based (objective) image and video quality assessment methods operate with the numerical
presentation of the signal, and they do not take the characteristics of the actual output device into account. This is a
reasonable approach, when quality assessment is needed for evaluating the signal quality distortion related directly to
digital signal processing, such as compression. However, the physical characteristics of the display device also pose a
significant impact on the overall perception. In order to facilitate image quality assessment on modern liquid crystaldisplays (LCD) using light emitting diode (LED) backlight with local dimming, we present the essential considerations
and guidelines for modeling the characteristics of displays with high dynamic range (HDR) and locally adjustable
backlight segments. The representation of the image generated by the model can be assessed using the traditional
objective metrics, and therefore the proposed approach is useful for assessing the performance of different backlight
dimming algorithms in terms of resulting quality and power consumption in a simulated environment. We have
implemented the proposed model in C++ and compared the visual results produced by the model against respective images displayed on a real display with locally controlled backlight units.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.