Automated lip-reading for improved speech intelligibility

M McClain, K Brady, M Brandstein… - 2004 IEEE International …, 2004 - ieeexplore.ieee.org
M McClain, K Brady, M Brandstein, T Quatieri
2004 IEEE International Conference on Acoustics, Speech, and …, 2004ieeexplore.ieee.org
Various psycho-acoustical experiments have concluded that visual features strongly affect
the perception of speech. This contribution is most pronounced in noisy environments where
the intelligibility of audio-only speech is quickly degraded. The paper explores the
effectiveness of using extracted visual features, such as lip height and width, for improving
speech intelligibility in noisy environments. The intelligibility content of these extracted visual
features is investigated through an intelligibility test on an animated rendition of the video …
Various psycho-acoustical experiments have concluded that visual features strongly affect the perception of speech. This contribution is most pronounced in noisy environments where the intelligibility of audio-only speech is quickly degraded. The paper explores the effectiveness of using extracted visual features, such as lip height and width, for improving speech intelligibility in noisy environments. The intelligibility content of these extracted visual features is investigated through an intelligibility test on an animated rendition of the video generated from the extracted visual features, as well as on the original video. These experiments demonstrate that the extracted video features do contain important aspects of intelligibility that may be utilized in augmenting speech enhancement and coding applications. Alternatively, these extracted visual features can be transmitted in a bandwidth effective way to augment speech coders.
ieeexplore.ieee.org
Showing the best result for this search. See all results