Fine-tuning Handwriting Recognition systems with Temporal Dropout

E Chammas, C Mokbel - arXiv preprint arXiv:2102.00511, 2021 - arxiv.org
arXiv preprint arXiv:2102.00511, 2021arxiv.org
This paper introduces a novel method to fine-tune handwriting recognition systems based
on Recurrent Neural Networks (RNN). Long Short-Term Memory (LSTM) networks are good
at modeling long sequences but they tend to overfit over time. To improve the system's ability
to model sequences, we propose to drop information at random positions in the sequence.
We call our approach Temporal Dropout (TD). We apply TD at the image level as well to
internal network representation. We show that TD improves the results on two different …
This paper introduces a novel method to fine-tune handwriting recognition systems based on Recurrent Neural Networks (RNN). Long Short-Term Memory (LSTM) networks are good at modeling long sequences but they tend to overfit over time. To improve the system's ability to model sequences, we propose to drop information at random positions in the sequence. We call our approach Temporal Dropout (TD). We apply TD at the image level as well to internal network representation. We show that TD improves the results on two different datasets. Our method outperforms previous state-of-the-art on Rodrigo dataset.
arxiv.org
Showing the best result for this search. See all results