0% found this document useful (0 votes)
5 views8 pages

LSTM

Uploaded by

ayamessar00
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
0% found this document useful (0 votes)
5 views8 pages

LSTM

Uploaded by

ayamessar00
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 8

Long Short-Term

Memory(LSTM)
By: Benamara Icherak
Messar Aya
Introduction
01

what LSTM is.


02
Overview
03 the architecture of an LSTM network

04 Fake news detection


Introduction
Long Short-Term Memory Networks is a deep learning, sequential neural
network that allows information to persist. It is a special type of Recurrent
Neural Network which is capable of handling the vanishing gradient
problem faced by RNN. LSTM was designed tot resolves the problem
caused by traditional rnns and machine learning algorithms. LSTM Model
can be implemented in Python using the Keras library.
What is LSTM?
LSTM (Long Short-Term Memory) is
a recurrent neural network (RNN)
architecture widely used in Deep
Learning. It excels at capturing
long-term dependencies, making it
ideal for sequence prediction tasks.

Unlike traditional neural networks,


LSTM incorporates feedback
connections, allowing it to process
entire sequences of data, not just
individual data points. This makes it
highly effective in understanding
and predicting patterns in
sequential data like time series,
text, and speech.
LSTM Architecture

The first part chooses whether the information coming from the
previous timestamp is to be remembered or is irrelevant and can
be forgotten. In the second part, the cell tries to learn new
information from the input to this cell. At last, in the third part, the
cell passes the updated information from the current timestamp
to the next timestamp. This one cycle of LSTM is considered a
single-time step.
LSTM Architecture

Just like a simple RNN, an LSTM also has a hidden state where H(t-1)
represents the hidden state of the previous timestamp and Ht is the
hidden state of the current timestamp. In addition to that, LSTM also
has a cell state represented by C(t-1) and C(t) for the previous and
current timestamps, respectively.
Here the hidden state is known as Short term memory, and the cell
state is known as Long term memory. Refer to the following image.
Objectives

Objective 01 Objective 02 Objective 03


Lorem ipsum dolor sit amet, Lorem ipsum dolor sit amet, Lorem ipsum dolor sit amet,
consectetur adipiscing elit. Nullam consectetur adipiscing elit. Nullam consectetur adipiscing elit. Nullam
laoreet risus fringilla, egestas elit a, laoreet risus fringilla, egestas elit a, laoreet risus fringilla, egestas elit a,
consequat augue. Phasellus consequat augue. Phasellus consequat augue. Phasellus
sollicitudin felis mi, quis egestas ex sollicitudin felis mi, quis egestas ex sollicitudin felis mi, quis egestas ex
ornare sed quis adipiscing. ornare sed quis adipiscing. ornare sed quis adipiscing.

Objective 01 Objective 02 Objective 03


FAKE NEWS DETECTION

Sequential Input Long-Term Memory Contextual


Each word in a news article is Understanding
converted into a word embedding, a LSTM's architecture includes a
numerical representation that memory cell that can maintain
captures the word's meaning. These information over long sequences. As the LSTM processes each word, it
embeddings are fed into the LSTM This allows the network to updates its internal state based on the
network one at a time, in the order remember important information current input and the previous state
they appear in the text. from earlier in the text

Classification Training and Evaluation

After processing the entire news During training, the LSTM adjusts its
article, the final state of the LSTM is weights and biases based on the
used as input to a classification prediction error,
layer, which predicts whether the This allows the network to learn to
article is real or fake based on the distinguish between real and fake
patterns learned during training. news articles.

You might also like