IE643 Lecture2 2020aug18
IE643 Lecture2 2020aug18
IE643 Lecture2 2020aug18
IE 643
Lecture 2
1 Perceptron: Recap
4 Multi-layer Perceptron
7 Other Topics
Perceptron
Biological Motivation
Perceptron
Perceptron
Perceptron
Key Assumptions
Stimuli which are similar will tend to form pathways to some sets of
response cells.
Perceptron
Perceptron
Perceptron
Perceptron
Perceptron
Supervised Learning
I Inputs and corresponding outputs are known during learning
I e.g. Classification (Binary, Multi-class, Multi-label)
Unsupervised Learning
I Input objects are not generally labeled
I e.g. Clustering, Principal-component Analysis
Semi-supervised Learning
I learning from a few labeled data
Transfer Learning
I transferring a learned model from task T1 to T2
I e.g. transfer from image captioning to video captioning
Binary Classification
Recall: e-mail Spam Classification
Binary Classification
Binary Classification
Generally many input/output pairs are given for learning the machine
learning model.
Binary Classification
Feature Extraction
Binary Classification
Feature Extraction
Binary Classification
Training
Input: Training data D = {(xi , yi )}ni=1
Aim: Learn a model h : X → Y
Training
Input: Training data D = {(xi , yi )}ni=1
Aim: Learn a model h : X → Y
Testing
Given x̂, predict ŷ = h(x̂)
P. Balamurugan Deep Learning - Theory and Practice August 18, 2020. 23 / 65
Perceptron and Learning
Perceptron
Perceptron
Perceptron
Perceptron
Perceptron
Perceptron - Geometry
Perceptron - Geometry
Perceptron - Geometry
Perceptron - Geometry
Perceptron - Geometry
Perceptron
Perceptron
Perceptron
x1 w1 Activation
function
Output
Inputs x2 w2 hw̃ , x̃i = Σwi xi sign(hw̃ , x̃i) ŷ
xd wd
Constant 1 w0
Weights, Bias
hw , xi = 0
hw , xi < 0
hw , xi > 0
Perceptron - Training
Homework:
Suppose now a new point x t = (−1, −1) with label −1 comes up.
How will the weights change?
Suppose a different new point x t = (−2, 3) with label +1 comes up.
How will the weights change?
P. Balamurugan Deep Learning - Theory and Practice August 18, 2020. 51 / 65
Perceptron and Learning
Perceptron - Convergence
Proof
Will be discussed later...
Perceptron - Caveat
Multi-layer Perceptron
Multi-layer Perceptron
CNN
Sequential outputs
yt−1 yt yt+1
Output
Hidden
Input
Other Topics
Optimization algorithms
ACKNOWLEDGMENTS
Some content borrowed from various open-access resources for
Lecture 1 and Lecture 2 slides.
I Blogs
I Tutorials
I Free e-books
I Open-access Papers
I Youtube videos
I Scribe notes from my students
Home Work
Programming Exercise
Write Python code to generate
100 two-dimensional
points from a normal
−5 1 0
distribution with mean and variance . Show the generated
−5 0 1
points in a plot.