Syl4 ML

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

Effective Teaching of “Machine Learning”

(A Module in the CSEDU – Certificate Program in CS Education)

Objective:

The aim of this module is to help teachers in colleges/universities (attendees) improve their
teaching of the Machine Learning course, as per its prescribed AICTE syllabus (given at the end
of this document). On attending this module, the attendees will improve their teaching of this
course leading to improved learning by their students, and thereby more students achieving the
learning outcomes of the AICTE course. This module is part of the Certificate Program in CS
Education initiative (csedu.iiitd.ac.in).

The main learning outcomes of this module are (at the end of the module, an attendee will):

● Have clearer understanding of importance of this course in a CSE program, desired


learning outcomes established in the AICTE course, and course syllabus
● For each topic in the AICTE syllabus, have a deeper understanding of concepts, what is
important, how to teach, what type of assignments to give, etc.
● Appreciate how some (contemporary) teaching tools and techniques could be brought to
the classrooms while teaching Machine Learning

The module will focus on delivering the Essential Learning Outcomes of the AICTE course
syllabus. Some advanced topics may also be discussed, based on inputs from attendees
towards the end of this module.

Requirements for Module Attendees

The attendees for this module should:

● Have taught machine learning in the past or planning to take up in near future
● Have sufficient background knowledge in topics covered in the AICTE ML course
● Have access to a good laptop (or desktop) and internet
● Commit to spending at least 5 hrs per week (avg) for the module
● Have familiarity with Python

Module Syllabus

Each weekly session will discuss one of the key topics in the AICTE syllabus, and an
appropriate teaching methodology that suits this topic.

The week-wise syllabus for this module is given in the table. It may be revised based on how the
module progresses and feedback of attendees:
Wk M Topic or Module from AICTE syllabus to Approach Self-work for the week
be discussed. (illustrative)

1 M1 ● Introduction to this module Lecture Quick recap of python


● What is Machine Learning and how and Jyupiter notebook
is it related to other areas of
computer science?
● How is ML related to other related
terms like AI, DL, DS, etc?
● Brief introduction to practical and
pedagogical challenges in teaching
● Our course plan and teaching/styles

2 M1 ● A quick tour of ML curriculum/course Primarily Recap of basic ML


and different ways of Lecture concepts (play videos)
sequencing/organizing and Q&A. Make everyone
● Recap of Key concepts in ML May be show comfortable with the
● Success stories/Case Studies some small keyword space.
(industry use case) and expose the video clips
ML pipeline (industrial use
● Data, Representation and case) (how to
Visualization use external
● Appreciate data and visualize using videos for
python notebooks provided. strengthening
the teaching)

Show python
notebook
(preview)

3 M1 ● Machine Learning Problems and How to use the Python notebooks


Formulations modern Tools
● Popular Paradigms of Machine to make the Running Demo and
Learning teaching showing
● Problem of Classification and effective
Regression Plotting
● Details: Linear regression and see (Tutorial could
how a ML topic could be planned introduce Use of Cloud/for teaching
● Use of Python notebooks in teaching python
and learning notebooks and Educational credits
● Popular tools and resources for running) (factual)
effective teaching.
Contrast between
classical vs using tools

Panda, data handling (?)


(students need to learn;
teachers may not)
4 M2 ● Dimensionality Reduction and PCA How to
● How to teach PCA? demonstrate One Lab experiment
● What will students find difficult? teaching with a involving PCA
● How to prepare full life cycle?
questions/homeworks? (End2End
Teaching )

How to use
Online
Resource for
planning a full
lecture/topic?
How to start
preparing for a
lecture (look at
popular videos,
blogs, courses
online, figure
out the full
breadth of
topics)?

Which are the


popular
sources?

How to
structure the
lesson? (short
videos, mix of
theory and
hands-on, recall
based Qs)

How to create
theory
assignments
(what are some
good sources
from which Qs
can be
curated)?

How to create
programming
assignments
(starter codes,
datasets,
variations)
5 M2 ● Decision Trees for Classification (full cycle) One Lab experiment
● Problem of Overfitting Setting home involving Decision Trees
● Idea of Regularization works,questions and Overfitting
,exams

6 M2 ● Notion of Training, Validation, (full cycle)


Testing, Creating Demonstrate the role of
● Generalization programming Validation data and error
exercises (start estimate
from some
library)

7 M3 ● K-Means Algorithm How to


● K-Means: Unsupervised Learning strengthen the Technical writing (maybe
and associated optimization analytical briefly introduce LaTeX)
problems background of
● More into K-Means the student?
● Formulating an ML optimization
problem (in class exercise) Strengthening
the analytical
foundations of
Students in ML

Use of boards
(also electronic)

Peer and and


Group Learning

Simple to
details and
fundamental
questions

Why is theory
important?
Motivating
students to
appreciate
theory, what are
some tricks?
Some
motivating
examples
where practice
will not be
effective without
knowing the
theory.

How to train
students to read
technical
notation.

Case study
could be SVMs
(which has a lot
of notation, but
also
geometrical
intuition)

8 M3 ● Conceptual introduction to SVM Inverted Class


● Formulation and objective functions Lab exercise for linear
● Beauty and Elegance of SVMs SVM with
connecting to theoretical claims and - Visualize support
practical utility vectors in toy 2D
data
- Use Kernels with
not much
explanation

9 M3 ● Probabilistic view of ML, Recap of Lecture Use of Naive Bayes


Basic Probability Terms Classifier in a Lab
● Bayesian Perspective
● Logistic regression

10 M4 ● Introduction to Neuron Models, Parallel or Run Simple MLP in


● Multi Layer Perceptrons different views Pytorch
● MLPs for Classification and of a specific
Regression topic (decision
● Loss functions and Regularization boundary,
feature
composition,
classifiers)

Notebook/Progr
amming/Code/D
emo Centric
Teaching

11 M4 ● Gradient Descent Optimization Notebook/Progr Show how BP works on


● Backpropagation Algorithm amming/Code/D a toy MLP.
● Chain Rule and Derivation emo Centric
Teaching
Also as inverted

12 M4 ● Intro. To Deep Learning Notebook/Progr A CNN classifier and


● Introduction to CNN amming/Code/D some visualization
● Why CNNs and DL yield impressive emo Centric
results? Teaching
● Practical issues in working with DL

13 Advances or Special Topic Specific Participant


request; Lecture

14 Advances or Special Topic Specific Participant


request; Lecture

15 Advances or Special Topic Specific Participant


request; Lecture

Schedule

The module will meet online once a week. In addition, a weekly help session to clear doubts and
to help with the assignment will be provided through TA s. Details about joining these sessions
will be provided later.

● Weekly Session: Thursday, 4:30 pm to 6 pm (Friday 4:30 -6 pm if Thursday is holiday)


● Weekly help Session: Saturday, 4:00- 5:30 pm (or 4:30 pm to 6 pm)

Text to be used for the Module

The textbook suggested in the AICTE syllabus will be used as the basis of this module.

Resources to be provided to attendees.

● A curated online resource aligned with the AICTE course


● Lecture Notes / ppt for the different topics in the course
● Some sample assignments for each of the major module
● Slides/material which are used for teaching these modules.

Post Module Support

● A mailing list will be created for the discussions


● An online symposium (2 Hrs) every quarter with
○ A brief Talk
○ Panel/Discussion/Sharing Experiences
○ Q&A

To maintain the connections and learn from each other.


Course Machine Learning 3L:1T: 0P Credits: 4 Pre-Reqs:
code:
CSXXX

Learning Outcomes of the course (i.e. statements on students’ understanding and


skills at the end of the course the student shall have):

Essential (<=6):
1. Understanding popular ML algorithms with their associated mathematical foundations
1. Capability to implement basic algorithms using basic (python) libraries. Have hands-on
experience in applying ML to problems encountered in various domains
2. Make aware of the role of data in the future of computing and solving real-world
problems.
3. Helping them connect/map real-world problems to the appropriate ML algorithm(s) to
solve them
4. Have a solid mathematical understanding of the popular ML algorithms
5. Have exposure to high level ML libraries or frameworks such as TF, pytorch
6. Have awareness about the importance of core CS principles such as algorithmic thinking
and systems design in ML

Desirable/Advanced (<= 3):

● Nil

Detailed contents for Essential Learning Outcomes:

Module (appx Topics Pedagogy / teaching Nature of lab /


dur in wks) suggestions assignment /
practice

Module 1: (i) Motivation and role of


Introduction to ML machine Learning in
computer science and
(3-4 weeks) problem solving,

(ii) Representation
(features), linear
transformations,
Appreciate linear
transformations and matrix
vector operations in the
context of data and
representation.

(iii) Problem formulations


(classification and
regression).

(iv) Appreciate the


probability distributions in
the context of data, Prior
probabilities and Bayes
Rule.

(v) Paradigms of Learning


(Supervised,
Unsupervised, and a brief
overview of others)

Module 2: (i) PCA and Dimensionality


Fundamentals of Reduction
ML
(ii) Nearest Neighbours
(3-4 weeks) and KNN.

(iii) Decision Tree


Classifiers

(iv) Generalization and


overfitting

(v) Notion of Training,


Validation and Testing

Module 3:
(i) Ensembling and RF
Selected
Algorithms (ii) Linear SVM,

(iii) K Means,

(iv) GMM,

(v) EM,

(vi) Naive Bayes

Module 4:
(i) Role of Loss Functions
NN Learning and Optimization, (ii)
Gradient Descent and
Perceptron/Delta Learning,

(iii) MLP,

(iv) Backpropagation

(v) MLP for Classification


and Regression,

(vi) Regularization, Early


Stopping

(vii) Introduction to Deep


Learning

Suggested text books / Online lectures or tutorials: To be added soon

Suggested reference books / Online resources: To be added soon

You might also like