AI Fellowship Syllabus LATAM

Download as pdf or txt
Download as pdf or txt
You are on page 1of 17

www.fuse.

ai

Latin America

A 6 month course to train, educate, and


create advanced AI career opportunities
About the program:

Fusemachines Democratizing AI Education


TM
A Microdegree program for Artificial Intelligence is an accelerated
TM
learning program in Artificial Intelligence. The Microdegree

program is created by the leading US university faculty members and

AI industry experts. It is specifically designed to upskill engineers/

domain experts with AI and Data Science expertise.

Program Outcomes

Participants will garner data science and ML skills with hands-on experience in

real world problem

Participants will develop a solid understanding of Artificial Intelligence,

Machine Learning, and Deep Learning algorithms with an understanding of

underlying math and programming practices

Select and implement appropriate algorithms, libraries, frameworks, and

techniques for different problems

Run experiments to assess the performance, evaluate and compare different

models to design and deploy an end-to-end pipeline


Course Schedule: Course Prerequisites:

Engineering students, IT students [Final year],


24 Weeks
Graduates or professionals who:

20 weeks course work + 4 weeks


have taken courses on Linear Algebra, Probability,
Project work
Statistics, and Basic Calculu

have taken some programming and computer


Method of Teaching:
science course

Pace of course: 12 hours/Wee are able to program in pytho

Lectures: 2 Hours/Wee
Soft skills: Fluent in English, Good communication

Self-study: 10 hours/Week skills, Teamwork, learning attitude

Blended Learning: be an eligibility test to check if students have


There will

enough foundation skills to take the course.


Most of the content will be available
Aptitude test:
online to be viewed on their own pac

Online Live Discussion by Academics


;
Programming Python programmin

and Industry Experts weekl Maths: Linear Algebra, Calculus, Probability, and

Statistic
Programming Assignment

Quizze Computer Science topics: DSA, Algorithms, Databas

Final Examinatio
Software Development skills: Git, REST, SQ
IQ and Problem-solvin
Group Project

Behavioral Questions
Paper reading sessions

Assessment Grading: Instructor / TAs:


Online (Quiz, Assignments x
Academics and Industry e perts from various
In-Class (Exams, Projects, Class parts of the worl

Activities) Fusemachines AI Engineer

Fusemachines employees
Program Features

Live Online Lectures and Guest Lecture


Weekly Lecture Session
Recordings of live classes available

Access to AI Enabled Online Classroom Platform


Course Video
Reading Material
Programming Notebook
Practice Quizze
Graded Programming Assignment
Proctored exams

Monthly newsletter [Announcements, reminders, news]

Case Studies and Paper reading

Group Projects

Kaggle Competitions

Student Community Forum, Fuse classroom platform,


and Discord forum

Access on Mobile and Desktop

Certificate of Completion
Enrollment Steps

Step 1 Step 2 Step 3 Step 4

Online Application Online Proctored 
 Interview of Shortlisted Enrollment and

Fill out and submit the Application Entrance Test Candidates Onboarding

Form with relevant details. After your application is approved, Shortlisted candidates who passed After a successful interview, you

you will sit down for an online the entrance test will be called for will be onboarded and enrolled in

entrance test designed to evaluate an interview. the program!

your foundational skills for

enrollment in the program.


Syllabus

Week 1
Introduction to AI/ML and Data Science

After completing this module, students should be able to:

Show and understanding of artificial intelligence, machine learning, and deep learning and
other relevant terminologie
Describe different categories of AI, the types of machine learning and their application
Understand the concept of Agents such as Search Agent, Adversarial agents and so o
Write a search agent to solve search problems

Assignment

Solve boat crossing puzzle using search Agents

Week 2
Data Wrangling

After completing this module, students should be able to:

Recognized different data types and data attribute


Scrape data from the internet
Clean data quality issues in dataset
Apply data normalization and scalin
Deal with data outliers and anomalie
Visualize dataset using different plots

Assignment

WebScraping Wikipedia page


Exploratory Data Analysis (EDA): Analyze and visualize a dataset to gain insights and
discover patterns.
Syllabus

Week 3
Regression & Classification

After completing this module, students should be able to:

Define basic machine learning terminologies


Create a linear regression model for predicting continuous values from dat
Create a logistic regression model for predicting classes
Write gradient descent algorithm (Stochastic and Batch Gradient Descent) to train the linear
regression and logistic regression model
Evaluate the performance of regression and classification models
To find out and improve Overfit or Underfit model
Implement regularization to improve model performance
Build, train and evaluate Regression and Classification models with Scikit-learn.

Assignment

Build a regression model to predict students' final grad


Customer Churn Prediction: Build a classification model to predict customer churn based
on historical data.

Week 4
Popular ML Models

After completing this module, students should be able to:

Interpret and visualize the decision surface of an overfitted decision tree.


Analyze how noise can cause the decision tree to go unnecessarily deep
Learn how to handle categorical and continuous features when using Naive Bayes.
Understand the concept of prior and posterior probabilities in Naive Bayes and their role in
the classification process
Describe Margin Classifier, the slack variables and implement the SVM with slack variables
Understand linearly separable and non-linearly separable data
Search the nearest neighbor using the K-D Tree algorithm
Implement K-NN algorithm for classification and recognize the effect of variation in K-value
Benchmarking models

Assignment

Build a model that can predict multiple genres of a movie based on its plot summary or
other relevant features.
Syllabus

Week 5
Clustering

After completing this module, students should be able to:

Examine the effect of centroid initialization in convergence and describe the various
initialization methods in K-means clusterin
Explain and implement DIANA algorithm (Agglomerative, BRICH
Explain density-based clustering and Exemplify terminologies such as ϵ-neighborhood,
density, core points, boundary points, outliers, density reachability, and connectivity
Explain the statistical cluster validation methods

Assignment

Market segmentation: Cluster customers based on their purchasing behavior or


demographic data to identify distinct customer groups.

Week 6
Ensemble Methods

After completing this module, students should be able to:

Decompose errors into bias, variance, and noise, and identify their causes in a model's poor
performance.
Explain ensemble methods and understand why they work, emphasizing the importance of
diversity and accuracy within ensembling
Explain bagging and bootstrapping, understand why averaging reduces variance, and list the
advantages of bagging
Understand random forest and differentiate it from bagging, listing the steps involved in
creating a random forest. 
Distinguish boosting from bagging, explain boosting as a sequential weighted averaging
technique, and discuss its working with an algorithm. 
Understand the working of and use XGBoost, CatBoost and LightGBM

Assignment

AirBnB guest arrival prediction using tree-based method


Random Forest classifier: Use the Random Forest algorithm to build an ensemble of
decision trees for classification tasks
Gradient Boosting regression: Implement Gradient Boosting to create an ensemble of
weak learners for regression tasks.
Syllabus

Week 7
Neural Networks

After completing this module, students should be able to:

Describe the basics of neural networks and correlate it with biological neurons (this may also
be redundant because of deep learning content
Examine and recognize the problems where the use of Neural network is appropriat
Explain how Perceptron Learning Algorithm helps to learn the parameter for the perceptro
Explain why multi-layer perceptron are universal approximato
Use tensorflow / Pytorch to perform basic tensor computation

Assignment

Implement artificial neural networks to predict the number of sales for a company based
on the advertisement platforms like TV, Radio and Newspaper.

Week 8
Image Processing, Feature Detection & Matching

After completing this module, students should be able to:

Understand the principles of image formation, camera geometry, and digital camera
component
Apply projective transformations, perspective transformations, and multi-view geometry
concepts to perform translation, scaling, rotation, and projections in both 2D and 3D spaces. 
Perform various image processing operations, including pixel transformations, histogram
analysis, noise removal, filtering, sharpening, deblurring, resizing, and morphology
Used algorithms and filters for edge detection, feature detection and matching techniques,
including corner detection using the Harris Corner Detector and scale-invariant feature
detection using SIFT.

Assignment

Edge detection: Implement algorithms like Canny edge detection or Sobel operator to
detect edges in images.
Image segmentation using clustering: Apply clustering techniques like K-means or Mean
Shift to segment images into different regions or objects.
Syllabus

Week 9
CNN & Transfer Learning

After completing this module, students should be able to:

Visualize the convolution operation in a CNN and point out the distinguishing features of a
CNN in comparison with an AN
Explain the architecture and working principles of CNNs, including convolutional layers,
pooling layers, and fully connected layers
Explain different CNN seminal architecture (VGG, ResNet, InceptionNet) and analyze their
importance
Identify, select and fine-tune appropriate pre-trained CNN models for different image analysis
tasks
Application of CNN in Computer vision

Assignment

Image classification with pre-trained models: Fine-tune a pre-trained CNN model like
VGG or ResNet on a new dataset for a specific classification task.
Object detection: Use a pre-trained CNN model like YOLO or SSD to detect and localize
objects in images or videos.

Week 10
Deploying ML models

After completing this module, students should be able to:

Build a machine learning pipeline: Create an end-to-end pipeline that includes data ingestion,
preprocessing, model training, and deployment in a production environmen
Implement a RESTful API using a web framework and Handle incoming requests and route
them appropriately
Determine whether the API should be public or private and Implement authentication and
authorization mechanisms
Document API endpoints and functionality Consider rate limiting and other usage control
measure
Select appropriate Deployment Strategies for a given use case: Recreate, Shadow, Canary,
Blue/Green

Assignment

Deploy a Machine Learning model engine with REST API or using streamlit
Syllabus

Week 11
Object Detection and Segmentation

After completing this module, students should be able to:

Develop a comprehensive understanding of object recognition and localization concepts,


including image classification, object detection, and the use of bounding boxes and
performance metrics
Understand the problem of computer vision addressed by segmentation and differentiate
between different types of segmentation techniques such as watershed algorithm, K-means
clustering, and mean shift clustering
Use object detection architectures such as Faster R-CNN, YOLO (You Only Look Once), and
SSD (Single Shot MultiBox Detector) and Segmentation architectures such as U-Net, Mask R-
CNN, and DeepLab

Assignment

Perform Object Detection and Segmentation on Popular dataset

Week 12
Recurrent Neural Networks and Transformers

After completing this module, students should be able to:

Understand the fundamentals of RNN, including the structure of recurrent units, the
computational graph, and backpropagation in RNNs. 
Gain knowledge of seminal architectures such as Long Short-Term Memory Networks (LSTMs)
and Gated-Recurrent Units (GRUs).
Understand the concept of attention in neural networks, including the motivation behind
using attention mechanisms to address the limitations of sequence-to-sequence
architectures.
Learn the different types of attention and Explain transformers, and analyze how it surpasses
architecture like LSTM and GRU.

Assignment

Language translation: Build a sequence-to-sequence model using a recurrent neural


network or transformer architecture to translate sentences between languages.
Text generation: Train a recurrent neural network model to generate text, such as writing
poetry or generating dialogue.
Syllabus

Week 13
Natural Language Processing

After completing this module, students should be able to:

Understand the relationship between language and knowledge, delve into morphology and
tagging, syntax and parsing, and explore lexical semantics using resources like WordNet.
Clean, Transform and preprocess text Dat
Describe the TF-IDF model and implement it using sk-learn
Describe Naive Bayes Classifier in the context of text classificatio
Analyze the problems with using RNN for the long sentence
State the difference between using RNN and CNN for the same NLP task.

Assignment

Implementation of POS Tagging from Scratch

Week 14
Language Models & LLMs

After completing this module, students should be able to:

Explain Markov Models, Markov Assumptions and find out when to use Markov Model
Discuss n-gram models and apply MLE to estimate n-gram probabilities and discuss
generalization issues in n-gram
Evaluate language models using perplexit
Understand how neural networks are used in language modeling in contrast to n-gram
language modelin
Appraise the semantic property of word embeddings: analogy reasoning with a classic
example of king-man+woman=quee
Understand Large Language Models & pre-training LLM
Understand the current issues  and limitations with LLMS: Hallucinations, inconsistency,
model drift, size and training tim
Use Prompt Engineering for in context learning at inference tim
Decide on pre-trained model or pre- training and fine-tuning a custom model for specific use
cases

Assignment

Transfer learning of LLM for text Summarization, Open Domain Chatbot etc
Syllabus

Week 15
Deep Unsupervised Learning

After completing this module, students should be able to:

Classify different types of generative models into a certain taxonom


Learn about explicit and implicit density estimation, and Boltzmann Machines (RBM)
Recall autoencoders and extend their idea and intuitively understand and explain variational
autoencoders(VAE), and reparameterization tricks
Understand and Train Latent Variable Model
Understand normalizing flows and implement models with normalizing flows RealNVP NICE
Glow

Assignment

Autoencoders for image denoising: Implement an autoencoder model to denoise images


corrupted with noise or artifacts.

Week 16
GANs and Diffusion Model  

After completing this module, students should be able to:

Visualize how mode collapse occurs during the training of GAN


Explain how a poor and great discriminator behavior affects the training process in GAN
Interpret why tracking the performance of GANs is difficult
Explore Generative Adversarial Networks(GANs), analyze their limitations, and understand
different types of GANs (CGAN, WGAN, DCGAN
Explore the cutting-edge world of diffusion-based generative AI
Gain deep familiarity with the diffusion process and the models driving it

Assignment

Paper Reading and Implementation of Generative Adversarial Net


Create your own diffusion model from scratch.
Syllabus

Week 17
Foundational Models and Generative AI

After completing this module, students should be able to:

Understand the concept of foundation models and in context learnin


Learn the principles of transfer learning and how pre-trained models can be leveraged as
foundation models
Adapt and fine-tune foundation models to suit the requirements of specific tasks
Identify limitations, potential biases, and ethical considerations associated with using
foundation models.

Assignment

Fine-tune a foundation model for sentiment analysis on a dataset of customer reviews


NER: Train a foundation model to identify and classify named entities in a text corpus
Fine-tune a foundation model to classify images into different categories.

Week 18
Reinforcement Learning 

After completing this module, students should be able to:

Discuss the Importance of RL and the type of problem to be solved using Reinforcement
Learning
Understand the k-armed bandit problem, the Markov property, Policy Iteration and Value
Iteration for solving MD
Solve the Bellman equations for small MRPs to determine the values of stat
Solve the Bellman equations for small MDP
Formulate various prediction and control algorithms of monte carl
Explain the Q-Learning algorith
Explain the Expected Sarsa and It's relationship between Sarsa and Q-Learning

Assignment

Tic-Tac-Toe Game Agent


Syllabus

Week 19
Reinforcement Learning  (continued)

After completing this module, students should be able to:

Distinguish continuous problems methods from tabular methods, explain various methods
such as coarse coding, tile coding
Understand the Monte Carlo Methods, Model-free, and how Temporal Difference (TD)
combines Monte Carlo (MC) method and Dynamic Programming (DP
Understand DQN architecture, Double DQN, Dueling DQN, different policy gradient
algorithms along with advantages and pitfalls
Understand the working of the Actor-Critic, and the problem with continuous action space,
and explain how DDPG solves it
Gym ecosystem and RLib

Assignment

Stock market prediction with Double DQN

Week 20
ML as a Services

After completing this module, students should be able to:

Understand the concept of ML as a Servic


Use apis to access ML services from different providers - OpenAI, AWS, GCP, Azure
Use AWS SageMaker, Google Cloud Machine Learning Engine, and Microsoft Azure Machine
Learning.
Train and deploy ML Models from cloud providers

Assignment

Developing ML product with Cloud services


Syllabus

Week 19
MLOps

After completing this module, students should be able to:

Understand ML in production, its operational workflow and its components. Discuss knowing,
labeling and validating data
Understand feature engineering & preprocessing and discuss their various techniques in
detail
Discuss filter, wrapper and embedded feature selection methods
Understand the fundamental concepts and definitions of Neural Architecture Search (NAS)
and auto-ML.
Monitor model and data for Model Stability and Drif
Configure Logging, Monitoring and Triggers for deployment

Assignment

Predicting data drift and model drift

Week 22 - Week 24

Project Work, Project Presentation and


Exams
Kickstart your 

AI career today!
Course duration-6 months

Full Scholarship

Job Placement

AI Certification

Apply now for:

Latin America

visit www.fuse.ai to learn more

You might also like