## Courses

### UDEMY

**Source :** Udemy**Course Name :** Machine Learning A-Z™: Hands-On Python & R In Data Science**Link :** Machine Learning A-Z™: Hands-On Python & R In Data Science

**Course Details :**

Part 1 – Data Preprocessing

Part 2 – Regression: Simple Linear Regression, Multiple Linear Regression, Polynomial Regression, SVR, Decision Tree Regression, Random Forest Regression

Part 3 – Classification: Logistic Regression, K-NN, SVM, Kernel SVM, Naive Bayes, Decision Tree Classification, Random Forest Classification

Part 4 – Clustering: K-Means, Hierarchical Clustering

Part 5 – Association Rule Learning: Apriori, Eclat

Part 6 – Reinforcement Learning: Upper Confidence Bound, Thompson Sampling

Part 7 – Natural Language Processing: Bag-of-words model and algorithms for NLP

Part 8 – Deep Learning: Artificial Neural Networks, Convolutional Neural Networks

Part 9 – Dimensionality Reduction: PCA, LDA, Kernel PCA

Part 10 – Model Selection & Boosting: k-fold Cross Validation, Parameter Tuning, Grid Search, XGBoost

**Detailed course:****Section: 1**

Welcome to the course!

1. Applications of Machine Learning

2. Why Machine Learning is the Future

3. Important notes, tips & tricks for this course

4. Installing Python and Anaconda (Mac, Linux & Windows)

5. Update: Recommended Anaconda Version

6. Installing R and R Studio (Mac, Linux & Windows)

7. BONUS: Meet your instructors

**Section: 2**

Part 1: Data Preprocessing

8. Welcome to Part 1 – Data Preprocessing

9. Get the dataset

10. Importing the Libraries

11. Importing the Dataset

12. For Python learners, summary of Object-oriented programming: classes & objects

13. Missing Data

14. Categorical Data

15. WARNING – Update

16. Splitting the Dataset into the Training set and Test set

17. Feature Scaling

18. And here is our Data Preprocessing Template!

Quiz 1: Data Preprocessing

**Section: 3**

Part 2: Regression

19. Welcome to Part 2 – Regression

**Section: 4**

Simple Linear Regression

20. How to get the dataset

21. Dataset + Business Problem Description

22. Simple Linear Regression Intuition – Step 1

23. Simple Linear Regression Intuition – Step 2

24. Simple Linear Regression in Python – Step 1

25. Simple Linear Regression in Python – Step 2

26. Simple Linear Regression in Python – Step 3

27. Simple Linear Regression in Python – Step 4

28. Simple Linear Regression in R – Step 1

29. Simple Linear Regression in R – Step 2

30. Simple Linear Regression in R – Step 3

31. Simple Linear Regression in R – Step 4

Quiz 2: Simple Linear Regression

**Section: 5**

Multiple Linear Regression

32. How to get the dataset

33. Dataset + Business Problem Description

34. Multiple Linear Regression Intuition – Step 1

35. Multiple Linear Regression Intuition – Step 2

36. Multiple Linear Regression Intuition – Step 3

37. Multiple Linear Regression Intuition – Step 4

38. Prerequisites: What is the P-Value?

39. Multiple Linear Regression Intuition – Step 5

40. Multiple Linear Regression in Python – Step 1

41. Multiple Linear Regression in Python – Step 2

42. Multiple Linear Regression in Python – Step 3

43. Multiple Linear Regression in Python – Backward Elimination – Preparation

44. Multiple Linear Regression in Python – Backward Elimination – HOMEWORK !

45. Multiple Linear Regression in Python – Backward Elimination – Homework Solution

46. Multiple Linear Regression in Python – Automatic Backward Elimination

47. Multiple Linear Regression in R – Step 1

48. Multiple Linear Regression in R – Step 2

49. Multiple Linear Regression in R – Step 3

50. Multiple Linear Regression in R – Backward Elimination – HOMEWORK !

51. Multiple Linear Regression in R – Backward Elimination – Homework Solution

52. Multiple Linear Regression in R – Automatic Backward Elimination

Quiz 3: Multiple Linear Regression

**Section: 6**

Polynomial Regression

53. Polynomial Regression Intuition

54. How to get the dataset

55. Polynomial Regression in Python – Step 1

56. Polynomial Regression in Python – Step 2

57. Polynomial Regression in Python – Step 3

58. Polynomial Regression in Python – Step 4

59. Python Regression Template

60. Polynomial Regression in R – Step 1

61. Polynomial Regression in R – Step 2

62. Polynomial Regression in R – Step 3

63. Polynomial Regression in R – Step 4

64. R Regression Template

**Section: 7**

Support Vector Regression (SVR)

65. How to get the dataset

66. SVR Intuition

67. SVR in Python

68. SVR in R

**Section: 8**

Decision Tree Regression

69. Decision Tree Regression Intuition

70. How to get the dataset

71. Decision Tree Regression in Python

72. Decision Tree Regression in R

**Section: 9**

Random Forest Regression

73. Random Forest Regression Intuition

74. How to get the dataset

75. Random Forest Regression in Python

76. Random Forest Regression in R

**Section: 10**

Evaluating Regression Models Performance

77. R-Squared Intuition

78. Adjusted R-Squared Intuition

79. Evaluating Regression Models Performance – Homework’s Final Part

80. Interpreting Linear Regression Coefficients

81. Conclusion of Part 2 – Regression

**Section: 11**

Part 3: Classification

82. Welcome to Part 3 – Classification

**Section: 12**

Logistic Regression

83. Logistic Regression Intuition

84. How to get the dataset

85. Logistic Regression in Python – Step 1

86. Logistic Regression in Python – Step 2

87. Logistic Regression in Python – Step 3

88. Logistic Regression in Python – Step 4

89. Logistic Regression in Python – Step 5

90. Python Classification Template

91. Logistic Regression in R – Step 1

92. Logistic Regression in R – Step 2

93. Logistic Regression in R – Step 3

94. Logistic Regression in R – Step 4

95. Logistic Regression in R – Step 5

96. R Classification Template

Quiz 4: Logistic Regression

**Section: 13**

K-Nearest Neighbors (K-NN)

97. K-Nearest Neighbor Intuition

98. How to get the dataset

99. K-NN in Python

100. K-NN in R

Quiz 5: K-Nearest Neighbor

**Section: 14**

Support Vector Machine (SVM)

101. SVM Intuition

102. How to get the dataset

103. SVM in Python

104. SVM in R

SVM.zip

**Section: 15**

Kernel SVM

105. Kernel SVM Intuition

106. Mapping to a higher dimension

107. The Kernel Trick

108. Types of Kernel Functions

109. How to get the dataset

110. Kernel SVM in Python

111. Kernel SVM in R

**Section: 16**

Naive Bayes

112. Bayes Theorem

113. Naive Bayes Intuition

114. Naive Bayes Intuition (Challenge Reveal)

115. Naive Bayes Intuition (Extras)

116. How to get the dataset

117. Naive Bayes in Python

118. Naive Bayes in R

**Section: 17**

Decision Tree Classification

119. Decision Tree Classification Intuition

120. How to get the dataset

121. Decision Tree Classification in Python

122. Decision Tree Classification in R

**Section: 18**

Random Forest Classification

123. Random Forest Classification Intuition

124. How to get the dataset

125. Random Forest Classification in Python

126. Random Forest Classification in R

**Section: 19**

Evaluating Classification Models Performance

127. False Positives & False Negatives

128. Confusion Matrix

129. Accuracy Paradox

130. CAP Curve

131. CAP Curve Analysis

132. Conclusion of Part 3 – Classification

**Section: 20**

Part 4: Clustering

133. Welcome to Part 4 – Clustering

**Section: 21**

K-Means Clustering

134. K-Means Clustering Intuition

135. K-Means Random Initialization Trap

136. K-Means Selecting The Number Of Clusters

137. How to get the dataset

138. K-Means Clustering in Python

139. K-Means Clustering in R

Quiz 6: K-Means Clustering

**Section: 22**

Hierarchical Clustering

140. Hierarchical Clustering Intuition

141. Hierarchical Clustering How Dendrograms Work

142. Hierarchical Clustering Using Dendrograms

143. How to get the dataset

144. HC in Python – Step 1

145. HC in Python – Step 2

146. HC in Python – Step 3

147. HC in Python – Step 4

148. HC in Python – Step 5

149. HC in R – Step 1

150. HC in R – Step 2

151. HC in R – Step 3

152. HC in R – Step 4

153. HC in R – Step 5

Quiz 7: Hierarchical Clustering

154. Conclusion of Part 4 – Clustering

**Section: 23**

Part 5: Association Rule Learning

155. Welcome to Part 5 – Association Rule Learning

**Section: 24**

Apriori

156. Apriori Intuition

157. How to get the dataset

158. Apriori in R – Step 1

159. Apriori in R – Step 2

160. Apriori in R – Step 3

161. Apriori in Python – Step 1

162. Apriori in Python – Step 2

163. Apriori in Python – Step 3

**Section: 25**

Eclat

164. Eclat Intuition

165. How to get the dataset

166. Eclat in R

Eclat.zip

**Section: 26**

Part 6: Reinforcement Learning

167. Welcome to Part 6 – Reinforcement Learning

**Section: 27**

Upper Confidence Bound (UCB)

168. The Multi-Armed Bandit Problem

169. Upper Confidence Bound (UCB) Intuition

170. How to get the dataset

171. Upper Confidence Bound in Python – Step 1

172. Upper Confidence Bound in Python – Step 2

173. Upper Confidence Bound in Python – Step 3

174. Upper Confidence Bound in Python – Step 4

175. Upper Confidence Bound in R – Step 1

176. Upper Confidence Bound in R – Step 2

177. Upper Confidence Bound in R – Step 3

178. Upper Confidence Bound in R – Step 4

**Section: 28**

Thompson Sampling

179. Thompson Sampling Intuition

180. Algorithm Comparison: UCB vs Thompson Sampling

181. How to get the dataset

182. Thompson Sampling in Python – Step 1

183. Thompson Sampling in Python – Step 2

184. Thompson Sampling in R – Step 1

185. Thompson Sampling in R – Step 2

**Section: 29**

Part 7: Natural Language Processing

186. Welcome to Part 7 – Natural Language Processing

187. Natural Language Processing Intuition

188. How to get the dataset

189. Natural Language Processing in Python – Step 1

190. Natural Language Processing in Python – Step 2

191. Natural Language Processing in Python – Step 3

192. Natural Language Processing in Python – Step 4

193. Natural Language Processing in Python – Step 5

194. Natural Language Processing in Python – Step 6

195. Natural Language Processing in Python – Step 7

196. Natural Language Processing in Python – Step 8

197. Natural Language Processing in Python – Step 9

198. Natural Language Processing in Python – Step 10

199. Homework Challenge

200. Natural Language Processing in R – Step 1

201. Natural Language Processing in R – Step 2

202. Natural Language Processing in R – Step 3

203. Natural Language Processing in R – Step 4

204. Natural Language Processing in R – Step 5

205. Natural Language Processing in R – Step 6

206. Natural Language Processing in R – Step 7

207. Natural Language Processing in R – Step 8

208. Natural Language Processing in R – Step 9

209. Natural Language Processing in R – Step 10

210. Homework Challenge

**Section: 30**

Part 8: Deep Learning

211. Welcome to Part 8 – Deep Learning

212. What is Deep Learning?

**Section: 31**

Artificial Neural Networks

213. Plan of attack

214. The Neuron

215. The Activation Function

216. How do Neural Networks work?

217. How do Neural Networks learn?

218. Gradient Descent

219. Stochastic Gradient Descent

220. Backpropagation

221. How to get the dataset

222. Business Problem Description

223. ANN in Python – Step 1 – Installing Theano, Tensorflow and Keras

224. ANN in Python – Step 2

225. ANN in Python – Step 3

226. ANN in Python – Step 4

227. ANN in Python – Step 5

228. ANN in Python – Step 6

229. ANN in Python – Step 7

230. ANN in Python – Step 8

231. ANN in Python – Step 9

232. ANN in Python – Step 10

233. ANN in R – Step 1

234. ANN in R – Step 2

235. ANN in R – Step 3

236. ANN in R – Step 4 (Last step)

**Section: 32**

Convolutional Neural Networks

237. Plan of attack

238. What are convolutional neural networks?

239. Step 1 – Convolution Operation

240. Step 1(b) – ReLU Layer

241. Step 2 – Pooling

242. Step 3 – Flattening

243. Step 4 – Full Connection

244. Summary

245. Softmax & Cross-Entropy

246. How to get the dataset

247. CNN in Python – Step 1

248. CNN in Python – Step 2

249. CNN in Python – Step 3

250. CNN in Python – Step 4

251. CNN in Python – Step 5

252. CNN in Python – Step 6

253. CNN in Python – Step 7

254. CNN in Python – Step 8

255. CNN in Python – Step 9

256. CNN in Python – Step 10

257. CNN in R

**Section: 33**

Part 9: Dimensionality Reduction

258. Welcome to Part 9 – Dimensionality Reduction

**Section: 34**

Principal Component Analysis (PCA)

259. Principal Component Analysis (PCA) Intuition

260. How to get the dataset

261. PCA in Python – Step 1

262. PCA in Python – Step 2

263. PCA in Python – Step 3

264. PCA in R – Step 1

265. PCA in R – Step 2

266. PCA in R – Step 3

**Section: 35**

Linear Discriminant Analysis (LDA)

267. Linear Discriminant Analysis (LDA) Intuition

268. How to get the dataset

269. LDA in Python

270. LDA in R

**Section: 36**

Kernel PCA

271. How to get the dataset

272. Kernel PCA in Python

273. Kernel PCA in R

**Section: 37**

Part 10: Model Selection & Boosting

274. Welcome to Part 10 – Model Selection & Boosting

**Section: 38**

Model Selection

275. How to get the dataset

276. k-Fold Cross Validation in Python

277. k-Fold Cross Validation in R

278. Grid Search in Python – Step 1

279. Grid Search in Python – Step 2

280. Grid Search in R

**Section: 39**

XGBoost

281. How to get the dataset

282. XGBoost in Python – Step 1

283. XGBoost in Python – Step 21

284. XGBoost in R

**Source :** Udemy**Course Name :** Python for Data Science and machine learning bootcamp**Link :** Python for Data Science and machine learning bootcamp

**Course Details :**

**What will I learn?**

Use Python for Data Science and Machine Learning

Use Spark for Big Data Analysis

Implement Machine Learning Algorithms

Learn to use NumPy for Numerical Data

Learn to use Pandas for Data Analysis

Learn to use Matplotlib for Python Plotting

Learn to use Seaborn for statistical plots

Use Plotly for interactive dynamic visualizations

Use SciKit-Learn for Machine Learning Tasks

K-Means Clustering

Logistic Regression

Linear Regression

Random Forest and Decision Trees

Natural Language Processing and Spam Filters

Neural Networks

Support Vector Machines

**Curriculum For This Course**

Course Introduction

Environment Set-Up

Jupyter Overview

Python Crash Course

Python for Data Analysis – NumPy

Python for Data Analysis – Pandas

Python for Data Analysis – Pandas Exercises

Python for Data Visualization – Matplotlib

Python for Data Visualization – Seaborn

Python for Data Visualization – Pandas Built-in Data Visualization

Python for Data Visualization – Plotly and Cufflinks

Python for Data Visualization – Geographical Plotting

Data Capstone Project

Introduction to Machine Learning

Linear Regression

Cross Validation and Bias-Variance Trade-Off

Logistic Regression

K Nearest Neighbors

Decision Trees and Random Forests

Support Vector Machines

K Means Clustering

Principal Component Analysis

Recommender Systems

Natural Language Processing

Big Data and Spark with Python

Neural Nets and Deep Learning

APPENDIX: OLD TENSORFLOW VIDEOS (Version 0.8)

BONUS: DISCOUNT COUPONS FOR OTHER COURSES

**Source :** Udemy**Course Name :** Machine learning with Scikit-learn**Link :** Machine learning with Scikit-learn

**Course Details :**

**What Will I Learn?**

Load data into scikit-learn; Run many machine learning algorithms both for unsupervised and supervised data

Assess model accuracy and performance

Being able to decide what’s the best model for every scenario

**Requirements**

Some Python and statistics knowledge is required: Being able to code loops, functions, classes in Python is necessary.

Understanding what are random variables, what is a Gaussian distribution, and the underlying concepts behind linear regression are necessary as well.

**Curriculum For This Course**

27 Lectures

Introduction to Scikit-learn

Supervised methods

Unsupervised methods

**Detailed Curriculum For This Course**

**Introduction to Scikit-learn**- Introduction
- Installing scikit-learn
- Data manipulation: from Pandas to scikit-learn
- Creating synthetic data

**Supervised methods**- Naive Bayes : Bernoulli – Multinomial
- Detecting spam in real SMS Kaggle data
- Linear Support Vector Machines (SVM): SVM and LinearSVC
- Linear Support Vector Machines (SVM): NuSVM
- Quiz on SVM
- Logistic regression
- Predicting if income >50k using real US Census Data
- Isotonic regression
- Linear regression – Lasso – Ridge
- Quiz on Lasso – Ridge
- Decision trees
- Introduction to ensemble methods
- Averaging ensemble methods – Part 1: Bagging
- Averaging ensemble methods – Part 2: Random forests
- Digit Classification via Random Forests
- Boosting ensemble methods
- Grid Search Cross Validation
- Predicting real house prices in the US using ExtraTreesRegressor

**Unsupervised methods**- Density Estimation
- Principal Components
- Principal Components
- K-Means
- Preview
- DBScan
- Clustering
- Clustering and PCA on real countries data from Kaggle
- Outlier detection
- Novelty detection

**Source :** Udemy**Course Name :** A Gentle Introduction to Machine Learning Using SciKit-Learn**Link :** A Gentle Introduction to Machine Learning Using SciKit-Learn

**Course Details :**

**What Will I Learn?**

At the end of the course you’ll understand how to create an end to end model using Python’s SciKit_Learn.

You’ll understand the nomenclature and process when creating a solution in SciKit_Learn.

You will also have a Jupyter Notebook that’s annotated with all the important points in the course.

You will also receive a completed Jupyter Notebook filled with models and references.

**Curriculum For This Course**

**Introduction**- What is our Goal?
- Predictive Modeling
- Why use scikit-learn?
- Installing Python and SciKit Learn
- Terminology
- Jupyter Notebook Anatomy
- Course Downloads
- Summary
- Quiz

**Building Our Model**- An End to End Model Walk through – Part 1
- An End to End Model Walk through – Part 2
- All Canned Data is Clean
- Building the Model – Part 1
- Building the Model – Part 2
- Building the Model – Part 3
- Building the Model – Part 4
- Summary
- Quiz

**Source :** Udemy**Course Name :** Introduction to Natural Language Processing (NLP)**Link :** Introduction to Natural Language Processing (NLP)

**Course Details :**

**What Will I Learn?**

Work with text data using the Natural Language Toolkit.

Load and manipulate custom text data.

Analyze text to discover, sentiment, important key words, and statistics.

**Curriculum For This Course**

**Course Introduction**- Course Intro and Outline

**Setup**- Windows Setup
- OS X Setup

**Python Refresher**- Lists
- Dictionaries
- Loops and Conditionals
- Functions

**NLTK and the Basics**- Overview – The Natural Language Toolkit
- Counting Text
- Example – Words Per Sentence Trends
- Frequency Distribution
- Conditional Frequency Distribution
- Example – Informative Words
- Bigrams
- Overview – Regular Expressions
- Regular Expression Practice

**Tokenization , Tagging, Chunking**- Overview – Tokenization
- Tokenization
- Normalizing
- Part of Speech Tagging
- Example – Multiple Parts of Speech
- Example – Choices
- Chunking
- Named Entity Recognition

**Custom Sources**- Overview – Character Encoding
- Text File
- HTML
- URL
- CSV File
- Exporting
- NLTK Resources
- Example – Remove Stopwords

**Projects**- Sentiment Analysis Intro
- Basic Sentiment Analysis
- Gender Prediction Intro
- Gender Prediction
- TF-IDF Intro
- TF-IDF

**Appendix**- Additional NLP Resources
- Learning Python
- Future Course Content

### EDX

**Source :** EdX**Course Name :** Machine Learning Fundamentals**Link :** Machine Learning Fundamentals

**Course Details :**

In this course, part of the Data Science MicroMasters program, you will learn a variety of supervised and unsupervised learning algorithms, and the theory behind those algorithms.

Using real-world case studies, you will learn how to classify images, identify salient topics in a corpus of documents, partition people according to personality profiles, and automatically capture the semantic structure of words and use it to categorize documents.

Armed with the knowledge from this course, you will be able to analyze many different types of data and to build descriptive and predictive models.All programming examples and assignments will be in Python, using Jupyter notebooks.

**What you’ll learn**

Classification, regression, and conditional probability estimation

Generative and discriminative models

Linear models and extensions to nonlinearity using kernel methods

Ensemble methods: boosting, bagging, random forests

Representation learning: clustering, dimensionality reduction, autoencoders, deep nets

**Source :** EdX**Course Name :** Machine Learning**Link :** Machine Learning

**Course Details :**

**What you’ll learn**

Supervised learning techniques for regression and classification

Unsupervised learning techniques for data modeling and analysis

Probabilistic versus non-probabilistic viewpoints

Optimization and inference algorithms for model learning

**Course Syllabus**

Week 1: maximum likelihood estimation, linear regression, least squares

Week 2: ridge regression, bias-variance, Bayes rule, maximum a posteriori inference

Week 3: Bayesian linear regression, sparsity, subset selection for linear regression

Week 4: nearest neighbor classification, Bayes classifiers, linear classifiers, perceptron

Week 5: logistic regression, Laplace approximation, kernel methods, Gaussian processes

Week 6: maximum margin, support vector machines, trees, random forests, boosting

Week 7: clustering, k-means, EM algorithm, missing data

Week 8: mixtures of Gaussians, matrix factorization

Week 9: non-negative matrix factorization, latent factor models, PCA and variations

Week 10: Markov models, hidden Markov models

Week 11: continuous state-space models, association analysis

Week 12: model selection, next steps

**Source :** University of Texas at Austin Computer Science**Course Name :** Machine Learning**Link :** Machine Learning

**Course Details :**

**Chapter Outline:**

**Introduction**

Definition of learning systems. Goals and applications of machine learning. Aspects of developing a learning system: training data, concept representation, function approximation.

Concept Learning and the General-to-Specific Ordering

Concept Learning and the General-to-Specific Ordering. The concept learning task. Concept learning as search through a hypothesis space. General-to-specific ordering of hypotheses. Finding maximally specific hypotheses. Version spaces and the candidate elimination algorithm. Learning conjunctive concepts. The importance of inductive bias.**Decision Tree Learning**

Decision Tree Learning. Representing concepts as decision trees. Recursive induction of decision trees. Picking the best splitting attribute: entropy and information gain. Searching for simple trees and computational complexity. Occam’s razor. Overfitting, noisy data, and pruning.**Artificial Neural Networks**

Artificial Neural Networks. Neurons and biological motivation. Linear threshold units. Perceptrons: representational limitation and gradient descent training. Multilayer networks and backpropagation. Hidden layers and constructing intermediate, distributed representations. Overfitting, learning network structure, recurrent networks.**Evaluating Hypotheses**

Bayesian Learning**Computational Learning Theory**

Computational Learning Theory. Models of learnability: learning in the limit; probably approximately correct (PAC) learning. Sample complexity: quantifying the number of examples needed to PAC learn. Computational complexity of training. Sample complexity for finite hypothesis spaces. PAC results for learning conjunctions, kDNF, and kCNF. Sample complexity for infinite hypothesis spaces, Vapnik-Chervonenkis dimension.**Instance-Based Learning****Genetic Algorithms****Learning Sets of Rules****Analytical Learning****Combining Inductive and Analytical Learning****Reinforcement Learning****Ensemble Learning**

Using committees of multiple hypotheses. Bagging, boosting, and DECORATE. Active learning with ensembles.(read this paper)**Experimental Evaluation of Learning Algorithms**

Evaluating Hypotheses. Measuring the accuracy of learned hypotheses. Comparing learning algorithms: cross-validation, learning curves, and statistical hypothesis testing.**Rule Learning: Propositional and First-Order**

Learning Sets of Rules. Translating decision trees into rules. Heuristic rule induction using separate and conquer and information gain. First-order Horn-clause induction (Inductive Logic Programming) and Foil. Learning recursive rules. Inverse resolution, Golem, and Progol.**Support Vector Machines**

Maximum margin linear separators. Quadratic programming solution to finding maximum margin separators. Kernels for learning non-linear functions.**Bayesian Learning**

Bayesian Learning and new on-line chapter. Probability theory and Bayes rule. Naive Bayes learning algorithm. Parameter smoothing. Generative vs. discriminative training. Logistic regression. Bayes nets and Markov nets for representing dependencies.**Instance-Based Learning**

Instance-Based Learning. Constructing explicit generalizations versus comparing to past specific examples. k-Nearest-neighbor algorithm. Case-based learning.**Text Classification**

Bag of words representation. Vector space model and cosine similarity. Relevance feedback and Rocchio algorithm. Versions of nearest neighbor and Naive Bayes for text.**Clustering and Unsupervised Learning**

Learning from unclassified data. Clustering. Hierarchical Agglomerative Clustering. k-means partitional clustering. Expectation maximization (EM) for soft clustering. Semi-supervised learning with EM using labeled and unlabeled data.**Language Learning**

Classification problems in language: word-sense disambiguation, sequence labeling. Hidden Markov models (HMM’s). Veterbi algorithm for determining most-probable state sequences. Forward-backward EM algorithm for training the parameters of HMM’s. Use of HMM’s for speech recognition, part-of-speech tagging, and information extraction. Conditional random fields (CRF’s). Probabilistic context-free grammars (PCFG). Parsing and learning with PCFGs. Lexicalized PCFGs.

### Coursera

**Source :** Coursera**Course Name :** Machine Learning**Link :** Machine Learning

**Course Details :**

**WEEK 1 :**

Introduction:

Supervised Learning

Unsupervised Learning

Linear Regression with One Variable

Model Representation

Cost Function

Gradient Descent

Gradient Descent For Linear Regression

Linear Algebra Review

Matrices and Vectors

Addition and Scalar Multiplication

Matrix Vector Multiplication

Inverse and Transpose

**WEEK 2 :**

Linear Regression with Multiple Variables

Setting Up Your Programming Assignment Environment

Octave/Matlab Tutorial

Vectorization

Plotting Data

**WEEK 3 :**

Logistic Regression

Classification

Hypothesis Representation

Decision Boundary

Simplified Cost Function and Gradient Descent

Multiclass Classification: One-vs-all

Regularization

The Problem of Overfitting

Regularized Linear Regression

**WEEK 4 : Neural Networks: Representation**

Non-Linear Hypothesis

Neurons and the Brain

Multiclass Classification

Multi-class Classification and Neural Networks

**WEEK 5 : Neural Networks: Learning**

Backpropagation Algorithm

Implementation Note: Unrolling Parameters

Gradient Checking

Random Initialization

**WEEK 6 :**

Advice for Applying Machine Learning

Evaluating a Hypothesis

Model Selection and Train/Validation/Test Sets

Diagnosing Bias vs. Variance

Regularization and Bias/Variance

Learning Curves

Deciding What to Do Next Revisited

Machine Learning System Design

Error Analysis

Error Metrics for Skewed Classes

Trading Off Precision and Recall

Data For Machine Learning

**WEEK 7 : Support Vector Machines**

Optimization Objective

Large Margin Intuition

Mathematics Behind Large Margin Classification

Kernels

Using An SVM

**WEEK 8 :**

Unsupervised Learning

K-Means Algorithm

Optimization Objective

Choosing the Number of Clusters

Dimensionality Reduction

Principal Component Analysis Problem Formulation

Choosing the Number of Principal Components

Advice for Applying PCA

**WEEK 9 :**

Anomaly Detection

Content Based Recommendations

Collaborative Filtering

Vectorization: Low Rank Matrix Factorization

Implementational Detail: Mean Normalization

Recommender Systems

Collaborative Filtering Algorithm

Vectorization: Low Rank Matrix Factorization

Implementational Detail: Mean Normalization

**WEEK 10 : Large Scale Machine Learning**

Stochastic Gradient Descent

Mini-Batch Gradient Descent

Stochastic Gradient Descent Convergence

Map Reduce and Data Parallelism

**WEEK 11 : Application Example: Photo OCR**

Getting Lots of Data and Artificial Data

Ceiling Analysis: What Part of the Pipeline to Work on Next

**Source :** Coursera**Course Name :** Neural Networks and Deep Learning**Link :** Neural Networks and Deep Learning

**Course Details :**

**WEEK 1 : Introduction to deep learning**

What is a neural network?

Supervised Learning with Neural Networks

**WEEK 2 : Neural Networks Basics**

Logistic Regression

Binary Classification

Logistic Regression Cost Function

Gradient Descent

Computation graph

Logistic Regression Gradient Descent

Vectorization

Broadcasting in Python

Python Basics with numpy

**WEEK 3 : Shallow neural networks**

Neural Network Representation

Computing a Neural Network’s Output

Vectorized Implementations

Non-linear activation functions

Backpropagation intuition

Planar data classification with a hidden layer

**WEEK 4 : Deep Neural Networks**

Forward Propagation in a Deep Network

Forward and Backward Propagation

Parameters vs Hyperparameters

**Source :** Coursera**Course Name :** Convolutional Neural Networks**Link :** Convolutional Neural Networks

**Course Details :**

**WEEK 1 : Foundations of Convolutional Neural Networks**

Edge Detection

Padding

Strided Convolutions

Convolutions Over Volume

One Layer of a Convolutional Network

Pooling Layers

**WEEK 2 : Deep convolutional models: case studies**

Classic Networks

ResNets

Inception Network Motivation

Using Open-Source Implementation

Transfer Learning

Data Augmentation

State of Computer Vision

Keras Tutorial – The Happy House (not graded)

Residual Networks

**WEEK 3 : Object detection**

Landmark Detection

Object Detection

Convolutional Implementation of Sliding Windows

Bounding Box Predictions

Intersection Over Union

Non-max Suppression

Anchor Boxes

YOLO Algorithm

**WEEK 4 : Special applications: Face recognition & Neural style transfer**

Siamese Network

Triplet Loss

Face Verification and Binary Classification

Cost Function

Style Cost Function

1D and 3D Generalization

### NPTEL

**Source :** NPTEL course offered July 2020**Course Name :** Deep Learning**Link :** Deep Learning

**Course Details :**

**Week 1 :**

(Partial) History of Deep Learning, Deep Learning Success Stories, McCulloch Pitts Neuron, Thresholding Logic, Perceptrons, Perceptron Learning Algorithm

**Week 2 :**

Multilayer Perceptrons (MLPs), Representation Power of MLPs, Sigmoid Neurons, Gradient Descent, Feedforward Neural Networks, Representation Power of Feedforward Neural Networks

**Week 3 :**

FeedForward Neural Networks, Backpropagation

**Week 4 :**

Gradient Descent (GD), Momentum Based GD, Nesterov Accelerated GD, Stochastic GD, AdaGrad, RMSProp, Adam, Eigenvalues and eigenvectors, Eigenvalue Decomposition, Basis

**Week 5 :**

Principal Component Analysis and its interpretations, Singular Value Decomposition

**Week 6 :**

Autoencoders and relation to PCA, Regularization in autoencoders, Denoising autoencoders, Sparse autoencoders, Contractive autoencoders

**Week 7 :**

Regularization: Bias Variance Tradeoff, L2 regularization, Early stopping, Dataset augmentation, Parameter sharing and tying, Injecting noise at input, Ensemble methods, Dropout

**Week 8 :**

Greedy Layerwise Pre-training, Better activation functions, Better weight initialization methods, Batch Normalization

**Week 9 :**

Learning Vectorial Representations Of Words

**Week 10 :**

Convolutional Neural Networks, LeNet, AlexNet, ZF-Net, VGGNet, GoogLeNet, ResNet, Visualizing

Convolutional Neural Networks, Guided Backpropagation, Deep Dream, Deep Art, Fooling Convolutional

Neural Networks

**Week 11 :**

Recurrent Neural Networks, Backpropagation through time (BPTT), Vanishing and Exploding Gradients, Truncated BPTT, GRU, LSTMs

**Week 12 :**

Encoder Decoder Models, Attention Mechanism, Attention over images

**SUGGESTED READING MATERIALS:**

Deep Learning, An MIT Press book, Ian Goodfellow and Yoshua Bengio and Aaron Courville http://www.deeplearningbook.org

**Source :** NPTEL course offered July 2020**Course Name :** Artificial Intelligence Search Methods for problem Solving**Link :** Artificial Intelligence Search Methods for problem Solving

**Course Details :**

**Week Topics**

1 Introduction: Overview and Historical Perspective, Turing Test, Physical Symbol Systems and the scope of Symbolic AI, Agents.

2 State Space Search: Depth First Search, Breadth First Search, DFID

3 Heuristic Search: Best First Search, Hill Climbing, Beam Search

4 Traveling Salesman Problem, Tabu Search, Simulated Annealing

5 Population Based Search: Genetic Algorithms, Ant Colony Optimization

6 Branch & Bound, Algorithm A*, Admissibility of A*

7 Monotone Condition, IDA*, RBFS, Pruning OPEN and CLOSED in A*

8 Problem Decomposition, Algorithm AO*, Game Playing

9 Game Playing: Algorithms Minimax, AlphaBeta, SSS*

10 Rule Based Expert Systems, Inference Engine, Rete Algorithm

11 Planning: Forward/Backward Search, Goal Stack Planning, Sussman’s Anomaly

12 Plan Space Planning, Algorithm Graphplan

The following topics are not part of evaluation for this course, and are included for the interested student. These topics will be covered in detail in two followup courses “AI: Knowledge Representation and Reasoning” and “AI: Constraint Satisfaction Problems”.

A1 Constraint Satisfaction Problems, Algorithm AC-1, Knowledge Based Systems

A2 Propositional Logic, Resolution Refutation Method

A3 Reasoning in First Order Logic, Backward Chaining, Resolution Method

Text Book (Chapters 1-8): Deepak Khemani, A First Course in Artificial Intelligence, McGraw Hill (India), 2013

**Source :** NPTEL course offered July 2020**Course Name :** Introduction to Machine Learning**Link :** Introduction to Machine Learning

**Course Details :**

**Week 1:**

Introduction: Basic definitions, types of learning, hypothesis space and inductive bias, evaluation, cross-validation

**Week 2:**

Linear regression, Decision trees, overfitting

**Week 3:**

Instance based learning, Feature reduction, Collaborative filtering based recommendation

**Week 4:**

Probability and Bayes learning

**Week 5:**

Logistic Regression, Support Vector Machine, Kernel function and Kernel SVM

**Week 6:**

Neural network: Perceptron, multilayer network, backpropagation, introduction to deep neural network

**Week 7:**

Computational learning theory, PAC learning model, Sample complexity, VC Dimension, Ensemble learning

**Week 8:**

Clustering: k-means, adaptive hierarchical clustering, Gaussian mixture model

**SUGGESTED READING**

Machine Learning. Tom Mitchell. First Edition, McGraw- Hill, 1997.

Introduction to Machine Learning Edition 2, by Ethem Alpaydin

**Source :** NPTEL course offered July 2020**Course Name :** Scalable Data Science**Link :** Scalable Data Science

**Course Details :**

**Week 1 : Background**

Introduction (30 mins) Probability: Concentration inequalities, (30 mins) Linear algebra: PCA, SVD (30 mins) Optimization: Basics, Convex, GD. (30 mins) Machine Learning: Supervised, generalization, feature learning, clustering. (30 mins)

**Week 2 : Memory-efficient data structures**

Hash functions, universal / perfect hash families (30 min)

Bloom filters (30 mins)

Sketches for distinct count (1 hr)

Misra-Gries sketch. (30 min)

**Week 3 : Memory-efficient data structures (contd.)**

Count Sketch, Count-Min Sketch (1 hr)

Approximate near neighbors search: Introduction, kd-trees etc (30 mins)

LSH families, MinHash for Jaccard, SimHash for L2 (1 hr)

**Week 4 : Approximate near neighbors search**

Extensions e.g. multi-probe, b-bit hashing, Data dependent variants (1.5 hr)

Randomized Numerical Linear Algebra Random projection (1 hr)

**Week 5 :**

Randomized Numerical Linear Algebra CUR Decomposition (1 hr)

Sparse RP, Subspace RP, Kitchen Sink (1.5 hr)

**Week 6 :**

Map-reduce and related paradigms Map reduce – Programming examples – (page rank, k-means, matrix multiplication) (1 hr)

Big data: computation goes to data. + Hadoop ecosystem (1.5 hrs)

**Week 7 :**

Map-reduce and related paradigms (Contd.) Scala + Spark (1 hr)

Distributed Machine Learning and Optimization: Introduction (30 mins)

SGD + Proof (1 hr)

**Week 8 : Distributed Machine Learning and Optimization**

ADMM + applications (1 hr)

Clustering (1 hr)

Conclusion (30 mins)

**SUGGESTED READING MATERIALS:**

J. Leskovec, A. Rajaraman and JD Ullman. Mining of Massive Datasets. Cambridge University Press, 2nd Ed.

Muthukrishnan, S. (2005). Data streams: Algorithms and applications. Foundations and Trends® in Theoretical Computer Science, 1(2), 117-236.

Woodruff, David P. “”Sketching as a tool for numerical linear algebra.”” Foundations and Trends® in Theoretical Computer Science 10.1–2 (2014): 1-157.

Mahoney, Michael W. “”Randomized algorithms for matrices and data.”” Foundations and Trends® in Machine Learning 3.2 (2011): 123-224.

**Source :** NPTEL course offered July 202**Course Name :** The Joy of Computing using Python**Link :** The Joy of Computing using Python

**Course Details :**

Motivation for Computing

Welcome to Programming!!

Variables and Expressions : Design your own calculator

Loops and Conditionals : Hopscotch once again

Lists, Tuples and Conditionals : Lets go on a trip

Abstraction Everywhere : Apps in your phone

Counting Candies : Crowd to the rescue

Birthday Paradox : Find your twin

Google Translate : Speak in any Language

Currency Converter : Count your foreign trip expenses

Monte Hall : 3 doors and a twist

Sorting : Arrange the books

Searching : Find in seconds

Substitution Cipher : What’s the secret !!

Sentiment Analysis : Analyse your Facebook data

20 questions game : I can read your mind

Permutations : Jumbled Words

Spot the similarities : Dobble game

Count the words : Hundreds, Thousands or Millions.

Rock, Paper and Scissor : Cheating not allowed !!

Lie detector : No lies, only TRUTH

Calculation of the Area : Don’t measure.

Six degrees of separation : Meet your favourites

Image Processing : Fun with images

Tic tac toe : Let’s play

Snakes and Ladders : Down the memory lane.

Recursion : Tower of Hanoi

Page Rank : How Google Works !!