AI in a nutshell
Protoypes are how the human brain learns and are very closely related to linear classification.


This method is often also called the Nearest Centroid Classifier
___________________________________________
Another Method for linear classification, the Perceptron




Problems with the Nearest Centroid Classification

Since NNC is a linear classifier , it is bound to have Problems with non linear Data and correlted data .
Correlated Data makes prediction more difficult , however, we have a tool called LDA for dealing with the corrolation and decorrolate Data.
Supervised Linear Classification with Fisher's LDA
Decorrolate corrolated data and measure Class seperability.




Lda first decorrolates the data and then uses ne Nearest Centroid classification method.

Cross Validation

Regression


Example of the most simple form of Linear Regression, Least Square Error, OLS
Comparision of Supervised Algorithmns

Ridge Regression,





Linear Regression is a generic framework for prediction straightforwardly extends to vector labels can model nonlinear dependencies between data and labels can be made more robust (Ridge Regression)
Kernel Methods
A trick for classifying linear non seperatable Data.
Calculations are done in a higher dimensional Space. Take data and project it to higher dimensional space, then compare it in this space (look for linear relationships) and use this knowledge to classify data . The Kernel is a measurement for the similarity of the data
Non linear Problems become linear in Kernel space
Popular Kernel Methods are the Linear Kernel, Polynomial Kernel and the Gaussian Kernel.






Neural Networks - Multilayer Neural Networks


Multilayer Netwroks!


Unsupervised Learning :

Deep Learning


Unsupervised Learning Methods
Priniple Component Analysis PCA
PCA for Maximizing Variance



Non Negative Matrix Factorization- NMF

Clustering
Kinda the same like NCC , but with multiple classes!


Last updated