By difficulties in general :
Intro:
- Regression(Linear, logistic)
- KNN, K-means, desecion tree
- Artifical neuron network(MLP/CNN/LSTM)
- Backpropagation, Forward Propergation
- Max pooling, Activation Functions
- RL: bandit/MDP/DP/TD learning/MC
- Coding:
- Python (Numpy, matplotlib, scikit-learn, open-cv , pytorch, Tensorflow etc)
- Good to have: GPU programming (CUDA C++ etc)
- Good to have: Parallel Computing
- Good to have: Matlab, R, Julia
- Gits, Unix enviorment
Beginner:
- Bayes optimization, Bayesian Linear Regression/logistic Regression
- FFT
- NLP Transformer/BERT/GPT
- GMM/HMM
- EM/Baum-Welch/Viterbi/KalmanFilter(UKF)
- Graphical model/Belief propagation
- Junction tree
- SVM/Kernel Method/RKHS
- model based RL/Policy gradient/actor-critic
Medium: Missing Cuz i am still learning
Hard: Missing Cuz i am far away , hhh
Intro: Calculus, Linear Algebra, probability theory
Beginner: Convex optimization(Ex. Constrained opt/kkt condition), Graph theory, High-dim probabilty theory(Ex. Normal distribution and its linear transform, conditional probablity), Bayesian inference, Stochastic process
中级:概率论(数族分布,GLM,测度论,skewness,kurtosis),凸优化,统计力学(spin glass model),泛函分析
Deep learning Enterprise Intergration
- code package
- Rule to create file & directory
- Management software
https://github.com/youwei1-sudo/MachineLearning-Study-Path/wiki/Enterprise-Integration(代码管理)
https://jeffmacaluso.github.io/post/DeepLearningRulesOfThumb/
https://github.com/google-research/tuning_playbook
Machine Learning
10-601, Spring 2015
Carnegie Mellon University
Tom Mitchell and Maria-Florina Balcan http://www.cs.cmu.edu/~ninamf/courses/601sp15/lectures.shtml

