Hyper Parameter Testing and Best Model Selection using various methods
In this code I have looked into,
1) how to hyper tune machine learning model parameters
2) choosing the best model for given machine learning problem
First comparing traditional train_test_split approach with k fold cross validation. Then used GridSearchCV to run K Fold cross validation with its convenient api. Also used RandomizedSearchCV as a replacement for GridSearchCV.
GridSearchCV helps find best parameters that gives maximum performance.
RandomizedSearchCV is another class in sklearn library that does same thing as GridSearchCV but without running exhaustive search, this helps with computation time and resources.
At last using GridSearchCV to choose best model among all the classification algorithms.
I used digits dataset which is already available in sklearn.datasets.
-
Notifications
You must be signed in to change notification settings - Fork 0
AmitJha2403/ML-Hyperparameter_Tuning-Model_Selection
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
About
Hyper Parameter Testing and Best Model Selection using various methods
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published