Skip to content

RajGupta2509/Machine-Learning

Repository files navigation

Classical Machine Learning Algorithms

This repository contains well-structured Jupyter notebooks that implement and demonstrate a variety of classical machine learning algorithms using Python and Scikit-learn. It is designed for learners and practitioners who want to understand how traditional models work and how to apply them effectively on real-world datasets.


📁 Repository Structure

Notebook Description
01_Perceptron_and_Adaline.ipynb Implementation of Perceptron and Adaptive Linear Neuron (Adaline)
02_classification_algorithms.ipynb Classification models: Logistic Regression, SVM, Decision Trees, KNN
03_data_preprocessing.ipynb Data cleaning, normalization, encoding, and splitting
04_data_compression.ipynb Dimensionality reduction using PCA and LDA
05_model_evaluation.ipynb Model validation and performance metrics
06_ensemble_methods.ipynb Ensemble methods: Random Forest, Bagging, Boosting, XGBoost
07_regression_analysis.ipynb Regression using Linear Regression and RANSAC
08_clustering_analysis.ipynb Unsupervised learning: K-Means, DBSCAN, Hierarchical clustering

🔍 Features Covered

  • Dimensionality Reduction

    • Principal Component Analysis (PCA)
    • Linear Discriminant Analysis (LDA)
  • Classification Algorithms

    • Logistic Regression
    • Support Vector Machine (SVM)
    • Decision Trees
    • K-Nearest Neighbors (KNN)
  • Ensemble Methods

    • Random Forest
    • Bagging
    • AdaBoost & Gradient Boosting
    • XGBoost
  • Regression Analysis

    • Linear Regression
    • RANSAC Regressor
  • Clustering Algorithms

    • K-Means
    • DBSCAN
    • Hierarchical Clustering
  • Model Evaluation Techniques

    • K-Fold Cross-Validation
    • Confusion Matrix
    • ROC Curve and AUC
    • Precision, Recall, F1 Score

🧰 Technologies Used

  • Python
  • NumPy, Pandas
  • Matplotlib, Seaborn
  • Scikit-learn
  • XGBoost

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published