Welcome to my Deep Learning Practice portfolio!
This repository documents my progressive, hands-on journey through deep learning. Each notebook builds upon the last, introducing new concepts, techniques, and practical applications. This guide is tailored for recruiters and reviewers, highlighting the learning flow, time investment, and the aim of each file.
graph TD
A[Perceptron & Basics] --> B[ANN & Backpropagation]
B --> C[Image Classification (CNNs)]
C --> D[Advanced CNNs & Transfer Learning]
D --> E[Sequence Models (RNN, LSTM, GRU)]
E --> F[Autoencoders & Unsupervised Learning]
F --> G[GANs & Advanced Topics]
Each file below includes a direct link, a brief description, the main technique learned, and the approximate time spent after upload before moving to the next topic (to show progression and depth).
- Aim: Introduce and implement the perceptron, the foundational unit of neural networks.
- Technique: Perceptron from scratch for binary classification using NumPy.
- Duration after upload: 1 week spent mastering basics and mathematical intuition.
- Aim: Build and train multi-layer Artificial Neural Networks (ANN) with backpropagation.
- Technique: Implemented MLP and backpropagation both from scratch and using Keras; explored activation functions.
- Duration after upload: 1 week deepening understanding of deep networks.
- Aim: Classify images with Convolutional Neural Networks (CNNs) on the MNIST dataset.
- Technique: Designed and trained a CNN in Keras, visualized learned filters and feature maps.
- Duration after upload: 1 week learning convolution, pooling, and feature extraction.
- Aim: Use pre-trained CNNs for image recognition tasks through transfer learning.
- Technique: Fine-tuned VGG/ResNet on custom datasets, applied data augmentation.
- Duration after upload: 2 weeks mastering transfer learning and advanced model usage.
- Aim: Sequence modeling for text using RNNs, LSTMs, and GRUs.
- Technique: Built and compared vanilla RNN, LSTM, GRU for text generation and sentiment analysis.
- Duration after upload: 2 weeks focused on handling sequential data and vanishing gradients.
- Aim: Dimensionality reduction and anomaly detection with autoencoders.
- Technique: Trained basic and variational autoencoders; visualized latent spaces; used for anomaly detection.
- Duration after upload: 2 weeks on unsupervised learning and generative models.
- Aim: Generate synthetic images using Generative Adversarial Networks (GANs).
- Technique: Implemented a DCGAN for image generation; explored generator/discriminator dynamics.
- Duration after upload: 3 weeks advancing into complex generative modeling.
- Structured Progression: Each file represents a new, mastered concept with increasing complexity.
- Breadth & Depth: Covers fundamentals through advanced generative modeling.
- Practical Focus: All notebooks are runnable, well-commented, and include outputs/visualizations.
- Growth Mindset: Repository grows as I learn; new topics and techniques are continuously added.
Aditya
Aspiring Deep Learning Engineer | GitHub Profile
Thank you for exploring my portfolio! Feedback and opportunities are welcome.