Skip to content

ichomchom/Advances-In-Deep-Learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Advances in Deep Learning

This repository contains my work for the Advances in Deep Learning class. Here, I explore and implement cutting-edge deep learning techniques and models, including:

Topics Covered

  • 4-bit Quantization + LoRA-Linear + FP16 for BigNet

    • Implemented efficient model compression and fine-tuning techniques for large neural networks using low-bit quantization, LoRA (Low-Rank Adaptation) linear layers, and half-precision (FP16) training.
  • Autoregressive Image Generator (SuperTuxKart)

    • Trained an autoregressive model from scratch to generate images, using the SuperTuxKart dataset as a benchmark for generative modeling.
  • LLMs (SmolLM2) for Unit Conversions

    • Developed and fine-tuned a small language model (SmolLM2) to perform unit conversions using In-Context Learning (ICL), Supervised Fine-Tuning (SFT), and Reinforcement Fine-Tuning (RFT).
  • Fine-tuned VLM and CLIP on SuperTuxKart Dataset

    • Adapted and fine-tuned a Vision-Language Model (VLM) and CLIP (Contrastive Language-Image Pretraining) on the SuperTuxKart dataset for improved multimodal understanding.

Getting Started

Each homework and module contains its own README.md and requirements.txt for setup and instructions. Please refer to those files for details on running specific experiments.


This repository demonstrates practical skills in model compression, generative modeling, language model fine-tuning, and multimodal learning, with a focus on real-world datasets and tasks.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published