I'm an engineer with a diverse technical background: embedded systems during undergrad, computer vision during graduate, and quantitative development in the finance industry. This space documents both recent and past explorations, aiming to revisit core ideas and extract deeper insights through hands-on experimentation. 🚀
DeepANI: Learning ANI Approximation via Weakly Supervised Siamese Networks
Deep learning meets genome comparison: a scalable alternative to ANI via weak supervision.
This approach combines pretrained sequence embeddings and sketch-based methods to enable accurate estimation of inter-species genomic similarity.
Status: Early-stage collaboration with @jiehua1995, a biology PhD student at LMU Munich. Open-sourcing will be considered based on ongoing development and validation.
-
Transformer Copy Task
A basic implementation of the Transformer model with attention visualization support. -
Transformer Attention Visualization - customized layer
Extracts multi-head attention weights from both the encoder and decoder, built by inheriting from PyTorch’s encoder and decoder layers to expose attention weights. -
Transformer Attention Visualization Experiment — Pig Latin Seq2Seq Task
A sequence-to-sequence task translating English to Pig Latin. Visualizations of various encoder and decoder attention patterns are generated using a semantically meaningful dataset.
- to be organized...
| Tech Stack | Project | Language/Tools | Year |
|---|---|---|---|
| Transformer | Transformer Attention Visualization | Python / PyTorch | 2025 |
| Transformer | Pig Latin Seq2Seq | Python / PyTorch | 2025 |
| Transformer | Transformer Copy Task | Python / PyTorch | 2025 |
- Languages: Python, C++
- Frameworks: PyTorch
- Interests: Computer vision and graph neural networks, with a current focus on Transformer models and attention mechanisms.