Skip to content

ajegorovs/aj_python_tool_lib

Repository files navigation

what this REPO is:

  • a sorted collection of templates which show different data processing methods.

data_processing:
* Many methods examined in this folder are taken from a book "Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control" by Brunton and Kutz.
* Morhpology, graphs and image processing scipts are snippets of code from my past experience.
* Plenty of examples from Regression and optimization.

multiprocessing: * Few templates for CPU/GPU parallization.
* CPU stuff uses multiprocessing library where async pool workers is created and shared memory region is used.
* GPU stuff is yoinked from pytorch. Functional methods are disconnected from neural network environment and can be used for image processing or linear algebra operations.

data_processing/neural_networks:
* Here i have working examples of simple neural networks
* XNN_from_scratch- is an exercise of writing Deep NN and Convolutional NN only using numpy.
* Hopfield_Networks- is an interesting approach of storing and restoring incomplete information by making original info (memories) into one-to-all connected dense network.
* autoencoder- network for compression of restoration of data. It can modified for de-noising or retrieval of missing data. * Variational autoencoder is a modification which changes how to view compressed state (latent space)- uit is viewed as probability distribution, from which by sampling, new data can be created.
* GAN - somewhat similar to autoencoder (specially variational type). Except network has a critic, which learns if generated data looks authentic.
* DNN_solve_ODE - model learns to advance trajectory one step at a time.
* PINN_physics-informed-NN - model's learning is guided using loss function which includes ordinary differential equation (ODE). Or ODE parameters are retrieved from learning data.
* RNN_recurrent-NN - classic recurrent network cell. Implementation from scratch and by using pytorch module.
* LSTM_Long_Short_Term_Memory - upgrade from RNN with long term memory. Remade forward pass (learns badly) and pytorch module implementation for reference.
* Graph_neural_networks - implement graph convolution (GCN) network and graph attention network (GAT) forward passes and solve a problem.

plots/misc_tools:
* random methods that cannot be classified as standalone type.

Tutorial videos:

in vscode jupyter sets its working dir to file dir, so you cannot import modules from workspace. Change this: jup_repo_dir

For future:
* logging: https://www.youtube.com/watch?v=9L77QExPmI0
* entropy: https://towardsdatascience.com/but-what-is-entropy-ae9b2e7c2137
* Neural Ordinary Differential Equations

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages