You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+2-22Lines changed: 2 additions & 22 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -22,28 +22,8 @@ A simple, hand-rolled neural network project for testing and exploration.
22
22
Gradually increases the learning rate at the start of training to stabilize early updates. Configure the warmup steps and scaling factors to improve convergence.
23
23
24
24
-**Robust Configuration Management with Pydantic:**
25
-
Employs Pydantic-based configuration (in the `NeuralNetworkConfig`) that validates fields, ensures parameter correctness, and simplifies hyperparameter management.### Features
26
-
27
-
-**Flexible Architecture Configuration:**
28
-
Easily define custom network layer dimensions via the `layer_dims` setting. The network supports both shallow and deeper architectures.
29
-
30
-
-**Classification and Regression Tasks:**
31
-
Train and evaluate the network on various tasks:
32
-
-**Classification:** Tested on datasets like the Breast Cancer and Titanic datasets, providing binary or multi-class classification capabilities.
33
-
-**Regression:** Successfully applied to tasks like predicting apartment rents from the StreetEasy dataset.
34
-
35
-
-**Multiple Optimizers:**
36
-
Switch between optimizers like **SGD** and **Adam** without changing your code logic, allowing you to experiment with different optimization strategies easily.
37
-
38
-
-**Dropout Regularization:**
39
-
Incorporate dropout layers to combat overfitting. Control the dropout probability and leverage inverted dropout scaling for consistent training behavior.
40
-
41
-
-**Warmup Learning Rate Schedules:**
42
-
Gradually increase the learning rate at the start of training to stabilize early updates. Configure the warmup steps and scaling factors to improve convergence.
43
-
44
-
-**Robust Configuration Management with Pydantic:**
45
-
Employ Pydantic-based configuration (e.g., `NeuralNetworkConfig`) that validates fields, ensures parameter correctness, and simplifies hyperparameter management.
46
-
25
+
Employs Pydantic-based configuration (in the `NeuralNetworkConfig`) that validates fields, ensures parameter correctness, and simplifies hyperparameter management.
0 commit comments