Skip to content

Features

  • 🧩 Layers: Dense, Dropout, Flatten, Pooling, Max-Pooling
  • Activations: ReLU, Sigmoid, Tanh, Softmax, ELU, Leaky ReLU, Linear, GeLU
  • 📉 Loss Functions: Mean Squared Error, Binary Cross-Entropy, Focal Loss, etc.
  • 🚀 Optimizers: SGD, Adam, RMSprop
  • 🔧 Preprocessing: Label Encoding, One-Hot Encoding, Standard Scaler, Min-Max Scaler
  • 🛡️ Regularizers: L1, L2, Combined L1-L2
  • Test Coverage: Comprehensive unit tests for all modules.