Skip to content

Core Functions

Ravin D edited this page Oct 9, 2024 · 1 revision

Core Functions in PyDeepFlow

PyDeepFlow provides the building blocks necessary to create and train neural networks. This section covers the core functions available in the framework.

Multi_Layer_ANN

This class defines a multi-layer artificial neural network.

  • Constructor:
    __init__(self, X_train, Y_train, hidden_layers, activations, loss='categorical_crossentropy', use_gpu=False)
    • X_train: Input training data (NumPy or CuPy array).
    • Y_train: Ground truth labels.
    • hidden_layers: List of integers representing the number of neurons in each hidden layer.
    • activations: List of activation functions (e.g., relu, tanh, etc.) for each layer.
    • loss: Loss function name (e.g., binary_crossentropy, mse).
    • use_gpu: Whether to use GPU for computations (requires CuPy).

Forward Propagation

def forward_propagation(self, X)

Performs forward propagation through the network, returning activations and Z-values.

Backpropagation

def backpropagation(self, X, y, activations, Z_values, learning_rate)

Updates the weights and biases by computing gradients with respect to the loss.

Training the Model

def fit(self, epochs, learning_rate)

Trains the model for a specified number of epochs with a given learning rate.

Prediction Functions

  • predict(self, X): Predicts the class labels for the given input data.
  • predict_prob(self, X): Returns the predicted probabilities.

Learning Rate Scheduling

def adjust_learning_rate(self, epoch)

Adjusts the learning rate during training based on user-defined schedules.

Clone this wiki locally