This repository provides a basic implementation of a neural network from scratch using MATLAB as a personal reference. It serves as supplementary material for CODECERDAS-6G project, Telkom University, Bandung, Indonesia.
├── func/
│ ├── activation/ % Activation functions
│ ├── backprop/ % Forward/backward propagation
│ ├── utils/ % Helper functions
│ └── weight_init/ % Weight initialization methods
├── xor_operation_train_no_bias.m
├── xor_operation_eval_no_bias.m
├── xor_operation_train.m
└── xor_operation_eval.m
-
Layer Structure
% Define layers (example: 2 inputs → 2 hidden → 1 output) input_size = N; % N = number of inputs dense1 = H; % H = number of hidden neurons dense2 = M; % M = number of outputs
-
Matrix Dimensions
% Weight matrices W1 = [H × N] % Hidden layer weights W2 = [M × H] % Output layer weights % Bias vectors WB1 = [H × 1] % Hidden layer bias WB2 = [M × 1] % Output layer bias % Data format INPUT = [samples × N] % Each row is one sample
-
General Rules
- Hidden layer size: dense = any number
- Weight matrix rows = number of neurons in current layer
- Weight matrix cols = number of inputs to that layer
- Bias vector length = number of neurons in that layer
-
Activation Functions
activation: @func_activation derivative: @func_activation_deriv
-
Network Flow
% forward pass pattern layer = func_forward_pass(input, W, b, WB, dense, @activation); % calculate error next_e = target - layer; % backward pass pattern % for the last layer (output), set next_bias = 0 [W, e] = func_backward_pass(layer, next_b, input, W, WB, b, next_e, l_rate, @activation_deriv);
Note: if using bias, ensure to add the bias input and weight.
For any inquiries or feedback, you can reach me at moefqy@rocketmail.com.