This repository provides an example of deploying an MLP (Multi-Layer Perceptron) model with ONNX Runtime in C++, using pre-scaled data.
The implementation is designed to support both regression and classification.
To train and export an MLP model to ONNX format, use one of the Python repositories:
👉 MLP_for_Regression
👉 MLP_for_Classification
When trained with those repositories, the ONNX model and scaler files will be generated and can be directly used here.
For testing purposes, this repository already includes sample models trained on Kaggle datasets:
👉 Student Performance Dataset
👉 Mobile Price Classification Dataset
-
Install ONNX Runtime (C++ API)
You can build it or download prebuilt binaries from the official repo:
👉 ONNX Runtime -
Update the path to your ONNX Runtime installation in
CMakeLists.txt:set(ONNXRUNTIME_DIR "/path/to/onnxruntime")
-
Build the project:
mkdir build && cd build cmake .. make
-
Run the executable:
./mlp_main
In main.cpp, you can switch between Regression and Classification modes.
- Regression mode requires scaler_y.txt (target scaler).
- Classification mode ignores scaler_y.txt if provided
Simply comment or uncomment the corresponding block depending on the type of model you want to use:
// ================= Regression Mode =================
mlp = new MLPModel(model_path, scaler_x_path, scaler_y_path, Mode::REGRESSION);
float reg_result = mlp->PredictRegression(input_data);
std::cout << "Regression result: " << reg_result << std::endl;
// ================= Classification Mode =================
mlp = new MLPModel(model_path, scaler_x_path, scaler_y_path, Mode::CLASSIFICATION);
int cls_result = mlp->PredictClassification(input_data);
std::cout << "Classification result: class " << cls_result << std::endl;