Skip to content

dsyahput/MLP_for_CPP_Deployment

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MLP_for_CPP_Deployment

This repository provides an example of deploying an MLP (Multi-Layer Perceptron) model with ONNX Runtime in C++, using pre-scaled data.
The implementation is designed to support both regression and classification.

To train and export an MLP model to ONNX format, use one of the Python repositories:
👉 MLP_for_Regression
👉 MLP_for_Classification

When trained with those repositories, the ONNX model and scaler files will be generated and can be directly used here.

For testing purposes, this repository already includes sample models trained on Kaggle datasets:
👉 Student Performance Dataset
👉 Mobile Price Classification Dataset


🔧 Build Instructions

  1. Install ONNX Runtime (C++ API)
    You can build it or download prebuilt binaries from the official repo:
    👉 ONNX Runtime

  2. Update the path to your ONNX Runtime installation in CMakeLists.txt:

    set(ONNXRUNTIME_DIR "/path/to/onnxruntime")
  3. Build the project:

    mkdir build && cd build
    cmake ..
    make
  4. Run the executable:

    ./mlp_main

âš¡ Usage Notes

In main.cpp, you can switch between Regression and Classification modes.

  • Regression mode requires scaler_y.txt (target scaler).
  • Classification mode ignores scaler_y.txt if provided

Simply comment or uncomment the corresponding block depending on the type of model you want to use:

// ================= Regression Mode =================
mlp = new MLPModel(model_path, scaler_x_path, scaler_y_path, Mode::REGRESSION);
float reg_result = mlp->PredictRegression(input_data);
std::cout << "Regression result: " << reg_result << std::endl;

// ================= Classification Mode =================
mlp = new MLPModel(model_path, scaler_x_path, scaler_y_path, Mode::CLASSIFICATION);
int cls_result = mlp->PredictClassification(input_data);
std::cout << "Classification result: class " << cls_result << std::endl;

About

Deployment of MLP models in C++ using ONNX Runtime

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published