This repository contains the implementation of DANets (Deep Abstract Networks) using PyTorch Lightning.
Challenge DANets have demonstrated their ability to outperform popular Gradient Boosting models such as LGBM, XGBoost, and CatBoost. However, training deep learning models presents unique challenges, including the need for flexible loss functions and the often tedious process of training and tuning. This repository addresses these challenges by providing a template that helps data scientists expedite their experiments and fully explore the potential of DANets.
Features Custom Loss and Metric Integration: This repository offers a flexible framework for integrating custom loss functions and metrics, allowing for more tailored and effective training strategies. Flexible Training Strategy: Leveraging PyTorch Lightning, this implementation provides a robust and adaptable training strategy aimed at achieving accuracy beyond traditional Gradient Boosting models like LGBM, XGBoost, and CatBoost. Original Implementation
For more details, you can refer to the original implementation here. See original implementation: https://github.com/WhatAShot/DANet