This is the official implementation of our ICML 2024 paper "MultiMax: Sparse and Multi-Modal Attention Learning""
-
Updated
Aug 19, 2025 - Python
This is the official implementation of our ICML 2024 paper "MultiMax: Sparse and Multi-Modal Attention Learning""
This repository features hands-on Jupyter Notebooks, covering everything from fundamental concepts to advanced neural network architectures.
We introduce two novel hybrid activation functions: S3 (Sigmoid-Softsign) and its improved version S4 (Smoothed S3)
This is a Fake and AI image prediction using Transfer Learning
Linear and Non Linear Activation Functions : Linear, ReLU, Sigmoid, Softmax, Tanh
This is a custom-built neural network that detects handwritten numbers from image inputs. It uses ReLU activation in the hidden layers and a softmax activation function in the output layer for classification. The model is trained using backpropagation with a loss function to minimize prediction errors, achieving over 99% accuracy when predicting
Graph by matplotlib
A Benchmark for Activation Function Exploration for Neural Architecture Search (NAS)
Add a description, image, and links to the activation-function-exploration topic page so that developers can more easily learn about it.
To associate your repository with the activation-function-exploration topic, visit your repo's landing page and select "manage topics."