Official repository of the paper "Copula Density Neural Estimation".
CODINE is a neural copula density estimator that estimates any copula density by maximizing a variational lower bound on the
$f$ -divergence.
Please refer to the paper to have a precise description of all the results.
The main purpose of CODINE is to estimate the copula density. In the following, we present three results of copula density estimation in three different settings.
CODINE can be adapted to be used as a MI estimator.
Once the copula density is estimated using CODINE, we show how to generate data sampling the estimated copula with Gibbs sampling.
We show the generated digits and the architecture of the decoder we use for generation.
Since the data is generated using a decoder, we compare the data generated feeding the decoder with the sampling of CODINE's estimated copula (referred to as codine generation) and a random sampling of a uniform distribution (random generation).
We compare the generation using different dimensions of the latent space.
Depending on your preferred library for neural implementatins, please refer to the corresponding folder:
CODINE_Keras
when using Keras (for a fast shortcut, click here)CODINE_PyTorch
when using PyTorch (for a fast shortcut, click here)
The paper presents a copula density estimation method, denoted as CODINE. CODINE is a neural network trained to estimate the copula density (and thus the pdf) associated to any data. By design, it works with pseudo-observations (data in the uniform probability space). It can be used for:
- Density estimation
- Dependence measures
- Mutual information estimation
- Data generation
- More...
The copula density is estimated by maximizing the following objective function with respect to
where
If you use the code for your research, please cite our paper:
@article{letizia2022copula,
title={Copula density neural estimation},
author={Letizia, Nunzio A and Novello, Nicola and Tonello, Andrea M},
journal={arXiv preprint arXiv:2211.15353},
year={2022}
}
The implementation is based on / inspired by: