Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Avoid Error When There Is No Positive Class In Certain Bins for KL Divergence Calculation #311

Merged
merged 1 commit into from
Mar 13, 2021

Conversation

lleiou
Copy link
Contributor

@lleiou lleiou commented Mar 9, 2021

Proposed changes

For KL Divergence, we will calculate the probability of conversion in each bin first in _GetNodeSummary(), in order to do that we will need to know the number of positive population under each treatment or control group, here is how the code currently gets the number:

n_1 = results[treatment_group_key][1]

However, there are rare cases when there is no positive population within a bin for a given treatment or control group and the code above will break, so I propose to change it to the following so that when there is no positive population then n_1 will be 0:

n_1 = results[treatment_group_key].get(1, 0)

Types of changes

What types of changes does your code introduce to CausalML?
Put an x in the boxes that apply

  • Bugfix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Documentation Update (if none of the other choices apply)

@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

@lleiou lleiou changed the title Avoid Error When There Is No Positive Class In Certain Bins Avoid Error When There Is No Positive Class In Certain Bins for KL Divergence Calculation Mar 9, 2021
Copy link
Collaborator

@paullo0106 paullo0106 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, I was able to reproduce the issue when # of bins is large and the data size is small, and verified the fix.

Thanks for the contribution!

@paullo0106 paullo0106 merged commit 0036d58 into uber:master Mar 13, 2021
@lleiou
Copy link
Contributor Author

lleiou commented Mar 13, 2021

Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants