-
-
Notifications
You must be signed in to change notification settings - Fork 18.5k
BUG: Series.map fails when keys are tuples of different lengths (#7333) #7336
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
} | ||
df['labels'] = df['a'].map(label_mappings) | ||
# All labels should be filled now | ||
self.assertFalse(any(df['labels'].isnull())) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
explicitly construct the Series
that you want and compare the result with the expected using tm.assert_series_equal(result, expected)
.
this seems like a bug in the |
probably related to the change to use multiindex for tuple keys by default |
hm nevermind ... i guess the constructor behavior makes sense from a mi point of view |
pls add a release note to |
Explicitly use keys as index to prevent conversion to MultiIndex Add test for mapping with tuple-keyed dict Specify expected output for the test and use assert_series_equal Add note to v0.14.1 release notes
Alright, I've added to the release notes and it seems like everything's good, so I've squashed into a single commit. Should be okay to merge if people are OK with it. |
Series.map
fails when keys are tuples of different lengths (#7333)
@cpcloud looks ok to you? (i'll clear conflict on merging) |
Yep. |
@onesandzeroes thanks! |
closes #7333 seems to be due to the new behaviour of
Series
, which will automatically create a MultiIndex out of the dict's keys when creating a series to map the values. The MultiIndex doesn't match the values for the shorter tuples, so they fail to map.The fix is fairly simple, just pass the dict keys explicitly as the index.
I've added a test case for this specific issue, but could add some more if others can see related issues that would arise from this. I'll try to finalize the PR (squashing into a single commit, rebasing if necessary) tomorrow if it looks good.