You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using a FloatFormatter with learn_rounding_scheme=True, we expect the transformer to learn the maximum # of significant digits. In practice, we see the following:
If data has 0-14 digits, then the transformer learns rounding scheme [Working as intended]
If the data has 15+ digits, then the transformer learns 0 digits, producing whole numbers instead [Bug]
We expect that case 2 to work. Or as a fallback, at least stop enforcing the rounding if there are already a large number of digits.
Steps to reproduce
importpandasaspdfromrdtimportHyperTransformerfromrdt.transformers.numericalimportFloatFormatter# create test data with 16 digitstest_data=pd.DataFrame(data={
'column': [1.1234567890123456]
})
ht=HyperTransformer()
ht.set_config({
'sdtypes': { 'column': 'numerical' },
'transformers': { 'column': FloatFormatter(learn_rounding_scheme=True)}
})
t=ht.fit_transform(test_data)
ht.reverse_transform(t)
Output: 1.0 (no digits learned)
The text was updated successfully, but these errors were encountered:
Environment Details
Error Description
When using a FloatFormatter with
learn_rounding_scheme=True
, we expect the transformer to learn the maximum # of significant digits. In practice, we see the following:We expect that case 2 to work. Or as a fallback, at least stop enforcing the rounding if there are already a large number of digits.
Steps to reproduce
Output: 1.0 (no digits learned)
The text was updated successfully, but these errors were encountered: