-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Fix failing tests #185
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix failing tests #185
Conversation
Standing issues.
|
config['skip_animations'] = any([config['save_last_frame'],
config['from_animation_number'],
default.get_boolean('skip_animations')]) I don't know if it makes sense, but it could work.
|
|
Arh, mathematicians, always complicated ..
|
|
Add manim/tests/test_cli/test_cli.py Lines 8 to 11 in 7a088cf
It's a bit worrying that 1/14th of 2/100ths of a difference, which appears to within the margin of error of float operations in python, would cause a test failure. Still, can see if the tests are robust enough to handle modifications later on. |
I think we have tests again ;) |
There are three issues that I fixed :
1 : pytest rootdir
The first issue was a quick one, as the rootdir of pytest was set as manim/, the .npy files could not be found as their path was set to tests_data/etc and not tests/tests_data/etc.
I think that this was an inteniontal move by @leotrs, who wanted the tests to use a custom
manim.cfg
. Sadly, we can't currently use one when not rendering a scene not from command lines. See #184.2 :
skip_animation
not enabled for tests : a t values issue.The second issue the these failing tests undercovered was an issue with t values.
@leotrs changed a little bit the config that the tests use when they are running, he basically didn't add -s flag. I agree, this should not be a problem as the last frame should be the same if we skip the rendering process of the animations or not, but they are different. See #183 TLDR : This is a problem with the t values, when doing
-s
, there is only one t value generated by manim wich is 1.0 (the time when the animation ends), and when the skip_animations is False, there are, let's say, 15 values of t generated (as 15 fps) which go from 0 to not 1 but .93333. In other terms, the animation stops at 93%). I fixed THE TESTS (not this issue) by simply adding to testing_utils.pyfile_write_config['skip_animations'] = True
.3 : frame_width slightly changed
I didn't have to fix this, I will explain.
frame_width
is computed l62 config.py :Because of how Python handles float, this value may differ depending how what is the value of
config['pixel_width'] and config['pixel_height']
If config['pixel_width'] is not set in a
manim.cfg
, (as it is done in the tests, see testing_utils.py, l.48), default value of pixel_width will be taken.There is the problem,
config['frame_width']
should be the same in both cases (set in manim.cdg and not set) (as default values and values used in the tests have the same ratio), they slightly differ (something like 14.211111111111 and 14.23333333).That's why there is a slightly pixel related difference.
As we don't use a manim.cfg for the tests rn, it is ok. But when we will do so, we will have to render a new set of tests_data.
Issues to be fixed :
NameError: name 'FRAME_X_RADIUS' is not defined
Not sure how to fix that. The king @leotrs surely knows.
FIXED
UnicodeEncodeError: 'charmap' codec can't encode character '\u2764' in position 2905: character maps to <undefined>
This has been recently added in the -h flag : "Made with ❤ by the manim community dev".
Sadly, '\u2764' (the heart) is seems to not be suported by the decoding of
subprocess
.To quicly fix this major issue, we could do the biggest change in manim's history : changing ❤ by <3.
FIXED