-
Notifications
You must be signed in to change notification settings - Fork 2.4k
Description
I am thinking of a test process, and have some interrogations :
- The way to actually test the videos. On How to test videos? #34 , we discussed about something like that :
class Test(Scene): def construct(self): square = Circle() self.play(ShowCreation(square)) a = Test() b = a.get_frame() d = np.load('test.npy') print((b == d).all())
This is far from being optimal since only the last frame is tested. My idea is to test the frame every n time, where n would be set in a test config file (something like 0.2
s ? or less). This would require modifying a bit self.play
-
How to store and compare these frames : We can store every previously-rendered frame (np array) in a bunch of file.npy and compare one by one with the frame-tested corresponding. But my idea is to do something with hashes, like the previously-rendered frames (that are np arrays) are hashed and stored. When testing a frame, it gets hashed and compared with the previously-rendered one. Is this good ?
-
The formats of the tests: I think we can test each manim module separately, like testing all the creation animations in a
test_creation.py
, and then do that for every module. -
The format of the tests files: My idea was to do something like that :
Intest_module.py
class Test_AnAnimation(Scene):
def construct(self):
mobject = Mobject()
hashes = self.play(AnAnimation(mobject)
compare_with_prev_rendered(hashes)
class Test_AnotherThing(Scene):
def construct(self):
...
class test_module():
Test_AnAnimation(TEST_CONFIG)
Test_AnotherThing(TEST_CONFIG)
...
Thoughts ?