Skip to content

Testing process #97

@huguesdevimeux

Description

@huguesdevimeux

I am thinking of a test process, and have some interrogations :

  1. The way to actually test the videos. On How to test videos? #34 , we discussed about something like that :
class Test(Scene): 
    def construct(self): 
        square = Circle()
        self.play(ShowCreation(square))

a = Test()
b = a.get_frame()
d = np.load('test.npy')
print((b == d).all())

This is far from being optimal since only the last frame is tested. My idea is to test the frame every n time, where n would be set in a test config file (something like 0.2 s ? or less). This would require modifying a bit self.play

  1. How to store and compare these frames : We can store every previously-rendered frame (np array) in a bunch of file.npy and compare one by one with the frame-tested corresponding. But my idea is to do something with hashes, like the previously-rendered frames (that are np arrays) are hashed and stored. When testing a frame, it gets hashed and compared with the previously-rendered one. Is this good ?

  2. The formats of the tests: I think we can test each manim module separately, like testing all the creation animations in a test_creation.py, and then do that for every module.

  3. The format of the tests files: My idea was to do something like that :
    In test_module.py

class Test_AnAnimation(Scene): 
    def construct(self): 
        mobject = Mobject()
        hashes = self.play(AnAnimation(mobject)
        compare_with_prev_rendered(hashes)

class Test_AnotherThing(Scene): 
    def construct(self): 
        ...

class test_module(): 
    Test_AnAnimation(TEST_CONFIG) 
    Test_AnotherThing(TEST_CONFIG)
    ...

Thoughts ?

Metadata

Metadata

Labels

testingAnything related to testing the library

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions