Skip to content

fix: Solve CUDA AV1 decoding #448

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 12 commits into from
Jan 13, 2025
Prev Previous commit
Next Next commit
Re-enable AV1 test on CUDA
  • Loading branch information
scotts committed Jan 10, 2025
commit d49fde543e9cbd7dff543a5eeca66a3a8b26e7fd
7 changes: 3 additions & 4 deletions test/decoders/test_video_decoder.py
Original file line number Diff line number Diff line change
Expand Up @@ -416,10 +416,9 @@ def test_get_frames_at_fails(self, device):
with pytest.raises(RuntimeError, match="Expected a value of type"):
decoder.get_frames_at([0.3])

def test_get_frame_at_av1(self):
# We don't parametrize with CUDA because the current GPUs on CI do not
# support AV1:
decoder = VideoDecoder(AV1_VIDEO.path, device="cpu")
@pytest.mark.parametrize("device", cpu_and_cuda())
def test_get_frame_at_av1(self, device):
decoder = VideoDecoder(AV1_VIDEO.path, device=device)
ref_frame10 = AV1_VIDEO.get_frame_data_by_index(10)
ref_frame_info10 = AV1_VIDEO.get_frame_info(10)
decoded_frame10 = decoder.get_frame_at(10)
Expand Down
Loading