Skip to content

Conversation

@abstract-base-method
Copy link

👋 hi there, love the concept of the project. Just a quick PR I figured might be helpful. This should add concurrent download support (GIL may still be an issue for YT-DLP in the backend) and standardize logging through slog.

If this is helpful, I am happy to submit some other refactoring nice to haves!

Signed-off-by: James Edward Bell Jr <13918870+abstract-base-method@users.noreply.github.com>
Signed-off-by: James Edward Bell Jr <13918870+abstract-base-method@users.noreply.github.com>
Signed-off-by: James Edward Bell Jr <13918870+abstract-base-method@users.noreply.github.com>
@andrewarrow
Copy link
Owner

Cool! I ran this through AI so take this with a grain of salt but it came up with:

The JSON tag in Api is json:"videoId:omitempty" — I think it should be json:"videoId,omitempty".

In NewDownloadQueue, for idx := range workers won’t compile since workers is an int. It should be a standard indexed loop.

min(time.Duration, time.Duration) is used but not defined — we’ll need a helper for that.

/youtube no longer writes a clear success response body, so clients might not know the enqueue succeeded.

Deduplication logic is gone; we may now enqueue the same video multiple times. Was that intentional?

GetQueueStatus still references isRunning, but that field no longer exists in the new model.

len(queue) on a channel shows buffer size, not total queued + in-progress items.

Also, me personally, I kinda like it doing just one video at a time NOT in parallel. I don't want multiple ffmpeg's running eating up more CPU. And youtube will rate limit you if you do more than one so you end up getting the same effective download rate.

Signed-off-by: James Edward Bell Jr <13918870+abstract-base-method@users.noreply.github.com>
Signed-off-by: James Edward Bell Jr <13918870+abstract-base-method@users.noreply.github.com>
@abstract-base-method
Copy link
Author

Thanks @andrewarrow, I did some additional refactoring to make the queue depth/concurrency configurable. Running in different environments may yield different results for CPU (I ran the backend on a server specifically built for transcoding workloads 😅).

I can confirm this still compiles, I made the response on /youtube return some metadata about the state of the system. I think to really do a check of "have I queued/downloaded this video id before" will either need to continue to manually enqueue/dequeue and search the queue, or search the filesystem for the video id, or I could bring in a SQLite DB and that would allow the program to store some states in the filesystem. Also happy to close my PR if this isn't helpful 🙏

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants