This project provides a self-hosted service to convert YouTube channels into podcast feeds. It automatically downloads the latest videos from specified YouTube channels, converts them to audio, and generates RSS feeds that can be used with any podcast client.
-
Clone the repository:
git clone https://github.com/SriviharReddy/podqueue.git cd podqueue -
Export YouTube cookies (highly recommended):
- Install the Get cookies.txt locally Chrome extension
- Log into your YouTube account in Chrome
- Use the extension to export your cookies as
cookies.txt - Place the
cookies.txtfile in the project root directory (podqueue/cookies.txt) - This helps avoid YouTube bot detection issues
-
Run the setup script:
./setup.sh
-
Start the Web UI:
./webui/start.sh
-
Open your browser and go to
http://localhost:8501
To access the Web UI from another device when running on a remote server:
-
Start the Web UI with network access:
cd webui streamlit run app.py --server.address 0.0.0.0 --server.port 8501 -
Configure your server's firewall to allow connections on port 8501
-
Access the Web UI from any device on the same network using:
http://YOUR_SERVER_IP:8501
For production use, consider:
- Using a reverse proxy (like Nginx) with SSL encryption
- Setting up authentication to secure the Web UI
- Using a process manager (like systemd or supervisor) to keep the Web UI running
The service consists of two main components:
- Downloader (
downloader.sh): A shell script that usesyt-dlpto download the latest videos from the YouTube channels specified inscripts/channels.json. It converts the videos to M4A audio files and stores them in thedownloadsdirectory. - RSS Generator (
rss_generator.py): A Python script that generates RSS feeds for each channel. It reads the downloaded audio files and their metadata to create the feeds in thefeedsdirectory.
The service is designed to be run on a server and can be automated with cron jobs.
This project also includes a Streamlit-based web interface (webui/) for easier management of your podcast channels and downloads.
-
Prerequisites:
-
Clone the repository:
git clone https://github.com/SriviharReddy/podqueue.git cd podqueue -
Install Python dependencies:
pip install -r scripts/requirements.txt
-
Configure the channels:
- Copy
scripts/channels.json.exampletoscripts/channels.json. - Edit
scripts/channels.jsonto add the YouTube channels you want to follow. Each entry should have anid,url, andlimit(the maximum number of episodes to keep). - You can add playlists directly, but for channels with @username URLs (e.g.,
https://www.youtube.com/@channelname), they will be automatically converted to channel ID URLs when using the Web UI. - The
urlinchannels.jsonshould be in the formathttps://www.youtube.com/channel/CHANNEL_IDfor direct channel URLs.
- Copy
-
Set up the base directory:
- The scripts expect to be run from a specific base directory. You will need to edit
downloader.shandrss_generator.pyto set theBASE_DIRvariable to the absolute path of the project directory.
- The scripts expect to be run from a specific base directory. You will need to edit
-
(Recommended) Cookies:
- It is highly recommended to provide a
cookies.txtfile in the root of the project to avoid bot detection issues with YouTube. Thedownloader.shscript will automatically use it. - To easily export cookies from your browser, you can use the Get cookies.txt locally Chrome extension.
- Without cookies, you may encounter errors when trying to access certain YouTube content that requires authentication or when YouTube detects automated access.
- It is highly recommended to provide a
- Navigate to the webui directory:
./webui/start.sh
-
Navigate to the webui directory:
cd webui -
Install the required Python packages:
pip install -r requirements.txt
-
Run the Streamlit app:
streamlit run app.py
-
Access the web interface at
http://localhost:8501
-
Run the downloader:
./scripts/downloader.sh
This will download the latest videos from the configured channels.
-
Run the RSS generator:
python3 scripts/rss_generator.py
This will generate the RSS feeds in the
feedsdirectory. -
Serve the feeds:
- The generated feeds are located in the
feedsdirectory. You will need to serve this directory with a web server (e.g., Nginx, Apache) to access them from your podcast client. TheBASE_URLinrss_generator.pyshould be set to the public URL of your server.
- The generated feeds are located in the
You can automate the process of downloading and generating feeds using cron jobs. For example, to run the downloader every hour and the RSS generator every two hours, you could add the following to your crontab:
0 * * * * /path/to/your/project/scripts/downloader.sh
0 */2 * * * /path/to/your/project/scripts/rss_generator.py
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.