From 2d28528033e6032c2f212973641b604467f65860 Mon Sep 17 00:00:00 2001 From: Nicolai Rosdahl Tellefsen Date: Sun, 1 Oct 2023 16:05:24 +0200 Subject: [PATCH] Fix crawl command using docker-compose --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 658a633..e009e00 100644 --- a/README.md +++ b/README.md @@ -14,7 +14,7 @@ Heavily based on services provided by [Webrecorder project](https://github.com/w To capture a new collection run this, replacing `` with the page you want to start crawling -`docker-compose run -rm crawler crawl --url --generateWACZ --collection my-new-collection` +`docker-compose run crawler crawl --url --generateWACZ --collection my-new-collection` This is essentially the same steps as [Browsertrix crawler - getting started](https://github.com/webrecorder/browsertrix-crawler#getting-started), but with our default volumes @@ -55,4 +55,4 @@ Add additional repositories or other sources there. 1. Add one or more custom configs to the `browsertrix-crawler/configs` folder 2. Add cron-job that runs this container with docker-compose and a command like `crawl --config /app/configs/my-crawl-config.yaml` -Full example command (without scheduling): `docker-compose run -rm crawler crawl --config /app/configs/my-crawl-config.yaml` +Full example command (without scheduling): `docker-compose run crawler crawl --config /app/configs/my-crawl-config.yaml`