Description
Describe the bug
We use the docker-compose to deploy the scarpyd and scrapydweb,
- SCRAPYD_SERVERS=scrapyd:6800
then in the Scrapydweb, all the Log URL start from scrapyd:6800
, is there any way to setup different host for the Log or other link related to Scrapyd?
我们使用 Docker compose 编排 scrapyd 和 scrapydweb, scrapydweb 配置 scrapyd 服务使用 Docker 内部 host name
- SCRAPYD_SERVERS=scrapyd:6800
这样所有的 Log URL 的地址都是 scrapyd:6800 开头, 有办法单独配置这个 host 给所有和 Scrapyd 相关的 link 吗?
To Reproduce
Steps to reproduce the behavior:
- Go to '...'
- Click on '....'
- See error
Expected behavior
A clear and concise description of what you expected to happen.
Logs
Add logs of ScrapydWeb and Scrapyd (optional) when reproducing the bug.
(It's recommended to run ScrapydWeb with argument '--verbose' if its version >= 1.0.0)
Screenshots
If applicable, add screenshots to help explain your problem.
Environment (please complete the following information):
- Operating system: [e.g. Win 10, macOS 10.14, Ubuntu 18, centOS 7.6, Debian 9.6 or Fedora 29]
- Python version: [e.g. 2.7 or 3.7]
- ScrapydWeb version: [e.g. 1.4.0 or latest code on GitHub]
- ScrapydWeb related settings [e.g. 'ENABLE_AUTH = True']
- Scrapyd version: [e.g. 1.2.1 or latest code on GitHub]
- Scrapyd amount [e.g. 1 or 5]
- Scrapy version: [e.g. 1.8.0, 2.0.0 or latest code on GitHub]
- Browser [e.g. Chrome 71, Firefox 64 or Safari 12]
Additional context
Add any other context about the problem here.