Skip to content

Hurl will only run first test #593

Closed
@bdmorin

Description

uname -a                                                                                   
Darwin dredl0ck 21.5.0 Darwin Kernel Version 21.5.0: Tue Apr 26 21:08:22 PDT 2022; root:xnu-8020.121.3~4/RELEASE_X86_64 x86_64

hurl -V
hurl 1.5.0 libcurl/7.79.1 (SecureTransport) LibreSSL/3.3.6 zlib/1.2.11 1.45.1

I'm feel like this is user error, but for the life of me I can't figure out what I'm doing wrong. I'm trying to write assertions for various domains robots.txt file, and hurl is only running the first test in my hurl file.

robots.hurl

GET https://domain.com/robots.txt

HTTP/* 200
[Asserts]
status < 300

GET https://staging.domain.com/robots.txt

HTTP/* 200
[Asserts]
status < 300

GET https://development.domain.com/robots.txt

HTTP/* 200
[Asserts]
status < 300

GET https://domain.com/robots.txt

HTTP/* 200
[Asserts]
status < 300

GET https://staging.domain.com/robots.txt

HTTP/* 200
[Asserts]
status < 300

GET https://development.domain.com/robots.txt

HTTP/* 200
[Asserts]
status < 300

Execution

hurl --test --summary robots.hurl
robots.hurl: RUNNING [1/1]
robots.hurl: SUCCESS
--------------------------------------------------------------------------------
Executed:  1
Succeeded: 1 (100.0%)
Failed:    0 (0.0%)
Duration:  1536ms

So it's only running the first one.

I tried taking out all the tests, and just running the URLS

robots2.hurl
GET https://domain.com/robots.txt
GET https://staging.domain.com/robots.txt
GET https://development.domain.com/robots.txt
GET https://domain.com/robots.txt
GET https://staging.domain.com/robots.txt
GET https://development.domain.com/robots.txt


hurl --progress robots2.hurl                                                           Tue May 24 10:53:15 2022
robots2.hurl: RUNNING [1/1]
robots2.hurl: SUCCESS
# Block all web crawlers from scanning the contact page.

user-agent: *
disallow: /contact
disallow: /contact*.php

# Handle multiple sitemaps:
# - 1 static marketing sitemap (excludes blog content).
# - 1 dynamic blog sitemap.
# Note: this command is only supported by Google, Ask, Bing, and Yahoo search engines.

sitemap: https://www.domain.com/sitemap_index.xml⏎

again, only running the first.

I'd swear this is user error, but it looks like a bug since the docs say you can just list one after another.

Let me know if I can provide more info.

Metadata

Assignees

No one assigned

    Labels

    documentationImprovements or additions to documentationenhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions