Skip to content
Usman Shahid edited this page Feb 12, 2019 · 2 revisions

A job is an independently running task. Whenever you submit a job through an HTTP request, a job id is returned to you which you can later use to query the state of the job. A job can contain multiple commands/scripts.

Submitting a Job (POST /v1/jobs)

To submit a new job, do

POST http://localhost:2112/v1/jobs
{
	"Executions": [
		{
			"Executable": {
				"Command": "echo",
				"Args": ["shhttp is awesome"]
			}
		},
		{
			"Executable": {
				"Command": "echo",
				"Args": ["\"winner\nwinner\nchicken\ndinner\"", "|", "grep", "chicken"],
				"Shell": true
			}
		}
	]
}

This will return:

{
    "id": "1549620684621883000-ddf85eb0-e2f8-4ff9-9012-c314dec75cf2"
}

Getting Job Status (GET /v1/jobs/{id})

To get the status of the job, do a:

GET http://localhost:2112/v1/jobs/1549620684621883000-ddf85eb0-e2f8-4ff9-9012-c314dec75cf2

This will return:

{
    "Id": "1549620684621883000-ddf85eb0-e2f8-4ff9-9012-c314dec75cf2",
    "Executions": [
        {
            "Executable": {
                "Command": "echo",
                "Args": [
                    "shhttp is awesome"
                ],
                "ExecPath": "",
                "Stdin": "",
                "Shell": false
            },
            "Stdout": "shhttp is awesome\n",
            "Stderr": "",
            "ExitCode": 0,
            "Start": 1549620684,
            "End": 1549620684
        },
        {
            "Executable": {
                "Command": "echo",
                "Args": [
                    "\"winner\nwinner\nchicken\ndinner\"",
                    "|",
                    "grep",
                    "chicken"
                ],
                "ExecPath": "",
                "Stdin": "",
                "Shell": true
            },
            "Stdout": "chicken\n",
            "Stderr": "",
            "ExitCode": 0,
            "Start": 1549620684,
            "End": 1549620684
        }
    ],
    "Status": "DONE",
    "Created": 1549620684,
    "LastModified": 1549620684,
    "IgnoreErrors": false
}

The commands/scripts are performed in sequence.

Ignoring Errors

A job will fail if any of the commands/scripts returns a non-zero exit code. In this case, the job will fail at that command/script and would not execute the successive commands/scripts. To prevent this, you can ignore errors by setting IgnoreErrors to true:

{
	"Executions": [
		{
			"Executable": {
				"Command": "echoecho",
				"Args": ["shhttp is awesome"]
			}
		},
		{
			"Executable": {
				"Command": "echo",
				"Args": ["\"winner\nwinner\nchicken\ndinner\"", "|", "grep", "chicken"],
				"Shell": true
			}
		}
	],
	"IgnoreErrors": true
}

Get All Job Ids (GET /v1/jobs)

Doing:

GET http://localhost:2112/v1/jobs

Will return a list of all the job ids:

{
    "Ids": [
        "1549621190200519000-0185f474-7689-4bf3-854d-9d8569e4ca30",
        "1549621269788136000-d87ae1a1-a7e4-41d9-a6a8-34d68f4a8c60",
        "1549621271388652000-24a8104e-0a09-46aa-84fb-30090f7dda3a"
    ]
}

Finished Jobs Cleanup

The job history is maintained in a file based store which can clutter up and consume disk space with time. To clean up finished (completed/failed) jobs, run the shhttp with the -clean-interval flag e.g.

shhttp -clean-interval 5

The -clean-interval is an integer that specifies the hours after which the finished jobs are purged from the job history.

Jobs Revival on Restart

If the daemon is killed by someone, it is possible that there were certain jobs that were either queued or in progress. The daemon's init process can revive the jobs or it can just consider the interrupted jobs as failed jobs. This can be configured by the -revive flag. To make the daemon revive the jobs, set -revive to true.

shhttp -revive true
Clone this wiki locally