Closed
Description
...since it doesn't generate a unique workflow job name. the apache beam job does; maybe just borrow its logic, or do something similar to generate a unique suffix?
example failure:
Traceback (most recent call last):
File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
File "/home/color/gcp-variant-transforms/local/lib/python2.7/site-packages/gcp_variant_transforms/vcf_to_bq.py", line 284, in <module>
run()
File "/home/color/gcp-variant-transforms/local/lib/python2.7/site-packages/gcp_variant_transforms/vcf_to_bq.py", line 206, in run
_merge_headers(known_args, pipeline_args, pipeline_mode)
File "/home/color/gcp-variant-transforms/local/lib/python2.7/site-packages/gcp_variant_transforms/vcf_to_bq.py", line 183, in _merge_headers
known_args.representative_header_file = temp_merged_headers_file_path
File "/home/color/gcp-variant-transforms/local/local/lib/python2.7/site-packages/apache_beam/pipeline.py", line 389, in __exit__
self.run().wait_until_finish()
File "/home/color/gcp-variant-transforms/local/local/lib/python2.7/site-packages/apache_beam/pipeline.py", line 369, in run
self.to_runner_api(), self.runner, self._options).run(False)
File "/home/color/gcp-variant-transforms/local/local/lib/python2.7/site-packages/apache_beam/pipeline.py", line 382, in run
return self.runner.run_pipeline(self)
File "/home/color/gcp-variant-transforms/local/local/lib/python2.7/site-packages/apache_beam/runners/dataflow/dataflow_runner.py", line 324, in run_pipeline
self.dataflow_client.create_job(self.job), self)
File "/home/color/gcp-variant-transforms/local/local/lib/python2.7/site-packages/apache_beam/utils/retry.py", line 180, in wrapper
return fun(*args, **kwargs)
File "/home/color/gcp-variant-transforms/local/local/lib/python2.7/site-packages/apache_beam/runners/dataflow/internal/apiclient.py", line 475, in create_job
return self.submit_job_description(job)
File "/home/color/gcp-variant-transforms/local/local/lib/python2.7/site-packages/apache_beam/utils/retry.py", line 180, in wrapper
return fun(*args, **kwargs)
File "/home/color/gcp-variant-transforms/local/local/lib/python2.7/site-packages/apache_beam/runners/dataflow/internal/apiclient.py", line 523, in submit_job_description
response = self._client.projects_locations_jobs.Create(request)
File "/home/color/gcp-variant-transforms/local/local/lib/python2.7/site-packages/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py", line 643, in Create
config, request, global_params=global_params)
File "/home/color/gcp-variant-transforms/local/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 722, in _RunMethod
return self.ProcessHttpResponse(method_config, http_response, request)
File "/home/color/gcp-variant-transforms/local/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 728, in ProcessHttpResponse
self.__ProcessHttpResponse(method_config, http_response, request))
File "/home/color/gcp-variant-transforms/local/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 599, in __ProcessHttpResponse
http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpConflictError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/analytics-color-prod/locations/us-central1/jobs?alt=json>: response: <{'status': '409', 'content-length': '302', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Tue, 12 Jun 2018 21:22:19 GMT', 'x-frame-options': 'SAMEORIGIN', 'alt-svc': 'quic=":443"; ma=2592000; v="43,42,41,39,35"', 'content-type': 'application/json; charset=UTF-8'}>, content <{
"error": {
"code": 409,
"message": "(fda40ffa4ddce383): The workflow could not be created. Causes: (c73c90de56636af5): There is already an active job named merge-vcf-headers. If you want to submit a second job, try again by setting a different name.",
"status": "ALREADY_EXISTS"
}
}
thanks in advance. i'm loving 0.4.0 btw, thanks for all the new features!