Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Source Asana: Sync token invalid or too old #33842

Open
1 task
Valdri opened this issue Dec 29, 2023 · 0 comments
Open
1 task

Source Asana: Sync token invalid or too old #33842

Valdri opened this issue Dec 29, 2023 · 0 comments

Comments

@Valdri
Copy link

Valdri commented Dec 29, 2023

Connector Name

Asana

Connector Version

0.6.1

What step the error happened?

During the sync

Relevant information

Failure reason: Sync token invalid or too old. If you are attempting to keep resources in sync, you must fetch the full dataset for this query now and use the new sync token for the next sync.

Relevant log output

2023-12-29 15:26:30 source > SourceAsana runtimes:
Syncing stream attachments 0:00:04.471807
Syncing stream attachments_compact 0:03:06.216297
Syncing stream custom_fields 0:00:01.028436
2023-12-29 15:26:30 source > {"errors":[{"message":"Sync token invalid or too old. If you are attempting to keep resources in sync, you must fetch the full dataset for this query now and use the new sync token for the next sync."}],"sync":"5a888cbba4065641bffe1b44dced2fcf:0"}
2023-12-29 15:26:30 source > Encountered an exception while reading stream events
Traceback (most recent call last):
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 116, in read
    stream_is_available, reason = stream_instance.check_availability(logger, self)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/core.py", line 263, in check_availability
    return self.availability_strategy.check_availability(self, logger, source)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/availability_strategy.py", line 56, in check_availability
    is_available, reason = self.handle_http_error(stream, logger, source, error)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/availability_strategy.py", line 85, in handle_http_error
    raise error
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/availability_strategy.py", line 50, in check_availability
    get_first_record_for_slice(stream, stream_slice)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/utils/stream_helper.py", line 40, in get_first_record_for_slice
    return next(records_for_slice)
  File "/airbyte/integration_code/source_asana/streams.py", line 204, in read_records
    yield from super().read_records(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 482, in read_records
    yield from self._read_pages(
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 498, in _read_pages
    request, response = self._fetch_next_page(stream_slice, stream_state, next_page_token)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 524, in _fetch_next_page
    response = self._send_request(request, request_kwargs)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 422, in _send_request
    return backoff_handler(user_backoff_handler)(request, request_kwargs)
  File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry
    ret = target(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry
    ret = target(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 381, in _send
    raise exc
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 378, in _send
    response.raise_for_status()
  File "/usr/local/lib/python3.9/site-packages/requests/models.py", line 1021, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 412 Client Error: Precondition Failed for url: https://app.asana.com/api/1.0/events?limit=100&opt_fields=user.gid%2Cresource.gid%2Ctype%2Caction%2Cparent.gid%2Ccreated_at%2Cchange.gid&resource=1203897763724806
2023-12-29 15:26:30 source > Marking stream events as STOPPED
2023-12-29 15:26:31 replication-orchestrator > Unable to update stream status for event ReplicationAirbyteMessageEvent(airbyteMessageOrigin=SOURCE, airbyteMessage=io.airbyte.protocol.models.AirbyteMessage@8e23ff9[type=TRACE,log=<null>,spec=<null>,connectionStatus=<null>,catalog=<null>,record=<null>,state=<null>,trace=io.airbyte.protocol.models.AirbyteTraceMessage@38c3cc4a[type=STREAM_STATUS,emittedAt=1.70386359099763E12,error=<null>,estimate=<null>,streamStatus=io.airbyte.protocol.models.AirbyteStreamStatusTraceMessage@2abc6571[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@606aeaa0[name=events,namespace=<null>,additionalProperties={}],status=INCOMPLETE,additionalProperties={}],analytics=<null>,additionalProperties={}],control=<null>,additionalProperties={}], replicationContext=ReplicationContext[isReset=false, connectionId=7bebcf18-8c6e-4302-adf8-f293f99a31cc, sourceId=cedcd770-a135-40d3-9102-c98e92d3aebd, destinationId=01aa19cd-16da-42af-9164-152c50d9057c, jobId=7055518, attempt=4, workspaceId=1e504d10-d6ae-4a12-a42e-14b665369fc5], incompleteRunCause=null)
io.airbyte.workers.internal.exception.StreamStatusException: Invalid stream status transition to INCOMPLETE (origin = SOURCE, context = ReplicationContext[isReset=false, connectionId=7bebcf18-8c6e-4302-adf8-f293f99a31cc, sourceId=cedcd770-a135-40d3-9102-c98e92d3aebd, destinationId=01aa19cd-16da-42af-9164-152c50d9057c, jobId=7055518, attempt=4, workspaceId=1e504d10-d6ae-4a12-a42e-14b665369fc5], stream = null:events)
	at io.airbyte.workers.internal.bookkeeping.StreamStatusTracker.handleStreamIncomplete-zkXUZaI(StreamStatusTracker.kt:249) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
	at io.airbyte.workers.internal.bookkeeping.StreamStatusTracker.handleStreamStatus(StreamStatusTracker.kt:107) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
	at io.airbyte.workers.internal.bookkeeping.StreamStatusTracker.track(StreamStatusTracker.kt:60) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
	at io.airbyte.workers.internal.bookkeeping.events.AirbyteStreamStatusMessageEventListener.onApplicationEvent(AirbyteStreamStatusMessageEventListener.kt:18) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
	at io.airbyte.workers.internal.bookkeeping.events.AirbyteStreamStatusMessageEventListener.onApplicationEvent(AirbyteStreamStatusMessageEventListener.kt:15) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
	at io.micronaut.context.event.ApplicationEventPublisherFactory.notifyEventListeners(ApplicationEventPublisherFactory.java:262) ~[micronaut-inject-3.10.1.jar:3.10.1]
	at io.micronaut.context.event.ApplicationEventPublisherFactory.access$200(ApplicationEventPublisherFactory.java:60) ~[micronaut-inject-3.10.1.jar:3.10.1]
	at io.micronaut.context.event.ApplicationEventPublisherFactory$2.publishEvent(ApplicationEventPublisherFactory.java:229) ~[micronaut-inject-3.10.1.jar:3.10.1]
	at io.airbyte.workers.internal.bookkeeping.events.ReplicationAirbyteMessageEventPublishingHelper.publishStatusEvent(ReplicationAirbyteMessageEventPublishingHelper.kt:67) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
	at io.airbyte.workers.general.ReplicationWorkerHelper.internalProcessMessageFromSource(ReplicationWorkerHelper.kt:334) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
	at io.airbyte.workers.general.ReplicationWorkerHelper.processMessageFromSource(ReplicationWorkerHelper.kt:400) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
	at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$8(DefaultReplicationWorker.java:350) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
	at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
	at java.lang.Thread.run(Thread.java:1583) ~[?:?]
2023-12-29 15:26:31 source > Finished syncing events
2023-12-29 15:26:31 source > SourceAsana runtimes:
Syncing stream attachments 0:00:04.471807
Syncing stream attachments_compact 0:03:06.216297
Syncing stream custom_fields 0:00:01.028436
Syncing stream events 0:00:00.619239
2023-12-29 15:26:31 source > 412 Client Error: Precondition Failed for url: https://app.asana.com/api/1.0/events?limit=100&opt_fields=user.gid%2Cresource.gid%2Ctype%2Caction%2Cparent.gid%2Ccreated_at%2Cchange.gid&resource=1203897763724806
Traceback (most recent call last):
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 116, in read
    stream_is_available, reason = stream_instance.check_availability(logger, self)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/core.py", line 263, in check_availability
    return self.availability_strategy.check_availability(self, logger, source)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/availability_strategy.py", line 56, in check_availability
    is_available, reason = self.handle_http_error(stream, logger, source, error)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/availability_strategy.py", line 85, in handle_http_error
    raise error
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/availability_strategy.py", line 50, in check_availability
    get_first_record_for_slice(stream, stream_slice)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/utils/stream_helper.py", line 40, in get_first_record_for_slice
    return next(records_for_slice)
  File "/airbyte/integration_code/source_asana/streams.py", line 204, in read_records
    yield from super().read_records(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 482, in read_records
    yield from self._read_pages(
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 498, in _read_pages
    request, response = self._fetch_next_page(stream_slice, stream_state, next_page_token)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 524, in _fetch_next_page
    response = self._send_request(request, request_kwargs)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 422, in _send_request
    return backoff_handler(user_backoff_handler)(request, request_kwargs)
  File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry
    ret = target(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry
    ret = target(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 381, in _send
    raise exc
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 378, in _send
    response.raise_for_status()
  File "/usr/local/lib/python3.9/site-packages/requests/models.py", line 1021, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 412 Client Error: Precondition Failed for url: https://app.asana.com/api/1.0/events?limit=100&opt_fields=user.gid%2Cresource.gid%2Ctype%2Caction%2Cparent.gid%2Ccreated_at%2Cchange.gid&resource=1203897763724806

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/airbyte/integration_code/main.py", line 13, in <module>
    launch(source, sys.argv[1:])
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 209, in launch
    for message in source_entrypoint.run(parsed_args):
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 116, in run
    yield from map(AirbyteEntrypoint.airbyte_message_to_string, self.read(source_spec, config, config_catalog, state))
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 158, in read
    yield from self.source.read(self.logger, config, catalog, state)
  File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 141, in read
    raise AirbyteTracedException.from_exception(e, message=display_message) from e
airbyte_cdk.utils.traced_exception.AirbyteTracedException: 412 Client Error: Precondition Failed for url: https://app.asana.com/api/1.0/events?limit=100&opt_fields=user.gid%2Cresource.gid%2Ctype%2Caction%2Cparent.gid%2Ccreated_at%2Cchange.gid&resource=1203897763724806
2023-12-29 15:26:31 replication-orchestrator > Source has no more messages, closing connection.
2023-12-29 15:26:31 replication-orchestrator > (pod: jobs / source-asana-read-7055518-4-wtqrx) - Closed all resources for pod
2023-12-29 15:26:31 replication-orchestrator > Attempt 0 to update stream status incomplete null:attachments_compact
2023-12-29 15:26:31 replication-orchestrator > Attempt 0 to update stream status incomplete null:custom_fields
2023-12-29 15:26:31 replication-orchestrator > Attempt 0 to update stream status incomplete null:attachments
2023-12-29 15:26:32 destination > INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):80 Airbyte message consumer: succeeded.
2023-12-29 15:26:32 destination > INFO i.a.i.d.m.MongodbRecordConsumer(close):90 Migration finished with no explicit errors. Copying data from tmp tables to permanent
2023-12-29 15:26:33 destination > null
2023-12-29 15:26:33 destination > INFO i.a.i.d.m.MongodbRecordConsumer(close):107 Removing tmp collections...
2023-12-29 15:26:34 destination > INFO i.a.i.d.m.MongodbRecordConsumer(close):110 Finishing destination process...completed
2023-12-29 15:26:34 destination > INFO i.a.i.b.IntegrationRunner(runInternal):197 Completed integration: io.airbyte.integrations.destination.mongodb.MongodbDestinationStrictEncrypt
2023-12-29 15:26:34 destination > INFO i.a.i.d.m.MongodbDestinationStrictEncrypt(main):54 completed destination: class io.airbyte.integrations.destination.mongodb.MongodbDestinationStrictEncrypt
2023-12-29 15:26:34 replication-orchestrator > (pod: jobs / destination-mongodb-strict-encrypt-write-7055518-4-crner) - Closed all resources for pod
2023-12-29 15:26:34 replication-orchestrator > thread status... timeout thread: false , replication thread: true
2023-12-29 15:26:34 replication-orchestrator > Sync worker failed.
java.util.concurrent.ExecutionException: io.airbyte.workers.internal.exception.SourceException: Source didn't exit properly - check the logs!
	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?]
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?]
	at io.airbyte.workers.general.DefaultReplicationWorker.replicate(DefaultReplicationWorker.java:213) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
	at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:143) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
	at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:63) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
	at io.airbyte.container_orchestrator.orchestrator.ReplicationJobOrchestrator.runJob(ReplicationJobOrchestrator.java:126) ~[io.airbyte-airbyte-container-orchestrator-dev-f5eb5d18e5.jar:?]
	at io.airbyte.container_orchestrator.Application.run(Application.java:78) ~[io.airbyte-airbyte-container-orchestrator-dev-f5eb5d18e5.jar:?]
	at io.airbyte.container_orchestrator.Application.main(Application.java:38) ~[io.airbyte-airbyte-container-orchestrator-dev-f5eb5d18e5.jar:?]
	Suppressed: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.
		at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:158) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
		at io.airbyte.workers.general.DefaultReplicationWorker.replicate(DefaultReplicationWorker.java:161) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
		at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:143) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
		at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:63) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
		at io.airbyte.container_orchestrator.orchestrator.ReplicationJobOrchestrator.runJob(ReplicationJobOrchestrator.java:126) ~[io.airbyte-airbyte-container-orchestrator-dev-f5eb5d18e5.jar:?]
		at io.airbyte.container_orchestrator.Application.run(Application.java:78) ~[io.airbyte-airbyte-container-orchestrator-dev-f5eb5d18e5.jar:?]
		at io.airbyte.container_orchestrator.Application.main(Application.java:38) ~[io.airbyte-airbyte-container-orchestrator-dev-f5eb5d18e5.jar:?]
Caused by: io.airbyte.workers.internal.exception.SourceException: Source didn't exit properly - check the logs!
	at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$8(DefaultReplicationWorker.java:367) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
	at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
	at java.lang.Thread.run(Thread.java:1583) ~[?:?]
Caused by: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.
	at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:158) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
	at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$8(DefaultReplicationWorker.java:365) ~[io.airbyte-airbyte-commons-worker-dev-f5eb5d18e5.jar:?]
	at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
	at java.lang.Thread.run(Thread.java:1583) ~[?:?]
2023-12-29 15:26:34 replication-orchestrator > sync summary: {
  "status" : "failed",
  "recordsSynced" : 0,
  "bytesSynced" : 0,
  "startTime" : 1703863387814,
  "endTime" : 1703863594850,
  "totalStats" : {
    "bytesCommitted" : 0,
    "bytesEmitted" : 29535,
    "destinationStateMessagesEmitted" : 0,
    "destinationWriteEndTime" : 1703863594498,
    "destinationWriteStartTime" : 1703863387824,
    "meanSecondsBeforeSourceStateMessageEmitted" : 0,
    "maxSecondsBeforeSourceStateMessageEmitted" : 0,
    "maxSecondsBetweenStateMessageEmittedandCommitted" : 0,
    "meanSecondsBetweenStateMessageEmittedandCommitted" : 0,
    "recordsEmitted" : 16,
    "recordsCommitted" : 0,
    "replicationEndTime" : 0,
    "replicationStartTime" : 1703863387814,
    "sourceReadEndTime" : 0,
    "sourceReadStartTime" : 1703863394436,
    "sourceStateMessagesEmitted" : 0
  },
  "streamStats" : [ {
    "streamName" : "attachments",
    "stats" : {
      "bytesCommitted" : 0,
      "bytesEmitted" : 28636,
      "recordsEmitted" : 8,
      "recordsCommitted" : 0
    }
  }, {
    "streamName" : "attachments_compact",
    "stats" : {
      "bytesCommitted" : 0,
      "bytesEmitted" : 899,
      "recordsEmitted" : 8,
      "recordsCommitted" : 0
    }
  } ]
}
2023-12-29 15:26:34 replication-orchestrator > failures: [ {
  "failureOrigin" : "source",
  "failureType" : "system_error",
  "internalMessage" : "412 Client Error: Precondition Failed for url: https://app.asana.com/api/1.0/events?limit=100&opt_fields=user.gid%2Cresource.gid%2Ctype%2Caction%2Cparent.gid%2Ccreated_at%2Cchange.gid&resource=1203897763724806",
  "externalMessage" : "Sync token invalid or too old. If you are attempting to keep resources in sync, you must fetch the full dataset for this query now and use the new sync token for the next sync.",
  "metadata" : {
    "attemptNumber" : 4,
    "jobId" : 7055518,
    "from_trace_message" : true,
    "connector_command" : "read"
  },
  "stacktrace" : "Traceback (most recent call last):\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py\", line 116, in read\n    stream_is_available, reason = stream_instance.check_availability(logger, self)\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/core.py\", line 263, in check_availability\n    return self.availability_strategy.check_availability(self, logger, source)\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/availability_strategy.py\", line 56, in check_availability\n    is_available, reason = self.handle_http_error(stream, logger, source, error)\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/availability_strategy.py\", line 85, in handle_http_error\n    raise error\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/availability_strategy.py\", line 50, in check_availability\n    get_first_record_for_slice(stream, stream_slice)\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/utils/stream_helper.py\", line 40, in get_first_record_for_slice\n    return next(records_for_slice)\n  File \"/airbyte/integration_code/source_asana/streams.py\", line 204, in read_records\n    yield from super().read_records(*args, **kwargs)\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py\", line 482, in read_records\n    yield from self._read_pages(\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py\", line 498, in _read_pages\n    request, response = self._fetch_next_page(stream_slice, stream_state, next_page_token)\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py\", line 524, in _fetch_next_page\n    response = self._send_request(request, request_kwargs)\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py\", line 422, in _send_request\n    return backoff_handler(user_backoff_handler)(request, request_kwargs)\n  File \"/usr/local/lib/python3.9/site-packages/backoff/_sync.py\", line 105, in retry\n    ret = target(*args, **kwargs)\n  File \"/usr/local/lib/python3.9/site-packages/backoff/_sync.py\", line 105, in retry\n    ret = target(*args, **kwargs)\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py\", line 381, in _send\n    raise exc\n  File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py\", line 378, in _send\n    response.raise_for_status()\n  File \"/usr/local/lib/python3.9/site-packages/requests/models.py\", line 1021, in raise_for_status\n    raise HTTPError(http_error_msg, response=self)\nrequests.exceptions.HTTPError: 412 Client Error: Precondition Failed for url: https://app.asana.com/api/1.0/events?limit=100&opt_fields=user.gid%2Cresource.gid%2Ctype%2Caction%2Cparent.gid%2Ccreated_at%2Cchange.gid&resource=1203897763724806\n",
  "timestamp" : 1703863590999
}, {
  "failureOrigin" : "source",
  "internalMessage" : "Source didn't exit properly - check the logs!",
  "externalMessage" : "Something went wrong within the source connector",
  "metadata" : {
    "attemptNumber" : 4,
    "jobId" : 7055518,
    "connector_command" : "read"
  },
  "stacktrace" : "io.airbyte.workers.internal.exception.SourceException: Source didn't exit properly - check the logs!\n\tat io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$8(DefaultReplicationWorker.java:367)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\nCaused by: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.\n\tat io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:158)\n\tat io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$8(DefaultReplicationWorker.java:365)\n\t... 4 more\n",
  "timestamp" : 1703863591489
} ]
2023-12-29 15:26:34 replication-orchestrator > Returning output...
2023-12-29 15:26:34 replication-orchestrator > 
2023-12-29 15:26:34 replication-orchestrator > ----- END REPLICATION -----
2023-12-29 15:26:34 replication-orchestrator > 
2023-12-29 15:26:34 replication-orchestrator > Writing async status SUCCEEDED for KubePodInfo[namespace=jobs, name=orchestrator-repl-job-7055518-attempt-4, mainContainerInfo=KubeContainerInfo[image=airbyte/container-orchestrator:dev-f5eb5d18e5, pullPolicy=IfNotPresent]]...
2023-12-29 15:26:35 INFO i.a.a.SegmentAnalyticsClient(close):226 - Closing Segment analytics client...
2023-12-29 15:26:35 INFO i.a.a.BlockingShutdownAnalyticsPlugin(waitForFlush):281 - Waiting for Segment analytic client to flush enqueued messages...
2023-12-29 15:26:35 INFO i.a.a.BlockingShutdownAnalyticsPlugin(waitForFlush):293 - Segment analytic client flush complete.
2023-12-29 15:26:35 INFO i.a.a.SegmentAnalyticsClient(close):230 - Segment analytics client closed.  No new events will be accepted.
2023-12-29 15:26:37 platform > State Store reports orchestrator pod orchestrator-repl-job-7055518-attempt-4 succeeded
2023-12-29 15:26:39 platform > Retry State: RetryManager(completeFailureBackoffPolicy=BackoffPolicy(minInterval=PT10S, maxInterval=PT30M, base=3), partialFailureBackoffPolicy=null, successiveCompleteFailureLimit=5, totalCompleteFailureLimit=5, successivePartialFailureLimit=1000, totalPartialFailureLimit=10, successiveCompleteFailures=5, totalCompleteFailures=5, successivePartialFailures=0, totalPartialFailures=0)
 Backoff before next attempt: 13 minutes 30 seconds
2023-12-29 15:26:39 platform > Failing job: 7055518, reason: Job failed after too many retries for connection 7bebcf18-8c6e-4302-adf8-f293f99a31cc

Contribute

  • Yes, I want to contribute
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants