Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Destination Redshift: using S3 staging bucket path is mandatory #16325

Open
hellmund opened this issue Sep 5, 2022 · 0 comments
Open

Destination Redshift: using S3 staging bucket path is mandatory #16325

hellmund opened this issue Sep 5, 2022 · 0 comments
Labels
area/databases community connectors/destination/redshift frozen Not being actively worked on team/destinations Destinations team's backlog type/bug Something isn't working

Comments

@hellmund
Copy link

hellmund commented Sep 5, 2022

Environment

  • Airbyte version: 0.40.4
  • OS Version / Instance: Amazon Linux 2 AMI, EC2 t3a.large
  • Deployment: Docker
  • Source Connector and version: N/A
  • Destination Connector and version: Redshift 0.3.49
  • Step where error happened: Sync job, Setup new connection

Current Behavior

Since very recently, the Redshift destination with S3 staging configured throws an error message. The cause appears to be that the "S3 Bucket Path (Optional)" does seem to be a mandatory field now. Entering a path resolved the issue and got the connections working again.

Expected Behavior

When leaving the "S3 Bucket Path (Optional)" empty, the expected behavior is that that staging files are uploaded to the bucket's root directory, as provided in the field description.

Logs

2022-09-05 08:22:46 �[32mINFO�[m i.a.s.p.j.LoggingJobErrorReportingClient(reportJobFailureReason):23 - Report Job Error -> workspaceId: b03b3214-2dab-4949-b52e-5d5887257125, dockerImage: airbyte/destination-redshift:0.3.49, failureReason: io.airbyte.config.FailureReason@252f7b39[failureOrigin=destination,failureType=system_error,internalMessage=java.lang.IllegalArgumentException: objectName cannot be empty,externalMessage=Something went wrong in the connector. See the logs for more details.,metadata=io.airbyte.config.Metadata@628cc407[additionalProperties={attemptNumber=null, jobId=null, from_trace_message=true, connector_command=check}],stacktrace=java.lang.IllegalArgumentException: objectName cannot be empty
	at com.amazonaws.util.ValidationUtils.assertStringNotEmpty(ValidationUtils.java:89)
	at com.amazonaws.services.s3.AmazonS3Client.doesObjectExist(AmazonS3Client.java:1421)
	at io.airbyte.integrations.destination.s3.S3StorageOperations.createBucketObjectIfNotExists(S3StorageOperations.java:101)
	at io.airbyte.integrations.destination.s3.S3Destination.attemptWriteAndDeleteS3Object(S3Destination.java:149)
	at io.airbyte.integrations.destination.s3.S3Destination.attemptS3WriteAndDelete(S3Destination.java:139)
	at io.airbyte.integrations.destination.s3.S3Destination.attemptS3WriteAndDelete(S3Destination.java:129)
	at io.airbyte.integrations.destination.redshift.RedshiftStagingS3Destination.check(RedshiftStagingS3Destination.java:67)
	at io.airbyte.integrations.destination.jdbc.copy.SwitchingDestination.check(SwitchingDestination.java:56)
	at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:121)
	at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:97)
	at io.airbyte.integrations.destination.redshift.RedshiftDestination.main(RedshiftDestination.java:64)
,retryable=<null>,timestamp=1662366165874], metadata: {workspace_id=b03b3214-2dab-4949-b52e-5d5887257125, airbyte_version=0.40.4, connector_definition_id=f7a7d195-377f-cf5b-70a5-be6b819019dc, failure_origin=destination, connector_repository=airbyte/destination-redshift, connector_release_stage=beta, job_id=2fa68ce9-49a1-4977-b04a-e6b8112618c5, workspace_url=http://localhost:8000/workspaces/b03b3214-2dab-4949-b52e-5d5887257125, failure_type=system_error, connector_command=check, connector_name=Redshift, deployment_mode=OSS}

Steps to Reproduce

  1. Have or create a Redshift destination
  2. Choose "S3 Staging" as Uploading Method
  3. Leave the "S3 Bucket Path (Optional)" field empty

Are you willing to submit a PR?

No

@hellmund hellmund added needs-triage type/bug Something isn't working labels Sep 5, 2022
@grishick grishick added the team/destinations Destinations team's backlog label Sep 27, 2022
@marcosmarxm marcosmarxm changed the title Redshift destination with S3 staging: bucket path mandatory Destination Redshift: using S3 staging bucket path is mandatory Nov 30, 2022
@bleonard bleonard added the frozen Not being actively worked on label Mar 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/databases community connectors/destination/redshift frozen Not being actively worked on team/destinations Destinations team's backlog type/bug Something isn't working
Projects
None yet
Development

No branches or pull requests

5 participants