-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🎉 Source and Destination Snowflake: Add jdbc_url_params support for optional JDBC parameters #9623
🎉 Source and Destination Snowflake: Add jdbc_url_params support for optional JDBC parameters #9623
Conversation
…nowflake and Destination Snowflake
/test connector=source-snowflake |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @noahkawasakigoogle, thank you very much for this contrib!
I made minor suggestions, let me know what you think.
Could you also please bump the version of the connectors you edited:
- In the Dockerfiles
- In
airbyte-config/init/src/main/resources/seed/source_definitions.yaml
- In
airbyte-config/init/src/main/resources/seed/destination_definitions.yaml
- In
airbyte-config/init/src/main/resources/config/STANDARD_SOURCE_DEFINITION/e2d65910-8c8b-40a1-ae7d-ee2416b2bfa2.json
- In
airbyte-config/init/src/main/resources/config/STANDARD_DESTINATION_DEFINITION/424892c4-daac-4491-b35d-c6688ba547ba.json
I'll run the integration test in our CI and send you the results.
airbyte-integrations/connectors/destination-snowflake/src/main/resources/spec.json
Outdated
Show resolved
Hide resolved
airbyte-integrations/connectors/destination-snowflake/src/main/resources/spec.json
Show resolved
Hide resolved
Thanks for reviewing! For some reason I thought the rest of the docker tag bumps happened in some automatic workflow. Just made them. Also removed the LICENSE file in the docs directory. Will open a separate PR for that. I had some responses to the JDBC nits, lemme know what you think. |
I confirm acceptance tests pass for the source connector. I'll keep you updated about the destination as the run is longer. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Acceptance tests passed for both connectors. @misteryeo / @sherifnada could you please give a final review as the spec.json
was changed?
Can we make sure to approve and merge this before this PR if approved? As I am only supposed to contribute to non Elastic licensed files |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To make it possible to enter a value like query_tag={'TOOL':'AIRBYTE','CUSTID':'1001'}
can url encoding be added too? Example is in the attached file.
import java.net.URLEncoder;
import java.nio.charset.StandardCharsets;
import java.io.UnsupportedEncodingException;
....
if (config.has("jdbc_url_params")) {
jdbcUrl.append(encodeValue(config.get("jdbc_url_params").asText()));
....
private static String encodeValue(String value) {
try {
return URLEncoder.encode(value, StandardCharsets.UTF_8.toString()).replace("%3D","=").replace("%26","&");
} catch (UnsupportedEncodingException ex) {
throw new RuntimeException(ex.getCause());
}
}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @noahkawasakigoogle ! Thanks for creating the PR. It's actually my first time seeing the epic linked to this PR, so I asked a bunch of questions on the main issue here: #9501 apologies for being a little behind you on this. Feel free to chime in on those questions as well
@misteryeo do you have thoughts on the UX questions in that issue?
@sherifnada No prob! I replied with some thoughts. @pjleiwa6g Hm, yeah this could happen. Do you think it would be possible to just put the URL encoded characters directly in the connection though? Snowflake documents this example:
My concern is just that Airbyte doing string manipulation behind the scenes makes it more vague whats going on to end users, and that there could be more and more characters to add search and replace logic for. This approach is more transparent: "What you put in will get sent to Snowflake". I'm open to either choice though. Also to answer your question on the other ticket, QUERY_TAG works here, tested that one specifically. Any of the Snowflake parameters can also be set this way. |
@noahkawasakigoogle I'm think you are right that it makes it more vague for end users. I tested it by entering an encoded string and it works fine. You can forget my comment. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just added one comment but otherwise LGTM!
airbyte-integrations/connectors/destination-snowflake/src/main/resources/spec.json
Outdated
Show resolved
Hide resolved
Yes please!
…---
Andy Yeo
Senior Product Manager
Github <https://github.com/airbytehq/airbyte> | Twitter
<https://twitter.com/AirbyteHQ> | LinkedIn
<https://www.linkedin.com/company/airbytehq/>
We're hiring, come work with me! <https://airbyte.io/careers> [image: 🚀]
On Wed, Jan 26, 2022 at 3:49 PM, Noah Kawasaki ***@***.***> wrote:
***@***.**** commented on this pull request.
------------------------------
In airbyte-integrations/connectors/destination-snowflake/src/main/
resources/spec.json
<#9623 (comment)>:
> @@ -68,11 +68,17 @@
"title": "Password",
"order": 6
},
+ "jdbc_url_params": {
+ "description": "Additional properties to pass to the jdbc url string when connecting to the database formatted as 'key=value' pairs separated by the symbol '&'. (example: key1=value1&key2=value2&key3=value3).",
Hey @misteryeo <https://github.com/misteryeo> @alafanechere
<https://github.com/alafanechere> said the same thing here and I replied,
would you prefer me to just update the capitalization everywhere?
—
Reply to this email directly, view it on GitHub
<#9623 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AEASBTPWXEDRTBW2X2JCW2LUYCB65ANCNFSM5MLMFB6Q>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
I see this in CI:
I'm assuming this is expected from my fork since this hasnt merged? |
Yes exactly, I was running final tests after your latest commit on a side branch. They are passing, so I'm going to publish the connectors now. |
Thank you @noahkawasakigoogle for this contribution! I re-bumped the version myself because of some conflicts with the destination connectors. |
What
Add support for Snowflakes Source and Destination's to take in a JDBC URL Params input to specify any additional JDBC parameters for the connection.
Solves Snowflake for #9501
Eventually I would like to have this for all connectors that support this: #9598
How
Add in
jdbc_url_params
for Snowflake in the config Jsonode. Optionally append to the JDBC string if exists.Also, I'm unsure on how I am supposed to run the integration tests myself without the secrets, is it expected that I should create my own personal Snowflake account for this? Or is there an assumption that I am the original Snowflake connector developer and already have these credentials?
Recommended reading order
Very simple, any file
🚨 User Impact 🚨
Pre-merge Checklist
Updating a connector
Community member or Airbyter
airbyte_secret
./gradlew :airbyte-integrations:connectors:<name>:integrationTest
.README.md
bootstrap.md
. See description and examplesdocs/integrations/<source or destination>/<name>.md
including changelog. See changelog exampleAirbyter
If this is a community PR, the Airbyte engineer reviewing this PR is responsible for the below items.
/test connector=connectors/<name>
command is passing./publish
command described here