[BEAM-582] Allow usage of the new GCP service account JSON key#879
[BEAM-582] Allow usage of the new GCP service account JSON key#879alexvanboxel wants to merge 1 commit intoapache:masterfrom alexvanboxel:feature/BEAM-582-json-gcp-service-account
Conversation
| @@ -137,13 +138,27 @@ public static Credential getCredential(GcpOptions options) | |||
| private static Credential getCredentialFromFile( | |||
| String keyFile, String accountId, Collection<String> scopes) | |||
There was a problem hiding this comment.
mark keyFile to be @Nullable based on your change above.
There was a problem hiding this comment.
accountId is @nullable depending on the type of key (keyFile is not in the getCredentialFromFile)
|
Thanks, this is interesting! |
|
That's a fast review. Will fix it tomorrow. |
|
I think the travis build failure is unrelated. As soon as it's approved I'll submit a back-port to Cloud DataFlow. |
|
|
||
| if (keyFile == null) { | ||
| throw new IOException("If accountName is given, also supply a keyFile"); | ||
| } |
There was a problem hiding this comment.
Hi Alex, I'm clearly missing something here. It looks like from the code at L141 that keyFile MUST be non-null to enter this function. But the change you made at L111 makes it okay to call this function if keyFile is null.
Should L111 maybe just be if (keyFile != null) ? Then we can remove @Nullable from the parameter here.
There was a problem hiding this comment.
Note that all these options are given via the command line: so we can expect null's (and want to check all the permutations in this function). So what is valid:
p12 keyfile + accountName
json keyfile : NO accountName
the rest is invalid. (I've added all the errors in the test.
So the scenario were handling here is (you have given a accountName, but an accountName is not enough, you need a keyFile):
@Test
public void testAccountNameWithoutKeyFile() throws Exception {
GcpOptions options = PipelineOptionsFactory.as(GcpOptions.class);
options.setServiceAccountName("test@gcloud");
thrown.expect(IOException.class);
thrown.expectMessage(new StringContains("also supply a keyFile"));
GcpCredentialFactory.fromOptions(options).getCredential();
}
Hope it makes sense?!
There was a problem hiding this comment.
Okay, that makes sense.
For visible clarity, I'd prefer if we refactored something like this around L111:
if (keyFile != null) {
get the credentials;
} else if (account != null) {
throw IOException("If accountName is given, also supply a keyFile');
}then we can remove the @Nullable from the keyFile parameter to this function. I think it will make the code generally easier to read and understand.
There was a problem hiding this comment.
that doesn't cover all scenario's. But I think I a pushed something that covers all remarks:
In:
- getCredential(GcpOptions options) -> keyFile or accountName is give -> check at least that the keyFile is given (required for JSON and P12)
- getCredentialFromFile -> check the JSON case (no accountName allowed) or 12 case (accountName required)
Because keyFile is checked of in getCredential @nullable could be removed from keyFile in getCredentialFromFile.
|
Global recap (because comment on changed code is hidden): getCredential(GcpOptions options) -> keyFile or accountName is give -> check at least that the keyFile is given (required for JSON and P12) Because keyFile is checked of in getCredential @nullable could be removed from keyFile in getCredentialFromFile. |
|
gcloud-java-core would subsume this logic and more, opened up issue against gcloud-java-core for exposing default project/default auth: googleapis/google-cloud-java#1207 |
|
@lukecwik if you mean "default" auth, I'm thinking of the credentials set by the gcloud command on a local machine. My scenario is this: I'm running a lot GCP services via Airflow (DataProc, BigQuery, Storage and DataFlow). They are all running in a Kubernetes cluster using a service-key generated via the console (JSON file). The problem I have is that the containers don't have gcloud command installed, so I need to pass the service-key (JSON) file as a parameter to the jar (I'm always packing everything in a single JAR), but currently there is only support for the P12 (old keyfiles). Hence the PR. |
|
We already support using JSON files via default auth: Integrating with gcloud-java-core would add the benefit of allowing us to get the project id from the credentials when using service accounts and not need to rely for users to specify it. |
|
@lukecwik : thanks for clarifying. Learned something new today. I'm going to try playing with the env variable. But does this block this block this PR? Also remember the accountName != projectId. The accountName you need to specify with the P12 keys (not at all required with JSON files). Currently I don't have the bandwidth to dive into gcloud-java-core. So I probably abandon the PR for now and experiment with setting the GOOGLE_APPLICATION_CREDENTIALS in the Apache Airflow DataFlowJavaOperator. |
|
Yes I understand that account name != project id, I was just saying that integrating with gcloud-java-core would also improve our experience since they expose more methods for credentials. |
|
R: @lukecwik (just updating metadata) |
|
@alexvanboxel could you please close this PR. |
|
Abandoning... |
Allow the usage of the more modern JSON files on the GCP. This required for seamless integration planned in Apache Airflow 1.8