Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when running verification job with "command=start_gcs_import" #1

Open
omahjoub opened this issue Feb 22, 2018 · 2 comments
Open

Comments

@omahjoub
Copy link

Hi @datancoffee ,

First i would like to thank you for this great example.
I'm trying to install the project in the google cloud platform but i'm facing some issues.

Version : 0.6.4

I followed the Readme and everything is ok till the section Run a verification job.

When i publish the message command=start_gcs_import in my Pub/Sub topic i see that the message is processed by the control pipeline which tries to launch the indexer pipeline,
but this launch fails with this error :

exception: "java.lang.NoSuchMethodError: org.apache.beam.sdk.common.runner.v1.RunnerApi$FunctionSpec$Builder.setPayload(Lcom/google/protobuf/ByteString;)Lorg/apache/beam/sdk/common/runner/v1/RunnerApi$FunctionSpec$Builder; at org.apache.beam.runners.dataflow.repackaged.org.apache.beam.runners.core.construction.WindowingStrategyTranslation.toProto(WindowingStrategyTranslation.java:224) at org.apache.beam.runners.dataflow.repackaged.org.apache.beam.runners.core.construction.WindowingStrategyTranslation.toProto(WindowingStrategyTranslation.java:299) at org.apache.beam.runners.dataflow.repackaged.org.apache.beam.runners.core.construction.WindowingStrategyTranslation.toProto(WindowingStrategyTranslation.java:285) at org.apache.beam.runners.dataflow.DataflowPipelineTranslator.serializeWindowingStrategy(DataflowPipelineTranslator.java:129) at org.apache.beam.runners.dataflow.DataflowPipelineTranslator.access$1500(DataflowPipelineTranslator.java:114) at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$5.groupByKeyHelper(DataflowPipelineTranslator.java:806) at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$5.translate(DataflowPipelineTranslator.java:784) at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$5.translate(DataflowPipelineTranslator.java:781) at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.visitPrimitiveTransform(DataflowPipelineTranslator.java:442) at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:663) at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:655) at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:655) at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:655) at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:655) at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:655) at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:655) at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:655) at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:311) at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:245) at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:446) at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.translate(DataflowPipelineTranslator.java:386) at org.apache.beam.runners.dataflow.DataflowPipelineTranslator.translate(DataflowPipelineTranslator.java:173) at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:537) at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:170) at org.apache.beam.sdk.Pipeline.run(Pipeline.java:303) at org.apache.beam.sdk.Pipeline.run(Pipeline.java:289) at com.google.cloud.dataflow.examples.opinionanalysis.ControlPipeline$ProcessCommand.startDocumentImportPipeline(ControlPipeline.java:270) at com.google.cloud.dataflow.examples.opinionanalysis.ControlPipeline$ProcessCommand.processElement(ControlPipeline.java:172)

And consequently i'm not seeing any output in my bigquery tables.
Any help on this?

PS: Launching only the indexer pipeline as specified in the version 0.6.4 Release Note is OK.

Thanks.

@datancoffee
Copy link
Contributor

yeah, the ControllerPipeline is broken at the moment (it was not brought to the latest versions of the Dataflow SDK yet), and I am working on fixing it. Sorry about that.

For the time being, launch the IndexerPipeline directly using the example in the Release Notes for version 0.6.4

@omahjoub
Copy link
Author

Thank you for your reply.
That's exactly what i did : i launched only the indexer pipeline.
Hope you'll find the time to fix this soonly :)
Thank you for your efforts.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants