Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Kafka protocol support #196

Closed
davidwmartines opened this issue Nov 5, 2022 · 4 comments · Fixed by #197
Closed

Kafka protocol support #196

davidwmartines opened this issue Nov 5, 2022 · 4 comments · Fixed by #197

Comments

@davidwmartines
Copy link
Contributor

Are there any plans for Kafka protocol support in this sdk?

I would primarily be interested in using binary mode with Kafka, with Avro serialization for the data. (Although structured mode for JSON would probably need to be supported, too.)

It seems the current HTTP CloudEvent does most of what is needed, except:

  1. It needs a property to store the message Key.
  2. For binary mode, the headers need to be prefixed with ce_ (not ce-), with the exception of content-type.

I would not expect any specific serialization support, as I would rather be able to pass in any serializer of my choice such as the AvroSerializer from the confluent-kafka-python package, as a callable.

I'd be willing to help with this, but I'm not sure where to begin in this repo as it seems very http-centric at the moment.

@xSAVIKx
Copy link
Member

xSAVIKx commented Nov 5, 2022

Hey @davidwmartines, thx for opening this issue.

Yes, as you mentioned, the repo is very http-centric, unfortunately :-(

I'm looking forward to having some free time to bring Protobuf support, hopefully, this should bring some clarity regarding adding more formatting options.
Also, we now have CloudEvent abstract base, so technically you can subclass it and add any required properties.

I personally don't have Kafka experience, but curious how this complies with the Cloudevents spec. Need to check.

And yeah, PRs are very welcome

@davidwmartines davidwmartines mentioned this issue Nov 9, 2022
2 tasks
@davidwmartines
Copy link
Contributor Author

@xSAVIKx I started exploring this and came up with a draft of something that seems to work.

I'm not sure if it's the right approach, though. For one thing, the kafka.CloudEvent class I created is basically a copy of the http.CloudEvent. It seems that if we are creating protocol-specific CloudEvent classes, then we might need a more functional base class inheriting the abstract.CloudEvent, with the common behavior implemented for the protocol-specific implementations to build upon (if needed).

Alternatively, I could simply not create a kafka.CloudEvent, with the expectation that consumers would use the http.CloudEvent. Then the kafka module would then only need to provide the protocol-specific conversion functions. This actually seems more in the spirit of cloudevents - a common, protocol-agnostic event structure, with extensions provided for converting to/from the native protocol formats.

Your thoughts?

(BTW, I researched more and discovered the cloudevents and knative connection, so now the http focus of all this makes more sense.)

@xSAVIKx
Copy link
Member

xSAVIKx commented Nov 9, 2022

Yeah, I guess kafka is just a conversion format and there's no need for extra CloudEvent class. Pydantic approach is a bit different and the abstract base was created for framework-specific implementations.

@davidwmartines
Copy link
Contributor Author

OK, that makes sense. I have updated my PR to remove the kafa.CloudEvent class, and the kafka conversion functions should work with AnyCloudEvent (both http and pydantic work, at least in my simple testing). I still need to add some unit tests. I think I like this approach.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants