Skip to content

Add support for HTTP Basic Auth #440

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 14 commits into from
Dec 4, 2018
Merged

Add support for HTTP Basic Auth #440

merged 14 commits into from
Dec 4, 2018

Conversation

rnpridgeon
Copy link
Contributor

No description provided.

@rnpridgeon rnpridgeon requested a review from edenhill August 18, 2018 06:49
Copy link
Contributor

@edenhill edenhill left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good stuff, some minor comments.
Also missing an integration test.

@rnpridgeon
Copy link
Contributor Author

Now with integration tests!

Copy link
Contributor

@edenhill edenhill left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Almost there, good stuff!

@rnpridgeon
Copy link
Contributor Author

@edenhill mind giving this another pass

@@ -94,9 +152,15 @@ def _send_request(self, url, method='GET', body=None, headers={}):
_headers.update(headers)

response = self._session.request(method, url, headers=_headers, json=body)

# Returned by Jetty not SR so the payload is not json encoded
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might be better to wrap the json parsing in a try-block for all 4xx

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any comment on this? Still seems like a valid concern

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry missed this one. I like this idea I did something similar in the CCloud CLI as well.

@rnpridgeon
Copy link
Contributor Author

@edenhill, mind giving this another go. I have fast-forwarded to rebase the PR on master and addressed all of your comments.

Copy link
Contributor

@edenhill edenhill left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Almost there, buddy!
Would be nice with an avro example, with auth support (maybe commented out), in examples/

@@ -94,9 +152,15 @@ def _send_request(self, url, method='GET', body=None, headers={}):
_headers.update(headers)

response = self._session.request(method, url, headers=_headers, json=body)

# Returned by Jetty not SR so the payload is not json encoded
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any comment on this? Still seems like a valid concern

@rnpridgeon
Copy link
Contributor Author

@edenhill, hopefully this cleans up the rest.

Deletes the specified subject and its associated compatibility level if registered.
It is recommended to use this API only when a topic needs to be recycled or in development environments

@:param: subject: subject name
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure about this doxygen style, make sure the docs are generated correctly.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It follows the conventions within the rest of the file. Would you like me to correct all of them or just this one?

It is recommended to use this API only when a topic needs to be recycled or in development environments

@:param: subject: subject name
@:returns: version (int) - version of the schema deleted under this subject
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

there is an '@rtype' for specifying the return type

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto

@@ -0,0 +1 @@
__import__('pkg_resources').declare_namespace(__name__)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what's this for?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

avoid conflicts between confluent_kafka.avro and examples.avro

@rnpridgeon
Copy link
Contributor Author

rnpridgeon commented Nov 27, 2018

@edenhill docs build fine although the CachedSchemaRegistryClient appears to be omitted by autodoc. This has actually always been the case though so I'd like to request that we move that into another PR.

#495

@@ -0,0 +1,180 @@
#!/usr/bin/env python
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: the examples/seriali../avro.. path may be overdoing it (but it is fine)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Having faced this again, I really think it should be in examples/ with the other examples, I don't see a reason for the separation

@@ -0,0 +1,180 @@
#!/usr/bin/env python
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Having faced this again, I really think it should be in examples/ with the other examples, I don't see a reason for the separation

Copy link
Contributor

@edenhill edenhill left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Smaller nits, almost there!

""")


# User stores the deserialized user Avro record.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These should be doc strings

self.favorite_number = favorite_number
self.favorite_color = favorite_color

# The Avro Python library does not support code generation.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

docstring

record.favorite_number = int(input("Enter favorite number: "))
record.favorite_color = input("Enter favorite color: ")

producer.produce(topic=topic, partition=0, value=record.to_dict(), callback=on_delivery)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggest not specifying the partition, is atypical, and not needed with subscribe()

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ack, leftover from usage of assign

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we could use this opportunity to demonstrate how you would do externalized custom partitioning.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But the python client does not support pluggable partitioners, and it doesn't seem relevant for show-casing avro.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I meant how you could still use a custom partitioner externally (i.e. explicitly set the partition with the result of some hashing function). You are right though that is out of place here.

Copy link
Contributor

@edenhill edenhill left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

one small change with brackets, then good to go!

@rnpridgeon rnpridgeon mentioned this pull request Dec 3, 2018
@edenhill
Copy link
Contributor

edenhill commented Dec 3, 2018

LGTM!

@edenhill
Copy link
Contributor

edenhill commented Dec 3, 2018

Might want to consider fixing the history of recent commit-review-ping-pong commits.

@rnpridgeon rnpridgeon merged commit df649a4 into master Dec 4, 2018
@rnpridgeon rnpridgeon deleted the basic_auth branch December 4, 2018 08:12
@ADDale
Copy link

ADDale commented Jan 3, 2019

I am still getting the below error. please confirm if this has been fixed and can you share working exmaple for the same.
Running MacOS, Python3.6.5
Installed packages with
pip install confluent-kafka
pip install confluent-kafka[avro]

run the example avro producer script from https://github.com/confluentinc/confluent-kafka-python/blob/master/README.md having edited for my broker, schema registry and topic name.

error stack is :
Traceback (most recent call last):
File "/Users/07HO38/Documents/Python/OEHCS/kafka_produce_avro.py", line 43, in
avroProducer.produce(topic='idcs-5bea815f69d54c6daa5701a57b5914fa-JDBCUSERS', value=value, key=key)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/confluent_kafka/avro/init.py", line 80, in produce
value = self._serializer.encode_record_with_schema(topic, value_schema, value)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/confluent_kafka/avro/serializer/message_serializer.py", line 98, in encode_record_with_schema
schema_id = self.registry_client.register(subject, schema)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/confluent_kafka/avro/cached_schema_registry_client.py", line 141, in register
result, code = self._send_request(url, method='POST', body=body)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/confluent_kafka/avro/cached_schema_registry_client.py", line 97, in _send_request
return response.json(), response.status_code
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/requests/models.py", line 892, in json
return complexjson.loads(self.text, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/json/init.py", line 354, in loads
return _default_decoder.decode(s)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/json/decoder.py", line 339, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/json/decoder.py", line 357, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

@ADDale
Copy link

ADDale commented Jan 3, 2019

I am still getting the error. Can you please share the working code for the same?
run the example avro producer script from https://github.com/confluentinc/confluent-kafka-python/blob/master/README.md having edited for my broker, schema registry and topic name.

error stack is :
Traceback (most recent call last):
File "/Users/07HO38/Documents/Python/OEHCS/kafka_produce_avro.py", line 43, in
avroProducer.produce(topic='idcs-5bea815f69d54c6daa5701a57b5914fa-JDBCUSERS', value=value, key=key)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/confluent_kafka/avro/init.py", line 80, in produce
value = self._serializer.encode_record_with_schema(topic, value_schema, value)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/confluent_kafka/avro/serializer/message_serializer.py", line 98, in encode_record_with_schema
schema_id = self.registry_client.register(subject, schema)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/confluent_kafka/avro/cached_schema_registry_client.py", line 141, in register
result, code = self._send_request(url, method='POST', body=body)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/confluent_kafka/avro/cached_schema_registry_client.py", line 97, in _send_request
return response.json(), response.status_code
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/requests/models.py", line 892, in json
return complexjson.loads(self.text, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/json/init.py", line 354, in loads
return _default_decoder.decode(s)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/json/decoder.py", line 339, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/json/decoder.py", line 357, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants