Skip to content

Commit d7d3d89

Browse files
committed
Update the README
1 parent 35ec0f2 commit d7d3d89

File tree

2 files changed

+15
-21
lines changed

2 files changed

+15
-21
lines changed

README.md

Lines changed: 13 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -115,13 +115,10 @@ A client must be initialized with at least one Kafka broker, from which the enti
115115
```ruby
116116
require "kafka"
117117

118-
kafka = Kafka.new(
119-
# At least one of these nodes must be available:
120-
seed_brokers: ["kafka1:9092", "kafka2:9092"],
121-
122-
# Set an optional client id in order to identify the client to Kafka:
123-
client_id: "my-application",
124-
)
118+
# The first argument is a list of "seed brokers" that will be queried for the full
119+
# cluster topology. At least one of these *must* be available. `client_id` is
120+
# used to identify this client in logs and metrics. It's optional but recommended.
121+
kafka = Kafka.new(["kafka1:9092", "kafka2:9092"], client_id: "my-application")
125122
```
126123

127124
### Producing Messages to Kafka
@@ -430,10 +427,7 @@ require "kafka"
430427

431428
# Configure the Kafka client with the broker hosts and the Rails
432429
# logger.
433-
$kafka = Kafka.new(
434-
seed_brokers: ["kafka1:9092", "kafka2:9092"],
435-
logger: Rails.logger,
436-
)
430+
$kafka = Kafka.new(["kafka1:9092", "kafka2:9092"], logger: Rails.logger)
437431

438432
# Set up an asynchronous producer that delivers its buffered messages
439433
# every ten seconds:
@@ -473,7 +467,7 @@ Consuming messages from a Kafka topic with ruby-kafka is simple:
473467
```ruby
474468
require "kafka"
475469

476-
kafka = Kafka.new(seed_brokers: ["kafka1:9092", "kafka2:9092"])
470+
kafka = Kafka.new(["kafka1:9092", "kafka2:9092"])
477471

478472
kafka.each_message(topic: "greetings") do |message|
479473
puts message.offset, message.key, message.value
@@ -496,7 +490,7 @@ Using the API is simple:
496490
```ruby
497491
require "kafka"
498492

499-
kafka = Kafka.new(seed_brokers: ["kafka1:9092", "kafka2:9092"])
493+
kafka = Kafka.new(["kafka1:9092", "kafka2:9092"])
500494

501495
# Consumers with the same group id will form a Consumer Group together.
502496
consumer = kafka.consumer(group_id: "my-consumer")
@@ -884,10 +878,7 @@ By enabling SSL encryption you can have some confidence that messages can be sen
884878
In this case you just need to pass a valid CA certificate as a string when configuring your `Kafka` client:
885879

886880
```ruby
887-
kafka = Kafka.new(
888-
ssl_ca_cert: File.read('my_ca_cert.pem'),
889-
# ...
890-
)
881+
kafka = Kafka.new(["kafka1:9092"], ssl_ca_cert: File.read('my_ca_cert.pem'))
891882
```
892883

893884
Without passing the CA certificate to the client it would be impossible to protect against [man-in-the-middle attacks](https://en.wikipedia.org/wiki/Man-in-the-middle_attack).
@@ -898,10 +889,7 @@ If you want to use the CA certs from your system's default certificate store, yo
898889
can use:
899890

900891
```ruby
901-
kafka = Kafka.new(
902-
ssl_ca_certs_from_system: true
903-
# ...
904-
)
892+
kafka = Kafka.new(["kafka1:9092"], ssl_ca_certs_from_system: true)
905893
```
906894

907895
This configures the store to look up CA certificates from the system default certificate store on an as needed basis. The location of the store can usually be determined by:
@@ -913,6 +901,7 @@ In order to authenticate the client to the cluster, you need to pass in a certif
913901

914902
```ruby
915903
kafka = Kafka.new(
904+
["kafka1:9092"],
916905
ssl_ca_cert: File.read('my_ca_cert.pem'),
917906
ssl_client_cert: File.read('my_client_cert.pem'),
918907
ssl_client_cert_key: File.read('my_client_cert_key.pem'),
@@ -935,6 +924,7 @@ In order to authenticate using GSSAPI, set your principal and optionally your ke
935924

936925
```ruby
937926
kafka = Kafka.new(
927+
["kafka1:9092"],
938928
sasl_gssapi_principal: 'kafka/kafka.example.com@EXAMPLE.COM',
939929
sasl_gssapi_keytab: '/etc/keytabs/kafka.keytab',
940930
# ...
@@ -946,6 +936,7 @@ In order to authenticate using PLAIN, you must set your username and password wh
946936

947937
```ruby
948938
kafka = Kafka.new(
939+
["kafka1:9092"],
949940
ssl_ca_cert: File.read('/etc/openssl/cert.pem'), # Optional but highly recommended
950941
sasl_plain_username: 'username',
951942
sasl_plain_password: 'password'
@@ -960,6 +951,7 @@ Since 0.11 kafka supports [SCRAM](https://kafka.apache.org/documentation.html#se
960951

961952
```ruby
962953
kafka = Kafka.new(
954+
["kafka1:9092"],
963955
sasl_scram_username: 'username',
964956
sasl_scram_password: 'password',
965957
sasl_scram_mechanism: 'sha256',

lib/kafka.rb

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -244,6 +244,8 @@ class FailedScramAuthentication < SaslScramError
244244
# @see Client#initialize
245245
# @return [Client]
246246
def self.new(seed_brokers = nil, **options)
247+
# We allow `seed_brokers` to be passed in either as a positional _or_ as a
248+
# keyword argument.
247249
if seed_brokers.nil?
248250
Client.new(**options)
249251
else

0 commit comments

Comments
 (0)