-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New azure_event_hubs
sink
#2434
Comments
I'm interested in Azure Event Hubs Sink Support. More specifically from an end to end perspective, looking to scrape an applications prometheus metrics endpoint, convert it into Application Insights JSON metrics format (something like this http://apmtips.com/blog/2017/10/27/send-metric-to-application-insights/), and then send them in compressed batches to an Azure Event hub (preferred, or to Application Insights direct) Appears the first part is supported. Let me know if I can help, e.g. QA this out. |
I can't get it working, so just to be sure: is it possible to use the Kafka sink for Azure Event Hub? And if so, is there any caveats i need to be aware of? |
@martinohansen when we say wrap the kafka sink we mean add a sink that reuses the kafka code customized for event hub. The straight kafka source will not directly work I suspect. |
For those that are also super excited to get vectordev to send into Azure Event Hub (Standard SKU required) as a sink. I was able to get this working this morning by following @joshmue's source configuration (see #4261) and the Microsoft documentation as inspiration. :)
From the Microsoft documentation:
My specific use case below uses a stdout source pulling Kubernetes logs, but you can adjust the sources to match your needs. sinks:
event_hub:
type: kafka
inputs: ["kubernetes_logs"]
bootstrap_servers: "<EVENT_HUB_NAMESPACE>.servicebus.windows.net:9093"
group_id: '$$Default'
topic: "<EVENT_HUB_NAME>"
encoding:
codec: "json"
healthcheck:
enabled: true
sasl:
enabled: true
mechanism: PLAIN
username: "$$ConnectionString"
password: "<YOUR_SAS_CONNECTION_STRING>"
librdkafka_options:
"security.protocol": sasl_ssl The key part that got it working for me are those last two lines. Make sure the key is surrounded with quotes as it is expecting a string, not a map. librdkafka_options:
"security.protocol": sasl_ssl All that said, I'd love to give this a shot and implement this as a specific |
Here's functional setup for both sink and source: ## Kafka sink
[sinks.out_azure_events]
type = "kafka"
inputs = ["parse_log_json"]
encoding.codec = "json"
bootstrap_servers = "myhub.servicebus.windows.net:9093" # [hub_namespace].servicebus.windows.net:90903
topic = "vector" # event hub name
compression = "zstd"
librdkafka_options."security.protocol" = "sasl_ssl"
[sinks.out_azure_events.sasl]
enabled = true
mechanism = "PLAIN"
username = "$$ConnectionString"
password = "Endpoint=sb://myhub.servicebus.windows.net/;SharedAccessKeyName=vector-producer;SharedAccessKey=dummy;EntityPath=vector" # SAS connection string ## Kafka source
[sources.in_azure_events]
type = "kafka"
bootstrap_servers = "myhub.servicebus.windows.net:9093" # [hub_namespace].servicebus.windows.net:90903
topics = ["vector"] # event hub name
group_id = '$$Default'
librdkafka_options."security.protocol" = "sasl_ssl"
[sources.in_azure_events.sasl]
enabled = true
mechanism = "PLAIN"
username = "$$ConnectionString"
password = "Endpoint=sb://myhub.servicebus.windows.net/;SharedAccessKeyName=vector-producer;SharedAccessKey=dummy;EntityPath=vector" # SAS connection string
[transforms.azure_events_parse]
type = "remap"
inputs = ["in_azure_events"]
source = '''
. = parse_json!(string!(.message))
'''
[sinks.console]
type = "console"
inputs = [ "azure_events_parse" ]
target = "stdout"
[sinks.console.encoding]
codec = "json" |
@fpytloun is that a suggestion for how it might look, or is the feature implemented? I could not find it in the documentation, source or commits. |
@dxlr8r that will work now if you are running a version of Azure Event Hubs that supports the Kafka interface (I believe the basic tier does not). |
Yes, in my opinion there's nothing to do under this issue, no code change needed and it's regular Kafka. Maybe just extend kafka sink documentation with example on how to setup output into Azure Event Hubs Kafka API? |
Agreed, we do plan to add this to the documentation. However the basic tier of Azure Event Hubs doesn't support connecting via Kafka so we plan to leave this open until we can add support for AMQP 1.0 which the basic tier of Azure Event Hubs does support. |
The |
It looks like Azure has been creating an SDK for Event Hub here: |
Azure Event Hubs is like a managed Kafka service:
This suggests that we could wrap the existing
kafka
sink. I'd prefer that we wrap it since it is technically a different service. We also get the marketing benefit (guides, ets) when we create a new sink.The text was updated successfully, but these errors were encountered: