-
Couldn't load subscription status.
- Fork 19
Open
Description
Would you have any advice on how to create a custom streaming data source for message brokers (other than Kafka) such as ZeroMQ, RabbitMQ, etc? There's no real partitioning required and messages have to be queued/buffered before the next spark read operation... I noticed that Spark's DataSourceStreamReader class doesn't seem to like multi-threading...
Metadata
Metadata
Assignees
Labels
No labels