-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
《3.3 ReceiverTraker, ReceivedBlockTracker 详解.md》讨论区 #8
Comments
您好,文章中有些图片挂了,麻烦能重新贴一下么? |
@romantic123 |
@lw-lin 您好,请教个问题,我现在从flume去数据,运行spark streaming的程序,设置了spark.streaming.blockInterval为1000ms,但感觉没有起作用,这种情况遇到过么,谢谢 |
你好,我也看了源码,ReceiveTracker上是先启动BlockGenerator 才启动的Receiver, 而你的文章是先启动Receiver, 不知道对不对 |
@lw-lin |
直接返回了 false 后,该分发 receiver 的 job 会正常结束;然后 ReceiverTracker 新起一个 job 再次分发这个 receiver: Hope it helps! |
本文中 ReceiverTracker 分发和监控 Receiver, 下面第三段中
此句话语句不顺,望修改 |
你好,现在Receive从redis zset读取数据,假设有10w个key,我启动了100个receiver |
这里是 《3.3 ReceiverTraker, ReceivedBlockTracker 详解.md》 讨论区。
如需要贴代码,请复制以下内容并修改:
谢谢!
The text was updated successfully, but these errors were encountered: