Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix for finagle-chirper locking problem #168

Merged
merged 2 commits into from
Jul 1, 2019
Merged

Conversation

farquet
Copy link
Collaborator

@farquet farquet commented Jul 1, 2019

This solves #164. Throughput is fairly close to the previous version, but seems to be slightly better on all VMs I tested.

@farquet
Copy link
Collaborator Author

farquet commented Jul 1, 2019

However, I hit #106 a lot locally on my MacBook. So this issue was apparently not fully resolved (cc @Fithos ).

It is independent of the changes of this PR though, because I hit the problem as often on this commit and on the previous one.
This occurs in almost all executions of 90 iterations (default). Around once per execution.

Resetting master, feed map size: 5000
Exception in thread "Thread-351" Failure(Address already in use: localhost/127.0.0.1:49954 at remote address: localhost/127.0.0.1:49954. Remote Info: Not Available, flags=0x08) with RemoteInfo -> Upstream Address: Not Available, Upstream id: Not Available, Downstream Address: localhost/127.0.0.1:49954, Downstream label: :49954, Trace Id: 06b6367f50db45df.06b6367f50db45df<:06b6367f50db45df with Service -> :49954
Caused by: com.twitter.finagle.ConnectionFailedException: Address already in use: localhost/127.0.0.1:49954 at remote address: localhost/127.0.0.1:49954. Remote Info: Not Available
	at com.twitter.finagle.netty4.ConnectionBuilder$$anon$1.operationComplete(ConnectionBuilder.scala:99)
	at com.twitter.finagle.netty4.ConnectionBuilder$$anon$1.operationComplete(ConnectionBuilder.scala:78)
	at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:511)
	at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:485)
	at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:424)
	at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:121)
	at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.connect(AbstractNioChannel.java:290)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.connect(DefaultChannelPipeline.java:1366)
	at io.netty.channel.AbstractChannelHandlerContext.invokeConnect(AbstractChannelHandlerContext.java:545)
	at io.netty.channel.AbstractChannelHandlerContext.connect(AbstractChannelHandlerContext.java:530)
	at io.netty.channel.ChannelDuplexHandler.connect(ChannelDuplexHandler.java:50)
	at io.netty.channel.AbstractChannelHandlerContext.invokeConnect(AbstractChannelHandlerContext.java:545)
	at io.netty.channel.AbstractChannelHandlerContext.connect(AbstractChannelHandlerContext.java:530)
	at io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.connect(CombinedChannelDuplexHandler.java:497)
	at io.netty.channel.ChannelOutboundHandlerAdapter.connect(ChannelOutboundHandlerAdapter.java:47)
	at io.netty.channel.CombinedChannelDuplexHandler.connect(CombinedChannelDuplexHandler.java:298)
	at io.netty.channel.AbstractChannelHandlerContext.invokeConnect(AbstractChannelHandlerContext.java:545)
	at io.netty.channel.AbstractChannelHandlerContext.connect(AbstractChannelHandlerContext.java:530)
	at io.netty.channel.AbstractChannelHandlerContext.connect(AbstractChannelHandlerContext.java:512)
	at io.netty.channel.DefaultChannelPipeline.connect(DefaultChannelPipeline.java:1024)
	at io.netty.channel.AbstractChannel.connect(AbstractChannel.java:259)
	at io.netty.bootstrap.Bootstrap$3.run(Bootstrap.java:252)
	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:466)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:897)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at com.twitter.finagle.util.BlockingTimeTrackingThreadFactory$$anon$1.run(BlockingTimeTrackingThreadFactory.scala:23)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.lang.Thread.run(Thread.java:748)
Caused by: io.netty.channel.AbstractChannel$AnnotatedSocketException: Address already in use: localhost/127.0.0.1:49954
	at sun.nio.ch.Net.connect0(Native Method)
	at sun.nio.ch.Net.connect(Net.java:454)
	at sun.nio.ch.Net.connect(Net.java:446)
	at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:648)
	at io.netty.util.internal.SocketUtils$3.run(SocketUtils.java:83)
	at io.netty.util.internal.SocketUtils$3.run(SocketUtils.java:80)
	at java.security.AccessController.doPrivileged(Native Method)
	at io.netty.util.internal.SocketUtils.connect(SocketUtils.java:80)
	at io.netty.channel.socket.nio.NioSocketChannel.doConnect(NioSocketChannel.java:312)
	at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.connect(AbstractNioChannel.java:254)
	... 24 more
Caused by: java.net.BindException: Address already in use
	... 34 more

The port already in use is either the master port or one of the cache ports depending on the execution.

@farquet farquet requested a review from Fithos July 1, 2019 10:10
@tkrodriguez
Copy link
Collaborator

Can't you also drop the lock object as well?

@axel22
Copy link
Member

axel22 commented Jul 1, 2019

The lock object seems like it's not used, and can be removed.

@@ -225,12 +226,11 @@ class FinagleChirper extends RenaissanceBenchmark {
class Cache(val index: Int, val service: Service[Request, Response])
extends Service[Request, Response] {
val lock = new AnyRef
val cache = new mutable.HashMap[String, Buf]
var count = 0
val cache = new concurrent.TrieMap[String, Buf]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Technically speaking, there is a race condition in accessing the cache, but that's probably not very important for the benchmark overall.

@farquet farquet merged commit e643e39 into master Jul 1, 2019
@farquet farquet deleted the finagle-chirper-lock-fix branch July 2, 2019 15:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants