Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP netty: use composite cumulator for perf flamegraph tests #7

Draft
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

sergiitk
Copy link
Owner

@sergiitk sergiitk commented Oct 5, 2020

No description provided.

@sergiitk sergiitk force-pushed the netty-adaptive-cumulator-wip-perf branch from 2b0c726 to a9de162 Compare October 7, 2020 19:53
@sergiitk sergiitk changed the title netty: use composite cumulator for perf flamegraph tests WIP netty: use composite cumulator for perf flamegraph tests Oct 7, 2020
sergiitk pushed a commit that referenced this pull request Jul 5, 2023
The PipeSocket was convenient and avoided real I/O, but the
shutdown/close while connecting/handshaking tests were triggering a
Socket bug in Java (https://bugs.openjdk.org/browse/JDK-8278326). Using
a real socket doesn't trigger the bug because the test stops sharing
state with the code under test.

Fixes grpc#10228

```
Details
==================
WARNING: ThreadSanitizer: data race (pid=4528)
  Write of size 1 at 0x0000cfb9d5f4 by thread T36 (mutexes: write M0):
    #0 java.net.Socket.setCreated()V Socket.java:687
    #1 java.net.AbstractPlainSocketImpl.create(Z)V AbstractPlainSocketImpl.java:149
    #2 java.net.Socket.createImpl(Z)V Socket.java:477
    #3 java.net.Socket.getImpl()Ljava/net/SocketImpl; Socket.java:540
    #4 java.net.Socket.setTcpNoDelay(Z)V Socket.java:998
    #5 io.grpc.okhttp.OkHttpServerTransport.startIo(Lio/grpc/internal/SerializingExecutor;)V OkHttpServerTransport.java:164
    #6 io.grpc.okhttp.OkHttpServerTransport.lambda$start$0(Lio/grpc/internal/SerializingExecutor;)V OkHttpServerTransport.java:159
    #7 io.grpc.okhttp.OkHttpServerTransport$$Lambda$56.run()V ??
    #8 io.grpc.internal.SerializingExecutor.run()V SerializingExecutor.java:133
    #9 java.util.concurrent.ThreadPoolExecutor.runWorker(Ljava/util/concurrent/ThreadPoolExecutor$Worker;)V ThreadPoolExecutor.java:1130
    #10 java.util.concurrent.ThreadPoolExecutor$Worker.run()V ThreadPoolExecutor.java:630
    #11 java.lang.Thread.run()V Thread.java:830
    #12 (Generated Stub) <null>

  Previous read of size 1 at 0x0000cfb9d5f4 by thread T35 (mutexes: write M1, write M2):
    #0 java.net.Socket.close()V Socket.java:1512
    #1 io.grpc.okhttp.OkHttpServerTransportTest$PipeSocket.close()V OkHttpServerTransportTest.java:1384
    #2 io.grpc.okhttp.OkHttpServerTransportTest.clientCloseDuringHandshake()V OkHttpServerTransportTest.java:290
```
sergiitk pushed a commit that referenced this pull request Feb 27, 2024
As discovered by TSAN, the adsStream field is not synchronized.
```
WARNING: ThreadSanitizer: data race (pid=1625)
  Read of size 4 at 0x00009b66fc88 by thread T23 (mutexes: write M0):
    #0 io.grpc.xds.ControlPlaneClient.isReady()Z ControlPlaneClient.java:203
    #1 io.grpc.xds.ControlPlaneClient.readyHandler()V ControlPlaneClient.java:211
    #2 io.grpc.xds.ControlPlaneClient$AdsStream.onReady()V ControlPlaneClient.java:328
    #3 io.grpc.xds.GrpcXdsTransportFactory$EventHandlerToCallListenerAdapter.onReady()V GrpcXdsTransportFactory.java:145
    #4 io.grpc.PartialForwardingClientCallListener.onReady()V PartialForwardingClientCallListener.java:44
    #5 io.grpc.ForwardingClientCallListener.onReady()V ForwardingClientCallListener.java:23
    #6 io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onReady()V ForwardingClientCallListener.java:40
    #7 io.grpc.PartialForwardingClientCallListener.onReady()V PartialForwardingClientCallListener.java:44
    #8 io.grpc.ForwardingClientCallListener.onReady()V ForwardingClientCallListener.java:23
    #9 io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onReady()V ForwardingClientCallListener.java:40
    #10 io.grpc.internal.DelayedClientCall$DelayedListener.onReady()V DelayedClientCall.java:497
    #11 io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamOnReady.runInternal()V ClientCallImpl.java:781
    #12 io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamOnReady.runInContext()V ClientCallImpl.java:772
    #13 io.grpc.internal.ContextRunnable.run()V ContextRunnable.java:37
    #14 io.grpc.internal.SerializingExecutor.run()V SerializingExecutor.java:133
    #15 java.util.concurrent.ThreadPoolExecutor.runWorker(Ljava/util/concurrent/ThreadPoolExecutor$Worker;)V ThreadPoolExecutor.java:1130
    #16 java.util.concurrent.ThreadPoolExecutor$Worker.run()V ThreadPoolExecutor.java:630
    #17 java.lang.Thread.run()V Thread.java:830
    #18 (Generated Stub) <null>

  Previous write of size 4 at 0x00009b66fc88 by thread T4 (mutexes: write M1, write M2, write M3, write M4, write M5):
    #0 io.grpc.xds.ControlPlaneClient$AdsStream.cleanUp()V ControlPlaneClient.java:424
    #1 io.grpc.xds.ControlPlaneClient$AdsStream.close(Ljava/lang/Exception;)V ControlPlaneClient.java:418
    #2 io.grpc.xds.ControlPlaneClient$1.run()V ControlPlaneClient.java:130
    #3 io.grpc.SynchronizationContext.drain()V SynchronizationContext.java:94
    #4 io.grpc.SynchronizationContext.execute(Ljava/lang/Runnable;)V SynchronizationContext.java:126
    #5 io.grpc.xds.XdsClientImpl.shutdown()V XdsClientImpl.java:207
    #6 io.grpc.xds.SharedXdsClientPoolProvider$RefCountedXdsClientObjectPool.returnObject(Ljava/lang/Object;)Lio/grpc/xds/XdsClient; SharedXdsClientPoolProvider.java:144
    #7 io.grpc.xds.SharedXdsClientPoolProvider$RefCountedXdsClientObjectPool.returnObject(Ljava/lang/Object;)Ljava/lang/Object; SharedXdsClientPoolProvider.java:102
    #8 io.grpc.xds.XdsClientFederationTest.cleanUp()V XdsClientFederationTest.java:86
```
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant