Quantcast

Getting SocketTimeoutException on java Producer client during load testing

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Getting SocketTimeoutException on java Producer client during load testing

Sean Stephens
I am getting the following "caused by" stack on an IO Exception occuring in my send method on the basicPublish call.

Caused by: com.rabbitmq.client.ShutdownSignalException: connection error; reason: java.net.SocketTimeoutException: Timeout during Connection negotiation
        at com.rabbitmq.utility.ValueOrException.getValue(ValueOrException.java:67)
        at com.rabbitmq.utility.BlockingValueOrException.uninterruptibleGetValue(BlockingValueOrException.java:33)
        at com.rabbitmq.client.impl.AMQChannel$BlockingRpcContinuation.getReply(AMQChannel.java:343)
        at com.rabbitmq.client.impl.AMQConnection.start(AMQConnection.java:313)
        ... 87 more
Caused by: java.net.SocketTimeoutException: Timeout during Connection negotiation
        at com.rabbitmq.client.impl.AMQConnection.handleSocketTimeout(AMQConnection.java:566)
        at com.rabbitmq.client.impl.AMQConnection.access$500(AMQConnection.java:59)
        at com.rabbitmq.client.impl.AMQConnection$MainLoop.run(AMQConnection.java:541)

My belief at this point is that my producer load has overrun the connection factory pool in writing to the queues.

I have been unable to locate information on settings related to the connection factory pool to make it larger or change the timeout.  When the connections do timeout, it doesn't look like the pool catches up unless I kill the Producer application.  Anyone have advice on how to get past this issue? Is there a way to throttle the load at the Producer connection, or will it be necessary to just add more nodes to handle the incoming load?

I have not implemented "confirms" yet, because it seems that will only provide a way to solve message loss, not prevent connection failure under load. Am I wrong on this?

Thanks in advance for your help.

_______________________________________________
rabbitmq-discuss mailing list
[hidden email]
https://lists.rabbitmq.com/cgi-bin/mailman/listinfo/rabbitmq-discuss
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Getting SocketTimeoutException on java Producer client during load testing

Michael Klishin-2
Sean Stephens:

> My belief at this point is that my producer load has overrun the connection factory pool in writing to the queues.
>
> I have been unable to locate information on settings related to the connection factory pool to make it larger or change the timeout.  When the connections do timeout, it doesn't look like the pool catches up unless I kill the Producer application.  Anyone have advice on how to get past this issue?

ConnectionFactory does not pool connections. Can you take a look at rabbitmq log to see if
there are any warnings about alarms and blocked publishers?

> Is there a way to throttle the load at the Producer connection, or will it be necessary to just add more nodes to handle the incoming load?

You can implement throttling logic in your own code but in general, you need to have enough
consumer capacity to keep up with producers, or RabbitMQ will block connections that publish
(there will be a very visible warning in the log).

MK




_______________________________________________
rabbitmq-discuss mailing list
[hidden email]
https://lists.rabbitmq.com/cgi-bin/mailman/listinfo/rabbitmq-discuss

signature.asc (506 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Getting SocketTimeoutException on java Producer client during load testing

Sean Stephens
On my last test attempt, this is what I received in the rabbitmq log

=WARNING REPORT==== 11-Sep-2013::16:02:40 ===
file descriptor limit alarm set.

********************************************************************
*** New connections will not be accepted until this alarm clears ***
********************************************************************


After running 2 separate loads with only 1 thread driving the load, 500 msgs ran on the producer and 500 were consumed. on the second, only 230 ran... totalling 730 which gets close to the default connection limit. Apparently I'm doing something wrong with my Producer connections.

I've set the FD limit on the linux server to something abnormally high (1024000), so it shouldn't be the OS causing the problem.

And just to close the loop.... problem is solved now. I was creating a new connection for each sent message and not closing it.  The proper way to do it is to create one shared connection and open new channels on that connection for each send. The thing to remember is to close the connections on client reset.

Thanks for your help!


On Wed, Sep 11, 2013 at 10:44 AM, Michael Klishin <[hidden email]> wrote:
Sean Stephens:

> My belief at this point is that my producer load has overrun the connection factory pool in writing to the queues.
>
> I have been unable to locate information on settings related to the connection factory pool to make it larger or change the timeout.  When the connections do timeout, it doesn't look like the pool catches up unless I kill the Producer application.  Anyone have advice on how to get past this issue?

ConnectionFactory does not pool connections. Can you take a look at rabbitmq log to see if
there are any warnings about alarms and blocked publishers?

> Is there a way to throttle the load at the Producer connection, or will it be necessary to just add more nodes to handle the incoming load?

You can implement throttling logic in your own code but in general, you need to have enough
consumer capacity to keep up with producers, or RabbitMQ will block connections that publish
(there will be a very visible warning in the log).

MK





_______________________________________________
rabbitmq-discuss mailing list
[hidden email]
https://lists.rabbitmq.com/cgi-bin/mailman/listinfo/rabbitmq-discuss
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Getting SocketTimeoutException on java Producer client during load testing

Fazul
This post has NOT been accepted by the mailing list yet.
Hi,

I am also facing same issue,

Can you please provide the detailed steps for the solution which yous suggested as

'The proper way to do it is to create one shared connection and open new channels on that connection for each send. The thing to remember is to close the connections on client reset.'

This very urgent for me.

Thanks a lot in advance.
Loading...