Home Bug Reports

duplicate threads created by Ice Java ThreadPool

cgeorgelcgeorgel Member Qin YeOrganization: PersonalProject: compare with corba
Dear Ice Developers:
Our project recently encountered a pressing concurrent Java Server-Side problem just almost the same as this post: http://www.zeroc.com/forums/help-center/3888-java-client-multiple-connection-sequential-access.html.
But the differents we found are:
1. After We tryed everything we have learned - including ensuring multi-thread server thread pool, ensuring Ice.ThreadPool.Server.Serialize=0 and each client application thread holding a distinct proxy by using proxy method - ice_connectionId(),
our Server still seemed to be running in Serialization mode - at most only 1 Server thread could busy dispatching the right Servant method, with another thread busy performing
IceInternal.Selector.select(Selector.java:258) (with thread state RUNNING, most recent call: sun.nio.ch.EPollArrayWrapper.epollWait(Native Method), which I guess the busy-waiting mode NIO lock?),
no matter how many threads we starts initially by setting Ice.ThreadPool.Server.Size, after a period of time the server thread pool shrink to 2.

below are out Java Server and Java Client's configurations:

Server:
# Server configuration
Ice.Admin.ServerId=pcrfServer-0 # Server descriptor properties
Ice.Admin.Endpoints=tcp -h 127.0.0.1 Ice.ThreadPool.Server.Size=8
Ice.ProgramName=pcrfServer-0 Ice.ThreadPool.Server.SizeMax=8
IceBox.InstanceName=pcrfServer-0
Ice.ThreadPool.Server.Size=8
Ice.ThreadPool.Server.SizeMax=8
Ice.ThreadPool.Client.Size=1
Ice.ThreadPool.Client.SizeMax=8
Ice.ThreadPool.Server.Serialize=0
Ice.ThreadPool.Client.Serialize=0
Ice.ThreadPriority=8
Ice.Trace.ThreadPool=1

Client:
# Server configuration
Ice.Admin.ServerId=pressClientJava
Ice.Admin.Endpoints=tcp -h 127.0.0.1
Ice.ProgramName=pressClientJava
# Server descriptor properties
george.pressThreadNum=8
Ice.ThreadPool.Client.Size=1
Ice.ThreadPool.Client.SizeMax=8
Ice.ThreadPool.Server.Size=1
Ice.ThreadPool.Server.SizeMax=8
Ice.Trace.ThreadPool=1
george.printResponse=0
george.invokeMode=0
Ice.ThreadPriority=6
Ice.ThreadPool.Server.Serialize=0
Ice.ThreadPool.Client.Serialize=0

we are using IceGrid to deploy our application, so I just copy them from the config file generated by the icegridadmin tool.

We have also tryed both the synchronous and asynchronous method invocation mode and the result is almost the same.


2. We are using VisualVM provided by Oracle's JDK_1.7.0_u17 to inspect our Ice Java Server's available threads and we found something interesting:

there are 2 kinds of Ice Java server thread name pattern:
(1) ${Ice.ProgramName}-Ice.ThreadPool.Server-${threadPoolSequenceNumber}, which seems to be the normal case according to the Ice Java source code: IceInternal.ThreadPool.EventHandlerThread's constructor (we carefully searched the call reference of this constructor) and several properties of ThreadPool;

(2) ${Ice.ProgramName}-<part of ProgramName>-Ice.ThreadPool.Server-${threadPoolSequenceNumber}, which is so weird. for instance in our Ice Java Server application, the 2 styles of Server threads names are:
pcrfServer-0-Ice.ThreadPool.Server-0
and
pcrfServer-0-pcrfServer-Ice.ThreadPool.Server-0
respectively:

Attachment not found.
<<screen-shot-here>>

We doubt if there are someting wrong with the Ice's Java ThreadPool implementation that lead to the serialized-like processing problem.

The Ice version we use is 3.4.1
OS: Fedora 15 X64 Linux localhost 2.6.43.8-1.fc15.x86_64 #1 SMP Mon Jun 4 20:33:44 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux

In addition, we are using a multi-CPU machine and the processing capacity is always low with system load far from heavy when we are performing stress test.

Comments

  • benoitbenoit Rennes, FranceAdministrators, ZeroC Staff Benoit FoucherOrganization: ZeroC, Inc.Project: Ice ZeroC Staff
    Hi,

    Can you upgrade to the latest Ice version (Ice 3.5.0) and try again to see if you experience the same behavior? Please also update your organization/project name accordingly.

    Cheers,
    Benoit.
Sign In or Register to comment.