ICE Communicator Runtime Queue Size (v3.6.1)

medawsonjrmedawsonjr Mark E DawsonOrganization: Belvedere Trading LLCProject: Typical threaded client/server application in Financial ServicesMember

I have an application which calculates options pricing values for financially-traded instruments (using C++ w/OpenMP). Through exhausting testing, I've determined that the server hardware in question is capable of specific number of simultaneous instrument option price calculations before it begins to exhaust CPU microarchitecture and OS limits.

Therefore, I've set both Ice.ThreadPool.Server.Size and Ice.ThreadPool.Server.SizeMax to the aforementioned number to restrict the number of simultaneous calculations executed. Of course, this causes any extra requests coming in during that time to queue in the ICE Communicator runtime. My question is:

How can I monitor the size (growing & shrinking activity) of the ICE Communicator runtime queue during operation?

Best Answer

  • benoitbenoit Rennes, FranceBenoit FoucherOrganization: ZeroC, Inc.Project: Ice ZeroC Staff
    Accepted Answer

    Hi,

    You can't monitor this. Basically when all the threads from the thread pool are busy, there's no thread to read requests queued on the TCP/IP sockets.

    Clients might still send requests but the sending of these requests will eventually block once the TCP/IP buffers on the server side and the client side are full. If you are using AMI on the client side, the calls won't actually block but the Ice runtime will queue the requests in memory until there's more room in the client TCP/IP socket buffers to send new requests.

    If you want to be able to control the queuing on the server side you will need to decouple the computation from the Ice thread pool which receives the requests. Basically, you need to implement a worker thread pool dedicated for the computation. The Ice threads will receive the requests and queue requests with the worker thread pool. This way, you can measure the number of requests queued with the worker thread pool. Note that this might however require to implement flow control because in this scenario if the clients are submitting too many requests and the server can't keep up, the worker thread pool queue might grow indefinitely.

    Let us know if you need additional information on this. And btw, you should consider upgrading to the latest Ice version :smile:

    Cheers,
    Benoit.

Answers

  • benoitbenoit Rennes, FranceBenoit FoucherOrganization: ZeroC, Inc.Project: IceAdministrators, ZeroC Staff ZeroC Staff
    Accepted Answer

    Hi,

    You can't monitor this. Basically when all the threads from the thread pool are busy, there's no thread to read requests queued on the TCP/IP sockets.

    Clients might still send requests but the sending of these requests will eventually block once the TCP/IP buffers on the server side and the client side are full. If you are using AMI on the client side, the calls won't actually block but the Ice runtime will queue the requests in memory until there's more room in the client TCP/IP socket buffers to send new requests.

    If you want to be able to control the queuing on the server side you will need to decouple the computation from the Ice thread pool which receives the requests. Basically, you need to implement a worker thread pool dedicated for the computation. The Ice threads will receive the requests and queue requests with the worker thread pool. This way, you can measure the number of requests queued with the worker thread pool. Note that this might however require to implement flow control because in this scenario if the clients are submitting too many requests and the server can't keep up, the worker thread pool queue might grow indefinitely.

    Let us know if you need additional information on this. And btw, you should consider upgrading to the latest Ice version :smile:

    Cheers,
    Benoit.

Sign In or Register to comment.