Home Help Center

Loadbalancing across thread, processes and nodes

VoidPointerVoidPointer Member Lars OppermannOrganization: PersonalProject: Digital asset management
Hi All,

I'm very new to Ice and I'm trying to learn about the loadbalancing facilities and how I could use them for various services I'm planning to write...

For all of the service I'm going to write, I would like to support load balancing across several physical hosts. This seems to be easily achieved for instance by just running the particular service in IceBox and deploying IceBox on the desired boxes...

What is not yet clear to me is how I would go about controlling the number of service requests that a particular node is handling at the same time. Is there a thread pool that can be configured in IceBox that would limit the maximum number of concurrent requests handled on that node? Would it be possible to set different limits for different services?

Is it possible to run services in external processes? I'm using some 3rd party functionality that is not thread-save so I would need to run concurrent requests in their own processes.

Any pointers would be greatly appreciated....



  • benoitbenoit Rennes, FranceAdministrators, ZeroC Staff Benoit FoucherOrganization: ZeroC, Inc.Project: Ice ZeroC Staff
    Hi Lars,

    The number of requests which can be dispatched concurrently for a given Ice communicator is configured with the Ice.ThreadPool.Server.Size property. By default, the Ice communicator server thread pool is configured with only one thread. For more information on the Ice threading model, I recommend reading this section in the Ice manual.

    By default the IceBox server container creates one Ice communicator per service. This communicator can be configured through the service command line options specified with the IceBox service property in the IceBox configuration file or in the service configuration file if you use one. For an example on how to configure IceBox, I recommend checking out the demo/IceBox/hello example from your Ice distribution.

    Since the default thread pool size is 1, your servants which are eventually calling in the non-thread safe library won't need synchronization. You can run multiple IceBox processes if you want to allow concurrent calls to your service (provided by multiple IceBox services, each service being hosted within its own IceBox process). However, since you'll run separate instances of the service be aware that these separate instances won't share a common state. So this won't work if your service needs to maintain some state and if this state needs to be accessible by all the service instances.

    Another option to deal with this non thread safe 3rd party library might be to synchronize calls to it. If you use C++, Ice provides some synchronization primitives in the IceUtil namespace, you'll find more information about them in this chapter.

  • VoidPointerVoidPointer Member Lars OppermannOrganization: PersonalProject: Digital asset management

    Thanks for the detailed reply, these are great starting points. If this works out as I think it should, Ice might indeed save me a lot of ceremonial coding which I would need to do if I implement my architecture based on JMS distributed queues and hand coded service dispatching as I originally intended.

Sign In or Register to comment.