Archived

This forum has been archived. Please start a new discussion on GitHub.

Queueing and QOS for calls

I am still learning ICE so please excuse this question if it is stupid but i have been unable to find an answer i the manual myself.

In my case i am looking to have a number of servers running over a number of boxes. The users will connect in via many different clients that i have no control over and some users requests are more important than others.

The issues i have not been able to work out are
1. How to prevent a single user from flooding the system so other users requests can't get processed. Ideally it would be good to limit it to one request per user at a time.
2. Is there a way to put some users requests before others in the queue of requests to process first? Eg members get processed before non members? I thought having separate servers would allow this but not as ideal as i can get maximum utilization out of all the servers then.
3. Is there a way to limit the number of requests a server will process at once? In my case my system will be hitting local databases (mysql) and i generally find that over 20 ish concurrent requests star to degrade performance more than increase throughput.
4 Is it possible to limit the number of requests per period of time say 1 request per 3 seconds for a particular group of users or a particular user. (not sure if user groups are supported or just users)

On other thing i should note is users would be able to login multiple times with different clients and limiting them over all clients would be great but i suspect it would not be possible.

Regards,

Chris

Comments

  • benoit
    benoit Rennes, France
    Hi Chris,
    I am still learning ICE so please excuse this question if it is stupid but i have been unable to find an answer i the manual myself.

    In my case i am looking to have a number of servers running over a number of boxes. The users will connect in via many different clients that i have no control over and some users requests are more important than others.

    The issues i have not been able to work out are
    1. How to prevent a single user from flooding the system so other users requests can't get processed. Ideally it would be good to limit it to one request per user at a time.

    If you have an authorization mechanism to restrict a client to establish a single Ice connection, you can consider enabling serialization with the the Ice.ThreadPool.Server.Serialize property. With this property set, the Ice server thread pool will serialize the dispatching of Ice requests on a given connection.
    2. Is there a way to put some users requests before others in the queue of requests to process first? Eg members get processed before non members? I thought having separate servers would allow this but not as ideal as i can get maximum utilization out of all the servers then.

    Ice has no mechanisms to prioritize the dispatching of requests over Ice connections, once a server thread reads a request over an Ice connection, the thread dispatch the request right away.

    3. Is there a way to limit the number of requests a server will process at once? In my case my system will be hitting local databases (mysql) and i generally find that over 20 ish concurrent requests star to degrade performance more than increase throughput.

    By default the Ice server thread pool size is 1 but you can configure it to have more threads by setting the Size and SizeMax thread pool properties. The SizeMax property will allow you to limit the number of threads dispatching requests.

    4 Is it possible to limit the number of requests per period of time say 1 request per 3 seconds for a particular group of users or a particular user. (not sure if user groups are supported or just users)

    This isn't possible.
    On other thing i should note is users would be able to login multiple times with different clients and limiting them over all clients would be great but i suspect it would not be possible.

    Regards,

    Chris

    If you want to have fined grained control over the dispatching of requests, an option is to add an additional layer for the dispatching of requests, using a work queue for example. So rather than executing the code directly, your servant would queue work items with a work queue. Those work items could be prioritized based on the user that submitted the request and on whether or not this user already has queued requests. Your work queue could also implement some throttling.

    Cheers,
    Benoit.
  • Thanks that gives me a great starting point
  • I am still having issues. I have built a simple test project attached just to confirm i managed to configure the multi threading correctly but it seems i only uses one of my 2 servers and does not process concurrent request. I have not idea if the issue it client config, server config, app config or something else.

    Basically i am just trying to get it to accept multiple concurrent requests into the server so that i can build my own queues on top etc.
  • Just some extra info on my last question. I am currently focusing on a c# client and server so both the client and the server would be C# and designed so that all the code can handle concurrent threads. To be exact my system already does this but i want to replace my existing transport layer with ICE.

    In my case the client uses synchronous methods but may call the same method multiple times from different threads at the same time. I may convert these over to asynchronous calls later on but at the minute that would not really offer much of a benefit.

    Basically i need my server to be able to handle many requests at once and for that number of requests to be able to be configured per node or per server. I have seen the C# async example where you have a workerpool and that is almost perfect except with that example you can only have 1 sync request and multiple async. I need to allow say 10 threads to process request no matter if they are sync or async. Part of this is due to my client been an open source lib instead of a client application and people could use either sync or async calls in their implementations so on my side i need to process them the same. Queue then execute them as soon as one of many threads become available.

    Also a little off topic but i know there is load balancing. Is there a way to do the load balancing based on a custom property/value in my case the number of requests queued so that the node with the least number of requests queued would be used and in my case the queue would be my own customer implemented work queue like the async example?
  • As a test to try to work out where the threading issue is i tried making the client Async but sadly it was a total failure and resulted in the ICE framework throwing exceptions that did not seem to help too much or at least that is what is seemed like. I plan to go over the async examples tonight to spot the stupid mistake i have made somewhere. It is prob a config thing or something.
  • bernard
    bernard Jupiter, FL
    Hi Chris,

    I just had a quick look at your attachment. In order for your server (here: IceBox service) to process requests concurrently, you need to configure several threads in its thread pool.

    Here, you're instantiating the HelloService service template, which does not set Ice.ThreadPool.Server.Size ... so you have just one thread in your server thread pool.

    Cheers,
    Bernard
  • Hi Bernard,
    Just to be clear the ThreadPool i set in the config.Grid was not the correct place to set it for the application and instead if should have placed that setting in the application.xml (Application-HelloService.xml in my case) file correct?

    I have attached a modified application config. Could you please confirm that I have added the ThreadPool setting in the correct place? I can't test it till tonight when i get home.

    I assume i got the client config right?

    Regards,

    Chris
  • Still only getting single thread server side even setting the Ice.ThreadPool.Server.Size=2 in the apllication.xml on the IceBox
  • benoit
    benoit Rennes, France
    Hi,

    By default the IceBox server, creates one communicator for each service it is running. Those communicators do not share the configuration of the IceBox server. To configure multiple threads for your service communicator, you need to specify the thread pool property in the service template properties. For more information on IceBox and service configuration, see this link in the Ice manual.

    Another option would be to let the service inherit from the IceBox communicator properties using the IceBox.InheritProperties property.

    Cheers,
    Benoit.
  • Is it possible for you to upload the application.xml with the threads specified for the HelloService only? I had a look at the links you posted and i think i understand how to get it to inherit it from the ICEBox but still a little unclear about the threads for just he service itself.

    You have a great product from everything i have read in all the doco but having so much trouble making it multi threaded is more than a little confusing to me. I have never dealt with a framework like this that is single threaded by default and seems to be so complicated to make it multi threaded. I know it can do it and i am sure it is there in the doco and i am probably missing something very simple but i just can't seem to get it right. As far as i can tell my code is fine and it is all config related. I am either looking in the wrong spot or i have missed it but all the c#/dotnet examples are single threaded apart from the async one which is really strange for dotnet/c# where virtually nothing is single threaded. Maybe i am just missing something in my design and doing something wrong but i just don't get why it is this hard to configure.

    Coding wise it i have not coded much so far but so far it could not be similar and i really liked the software, feature wise the software sounds great and something i would want my work place to use as well if i can convince them. Configuration wise and server setup wise it seems a nightmare. once it is working i sure it will work perfectly and the admin app makes it all look like it will be easy but then you realise that everything in the admin app is really string property/value pairs for some reason and there is not even intelisense for what properties are available where, possible values or context based help. If you plan to improve one part of the system i really hope you work on the setup and admin of the system because it badly needs it at least from the point of view of someone trying to learn the system. Based on coding apps and features i would prob give it a 8-9/10 but setting up and configuring about a 1-3/10. (meant as constructive not having a go at the system and i have fingers crossed for more demo/examples and upgraded admin app when ever you do your next updates)
  • benoit
    benoit Rennes, France
    Hi,

    Sorry to hear you find the configuration confusing. From having a closer look at your XML, it appears that you're using the wrong names for the server thread pool properties. The Ice properties to configure the Ice server thread pool are all prefixed with Ice.ThreadPool.Server, properties with the prefix IceBox.ThreadPool.Server won't be recognized. Ice thread pool properties are documented here.

    In some (rare) scenarios where your server or service has multiple Ice object adapters and wants to have separate thread pools for each object adapter, it's also possible to configure a thread pool for each of the object adapter using the object adapter thread pool properties (documented here). I doubt this is your case here.

    With respect to IceBox, by default each service has its own set of properties to configure the service's Ice communicator. In order to configure the Ice communicator of your service to use multiple server threads you therefore need to set the Ice.ThreadPool.Server properties at the right location: in the configuration of the service.

    With IceGrid XML deployment descriptors, you have to set the Ice.ThreadPool.Server properties under the service template <properties> element as follow:

      <service-template id="HelloService">
          <parameter name="name"/>
          <service name="${name}" entry="ICE Tests.dll:ICE_Tests.Services.Hello.HelloService">
            <description>A very simple service named after ${name}</description>
            <properties>
              <property name="Hello.Identity" value="hello"/>
              <property name="Ice.ThreadPool.Server.Size" value="2"/>
              <property name="Ice.ThreadPool.Server.SizeMax" value="10"/>
            </properties>
            <adapter name="Hello-${name}" endpoints="default" id="Hello-${name}" replica-group="HelloGroup" server-lifetime="false"/>
          </service>
        </service-template>
    

    I hope this is clearer now.

    Cheers,
    Benoit.
  • Thanks Benoit for the example! I had an updated version of the properties starting with ICE instead of ICEBOX as ICEBOX gave errors but i had them under the icebox config in the application.xml instead of the service in the application.xml and that seems to make all the difference.

    It is still not perfect with the client running up to 10 requests at once initally as i have a max of 10 threads but then only ever running 2 at a time after that. For example this is my results

    warning: bzip2.dll could not be loaded (likely due to 32/64-bit mismatch).
    Thread 16 - 6:41:29 AMTestB on Thread 24 says Hello World!
    Thread 19 - 6:41:29 AMTestB on Thread 26 says Hello World!
    Thread 17 - 6:41:29 AMTestB on Thread 18 says Hello World!
    Thread 15 - 6:41:29 AMTestB on Thread 30 says Hello World!
    Thread 18 - 6:41:29 AMTestB on Thread 29 says Hello World!
    Thread 21 - 6:41:29 AMTestB on Thread 25 says Hello World!
    Thread 22 - 6:41:29 AMTestB on Thread 27 says Hello World!
    Thread 10 - 6:41:29 AMTestB on Thread 28 says Hello World!
    Thread 20 - 6:41:30 AMTestB on Thread 31 says Hello World!
    Thread 27 - 6:41:31 AMTestB on Thread 26 says Hello World!
    Thread 16 - 6:41:31 AMTestB on Thread 19 says Hello World!
    Thread 19 - 6:41:32 AMTestB on Thread 31 says Hello World!
    Thread 17 - 6:41:33 AMTestB on Thread 26 says Hello World!
    Thread 15 - 6:41:33 AMTestB on Thread 19 says Hello World!
    Thread 18 - 6:41:34 AMTestB on Thread 31 says Hello World!
    Thread 21 - 6:41:35 AMTestB on Thread 26 says Hello World!
    Thread 22 - 6:41:35 AMTestB on Thread 19 says Hello World!
    Thread 10 - 6:41:36 AMTestB on Thread 31 says Hello World!
    Thread 20 - 6:41:37 AMTestB on Thread 26 says Hello World!
    Thread 27 - 6:41:37 AMTestB on Thread 19 says Hello World!
    Thread 16 - 6:41:38 AMTestB on Thread 31 says Hello World!
    Thread 19 - 6:41:39 AMTestB on Thread 26 says Hello World!
    Thread 17 - 6:41:39 AMTestB on Thread 19 says Hello World!
    Thread 15 - 6:41:40 AMTestB on Thread 31 says Hello World!
    Thread 18 - 6:41:41 AMTestB on Thread 26 says Hello World!
    Thread 21 - 6:41:41 AMTestB on Thread 19 says Hello World!
    Thread 22 - 6:41:42 AMTestB on Thread 31 says Hello World!
    Thread 10 - 6:41:43 AMTestB on Thread 26 says Hello World!
    Thread 20 - 6:41:43 AMTestB on Thread 19 says Hello World!
    Thread 27 - 6:41:44 AMTestB on Thread 31 says Hello World!
    done
    

    It also only ever uses one of the 2 instances for all requests but i assume that is either a config thing or because i am doing it all in the same client as i think i read about connection reuse somewhere in the doco. The issue seems to resolve itself if i run multiple clients at once so it is not a big issue just wondering if it means i have a problem in my client config.
  • sorry please ignore the last post i resolved it by using ice_connectionId to get a different connection for my threads. I generally would not need this but i just needed to know how to do so for the few cases where i need to connect to more than one of the servers. That said if there is a better way to spread the load from a single client over all servers i would be interested.
  • benoit
    benoit Rennes, France
    Hi,

    You can spread the client requests over several servers by disabling connection caching on proxies. This is achieved by creating new proxies using the proxy ice_connectionCached(false) proxy method, for example:
    // Java
    Ice.ObjectPrx proxy = communicator.stringToProxy("hello:tcp -h HOST1 -p 10000:tcp -h HOST2 -p 10000");
    proxy = proxy.ice_connectionCached(false);
    

    See the Connection Establishment section in the Ice manual for a detailed explanation on how connection and proxies work.

    Cheers,
    Benoit.