Archived

This forum has been archived. Please start a new discussion on GitHub.

IceStorm Oneway Delivery

Our game publishes iPhone input data to an IceStorm topic. Since most of our invocations are less than the size of a TCP packet header, we use a oneway batched proxy and flush it manually ~ every three invocations.

When IceStorm receives these three batched invocations, will it send them batched up immediately to the subscribers?

I would like to avoid having three packets sent back to the subscribers, but I also want the invocations sent immediately.

Thanks,
Pete

Comments

  • matthew
    matthew NL, Canada
    IceStorm itself doesn't know the events are batched, so it cannot do as you describe. You can use IceStorm batching, but unless the batching delay is very short, then you will have the delay you describe in IceStorm. I would suspect the simplest option is to coalesce the messages in the sender.

    Another possibly important detail is in the picture is Glacier2... Are you using it?
  • After reading the docs, I thought this was the case, but I wanted to be sure.

    I am using Glacier2 for sessions, but I'm ignorant as to why it is important in this scenario. Perhaps it has something that solves my problem?

    Thanks,
    Pete
  • Interesting, Glacier2 can batch requests at the router level. To be honest, the thought never even crossed my mind.

    IceStorm doesn't let you set a flush interval of < 100ms, but Glacier2 does. Since IceStorm should pump out all three invocations that it receives from the batch almost immediately, I suppose setting a very small sleep time on the router will allow me to batch the outgoing messages with a delay that will not be noticeable.

    Is this the best solution other than combining the requests into one invocation?

    Edit: On second thought, I'm not sure this is a great idea. The first one would go out immediately, and the following two would be batched. Very interested in hearing your input...

    Thanks,
    Pete
  • matthew
    matthew NL, Canada
    Yes, exactly. Setting a batch interval at the Glacier2 level would batch any requests that arrive during that time interval, including any other requests destined for the client.

    I'm not sure whether the first message is sent immediately, and then the remaining two will go out in a future batch... I'll need to check a bit further.
  • matthew
    matthew NL, Canada
    I looked into the Glacier2 question you had above regarding the batching.

    There is a single request queue that handles forwarding messages to all clients. If the queue is idle, it will behave as you say above. That is the first request will arrive, and is immediately sent, and then remaining two messages are sent 100ms later (assuming a 100ms flush time). However, this is only the case if the queue is idle for all clients. Assuming that you have lots of active clients, this will never be the case. What you cannot predict, however, is the flush boundary relative to a single client, as the queue is shared.

    I suspect that this is good enough for your use-case, assuming that you have lots of clients.
  • Our game will have only one active client most of the time. I am going to see what the result is when I leverage Glacier2's sleep time.

    Thanks a lot for the help,
    Pete