5.12. Batching¶
This section is organized as follows:
5.12.1. Overview¶
Batching refers to a mechanism that allows RTI Connext Micro to collect multiple user data DDS samples to be sent in a single network packet, to take advantage of the efficiency of sending larger packets and thus increase effective throughput.
Connext Micro supports receiving batches of user data DDS samples, but does not support any mechanism to collect and send batches of user data.
Receiving batches of user samples is transparent to the application, which receives the samples as if the samples had been received one at a time. Note though that the reception sequence number refers to the sample sequence number, not the RTPS sequence number used to send RTPS messages. The RTPS sequence number is the batch sequence number for the entire batch.
A Connext Micro DataReader can receive both batched and non-batched samples.
For a more detailed explanation, please refer to the BATCH QosPolicy section in the RTI Connext DDS Core Libraries User’s Manual (available here if you have Internet access).
5.12.2. Interoperability¶
RTI Connext Professional supports both sending and receiving batches, whereas RTI Connext Micro supports only receiving batches. Thus, this feature primarily exists in Connext Micro to interoperate with RTI Connext applications that have enabled batching. An Connext Micro DataReader can receive both batched and non-batched samples.
5.12.3. Performance¶
The purpose of batching is to increase throughput when writing small DDS samples at a high rate. In such cases, throughput can be increased several-fold, approaching much more closely the physical limitations of the underlying network transport.
However, collecting DDS samples into a batch implies that they are not sent on the network immediately when the application writes them; this can potentially increase latency. But, if the application sends data faster than the network can support, an increased proportion of the network’s available bandwidth will be spent on acknowledgements and DDS sample resends. In this case, reducing that overhead by turning on batching could decrease latency while increasing throughput.
5.12.4. Example Configuration¶
This section includes several examples that explain how to enable batching in RTI Connext Professional. For more detailed and advanced configuration, please refer to the RTI Connext DDS Core Libraries User’s Manual.
This configuration ensures that a batch will be sent with a maximum of 10 samples:
<datawriter_qos> <publication_name> <name>HelloWorldDataWriter</name> </publication_name> <batch> <enable>true</enable> <max_samples>10</max_samples> </batch> </datawriter_qos>
This configuration ensures that a batch is automatically flushed after the delay specified by max_flush_delay. The delay is measured from the time the first sample in the batch is written by the application:
<datawriter_qos> <publication_name> <name>HelloWorldDataWriter</name> </publication_name> <batch> <enable>true</enable> <max_flush_delay> <sec>1</sec> <nanosec>0</nanosec> </max_flush_delay> </batch> </datawriter_qos>
The following configuration ensures that a batch is flushed automatically when max_data_bytes is reached (in this example 8192).
<datawriter_qos> <publication_name> <name>HelloWorldDataWriter</name> </publication_name> <batch> <enable>true</enable> <max_data_bytes>8192</max_data_bytes> </batch> </datawriter_qos>
Note that max_data_bytes does not include the metadata associated with the batch samples.
Batches must contain whole samples. If a new batch is started and its initial sample causes the serialized size to exceed max_data_bytes, RTI Connext Professional will send the sample in a single batch.