RTI Connext DDS Micro  Version 2.4.11
 All Data Structures Files Functions Variables Typedefs Enumerations Enumerator Macros Groups Pages
Batching

This user guide explains support for receiving batched data in RTPS.

Introduction

This user guide is organized as follows:

Overview

Batching refers to a mechanism that allows Connext DDS to collect multiple user data DDS samples to be sent in a single network packet, to take advantage of the efficiency of sending larger packets and thus increase effective throughput.

RTI Connext DDS Micro supports receiving batches of user data DDS samples, but does not support any mechanism to collect and send batches of user data.

Receiving batches of user samples is transparent to the application, which receives the samples as if the samples had been received one at a time. Note though that the reception sequence number refers to the sample sequence number, not the RTPS sequence number used to send RTPS messages. The RTPS sequence number is the batch sequence number for the entire batch.

An RTI Connext Micro DDS DataReader can receive both batched and non-batched samples.

For a more detailed explanation, please refer to the RTI Connext DDS User's Manual.

Interoperability

RTI Connext DDS Pro supports both sending and receiving batches, whereas RTI Connext DDS Micro supports only receiving batches. Thus, this feature primarily exists in RTI Connext DDS Micro to interoperate with RTI Connext DDS Pro applications that have enabled batching. An RTI Connext Micro DDS DataReader can receive both batched and non-batched samples.

Performance

The purpose of batching is to increase throughput when writing small DDS samples at a high rate. In such cases, throughput can be increased several-fold, approaching much more closely the physical limitations of the underlying network transport.

However, collecting DDS samples into a batch implies that they are not sent on the network immediately when the application writes them; this can potentially increase latency. But, if the application sends data faster than the network can support, an increased proportion of the network's available bandwidth will be spent on acknowledgements and DDS sample resends. In this case, reducing that overhead by turning on batching could decrease latency while increasing throughput.

Example Configuration

This section includes several examples that explain how to enable batching in RTI Connext Pro. For more detailed and advanced configuration, please refer to the RTI Connext DDS User's Manual.

<datawriter_qos>
<publication_name>
<name>HelloWorldDataWriter</name>
</publication_name>
<batch>
<enable>true</enable>
<max_samples>10</max_samples>
</batch>
</datawriter_qos>
<datawriter_qos>
<publication_name>
<name>HelloWorldDataWriter</name>
</publication_name>
<batch>
<enable>true</enable>
<max_flush_delay>
<sec>1</sec>
<nanosec>0</nanosec>
</max_flush_delay>
</batch>
</datawriter_qos>
<datawriter_qos>
<publication_name>
<name>HelloWorldDataWriter</name>
</publication_name>
<batch>
<enable>true</enable>
<max_data_bytes>8192</max_data_bytes>
</batch>
</datawriter_qos>

Note that max_data_bytes does not include the metadata associated with the batch samples.

Batches must contain whole samples. If a new batch is started and its initial sample causes the serialized size to exceed max_data_bytes, RTI Connext Pro will send the sample in a single batch.


RTI Connext DDS Micro Version 2.4.11 Copyright © Mon Jul 23 2018 Real-Time Innovations, Inc