3 posts / 0 new
Last post
Joined: 08/25/2015
Posts: 29


since we've some big types (and want to subscribe to them dynamically), we need to increase type_code_max_serialized_length/type_object_max_serialized_length. Now my question is - what's the memory impact when increasing this value? Why should I not increase it to, let's say 32Kb? The documentation does not say anything about it.




sara's picture
Joined: 01/16/2013
Posts: 79

Hi Christian,

Basically the endpoint discovery messages ( Data(R) and Data(W) in the Wireshark traces) will be bigger. This could affect to the over all performance of your system. But you could measure easily if that affects to your requirements. 
However, there are several details to take into account when increasing the size of the TypeCode and TypeObject. You can see them in the following articles:


jcwenger's picture
Joined: 01/21/2015
Posts: 11

Not sure if this is applicable to your case or not, but we experinced this problem, driven by large enumerations.  One compromise soltuion in that case is to declare the enum in the IDL, so you can use the symbolic names for the values, but to declare the member variable in the type you're publishing, not as the enum, but as a matching-size integral type.  This allows RTI to not serialize the names of the enum values, which was the lion share of the size of our typecode.  The downside is that, rather than seeing the e_ENUM_VALUE string in, for example, the DDS Spy utility, you see the integer value instead.  You may also need to do some explicit casting, depending on your language.  There's a forum thread some time ago discussing this.