Hi all,
I found this high-priority-first-flow-controller example (http://community.rti.com/examples/high-priority-first-flow-controller) and tried to increase the value of HPF_MAX_PAYLOAD_SIZE to 2073600 but I am having the following output in the reader side:
COMMENDFragmentedSampleTable_assertSample:max_fragments_per_sample exceeded
COMMENDFragmentedSampleTable_addFragments:!asserted
COMMENDSrReaderService_onSubmessage:!add fragments
Does anyone knows what it is missing?
Thanks in advance,
Hi,
Where did you change the value of HPF_MAX_PAYLOAD_SIZE? In the idl file, or in the generated header file?
I just tried changing the IDL and was not able to reproduce the error you observed. I verified that the subscriber reports no problems by changing the hpf.h file first, and also by changing the IDL.
Which version of RTI are you using? Can you upload your edited files?
Hi Sumeet,
First of all thank you your prompt reply.
The value that I told you doesn't reproduce the error. The correct one is 6220800, sorry my mistake.
I am using the latest version of RTI DDS and you can find the source code that I am using in the attachments.
If you find some error please let me know.
Thanks in advance.
Hi JoãoMSM,
I was able to reproduce your issue with the larger size. The error is produced because a QoS parameter controls the number of fragments allowed for any given sample. If you add the following lines to the DataReader QoS, the error should go away and the data should be received successfully:
Before:
<reader_resource_limits>
<!-- Determines whether the DataReader pre-allocates
storage for storing fragmented samples. This setting
can be used to limit upfront memory allocation costs
in applications which are dealing with large data -->
<dynamically_allocate_fragmented_samples>true</dynamically_allocate_fragmented_samples>
</reader_resource_limits>
After:
<reader_resource_limits>
<!-- Determines whether the DataReader pre-allocates
storage for storing fragmented samples. This setting
can be used to limit upfront memory allocation costs
in applications which are dealing with large data -->
<dynamically_allocate_fragmented_samples>true</dynamically_allocate_fragmented_samples>
<max_fragments_per_sample>1024</max_fragments_per_sample>
</reader_resource_limits>
I've added the new parameter: max_fragments_per_sample, and changed the value to 1024. The default is 512. I've also attached the updated XML. Give it a try and let me know if it works for you.
Hi again Sumeet,
Is working perfectly, thank you very much your help regarding this issue.