I have a set of samples which are sent from a DataWriter with durability of (DDS_KEEP_LAST_HISTORY, depth=1). Messages are sent only when changed, without any deadline. So it might be many seconds between updates.
I have a datareader which has either used a ContentFilteredTopic, or has caused take() to be called on samples that did not match a QueryConditoin crieteria. Either way, I know that, at some point in time, there are some data instances whose last-sent samples didn't match the filter, and that are not in the DataReader's sample cache. While in this state, I change the query expression or parameters. Some of those old samples that are not in the cache, would match, given the new filter.
If I created a new DataReader with the same filter expression as my old DataReader, the publication match would trigger the DataWriter to resend historical samples. The newly created DataReader woud end up with a full and consistent set of all the samples that match its criteria, right away.
But instead, I want to keep using my existing DataReader with the just-updated filter. However, it won't end up reaching a consistent state until whenever the previously-filtered samples happen to be transmitted again in the future.
Is there any way to force the existing DataReader to explicitly request historical samples, the same way that a new DataReader would, other than destroying and recreating the DataReader itself?
Obvioulsy I could use some sort of reliable command message to prompt other models to manually resend old samples, but then I have to write code to handle all that. I'm hoping for a middleware option.
Hello,
I am afraid that there is currently no mechanism that would cause the DataWriter to resend historical samples to a DataReader that changes its filter.
Supporting this is more complex that it would appear at first. The reason is that the DataWriter has already sent samples with sequence numbers beyond the historical ones and it is not so simple to devise a mechanism to resend those samples without confusing the reliability protocol. Similarly if the samples were "rewritten" with new sequene numbers then not only they have to be directed just to the Reader that chnaged the filter. But also they would arrive to that DataReader "out of order" relative to the samples that were already received before with the other filter. Finally there is the issue of what to do with samples that were alreadu sent because they passed the old filter and also pass the new filter. Should they be re-sent.
All that said, we have run into that use-case before and we are actively considering ways that we can provide such mechanisms. If you are interested can discuss with you the details directly over email. It may be good to get your input on some of the approaches we are considering. It is likely we will implement something in the next 6-12 months.
In the meantime I think the most practical approaches would be to one of the following:
Gerardo