I'm using the a persistency service process to store persistent data to disk. How am I able to read historical data that is beyond the local history of my publisher?
Both my Publisher and my Subscriber use RELIABLE_RELIABILTY_QOS and PERSISTENT_DURABILITY_QOS with KEEP_LAST_HISTORY_QOS and depth = 20.
Samples are generated and published to each of 10 instances in turn. Samples are given a sequence number which is independent of the instance key. Once I'm in a situation where my instances contain more than 20 samples each, how am I able to retrieve an arbitrary set of samples?
I'd like to use dynamic filters such as "seq_num BETWEEN %0 AND %1"
Or maybe "(seq_num BETWEEN %0 AND %1) AND key_num = %2"
Where %0, %1, and %2 can be modified at run-time to retrieve different sets of data.
Where do I start looking?
Thanks
Dallas
Hi Dallas,
You can find a good code example on how to use content filters here. Also, the RTI Connext Core Libraries and Utilities User's Manual includes an explaination on how to set up content filters--including the use of parameters that can be changed at run-time--in its Chapter 12 (Section 5.4.3).
Basically, to be able to modify a set parameters on a filter expression at run-time, you have to create your content filtered topic indicating the sequence of parameters that can be modified.
Then, whenever you want to change the value of those parameters, you have to (1) change their value in the parameters string squence, and (2) call set_expression_parameters() to update the filter.
Regarding your first question, KEEP_LAST_HISTORY_QOS indicates the number of samples per instance that the middleware stores locally for data writers and data readers. If the data writer of your publisher application (or the persitence service data writer) keeps the last 20 samples of each of your 10 instances, it won't be able to send data readers more samples than the number of samples stored for each instance. On the data reader side, only the samples that passing the filter will be stored.
Therefore, if you want to increase the number of samples you receive for each instance you have to increase the KEEP_LAST_HISTORY_QOS depth on your data writers (including persistence service for its potential use), and accordingly its depth on your data readers.
Fernando.
Hi,
Regarding the depth of the history received by the DataReader, I think it is enough if you increase the history depth on the pesistence service DataWriter. RTI Connext DDS will automaticaly combine what is in the persistence service DataWriter with the history in the original DataWriter (if it still around) so that the DataReader that specified a DURABILITY of TRANSIENT or PERSISTENT will get the larger history.
Gerardo
Thanks for your responses, guys.
I think that answers about half of my question. The example in Content_Filtered_Topic.zip in Fernando's post is useful.
But in that example, the dynamic changes to the content filtered topic only affect the reception of newly published samples.
I'd like to be able to modify this example so that the filter is on the "count" field so that I can read, for example, samples 10 to 15, then 25 to 30, then back to 5 to 10, and so on, arbitrarily back and forth through the entire history.
I've tried to implement a trivial example of this, without success. Using a DataReaderListener (as in the example) is clearly not the way to go; it will only notify me of "new" samples. I've tried using "read" on a reader of a content filtered topic, and I've also tried using "read_w_condition" on a reader of a non-filtered topic. Neither seem to work. In both cases, when the filter parameters are set to read samples "older" than the "oldest" sample currently in the DataReaders' histories, no data is read.
So, my (limited) experience so far indicates that content filters will filter data that is within the DataReader's local history. Once data is too "old" to be contained within this local history it seems to be gone forever.
Obviously, once the data samples are gone from the DataReader's local cache, the only way to get the data again would be for the Persistence Service to send the samples again. Is this even possible?
If this is possible, how would it be done?
Alternatively, is there something I've missed? A different solution altogether maybe?
Thanks again
Dallas
Hello,
Your observation below is exactly right:
"dynamic changes to the content filtered topic only affect the reception of newly published samples"
This is how it works:
In this process the DataWriter never re-evaluates the history of samples that were already filtered for that DataReader. The reason is that it would be quite complicated to define the proper behavior. For instance:
These may or may not be problems in partuclar situations. But it is very hard to get the behavior to be correct and intuitive in general so for this reason the DataWriter does not go back and replay its history to the DataReader...
So if you need this old history you have 2 options I think:
The first option works because a DataWriter that is configured with DURABILITY kind TRANSIENT_LOCAL, TRANSIENT, or PERSISTENT will keep samples in its cache for future, to-be-discovered DataReaders. When the DataWriter discovers a new DataReader that also specifies non-VOLATILE DURABILITY the DataWriter will send the data in its history cache. If the DataWriter HISTORY QoS has been configured with the RefilterQosPolicyKind set to DDS_ALL_REFILTER_QOS then the DataWriter will filter the historical samples and only sent the ones that pass the filter. If the RefilterQosPolicyKind is set to DDS_NONE_REFILTER_QOS then it will just send all the samples and let the DataReader filter them, which is probably not what you are looking for.
Let us know if this helps,
Gerardo