Persistence Service Performance

5 posts / 0 new
Last post
Offline
Last seen: 7 years 3 weeks ago
Joined: 06/10/2014
Posts: 49
Persistence Service Performance

Hi,

I've been using Persistence Service (with file storage) to send historical data to applications.  One issue I've run into is if there are a lot of samples in the backlog then any samples currently being published are not received by the reader until the application "catches up". (One data point - if there are 3 hours worth of data in the files then it takes about 30 minutes for my application to catch up to realtime.)  I've created a second reader that doesn't use PERSISTENT_DURABILITY_QOS kind so that it is receiving current samples from the writer but then I need to check for when the persistent durable reader catches up so that I can stop it.   

Is there a better way to do this so that I can access historical data in a more timely way?

Thanks,

Anne Fiore

Organization:
Offline
Last seen: 4 years 4 months ago
Joined: 09/10/2010
Posts: 32

Hello Anne,

    What is the size of data that you are persisting?  If the data is relatively small, then you can turn on batching for late joining readers to make that initial transfer be alot more smoother.  There is a QoS for the persitence service called: "late_joiner_read_batch".  Please look into this option for getting better efficieny of the persistence service.

Bert

Offline
Last seen: 7 years 3 weeks ago
Joined: 06/10/2014
Posts: 49

Hi Bert,

Thanks for this information.  I was able to try the "late_joiner_read_batch" in the Persistence Service.  Unfortunately it didn't improve performance for my application. It is probably due to the amount of data that I am testing with.  We have a requirement for our applications to be able to retrieve up to 36 hours of data. 

Anne

Offline
Last seen: 4 years 4 months ago
Joined: 09/10/2010
Posts: 32

Hello Anne,

    Would it make more sense to record your data using Recorder and then in your applications, pull the recorded data in from that using SQLite access?

Bert

Offline
Last seen: 7 years 3 weeks ago
Joined: 06/10/2014
Posts: 49

Hi Bert,

We looked into using Recording Service and Realtime Connext but in both cases because some of our types are using large sequences we are violating database constraints (column width, etc.)  We are using Recording Service (for later replay) but we need to always serialize the data making it inaccessible for our applications to query.

We are still investigating alternatives including a custom database.  I appreciate your insights. Let me know if you have other suggestions.

Thanks,

Anne