Using autopurge_disposed_samples_delay

1 post / 0 new
Offline
Last seen: 6 years 10 months ago
Joined: 04/28/2013
Posts: 38
Using autopurge_disposed_samples_delay

Hi,

I'd like to use autopurge_disposed_samples_delay to manage reader resources.

Essentially, I'd like to purge it as soon as the instance becomes NOT_ALIVE_DISPOSED, so I've set the delay to 1 nanosec (the minimum).

What I'd like to know is whether it could somehow affect performance?

Does it mean there is a task that's running every 1 nanosec and goes over all the instances on the DataReader even if all of them are ALIVE?

What is the best practice when using this setting?

 

Thanks,

Michael