Performance evaluation

Building large and complex distributed systems required in Defence pose challenges in ensuring that the functional system specifications of processing and network performance are achieved while at the same time the non-functional properties of space, weight and power are optimised. Current research into performance measurement and prediction is achieved through modelling the system behaviour and interactions and observing the execution on several alternative deployment environments. We use RTI DDS as the middleware extensively for data mining and logging of system executions. Observation of the execution is enhanced by the RTI DDS built-in management topic for publications and the dynamic data features available. Data Distribution Service (DDS) middleware enables a distributed system to be modelled and executed on a wide range of hardware such as IBM blade servers through to low cost single board computers such as the Raspberry Pi. Such a broad range of deployment environments is essential to our current research into software architectures, model-driven engineering and distributed systems design.

Further information about this research project is available from:
http://blogs.adelaide.edu.au/dig/

The University of Adelaide

Measuring Performance with PerfTest Utility

This video guides users through architecting, building, and running tests with PerfTest, a free utility that measures throughput and latency when using RTI Connext DDS. 

This is module #26 of 27 in RTI Connextâ„¢ DDS Online Training, part of the RTI eLearning program. Watch other free modules in the RTI eLearning program.

Organization:

 Real-Time availability of information is of most importance in large scale distributed interactive simulation in network-centric communication. Information generated from multiple federates must be distributed and made available to interested parties and providing the required QoS for consistent communication. The remainder of this Project discusses design alternative for realizing high performance distributed interactive simulation (DIS) application using the OMG Data Distribution Service (DDS), which is a QoS enabled publish/subscribe platform standard for time-critical, data-centric and large scale distributed networks. The considered application, in the civil domain, is used for remote education in driving schools. An experimental design evaluates the bandwidth and the latency performance of DDS and a comparison with the High Level Architecture performance is given

Subscribe to RSS - Performance evaluation