An Event-based Capture-and-Compare Approach to Support the Evolution of Systems of Systems

Industrial software systems are often systems of systems (SoS) that evolve continuously to meet new customer requirements or to address technological changes. Despite thorough testing of the different contributing parts, the full behavior of SoS only emerges at runtime. The systems in the SoS and their interactions thus need to be continuously monitored and checked during operation to determine compliance with requirements. In particular, after changes to one system, it is necessary to check whether the overall SoS still behaves correctly and as intended. Based on an existing monitoring framework we have been developing support for capturing and comparing event traces in SoS. Our approach facilitates and partly automates the identification of differences in event traces, which often indicate undesirable behavior introduced during evolution. In this paper we motivate the need for monitoring and evolution support in SoS using an industrial example and describe our event-based capture-and-compare approach. We evaluate the applicability and scalability of our tool-supported approach, demonstrating that it can cope with comparing event traces from an industrial SoS. We present our experiences and findings intended for researchers and practitioners working on maintenance and evolution of large-scale software systems.

[1]  Mark W. Maier Architecting Principles for Systems‐of‐Systems , 1996 .

[2]  Paul Grünbacher,et al.  ReMinds : A flexible runtime monitoring framework for systems of systems , 2016, J. Syst. Softw..

[3]  Luciano Baresi,et al.  A comparison framework for runtime monitoring approaches , 2017, J. Syst. Softw..

[4]  Andy Zaidman,et al.  Managing trace data volume through a heuristical clustering process based on event execution frequency , 2004, Eighth European Conference on Software Maintenance and Reengineering, 2004. CSMR 2004. Proceedings..

[5]  Bernd Freisleben,et al.  Complex event processing for reactive security monitoring in virtualized computer systems , 2015, DEBS.

[6]  Bernhard Rumpe,et al.  A Manifesto for Semantic Model Differencing , 2010, MoDELS.

[7]  Carlo Ghezzi,et al.  Self-adaptive software needs quantitative verification at runtime , 2012, CACM.

[8]  Alessandro Orso,et al.  Selective capture and replay of program executions , 2005, WODA '05.

[9]  Paul Grünbacher,et al.  Assessing the Usefulness of a Requirements Monitoring Tool: A Study Involving Industrial Software Engineers , 2016, 2016 IEEE/ACM 38th International Conference on Software Engineering Companion (ICSE-C).

[10]  Alessandro Orso,et al.  Applying classification techniques to remotely-collected program execution data , 2005, ESEC/FSE-13.

[11]  Jonathan I. Maletic,et al.  Supporting source code difference analysis , 2004, 20th IEEE International Conference on Software Maintenance, 2004. Proceedings..

[12]  Ricardo Seguel,et al.  Process Mining Manifesto , 2011, Business Process Management Workshops.

[13]  William N. Robinson A requirements monitoring framework for enterprise systems , 2005, Requirements Engineering.

[14]  Andriy V. Miranskyy,et al.  Using entropy measures for comparison of software traces , 2010, Inf. Sci..

[15]  Christus,et al.  A General Method Applicable to the Search for Similarities in the Amino Acid Sequence of Two Proteins , 2022 .

[16]  George S. Avrunin,et al.  Patterns in property specifications for finite-state verification , 1999, Proceedings of the 1999 International Conference on Software Engineering (IEEE Cat. No.99CB37002).

[17]  Steven P. Reiss,et al.  Encoding program executions , 2001, Proceedings of the 23rd International Conference on Software Engineering. ICSE 2001.

[18]  Paul Grünbacher,et al.  Event Capture and Compare for Runtime Monitoring of Systems of Systems , 2016, 2016 IEEE/ACM 1st International Workshop on Variability and Complexity in Software Design (VACE).

[19]  Nenad Stojanovic,et al.  Dynamic monitoring for improving worker safety at the workplace: use case from a manufacturing shop floor , 2015, DEBS.

[20]  Alexander Egyed,et al.  Developing a DSL-Based Approach for Event-Based Monitoring of Systems of Systems: Experiences and Lessons Learned (E) , 2015, 2015 30th IEEE/ACM International Conference on Automated Software Engineering (ASE).

[21]  Claudia Szabo,et al.  Model-driven performance prediction of systems of systems , 2016, Software & Systems Modeling.

[22]  Ivan Porres,et al.  Difference and Union of Models , 2003, UML.

[23]  Paul Grünbacher,et al.  A case study on testing, commissioning, and operation of very-large-scale software systems , 2014, ICSE Companion.

[24]  F. Itakura,et al.  Minimum prediction residual principle applied to speech recognition , 1975 .

[25]  Hanspeter Mössenböck,et al.  A Comprehensive Solution for Deterministic Replay Debugging of SoftPLC Applications , 2011, IEEE Transactions on Industrial Informatics.

[26]  Michael zur Muehlen,et al.  Business Process Analytics , 2015, Handbook on Business Process Management.

[27]  Paul Grünbacher,et al.  Requirements monitoring frameworks: A systematic review , 2016, Inf. Softw. Technol..

[28]  Lars Grunske,et al.  Aligning Qualitative, Real-Time, and Probabilistic Property Specification Patterns Using a Structured English Grammar , 2015, IEEE Transactions on Software Engineering.

[29]  Wilhelm Hasselbring,et al.  Kieker: a framework for application performance monitoring and dynamic software analysis , 2012, ICPE '12.

[30]  Carlo Ghezzi,et al.  Runtime monitoring of component changes with Spy@Runtime , 2012, 2012 34th International Conference on Software Engineering (ICSE).

[31]  Harald C. Gall,et al.  Change Distilling:Tree Differencing for Fine-Grained Source Code Change Extraction , 2007, IEEE Transactions on Software Engineering.

[32]  Monika Solanki,et al.  RFID-based logistics monitoring with semantics-driven event processing , 2016, DEBS.

[33]  Jan Peleska,et al.  Systems of Systems Engineering , 2015 .