Architectural Run-time Models for Performance and Privacy Analysis in Dynamic Cloud Applications?

Building software systems by composing third-party cloud services promises many benefits such as flexibility and scalability. Yet at the same time, it leads to major challenges like limited control of third party infrastructures and runtime changes which mostly cannot be foreseen during development. While previous research focused on automated adaptation, increased complexity and heterogeneity of cloud services as well as their limited observability, makes evident that we need to allow operators (humans) to engage in the adaptation process. Models are useful for involving humans and conducting analysis, e.g. for performance and privacy. During operation the systems often drifts away from its design-time models. Run-time models are kept insync with the underlying system. However, typical run-time models are close to an implementation level of abstraction which impedes understandability for humans. In this vision paper, we present the iObserve approach to target aforementioned challenges while considering operationlevel adaptation and development-level evolution as two mutual interwoven processes. Central to this perception is an architectural run-time model that is usable for automatized adaptation and is simultaneously comprehensible for humans during evolution. The run-time model builds upon a technology-independent monitoring approach. A correspondence model maintains the semantic relationships between monitoring outcomes and architecture models. As an umbrella a megamodel integrates design-time models, code generation, monitoring, and run-time model update. Currently, iObserve covers the monitoring and analysis phases of the MAPE control loop. We come up with a roadmap to include planning and execution activities in iObserve.

[1]  Wilhelm Hasselbring,et al.  The CloudMIG Approach: Model-Based Migration of Software Systems to Cloud-Optimized Applications , 2012 .

[2]  Barbara Paech,et al.  Integrating business process simulation and information system simulation for performance prediction , 2017, Software & Systems Modeling.

[3]  Wilhelm Hasselbring,et al.  Automatic Extraction of Probabilistic Workload Specifications for Load Testing Session-Based Application Systems , 2015, EAI Endorsed Trans. Self Adapt. Syst..

[4]  Falko Bause,et al.  Queueing Petri Nets-A formalism for the combined qualitative and quantitative analysis of systems , 1993, Proceedings of 5th International Workshop on Petri Nets and Performance Models.

[5]  Robert Heinrich,et al.  Architecture-based assessment and planning of change requests , 2015, 2015 11th International ACM SIGSOFT Conference on Quality of Software Architectures (QoSA).

[6]  Carlo Ghezzi,et al.  A journey to highly dynamic, self-adaptive service-based applications , 2008, Automated Software Engineering.

[7]  Wilhelm Hasselbring,et al.  Search-based genetic optimization for deployment and reconfiguration of software in the cloud , 2013, 2013 35th International Conference on Software Engineering (ICSE).

[8]  Lars Grunske,et al.  Software Architecture Optimization Methods: A Systematic Literature Review , 2013, IEEE Transactions on Software Engineering.

[9]  Samuel Kounev,et al.  S/T/A: Meta-Modeling Run-Time Adaptation in Component-Based System Architectures , 2012, 2012 IEEE Ninth International Conference on e-Business Engineering.

[10]  Samuel Kounev,et al.  Automated extraction of architecture-level performance models of distributed component-based systems , 2011, 2011 26th IEEE/ACM International Conference on Automated Software Engineering (ASE 2011).

[11]  Wilhelm Hasselbring,et al.  Generating Probabilistic and Intensity-Varying Workload for Web-Based Software Systems , 2008, SIPEW.

[12]  Tzilla Elrad,et al.  Aspect-Oriented Modeling: Bridging the Gap between Implementation and Design , 2002, GPCE.

[13]  Jean-Marie Favre,et al.  Foundations of Model (Driven) (Reverse) Engineering : Models - Episode I: Stories of The Fidus Papyrus and of The Solarus , 2004, Language Engineering for Model-Driven Software Development.

[14]  Wilhelm Hasselbring,et al.  Run-time Architecture Models for Dynamic Adaptation and Evolution of Cloud Applications , 2015 .

[15]  Wil M. P. van der Aalst,et al.  Time prediction based on process mining , 2011, Inf. Syst..

[16]  Wilhelm Hasselbring,et al.  Reverse engineering of dependency graphs via dynamic analysis , 2011, ECSA '11.

[17]  Maria Luisa Villani,et al.  A framework for QoS-aware binding and re-binding of composite web services , 2008, J. Syst. Softw..

[18]  Carlo Ghezzi,et al.  Mining behavior models from user-intensive web applications , 2014, ICSE.

[19]  Nelly Bencomo,et al.  Models@run.time , 2014, Lecture Notes in Computer Science.

[20]  David Notkin,et al.  Software Reflexion Models: Bridging the Gap between Design and Implementation , 2001, IEEE Trans. Software Eng..

[21]  Wilhelm Hasselbring,et al.  Performance Simulation of Runtime Reconfigurable Component-Based Software Architectures , 2011, ECSA.

[22]  Heiko Koziolek,et al.  From monolithic to component-based performance evaluation of software architectures , 2010, Empirical Software Engineering.

[23]  Thomas Vogel,et al.  On Unifying Development Models and Runtime Models , 2014, MoDELS@Run.time.

[24]  Max E. Kramer,et al.  Modeling and Simulating Software Architectures: The Palladio Approach , 2016 .

[25]  Antonio Bucchiarone,et al.  Design for Self-Adaptation in Service-Oriented Systems in the Cloud , 2012, CloudCom 2012.

[26]  Muhammad Awais Shibli,et al.  Comparative Analysis of Access Control Systems on Cloud , 2012, 2012 13th ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing.

[27]  Brice Morin,et al.  Models@ Run.time to Support Dynamic Adaptation , 2009, Computer.

[28]  Klaus Pohl,et al.  iObserve 2: Integrated Observation and Modeling Techniques to Support Adaptation and Evolution of Software Systems , 2012 .

[29]  Thomas Vogel,et al.  Adaptation and abstract runtime models , 2010, SEAMS '10.

[30]  Wilhelm Hasselbring,et al.  Integrating Run-time Observations and Design Component Models for Cloud System Analysis , 2014, Models@run.time.

[31]  Jean-Marie Favre,et al.  Foundations of Model ( Driven ) ( Reverse ) Engineering Episode I : Story of The Fidus Papyrus and the Solarus , 2004 .

[32]  Jan Jürjens,et al.  A Platform for Empirical Research on Information System Evolution , 2015, SEKE.

[33]  Mary Shaw,et al.  Engineering Self-Adaptive Systems through Feedback Loops , 2009, Software Engineering for Self-Adaptive Systems.

[34]  Martin P. Robillard,et al.  Recommendation Systems for Software Engineering , 2010, IEEE Software.

[35]  Andreas Metzger,et al.  Addressing Highly Dynamic Changes in Service-Oriented Systems: Towards Agile Evolution and Adaptation , 2013 .

[36]  Uwe Zdun,et al.  Systematic literature review of the objectives, techniques, kinds, and architectures of models at runtime , 2016, Software & Systems Modeling.

[37]  Klaus Pohl,et al.  Runtime Model-Based Privacy Checks of Big Data Cloud Services , 2015, ICSOC.

[38]  Meir M. Lehman,et al.  Program evolution: processes of software change , 1985 .

[39]  Hong Yan,et al.  Discovering Architectures from Running Systems , 2006, IEEE Transactions on Software Engineering.

[40]  Wilhelm Hasselbring,et al.  Architectural run-time models for operator-in-the-loop adaptation of cloud applications , 2015, 2015 IEEE 9th International Symposium on the Maintenance and Evolution of Service-Oriented and Cloud-Based Environments (MESOCA).

[41]  Jerome A. Rolia,et al.  The Method of Layers , 1995, IEEE Trans. Software Eng..

[42]  Coroiu Nicolae,et al.  SCADA: Supervisory Control and Data Acquisition , 2015 .

[43]  Robert Heinrich,et al.  Model-driven Instrumentation with Kieker and Palladio to Forecast Dynamic Applications , 2013, KPDAYS.

[44]  Hui Song,et al.  Supporting runtime software architecture: A bidirectional-transformation-based approach , 2011, J. Syst. Softw..

[45]  Wilhelm Hasselbring,et al.  ExplorViz: Visual Runtime Behavior Analysis of Enterprise Application Landscapes , 2015, ECIS.

[46]  Helmut Krcmar,et al.  Modeling Complex User Behavior with the Palladio Component Model , 2015, Softwaretechnik-Trends.

[47]  Dragan Ivanovic,et al.  Constraint-Based Runtime Prediction of SLA Violations in Service Orchestrations , 2011, ICSOC.

[48]  Mike P. Papazoglou,et al.  Service Research Challenges and Solutions for the Future Internet , 2010, Lecture Notes in Computer Science.

[49]  Zachary N. J. Peterson,et al.  Geolocation of data in the cloud , 2013, CODASPY.

[50]  Heiko Koziolek,et al.  CoCoME - The Common Component Modeling Example , 2007, CoCoME.

[51]  Peyman Oreizy,et al.  Runtime software adaptation: framework, approaches, and styles , 2008, ICSE Companion '08.

[52]  Mary Shaw,et al.  Software Engineering for Self-Adaptive Systems: A Research Roadmap , 2009, Software Engineering for Self-Adaptive Systems.

[53]  Xifeng Yan,et al.  Workload characterization and prediction in the cloud: A multiple time series approach , 2012, 2012 IEEE Network Operations and Management Symposium.

[54]  Samuel Kounev,et al.  Performance Modeling and Evaluation of Distributed Component-Based Systems Using Queueing Petri Nets , 2006, IEEE Transactions on Software Engineering.

[55]  Jeff Magee,et al.  Self-Managed Systems: an Architectural Challenge , 2007, Future of Software Engineering (FOSE '07).

[56]  Nessi White Paper Software Engineering Key Enabler for Innovation Executive Summary Contents , .

[57]  Andreas Hotho,et al.  Modeling and Extracting Load Intensity Profiles , 2017, 2015 IEEE/ACM 10th International Symposium on Software Engineering for Adaptive and Self-Managing Systems.

[58]  Heiko Koziolek,et al.  Performance evaluation of component-based software systems: A survey , 2010, Perform. Evaluation.

[59]  Benoît Combemale,et al.  Formally defining and iterating infinite models , 2012, MODELS'12.

[60]  Wilhelm Hasselbring,et al.  Kieker: a framework for application performance monitoring and dynamic software analysis , 2012, ICPE '12.

[61]  Hausi A. Müller,et al.  Runtime Evolution of Highly Dynamic Software , 2014, Evolving Software Systems.

[62]  Samuel Kounev,et al.  Modeling parameter and context dependencies in online architecture-level performance models , 2012, CBSE '12.

[63]  Gueyoung Jung,et al.  CloudAdvisor: A Recommendation-as-a-Service Platform for Cloud Configuration and Pricing , 2013, 2013 IEEE Ninth World Congress on Services.

[64]  Christopher Scaffidi,et al.  Impact and utility of smell-driven performance tuning for end-user programmers , 2015, J. Vis. Lang. Comput..

[65]  Virgílio A. F. Almeida,et al.  Capacity Planning for Web Services: Metrics, Models, and Methods , 2001 .

[66]  Sungwon Kang,et al.  The Impact of View Histories on Edit Recommendations , 2015, IEEE Transactions on Software Engineering.