Performance Benchmarking of Application Monitoring Frameworks
暂无分享,去创建一个
[1] Wilhelm Hasselbring,et al. Kieker: continuous monitoring and on demand visualization of Java software behavior , 2008, ICSE 2008.
[2] Walter F. Tichy,et al. Status of Empirical Research in Software Engineering , 2006, Empirical Software Engineering Issues.
[3] James M. Bieman,et al. Software Metrics: A Rigorous and Practical Approach, Third Edition , 2014 .
[4] William G. Griswold,et al. An Overview of AspectJ , 2001, ECOOP.
[5] Wilhelm Hasselbring,et al. An adaptation framework enabling resource-efficient operation of software systems , 2009 .
[6] S. V. Subrahmanya,et al. Object driven performance testing of Web applications , 2000, Proceedings First Asia-Pacific Conference on Quality Software.
[7] Anthony J. G. Hey,et al. Jim Gray on eScience: a transformed scientific method , 2009, The Fourth Paradigm.
[8] Bobby Woolf,et al. Enterprise Integration Patterns , 2003 .
[9] Ralf H. Reussner,et al. SKaMPI: A Detailed, Accurate MPI Benchmark , 1998, PVM/MPI.
[10] Heiko Koziolek,et al. Measuring Performance Metrics: Techniques and Tools , 2005, Dependability Metrics.
[11] Frank Yellin,et al. The Java Virtual Machine Specification , 1996 .
[12] Virgílio A. F. Almeida,et al. Performance by Design - Computer Capacity Planning By Example , 2004 .
[13] Henry C. Lucas,et al. Performance Evaluation and Monitoring , 1971, CSUR.
[14] Markus Dahm. Byte Code Engineering with the BCEL API , 2007 .
[15] Connie U. Smith,et al. Best Practices for Software Performance Engineering , 2003, Int. CMG Conference.
[16] Allen D. Malony,et al. Performance Measurement Intrusion and Perturbation Analysis , 1992, IEEE Trans. Parallel Distributed Syst..
[17] Petr Tuma,et al. CORBA benchmarking: a course with hidden obstacles , 2003, Proceedings International Parallel and Distributed Processing Symposium.
[18] Michael D. Bond,et al. Continuous path and edge profiling , 2005, 38th Annual IEEE/ACM International Symposium on Microarchitecture (MICRO'05).
[19] Richard Mortier,et al. Using Magpie for Request Extraction and Workload Modelling , 2004, OSDI.
[20] Jin Shao,et al. A Runtime Model Based Monitoring Approach for Cloud , 2010, 2010 IEEE 3rd International Conference on Cloud Computing.
[21] David A. Patterson,et al. How to Have a Bad Career in Research/Academia , 2005 .
[22] David Lilja,et al. Statistical Techniques for Computer Performance Analysis , 2005 .
[23] Lieven Eeckhout,et al. Javana: a system for building customized Java program analysis tools , 2006, OOPSLA '06.
[24] Viktor Mauch,et al. Site specific monitoring of multiple information systems – the HappyFace Project , 2010 .
[25] Mark W. Johnson. Monitoring and Diagnosing Applications with ARM 4.0 , 2004, Int. CMG Conference.
[26] Michael D. Bond,et al. LeakChaser: helping programmers narrow down causes of memory leaks , 2011, PLDI '11.
[27] Radu Grosu,et al. Software monitoring with bounded overhead , 2008, 2008 IEEE International Symposium on Parallel and Distributed Processing.
[28] Tomáš Kalibera. Performance in Software Development Cycle: Regression Benchmarking , 2006 .
[29] Alexander L. Wolf,et al. A Benchmark Suite for Distributed Publish/Subscribe Systems , 2002 .
[30] Thilo Focke. Performance Monitoring von Middleware-basierten Applikationen , 2006 .
[31] Steven P. Reiss,et al. Controlled dynamic performance analysis , 2008, WOSP '08.
[32] Gregor Kiczales,et al. Using aspectC to improve the modularity of path-specific customization in operating system code , 2001, ESEC/FSE-9.
[33] Ed Urban,et al. Ocean data publication cookbook , 2013 .
[34] Wilhelm Hasselbring,et al. Model-Driven Instrumentation for Dynamic Analysis of Legacy Software Systems , 2011, Softwaretechnik-Trends.
[35] Karen L. Karavanic,et al. Trace profiling: Scalable event tracing on high-end parallel systems , 2012, Parallel Comput..
[36] Wilhelm Hasselbring,et al. Ein Vorgehensmodell für Performance-Monitoring von Informationssystemlandschaften , 2006, EAI.
[37] Alexandru Iosup,et al. Benchmarking in the Cloud: What It Should, Can, and Cannot Be , 2012, TPCTC.
[38] Antonia Zhai,et al. Efficient dynamic program monitoring on multi-core systems , 2011, J. Syst. Archit..
[39] Shigeru Chiba,et al. Mostly modular compilation of crosscutting concerns by contextual predicate dispatch , 2010, OOPSLA.
[40] Jens Ehlers,et al. Self-Adaptive Performance Monitoring for Component-Based Software Systems , 2012, Softwaretechnik-Trends.
[41] Susan L. Graham,et al. Gprof: A call graph execution profiler , 1982, SIGPLAN '82.
[42] Tony Field,et al. GILK: A Dynamic Instrumentation Tool for the Linux Kernel , 2002, Computer Performance Evaluation / TOOLS.
[43] Zora Konjovic,et al. Towards performance monitoring overhead reduction , 2013, 2013 IEEE 11th International Symposium on Intelligent Systems and Informatics (SISY).
[44] Matthew Arnold,et al. A concurrent dynamic analysis framework for multicore hardware , 2009, OOPSLA.
[45] Shari Lawrence Pfleeger,et al. Preliminary Guidelines for Empirical Research in Software Engineering , 2002, IEEE Trans. Software Eng..
[46] Christof Ebert,et al. Messung und Bewertung von Software , 2013, Informatik-Spektrum.
[47] Arthur B. Maccabe,et al. Lightweight Online Performance Monitoring and Tuning with Embedded Gossip , 2009, IEEE Transactions on Parallel and Distributed Systems.
[48] Joseph Gil,et al. A microbenchmark case study and lessons learned , 2011, SPLASH Workshops.
[49] Amer Diwan,et al. The DaCapo benchmarks: java benchmarking development and analysis , 2006, OOPSLA '06.
[50] Wilhelm Hasselbring,et al. Capturing provenance information with a workflow monitoring extension for the Kieker framework , 2012, SWPM@ESWC.
[51] Sören Frey,et al. Conformance checking and simulation-based evolutionary optimization for deployment and reconfiguration of software in the cloud , 2014, Softwaretechnik-Trends.
[52] Felix Magedanz,et al. Dynamic analysis of .NET applications for architecture-based model extraction and test generation , 2011 .
[53] Vom Fachbereich Informatik. Performance Engineering of Distributed Component-Based Systems - Benchmarking, Modeling and Performance Prediction , 2005 .
[54] James R. Larus,et al. Exploiting hardware performance counters with flow and context sensitive profiling , 1997, PLDI '97.
[55] Allen D. Malony,et al. Overhead Compensation in Performance Profiling , 2004, Parallel Process. Lett..
[56] Petr Tuma,et al. Repeated results analysis for middleware regression benchmarking , 2005, Perform. Evaluation.
[57] Wilhelm Hasselbring,et al. Workload-intensity-sensitive timing behavior analysis for distributed multi-user software systems , 2010, WOSP/SIPEW '10.
[58] Eelco Visser,et al. A survey of strategies in rule-based program transformation systems , 2005, J. Symb. Comput..
[59] Petr Tuma,et al. Benchmark Precision and Random Initial State , 2005 .
[60] Heiko Koziolek,et al. Performance evaluation of component-based software systems: A survey , 2010, Perform. Evaluation.
[61] Wilhelm Hasselbring,et al. Kieker: a framework for application performance monitoring and dynamic software analysis , 2012, ICPE '12.
[62] Clemens A. Szyperski,et al. Component software - beyond object-oriented programming , 2002 .
[63] Marc Feeley,et al. Portable and Efficient Run-time Monitoring of JavaScript Applications Using Virtual Machine Layering , 2014, ECOOP.
[64] Matthias S. Müller,et al. Developing Scalable Applications with Vampir, VampirServer and VampirTrace , 2007, PARCO.
[65] Karen L. Karavanic,et al. Towards Scalable Event Tracing for High End Systems , 2007, HPCC.
[66] Dimosthenis Kyriazis,et al. A Self-adaptive hierarchical monitoring mechanism for Clouds , 2012, J. Syst. Softw..
[67] Paola Inverardi,et al. Software Performance: state of the art and perspectives , 2003 .
[68] Wilhelm Hasselbring,et al. A Concurrent and Distributed Analysis Framework for Kieker , 2013, KPDAYS.
[69] Claes Wohlin,et al. Empirical Research Methods in Web and Software Engineering , 2006, Web Engineering.
[70] Konrad Hinsen. Caring for Your Data , 2012, Computing in Science & Engineering.
[71] Martin Zloch. Automatisierte Durchführung und Auswertung von Microbenchmarks in Continuous Integration Systemen , 2014 .
[72] James Noble,et al. InspectJ: Program Monitoring for Visualisation Using AspectJ , 2003, ACSC.
[73] Mario Piattini,et al. Towards a consistent terminology for software measurement , 2006, Inf. Softw. Technol..
[74] Felix C. Freiling,et al. On Metrics and Measurements , 2005, Dependability Metrics.
[75] Heiko Koziolek,et al. CoCoME - The Common Component Modeling Example , 2007, CoCoME.
[76] Xu Chen,et al. Binary Code Analysis , 2013, Computer.
[77] Vladimir Stantchev,et al. Performance Evaluation of Cloud Computing Offerings , 2009, 2009 Third International Conference on Advanced Engineering Computing and Applications in Sciences.
[78] Todd C. Mowry,et al. Butterfly analysis: adapting dataflow analysis to dynamic parallel monitoring , 2010, ASPLOS XV.
[79] Steve Wilson,et al. Java Platform Performance - Strategies and Tactics , 2000 .
[80] Bernd Hamann,et al. State of the Art of Performance Visualization , 2014, EuroVis.
[81] Raúl Izquierdo,et al. The Runtime Performance of invokedynamic: An Evaluation with a Java Library , 2014, IEEE Software.
[82] Wilhelm Hasselbring,et al. A Comparison of the Influence of Different Multi-core Processors on the Runtime Overhead for Application-Level Monitoring , 2012, MSEPT.
[83] Steffen Becker,et al. Decision support via automated metric comparison for the palladio-based performance blame analysis , 2013, ICPE '13.
[84] R. Peng. Reproducible Research in Computational Science , 2011, Science.
[85] Uwe Hohenstein,et al. Using aspect-orientation in industrial projects: appreciated or damned? , 2009, AOSD '09.
[86] Samuel Kounev,et al. Resilience Benchmarking , 2012, Resilience Assessment and Evaluation of Computing Systems.
[87] Harish Patil,et al. Efficient Run-time Monitoring Using Shadow Processing , 1995, AADEBUG.
[88] David F. Hinnant,et al. Accurate Unix benchmarking: art, science, or black magic? , 1988, IEEE Micro.
[89] Anil Kumar,et al. SPECjbb2013 1.0: an overview , 2014, ICPE.
[90] Jan Waller. Data for: Performance Benchmarking of Application Monitoring Frameworks , 2014 .
[91] Kim M. Hazelwood,et al. SuperPin: Parallelizing Dynamic Instrumentation for Real-Time Performance , 2007, International Symposium on Code Generation and Optimization (CGO'07).
[92] Benjamin Harms,et al. Reverse-Engineering und Analyse einer Plug-in-basierten Java-Anwendung , 2013 .
[93] Wilhelm Hasselbring,et al. Toward a Generic and Concurrency-Aware Pipes & Filters Framework , 2014, SoSP.
[94] Connie U. Smith,et al. Performance Engineering of Software Systems , 1990, SIGMETRICS Perform. Evaluation Rev..
[95] Jan Waller. Benchmarking the Performance of Application Monitoring Systems , 2013 .
[96] Gregor Kiczales,et al. Aspect-oriented programming , 2001, ESEC/FSE-9.
[97] Robert Heinrich,et al. Model-driven Instrumentation with Kieker and Palladio to Forecast Dynamic Applications , 2013, KPDAYS.
[98] Dirk Grunwald,et al. Shadow Profiling: Hiding Instrumentation Costs with Parallelism , 2007, International Symposium on Code Generation and Optimization (CGO'07).
[99] Arindam Banerjee,et al. Submitted for publication , 1981 .
[100] Steffen Becker,et al. Performance Prediction of Component-Based Systems - A Survey from an Engineering Perspective , 2004, Architecting Systems with Trustworthy Components.
[101] Lieven Eeckhout,et al. Statistically rigorous java performance evaluation , 2007, OOPSLA.
[102] Wenguang Chen,et al. RACEZ: a lightweight and non-invasive race detection tool for production applications , 2011, 2011 33rd International Conference on Software Engineering (ICSE).
[103] Matthias Hauswirth,et al. Vertical profiling: understanding the behavior of object-priented applications , 2004, OOPSLA.
[104] Philipp Döhring,et al. Visualisierung von Synchronisationspunkten in Kombination mit der Statik und Dynamik eines Softwaresystems , 2012 .
[105] Jing Li,et al. The Qualitas Corpus: A Curated Collection of Java Code for Empirical Studies , 2010, 2010 Asia Pacific Software Engineering Conference.
[106] Jens Happe,et al. The Performance Cockpit Approach: A Framework For Systematic Performance Evaluations , 2010, 2010 36th EUROMICRO Conference on Software Engineering and Advanced Applications.
[107] Sheng Liang,et al. Java Virtual Machine Profiler Interface , 2000, IBM Syst. J..
[108] Matthias Hauswirth,et al. Using Hardware Performance Monitors to Understand the Behavior of Java Applications , 2004, Virtual Machine Research and Technology Symposium.
[109] Raj Jain,et al. The art of computer systems performance analysis - techniques for experimental design, measurement, simulation, and modeling , 1991, Wiley professional computing.
[110] Gordon Bell,et al. Beyond the Data Deluge , 2009, Science.
[111] Steffen Becker,et al. The Palladio component model for model-driven performance prediction , 2009, J. Syst. Softw..
[112] Sebastian Fischmeister,et al. Runtime Monitoring of Time-Sensitive Systems - [Tutorial Supplement] , 2011, RV.
[113] Wilhelm Hasselbring,et al. Engineering and Continuously Operating Self-Adaptive Software Systems: Required Design Decisions , 2009 .
[114] Albert Flaig. Dynamic Instrumentation in Kieker Using Runtime Bytecode Modification , 2014 .
[115] Lieven Eeckhout,et al. Method-level phase behavior in java workloads , 2004, OOPSLA.
[116] Wilhelm Hasselbring,et al. The Aspect-Oriented Architecture of the CAPS Framework for Capturing, Analyzing and Archiving Provenance Data , 2014, IPAW.
[117] Lizy K. John,et al. Confusion by All Means , 2006 .
[118] Adrian Mos. A Framework for Adaptive Monitoring and Performance Management of Component-Based Enterprise Applications , 2004 .
[119] Oege de Moor,et al. Making trace monitors feasible , 2007, OOPSLA.
[120] Tomas Kalibera,et al. Rigorous benchmarking in reasonable time , 2013, ISMM '13.
[121] Babak Falsafi,et al. ParaLog: enabling and accelerating online parallel monitoring of multithreaded applications , 2010, ASPLOS XV.
[122] Petr Tuma,et al. Generic Environment for Full Automation of Benchmarking , 2004, SOQUA/TECOS.
[123] Shigeru Chiba,et al. Load-Time Structural Reflection in Java , 2000, ECOOP.
[124] Wolfgang Schröder-Preikschat,et al. AspectC++: an aspect-oriented extension to the C++ programming language , 2002 .
[125] Klaus Schmid,et al. Erhebung von Produkt-Laufzeit-Metriken: Ein Vergleich mit dem SPASS-Meter-Werkzeug , 2012 .
[126] David J. Kuck,et al. A Supercomputing Performance Evaluation Plan , 1988, ICS.
[127] Ondrej Lhoták,et al. A Staged Static Program Analysis to Improve the Performance of Runtime Monitoring , 2007, ECOOP.
[128] Walter Binder,et al. DiSL: a domain-specific language for bytecode instrumentation , 2012, AOSD.
[129] Clinton L. Jeffery,et al. Program monitoring and visualization - a exploratory approach , 2011 .
[130] Carsten Binnig,et al. How is the weather tomorrow?: towards a benchmark for the cloud , 2009, DBTest '09.
[131] Philip J. Fleming,et al. How not to lie with statistics: the correct way to summarize benchmark results , 1986, CACM.
[132] Lieven Eeckhout,et al. Java performance evaluation through rigorous replay compilation , 2008, OOPSLA.
[133] Michele Lanza,et al. Visualizing Software Systems as Cities , 2007, 2007 4th IEEE International Workshop on Visualizing Software for Understanding and Analysis.
[134] Lubomír Bulej. Connector-based Performance Data Collection for Component Applications , 2007 .
[135] Wilhelm Hasselbring,et al. Data for: Including Performance Benchmarks into Continuous Integration to Enable DevOps , 2015 .
[136] M. Roper,et al. Replication of Software Engineering Experiments , 2000 .
[137] Ralf Reussner,et al. Monitor Overhead Measurement with SKaMPI , 1999 .
[138] Matthias Hauswirth,et al. TraceAnalyzer: a system for processing performance traces , 2011, Softw. Pract. Exp..
[139] Walter F. Tichy,et al. Empirische Methodik in der Softwaretechnik im Allgemeinen und bei der Software-Visualisierung im Besonderen , 2007, Software Engineering.
[140] Dorina C. Petriu,et al. The Future of Software Performance Engineering , 2007, Future of Software Engineering (FOSE '07).
[141] Lars Kroll,et al. Performance Monitoring for a Web-based Information System , 2011 .
[142] Wilhelm Hasselbring,et al. Live trace visualization for comprehending large software landscapes: The ExplorViz approach , 2013, 2013 First IEEE Working Conference on Software Visualization (VISSOFT).
[143] Alessandro Orso,et al. A generic instrumentation framework for collecting dynamic information , 2004, SOEN.
[144] J. Davenport. Editor , 1960 .
[145] Wilhelm Hasselbring,et al. Model Driven Performance Measurement and Assessment with MoDePeMART , 2009, MoDELS.
[146] Petr Tuma,et al. ShadowVM: robust and comprehensive dynamic program analysis for the java platform , 2014, GPCE '13.
[147] Paolo Bellavista,et al. Java for On-line Distributed Monitoring of Heterogeneous Systems and Services , 2002, Comput. J..
[148] Ross Ihaka,et al. Gentleman R: R: A language for data analysis and graphics , 1996 .
[149] Elaine J. Weyuker,et al. Experience with Performance Testing of Software Systems: Issues, an Approach, and Case Study , 2000, IEEE Trans. Software Eng..
[150] J. N. Amaral,et al. Benchmark Design for Robust Profile-Directed Optimization , 2007 .
[151] Martin D. Thompson,et al. Disruptor : High performance alternative to bounded queues for exchanging data between concurrent threads , 2011 .
[152] Wilhelm Hasselbring,et al. Continuous Monitoring of Software Services: Design and Application of the Kieker Framework , 2009 .
[153] Simon Shim,et al. Monitoring software components and component-based software , 2000, Proceedings 24th Annual International Computer Software and Applications Conference. COMPSAC2000.
[154] Thomas Reidemeister,et al. DataMill: rigorous performance evaluation made easy , 2013, ICPE '13.
[155] Walter Binder,et al. Introduction to dynamic program analysis with DiSL , 2013, ICPE '13.
[156] Howard Kim,et al. AspectC#: An AOSD implementation for C# , 2002 .
[157] Walter J. Price. A benchmark tutorial , 1989, IEEE Micro.
[158] Martin Moser,et al. Systematic performance evaluation based on tailored benchmark applications , 2013, ICPE '13.
[159] Byeong-Mo Chang,et al. A thread monitoring system for multithreaded Java programs , 2006, SIGP.
[160] Amjad Nusayr,et al. Using AOP for detailed runtime monitoring instrumentation , 2009, WODA '09.
[161] Will Cappelli. Magic Quadrant for Application Performance Monitoring , 2010 .
[162] Wilhelm Hasselbring,et al. Application Performance Monitoring: Trade-Off between Overhead Reduction and Maintainability , 2014, SoSP.
[163] Eugene Miya,et al. Machine Characterization Based on an Abstract High-level Language Machine , 1990, PERV.
[164] Ian Molyneaux. The Art of Application Performance Testing - Help for Programmers and Quality Assurance , 2009 .
[165] Stefan Voigt,et al. Maintenance of embedded systems: Supporting program comprehension using dynamic analysis , 2012, 2012 Second International Workshop on Software Engineering for Embedded Systems (SEES).
[166] Ralf H. Reussner,et al. Automated Benchmarking of Java APIs , 2010, Software Engineering.
[167] Gerhard Wellein,et al. Performance Patterns and Hardware Metrics on Modern Multicore Processors: Best Practices for Performance Engineering , 2012, Euro-Par Workshops.
[168] J. Nievergelt,et al. Special Feature: Monitoring Program Execution: A Survey , 1981, Computer.
[169] Wilhelm Hasselbring,et al. Synchrovis: 3D visualization of monitoring traces in the city metaphor for analyzing concurrency , 2013, 2013 First IEEE Working Conference on Software Visualization (VISSOFT).
[170] Chinya V. Ravishankar,et al. Monitoring and debugging distributed realtime programs , 1992, Softw. Pract. Exp..
[171] Mary Lou Soffa,et al. Low overhead program monitoring and profiling , 2005, PASTE '05.
[172] Eda Marchetti,et al. Adequate monitoring of service compositions , 2013, ESEC/FSE 2013.
[173] John Murphy,et al. Non-intrusive end-to-end runtime path tracing for J2EE systems , 2006, IEE Proc. Softw..
[174] Thomas R. Gross,et al. Online optimizations driven by hardware performance monitoring , 2007, PLDI '07.
[175] Petr Tuma,et al. Dynamic program analysis - Reconciling developer productivity and tool performance , 2014, Sci. Comput. Program..
[176] Klaus Schmid,et al. Flexible resource monitoring of Java programs , 2014, J. Syst. Softw..
[177] Karl Huppler,et al. The Art of Building a Good Benchmark , 2009, TPCTC.
[178] Samin Ishtiaq,et al. "Can I Implement Your Algorithm?": A Model for Reproducible Research Software , 2014, ArXiv.
[179] Wilhelm Hasselbring,et al. Generating Probabilistic and Intensity-Varying Workload for Web-Based Software Systems , 2008, SIPEW.
[180] Tomás Bures,et al. Eliminating Execution Overhead of Disabled Optional Features in Connectors , 2006, EWSA.
[181] Adrian M. Colyer,et al. Using AspectJ for component integration in middleware , 2003, OOPSLA '03.
[182] Andrew Glover,et al. Continuous Integration: Improving Software Quality and Reducing Risk (The Addison-Wesley Signature Series) , 2007 .
[183] Aamer Jaleel,et al. Analyzing Parallel Programs with PIN , 2010, Computer.
[184] Teemu Kanstrén,et al. An adaptive and dependable distributed monitoring framework , 2011 .
[185] Mathias Meyer,et al. Continuous Integration and Its Tools , 2014, IEEE Software.
[186] Jack Dongarra,et al. Computer benchmarking: paths and pitfalls , 1987 .
[187] Douglas C. Runger. Applied Statistics and Probability for Engineers, Third edition , 2003 .
[188] Shicong Meng,et al. Enhanced Monitoring-as-a-Service for Effective Cloud Management , 2013, IEEE Transactions on Computers.
[189] Radu Grosu,et al. Software monitoring with controllable overhead , 2010, International Journal on Software Tools for Technology Transfer.
[190] R. F. Hitti,et al. Evaluation and performance of computers: application benchmarks: the key to meaningful computer evaluations , 1965, ACM '65.
[191] Jeffrey C. Carver,et al. Replicating software engineering experiments: addressing the tacit knowledge problem , 2002, Proceedings International Symposium on Empirical Software Engineering.
[192] Wilhelm Hasselbring,et al. Automated Source-Level Instrumentation for Dynamic Dependency Analysis of COBOL Systems , 2012, Softwaretechnik-Trends.
[193] Wei Xu,et al. Advances and challenges in log analysis , 2011, Commun. ACM.
[194] Yuqing Zhu,et al. BigDataBench: A big data benchmark suite from internet services , 2014, 2014 IEEE 20th International Symposium on High Performance Computer Architecture (HPCA).
[195] Paul Grünbacher,et al. A Flexible Framework for Runtime Monitoring of System-of-Systems Architectures , 2014, 2014 IEEE/IFIP Conference on Software Architecture.
[196] Virgílio A. F. Almeida,et al. A methodology for workload characterization of E-commerce sites , 1999, EC '99.
[197] Lukáš Marek,et al. Instrumentation and Evaluation for Dynamic Program Analysis , 2014 .
[198] Margo I. Seltzer,et al. The case for application-specific benchmarking , 1999, Proceedings of the Seventh Workshop on Hot Topics in Operating Systems.
[199] Wilhelm Hasselbring,et al. Including Performance Benchmarks into Continuous Integration to Enable DevOps , 2015, SOEN.
[200] Lance M. Berc,et al. Continuous profiling: where have all the cycles gone? , 1997, ACM Trans. Comput. Syst..
[201] Ondrej Lhoták,et al. abc: an extensible AspectJ compiler , 2005, AOSD '05.
[202] Dana Petcu,et al. Multi-Cloud: expectations and current approaches , 2013, MultiCloud '13.
[203] Giovanni Denaro,et al. Early performance testing of distributed software applications , 2004, WOSP '04.
[204] Weng-Fai Wong,et al. PiPA: Pipelined profiling and analysis on multicore systems , 2008, TACO.
[205] André van Hoorn. Model-driven online capacity management for component-based software systems , 2014 .
[206] S. Mohan,et al. Performance Solutions: A Practical Guide to Creating Responsive, Scalable Software [Book Review] , 2003, IEEE Software.
[207] Steffen Becker. Performance-Related Metrics in the ISO 9126 Standard , 2005, Dependability Metrics.
[208] R. Stockton Gaines,et al. Characterizing the Performance Space of Shared Memory Computers Using Micro-Benchmarks , 2012 .
[209] Arie van Deursen,et al. A Systematic Survey of Program Comprehension through Dynamic Analysis , 2008, IEEE Transactions on Software Engineering.
[210] Jason Gait,et al. A probe effect in concurrent programs , 1986, Softw. Pract. Exp..
[211] Martin Gogolla,et al. Aspect-Oriented Monitoring of UML and OCL Constraints , 2007 .
[212] Paola Inverardi,et al. Model-based performance prediction in software development: a survey , 2004, IEEE Transactions on Software Engineering.
[213] Tom Frotscher,et al. Architecture-Based Multivariate Anomaly Detection for Software Systems , 2013 .
[214] Wilhelm Hasselbring,et al. A Benchmark Engineering Methodology to Measure the Overhead of Application-Level Monitoring , 2013, KPDAYS.
[215] Samuel Kounev,et al. Performance Engineering of Distributed Component-Based Systems - Benchmarking, Modeling and Performance Prediction , 2005 .
[216] Vivek Sarkar,et al. The Jikes Research Virtual Machine project: Building an open-source research community , 2005, IBM Syst. J..
[217] Matthew Arnold,et al. Collecting and exploiting high-accuracy call graph profiles in virtual machines , 2005, International Symposium on Code Generation and Optimization.
[218] David Georg Reichelt,et al. Sicherstellung von Performanzeigenschaften durch kontinuierliche Performanztests mit dem KoPeMe Framework , 2014, Software Engineering.
[219] Steven McCanne,et al. A Randomized Sampling Clock for CPU Utilization Estimation and Code Profiling , 1993, USENIX Winter.
[220] Matthias Hauswirth,et al. Producing wrong data without doing anything obviously wrong! , 2009, ASPLOS.
[221] Jan Waller. Runtime Visualization of Static and Dynamic Architectural Views of a Software System to identify Performance Problems , 2010 .
[222] Matthias Hauswirth,et al. Evaluating the accuracy of Java profilers , 2010, PLDI '10.
[223] John Kunze,et al. Practices, Trends, and Recommendations in Technical Appendix Usage for Selected Data-Intensive Disciplines , 2011 .
[224] Wilhelm Hasselbring,et al. PubFlow: a scientific data publication framework for marine science , 2013 .
[225] Wilhelm Hasselbring,et al. Self-adaptive software system monitoring for performance anomaly localization , 2011, ICAC '11.
[226] Björn Konarski. Ein 3D-Ansatz zur Visualisierung der Kernauslastung in Multiprozessorsystemen , 2012 .
[227] Patricia J. Teller,et al. Towards a cross-platform microbenchmark suite for evaluating hardware performance counter data , 2005, 2005 Richard Tapia Celebration of Diversity in Computing Conference.
[228] Thierry Coupaye,et al. ASM: a code manipulation tool to implement adaptable systems , 2002 .
[229] Connie U. Smith,et al. New Book - Performance Solutions: A Practical Guide to Creating Responsive, Scalable Software , 2001, Int. CMG Conference.
[230] Christoph Heger,et al. AIM: Adaptable Instrumentation and Monitoring for Automated Software Performance Analysis , 2015, 2015 IEEE/ACM 10th International Workshop on Automation of Software Test.
[231] Eric Bodden,et al. Racer: effective race detection using aspectj , 2008, ISSTA '08.
[232] Wilhelm Hasselbring,et al. Scalable and Live Trace Processing with Kieker Utilizing Cloud Computing , 2013, KPDAYS.
[233] George Cybenko,et al. Supercomputer performance evaluation and the Perfect Benchmarks , 1990, ICS '90.
[234] Wilhelm Hasselbring,et al. Comparing Trace Visualizations for Program Comprehension through Controlled Experiments , 2015, 2015 IEEE 23rd International Conference on Program Comprehension.
[235] Nicholas Merriam,et al. Measurement and tracing methods for timing analysis , 2012, International Journal on Software Tools for Technology Transfer.
[236] Egon Berghout,et al. The Goal/Question/Metric Method: , 2000 .
[237] Tore Dybå,et al. The Future of Empirical Methods in Software Engineering Research , 2007, Future of Software Engineering (FOSE '07).
[238] Susan Elliott Sim,et al. Using benchmarking to advance research: a challenge to software engineering , 2003, 25th International Conference on Software Engineering, 2003. Proceedings..
[239] Daniela E. Damian,et al. Selecting Empirical Methods for Software Engineering Research , 2008, Guide to Advanced Empirical Software Engineering.
[240] André van Hoorn,et al. Adaptive Instrumentation of Java Applications for Experiment-Based Performance Analysis , 2014 .
[241] Oscar Nierstrasz,et al. Exploiting Dynamic Information in IDEs Improves Speed and Correctness of Software Maintenance Tasks , 2012, IEEE Transactions on Software Engineering.
[242] Elaine J. Weyuker,et al. Performance testing of software systems , 1998, WOSP '98.
[243] Martin D. Westhead,et al. A benchmark suite for high performance Java , 2000, Concurr. Pract. Exp..