Database Benchmarks

Performance measurement tools are very important, both for designers and users of Database Management Systems (DBMSs). Performance evaluation is useful to designers to determine elements of architecture, and, more generally, to validate or refute hypotheses regarding the actual behavior of a DBMS. Thus, performance evaluation is an essential component in the development process of well-designed and efficient systems. Users may also employ performance evaluation, either to compare the efficiency of different technologies before selecting a DBMS, or to tune a system. Performance evaluation by experimentation on a real system is generally referred to as benchmarking. It consists of performing a series of tests on a given DBMS to estimate its performance in a given setting. Typically, a benchmark is constituted of two main elements: a database model (conceptual schema and extension), and a workload model (set of read and write operations) to apply on this database, following a predefined protocol. Most benchmarks also include a set of simple or composite performance metrics such as response time, throughput, number of input/output, disk or memory usage, and so forth. The aim of this article is to present an overview of the major families of state-of-the-art database benchmarks, namely, relational benchmarks, object and object-relational benchmarks, XML benchmarks, and decision-support benchmarks; and to discuss the issues, tradeoffs, and future trends in database benchmarking. We particularly focus on XML and decision-support benchmarks, which are currently the most innovative tools that are developed in this area.

[1]  Philip Calvert,et al.  Encyclopedia of Database Technologies and Applications , 2005 .

[2]  Keng Siau,et al.  Advanced Principles for Improving Database Design, Systems Modeling, and Software Development , 2008 .

[3]  Athanasios Karamalis Databases for Multiple Archaeological Excavations and Internet Applications , 2009, Database Technologies: Concepts, Methodologies, Tools, and Applications.

[4]  Omar Boussaïd,et al.  An Architecture Framework for Complex Data Warehouses , 2007, ICEIS.

[5]  Roderic G. G. Cattell The benchmark handbook for database and transaction processing systems , 1991 .

[6]  Zhen He,et al.  Evaluating the Dynamic Behavior of Database Applications , 2005, J. Database Manag..

[7]  Keng Siau Contemporary Issues in Database Design and Information Systems Development , 2007 .

[8]  John S. Erickson Database Technologies: Concepts, Methodologies, Tools, and Applications (4 Volumes) , 2009, Database Technologies: Concepts, Methodologies, Tools, and Applications.

[9]  Gökhan Tür,et al.  Sanitization and Anonymization of Document Repositories , 2009, Database Technologies: Concepts, Methodologies, Tools, and Applications.

[10]  Henry M. Levy,et al.  Evaluation of OO7 as a system and an application benchmark , 1995 .

[11]  David J. DeWitt,et al.  The BUCKY object-relational benchmark , 1997, SIGMOD '97.

[12]  Fiona Fui-Hoon Nah,et al.  Empirical Assessment of Factors Influencing Success of Enterprise Resource Planning Implementations , 2007, J. Database Manag..

[13]  Ahmed Seffah,et al.  An XML Multi-Tier Pattern Dissemination System , 2005, Encyclopedia of Database Technologies and Applications.

[14]  R. G. G. Cattell,et al.  The Engineering Database Benchmark , 1994, The Benchmark Handbook.

[15]  Stéphane Bressan,et al.  Efficiency and Effectiveness of XML Tools and Techniques and Data Integration over the Web , 2003, Lecture Notes in Computer Science.

[16]  Omar Boussaïd,et al.  DWEB: A Data Warehouse Engineering Benchmark , 2005, DaWaK.

[17]  Jignesh M. Patel,et al.  The Michigan Benchmark: A Microbenchmark for XML Query Processing Systems , 2002, EEXTT.

[18]  David J. DeWitt,et al.  The oo7 Benchmark , 1993, SIGMOD Conference.

[19]  Yannis Manolopoulos Spatial Databases: Technologies, Techniques and Trends , 2005, Spatial Databases.

[20]  Ioana Manolescu,et al.  MemBeR: A Micro-benchmark Repository for XQuery , 2005, XSym.

[21]  Gondy Leroy,et al.  Using Decision Trees to Predict Crime Reporting , 2008 .

[22]  Sang Ho Lee,et al.  The BORD Benchmark for Object-Relational Databases , 2000, DEXA.

[23]  Stéphane Bressan,et al.  The XOO7 Benchmark , 2002, EEXTT.

[24]  Michel Schneider,et al.  Benchmarking OODBs with a Generic Tool , 2000, J. Database Manag..

[25]  M. Tamer Özsu,et al.  XBench benchmark and performance testing of XML DBMSs , 2004, Proceedings. 20th International Conference on Data Engineering.

[26]  Ioana Manolescu,et al.  XMark: A Benchmark for XML Data Management , 2002, VLDB.

[27]  嘉義大學 Terms and Definition , 2010 .

[28]  H. James Nelson,et al.  Research Review: A Systematic Literature Review on the Quality of UML Models , 2011, J. Database Manag..

[29]  Ioana Manolescu,et al.  Active XML: Peer-to-Peer Data and Web Services Integration , 2002, VLDB.

[30]  Chetan S. Sankar,et al.  Database Design Support: An Empirical Investigation of Perceptions and Performance , 1993 .

[31]  Cédric du Mouza,et al.  Management of Large Moving Objects Datasets: Indexing, Benchmarking and Uncertainty in Movement Representation , 2005, Spatial Databases.

[32]  Erhard Rahm,et al.  XMach-1: A Benchmark for XML Data Management , 2001, BTW.