Towards a comparative performance evaluation of overlays for Networked Virtual Environments

Peer-to-peer overlays for Networked Virtual Environments have recently gained much research interest, resulting in a variety of different approaches for spatial information dissemination. Although designed for the same purpose, the evaluation methodologies used by particular authors differ widely. This makes any comparison of existing systems difficult, if not impossible. To overcome this problem we present a benchmarking methodology which allows for a fair comparison of those systems. We, therefore, define a common set of workloads and metrics. We demonstrate the feasibility of our approach by testing four typical systems for spatial information dissemination and discovering their specific performance profiles.

[1]  Ioana Manolescu,et al.  P2PTester: a tool for measuring P2P platform performance , 2007, 2007 IEEE 23rd International Conference on Data Engineering.

[2]  Vasilis Samoladas,et al.  Contention-based performance evaluation of multidimensional range search in peer-to-peer networks , 2007, InfoScale '07.

[3]  Ray Jain,et al.  The art of computer systems performance analysis - techniques for experimental design, measurement, simulation, and modeling , 1991, Wiley professional computing.

[4]  Honghui Lu,et al.  Peer-to-peer support for massively multiplayer games , 2004, IEEE INFOCOM 2004.

[5]  Alejandro P. Buchmann,et al.  pSense - Maintaining a Dynamic Localized Peer-to-Peer Structure for Position Based Multicast in Games , 2008, 2008 Eighth International Conference on Peer-to-Peer Computing.

[6]  Hui Zhang,et al.  Towards global network positioning , 2001, IMW '01.

[7]  Franck Cappello,et al.  One step further in large-scale evaluations: the V-DS environment , 2007 .

[8]  Ralf Steinmetz,et al.  PeerfactSim.KOM: A simulation framework for Peer-to-Peer systems , 2011, 2011 International Conference on High Performance Computing & Simulation.

[9]  Ilia Petrov,et al.  From Active Data Management to Event-Based Systems and More , 2010, Lecture Notes in Computer Science.

[10]  Gerhard Weikum,et al.  A Reproducible Benchmark for P2P Retrieval , 2006, ExpDB.

[11]  Chun-Ying Huang,et al.  Game traffic analysis: an MMORPG perspective , 2005, NOSSDAV '05.

[12]  Shun-Yun Hu,et al.  VON: a scalable peer-to-peer network for virtual environments , 2006, IEEE Network.

[13]  Srinivasan Seshan,et al.  Mercury: a scalable publish-subscribe system for internet games , 2002, NetGames '02.

[14]  Pierluigi Crescenzi,et al.  Performance Evaluation of a Chord-Based JXTA Implementation , 2009, 2009 First International Conference on Advances in P2P Systems.

[15]  Raj Jain,et al.  A Quantitative Measure Of Fairness And Discrimination For Resource Allocation In Shared Computer Systems , 1998, ArXiv.

[16]  Philip W. Trinder,et al.  Design issues for Peer-to-Peer Massively Multiplayer Online Games , 2010, Int. J. Adv. Media Commun..

[17]  Ralf Steinmetz,et al.  Benchmarking Platform for Peer-to-Peer Systems (Benchmarking Plattform für Peer-to-Peer Systeme) , 2007, it Inf. Technol..

[18]  Son T. Vuong,et al.  MOPAR: a mobile peer-to-peer overlay architecture for interest management of massively multiplayer online games , 2005, NOSSDAV '05.

[19]  Ralf Steinmetz,et al.  Towards Benchmarking of Structured Peer-to-Peer Overlays for Network Virtual Environments , 2008, 2008 14th IEEE International Conference on Parallel and Distributed Systems.

[20]  Ralf Steinmetz,et al.  Benchmarking Platform for Peer-to-Peer Systems , 2007 .

[21]  Robert Tappan Morris,et al.  A performance vs. cost framework for evaluating DHT design tradeoffs under churn , 2005, Proceedings IEEE 24th Annual Joint Conference of the IEEE Computer and Communications Societies..