Benchmarking in virtual desktops for end-to-end performance traceability

There are proven benefits in terms of cost and convenience in delivering thin-client based virtual desktops, versus the use of traditional physical computers for end-user computing purposes. In this paper, we present novel extensions in terms of user interface and methodology to our previously developed VDBench benchmarking toolkit for virtual desktop environments that uses principles of slow-motion benchmarking. We focus on automation aspects of benchmarking, and describe how we extend the end-to-end performance traceability for different desktop applications such as Internet Explorer, Media Player and Excel Spreadsheets. Our approach prevents invasive modification of thin-client systems, and allows emulation of user behavior with realistic workloads. Our user interface design issues are aimed at managing workflows between the benchmarking client and server, for easy instrumentation and generation of comprehensive performance reports for complex environment setups. In a validation study, we deploy the enhanced VDBench toolkit in a real-world virtual desktop testbed that hosts applications that render 3D visualizations of disaster scenarios for scene understanding and situational awareness. Through the benchmarking results, we show how the toolkit provides user QoE assessments involving reliable video events display under different network health conditions and computation resource configurations.

[1]  Prasad Calyam,et al.  VDBench: A Benchmarking Toolkit for Thin-Client Based Virtual Desktop Environments , 2010, 2010 IEEE Second International Conference on Cloud Computing Technology and Science.

[2]  Mahadev Satyanarayanan,et al.  Quantifying interactive user experience on thin clients , 2006, Computer.

[3]  Ye Duan,et al.  LIDAR-based virtual environment study for disaster response scenarios , 2015, 2015 IFIP/IEEE International Symposium on Integrated Network Management (IM).

[4]  Markus Fiedler,et al.  A generic quantitative relationship between quality of experience and quality of service , 2010, IEEE Network.

[5]  David,et al.  Human- Centric Composite- Quality Modeling and Assessment for Virtual Desktop Clouds , 2013 .

[6]  Michael Seufert,et al.  Quality of experience in remote virtual desktop services , 2013, 2013 IFIP/IEEE International Symposium on Integrated Network Management (IM 2013).

[7]  Jason Nieh,et al.  Measuring thin-client performance using slow-motion benchmarking , 2001, TOCS.

[8]  Saverio Niccolini,et al.  A closer look at thin-client connections: statistical application identification for QoE detection , 2012, IEEE Communications Magazine.

[9]  Junghwan Rhee,et al.  DeskBench: Flexible virtual desktop benchmarking toolkit , 2009, 2009 IFIP/IEEE International Symposium on Integrated Network Management.

[10]  Daniel Schlosser,et al.  Quantifying the Influence of Network Conditions on the Service Quality Experienced by a Thin Client User , 2008, MMB.