Replaying artificial or real-world traffic is a method used to test networking devices. Using real-world traffic is desirable as it uncovers more realistic properties such as traffic diversity and complicated user behaviors. In this paper, we propose an in-lab replay testing (ILRT) framework, composed of a monitor, a traffic replayer and a library of real-world traffic traces, to replay captured packet traces to test networking devices. To demonstrate its effectiveness, a total of 28 WLAN routers were tested to evaluate whether they could work stably for an extended time. In our experiments, 53.57% and 100% of devices under test (DUTs) were triggered critical (L1) failures and tolerable (L2) failures, respectively. 21.43% and 100% of DUTs were triggered more than one L1 and L2 failures, respectively. Among high-spec DUTs, there was 25% of DUTs which were triggered L1 failures. Furthermore, among 458 test results, chance of pass, L1 failure, and L2 failure is 2.84%, 7.86%, and 89.3%, respectively. The results showed that even though these devices have passed traditional lab tests, there is still a good chance for them to fail under real-world traffic.
[1]
David Brumley,et al.
Replayer: automatic protocol replay by binary analysis
,
2006,
CCS '06.
[2]
Yuan-Cheng Lai,et al.
Low-storage capture and loss recovery selective replay of real flows
,
2012,
IEEE Communications Magazine.
[3]
Stefan Savage,et al.
Monkey See, Monkey Do: A Tool for TCP Tracing and Replaying
,
2004,
USENIX ATC, General Track.
[4]
Ying-Dar Lin,et al.
On campus beta site: architecture designs, operational experience, and top product defects
,
2010,
IEEE Communications Magazine.
[5]
Randy H. Katz,et al.
Protocol-Independent Adaptive Replay of Application Dialog
,
2006,
NDSS.
[6]
Michael Daskalantonakis,et al.
A Practical View of Software Measurement and Implementation Experiences Within Motorola
,
1992,
IEEE Trans. Software Eng..