T-Morph: revealing buggy behaviors of TinyOS applications via rule mining and visualization

TinyOS applications for Wireless Sensor Networks (WSNs) typically run in a complicated concurrency model. It is difficult for developers to precisely predict the dynamic execution process of a TinyOS application by its static source codes. Such a conceptual gap frequently incurs software bugs, due to unexpected system behaviors caused by unknown execution patterns. This paper presents T-Morph (TinyOS application tomography), a novel tool to mine, visualize, and verify the execution patterns of TinyOS applications. T-Morph abstracts the dynamic execution process of a TinyOS application into simple, structured application behavior models, which well reflect how the static source codes are executed. Furthermore, T-Morph visualizes them in a user-friendly manner. Therefore, WSN developers can readily see if their source codes run as intended by simply verifying the correctness of the models. Finally, the verified models allow T-Morph to automatically check the application behaviors during a long-term testing execution. The suggested model violations can unveil potential bugs and direct developers to suspicious locations in the source codes. We have implemented T-Morph and applied it to verify a series of representative real-life TinyOS applications and find several bugs, including a new bug in the latest release of TinyOS. It shows T-Morph can provide substantial help to verify TinyOS applications.

[1]  Richard Han,et al.  NodeMD: diagnosing node-level faults in remote wireless sensor systems , 2007, MobiSys '07.

[2]  James R. Larus,et al.  Mining specifications , 2002, POPL '02.

[3]  Michael R. Lyu,et al.  Sentomist: Unveiling Transient Sensor Network Bugs via Symptom Mining , 2010, 2010 IEEE 30th International Conference on Distributed Computing Systems.

[4]  Atif M. Memon,et al.  Designing and comparing automated test oracles for GUI-based software applications , 2007, TSEM.

[5]  Philip Levis,et al.  TinyOS Programming: Introduction , 2009 .

[6]  Eric Eide,et al.  Efficient memory safety for TinyOS , 2007, SenSys '07.

[7]  Romain Thouvenin,et al.  Implementing and Evaluating the Dynamic Manet On-demand Protocol in Wireless Sensor Networks , 2007 .

[8]  Richard H. Carver,et al.  Reachability testing of concurrent programs , 2006, IEEE Transactions on Software Engineering.

[9]  Ramesh Govindan,et al.  Deriving State Machines from TinyOS Programs Using Symbolic Execution , 2008, 2008 International Conference on Information Processing in Sensor Networks (ipsn 2008).

[10]  Jens Palsberg,et al.  Avrora: scalable sensor network simulation with precise timing , 2005, IPSN 2005. Fourth International Symposium on Information Processing in Sensor Networks, 2005..

[11]  Yun Zhang,et al.  Static data race detection for concurrent programs with asynchronous calls , 2009, ESEC/FSE '09.

[12]  François Ingelrest,et al.  The hitchhiker's guide to successful wireless sensor network deployments , 2008, SenSys '08.

[13]  Matt Welsh,et al.  Fidelity and yield in a volcano monitoring sensor network , 2006, OSDI '06.

[14]  Kamin Whitehouse,et al.  Macrodebugging: global views of distributed program execution , 2009, SenSys '09.

[15]  Elaine J. Weyuker,et al.  An Applicable Family of Data Flow Testing Criteria , 1988, IEEE Trans. Software Eng..

[16]  Deborah Estrin,et al.  Next Century Challenges: Mobile Networking for Smart Dust , 1999, MobiCom 1999.

[17]  Amin Vahdat,et al.  Pip: Detecting the Unexpected in Distributed Systems , 2006, NSDI.

[18]  Peng Li,et al.  T-check: bug finding for sensor networks , 2010, IPSN '10.

[19]  Shing-Chi Cheung,et al.  Inter-context control-flow and data-flow test adequacy criteria for nesC applications , 2008, SIGSOFT '08/FSE-16.

[20]  Jiawei Han,et al.  Dustminer: troubleshooting interactive complexity bugs in sensor networks , 2008, SenSys '08.

[21]  Kamin Whitehouse,et al.  Clairvoyant: a comprehensive source-level debugger for wireless sensor networks , 2007, SenSys '07.

[22]  Kamin Whitehouse,et al.  Declarative tracepoints: a programmable and application independent debugging system for wireless sensor networks , 2008, SenSys '08.

[23]  David Lo,et al.  Mining Scenario-Based Triggers and Effects , 2008, 2008 23rd IEEE/ACM International Conference on Automated Software Engineering.

[24]  Adam Dunkels,et al.  Contiki - a lightweight and flexible operating system for tiny networked sensors , 2004, 29th Annual IEEE International Conference on Local Computer Networks.

[25]  Philip Levis,et al.  The nesC language: a holistic approach to networked embedded systems , 2003, SIGP.

[26]  John Regehr,et al.  Random testing of interrupt-driven software , 2005, EMSOFT.

[27]  Koen Langendoen,et al.  Murphy loves potatoes: experiences from a pilot sensor network deployment in precision agriculture , 2006, Proceedings 20th IEEE International Parallel & Distributed Processing Symposium.

[28]  David Lo,et al.  Automatic steering of behavioral model inference , 2009, ESEC/SIGSOFT FSE.

[29]  Koushik Sen,et al.  Asserting and checking determinism for multithreaded programs , 2009, ESEC/FSE '09.

[30]  Jonathan W. Hui,et al.  Marionette: using RPC for interactive development and debugging of wireless embedded networks , 2006, 2006 5th International Conference on Information Processing in Sensor Networks.

[31]  Mohammad Ilyas,et al.  Smart Dust , 2006 .

[32]  Klaus Wehrle,et al.  KleeNet: discovering insidious interaction bugs in wireless sensor networks before deployment , 2010, IPSN '10.

[33]  Sandeep Kumar Specification mining in concurrent and distributed systems , 2011, 2011 33rd International Conference on Software Engineering (ICSE).

[34]  Philip Levis,et al.  Surviving sensor network software faults , 2009, SOSP '09.