A testing methodology for a dataflow based visual programming language

Dataflow based visual programming languages have become an important topic of research in recent years, yielding a variety of research systems and commercial applications. As with any programming language, visual or textual, dataflow programs may contain faults. Thus, to ensure the coma functioning of dataflow programs, and increase confidence in the quality of these programs, testing is required. Despite this valid observation, we find that the casting criteria found in the literature mainly addressed imperative, declarative, and form-based languages. However, we did not find any discussion that specifically addressed testing criteria for dataflow programs. In this paper, we investigate, from a testing perspective, differences between dataflow and imperative languages. The results reveal opportunities for adapting code-based control-flow testing criteria to test dataflow languages. We show that our proposed testing methodology is well suited for dataflow programs. In particular, the "all-branches" criterion provides important error detection ability, and can be applied to dataflow programs. We implemented a testing system that allows users to visually and empirically investigate the testability of programs written in the visual programming language Prograph. Our empirical results confirm that, analogous to imperative languages, the all-branches criterion cannot detect all the errors in a dataflow program. Thus, to catch those undetected errors, more rigorous testing should be applied.

[1]  Simeon C. Ntafos,et al.  A Comparison of Some Structural Testing Strategies , 1988, IEEE Trans. Software Eng..

[2]  Margaret M. Burnett,et al.  Interactive Visual Data Abstraction in a Declarative Visual Programming Language , 1994, J. Vis. Lang. Comput..

[3]  Barry E. Paton Sensors Transducers Labview , 1998 .

[4]  Gregg Rothermel,et al.  What you see is what you test: a methodology for testing form-based visual programs , 1998, Proceedings of the 20th International Conference on Software Engineering.

[5]  Phyllis G. Frankl,et al.  An Experimental Comparison of the Effectiveness of Branch Testing and Data Flow Testing , 1993, IEEE Trans. Software Eng..

[6]  Fevzi Belli,et al.  Testing and reliability of logic programs , 1993, Proceedings of 1993 IEEE International Symposium on Software Reliability Engineering.

[7]  Hong Zhu,et al.  Software unit test coverage and adequacy , 1997, ACM Comput. Surv..

[8]  Elaine J. Weyuker,et al.  A Formal Analysis of the Fault-Detecting Ability of Testing Methods , 1993, IEEE Trans. Software Eng..

[9]  Gregg Rothermel,et al.  Testing strategies for form-based visual programs , 1997, Proceedings The Eighth International Symposium on Software Reliability Engineering.

[10]  Elaine J. Weyuker,et al.  An Applicable Family of Data Flow Testing Criteria , 1988, IEEE Trans. Software Eng..

[11]  Daniel D. Hils,et al.  Visual languages and computing survey: Data flow visual programming languages , 1992, J. Vis. Lang. Comput..

[12]  Janusz W. Laski,et al.  A Data Flow Oriented Program Testing Strategy , 1983, IEEE Transactions on Software Engineering.

[13]  Fevzi Belli,et al.  A test coverage notion for logic programming , 1995, Proceedings of Sixth International Symposium on Software Reliability Engineering. ISSRE'95.

[14]  Philip T. Cox,et al.  Prograph: a step towards liberating programming from textual conditioning , 1989, [Proceedings] 1989 IEEE Workshop on Visual Languages.

[15]  Glenford J. Myers,et al.  Art of Software Testing , 1979 .

[16]  Gregg Rothermel,et al.  WYSIWYT testing in the spreadsheet paradigm: an empirical evaluation , 2000, Proceedings of the 2000 International Conference on Software Engineering. ICSE 2000 the New Millennium.

[17]  P. M. Herman,et al.  A Data Flow Analysis Approach to Program Testing , 1976, Aust. Comput. J..