Tutorons: Generating context-relevant, on-demand explanations and demonstrations of online code

Programmers frequently turn to the web to solve problems and find example code. For the sake of brevity, the snippets in online instructions often gloss over the syntax of languages like CSS selectors and Unix commands. Programmers must compensate by consulting external documentation. In this paper, we propose language-specific routines called Tutorons that automatically generate context-relevant, on-demand micro-explanations of code. A Tutoron detects explainable code in a web page, parses it, and generates in-situ natural language explanations and demonstrations of code. We build Tutorons for CSS selectors, regular expressions, and the Unix command “wget”. We demonstrate techniques for generating natural language explanations through template instantiation, synthesizing code demonstrations by parse tree traversal, and building compound explanations of co-occurring options. Through a qualitative study, we show that Tutoron-generated explanations can reduce the need for reference documentation in code modification tasks.

[1]  Ankur Saxena,et al.  Towards a taxonomy of errors in HTML and CSS , 2013, ICER.

[2]  Emily Hill,et al.  Towards automatically generating summary comments for Java methods , 2010, ASE.

[3]  Håkan Burden,et al.  Natural language generation from class diagrams , 2011, MoDeVVa.

[4]  Lori L. Pollock,et al.  Automatic generation of natural language summaries for Java classes , 2013, 2013 21st International Conference on Program Comprehension (ICPC).

[5]  Michael S. Bernstein,et al.  Inky: a sloppy command line for the web with rich visual feedback , 2008, UIST '08.

[6]  Mark Guzdial,et al.  Learning on the job: characterizing the programming knowledge and learning strategies of web designers , 2010, CHI.

[7]  Martin P. Robillard,et al.  Asking and answering questions about unfamiliar APIs: An exploratory study , 2012, 2012 34th International Conference on Software Engineering (ICSE).

[8]  Scott R. Klemmer,et al.  Example-centric programming: integrating web search into the development environment , 2010, CHI.

[9]  Sumit Gulwani,et al.  How Can Automatic Feedback Help Students Construct Automata? , 2015, ACM Trans. Comput. Hum. Interact..

[10]  Brian Dorn,et al.  Lost while searching: Difficulties in information seeking among end-user programmers , 2013, ASIST.

[11]  Björn Hartmann,et al.  Browsing and Analyzing the Command-Level Structure of Large Collections of Image Manipulation Tutorials , 2013 .

[12]  Lori L. Pollock,et al.  Generating Parameter Comments and Integrating with Method Summaries , 2011, 2011 IEEE 19th International Conference on Program Comprehension.

[13]  Frank Maurer,et al.  What makes a good code example?: A study of programming Q&A in StackOverflow , 2012, 2012 28th IEEE International Conference on Software Maintenance (ICSM).

[14]  Collin McMillan,et al.  Automatic documentation generation via source code summarization of method context , 2014, ICPC 2014.

[15]  Manabu Kamimura,et al.  Towards generating human-oriented summaries of unit test cases , 2013, 2013 21st International Conference on Program Comprehension (ICPC).

[16]  Otmar Hilliges,et al.  An Interactive System for Data Structure Development , 2015, CHI.

[17]  Juha Sorva,et al.  Visual program simulation in introductory programming education , 2012 .

[18]  Lori L. Pollock,et al.  Automatically detecting and describing high level actions within methods , 2011, 2011 33rd International Conference on Software Engineering (ICSE).

[19]  Brad A. Myers,et al.  WebCrystal: understanding and reusing examples in web authoring , 2012, CHI.

[20]  Philip J. Guo Online python tutor: embeddable web-based program visualization for cs education , 2013, SIGCSE '13.

[21]  Philip J. Guo,et al.  Two studies of opportunistic programming: interleaving web foraging, learning, and writing code , 2009, CHI.