PICL: portable in-circuit learner

This paper introduces the PICL, the portable in-circuit learner. The PICL explores the possibility of providing standalone, low-cost, programming-by-demonstration machine learning capabilities to circuit prototyping. To train the PICL, users attach a sensor to the PICL, demonstrate example input, then specify the desired output (expressed as a voltage) for the given input. The current version of the PICL provides two learning modes, binary classification and linear regression. To streamline training and also make it possible to train on highly transient signals (such as those produced by a camera flash or a hand clap), the PICL includes a number of input inferencing techniques. These techniques make it possible for the PICL to learn with as few as one example. The PICL's behavioural repertoire can be expanded by means of various output adapters, which serve to transform the output in useful ways when prototyping. Collectively, the PICL's capabilities allow users of systems such as the Arduino or littleBits electronics kit to quickly add basic sensor-based behaviour, with little or no programming required.

[1]  Desney S. Tan,et al.  CueFlik: interactive concept learning in image search , 2008, CHI.

[2]  Gregory D. Abowd,et al.  Preventing Camera Recording by Designing a Capture-Resistant Environment , 2005, UbiComp.

[3]  Steve Hodges,et al.  .NET Gadgeteer: A Platform for Custom Devices , 2012, Pervasive.

[4]  Darren Leigh,et al.  The calder toolkit: wired and wireless components for rapidly prototyping interactive devices , 2004, DIS '04.

[5]  Ayah Bdeir,et al.  Electronics as material: littleBits , 2010, TEI.

[6]  Meredith Ringel Morris,et al.  iStuff: a physical user interface toolkit for ubiquitous computing environments , 2003, CHI '03.

[7]  Michael S. Bernstein,et al.  Reflective physical prototyping through integrated design, test, and analysis , 2006, UIST.

[8]  Timothy S. McNerney From turtles to Tangible Programming Bricks: explorations in physical language design , 2004, Personal and Ubiquitous Computing.

[9]  Hideyuki Suzuki,et al.  Interaction-level support for collaborative learning: AlgoBlock - an open programming language , 1995, CSCL.

[10]  Scott R. Klemmer,et al.  Authoring sensor-based interactions by demonstration with direct manipulation and pattern recognition , 2007, CHI.

[11]  P. J. Green,et al.  Probability and Statistical Inference , 1978 .

[12]  Saul Greenberg,et al.  Phidgets: easy development of physical interfaces through physical widgets , 2001, UIST '01.

[13]  Mark Bachman,et al.  A Rapid Prototyping Tool for Interactive Device Development , 2011, HCI.

[14]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[15]  Frank Vahid,et al.  Enabling nonexpert construction of basic sensor-based systems , 2009, TCHI.

[16]  D. Hammerstrom,et al.  A VLSI architecture for high-performance, low-cost, on-chip learning , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[17]  Albrecht Schmidt,et al.  Physical Prototyping with Smart-its Embedded Ubiquitous Computing Systems , 2022 .

[18]  Tao Xiong,et al.  A combined SVM and LDA approach for classification , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[19]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[20]  Alan F. Murray,et al.  International Joint Conference on Neural Networks , 1993 .

[21]  Jihan Zhu,et al.  FPGA Implementations of Neural Networks - A Survey of a Decade of Progress , 2003, FPL.

[22]  M. Kendall Probability and Statistical Inference , 1956, Nature.

[23]  Thad Starner,et al.  MAGIC: a motion gesture design tool , 2010, CHI.

[24]  Ian H. Witten,et al.  The WEKA data mining software: an update , 2009, SKDD.

[25]  James A. Landay,et al.  Gestalt: integrated support for implementation and analysis in machine learning , 2010, UIST.

[26]  Jerry Alan Fails,et al.  A design tool for camera-based interaction , 2003, CHI '03.