Microbiology Tray and Pipette Tracking as a Proactive Tangible User Interface

Many work environments can benefit from integrated computing devices to provide information to users, record users’ actions, and prompt users about the next steps to take in a procedure. We focus on the cell biology laboratory, where previous work on the Labscape project has provided a framework to organize experiment plans and store data. Currently developed sensor systems allow amount and type of materials used in experiments to be recorded. This paper focuses on providing the last piece: determining where the materials are deposited. Using a camera and projector setup over a lab bench, vision techniques allow a specially marked well tray and pipette to be located in real time with enough precision to determine which well the pipette tip is over. Using the projector, the tray can be augmented with relevant information, such as the next operation to be performed, or the contents of the tray. Without changing the biologist’s work practice, it is possible to record the physical interactions and provide easily available status and advice to the user. Preliminary user feedback suggests this system would indeed be a useful addition to the laboratory environment.

[1]  Jun Rekimoto,et al.  CyberCode: designing augmented reality environments with visual tags , 2000, DARE '00.

[2]  Hiroshi Ishii,et al.  The metaDESK: models and prototypes for tangible user interfaces , 1997, UIST '97.

[3]  James L. Crowley,et al.  Vision for man machine interaction , 1995 .

[4]  Gaetano Borriello,et al.  Labscape: A Smart Environment for the Cell Biology Laboratory , 2002, IEEE Pervasive Comput..

[5]  Ipke Wachsmuth,et al.  Gesture and Sign Language in Human-Computer Interaction , 1998, Lecture Notes in Computer Science.

[6]  Jianbo Shi,et al.  Tele-Graffiti: A Camera-Projector Based Remote Sketching System with Hand-Based User Interface and Automatic Session Summarization , 2003, International Journal of Computer Vision.

[7]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[8]  Jun Rekimoto,et al.  Augmented surfaces: a spatially continuous work space for hybrid computing environments , 1999, CHI '99.

[9]  Min Wang,et al.  A Flexible, Low-Overhead Ubiquitous System for Medication Monitoring , 2003 .

[10]  Claudio S. Pinhanez The Everywhere Displays Projector: A Device to Create Ubiquitous Graphical Interfaces , 2001, UbiComp.

[11]  Mark W. Newman,et al.  The designers' outpost: a tangible interface for collaborative web site , 2001, UIST '01.

[12]  Peter Piela,et al.  Work language analysis and the naming problem , 1993, CACM.

[13]  Yihong Gong,et al.  Detection of Regions Matching Specified Chromatic Features , 1995, Comput. Vis. Image Underst..

[14]  Larry Arnstein,et al.  Ubiquitous Computing in the Biology Laboratory , 2001 .

[15]  Claudio S. Pinhanez,et al.  Dynamically reconfigurable vision-based user interfaces , 2004, Machine Vision and Applications.

[16]  Hirokazu Kato,et al.  Collaborative augmented reality , 2002, CACM.

[17]  Morten Fjeld,et al.  BUILD-IT: An Intuitive Design Tool Based on Direct Object Manipulation , 1997, Gesture Workshop.

[18]  Vahab S. Mirrokni,et al.  A Fast Vision System for Middle Size Robots in RoboCup , 2001, RoboCup.

[19]  Dieter Fox,et al.  Knowledge Compilation Properties of Trees-of-BDDs, Revisited , 2009, IJCAI.

[20]  Robert Grimm,et al.  Systems Support for Ubiquitous Computing: A Case Study of Two Implementations of Labscape , 2002, Pervasive.

[21]  Lars Erik Holmquist,et al.  Total recall: in-place viewing of captured whiteboard annotations , 2003, CHI Extended Abstracts.

[22]  Yoichi Sato,et al.  EnhancedMovie: Movie Editing on an Augmented Desk , 2003 .

[23]  Pierre David Wellner,et al.  Interacting with paper on the DigitalDesk , 1993, CACM.

[24]  Anind K. Dey,et al.  SiteView: Tangibly Programming Active Environments with Predictive Visualization , 2003 .