ion component represents the core library. Here we specify the interface through which we can elaborate the raw data from a sensor and so specify an API that abstracts from the specific device implementation. For example, in the case of an accelerometer, we defined methods like getYAcceleration(): float, in order to retrieve the acceleration in the y dimension from raw data. We can also define higherlevel methods like getRoll(): double or getPitch(): double in order to retrieve rotations in the y and x dimensions. The implementation of these methods is completely transparent to the user, who does not need to know how the raw data are processed to get the final value. In this way we support devices interchangeability and code reuse, because the same code for, let’s say, the accelerometer of the Nintendo Wiimote will work for the accelerometer of an iPad (and any device that is compliant with the HAT specification). The abstraction toolkit is powerful enough to allow the composition of devices. For example, an accelerometer can be combined with a gyroscope to create a general Inertial Measurement Unit (IMU) component. This level also Presently, the abstraction level supports a range of device types such as accelerometers, gyroscopes, LED, display screen, touch sensors, RGB cameras and Depth sensors among others. On top of this API, different middlewares can be developed that, for example, implement gesture detection from sensors data (the Features Layer, which has not yet been developed). At the Application level, software applications can directly exploit functionalities provided by a specific middleware. In our framework we will also consider output channels for feedbacks, while other similar frameworks do not [15]. For example, the speakers can be used as output for giving some audio feedback to Towards a framework for the rapid prototyping of physical interaction Andrea Bellucci, Alessio Malizia and Ignacio Aedo the user. LEDs (Light Emitting Diodes) can be employed to create ambient displays giving visual feedback and small motors can provide haptic feedback (via a rumble feature). Therefore we will provide APIs also for defining and managing the output of the interactive system itself, in term of events perceived in the real world (e.g. an LED blinking) originated by some digital event (e.g. a control value exceeding a threshold) which was caused by a physical event (e.g. user’s hand too close to a specific object: this event can be captured by means of a depth sensor).
[1]
Gudrun Klinker,et al.
A multitouch software architecture
,
2008,
NordiCHI.
[2]
Alessio Malizia,et al.
Don't touch me: multi-user annotations on a map in large display environments
,
2010,
AVI.
[3]
Roman Rädle,et al.
Squidy: a zoomable design environment for natural user interfaces
,
2009,
CHI Extended Abstracts.
[4]
Scott R. Klemmer,et al.
Toolkit Support for Integrating Physical and Digital Interactions
,
2009,
Hum. Comput. Interact..
[5]
Andreas Butz,et al.
GISpL: gestures made easy
,
2012,
Tangible and Embedded Interaction.
[6]
Jan Zibuschka,et al.
MT4j - A Cross-platform Multi-touch Development Framework
,
2010,
ArXiv.
[7]
Alessio Malizia,et al.
TESIS: turn every surface into an interactive surface
,
2011,
ITS '11.
[8]
Russell M. Taylor,et al.
VRPN: a device-independent, network-transparent VR peripheral system
,
2001,
VRST '01.
[9]
Ali Mazalek,et al.
A nested APi structure to simplify cross-device communication
,
2012,
TEI.
[10]
Ross Bencina,et al.
reacTIVision: a computer-vision framework for table-based tangible interaction
,
2007,
TEI.
[11]
Rainer Stark,et al.
An object-centric interaction framework for tangible interfaces in virtual environments
,
2010,
TEI.
[12]
Jean Vanderdonckt,et al.
An open source workbench for prototyping multimodal interactions based on off-the-shelf heterogeneous components
,
2009,
EICS '09.
[13]
Victor L. Wallace,et al.
The semantics of graphic input devices
,
1976,
ACM Symposium on Graphic Languages.