HandSmart is one example of wearable device that can be used as a user interface for advanced mediaphones and it is based on MARISIL a Mobile Augmented Reality Interface Sign Interpretation Language. This paper will describe the interface and some of the applications of these new types of personal devices. The user-centered development methodology is discussed in brief at the end of the paper. Evolutions in technology have provided a variety of new opportunities for exploring and discovering virtual 3D worlds. Head-mounted displays and data gloves enable us to interact and immerse much better into the artificial generated 3D environment. Such devices have been advertised in the entertainment media and are recognized by the public as the symbols of virtual reality (VR). Augmented Reality that has the attribute of being more related to real world than VR by overlaying virtual sounds, feelings or visions onto our senses within the real world, can therefore extend our natural experiences. The authors believe that the new generation of mediaphones can embed these new techniques.
[1]
Akira Fukuda,et al.
A spatio-temporal resource allocation protocol (STRAP) with mobility specification: simulation and performance evaluation
,
1999,
MSWiM '99.
[2]
Jik,et al.
THE BOOK OF VISION
,
1933
.
[3]
Kenya Sato,et al.
STRAP, SPATIO-TEMPORAL RESOURCE ALLOCATION PROTOCOL WITH MOBILITY SPECIFICATION
,
1999
.
[4]
Kunihiro Chihara,et al.
Augmented Reality Based Input Interface for Wearable Computers
,
2000,
Virtual Worlds.
[5]
Jakob Nielsen,et al.
Information Appliances and Beyond
,
2000
.
[6]
Ronald G. Day.
Quality Function Deployment: Linking a Company With Its Customers
,
1993
.
[7]
Petri Pulli,et al.
MARISIL – User Interface Framework for Future Mobile
,
.
[8]
Jouni Similä,et al.
CyPhone : Future personal telecooperation device
,
1998
.