Large Scale Game Accessibility : A survey of possible engine independent solutions
暂无分享,去创建一个
This was a keynote presentation at the International Conference on Translation and Accessibility in Video Games and Virtual Worlds. Computer games use various modalities for interaction, depending on e.g. the platform or game genre. Keys or buttons (mouse, keyboard, controllers etc), gestures (joystick, mouse, body), voice commands, speech feedback (samples or syntheziser), interactive music, haptics, biofeedback (EMG, EEG, EOG signals). The list goes on. The Sony Playstation Eyetoy and Singstar, the Nintendo Wii, the Apple iPhone and the Microsoft Kinect are all examples of commercial success for multimodal interaction. Multimodality has reached the masses thanks to this development of consumer affordable hardware. Game accessibility relies on multimodal interaction. Sight disabled gamers use e.g. braille, speech synthezis, voice commands and spatial audio to interact with games. Gamers with mobility or dexterity disabilities use a range of different special or modified hardware controllers. Deaf gamers rely on subtitles, closed captioning, visualization of sounds or modifications to represent audio with haptics. Some solutions are included in operative systems while other solutions are more or less affordable. Some solutions require technical expertise by the gamer or by the game developers to implement. Compared with a PC, game consoles and handhelds as well as tablets and phones are harder to adapt due to a more closed system design. On the PC platform accessibility is more independent of the original designer. Game accessibility has been improved by a number of developers and researchers over the years. All of these contributions are important and often done with small or no funding. However, it has proven hard to create accessibility solutions on a large scale, say for an entire game genre across platforms or for all games on a specific platform. Design guidelines exist inspired by the W3C guidelines for web accessibility. Game engines may use XML for various purposes but there is far from a standard markup language across engines and platforms, as is the case with the web. This is one of many reasons a generic approach is harder to implement for games. The question then, is: How may game accessibility be achieved on a large scale for as many disabled as possible, in the near future? Based upon the above some conclusions can be made. Game accessibility may benefit from affordable and common multimodal consumer products. Other available software and hardware on the PC platform may be used to enhance accessibility, e.g. automated translation, analysis and transformation of content to be accessible to the user’s needs. User generated content can be used to improve where the automated approach fails (e.g. Google Translate and corrections by users). A first large-scale attempt should be made for PCs, which are easier to modify than consoles, tablets or phones. This paper presents a concept the author calls engine independence (EI). This means that the accessibility solutions act in parallell to the game, rather than being directly integrated with the game engine. This way, game accessibility solutions may be scaled without the need of standardization.