" How was School Today ... ? " , Evaluating the Personal-Narrative-Telling Prototype : Preliminary Results

Talking about your day at school can be essential for family interaction and this narrative communication has been highlighted for its role in literacy acquisition. In a one year feasibility study we explored whether we can support children with complex communication needs (CCN) to create and tell stories about their school day with the help of a computer tool which uses data from the user’s activities during the day. We have built a prototype system that generates draft stories about a child's day at school using interaction and location sensors, pre-stored information about the school day such as the timetable and lunch menu and recorded voice messages from staff. The child can easily edit the generated stories and use them integrated in her personal voice output communication aid (VOCA). We will present preliminary results from the evaluation of the system in a special school. Extended Abstract Talking about your day at school can be essential for family interaction and this kind of narrative interaction has been highlighted for its role in literacy acquisition (Peterson, Jesso and McCabe 1999). However, as most children can do this naturally, telling stories about oneself can be a real struggle for people with complex communication needs (CCN); they find it very difficult to create and to articulate such stories. Current augmentative and alternative communication (AAC) technology available is generally not designed to support personal narrative. In the EPSRC funded feasibility study “How was School Today...?” we have successfully built a Personal-NarrativeTelling Prototype which allows children with CCN to use automatically generated narratives to chat about their school day (Black, Waller, Reiter et al. 2008, Reiter, Turner, Alm et al. 2009). The prototype uses data from location and interaction sensors as well as recorded voice messages to generate interactive stories using Natural Language Generation (NLG). We have evaluated the system in a special school environment with a number of children, their parents and school staff over a period of several weeks. The school environment was adapted to allow for sensor data acquisition of location and interaction information. School staff and other associated people (parents, researchers and visitors) were given special staff cards that could be identified by the system's sensors. During the evaluation the system software was installed on the participants' personal voice output communication aid VOCA. If necessary the participant's page setup was transferred to a compatible device in order to allow the installation of the system. The sensors attached to the VOCA use RFID (Radio-frequency identification) technology which allowed identifying the participants location in the building at any given time and interaction of the participant with object (e.g. learning material) and people (e.g. teacher or therapist). The interaction tracking was established when an RFID card attached to an object or carried by an individual was swiped over the sensor. We will present preliminary results of the evaluation, including video footage of conversations about the school day between participating children and their therapists and family members. Feedback gathered from all involved users will be shown together with an outlook on possible future developments of the system. References R Black, A Waller, E Reiter and R Turner (2008). Supporting Personal Narrative for School Children with Communication Disorders the Development of a Prototype. Recent Advances in Assistve Technology and Engineering (RAatE). Coventry. C Peterson, B Jesso and A McCabe (1999). "Encouraging narratives in preschoolers: an intervention study." Journal of Child Language 26: 46-67. E Reiter, R Turner, N Alm, R Black, M Dempster and A Waller (2009). Using NLG to Help Language-Impaired Users Tell Stories and Participate in Social Dialogues. ENLG2009, 12th European Workshop on Natural Language Generation. Athens, Greece.