Run Spot Run: Capturing and Tagging Footage of a Race by Crowds of Spectators

There has been a massive growth in the number of people who film and upload amateur footage of events to services such as Facebook and Youtube, or even stream live to services such as LiveStream. We present an exploratory study that investigates the potential of these spectators in creating footage en masse; in this case, during a live trial at a local marathon. We deployed a prototype app, RunSpotRun, as a technology probe to see what kinds of footage spectators would produce. We present an analysis of this footage in terms of its coverage, quality, and contents, and also discuss the implications for a) spectators enjoying the race, and b) extracting the stories of individual runners throughout the race. We conclude with a discussion of the challenges that remain for deploying such technology at a larger scale.

[1]  Jon Whittle,et al.  HeartLink: open broadcast of live biometric data to social networks , 2013, CHI.

[2]  David A. Shamma,et al.  Live mobile collaboration for video production: design, guidelines, and requirements , 2013, Personal and Ubiquitous Computing.

[3]  Mark S. Melenhorst,et al.  Incorporating user motivations to design for video tagging , 2009, Interact. Comput..

[4]  Duncan Rowland,et al.  Automics: souvenir generating photoware for theme parks , 2011, CHI.

[5]  Antti Oulasvirta,et al.  Collective creation and sense-making of mobile media , 2006, CHI.

[6]  Cees Snoek,et al.  Crowdsourcing rock n' roll multimedia retrieval , 2010, ACM Multimedia.

[7]  Mor Naaman,et al.  Why we tag: motivations for annotation in mobile and online media , 2007, CHI.

[8]  S. Wolk Watching the Stars go `Round and `Round , 1996 .

[9]  Oskar Juhlin,et al.  Watching the cars go round and round: designing for active spectating , 2006, CHI.

[10]  Antti Oulasvirta,et al.  Comedia: mobile group media for active spectatorship , 2007, CHI.

[11]  Oskar Juhlin,et al.  Mobile broadcasting: the whats and hows of live video as a social medium , 2010, Mobile HCI.

[12]  Lexing Xie,et al.  Event Mining in Multimedia Streams , 2008, Proceedings of the IEEE.

[13]  Mor Naaman,et al.  Requirements for mobile photoware , 2010, Personal and Ubiquitous Computing.

[14]  Mark J. Perry,et al.  Amateur vision and recreational orientation:: creating live video together , 2012, CSCW '12.

[15]  Mor Naaman,et al.  Less talk, more rock: automated organization of community-contributed collections of concert videos , 2009, WWW '09.

[16]  Ryen W. White,et al.  Slow search , 2014, CACM.

[17]  Martin Ludvigsen,et al.  Designing technology for active spectator experiences at sporting events , 2010, OZCHI '10.

[18]  Igor D. D. Curcio,et al.  We want more: human-computer collaboration in mobile social video remixing of music concerts , 2011, CHI.

[19]  Joseph A. Paradiso,et al.  Multimedia content creation using societal-scale ubiquitous camera networks and human-centric wearable sensing , 2010, ACM Multimedia.

[20]  Peter H. N. de With,et al.  Automatic mashup generation from multiple-camera concert recordings , 2010, ACM Multimedia.

[21]  Patrick Olivier,et al.  Media Crate: tangible live media production interface , 2009, Tangible and Embedded Interaction.

[22]  Shai Avidan,et al.  Racing Bib Numbers Recognition , 2012, BMVC.