Multimer: validating multimodal, cognitive data in the city: towards a model of how the urban environment influences streetscape users

Multimer is a new technology that aims to provide a data-driven understanding of how humans cognitively and physically experience spatial environments. By multimodally measuring biosensor data to model how the built environment and its uses influence cognitive processes, Multimer aims to help space professionals like architects, workplace strategists, and urban planners make better design interventions. Multimer is perhaps the first spatial technology that collects biosensor data, like brainwave and heart rate data, and analyzes it with both spatiotemporal and neurophysiological tools. The Multimer mobile app can record data from several kinds of commonly available, inexpensive, wearable sensors, including EEG, ECG, pedometer, accelerometer, and gyroscope modules. The Multimer app also records user-entered information via its user interface and micro-surveys, then also combines all this data with a user's geo-location using GPS, beacons, and other location tools. Multimer's study platform displays all of this data in real-time at the individual and aggregate level. Multimer also validates the data by comparing the collected sensor and sentiment data in spatiotemporal contexts, and then it integrates the collected data with other data sets such as citizen reports, traffic data, and city amenities to provide actionable insights towards the evaluation and redesign of sites and spaces. This report presents preliminary results from the data validation process for a Multimer study of 101 subjects in New York City from August to October 2017. Ultimately, the aim of this study is to prototype a replicable, scalable model of how the built environment and the movement of traffic influence the neurophysiological state of pedestrians, cyclists, and drivers.

[1]  Anil K. Bera,et al.  Simple diagnostic tests for spatial dependence , 1996 .

[2]  Martin Luessi,et al.  MNE software for processing MEG and EEG data , 2014, NeuroImage.

[3]  John D. Hunter,et al.  Matplotlib: A 2D Graphics Environment , 2007, Computing in Science & Engineering.

[4]  Sergio J. Rey,et al.  PySAL: A Python Library of Spatial Analytical Methods , 2010 .

[5]  P. Aspinall,et al.  The urban brain: analysing outdoor physical activity with mobile EEG , 2013, British Journal of Sports Medicine.

[6]  A. Getis,et al.  Constructing the Spatial Weights Matrix Using a Local Statistic , 2004 .

[7]  Wes McKinney,et al.  Data Structures for Statistical Computing in Python , 2010, SciPy.

[8]  D. Tucker,et al.  Medial Frontal Cortex in Action Monitoring , 2000, The Journal of Neuroscience.

[9]  William D. Marslen-Wilson,et al.  The time course of visual word recognition as revealed by linear regression analysis of ERP data , 2006, NeuroImage.

[10]  Travis E. Oliphant,et al.  Guide to NumPy , 2015 .

[11]  J. Ord,et al.  Local Spatial Autocorrelation Statistics: Distributional Issues and an Application , 2010 .

[12]  Jonathan Grainger,et al.  A Thousand Words Are Worth a Picture , 2015, Psychological science.

[13]  S. Hahn Hilbert Transforms in Signal Processing , 1996 .

[14]  Christophe Ley,et al.  Detecting outliers: Do not use standard deviation around the mean, use absolute deviation around the median , 2013 .

[15]  Martin Luessi,et al.  MEG and EEG data analysis with MNE-Python , 2013, Front. Neuroinform..

[16]  L. Amelin,et al.  Local Indicators of Spatial Association-LISA , 1995 .

[17]  C. Frith,et al.  Meeting of minds: the medial frontal cortex and social cognition , 2006, Nature Reviews Neuroscience.