Problem setting. Participatory sensing [1] is an integral part of many citizen science projects which often includes community efforts. Examples are the collection of noise samples to prevent a new runway for an airport [2] or air quality campaigns in urban environments [4]. In such scenarios, the participants and organizers are interested in sharing, discussing, interpreting, and understanding the objective data measured by a sensing device complemented by impressions and subjective context information of participants. While most sophisticated and flexible IoT platforms, like Xively or Ubidots allow for collecting and visualizing objective, typically numeric data (e.g. gps, sensor readings), they do not provide the means to (i) process and share the collected data [5] and (ii) collect, visualize and analyze the contextual or subjective information (e.g. tags, perceptions) of the participants. Without such additional user information the interpretation of the data often becomes very difficult. The EveryAware platform introduced in 2013 as a generic platform for ubiquitous and subjective data [3] has addressed these three issues: It allows for processing and visualisation of data, sharing of results and inclusion of subjective data in textual form, but only with special focus on two types, air quality [4] and noise pollution [2]. Contribution. In our work, we enhanced EveryAware to accept any type of data in numeric and textual form together with relations between both for a better support of citizen science projects. To flexibly visualize and analyze this data, we developed a shareable, interactive dashboard system to provide users with the means to easily build custom tailored visualizations. All visualizations are managed with standard widget technology. These widgets can be used to show previously collected data but also to visualize an incoming live stream. Concluding, we provide an easy to use tool to collect, visualize, share and analyze objective and subjective data and data streams in real time. Future Work. While at the moment, only basic visualizations, like line plots, are available, we are working on more sophisticated widgets, like plots in maps. An example is given in Figure 1, where sensory data can be plotted on a map with corresponding subjective information in form of tags. Additionally, an easy to use interface for custom-programmed
[1]
Andreas Hotho,et al.
A generic platform for ubiquitous and subjective data
,
2013,
UbiComp.
[2]
Vittorio Loreto,et al.
Awareness and Learning in Participatory Noise Sensing
,
2013,
PloS one.
[3]
Sasu Tarkoma,et al.
A gap analysis of Internet-of-Things platforms
,
2015,
Comput. Commun..
[4]
Vittorio Loreto,et al.
Participatory Patterns in an International Air Quality Monitoring Initiative
,
2015,
PloS one.
[5]
Mark H. Hansen,et al.
Participatory sensing - eScholarship
,
2006
.