Deep Learning Models for Passability Detection of Flooded Roads
暂无分享,去创建一个
Paolo Garza | Harald Skinnemoen | Laura Lopez-Fuentes | Alessandro Farasin | Alessandro Farasin | H. Skinnemoen | P. Garza | Laura Lopez-Fuentes
[1] Bo Zhang,et al. Color-based road detection in urban traffic scenes , 2004, IEEE Transactions on Intelligent Transportation Systems.
[2] Sergey Ioffe,et al. Rethinking the Inception Architecture for Computer Vision , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[3] M. Curtarelli,et al. The Use of Optical Remote Sensing For Mapping Flooded Areas , 2013 .
[4] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[5] C. Ticehurst,et al. Using passive microwave and optical remote sensing to monitor flood inundation in support of hydrologic modelling , 2009 .
[6] Claudio Rossi,et al. Online clustering and classification for real-time event detection in Twitter , 2018, ISCRAM.
[7] J. S. Verkade,et al. Probabilistic flood extent estimates from social media flood observations , 2016 .
[8] Claudio Rossi,et al. Filtering informative tweets during emergencies: a machine learning approach , 2017, I-TENDER@CoNEXT.
[9] Benjamin Bischke,et al. The Multimedia Satellite Task at MediaEval 2018: Emergency Response for Flooding Events , 2018 .
[10] V. Klemas,et al. Remote Sensing of Floods and Flood-Prone Areas: An Overview , 2015 .
[11] Joost van de Weijer,et al. Multi-modal Deep Learning Approach for Flood Detection , 2017, MediaEval.
[12] Joost van de Weijer,et al. Review on computer vision techniques in emergency situations , 2017, Multimedia Tools and Applications.
[13] Jean Ponce,et al. General Road Detection From a Single Image , 2010, IEEE Transactions on Image Processing.
[14] Evelina Di Corso,et al. Analyzing spatial data from twitter during a disaster , 2017, 2017 IEEE International Conference on Big Data (Big Data).
[15] Vishal M. Patel,et al. Learning Deep Features for One-Class Classification , 2018, IEEE Transactions on Image Processing.