The generalized optical remote sensing tracking problem for an object moving in a dynamic urban environment is complex. Two emerging capabilities that can help solve this problem are adaptive multimodal sensing and modeling with data assimilation. Adaptive multimodal sensing describes sensor hardware systems that can be rapidly reconfigured to collect the appropriate data as needed. Imaging of a moving target implies some ability to forecast where to image next so as to keep the object in the scene. Forecasts require models and to help solve this prediction problem, data assimilation techniques can be applied to update executing models with sensor data and thereby dynamically minimize forecast errors. The direct combination of these two capabilities is powerful but does not answer the questions of how or when to change the imaging modality. The Dynamic DataDriven Applications Systems (DDDAS) paradigm is well-suited for solving this problem, where sensing must be adaptive to a complex changing environment and where the prediction of object movement and its interaction with the environment will enhance the ability of the sensing system to stay focused on the object of interest. Here we described our work on the creation of a modeling system for optical tracking in complex environments, with a focus on integrating an adaptive imaging sensor within the system framework.
[1]
John P. Kerekes,et al.
Tunable Single Pixel MEMS Fabry-Perot Interferometer
,
2011
.
[2]
W. Marsden.
I and J
,
2012
.
[3]
Michael D. Presnar,et al.
Modeling and simulation of adaptive multimodal optical sensors for target tracking in the visible to near infrared
,
2010
.
[4]
John P. Kerekes,et al.
Simulation of practical single-pixel wire-grid polarizers for superpixel stokes vector imaging arrays
,
2012
.
[5]
Dusanka Zupanski,et al.
Model Error Estimation Employing an Ensemble Data Assimilation Approach
,
2006
.
[6]
A. James.
2010
,
2011,
Philo of Alexandria: an Annotated Bibliography 2007-2016.