Through-the-Lens Drone Filming

Aerial filming in action scenes using a drone is difficult for inexperienced flyers because manipulating a remote controller and meeting the desired image composition are two independent, while concurrent, tasks. Existing systems attempt to utilize wearable GPS-based or infrared-based sensors to track the human movement and to assist in capturing footage. However, these sensors work only in either indoor (infrared-based) or outdoor environments (GPS-based), but not both. In this paper, we introduce a novel drone filming system which integrates monocular 3D human pose estimation and localization into a drone platform to remove the constraints imposed by wearable-sensor-based solutions. Meanwhile, given the estimated position, we propose a novel drone control system, called “through-the-lens drone filming”, to allow a cameraman to conveniently control the drone by manipulating a 3D model in the preview, which closes the gap between the flight control and the viewpoint design. Our system includes two key enabling techniques: 1) subject localization based on visual-inertial fusion, and 2) through-the-lens camera planning. This is the first drone camera system which allows users to capture human actions by manipulating the camera in a virtual environment. From the drone hardware, we integrate a gimbal camera and two GPUs into the limited space of a drone and demonstrate the feasibility of running the entire system onboard with insignificant delays, which are sufficient for filming in our real-time application. Experimental results, in both simulation and real-world scenarios, demonstrate that our techniques can greatly ease camera control and capture better videos.

[1]  Marc Christie,et al.  Intuitive and efficient camera control with the toric space , 2015, ACM Trans. Graph..

[2]  David Salesin,et al.  The virtual cinematographer: a paradigm for automatic real-time camera control and directing , 1996, SIGGRAPH.

[3]  Pat Hanrahan,et al.  Towards a Drone Cinematographer: Guiding Quadrotor Cameras using Visual Composition Principles , 2016, ArXiv.

[4]  Ken Shoemake,et al.  Animating rotation with quaternion curves , 1985, SIGGRAPH.

[5]  Friedrich M. Wahl,et al.  Online Trajectory Generation: Basic Concepts for Instantaneous Reactions to Unforeseen Events , 2010, IEEE Transactions on Robotics.

[6]  Demetri Terzopoulos,et al.  Virtual cinematography using optimization and temporal smoothing , 2017, MIG.

[7]  Xin Yang,et al.  ACT: An Autonomous Drone Cinematography System for Action Scenes , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[8]  Alexander Domahidi,et al.  Real-time planning for automated multi-view drone cinematography , 2017, ACM Trans. Graph..

[9]  James J. Little,et al.  Exploiting Temporal Information for 3D Human Pose Estimation , 2017, ECCV.

[10]  Charles Richter,et al.  Polynomial Trajectory Planning for Aggressive Quadrotor Flight in Dense Indoor Environments , 2016, ISRR.

[11]  Daniel Cohen-Or,et al.  The Virtual Director: a Correlation‐Based Online Viewing of Human Motion , 2010, Comput. Graph. Forum.

[12]  Pascal Fua,et al.  Monocular 3D Human Pose Estimation in the Wild Using Improved CNN Supervision , 2016, 2017 International Conference on 3D Vision (3DV).

[13]  James J. Little,et al.  Exploiting temporal information for 3D pose estimation , 2017, ArXiv.

[14]  Hans-Peter Seidel,et al.  VNect , 2017, ACM Trans. Graph..

[15]  Yaser Sheikh,et al.  OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields , 2018, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Michael Gleicher,et al.  Through-the-lens camera control , 1992, SIGGRAPH.

[17]  Thomas Wischgoll,et al.  Display Systems for Visualization and Simulation in Virtual Environments , 2017, Visualization and Data Analysis.

[18]  Alexander Domahidi,et al.  Real-Time Motion Planning for Aerial Videography With Real-Time With Dynamic Obstacle Avoidance and Viewpoint Optimization , 2017, IEEE Robotics and Automation Letters.

[19]  Sudipta N. Sinha,et al.  Monocular Localization of a moving person onboard a Quadrotor MAV , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[20]  Vijay Kumar,et al.  Minimum snap trajectory generation and control for quadrotors , 2011, 2011 IEEE International Conference on Robotics and Automation.