New developments in the fields of virtual reality and unmanned aerial vehicles (UAV) are creating new
potential usecases. This project aims to create a system for third-person viewing in a real-world environment
using a UAV, a camera, and a head-mounted display. The concept is that the UAV will autonomously
hover a short distance behind the user while recording video which is sent to the user’s head-mounted
display.
The concept is realized using a quadcopter, a UAV with four rotors, which is equipped with a gimbal
and a camera which streams digital video data to the user over a wireless ad-hoc network. The report
demonstrates how a control system which uses the head-mounted display’s sensors to steer the UAV
can be constructed. We present the advantages of using digital video transfer for this type of system. A
Kalman filter, a form of mathmatical filter, is constructed and analysed for use in positioning the system.
Theory regarding autonomous positioning using ultrasonic ranging and trilateration is presented but not
implemented. The problems in implementing such an ultrasonic positioning system are presented and
discussed along with possibilities for future research.
The report is written in Swedish.