OCCLUDED AREA REMOVING FROM HANDHELD LASER SCANNER DATA DURING 3D BUILDING MODELLING

Abstract. 3D building modelling has been turned to be one of the most interesting and hottest subjects in photogrammetry in last two decades, and it seems that photogrammetry provides the only economic means to acquire truly 3D city-data. Most of the researches proposed methods for 3d building modelling in LoD2 using aerial images and LIDAR data and the produced models will be enriched by oblique images, therefore there is always a demand for a user to interpret the facade or in other manual building reconstruction process the operator should draw boundaries to represent the building model and the process will be too time-consuming for 3d modelling for a whole city. Creating building facade models for a whole city requires considerable work, therefore for decades, much research has been dedicated to the automation of this reconstruction process. Nowadays researchers attempt to recommend a new method which is flexible to model hug variety of buildings and has a solution for several challenges such as irrelevant objects (pedestrians, trees, traffic signs, etc.), occluded areas and non-homogenous data. Based on various 3d building models applications, namely navigation systems, location-based system, city planning and etc. the demand for adding semantic features (such as windows and doors) is increasing and becoming more essential, therefore simple blocks as the representation of 3d buildings aren’t sufficient anymore. Therefore 2.5 models which show the facade details using pixel values have been substituted by LoD3 models recently. The lack of automation in image based approaches can be explained by the difficulties in image interpretation. Specifically, factors like illumination and occlusion can cause considerable confusion for machine understanding and some conditions (relative orientation, feature matching, etc.) need to be accurately determined to transfer image pixels to 3D coordinates. In recent years, terrestrial laser scanning data has been proven as a valuable source for building facade reconstruction. The point density of stationary laser scanning in urban areas can be up to hundreds or thousands of points per square meter, which is high enough for documenting most details on building facades. In comparison with image-based modelling, several steps such as image matching, intersection and resection will be eliminated, while there is no need to image interpret in laser data-based reconstruction approaches, these methods face major challenges such as extracting meaningful structures from a huge amount of data. This paper presents a data-driven algorithm for facade reconstruction, using a handheld laser scanner, Zebedee. The mentioned device is consisting of 2d laser scanner and an inertial measurement unit mounted on one or two springs, it has 270-degree field of view. Its mass is 210 g which makes it ideal for low measurement and it is maximum range is 30 m. The proposed method was implemented by using the Zebedee point cloud in order to determine the challenges of zeb1 data and ensure that the introduced device can be practical for 3d reconstruction. Due to obstacle existence, operator gross errors while data capturing and facade elements arrangement, there will always be occluded area and shadows in produced data. Occluded area cause tribulation in machine understanding and problems for automatic reconstruction algorithms. The proposed method represents a new way to detect occluded area and remove the artificial objects which are produced by them. The 3d point cloud is used to cover all facade elements and details, also image matching and producing 3-dimensional data steps will be omitted from the process. The proposed workflow is indicated in figure 1. Most researches such as road, building or other objects detection and reconstruction put ground points detection in priority in order to decrease data volume and processing time, so as a pre-processing step, point cloud is classified into two separate groups (non-ground and ground points).