We are witnessing nowadays that the last decade of the past century, as well as the first years of the present one, have brought technology expansion with respect to spatial data gathering and processing which makes a physical basis for management of spatial development. This has resulted in enlargement of the spatial data market. New technologies, presented in computer applications, have greatly expanded the number of users of these products. The philosophy of spatial data collecting has changed; analogue maps and plans printed on paper have been replaced by digital data bases which enable their presentation in a way that is the best for a particular user. Further, digital spatial data bases provide the possibility of their further upgrading by users. The two aspects, with respect to circumstances mentioned above, are very important in the process of data bases production and distribution. Firstly, the users of these data bases should be the ones who decide which of the available bases could satisfy their requirements, or in other words, what is the data quality level necessary for a certain application. On the other hand, the visualization of digital data bases could often mislead, since review of data bases could present data with better accuracy then the actual one. Thus, certain methods that would point to a quality of the selected data in the process of their analysis should be available to users. Specific, already adopted international standards, or specially developed procedures and methodologies, so called de facto standards, could be used in this data processing, enabling the estimation of these data quality. The development of Open GIS concept requires the adoption of widely accepted standards for spatial data quality. It is recommended that ISO standards should be accepted, firstly TC211 standards which are related to geographic information and geomatics. The realization of projects on ISO standards should be finished by 2006, so, all participants of these data bases should be both familiar with this project and ready to adapt to the given solutions. The basic components defining quality of data bases are explained by this work, and the results of the standardization regarding the procedures and methodology of their quality assessment, obtained so far, are also presented.
[1]
Kate Beard.
Roles of Meta-Information in Uncertainty Management
,
2001
.
[2]
Carolyn T. Hunsaker,et al.
An Introduction to Uncertainty Issues for Spatial Data Used in Ecological Applications
,
2001
.
[3]
Ronald Eastman,et al.
Uncertainty Management in GIS: Decision Support Tools for Effective Use of Spatial Data Resources
,
2001
.
[4]
Joel L. Morrison.
CHAPTER ONE – Spatial data quality
,
1995
.
[5]
Stan Openshaw,et al.
Learning to live with errors in spatial databases
,
1989
.
[6]
Howard Veregin,et al.
CHAPTER NINE – An evaluation matrix for geographical data quality
,
1995
.
[7]
Harold Moellering,et al.
A draft proposed standard for digital cartographic data
,
1987
.
[8]
Radoš ŠUMRADA.
Temporal Data and Temporal Reference Systems
,
2003
.