Please use this identifier to cite or link to this item:
http://acervodigital.unesp.br/handle/11449/116495
- Title:
- HTS and HTSn: New shape descriptors based on Hough transform statistics
- Universidade Estadual Paulista (UNESP)
- 1077-3142
- With the widespread proliferation of computers, many human activities entail the use of automatic image analysis. The basic features used for image analysis include color, texture, and shape. In this paper, we propose a new shape description method, called Hough Transform Statistics (HTS), which uses statistics from the Hough space to characterize the shape of objects or regions in digital images. A modified version of this method, called Hough Transform Statistics neighborhood (HTSn), is also presented. Experiments carried out on three popular public image databases showed that the HTS and HTSn descriptors are robust, since they presented precision-recall results much better than several other well-known shape description methods. When compared to Beam Angle Statistics (BAS) method, a shape description method that inspired their development, both the HTS and the HTSn methods presented inferior results regarding the precision-recall criterion, but superior results in the processing time and multiscale separability criteria. The linear complexity of the HTS and the HTSn algorithms, in contrast to BAS, make them more appropriate for shape analysis in high-resolution image retrieval tasks when very large databases are used, which are very common nowadays. (C) 2014 Elsevier Inc. All rights reserved.
- 1-Oct-2014
- Computer Vision And Image Understanding. San Diego: Academic Press Inc Elsevier Science, v. 127, p. 43-56, 2014.
- 43-56
- Elsevier B.V.
- HTS
- HTSn
- Image analysis
- Shape analysis
- Hough transform
- Content-based image retrieval
- Optical character recognition
- http://dx.doi.org/10.1016/j.cviu.2014.06.010
- Acesso restrito
- outro
- http://repositorio.unesp.br/handle/11449/116495
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.