You are in the accessibility menu

Please use this identifier to cite or link to this item: http://acervodigital.unesp.br/handle/11449/72693
Full metadata record
DC FieldValueLanguage
dc.contributor.authorPonti Jr., Moacir P.-
dc.contributor.authorPapa, João Paulo-
dc.date.accessioned2014-05-27T11:26:00Z-
dc.date.accessioned2016-10-25T18:34:42Z-
dc.date.available2014-05-27T11:26:00Z-
dc.date.available2016-10-25T18:34:42Z-
dc.date.issued2011-09-26-
dc.identifierhttp://dx.doi.org/10.1007/978-3-642-21557-5_26-
dc.identifier.citationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), v. 6713 LNCS, p. 237-248.-
dc.identifier.issn0302-9743-
dc.identifier.issn1611-3349-
dc.identifier.urihttp://hdl.handle.net/11449/72693-
dc.identifier.urihttp://acervodigital.unesp.br/handle/11449/72693-
dc.description.abstractThe Optimum-Path Forest (OPF) classifier is a recent and promising method for pattern recognition, with a fast training algorithm and good accuracy results. Therefore, the investigation of a combining method for this kind of classifier can be important for many applications. In this paper we report a fast method to combine OPF-based classifiers trained with disjoint training subsets. Given a fixed number of subsets, the algorithm chooses random samples, without replacement, from the original training set. Each subset accuracy is improved by a learning procedure. The final decision is given by majority vote. Experiments with simulated and real data sets showed that the proposed combining method is more efficient and effective than naive approach provided some conditions. It was also showed that OPF training step runs faster for a series of small subsets than for the whole training set. The combining scheme was also designed to support parallel or distributed processing, speeding up the procedure even more. © 2011 Springer-Verlag.en
dc.format.extent237-248-
dc.language.isoeng-
dc.sourceScopus-
dc.subjectdistributed combination of classifiers-
dc.subjectOptimum-Path Forest classifier-
dc.subjectpasting small votes-
dc.subjectCombination of classifiers-
dc.subjectCombining method-
dc.subjectCombining schemes-
dc.subjectFast methods-
dc.subjectFinal decision-
dc.subjectFixed numbers-
dc.subjectForest classifiers-
dc.subjectLearning procedures-
dc.subjectMajority vote-
dc.subjectParallel or distributed processing-
dc.subjectRandom sample-
dc.subjectReal data sets-
dc.subjectTraining algorithms-
dc.subjectTraining sets-
dc.subjectTraining subsets-
dc.subjectAlgorithms-
dc.subjectPattern recognition systems-
dc.subjectSet theory-
dc.titleImproving accuracy and speed of optimum-path forest classifier using combination of disjoint training subsetsen
dc.typeoutro-
dc.contributor.institutionUniversidade de São Paulo (USP)-
dc.contributor.institutionUniversidade Estadual Paulista (UNESP)-
dc.description.affiliationInstitute of Mathematical and Computer Sciences University of São Paulo (ICMC/USP), 13560-970 São Carlos, SP-
dc.description.affiliationDepartment of Computing UNESP - Univ. Estadual Paulista, Bauru, SP-
dc.description.affiliationUnespDepartment of Computing UNESP - Univ. Estadual Paulista, Bauru, SP-
dc.identifier.doi10.1007/978-3-642-21557-5_26-
dc.rights.accessRightsAcesso restrito-
dc.relation.ispartofLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)-
dc.identifier.scopus2-s2.0-80053014556-
Appears in Collections:Artigos, TCCs, Teses e Dissertações da Unesp

There are no files associated with this item.
 

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.