You are in the accessibility menu

Please use this identifier to cite or link to this item: http://acervodigital.unesp.br/handle/11449/9651
Title: 
Analog neural nonderivative optimizers
Author(s): 
Institution: 
  • Universidade Estadual Paulista (UNESP)
  • Purdue Univ
ISSN: 
1045-9227
Abstract: 
Continuous-time neural networks for solving convex nonlinear unconstrained;programming problems without using gradient information of the objective function are proposed and analyzed. Thus, the proposed networks are nonderivative optimizers. First, networks for optimizing objective functions of one variable are discussed. Then, an existing one-dimensional optimizer is analyzed, and a new line search optimizer is proposed. It is shown that the proposed optimizer network is robust in the sense that it has disturbance rejection property. The network can be implemented easily in hardware using standard circuit elements. The one-dimensional net is used as a building block in multidimensional networks for optimizing objective functions of several variables. The multidimensional nets implement a continuous version of the coordinate descent method.
Issue Date: 
1-Jul-1998
Citation: 
IEEE Transactions on Neural Networks. New York: IEEE-Inst Electrical Electronics Engineers Inc., v. 9, n. 4, p. 629-638, 1998.
Time Duration: 
629-638
Publisher: 
Institute of Electrical and Electronics Engineers (IEEE)
Keywords: 
  • analog networks
  • coordinate descent
  • derivative free optimization
  • unconstrained optimization
Source: 
http://dx.doi.org/10.1109/72.701176
URI: 
Access Rights: 
Acesso restrito
Type: 
outro
Source:
http://repositorio.unesp.br/handle/11449/9651
Appears in Collections:Artigos, TCCs, Teses e Dissertações da Unesp

There are no files associated with this item.
 

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.