Accession Number : ADA255441


Title :   Recursively Generated Networks and Dynamical Learning


Descriptive Note : Final rept.


Corporate Author : YALE UNIV NEW HAVEN CT DEPT OF COMPUTER SCIENCE


Personal Author(s) : Mjolsness, Eric


Full Text : https://apps.dtic.mil/dtic/tr/fulltext/u2/a255441.pdf


Report Date : Dec 1991


Pagination or Media Count : 90


Abstract : Much of the research has been based on the premise is that mathematical methods and notation associated with constrained optimization should be used to specify a neural net, which can then be compiled to diverse implementations. But where do they get such a compiler? And what are the details of this mathematical notation? They have made substantial progress on these research questions: (1) They have developed mathematical methods that can transform one algebraic NN description into another, more implementable one. These developments were attained by serious work in the applied mathematics of neural nets. They can form the basis of a neural compiler because they address most of the major NN compilation and implementation issues. But they do not yet suffice. (2) They have been accumulating the research in a neural simulator. It can be expanded into a semi-automatic compiler: a neural net design and implementation environment based on mathematical methods. (3) They have developed a mathematical notation (not yet a formal language) for describing complex problem domains in terms of constrained optimization problems. The optimization problems can be solved by neural nets.


Descriptors :   *MATHEMATICAL MODELS , *COMPILERS , *LEARNING , SIMULATORS , NEURAL NETS , OPTIMIZATION , NETS , AUTOMATIC , ENVIRONMENTS , LANGUAGE , MATHEMATICS , APPLIED MATHEMATICS


Subject Categories : Theoretical Mathematics
      Computer Programming and Software


Distribution Statement : APPROVED FOR PUBLIC RELEASE