Neural Network Computing Architectures of Coupled Associative Memories with Dynamic Attractors.
Final technical rept. 14 May 94-14 May 96,
CALIFORNIA UNIV BERKELEY CENTER FOR PURE AND APPLIED MATHEMATICS
Pagination or Media Count:
In this time period, previous work on the construction of an oscillating neural network computer that could recognize sequences of characters of a grammar was extended to employ selective attentional control of synchronization to direct the flow of communication and computation within the architecture. This selective control of synchronization was used to solve a more difficult grammatical inference problem than we had previously attempted. Further performance improvement was demonstrated by the use of a temporal context hierarchy in the hidden and context units of the architecture. These form a temporal counting hierarchy which allows representations of the input variations to form at different temporal scales for learning sequences with with long temporal dependencies. We further explored the analog system identification capabilities of these systems where the output modules take on analog values. We were able to learn a mapping from the acoustic cepstral values of speech to articulatory parameters such as jaw and lip movement. This is a model speech processing problem which allows us to test the usefulness of our systems for speech recognition preprocessing.
- Electrical and Electronic Equipment
- Computer Hardware