Accession Number:

ADA285098

Title:

Fault Tolerance of Neural Networks

Descriptive Note:

Final rept., Feb 1992-Aug 1993

Corporate Author:

SYRACUSE UNIV NY SCHOOL OF COMPUTER AND INFORMATION SCIENCE

Report Date:

1994-07-01

Pagination or Media Count:

107.0

Abstract:

This effort studied fault tolerance aspects of artificial neural networks, and resulted in the development of neural learning techniques that more effectively utilize the inherent redundancy and excess of resources over the minimum required found in most classically trained networks. Performance evaluation measures were developed and used to quantify network tolerance to faults such as single link failures, multiple node failures, multiple link failures, and also small degradations in multiple links or nodes. Several variations of the basic back-propagation algorithm were designed and implemented, yielding improvements in fault tolerance. An Addition-Deletion algorithm was designed to successively modify the size of a network by deleting nodes that do not contribute to fault tolerance, and to add new nodes in a way that is assured to improve fault tolerance. The techniques designed in this project were compared to those suggested by others, and were found to improve robustness. Also, a Refine algorithm was defined, which takes a robust network which does not satisfy hardware restrictions and transforms it to another network which does.

Subject Categories:

  • Computer Systems

Distribution Statement:

APPROVED FOR PUBLIC RELEASE