Accession Number : AD1006958

Title :   Sparse Distributed Representation and Hierarchy: Keys to Scalable Machine Intelligence

Descriptive Note : Technical Report,29 Apr 2013,30 Nov 2015

Corporate Author : Neurithmic Systems, LLC Newton United States

Personal Author(s) : Rinkus,Gerard ; Lesher,Greg ; Leveille,Jasmin ; Layton,Oliver

Full Text :

Report Date : 01 Apr 2016

Pagination or Media Count : 231

Abstract : We developed and tested a cortically-inspired model of spatiotemporal pattern learning and recognition called Sparsey. Sparsey is a hierarchical model allowing an arbitrary number of levels consisting of coding modules that code information, specifically particular spatiotemporal input moments using sparse distributed representations (SDRs). The modules are called macs as they are proposed as analogs of the canonical cortical processing module known as macrocolumns. Sparsey differs from mainstream neural models, e.g., Deep Learning, in many ways including: a) it uses single-trial, Hebbian learning rather than incremental, many-trial, gradient-based learning; and b) it multiplicatively combines bottom-up, top-down, and horizontal, evidence at every unit (neuron) in every mac at every level on every time step during learning and inference (retrieval). However, Sparseys greatest distinguishing characteristic is that it does both learning (storage) and retrieval of the best matching stored input in time that remains constant regardless of how many patterns (how much information) has been stored. Thus, it has excellent scaling potential to Big Data-sized problems. We conducted numerous studies establishing basic properties and capacities, culminating in demonstration of 67% classification accuracy on the Weizmann data set, accomplished with 3.5 minutes training time, with no machine parallelism and almost no software optimization.

Descriptors :   artificial intelligence , artificial neural networks , pattern recognition , information processing , machine learning , VIDEO FRAMES , probabilistic models

Distribution Statement : APPROVED FOR PUBLIC RELEASE