Accession Number:

AD1006958

Title:

Sparse Distributed Representation and Hierarchy: Keys to Scalable Machine Intelligence

Descriptive Note:

Technical Report,29 Apr 2013,30 Nov 2015

Corporate Author:

Neurithmic Systems, LLC Newton United States

Report Date:

2016-04-01

Pagination or Media Count:

231.0

Abstract:

We developed and tested a cortically-inspired model of spatiotemporal pattern learning and recognition called Sparsey. Sparsey is a hierarchical model allowing an arbitrary number of levels consisting of coding modules that code information, specifically particular spatiotemporal input moments using sparse distributed representations SDRs. The modules are called macs as they are proposed as analogs of the canonical cortical processing module known as macrocolumns. Sparsey differs from mainstream neural models, e.g., Deep Learning, in many ways including a it uses single-trial, Hebbian learning rather than incremental, many-trial, gradient-based learning and b it multiplicatively combines bottom-up, top-down, and horizontal, evidence at every unit neuron in every mac at every level on every time step during learning and inference retrieval. However, Sparseys greatest distinguishing characteristic is that it does both learning storage and retrieval of the best matching stored input in time that remains constant regardless of how many patterns how much information has been stored. Thus, it has excellent scaling potential to Big Data-sized problems. We conducted numerous studies establishing basic properties and capacities, culminating in demonstration of 67 classification accuracy on the Weizmann data set, accomplished with 3.5 minutes training time, with no machine parallelism and almost no software optimization.

Subject Categories:

Distribution Statement:

APPROVED FOR PUBLIC RELEASE