The goal of this project is to uncover the effects of learning and long-term memory storage on synaptic connectivity, thus,creating the basis for quantitative analyses of these fundamental brain functions. To that end, we developed a model of a biologically-inspired recurrent neural network, capable of storing and reliably retrieving temporal sequences of network states. The model incorporates many basic elements of local connectivity in the mammalian neocortex, including excitatory and inhibitory neuron classes, neuron morphologies, the homeostatic constraint on connection weights, and several types of errors and noise in signal transmission. The model was solved analytically and numerically with the replica and convex optimization methods. In addition, a perceptron-type learning rule was developed to load associative memory sequences into the network in a biologically-plausible online manner. Our results revealed that when individual neurons are robustly loaded with a near-maximum amount of memories they can support, the network develops many structural and dynamical properties that are consistent with experimental observations.