Associative Fine-Tuning of Biologically Inspired Active Neuro-Associative Knowledge Graphs

This paper introduces a new tuning algorithm that improves the associative training algorithm of active neuro-associative knowledge graphs (ANAKG). We also expand the definition of synaptic weights, using new multiplicative factors. Biological neural networks are sparse and developed in neuronal plasticity processes adapting them to the repeatable combinations of the input stimuli. Real neurons connect conditionally according to neural activity and on demand of some biochemical processes. They do not connect to all neurons in subsequent layers as is usually performed in artificial neural networks. For more than a decade, scientists conducted extensive research on adaptation mechanisms which use sparsely connected neural structures that can specialize and adapt faster to training data. This approach is also widely used in various learning strategies of deep neural networks. Conditional creation of sparse connections and fin e-tuning of their weights in complex associative neuronal ANAKG structures are the main contributions of this paper. The significant improvement of recalling of the context-based associations was verified experimentally.