Tel. +49 (0)541 969-3372
Institute of Cognitive Science,
49090 Osnabrück, Germany
Semi-supervised Conceptors and Conceptor Logic
Conceptors were recently introduced for the framework of reservoir computing as a mathematical formalism to access the internal representation of concepts by neural networks. Their application includes but is not restricted to controlling dynamics of reservoirs, denoising and classification of network responses and creating smooth transitions between patterns. On top of this formalism a quasi-Boolean logic is introduced to allow combining conceptors and thus deriving conceptors representing more abstract concepts from other conceptors instead of from data observation.
In general, conceptors are a promising idea to connect the subsymbolic behavior of a dynamical system to a symbolic representation of the high-level concepts represented by the underlying dynamics both in a bottom-up approach of calculating conceptors from neural responses and a top-down approach of applying the conceptors to manipulate the dynamical system.
During this PhD project a number of research questions will be tackled:
1. In the first phase of the project the main goal is to increase the understanding of the behavior of conceptors in general and the range of possible applications. One important detail is the role of the aperture; originally introduced as an additional parameter for controlling the regularization it plays a central role in the formalism, and identifying this role more clearly will most likely help with that.
2. Of special interest is the possibility of a transfer of the conceptor formalism to more commonly used classical architectures like feedforward neural networks. This transfer would enable us to apply theoretical results to a bigger variety of solutions for real world problems and thus on one hand help in developing the formalism and on the other hand making use of analysis mechanisms for more applications.
3. Originally, conceptors are only applied to Echo State Networks (ESNs) where neither the input weights nor the internal connections in the recurrent layers are trained but instead initialized randomly, only the weights connecting to output neurons are learned. Conceptors can identify how well suited the reservoir is for a specific task from a dynamical perspective.
4. Instead of using the randomly connected networks, frameworks like SORN implement local learning rules to train a recurrent network unsupervised, and show that this increases the efficiency and performance of the network for the given task. We aim to combine unsupervised learning of recurrent networks with conceptors, thus making the conceptors semi-supervised.
Michael Marino, Georg Schröter, Gunther Heidemann, Joachim Hertzberg (2020)
Hierarchical Modeling with Neurodynamical Agglomerative Analysis
LNCS, volume 12396