The generation of abstract mental representations enables considerably more skillful interaction with the environment. How can such representations arise from concrete and uninterpreted sensorimotor activations? How can a system interpret its sensorimotor data as concepts that it developed completely independently, without using the semantics in the mind of its developer?
This ability is a prerequisite for general learning in unknown environments. Previous approaches attempt to achieve this in three ways: by simulating a sufficiently complex biological brain (anatomically motivated); by simulating and combining functional modules of the human psyche (psychologically motivated); and by identifying one basic algorithm that enables different types of learning (holistically motivated).
In this publication the author follows the third path and draws inspiration from phenomenology, theories of embodied cognition and semiotics. Mark Wernsdorfer shows that this approach surpasses previous methods of sequence prediction. It also allows the dynamic generation and modification of representations during runtime. Mark Wernsdorfer presents and evaluates the possibilities and limitations of the developed algorithm by means of different experiments.