Open Access

Self-Assimilation for Solving Excessive Information Acquisition in Potential Learning


Cite

The present paper aims to propose a new computational method for potential learning to improve generalization and interpretation. Potential learning has been proposed to simplify the computational procedures of information maximization and to specify which neurons should be fired. However, it is often the case that potential learning sometimes absorbs too much information content on input patterns in the early stage of learning, which tends to degrade generalization performance. This can be solved by making potential learning as slow as possible. Accordingly, we here propose a procedure called “self-assimilation” in which connection weights are accentuated by their characteristics observed in the specific learning step. This makes it possible to predict future connection weights in the early stage of learning. Thus, it is possible to improve generalization by slow learning and at the same time to improve the interpretation of connection weights via the enhanced characteristics of the connection weights. The method was applied to an artificial data set, as well as a real data set of counter services at a local government office in the Tokyo metropolitan area. The results show that improved generalization was observed by making learning as slow as possible. In addition, the number of strong connection weights became smaller for better interpretation by self-assimilation.

eISSN:
2083-2567
Language:
English
Publication timeframe:
4 times per year
Journal Subjects:
Computer Sciences, Databases and Data Mining, Artificial Intelligence