Jump to ContentJump to Main Navigation
Show Summary Details
More options …

Journal of Artificial Intelligence and Soft Computing Research

The Journal of Polish Neural Network Society, the University of Social Sciences in Lodz & Czestochowa University of Technology

4 Issues per year

Open Access
Online
ISSN
2083-2567
See all formats and pricing
More options …

Self-Assimilation for Solving Excessive Information Acquisition in Potential Learning

Ryotaro Kamimura / Tsubasa Kitago
  • Department of Politics and Economics, Tokai University, 4-1-1 Kitakaname, Hiratsuka, Kanagawa 259-1292, Japan
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
Published Online: 2017-11-01 | DOI: https://doi.org/10.1515/jaiscr-2018-0001

Abstract

The present paper aims to propose a new computational method for potential learning to improve generalization and interpretation. Potential learning has been proposed to simplify the computational procedures of information maximization and to specify which neurons should be fired. However, it is often the case that potential learning sometimes absorbs too much information content on input patterns in the early stage of learning, which tends to degrade generalization performance. This can be solved by making potential learning as slow as possible. Accordingly, we here propose a procedure called “self-assimilation” in which connection weights are accentuated by their characteristics observed in the specific learning step. This makes it possible to predict future connection weights in the early stage of learning. Thus, it is possible to improve generalization by slow learning and at the same time to improve the interpretation of connection weights via the enhanced characteristics of the connection weights. The method was applied to an artificial data set, as well as a real data set of counter services at a local government office in the Tokyo metropolitan area. The results show that improved generalization was observed by making learning as slow as possible. In addition, the number of strong connection weights became smaller for better interpretation by self-assimilation.

Keywords: neural networks; learning; excessive information acquisition; self-assimilation method

References

  • [1] R. Linsker, Self-organization in a perceptual network, Computer, vol. 21, no. 3, pp. 105–117, 1988.Google Scholar

  • [2] R. Linsker, How to generate ordered maps by maximizing the mutual information between input and output signals, Neural computation, vol. 1, no. 3, pp. 402–411, 1989.Google Scholar

  • [3] R. Linsker, Local synaptic learning rules suffice to maximize mutual information in a linear network, Neural Computation, vol. 4, no. 5, pp. 691–702, 1992.CrossrefGoogle Scholar

  • [4] R. Linsker, Improved local learning rule for information maximization and related applications, Neural networks, vol. 18, no. 3, pp. 261–265, 2005.Google Scholar

  • [5] G. Deco, W. Finnoff, and H. Zimmermann, Unsupervised mutual information criterion for elimination of overtraining in supervised multilayer networks, Neural Computation, vol. 7, no. 1, pp. 86–107, 1995.Google Scholar

  • [6] G. Deco and D. Obradovic, An information-theoretic approach to neural computing, Springer Science & Business Media, 2012.Google Scholar

  • [7] H. B. Barlow, Unsupervised learning, Neural computation, vol. 1, no. 3, pp. 295–311, 1989.Google Scholar

  • [8] H. B. Barlow, T. P. Kaushal, and G. J. Mitchison, Finding minimum entropy codes, Neural Computation, vol. 1, no. 3, pp. 412–423, 1989.Google Scholar

  • [9] J. J. Atick, Could information theory provide an ecological theory of sensory processing?, Network: Computation in neural systems, vol. 3, no. 2, pp. 213–251, 1992.Google Scholar

  • [10] Z. Nenadic, Information discriminant analysis: Feature extraction with an information-theoretic objective, Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol. 29, no. 8, pp. 1394–1407, 2007.Google Scholar

  • [11] J. C. Principe, D. Xu, and J. Fisher, Information theoretic learning, Unsupervised adaptive filtering, vol. 1, pp. 265–319, 2000.Google Scholar

  • [12] J. C. Principe, Information theoretic learning: Renyi’s entropy and kernel perspectives, Springer Science & Business Media, 2010.Google Scholar

  • [13] K. Torkkola, Feature extraction by non parametric mutual information maximization, The Journal of Machine Learning Research, vol. 3, pp. 1415–1438, 2003.Google Scholar

  • [14] R. Kamimura, Simple and stable internal representation by potential mutual information maximization, in International Conference on Engineering Applications of Neural Networks, pp. 309–316, Springer, 2016.Google Scholar

  • [15] R. Kamimura, Self-organizing selective potentiality learning to detect important input neurons, in Systems, Man, and Cybernetics (SMC), 2015 IEEE International Conference on, pp. 1619–1626, IEEE, 2015.Google Scholar

  • [16] R. Kamimura, Collective interpretation and potential joint information maximization, in Intelligent Information Processing VIII: 9th IFIP TC 12 International Conference, IIP 2016, Melbourne, VIC, Australia, November 18-21, 2016, Proceedings, pp. 12–21, Springer, 2016.Google Scholar

  • [17] R. Kamimura, Repeated potentiality assimilation: Simplifying learning procedures by positive, independent and indirect operation for improving generalization and interpretation (in press), in Proc. of IJCNN-2016, (Vancouver), 2016.Google Scholar

  • [18] R. Kamimura and T. Kamimura, Structural information and linguistic rule extraction, in Proceedings of ICONIP, pp. 720–726, 2000.Google Scholar

  • [19] R. Kamimura, T. Kamimura, and O. Uchida, Flexible feature discovery and structural information control, Connection science, vol. 13, no. 4, pp. 323–347, 2001.CrossrefGoogle Scholar

  • [20] R. Kamimura, Information-theoretic competitive learning with inverse euclidean distance output units,” Neural processing letters, vol. 18, no. 3, pp. 163–204, 2003.CrossrefGoogle Scholar

About the article

Received: 2017-03-31

Accepted: 2017-04-19

Published Online: 2017-11-01

Published in Print: 2018-01-01


Citation Information: Journal of Artificial Intelligence and Soft Computing Research, Volume 8, Issue 1, Pages 5–29, ISSN (Online) 2083-2567, DOI: https://doi.org/10.1515/jaiscr-2018-0001.

Export Citation

© 2018 Ryotaro Kamimura et al., published by De Gruyter Open. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. BY-NC-ND 4.0

Comments (0)

Please log in or register to comment.
Log in