Jump to ContentJump to Main Navigation
Show Summary Details
More options …

Journal of Artificial Intelligence and Soft Computing Research

The Journal of Polish Neural Network Society, the University of Social Sciences in Lodz & Czestochowa University of Technology

4 Issues per year

Open Access
Online
ISSN
2083-2567
See all formats and pricing
More options …

Accumulative Information Enhancement In The Self-Organizing Maps And Its Application To The Analysis Of Mission Statements

Ryozo Kitajima
  • Graduate School of Science and Technology, Tokai University, 1117 Kitakaname Hiratsuka Kanagawa 259-1292, Japan
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Ryotaro Kamimura
  • IT Education Center and Graduate School of Science and Technology, Tokai University, 1117 Kitakaname Hiratsuka Kanagawa 259-1292, Japan
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
Published Online: 2015-09-23 | DOI: https://doi.org/10.1515/jaiscr-2015-0026

Abstract

This paper proposes a new information-theoretic method based on the information enhancement method to extract important input variables. The information enhancement method was developed to detect important components in neural systems. Previous methods have focused on the detection of only the most important components, and therefore have failed to fully incorporated the information contained in the components into learning processes. In addition, it has been observed that the information enhancement method cannot always extract input information from input patterns. Thus, in this paper a computational method is developed to accumulate information content in the process of information enhancement. The method was applied to an artificial data set and the analysis of mission statements. The results demonstrate that while we were able to explicitly extract the symmetric properties of the data from the artificial data set, only one main factor was able to be extracted from the mission statement, namely, “contribution to the society”. The companies with higher profits tend to have mission statements concerning the society. The results can be considered to be a first step toward the full clarification of the importance of mission statements in actual business activities.

References

  • [1] R. Linsker, “Self-organization in a perceptual network,” Computer, vol. 21, pp. 105–117, 1988.CrossrefGoogle Scholar

  • [2] R. Linsker, “How to generate ordered maps by maximizing the mutual information between input and output,” Neural Computation, vol. 1, pp. 402–411, 1989.CrossrefGoogle Scholar

  • [3] R. Linsker, “Local synaptic rules suffice to maximize mutual information in a linear network,” Neural Computation, vol. 4, pp. 691–702, 1992.CrossrefGoogle Scholar

  • [4] R. Linsker, “Improved local learning rule for information maximization and related applications,” Neural Networks, vol. 18, pp. 261–265, 2005.CrossrefGoogle Scholar

  • [5] Z. Nenadic, “Information discriminant analysis: Feature extraction with an information-theoretic objective,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, no. 8, pp. 1394–1407, 2007.CrossrefGoogle Scholar

  • [6] K. Torkkola, “Feature extraction by non-parametric mutual information maximization,” Journal of Machine Learning Research, vol. 3, pp. 1415–1438, 2003.Google Scholar

  • [7] J. M. Leiva-Murillo and A. Artes-Rodriguez, “Maximization of mutual information for supervised linear feature extraction,” IEEE Transactions on Neural Networks, vol. 18, no. 5, pp. 1433–1441, 2007.Web of ScienceCrossrefGoogle Scholar

  • [8] D. E. Rumelhart and D. Zipser, “Feature discovery by competitive learning,” in Parallel Distributed Processing (D. E. Rumelhart and G. E. H. et al., eds.), vol. 1, pp. 151–193, Cambridge: MIT Press, 1986.Google Scholar

  • [9] T. Kohonen, Self-Organization and Associative Memory. New York: Springer-Verlag, 1988.Google Scholar

  • [10] T. Kohonen, Self-Organizing Maps. Springer-Verlag, 1995.Google Scholar

  • [11] R. Kamimura and T. Kamimura, “Structural information and linguistic rule extraction,” in Proceedings of ICONIP-2000, pp. 720–726, 2000.Google Scholar

  • [12] R. Kamimura, T. Kamimura, and O. Uchida, “Flexible feature discovery and structural information control,” Connection Science, vol. 13, no. 4, pp. 323–347, 2001.CrossrefGoogle Scholar

  • [13] R. Kamimura, “Information-theoretic competitive learning with inverse Euclidean distance output units,” Neural Processing Letters, vol. 18, pp. 163–184, 2003.CrossrefGoogle Scholar

  • [14] R. Kamimura, “Teacher-directed learning: information-theoretic competitive learning in supervised multi-layered networks,” Connection Science, vol. 15, pp. 117–140, 2003.CrossrefGoogle Scholar

  • [15] R. Kamimura, “Progressive feature extraction by greedy network-growing algorithm,” Complex Systems, vol. 14, no. 2, pp. 127–153, 2003.Google Scholar

  • [16] R. Kamimura, “Information theoretic competitive learning in self-adaptive multi-layered networks,” Connection Science, vol. 13, no. 4, pp. 323–347, 2003.CrossrefGoogle Scholar

  • [17] R. Kamimura, “Feature discovery by enhancement and relaxation of competitive units,” in Intelligent data engineering and automated learning-IDEAL2008(LNCS), vol. LNCS5326, pp. 148–155, Springer, 2008.Google Scholar

  • [18] R. Kamimura, “Information-theoretic enhancement learning and its application to visualization of self-organizing maps,” Neurocomputing, vol. 73, no. 13-15, pp. 2642–2664, 2010.Web of ScienceCrossrefGoogle Scholar

  • [19] R. Kamimura, “Information-theoretic enhancement learning and its application to visualization of self-organizing maps,” Neurocomputing, vol. 73, no. 13-15, pp. 2642–2664, 2010.Web of ScienceCrossrefGoogle Scholar

  • [20] R. Kamimura, “Double enhancement learning for explicit internal representations: unifying self-enhancement and information enhancement to incorporate information on input variables,” Applied Intelligence, pp. 1–23, 2011.Web of ScienceGoogle Scholar

  • [21] R. Kamimura, “Selective information enhancement learning for creating interpretable representations in competitive learning,” Neural Networks, vol. 24, no. 4, pp. 387–405, 2011.Web of ScienceCrossrefGoogle Scholar

  • [22] B. Bartkus, M. Glassman, and B. McAFEE, “Mission statement quality and financial performance,” European Management Journal, vol. 24, no. 1, pp. 86–94, 2006.CrossrefGoogle Scholar

  • [23] B. R. Bartkus, M. Glassman, and R. B. McAfee, “A comparison of the quality of european, japanese and us mission statements:: A content analysis,” European Management Journal, vol. 22, no. 4, pp. 393–401, 2004.CrossrefGoogle Scholar

  • [24] E. Oda and H. Mitsuhashi, “Experimental study of management principle and company performance by text mining(in japanese),” Management philosophy, vol. 7, no. 2, pp. 22–37, 2010.Google Scholar

  • [25] K. Ryozo and K. Ryotaro, “Company policy analysis by information theoretical neural networks,” in Proceedings of the 40th fuzzy workshop, pp. 13–14, 2014.Google Scholar

  • [26] R. Kamimura, T. Kamimura, and T. R. Shultz, “Information theoretic competitive learning and linguistic rule acquisition,” Transactions of the Japanese Society for Artificial Intelligence, vol. 16, no. 2, pp. 287–298, 2001.Google Scholar

  • [27] D. E. Rumelhart and D. Zipser, “Feature discovery by competitive learning,” Cognitive Science, vol. 9, pp. 75–112, 1985.CrossrefGoogle Scholar

  • [28] M. Van Hulle, “Topographic map formation by maximizing unconditional entropy: a plausible strategy for ’on-line’ unsupervised competitive learning and nonparametric density estimation,” IEEE Transactions on Neural Networks, vol. 7, no. 5, pp. 1299–1305, 1996.CrossrefGoogle Scholar

  • [29] M. M. Van Hulle, “The formation of topographic maps that maximize the average mutual information of the output responses to noiseless input signals,” Neural Computation, vol. 9, no. 3, pp. 595–606, 1997.CrossrefGoogle Scholar

  • [30] M. M. Van Hulle, “Topology-preserving map formation achieved with a purely local unsupervised competitive learning rule,” Neural Networks, vol. 10, no. 3, pp. 431–446, 1997.CrossrefGoogle Scholar

  • [31] M. M. Van Hulle, “Faithful representations with topographic maps,” Neural Networks, vol. 12, no. 6, pp. 803–823, 1999.CrossrefGoogle Scholar

  • [32] M. M. Van Hulle, “Entropy-based kernel modeling for topographic map formation,” IEEE Transactions on Neural Networks, vol. 15, no. 4, pp. 850–858, 2004.CrossrefGoogle Scholar

  • [33] M. M. V. Hulle, “The formation of topographic maps that maximize the average mutual information of the output responses to noiseless input signals,” Neural Computation, vol. 9, no. 3, pp. 595–606, 1997.CrossrefGoogle Scholar

  • [34] S. C. Ahalt, A. K. Krishnamurthy, P. Chen, and D. E. Melton, “Competitive learning algorithms for vector quantization,” Neural Networks, vol. 3, pp. 277–290, 1990.CrossrefGoogle Scholar

  • [35] L. Xu, “Rival penalized competitive learning for clustering analysis, RBF net, and curve detection,” IEEE Transaction on Neural Networks, vol. 4, no. 4, pp. 636–649, 1993.Google Scholar

  • [36] A. Luk and S. Lien, “Properties of the generalized lotto-type competitive learning,” in Proceedings of International conference on neural information processing, (San Mateo: CA), pp. 1180–1185, Morgan Kaufmann Publishers, 2000.Google Scholar

  • [37] Y. J. Zhang and Z. Q. Liu, “Self-splitting competitive learning: a new on-line clustering paradigm,” IEEE Transactions on Neural Networks, vol. 13, no. 2, pp. 369–380, 2002.CrossrefGoogle Scholar

  • [38] H. Xiong, M. N. S. Swamy, and M. O. Ahmad, “Competitive splitting for codebook initialization,” IEEE Signal Processing Letters, vol. 11, pp. 474–477, 2004.CrossrefGoogle Scholar

  • [39] J. C. Yen, J. I. Guo, and H. C. Chen, “A new k-winners-take-all neural networks and its array architecture,” IEEE Transactions on Neural Networks, vol. 9, no. 5, pp. 901–912, 1998.CrossrefGoogle Scholar

  • [40] S. Ridella, S. Rovetta, and R. Zunino, “K-winner machines for pattern classification,” IEEE Transactions on Neural Networks, vol. 12, no. 2, pp. 371–385, 2001.Web of ScienceCrossrefGoogle Scholar

  • [41] S. Kurohashi and D. Kawata, “http://nlp.ist.i.kyoto-u.ac.jp/index.php?juman,”

  • [42] E. Merényi, K. Tasdemir, and L. Zhang, “Learning highly structured manifolds: harnessing the power of soms,” in Similarity-Based Clustering, pp. 138–168, Springer, 2009.Google Scholar

  • [43] K. Tasdemir and E. Merényi, “Exploiting data topology in visualization and clustering of self-organizing maps,” Neural Networks, IEEE Transactions on, vol. 20, no. 4, pp. 549–562, 2009.Web of ScienceCrossrefGoogle Scholar

  • [44] I. Guyon and A. Elisseeff, “An introduction to variable and feature selection,” Journal of Machine Learning Research, vol. 3, pp. 1157–1182, 2003.Google Scholar

  • [45] A. Rakotomamonjy, “Variable selection using SVM-based criteria,” Journal of Machine Learning Research, vol. 3, pp. 1357–1370, 2003.Google Scholar

  • [46] S. Perkins, K. Lacker, and J. Theiler, “Grafting: Fast, incremental feature selection by gradient descent in function space,” Journal of Machine Learning Research, vol. 3, pp. 1333–1356, 2003.Google Scholar

About the article

Published Online: 2015-09-23

Published in Print: 2015-07-01


Citation Information: Journal of Artificial Intelligence and Soft Computing Research, ISSN (Online) 2083-2567, DOI: https://doi.org/10.1515/jaiscr-2015-0026.

Export Citation

© Academy of Management (SWSPiZ), Lodz. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License. BY-NC-ND 3.0

Comments (0)

Please log in or register to comment.
Log in