PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Accumulative information enhancement in the self-organizing maps and its application to the analysis of mission statements

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
This paper proposes a new information-theoretic method based on the information enhancement method to extract important input variables. The information enhancement method was developed to detect important components in neural systems. Previous methods have focused on the detection of only the most important components, and therefore have failed to fully incorporated the information contained in the components into learning processes. In addition, it has been observed that the information enhancement method cannot always extract input information from input patterns. Thus, in this paper a computational method is developed to accumulate information content in the process of information enhancement. The method was applied to an artificial data set and the analysis of mission statements. The results demonstrate that while we were able to explicitly extract the symmetric properties of the data from the artificial data set, only one main factor was able to be extracted from the mission statement, namely, “contribution to the society”. The companies with higher profits tend to have mission statements concerning the society. The results can be considered to be a first step toward the full clarification of the importance of mission statements in actual business activities.
Rocznik
Strony
161--176
Opis fizyczny
Bibliogr. 46 poz., rys.
Twórcy
autor
  • Graduate School of Science and Technology, Tokai University, 1117 Kitakaname Hiratsuka Kanagawa 259-1292, Japan
autor
  • IT Education Center and Graduate School of Science and Technology, Tokai University, 1117 Kitakaname Hiratsuka Kanagawa 259-1292, Japan
Bibliografia
  • [1] R. Linsker, “Self-organization in a perceptual network,” Computer, vol. 21, pp. 105–117, 1988.
  • [2] R. Linsker, “How to generate ordered maps by maximizing the mutual information between input and output,” Neural Computation, vol. 1, pp. 402–411, 1989.
  • [3] R. Linsker, “Local synaptic rules suffice to maximize mutual information in a linear network,” Neural Computation, vol. 4, pp. 691–702, 1992.
  • [4] R. Linsker, “Improved local learning rule for information maximization and related applications,” Neural Networks, vol. 18, pp. 261–265, 2005.
  • [5] Z. Nenadic, “Information discriminant analysis: Feature extraction with an information-theoretic objective,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, no. 8, pp. 1394–1407, 2007.
  • [6] K. Torkkola, “Feature extraction by nonparametric mutual information maximization,” Journal of Machine Learning Research, vol. 3, pp. 1415–1438, 2003.
  • [7] J. M. Leiva-Murillo and A. Artes-Rodriguez, “Maximization of mutual information for supervised linear feature extraction,” IEEE Transactions on Neural Networks, vol. 18, no. 5, pp. 1433–1441, 2007.
  • [8] D. E. Rumelhart and D. Zipser, “Feature discovery by competitive learning,” in Parallel Distributed Processing (D. E. Rumelhart and G. E. H. et al., eds.), vol. 1, pp. 151–193, Cambridge: MIT Press, 1986.
  • [9] T. Kohonen, Self-Organization and Associative Memory. New York: Springer-Verlag, 1988.
  • [10] T. Kohonen, Self-Organizing Maps. Springer-Verlag, 1995.
  • [11] R. Kamimura and T. Kamimura, “Structural information and linguistic rule extraction,” in Proceedings of ICONIP-2000, pp. 720–726, 2000.
  • [12] R. Kamimura, T. Kamimura, and O. Uchida, “Flexible feature discovery and structural information control,” Connection Science, vol. 13, no. 4, pp. 323–347, 2001.
  • [13] R. Kamimura, “Information-theoretic competitive learning with inverse Euclidean distance output units,” Neural Processing Letters, vol. 18, pp. 163–184, 2003.
  • [14] R. Kamimura, “Teacher-directed learning:information-theoretic competitive learning in supervised multi-layered networks,” Connection Science, vol. 15, pp. 117–140, 2003.
  • [15] R. Kamimura, “Progressive feature extraction by greedy network-growing algorithm,” Complex Systems, vol. 14, no. 2, pp. 127–153, 2003.
  • [16] R. Kamimura, “Information theoretic competitive learning in self-adaptive multi-layered networks,” Connection Science, vol. 13, no. 4, pp. 323–347, 2003.
  • [17] R. Kamimura, “Feature discovery by enhancement and relaxation of competitive units,” in Intelligent data engineering and automated learning- IDEAL2008(LNCS), vol. LNCS5326, pp. 148–155, Springer, 2008.
  • [18] R. Kamimura, “Information-theoretic enhancement learning and its application to visualization of self-organizing maps,” Neurocomputing, vol. 73, no. 13-15, pp. 2642–2664, 2010.
  • [19] R. Kamimura, “Information-theoretic enhancement learning and its application to visualization of self-organizing maps,” Neurocomputing, vol. 73, no. 13-15, pp. 2642–2664, 2010.
  • [20] R. Kamimura, “Double enhancement learning for explicit internal representations: unifying selfenhancement and information enhancement to incorporateinformation on input variables,” Applied Intelligence, pp. 1–23, 2011.
  • [21] R. Kamimura, “Selective information enhancement learning for creating interpretable representations in competitive learning,” Neural Networks, vol. 24, no. 4, pp. 387–405, 2011.
  • [22] B. Bartkus, M. Glassman, and B. McAFEE, “Mission statement quality and financial performance,” European Management Journal, vol. 24, no. 1, pp. 86–94, 2006.
  • [23] B. R. Bartkus, M. Glassman, and R. B. McAfee, “A comparison of the quality of european, japanese and us mission statements:: A content analysis,” European Management Journal, vol. 22, no. 4, pp. 393–401, 2004.
  • [24] E. Oda and H. Mitsuhashi, “Experimental study of management principle and company performance by text mining(in japanese),” Management philosophy, vol. 7, no. 2, pp. 22–37, 2010.
  • [25] K. Ryozo and K. Ryotaro, “Company policy analysis Proceedings of the 40th fuzzy workshop, pp. 13–14, 2014.
  • [26] R. Kamimura, T. Kamimura, and T. R. Shultz, “Information theoretic competitive learning and linguistic rule acquisition,” Transactions of the Japanese Society for Artificial Intelligence, vol. 16, no. 2, pp. 287–298, 2001.
  • [27] D. E. Rumelhart and D. Zipser, “Feature discovery by competitive learning,” Cognitive Science, vol. 9, pp. 75–112, 1985.
  • [28] M. Van Hulle, “Topographic map formation by maximizing unconditional entropy: a plausible strategy for ’on-line’ unsupervised competitive learning and nonparametric density estimation,”IEEE Transactions on Neural Networks, vol. 7, no. 5, pp. 1299–1305, 1996.
  • [29] M. M. Van Hulle, “The formation of topographic maps that maximize the average mutual information of the output responses to noiseless input signals,” Neural Computation, vol. 9, no. 3, pp. 595–606, 1997.
  • [30] M. M. Van Hulle, “Topology-preserving map formation achieved with a purely local unsupervised competitive learning rule,” Neural Networks, vol. 10, no. 3, pp. 431–446, 1997.
  • [31] M. M. Van Hulle, “Faithful representations with topographic maps,” Neural Networks, vol. 12, no. 6, pp. 803–823, 1999.
  • [32] M. M. Van Hulle, “Entropy-based kernel modeling for topographic map formation,” IEEE Transactions on Neural Networks, vol. 15, no. 4, pp. 850–858, 2004.
  • [33] M. M. V. Hulle, “The formation of topographic maps that maximize the average mutual information of the output responses to noiseless input signals,” Neural Computation, vol. 9, no. 3, pp. 595–606, 1997.
  • [34] S. C. Ahalt, A. K. Krishnamurthy, P. Chen, and D. E. Melton, “Competitive learning algorithms for vector quantization,” Neural Networks, vol. 3, pp. 277–290, 1990.
  • [35] L. Xu, “Rival penalized competitive learning for clustering analysis, RBF net, and curve detection,” IEEE Transaction on Neural Networks, vol. 4, no. 4, pp. 636–649, 1993.
  • [36] A. Luk and S. Lien, “Properties of the generalized lotto-type competitive learning,” in Proceedings of International conference on neural information processing, (San Mateo: CA), pp. 1180–1185, Morgan Kaufmann Publishers, 2000.
  • [37] Y. J. Zhang and Z. Q. Liu, “Self-splitting competitive learning: a new on-line clustering paradigm,” IEEE Transactions on Neural Networks, vol. 13, no. 2, pp. 369–380, 2002.
  • [38] H. Xiong, M. N. S. Swamy, and M. O. Ahmad, “Competitive splitting for codebook initialization,” IEEE Signal Processing Letters, vol. 11, pp. 474–477, 2004.
  • [39] J. C. Yen, J. I. Guo, and H. C. Chen, “A new k-winners-take-all neural networks and its array architecture,” IEEE Transactions on Neural Networks,vol. 9, no. 5, pp. 901–912, 1998.
  • [40] S. Ridella, S. Rovetta, and R. Zunino, “K-winner machines for pattern classification,” IEEE Transactions on Neural Networks, vol. 12, no. 2, pp. 371–385, 2001.
  • [41] S. Kurohashi and D. Kawata, “http://nlp.ist.i.kyoto-u.ac.jp/index.php?juman,”
  • [42] E. Mer´enyi, K. Tasdemir, and L. Zhang, “Learning highly structured manifolds: harnessing the power of soms,” in Similarity-Based Clustering, pp. 138–168, Springer, 2009.
  • [43] K. Tasdemir and E. Mer´enyi, “Exploiting data topology in visualization and clustering of selforganizing on, vol. 20, no. 4, pp. 549–562, 2009.
  • [44] I. Guyon and A. Elisseeff, “An introduction to variable and feature selection,” Journal of Machine Learning Research, vol. 3, pp. 1157–1182, 2003.
  • [45] A. Rakotomamonjy, “Variable selection using SVM-based criteria,” Journal of Machine Learning Research, vol. 3, pp. 1357–1370, 2003.
  • [46] S. Perkins, K. Lacker, and J. Theiler, “Grafting: Fast, incremental feature selection by gradient descent in function space,” Journal of Machine Learning Research, vol. 3, pp. 1333–1356, 2003.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-8b45e380-2a0b-47e1-bced-0c045ec21cec
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.