default search action
Ryotaro Kamimura
Person information
Refine list
refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
2020 – today
- 2024
- [c121]Ryotaro Kamimura:
Forced and Natural Creative-Prototype Learning for Interpreting Multi-Layered Neural Networks. IIAI-AAI 2024: 343-350 - 2023
- [j58]Ryotaro Kamimura:
Contradiction neutralization for interpreting multi-layered neural networks. Appl. Intell. 53(23): 28349-28376 (2023) - [j57]Ryotaro Kamimura:
Impartial competitive learning in multi-layered neural networks. Connect. Sci. 35(1) (2023) - [j56]Ryotaro Kamimura:
Destructive computing with winner-lose-all competition in multi-layered neural networks. Int. J. Hybrid Intell. Syst. 19(3): 145-166 (2023) - [c120]Ryotaro Kamimura:
Repeated Potentiality Augmentation for Multi-layered Neural Networks. FICC (2) 2023: 117-134 - 2022
- [j55]Ryotaro Kamimura:
Multi-level selective potentiality maximization for interpreting multi-layered neural networks. Appl. Intell. 52(12): 13961-13986 (2022) - [j54]Ryotaro Kamimura:
Cost-forced and repeated selective information minimization and maximization for multi-layered neural networks. Int. J. Hybrid Intell. Syst. 18(1-2): 69-95 (2022) - [j53]Ryotaro Kamimura:
Cost-forced collective potentiality maximization by complementary potentiality minimization for interpreting multi-layered neural networks. Neurocomputing 480: 234-256 (2022) - [c119]Ryotaro Kamimura, Ryozo Kitajima:
Min-Max Cost and Information Control in Multi-layered Neural Networks. FTC (1) 2022: 1-17 - [c118]Ryotaro Kamimura, Ryozo Kitajima:
Serially Disentangled Learning for Multi-Layered Neural Networks. IEA/AIE 2022: 669-681 - 2021
- [j52]Ryotaro Kamimura:
Partially black-boxed collective interpretation and its application to SOM-based convolutional neural networks. Neurocomputing 450: 336-353 (2021) - [c117]Ryotaro Kamimura, Ryozo Kitajima:
Forced Selective Information Reduction for Interpreting Multi-Layered Neural Networks. IIAI-AAI-Winter 2021: 24-40 - [c116]Ryotaro Kamimura:
Selective Information Control and Network Compression in Multi-layered Neural Networks. IntelliSys (1) 2021: 184-204 - [c115]Ryotaro Kamimura:
Selective Information Control and Layer-Wise Partial Collective Compression for Multi-Layered Neural Networks. ISDA 2021: 121-131 - 2020
- [j51]Ryotaro Kamimura:
Minimum interpretation by autoencoder-based serial and enhanced mutual information production. Appl. Intell. 50(8): 2423-2448 (2020) - [j50]Ryotaro Kamimura, Haruhiko Takeuchi:
Improving collective interpretation by extended potentiality assimilation for multi-layered neural networks. Connect. Sci. 32(2): 174-203 (2020) - [j49]Ryotaro Kamimura:
Cost-conscious mutual information maximization for improving collective interpretation of multi-layered neural networks. Neurocomputing 409: 259-274 (2020) - [c114]Ryotaro Kamimura:
Cost-Conscious Internal Information Maximization for Disentangling and Interpreting Multi-layered Neural Networks. HIS 2020: 606-616 - [c113]Ryotaro Kamimura:
Compressing and Interpreting SOM-Based Convolutional Neural Networks. IEA/AIE 2020: 732-744 - [c112]Ryotaro Kamimura:
Disentangled Representations by Pseudo-Maximum Mutual Information for Interpreting Multi-Layered Neural Networks. IIAI-AAI 2020: 436-443 - [c111]Ryotaro Kamimura:
Bidirectional Estimation of Partially Black-Boxed Layers of SOM-Based Convolutional Neural Networks. IntelliSys (1) 2020: 407-427 - [c110]Ryotaro Kamimura:
Connective Potential Information for Collectively Interpreting Multi-Layered Neural Networks. SSCI 2020: 3033-3042
2010 – 2019
- 2019
- [j48]Ryotaro Kamimura, Haruhiko Takeuchi:
Sparse semi-autoencoders to solve the vanishing information problem in multi-layered neural networks. Appl. Intell. 49(7): 2522-2545 (2019) - [j47]Ryotaro Kamimura:
SOM-based information maximization to improve and interpret multi-layered neural networks: From information reduction to information augmentation approach to create new information. Expert Syst. Appl. 125: 397-411 (2019) - [j46]Ryotaro Kamimura:
Neural self-compressor: Collective interpretation by compressing multi-layered neural networks into non-layered networks. Neurocomputing 323: 12-36 (2019) - [j45]Ryotaro Kamimura:
Supposed Maximum Mutual Information for Improving Generalization and Interpretation of Multi-Layered Neural Networks. J. Artif. Intell. Soft Comput. Res. 9(2): 123-147 (2019) - [c109]Ryotaro Kamimura:
Information Compression for Intelligible Multi-Layered Neural Networks. IIAI-AAI 2019: 635-642 - [c108]Ryozo Kitajima, Ryotaro Kamimura, Hiroyuki Sakai:
Relationship Between Management Policies and Profitability for Second Section-Listed Manufacturing Companies of the Tokyo Stock Exchange (2016 Results). IIAI-AAI 2019: 697-700 - [c107]Ryotaro Kamimura:
Mutual Information Generation for Improving Generalization and Interpretation in Neural Networks. IJCNN 2019: 1-8 - [c106]Ryotaro Kamimura:
Interpreting Collectively Compressed Multi-Layered Neural Networks. CIS/RAM 2019: 95-100 - 2018
- [j44]Ryotaro Kamimura, Tsubasa Kitago:
Self-Assimilation for Solving Excessive Information Acquisition in Potential Learning. J. Artif. Intell. Soft Comput. Res. 8(1): 5-29 (2018) - [c105]Ryotaro Kamimura, Haruhiko Takeuchi:
Autoeconder-Based Excessive Information Generation for Improving and Interpreting Multi-layered Neural Networks. IIAI-AAI 2018: 518-523 - [c104]Ryotaro Kamimura, Haruhiko Takeuchi:
Excessive, Selective and Collective Information Processing to Improve and Interpret Multi-layered Neural Networks. IntelliSys (1) 2018: 664-675 - [c103]Ryotaro Kamimura:
Autoeoncoders and Information Augmentation for Improved Generalization and Interpretation in Multi-layered Neural Networks. ISCBI 2018: 52-58 - [c102]Ryotaro Kamimura, Ryozo Kitajima, Hiroyuki Sakai:
Local Selective Learning for Interpreting Multi-Layered Neural Networks. SCIS&ISIS 2018: 115-122 - [c101]Ryotaro Kamimura, Haruhiko Takeuchi:
Correlation-Constrained Mutual Information Maximization for Interpretable Multi-Layered Neural Networks. SCIS&ISIS 2018: 123-130 - [c100]Ryotaro Kamimura:
Information-Theoretic Self-compression of Multi-layered Neural Networks. TPNC 2018: 401-413 - 2017
- [j43]Ryotaro Kamimura:
Collective mutual information maximization to unify passive and positive approaches for improving interpretation and generalization. Neural Networks 90: 56-71 (2017) - [c99]Ryotaro Kamimura:
Direct Potentiality Assimilation for Improving Multi-Layered Neural Networks. FedCSIS (Position Papers) 2017: 19-23 - [c98]Ryotaro Kamimura, Haruhiko Takeuchi:
Supervised semi-autoencoder learning for multi-layered neural networks. IFSA-SCIS 2017: 1-8 - [c97]Ryotaro Kamimura:
Selective and cooperative potentiality maximization for improving interpretation and generalization. IJCNN 2017: 147-153 - [c96]Ryotaro Kamimura:
Potential layer-wise supervised learning for training multi-layered neural networks. IJCNN 2017: 2568-2575 - [c95]Ryotaro Kamimura:
Mutual information maximization for improving and interpreting multi-layered neural networks. SSCI 2017: 1-7 - 2016
- [j42]Ryozo Kitajima, Ryotaro Kamimura, Osamu Uchida, Fujio Toriumi:
Identifying Important Tweets by Considering the Potentiality of Neurons. IEICE Trans. Fundam. Electron. Commun. Comput. Sci. 99-A(8): 1555-1559 (2016) - [c94]Ryotaro Kamimura:
Simple and Stable Internal Representation by Potential Mutual Information Maximization. EANN 2016: 309-316 - [c93]Ryotaro Kamimura:
Solving the Vanishing Information Problem with Repeated Potential Mutual Information Maximization. ICONIP (4) 2016: 442-451 - [c92]Ryotaro Kamimura:
Collective Interpretation and Potential Joint Information Maximization. Intelligent Information Processing 2016: 12-21 - [c91]Ryotaro Kamimura:
Repeated potentiality assimilation: Simplifying learning procedures by positive, independent and indirect operation for improving generalization and interpretation. IJCNN 2016: 803-810 - 2015
- [j41]Ryotaro Kamimura:
Improving visualisation and prediction performance of supervised self-organising map by modified contradiction resolution. Connect. Sci. 27(1): 40-67 (2015) - [j40]Ryozo Kitajima, Ryotaro Kamimura:
Accumulative Information Enhancement In The Self-Organizing Maps And Its Application To The Analysis Of Mission Statements. J. Artif. Intell. Soft Comput. Res. 5(3): 161 (2015) - [c90]Ryotaro Kamimura:
Pseudo-potentiality maximization for improved interpretation and generalization in neural networks. IWCIA 2015: 21-28 - [c89]Ryotaro Kamimura:
Simplified and gradual information control for improving generalization performance of multi-layered neural networks. IJCNN 2015: 1-7 - [c88]Ryotaro Kamimura, Ryozo Kitajima:
Selective potentiality maximization for input neuron selection in self-organizing maps. IJCNN 2015: 1-8 - [c87]Ryotaro Kamimura:
Self-Organized Mutual Information Maximization Learning for Improved Generalization Performance. SMC 2015: 1613-1618 - [c86]Ryotaro Kamimura:
Self-Organizing Selective Potentiality Learning to Detect Important Input Neurons. SMC 2015: 1619-1626 - [c85]Ryozo Kitajima, Ryotaro Kamimura, Osamu Uchida, Fujio Toriumi:
Neural potential learning for tweets classification and interpretation. SoCPaR 2015: 141-148 - 2014
- [j39]Ryotaro Kamimura:
Input information maximization for improving self-organizing maps. Appl. Intell. 41(2): 421-438 (2014) - [c84]Ryotaro Kamimura:
Simplified Information Acquisition Method to Improve Prediction Performance: Direct Use of Hidden Neuron Outputs and Separation of Information Acquisition and Use Phase. ANNIIP 2014: 78-87 - [c83]Ryotaro Kamimura:
Explicit knowledge extraction in information-theoretic supervised multi-layered SOM. FOCI 2014: 78-83 - [c82]Ryotaro Kamimura:
Information acquisition performance by supervised information-theoretic self-organizing maps. NaBIC 2014: 151-157 - [c81]Ryotaro Kamimura:
Information-theoretic multi-layered supervised self-organizing maps for improved prediction performance and explicit internal representation. SMC 2014: 953-958 - [c80]Ryotaro Kamimura, Taeko Kamimura:
Embedded information enhancement for neuron selection in self-organizing maps. SMC 2014: 959-965 - 2013
- [j38]Ryotaro Kamimura:
Similarity interaction in information-theoretic self-organizing maps. Int. J. Gen. Syst. 42(3): 239-267 (2013) - [j37]Ryotaro Kamimura:
Repeated comprehensibility maximization in competitive learning. Neural Comput. Appl. 22(5): 911-932 (2013) - [j36]Ryotaro Kamimura:
Controlling Relations between the Individuality and Collectivity of Neurons and its Application to Self-Organizing Maps. Neural Process. Lett. 38(2): 177-203 (2013) - [c79]Ryotaro Kamimura:
Dependent input neuron selection in contradiction resolution. IWCIA 2013: 45-50 - [c78]Ryotaro Kamimura:
Contradiction resolution with explicit and limited evaluation and its application to SOM. IJCNN 2013: 1-8 - [c77]Ryotaro Kamimura:
Contradiction Resolution with Dependent Input Neuron Selection for Self-Organizing Maps. SMC 2013: 1353-1360 - 2012
- [j35]Ryotaro Kamimura:
Double enhancement learning for explicit internal representations: unifying self-enhancement and information enhancement to incorporate information on input variables. Appl. Intell. 36(4): 834-856 (2012) - [j34]Ryotaro Kamimura:
Comprehensibility maximization and humanly comprehensible representations. Int. J. Gen. Syst. 41(3): 265-287 (2012) - [j33]Ryotaro Kamimura:
Relative information maximization and its application to the extraction of explicit class structure in SOM. Neurocomputing 82: 37-51 (2012) - [c76]Ryotaro Kamimura:
Separation and Unification of Individuality and Collectivity and Its Application to Explicit Class Structure in Self-Organizing Maps. ICANN (2) 2012: 387-394 - [c75]Ryotaro Kamimura:
Contradiction Resolution for Foreign Exchange Rates Estimation. IJCCI 2012: 529-535 - [c74]Ryotaro Kamimura:
Interaction of individually and collectively treated neurons for explicit class structure in self-organizing maps. IJCNN 2012: 1-8 - [c73]Ryotaro Kamimura:
Contradiction resolution and its application to self-organizing maps. SMC 2012: 1966-1971 - 2011
- [j32]Ryotaro Kamimura:
Structural enhanced information and its application to improved visualization of self-organizing maps. Appl. Intell. 34(1): 102-115 (2011) - [j31]Ryotaro Kamimura:
Self-enhancement learning: target-creating learning and its application to self-organizing maps. Biol. Cybern. 104(4-5): 305-338 (2011) - [j30]Ryotaro Kamimura:
Constrained information maximization by free energy minimization. Int. J. Gen. Syst. 40(7): 701-725 (2011) - [j29]Ryotaro Kamimura:
Supposed maximum information for comprehensible representations in SOM. Neurocomputing 74(7): 1116-1134 (2011) - [j28]Ryotaro Kamimura:
Selective information enhancement learning for creating interpretable representations in competitive learning. Neural Networks 24(4): 387-405 (2011) - [c72]Ryotaro Kamimura:
Relative Information Maximization and Its Application to the Detection of Explicit Class Structure in SOM. FOCI 2011: 74-79 - [c71]Ryotaro Kamimura:
Explicit Class Structure by Weighted Cooperative Learning. ICANN (1) 2011: 109-116 - [c70]Ryotaro Kamimura:
Individually and Collectively Treated Neurons and its Application to SOM . IJCCI (NCTA) 2011: 24-30 - [c69]Ryotaro Kamimura:
Cooperation control and enhanced class structure in self-organizing maps. IJCNN 2011: 689-695 - [c68]Ryotaro Kamimura:
Explicit class structure with closeness and similarity between neurons. NaBIC 2011: 92-98 - 2010
- [j27]Ryotaro Kamimura:
Information enhancement for interpreting competitive learning. Int. J. Gen. Syst. 39(7): 705-728 (2010) - [j26]Ryotaro Kamimura:
Information-theoretic enhancement learning and its application to visualization of self-organizing maps. Neurocomputing 73(13-15): 2642-2664 (2010) - [c67]Ryotaro Kamimura:
Generation of Comprehensible Representations by Supposed Maximum Information. ICANN (2) 2010: 333-342 - [c66]Ryotaro Kamimura:
Pseudo-network Growing for Gradual Interpretation of Input Patterns. ICONIP (2) 2010: 375-382 - [c65]Ryotaro Kamimura:
Information-Theoretic Competitive and Cooperative Learning for Self-Organizing Maps. ICONIP (2) 2010: 423-430 - [c64]Ryotaro Kamimura:
Explicit class structure produced by information-theoretic competitive and cooperative learning. NaBIC 2010: 628-633
2000 – 2009
- 2009
- [j25]Ryotaro Kamimura:
An information-theoretic approach to feature extraction in competitive learning. Neurocomputing 72(10-12): 2693-2704 (2009) - [j24]Ryotaro Kamimura:
Relative Relaxation and Weighted Information Loss to Simplify and Stabilize Feature Detection. J. Adv. Comput. Intell. Intell. Informatics 13(4): 489-498 (2009) - [j23]Ryotaro Kamimura:
Feature Discovery by Information Loss. J. Comput. 4(10): 943-953 (2009) - [j22]Ryotaro Kamimura:
Enhancing and Relaxing Competitive Units for Feature Discovery. Neural Process. Lett. 30(1): 37-57 (2009) - [c63]Ryotaro Kamimura:
Information Enhancement Learning: Local Enhanced Information to Detect the Importance of Input Variables in Competitive Learning. EANN 2009: 86-97 - [c62]Ryotaro Kamimura:
Selective enhancement learning in competitive learning. IJCNN 2009: 1497-1502 - [c61]Ryotaro Kamimura:
Self-enhancement learning: Self-supervised and target-creating learning. IJCNN 2009: 1503-1509 - [c60]Ryotaro Kamimura:
Self-supervised learning by information enhancement: Target-Generating and Spontaneous learning for Competitive Learning. SMC 2009: 113-119 - [c59]Ryotaro Kamimura:
Structural Enhanced Information to Detect Features in Competitive Learning. SMC 2009: 3395-3401 - 2008
- [c58]Ryotaro Kamimura:
Partially Enhanced Competitive Learning. ICONIP (2) 2008: 163-170 - [c57]Ryotaro Kamimura:
Feature Detection by Structural Enhanced Information. ICONIP (2) 2008: 171-178 - [c56]Ryotaro Kamimura:
Enhanced Visualization by Combing SOM and Mixture Models. ICONIP (2) 2008: 268-275 - [c55]Ryotaro Kamimura:
Collective Activations to Generate Self-Organizing Maps. ICONIP (2) 2008: 943-950 - [c54]Ryotaro Kamimura:
Feature Discovery by Enhancement and Relaxation of Competitive Units. IDEAL 2008: 148-155 - [c53]Ryotaro Kamimura:
Conditional information and information loss for flexible feature extraction. IJCNN 2008: 2074-2083 - [c52]Ryotaro Kamimura:
Interpreting and improving multi-layered networks by free energy-based competitive learning. SMC 2008: 1812-1818 - [c51]Ryotaro Kamimura:
Mutual information maximization by free energy-based competitive learning for self-organizing maps. SMC 2008: 1819-1825 - 2007
- [j21]Ryotaro Kamimura, Osamu Uchida, Seiki Hashimoto:
Greedy network-growing algorithm with Minkowski distances. Int. J. Gen. Syst. 36(2): 157-177 (2007) - [c50]Ryotaro Kamimura, Fumihiko Yoshida, Ryozo Kitajima:
Interpreting cabinet approval ratings by neural networks. Artificial Intelligence and Applications 2007: 72-77 - [c49]Ryotaro Kamimura:
Forced information and information loss in information-theoretic competitive learning. Artificial Intelligence and Applications 2007: 78-84 - [c48]Ryotaro Kamimura, Fumihiko Yoshida, Yamashita Toshie, Ryozo Kitajima:
Information-Theoretic Variable Selection in Neural Networks. FOCI 2007: 222-227 - [c47]Ryotaro Kamimura:
Combining Hard and Soft Competition in Information-Theoretic Learning. FOCI 2007: 578-582 - [c46]Ryotaro Kamimura:
Forced Information and Information Loss for a Student Survey Analysis. FOCI 2007: 630-636 - [c45]Ryotaro Kamimura:
Partially Activated Neural Networks by Controlling Information. ICANN (1) 2007: 480-489 - [c44]Ryotaro Kamimura:
Controlled Competitive Learning: Extending Competitive Learning to Supervised Learning. IJCNN 2007: 1767-1773 - [c43]Ryotaro Kamimura, Ryozo Kitajima:
Forced Information Maximization to Accelerate Information-Theoretic Competitive Learning. IJCNN 2007: 1779-1784 - [c42]Ryotaro Kamimura:
Information loss to extract distinctive features in competitive learning. SMC 2007: 1217-1222 - 2006
- [j20]Ryotaro Kamimura:
Cooperative information maximization with Gaussian activation functions for self-organizing maps. IEEE Trans. Neural Networks 17(4): 909-918 (2006) - [c41]Ryotaro Kamimura, Fumihiko Yoshida, Ryozo Kitajima:
Collective Information-Theoretic Competitive Learning: Emergency of Improved Performance by Collectively Treated Neurons. ICONIP (1) 2006: 626-633 - [c40]Ryotaro Kamimura, Fumihiko Yoshida:
Automatic Inference of Cabinet Approval Ratings by Information-Theoretic Competitive Learning. ICONIP (2) 2006: 897-908 - [c39]Ryotaro Kamimura:
Self-organizing by Information Maximization: Realizing Self-Organizing Maps by Information-Theoretic Competitive Learning. ICONIP (1) 2006: 925-934 - [c38]Ryotaro Kamimura:
Supervised Information Maximization by Weighted Distance. IJCNN 2006: 1790-1796 - [c37]Ryotaro Kamimura:
Controlling Excessive Information by Surface Information Criterion for Information-Theoretic Self-Organizing Maps. SMC 2006: 5130-5134 - [c36]Ryotaro Kamimura:
Student Survey by Information-Theoretic Competitive Learning. SMC 2006: 5135-5140 - 2005
- [j19]Ryotaro Kamimura:
Improving information-theoretic competitive learning by accentuated information maximization. Int. J. Gen. Syst. 34(3): 219-233 (2005) - [j18]Ryotaro Kamimura:
Unifying cost and information in information-theoretic competitive learning. Neural Networks 18(5-6): 711-718 (2005) - [c35]Ryotaro Kamimura:
Modular Structure Generation and Its Application to Feature Extraction. Artificial Intelligence and Applications 2005: 166-171 - [c34]Ryotaro Kamimura:
Extracting Common and Distinctive Features by Cost-Sensitive Information Maximization. Artificial Intelligence and Applications 2005: 172-177 - [c33]Ryotaro Kamimura, Sachiko Aida-Hyugaji:
Maximizing the Ratio of Information to Its Cost in Information Theoretic Competitive Learning. ICANN (2) 2005: 215-222 - 2004
- [j17]Ryotaro Kamimura:
Improving feature extraction performance of greedy network-growing algorithm by inverse euclidean distance. Connect. Sci. 16(2): 129-138 (2004) - [j16]Ryotaro Kamimura:
Multi-Layered Greedy Network-Growing Algorithm: Extension of Greedy Network-Growing Algorithm to Multi-Layered Networks. Int. J. Neural Syst. 14(1): 9-26 (2004) - [j15]Ryotaro Kamimura:
Cooperative information control for self-organizing maps. Neurocomputing 62: 225-265 (2004) - [c32]Ryotaro Kamimura:
One-Epoch Learning for Supervised Information-Theoretic Competitive Learning. ICONIP 2004: 524-529 - [c31]Ryotaro Kamimura:
Teacher-Directed Learning with Gaussian and Sigmoid Activation Functions. ICONIP 2004: 530-536 - [c30]Ryotaro Kamimura, Osamu Uchida:
Cost-Sensitive Greedy Network-Growing Algorithm with Gaussian Activation Functions. ICONIP 2004: 653-658 - 2003
- [j14]Ryotaro Kamimura:
Progressive Feature Extraction with a Greedy Network-growing Algorithm. Complex Syst. 14(2) (2003) - [j13]Ryotaro Kamimura:
Information theoretic competitive learning in self-adaptive multi-layered networks. Connect. Sci. 15(1): 3-26 (2003) - [j12]Ryotaro Kamimura, Fumihiko Yoshida:
Teacher-directed learning: information-theoretic competitive learning in supervised multi-layered networks. Connect. Sci. 15(2-3): 117-140 (2003) - [j11]Ryotaro Kamimura:
Information-Theoretic Competitive Learning with Inverse Euclidean Distance Output Units. Neural Process. Lett. 18(3): 163-204 (2003) - [c29]Ryotaro Kamimura:
Competitive Learning by Information Maximization: Eliminating Dead Neurons in Competitive Learning. ICANN 2003: 99-106 - [c28]Ryotaro Kamimura, Haruhiko Takeuchi:
Generating Explicit Self-Organizing Maps by Information Maximization. IDEAL 2003: 236-245 - [c27]Ryotaro Kamimura, Osamu Uchida:
Improving Feature Extraction Performance of Greedy Network-Growing Algorithm. IDEAL 2003: 1056-1061 - [c26]Ryotaro Kamimura:
Information-theoretic Competitive Learning. Modelling and Simulation 2003: 359-365 - [c25]Ryotaro Kamimura, Seiki Hashimoto:
Economic Data Analysis and Cooperative Information Control. Modelling and Simulation 2003: 599-605 - 2002
- [j10]Ryotaro Kamimura, Taeko Kamimura, Haruhiko Takeuchi:
Greedy information acquisition algorithm: a new information theoretic approach to dynamic information acquisition in neural networks. Connect. Sci. 14(2): 137-162 (2002) - [j9]Ryotaro Kamimura:
Controlling internal representations by structural information. Neurocomputing 48(1-4): 705-725 (2002) - 2001
- [j8]Ryotaro Kamimura, Taeko Kamimura, Osamu Uchida:
Flexible feature discovery and structural information control. Connect. Sci. 13(4): 323-347 (2001) - [c24]Ryotaro Kamimura, Taeko Kamimura:
Cooperative Information Control to Coordinate Competition and Cooperation. ICANN 2001: 835-842 - [c23]Ryotaro Kamimura, Taeko Kamimura:
Information Maximization and Language Acquisition. ICANN 2001: 1225-1232 - 2000
- [c22]Ryotaro Kamimura:
Conditional Information Analysis. IJCNN (1) 2000: 197-202 - [c21]Ryotaro Kamimura:
Selective Information Acquisition with Application to Pattern Classification. IJCNN (1) 2000: 203-210 - [c20]Ryotaro Kamimura, Taeko Kamimura:
Information theoretic rule discovery in neural networks. SMC 2000: 2569-2574
1990 – 1999
- 1999
- [c19]Ryotaro Kamimura:
Mediated and multi-level information processing. IJCNN 1999: 1397-1402 - [c18]Ryotaro Kamimura:
Controlling simple structural information to improve generalization performance. IJCNN 1999: 1403-1408 - 1998
- [j7]Ryotaro Kamimura:
Minimizing alpha-Information for Generalization and Interpretation. Algorithmica 22(1/2): 173-197 (1998) - [c17]Ryotaro Kamimura:
Structural Information Control to Improve Generalization. ICONIP 1998: 643-646 - [c16]Ryotaro Kamimura:
Integrated Information Processors with Multi-functional Components. ICONIP 1998: 1501-1506 - 1997
- [j6]Ryotaro Kamimura:
Controlling α-entropy with a Neural α-Feature Detector. Complex Syst. 11(1) (1997) - [j5]Ryotaro Kamimura:
Constrained Information Maximization to Control Internal Representatio. J. Braz. Comput. Soc. 4(1) (1997) - [j4]Ryotaro Kamimura:
Information Controller to Maximize and Minimize Information. Neural Comput. 9(6): 1357-1380 (1997) - [c15]Ryotaro Kamimura:
D-entropy minimization: integration of mutual information maximization and minimization. ICNN 1997: 1056-1061 - [c14]Ryotaro Kamimura:
D-entropy controller for interpretation and generalization. ICNN 1997: 1948-1953 - 1996
- [c13]Ryotaro Kamimura, Shohachiro Nakanishi:
Constrained information maximization. ICNN 1996: 740-744 - [c12]Ryotaro Kamimura, Shohachiro Nakanishi:
Kernel feature detector: extracting kernel features by minimizing α-information. ICNN 1996: 2182-2187 - [c11]Ryotaro Kamimura:
Unification of Information Maximization and Minimization. NIPS 1996: 508-514 - 1995
- [j3]Ryotaro Kamimura, Toshiyuki Takagi, Shohachiro Nakanishi:
Improving Generalization Performance by Information Minimization. IEICE Trans. Inf. Syst. 78-D(2): 163-173 (1995) - [j2]Ryotaro Kamimura, Shohachiro Nakanishi:
Kernel Hidden Unit Analysis-Network Size Reduction by Entropy Minimization-. IEICE Trans. Inf. Syst. 78-D(4): 484-489 (1995) - [j1]Ryotaro Kamimura, Shohachiro Nakanishi:
Feature detectors by autoencoders: Decomposition of input patterns into atomic features by neural networks. Neural Process. Lett. 2(6): 17-22 (1995) - [c10]Ryotaro Kamimura, Shohachiro Nakanishi:
Minimum α-information strategy for the interpretation of the network behaviors and the improved generalization. ICNN 1995: 974-978 - [c9]Ryotaro Kamimura, Shohachiro Nakanishi:
Information maximization for feature detection and pattern classification by autoencoders. ICNN 1995: 985-989 - [c8]Ryotaro Kamimura, Taeko Kamimura, Shohachiro Nakanishi:
Discovery of Linguistic Rules by Hidden Information Maximization. ICNN 1995: 2987-2991 - 1993
- [c7]Ryotaro Kamimura:
Minimum entropy methods in neural networks: competition and selective responses by entropy minimization. ICNN 1993: 219-225 - [c6]Ryotaro Kamimura:
Generation of Internal Representation by alpha. NIPS 1993: 271-278 - 1992
- [c5]Ryotaro Kamimura:
Competitive Learning by Entropy Minimization. ALT 1992: 111-122 - [c4]Ryotaro Kamimura:
Complexity Term to Generate Explicit Internal Representation in Recurrent Neural Networks. IFIP Congress (1) 1992: 336-342 - 1991
- [c3]Ryotaro Kamimura:
Application of the Recurrent Neural Network to the Problem of Language Acquisition. Conference on Analysis of Neural Network Applications 1991: 14-28 - 1990
- [c2]Ryotaro Kamimura:
Application of temporal supervised learning algorithm to generation of natural language. IJCNN 1990: 201-207 - [c1]Ryotaro Kamimura:
Recognition and restoration of periodic patterns with recurrent neural network. SPDP 1990: 436-440
Coauthor Index
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.
Unpaywalled article links
Add open access links from to the list of external document links (if available).
Privacy notice: By enabling the option above, your browser will contact the API of unpaywall.org to load hyperlinks to open access articles. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Unpaywall privacy policy.
Archived links via Wayback Machine
For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available).
Privacy notice: By enabling the option above, your browser will contact the API of archive.org to check for archived content of web pages that are no longer available. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Internet Archive privacy policy.
Reference lists
Add a list of references from , , and to record detail pages.
load references from crossref.org and opencitations.net
Privacy notice: By enabling the option above, your browser will contact the APIs of crossref.org, opencitations.net, and semanticscholar.org to load article reference information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Crossref privacy policy and the OpenCitations privacy policy, as well as the AI2 Privacy Policy covering Semantic Scholar.
Citation data
Add a list of citing articles from and to record detail pages.
load citations from opencitations.net
Privacy notice: By enabling the option above, your browser will contact the API of opencitations.net and semanticscholar.org to load citation information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the OpenCitations privacy policy as well as the AI2 Privacy Policy covering Semantic Scholar.
OpenAlex data
Load additional information about publications from .
Privacy notice: By enabling the option above, your browser will contact the API of openalex.org to load additional information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the information given by OpenAlex.
last updated on 2024-11-08 20:30 CET by the dblp team
all metadata released as open data under CC0 1.0 license
see also: Terms of Use | Privacy Policy | Imprint