- P. Patrick van der Smagt:
Minimisation methods for training feedforward neural networks.
- Michael E. Hasselmo:
Runaway synaptic modification in models of cortex: Implications for Alzheimer's disease.
- Kunihiko Fukushima, Masato Okada, Kazuhito Hiroshige:
Neocognitron with dual C-cell layers.
- Yoko Yamaguchi, Hiroshi Shimizu:
Pattern recognition with figure-ground separation by generation of coherent oscillations.
- David Hestenes:
Invariant body kinematics: I. Saccadic and compensatory eye movements.
- David Hestenes:
Invariant body kinematics: II. Reaching and neurogeometry.
- Tarek M. Nabhan, Albert Y. Zomaya:
Toward generating neural network structures for function approximation.
- Michael D. Lemmon:
Topologically ordered competitive sampling.
- Juha Karhunen, Jyrki Joutsensalo:
Representation and separation of signals using nonlinear PCA type learning.
- Eric B. Bartlett:
Dynamic node architecture learning: An information theoretic approach.
- John M. DeLaurentis, Fred M. Dickey:
A convexity-based analysis of neural networks.
- Daniel F. McCaffrey, A. Ronald Gallant:
Convergence rates for single hidden layer feedforward networks.
- Gail A. Carpenter:
A distributed outstar network for spatial pattern learning.
- Pierre Courrieu:
Three algorithms for estimating the domain of validity of feedforward neural networks.
- Christian Cachin:
Pedagogical pattern selection strategies.
- Françoise Beaufays, Youssef Abdel-Magid, Bernard Widrow:
Application of neural networks to load-frequency control in power systems.
- Anna Esposito, Salvatore Rampone, Roberto Tagliaferri:
A neural network for error correcting decoding of binary linear codes.
- Qing Hu, David B. Hertz:
An inappropriate use of neural networks for forecasting.
- Kanad Chakraborty, Kishan G. Mehrotra, Chilukuri K. Mohan, Sanjay Ranka:
Response to letter by Q. Hu and D. B. Hertz.
- Stephen Grossberg:
Recognition and segmentation of connected characters with selective attention.
- Kunihiko Fukushima:
Response to letter by S. Grossberg.
- Yukio Hayashi:
Oscillatory neural network and learning of continuously transformed patterns.
- Joshua Chover:
Recall via transient neuronal firing.
- Kaining Wang, Anthony N. Michel:
Robustness and perturbation analysis of a class of artificial neural networks.
- Stefan Wimbauer, Nikolaus Klemmer, J. Leo van Hemmen:
Universality of unlearning.
- Elias B. Kosmatopoulos, Manolis A. Christodoulou:
The Boltzmann g-RHONN: A learning machine for estimating unknown probability distributions.
- Gerald Fahner, Rolf Eckmiller:
Structural adaptation of parsimonious higher-order neural classifiers.
- Zhenni Wang, Christine Di Massimo, Ming T. Tham, A. Julian Morris:
A procedure for determining the topology of multilayer feedforward neural networks.
- Zaiyong Tang, Gary J. Koehler:
Deterministic global optimal FNN training algorithms.
- Danny S. Thomas, Amar Mitiche:
Asymptotic optimality of pattern recognition by regression analysis.
- Bagrat R. Amirikian, Hajime Nishimura:
What size network is good for generalization of a specific task of interest?
- Li Deng, Khaled Hassanein, Mohamed I. Elmasry:
Analysis of the correlation structure for a neural predictive model with application to speech recognition.
- Robert W. Smalz, Michael Conrad:
Combining evolution with credit apportionment: A new learning algorithm for neural nets.
- Sushmita Mitra, Sankar K. Pal:
Logical operation based fuzzy MLP for classification and rule generation.
- Apostolos-Paul Nicholas Refenes, Achileas D. Zapranis, Gavin Francis:
Stock performance modeling using neural networks: A comparative study with regression models.
- Jun Takeuchi, Yukio Kosugi:
Neural network representation of finite element method.
- Salvatore Cavalieri, Antonella Di Stefano, Orazio Mirabella:
Optimal path determination in a graph by hopfield neural network.
- Paolo Gaudiano, Dimitrij Surmeli, Frank D. M. Wilson:
Gated dipoles for operant conditioning.
- Jennifer L. Raymond, Douglas A. Baxter, Dean V. Buonomano, John H. Byrne:
Response to letter by Gaudiano et al.
- Mohammad Bahrami:
Adaptive control of dynamic systems by back propagation network.
- W. H. Schiffmann:
Response to letter by M. Bahrami.
- Qi Jia, Katsuyuki Hagiwara, Naohiro Toda, Shiro Usui:
Equivalence relation between the back propagation learning process of an FNN and that of an FNNG.
- Masakazu Matsugu, Alan L. Yuille:
Spatiotemporal information storage in a content addressable memory using realistic neurons.
- Nur Arad, Eric L. Schwartz, Zvi Wollberg, Yehezkel Yeshurun:
Acoustic binaural correspondence used for localization of natural acoustic signals.
- Morris W. Hirsch:
Saturation at high gain in discrete time recurrent networks.
- Hidefumi Katsuura, David A. Sprecher:
Computational aspects of Kolmogorov's superposition theorem.
- Robert L. Coultrip, Richard H. Granger:
Sparse random networks with LTP learning rules approximate Bayes classifiers via Parzen's method.
- J. J. Kosowsky, Alan L. Yuille:
The invisible hand algorithm: Solving the assignment problem with statistical physics.
- Kevin S. Van Horn, Tony R. Martinez:
The minimum feature set problem.
- Michael Georgiopoulos, Juxin Huang, Gregory L. Heileman:
Properties of learning in ARTMAP.
- Thomas Martinetz, Klaus Schulten:
Topology representing networks.
- Y. Guan, Trevor G. Clarkson, John G. Taylor, Denise Gorse:
Noisy reinforcement training for pRAM nets.
- Fu-Lai Chung, Tong Lee:
Fuzzy competitive learning.
- Jun Tani, Naohiro Fukumura:
Learning goal-directed sensory-based navigation of a mobile robot.
- Leonid I. Perlovsky:
A model-based neural network for transient signal processing.
- Daniel C. Chin:
A more efficient global optimization algorithm based on Styblinski and Tang.
- Jean-François Vibert, Khashayar Pakdaman, Noureddine Azmy:
Interneural delay modification synchronizes biologically plausible neural networks.
- Lei Xu, Adam Krzyzak, Alan L. Yuille:
On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size.
- Jun Wang:
A deterministic annealing neural network for convex programming.
- Sukhan Lee, Rhee Man Kil:
Redundant arm kinematic control with recurrent loop.
- Vicken Kasparian, Celal Batur, H. Zhang, Joseph Padovan:
Davidon least squares-based learning algorithm for feedforward neural networks.
- Leonid I. Perlovsky, John Jaskolski:
Maximum likelihood adaptive neural controller.
- Laura I. Burke:
Neural methods for the traveling salesman problem: Insights from operations research.
- Roberto Battiti, Anna Maria Colla:
Democracy in neural nets: Voting schemes for classification.
- Charles M. Bachmann, Scott A. Musman, Dong Luong, Abraham Schultz:
Unsupervised BCM projection pursuit algorithms for classification of simulated radar presentations.
- Kumpati S. Narendra, Snehasis Mukhopadhyay:
Adaptive control of nonlinear multivariable systems using neural networks.
- Kiyotoshi Matsuoka, Mitsuru Kawamoto:
A neural network that self-organizes to perform three operations related to principal component analysis.
- Luis Gonzalez Sotelino, Marco Saerens, Hugues Bersini:
Classification of temporal trajectories by continuous-time recurrent nets.
- Galina L. Rogova:
Combining the results of several neural network classifiers.
- Ali A. Minai, Ronald D. Williams:
Perturbation response in feedforward networks.
- Keihiro Ochiai, Naohiro Toda, Shiro Usui:
Kick-out learning algorithm to reduce the oscillation of weights.
- Brian A. Telfer, Harold Szu:
Energy functions for minimizing misclassification error with minimum-complexity networks.
- Johan A. K. Suykens, Bart De Moor, Joos Vandewalle:
Static and dynamic stabilizing neural controllers, applicable to transition between equilibrium points.
- Kootala P. Venugopal, Abhijit S. Pandya, Raghavan Sudhakar:
A recurrent neural network controller and learning algorithm for the on-line learning control of autonomous underwater vehicles.
- E. V. Krishnamurthy:
Unsolvability, complexity, and neural networks.
- Kevin T. Judd, Kazuyuki Aihara:
Response to letter by E. V. Krishnamurthy.
1994 Models of Neurodynamics and Behavior
1994 Special Issue
- Gary G. Blasdel, Klaus Obermayer:
Putative strategies of scene segmentation in monkey visual cortex.
- Stephen Grossberg, Steven J. Olson:
Rules for the cortical map of ocular dominance and orientation columns.
- Robert K. Cunningham, Allen M. Waxman:
Diffusion-enhancement bilayer: Realizing long-range apparent motion and spatiotemporal grouping in a neural architecture.
- Stephen R. Jackson, Richard T. Marrocco, Michael I. Posner:
Networks of anatomical areas controlling visuospatial attention.
- Mike W. Oram, David I. Perrett:
Modeling visual recognition from neurobiological constraints.
- Samuel Kaski, Teuvo Kohonen:
Winner-take-all networks for physiological models of competitive learning.
- Bart L. M. Happel, Jacob M. J. Murre:
Design and evolution of modular neural network architectures.
- Daniel L. Alkon, Kim T. Blackwell, Garth S. Barbour, Susan A. Werness, Thomas P. Vogl:
Biological plausibility of synaptic associative memory models.
- Wolfgang Konen, Thomas Maurer, Christoph von der Malsburg:
A fast dynamic link matching algorithm for invariant pattern recognition.
- Theodore W. Berger, Gilbert A. Chauvet, Robert J. Sclabassi:
A biologically based model of functional properties of the hippocampus.
- Neil Burgess, Michael Recce, John O'Keefe:
A model of hippocampal function.
- Ivan A. Bachelder, Allen M. Waxman:
Mobile robot visual mapping and localization: A view-based neurocomputational architecture that emulates hippocampal place learning.
- Daniel Bullock, John C. Fiala, Stephen Grossberg:
A neural model of timed response learning in the cerebellum.
- Kuniharu Arai, Edward L. Keller, Jay A. Edelman:
Two-dimensional neural network model of the primate saccadic system.
- Jim-Shih Liaw, Ananda Weerasuriya, Michael A. Arbib:
Snapping: A paradigm for modeling coordination of motor synergies.
- Paul C. Bressloff, John G. Taylor:
Dynamics of compartmental model neurons.
- Raju S. Bapi, Daniel S. Levine:
Modeling the role of frontal lobes in sequential task performance. I. Basic structure and primacy effects.
- John G. Taylor:
Goals, drives, and consciousness.
- Benedikt K. Humpert:
Improving back propagation with a new error function.
- Hideki Hayakawa, Shinya Nishida, Yasuhiro Wada, Mitsuo Kawato:
A computational model for shape estimation by integration of shading and edge information.
- Naonori Ueda, Ryohei Nakano:
A new competitive learning approach based on an equidistortion principle for designing optimal vector quantizers.
- Stefan Jockusch, Helge Ritter:
Self-organizing maps: Local competition and evolutionary optimization.
- Kunikazu Kobayashi, Toyoshi Torioka, Nobuhiko Ikeda:
Fundamental consideration on self-formation of recognition cells.
- Norio Baba, Yoshio Mogami, Motokazu Kohzaki, Yasuhiro Shiraishi, Yutaka Yoshida:
A hybrid algorithm for finding the global minimum of error function of neural networks and its applications.
- Kenji Araki, Toshimichi Saito:
An associative memory including time-variant self-feedback.
- Ronald R. Yager:
Modeling and formulating fuzzy knowledge bases using neural networks.
- Burkhard Lenze:
How to make sigma-pi neural networks perform perfectly on regular training sets.
- Alexander Shustorovich:
A subspace projection approach to feature extraction: The two-dimensional gabor transform for character recognition.
- Hiroshi Ohno, Toshihiko Suzuki, Keiji Aoki, Arata Takahasi, Gunji Sugimoto:
Neural network control for automatic braking control system.
- Thomas Wagner, Friedrich G. Boebel:
Testing synergetic algorithms with industrial classification problems.
Copyright © Fri Mar 12 17:31:24 2010
by Michael Ley (firstname.lastname@example.org)
- Thomas P. Caudell, Scott D. G. Smith, Richard Escobedo, Michael Anderson:
NIRS: Large scale ART-1 neural architectures for engineering design retrieval.
- Brendan L. Rogers:
New neural multiprocess memory model for adaptively regulating associative learning.
- Tao Wang:
Improving recall in associative memories by dynamic threshold.
- Anne-Johan Annema, Klaas Hoen, Hans Wallinga:
Learning behavior and temporary minima of two-layer neural networks.
- Roberto Brunelli:
Training neural nets through stochastic minimization.
- Bill G. Horne, Don R. Hush:
On the node complexity of neural networks.
- Patrick Thiran, Martin Hasler:
Self-organization of a one-dimensional Kohonen network with quantized weights and inputs.
- Bernd Fritzke:
Growing cell structures--A self-organizing network for unsupervised and supervised learning.
- Bernard Ans, Yves Coiton, Jean-Claude Gilhodes, Jean-Luc Velay:
A neural network model for temporal sequence learning and motor programming.
- Yan Qiu Chen, David W. Thomas, Mark S. Nixon:
Generating-shrinking algorithm for learning arbitrary classification.