Songmin Xie

Focus on Bioinformatics and Informatics

  博客园 :: 首页 :: 博问 :: 闪存 :: 新随笔 :: 联系 :: 订阅 订阅 :: 管理 ::
Part 1: Introduction
    What is this newsgroup for? How shall it be used?
    Where is comp.ai.neural-nets archived?
    What if my question is not answered in the FAQ?
    May I copy this FAQ?
    What is a neural network (NN)?
    Where can I find a simple introduction to NNs?
    Are there any online books about NNs?
    What can you do with an NN and what not?
    Who is concerned with NNs?
    How many kinds of NNs exist?
    How many kinds of Kohonen networks exist? (And what is k-means?)
      VQ: Vector Quantization and k-means
      SOM: Self-Organizing Map
      LVQ: Learning Vector Quantization
      Other Kohonen networks and references
    How are layers counted?
    What are cases and variables?
    What are the population, sample, training set, design set, validation set, and test set?
    How are NNs related to statistical methods?
Part 2: Learning
    What are combination, activation, error, and objective functions?
    What are batch, incremental, on-line, off-line, deterministic, stochastic, adaptive, instantaneous, pattern, epoch, constructive, and sequential learning?
    What is backprop?
    What learning rate should be used for backprop?
    What are conjugate gradients, Levenberg-Marquardt, etc.?
    How does ill-conditioning affect NN training?
    How should categories be encoded?
    Why not code binary inputs as 0 and 1?
    Why use a bias/threshold?
    Why use activation functions?
    How to avoid overflow in the logistic function?
    What is a softmax activation function?
    What is the curse of dimensionality?
    How do MLPs compare with RBFs?
    What are OLS and subset/stepwise regression?
    Should I normalize/standardize/rescale the data?
    Should I nonlinearly transform the data?
    How to measure importance of inputs?
    What is ART?
    What is PNN?
    What is GRNN?
    What does unsupervised learning learn?
    Help! My NN won't learn! What should I do?
Part 3: Generalization
    How is generalization possible?
    How does noise affect generalization?
    What is overfitting and how can I avoid it?
    What is jitter? (Training with noise)
    What is early stopping?
    What is weight decay?
    What is Bayesian learning?
    How to combine networks?
    How many hidden layers should I use?
    How many hidden units should I use?
    How can generalization error be estimated?
    What are cross-validation and bootstrapping?
    How to compute prediction and confidence intervals (error bars)?
Part 4: Books, data, etc.
    Books and articles about Neural Networks?
    Journals and magazines about Neural Networks?
    Conferences and Workshops on Neural Networks?
    Neural Network Associations?
    Mailing lists, BBS, CD-ROM?
    How to benchmark learning methods?
    Databases for experimentation with NNs?
Part 5: Free software
    Source code on the web?
    Freeware and shareware packages for NN simulation?
Part 6: Commercial software
    Commercial software packages for NN simulation?
Part 7: Hardware and miscellaneous
    Neural Network hardware?
    What are some applications of NNs?
      General
      Agriculture
      Chemistry
      Face recognition
      Finance and economics
      Games, sports, gambling
      Industry
      Materials science
      Medicine
      Music
      Robotics
      Weather forecasting
      Weird
    What to do with missing/incomplete data?
    How to forecast time series (temporal sequences)?
    How to learn an inverse of a function?
    How to get invariant recognition of images under translation, rotation, etc.?
    How to recognize handwritten characters?
    What about pulsed or spiking NNs?
    What about Genetic Algorithms and Evolutionary Computation?
    What about Fuzzy Logic?
    Unanswered FAQs
    Other NN links?
------------------------------------------------------------------------
posted on 2005-02-28 15:28  Songmin Xie  阅读(578)  评论(0)    收藏  举报