SIAM-AMS Proceedings Volume 13 1981
Put loosely, the law of large numbers (LLN) says that the average of a large number of independent, or nearly independent, random variables is usually close to its mean. For some of the mathematics that typically arise in neural modelling, this simple principle has a natural and rewarding application. In one version of this application, equations for the development of long term memory traces (usually modelled as changes in "synaptic efficacies") are well approximated by more elementary equations, and from these the performance of the model can be more easily anticipated. In a second version, a large system of equations modelling the individual activities of interconnected homogeneous populations of neurons is replaced by a small number of prototype equations which accurately describe the macroscopic dynamics of the network. Models of this latter type might be relevant, for example, to the generation of phrenic nerve activity by the brainstem respiratory centers.
What I mean to present is more a point of view than a strict mathematical technique. It is another, more simple, way of looking at models which may be very complex, or even intractable, in their first formulation. For this purpose, I feel that a presentation completely by example will be most effective. A reader interested in a more formal and rigorous development, and a more general context, is referred to , , , and , and the references therein to other authors.
Time averaging: the behavior of models for the development of long term memory. In the three examples of this section, the LLN takes the form of a stochastic "method of averaging" for differential equations, through which method the behavior of a complex neural or cognitive model can often be anticipated with surprising ease. The method applies to differential equations in which the solution is slowly varying relative to the other time dependent terms
© 1981 American Mathematical Society____________________