Associative learning on a continuum in evolved dynamical neural networks

This article extends previous work on evolving learning without synaptic plasticity from discrete tasks to continuous tasks. Continuous-time recurrent neural networks without synaptic plasticity are artificially evolved on an associative learning task. The task consists in associating paired stimuli: temperature and food. The temperature to be associated can be either drawn from a discrete set or allowed to range over a continuum of values. We address two questions: Can the learning without synaptic plasticity approach be extended to continuous tasks? And if so, how does learning without synaptic plasticity work in the evolved circuits? Analysis of the most successful circuits to learn discrete stimuli reveal finite state machine (FSM) like internal dynamics. However, when the task is modified to require learning stimuli on the full continuum range, it is not possible to extract a FSM from the internal dynamics. In this case, a continuous state machine is extracted instead.

Izquierdo, E., Harvey, I. and Beer, R.D. (2008). Associative learning on a continuum in evolved dynamical neural networks. Adaptive Behavior 16:361-384.


, , , , ,

  1. Leave a comment

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: