Before the recent breakthroughs in DL, many EC researchers believed that the evolution of neural network (NN) weights could defeat backpropagation as a general NN training method. Now, however, the focus has shifted toward hybrid systems in which evolution determines large-scale topological features (such as the number, size and mesoscale connectivity of hidden layers) along with hyperparameters (such as learning and momentum rate, regularization technique, etc.), while backpropagation tunes the weights within that constellation.
Unfortunately, the merger of these two computationally intensive methods yields systems that can take weeks to run, even on large clusters.
This project seeks to bring these multiply-adaptive systems into a more reasonable computational timeframe by allowing the user to select which network properties to hard-wire, and which to evolve. Underlying code will then convert this specification into the appropriate EC genomes, whose fitness will be assessed within a neural network that is implemented in Tensorflow and
applied to the problem domain.
You may choose a standard classification domain or venture into those characteristic of artificial life and evolutionary robotics, wherein
an agent struggles for survival in a simulated environment (and supervised learning occurs when an agent makes predictions of future states and then compares those expectations to reality, thus deriving an error signal for backpropagation). Regardless of your domain choice, the user will be given the power to control the relative contributions of evolution and supervised learning within that domain.
For more on the general combination of EC and DL, see:
For details of the master-student selection process that Keith Downing uses, please see: