Co-evolution of Fitness Predictors and Deep Neural Networks

Deep neural networks proved to be a very useful and powerful tool with many applications. In order to achieve good learning results, the network architecture has, however, to be carefully designed, which requires a lot of experience and knowledge. Using an evolutionary process to develop new network...

Full description

Saved in:
Bibliographic Details
Published inParallel Processing and Applied Mathematics Vol. 10777; pp. 555 - 564
Main Authors Funika, Włodzimierz, Koperek, Paweł
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2018
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text
ISBN9783319780238
3319780239
ISSN0302-9743
1611-3349
DOI10.1007/978-3-319-78024-5_48

Cover

Loading…
More Information
Summary:Deep neural networks proved to be a very useful and powerful tool with many applications. In order to achieve good learning results, the network architecture has, however, to be carefully designed, which requires a lot of experience and knowledge. Using an evolutionary process to develop new network topologies can facilitate this process. The limiting factor is the speed of evaluation of a single specimen (a single network architecture), which includes learning based on a large dataset. In this paper we propose a new approach which uses subsets of the original training set to approximate the fitness. We describe a co-evolutionary algorithm and discuss its key elements. Finally we draw conclusions from experiments and outline plans for future work.
ISBN:9783319780238
3319780239
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-78024-5_48