1. Paper that describes the work Evolving Adaptive Neural Network Optimizers for Image Classification 2. Authors Contact Information; Pedro Carvalho Departamento de Engenharia Informática Faculdade de Ciências e Tecnologia, Universidade de Coimbra Pólo II - Pinhal de Marrocos 3030-290, Coimbra, Portugal pfcarvalho@dei.uc.pt +351 239790016 Nuno Lourenço Departamento de Engenharia Informática Faculdade de Ciências e Tecnologia, Universidade de Coimbra Pólo II - Pinhal de Marrocos 3030-290, Coimbra, Portugal naml@dei.uc.pt +351 239790016 Penousal Machado Departamento de Engenharia Informática Faculdade de Ciências e Tecnologia, Universidade de Coimbra Pólo II - Pinhal de Marrocos 3030-290, Coimbra, Portugal machado@dei.uc.pt +351 239790052 3. Corresponding Author; Pedro Carvalho pfcarvalho@dei.uc.pt 4. Abstract of the paper; The evolution of hardware has enabled Artificial Neural Networks to become a staple solution to many modern Artificial Intelligence problems such as natural language processing and computer vision. The neural network’s effectiveness is highly dependent on the optimizer used during training, which motivated significant research into the design of neural network optimizers. Current research focuses on creating optimizers that perform well across different topologies and network types. While there is evidence that it is desirable to fine-tune optimizer parameters for specific networks, the benefits of designing optimizers specialized for single networks remain mostly unexplored. In this paper, we propose an evolutionary framework called Adaptive AutoLR (ALR) to evolve adaptive optimizers for specific neural networks in an image classification task. The evolved optimizers are then compared with state-of-the-art, human-made optimizers on two popular image classification problems. The results show that some evolved optimizers perform competitively in both tasks, even achieving the best average test accuracy in one dataset. An analysis of the best evolved optimizer also reveals that it functions differently from human-made approaches. The results suggest ALR can evolve novel, high-quality optimizers motivating further research and applications of the framework. 5. Which criteria does the work satisfy (E) The result is equal to or better than the most recent human-created solution to a long-standing problem for which there has been a succession of increasingly better human-created solutions. (F) The result is equal to or better than a result that was considered an achievement in its field at the time it was first discovered. 6. Why does the results of the work satisfy criteria E and F * Our system evolves neural network optimizers from scratch with no prior knowledge. These solutions perform on the level of human-made optimizers, out-performing them in specific scenarios. * Specifically, our system was evolved for a specific image classification dataset and network architecture (Fashion-MNIST). This experiment produced the ADES evolved optimizers. ADES (92.87% test accuracy) empirically outperforms Nesterov Momentum (92.82% test accuracy), RMSprop (92.80% test accuracy) and Adam (91.29% test accuracy); three milestone human-created neural network optimizers. * ADES was also compared with human-created solutions in a different image classification task (CIFAR-10). In this task, ADES remains statistically comparable to all human-created solutions (ADES achieved 81.85% test accuracy compared Nesterov Momentum’s 82.03% test accuracy). 7. Full Paper Citation @InProceedings{10.1007/978-3-031-02056-8_1, author="Carvalho, Pedro and Louren{\c{c}}o, Nuno and Machado, Penousal", editor="Medvet, Eric and Pappa, Gisele and Xue, Bing", title="Evolving Adaptive Neural Network Optimizers for Image Classification", booktitle="Genetic Programming", year="2022", publisher="Springer International Publishing", address="Cham", pages="3--18", abstract="The evolution of hardware has enabled Artificial Neural Networks to become a staple solution to many modern Artificial Intelligence problems such as natural language processing and computer vision. The neural network's effectiveness is highly dependent on the optimizer used during training, which motivated significant research into the design of neural network optimizers. Current research focuses on creating optimizers that perform well across different topologies and network types. While there is evidence that it is desirable to fine-tune optimizer parameters for specific networks, the benefits of designing optimizers specialized for single networks remain mostly unexplored.", isbn="978-3-031-02056-8" } 8. Prize Money Split; Any prize money, if any, is to be divided equally among the co-authors 9. Why the work would be the “best” entry; Researchers have developed and used many optimizers for neural network training. While it is difficult to identify a single “best” human-created solution, optimizers have become increasingly more robust and sophisticated since the advent of back-propagation and stochastic gradient descent. Despite the fact that neural network optimizers have been a research topic of interest for over two decades, our evolutionary approach produces optimizers that compete and even surpass these human-created solutions backed by years of research. Furthermore, our system creates these optimizers from scratch without any prior knowledge of how optimizers should work and the specificities of the networks. During our validation process, the experiments were designed to ensure that the human-made optimizers are properly tuned and comparisons are fair. Despite the fact that we took all measures to get the most of the human-created solutions, the evolved optimizers achieved equal or superior results in all the experiments. . 10. Type of Evolutionary Computation used; GP, (Genetic Programming), SGE (Structured Grammatical Evolution) 11. The date of publication; 13 April 2022