1. Complete Title of Papers: - "LLaMEA: A Large Language Model Evolutionary Algorithm for Automatically Generating Metaheuristics" (IEEE Transactions on Evolutionary Computation, April 2025) - "Optimizing Photonic Structures with Large Language Model Driven Algorithm Discovery" (GECCO Workshop, 2025) 2. Authors: - Niki van Stein, Leiden Institute of Advanced Computer Science (LIACS), Leiden University, 2300 RA, The Netherlands, n.van.stein@liacs.leidenuniv.nl - Thomas Bäck, LIACS, Leiden University, 2300 RA, The Netherlands, t.h.w.baeck@liacs.leidenuniv.nl - Haoran Yin, LIACS, Leiden University, 2300 RA, The Netherlands, h.yin@liacs.leidenuniv.nl - Anna V. Kononova, LIACS, Leiden University, 2300 RA, The Netherlands, a.kononova@liacs.leidenuniv.nl 3. Corresponding Author: - Niki van Stein (n.van.stein@liacs.leidenuniv.nl) 4. Abstracts: - Large language models (LLMs), such as GPT-4 have demonstrated their ability to understand natural language and generate complex code snippets. This article introduces a novel LLM evolutionary algorithm (LLaMEA) framework, leveraging GPT models for the automated generation and refinement of algorithms. Given a set of criteria and a task definition (the search space), LLaMEA iteratively generates, mutates, and selects algorithms based on performance metrics and feedback from runtime evaluations. This framework offers a unique approach to generating optimized algorithms without requiring extensive prior expertise. We show how this framework can be used to generate novel closed box metaheuristic optimization algorithms for box-constrained, continuous optimization problems automatically. LLaMEA generates multiple algorithms that outperform state-of-the-art optimization algorithms (covariance matrix adaptation evolution strategy and differential evolution) on the 5-D closed box optimization benchmark (BBOB). The algorithms also show competitive performance on the 10- and 20-D instances of the test functions, although they have not seen such instances during the automated generation process. The results demonstrate the feasibility of the framework and identify future directions for automated generation and optimization of algorithms via LLMs. - We study how large language models (LLMs) can be used in combination with evolutionary computation techniques to automatically discover optimization algorithms for the design of photonic structures. Building on the Large Language Model Evolutionary Algorithm (LLaMEA) framework, we introduce structured prompt engineering tailored to multilayer photonic problems such as Bragg mirror, ellipsometry inverse analysis and solar cell antireflection coatings. We systematically explore multiple configurations of Evolution Strategies, including (1+1), (1+5), (2+10) and others, to balance exploration and exploitation. Our experiments show that LLM-generated algorithms, generated using small-scale problem instances, can match or surpass established methods like Quasi-oppositional Differential Evolution on large-scale realistic real-world problem instances. Notably, LLaMEA's self-debugging mutation loop, augmented by automatically extracted problem-specific insights, achieves strong anytime performance and reliable convergence across diverse problem scales. This work demonstrates the feasibility of domain-focused LLM prompts and evolutionary approaches in solving optical design tasks, paving the way for rapid, automated photonic inverse design. Such an approach is broadly applicable and can transform design processes across a wide range of scientific and engineering domains. 5. Criteria Claimed: - (B) Result is equal to or better than a peer-reviewed scientific result. - (D) Result is publishable as a novel scientific result independent of mechanical creation. - (E) Result improves upon recent human-created solutions. - (G) Result solves problems of indisputable difficulty in its field. - (H) The result holds its own or wins a regulated competition involving human contestants (in the form of either live human players or human-written computer programs). 6. Statement of Human-Competitiveness: - The LLaMEA algorithm has automatically generated and refined metaheuristics that surpass state-of-the-art manually designed optimization algorithms like CMA-ES and DE in continuous optimization benchmarks, validated using the widely accepted BBOB suite. Moreover, it has successfully addressed complex real-world photonic structure optimization tasks, achieving results competitive with, or superior to, established human-created optimization algorithms. In the real-world problem paper we also show that it successfully could generalize to higher dimensions by optimizing an algorithm on a small problem instance and solving a higher dimensional instance of that problem. This demonstrates clear advancement over existing human-designed solutions, fulfilling criteria B, D, E, and G. In addition, a LLaMEA-generated algorithm won the "Anytime Algorithms for Many-affine BBOB functions" competition at GECCO 2024 (https://iohprofiler.github.io/competitions/mabbob25), fulfilling criterium H. 7. Full Citation: - N. van Stein and T. Bäck, "LLaMEA: A Large Language Model Evolutionary Algorithm for Automatically Generating Metaheuristics," in IEEE Transactions on Evolutionary Computation, vol. 29, no. 2, pp. 331-345, April 2025, doi: 10.1109/TEVC.2024.3497793 - Yin, H., Kononova, A. V., Bäck, T., & Van Stein, N. (2025). "Optimizing Photonic Structures with Large Language Model Driven Algorithm Discovery." GECCO Companion Proceedings, 2025. (in press) 8. Prize Money Distribution: - Prize money, if any, is to be divided equally among the co-authors. 9. Why This Entry is the Best: - This entry demonstrates the first successful integration of large language models and evolutionary computation to autonomously generate, evolve, and refine complete high-performance black-box optimization algorithms. The approach significantly outperforms existing state-of-the-art algorithms on both standard benchmark problems and complex real-world problems (photonic structure optimization), illustrating a groundbreaking advancement in automated algorithm design that substantially reduces human involvement while achieving human-competitive results. 10. **General Type of Evolutionary Computation:** - GA (genetic algorithms), ES (evolution strategies), EP (evolutionary programming) 11. **Publication Date:** - IEEE Transactions on Evolutionary Computation: April 2, 2025 - GECCO Workshop paper: Accepted and "in press" by the competition deadline (documentation (acceptance notification email) attached, the camera-ready version with copyright statement is attached as full paper).