1. the complete title of one (or more) paper(s) published in the open literature describing the work that the author claims describes a human-competitive result; Neural Exploratory Landscape Analysis for Meta-Black-Box-Optimization 2. the name, complete physical mailing address, e-mail address, and phone number of EACH author of EACH paper(s); Zeyuan Ma; Guangzhou Higher Education Mega Centre, Panyu District, Guangzhou, 510006, Guangdong, China; scut.crazynicolas@gmail.com; (+86) 15865394156; Jiacheng Chen; Guangzhou Higher Education Mega Centre, Panyu District, Guangzhou, 510006, Guangdong, China; jackchan9345@gmail.com; (+86) 19864522348; Hongshu Guo; Guangzhou Higher Education Mega Centre, Panyu District, Guangzhou, 510006, Guangdong, China; guohongshu369@gmail.com; (+86) 15815266593; Yue-Jiao Gong; Guangzhou Higher Education Mega Centre, Panyu District, Guangzhou, 510006, Guangdong, China; gongyuejiao@gmail.com; (+86) 18302055632; 3. the name of the corresponding author (i.e., the author to whom notices will be sent concerning the competition); Yue-Jiao Gong gongyuejiao@gmail.com 4. the abstract of the paper(s); Recent research in Meta-Black-Box Optimization (MetaBBO) have shown that meta-trained neural networks can effectively guide the design of black-box optimizers, significantly reducing the need for expert tuning and delivering robust performance across complex problem distributions. Despite their success, a paradox remains: MetaBBO still rely on human-crafted Exploratory Landscape Analysis features to inform the meta-level agent about the low-level optimization progress. To address the gap, this paper proposes Neural Exploratory Landscape Analysis (NeurELA), a novel framework that dynamically profiles landscape features through a two-stage, attention-based neural network, executed in an entirely end-to-end fashion. NeurELA is pre-trained over a variety of MetaBBO algorithms using a multi-task neuroevolution strategy. Extensive experiments show that NeurELA achieves consistently superior performance when integrated into different and even unseen MetaBBO tasks and can be efficiently fine-tuned for further performance boost. This advancement marks a pivotal step in making MetaBBO algorithms more autonomous and broadly applicable. The source code of NeurELA can be accessed at https://github.com/GMC-DRL/Neur-ELA. 5. a list containing one or more of the eight letters (A, B, C, D, E, F, G, or H) that correspond to the criteria (see above) that the author claims that the work satisfies; E, F and G. 6. a statement stating why the result satisfies the criteria that the contestant claims (see examples of statements of human-competitiveness as a guide to aid in constructing this part of the submission); (G) The result solves a problem of indisputable difficulty in its field. At its early-stage exploration phase and as an emerging topic on how to learn automated algorithm design, MetaBBO heavily relies on problem-specific feature extraction scheme. More importantly, algorithm design tasks such as dynamic operator selection, algorithm configuration and algorithm generation require timely optimization status features per optimization step, which can not be provided by existing Exploratory Landscape Analysis tools. Our NeurELA is the first to address this difficulty. (F) The result is equal to or better than a result that was considered an achievement in its field at the time it was first discovered. NeurELA provides more useful per-optimization-step landscape features for many existing MetaBBO approaches than the per-instance landscape features provided by a very recent work DeepELA [1]. [1] Seiler, Moritz Vinzent, Pascal Kerschke, and Heike Trautmann. "Deep-ela: Deep exploratory landscape analysis with self-supervised pretrained transformers for single-and multi-objective continuous optimization problems." Evolutionary Computation (2025): 1-27. (E) The result is equal to or better than the most recent human-created solution to a long-standing problem for which there has been a succession of increasingly better human-created solutions. Designing landscape features by human experts is not a new story. For decades, well-designed ELA features have been increasingly proposed to address the recognization and classification of optimization problems with diverse characteristics. Compared to these traditional ELA features summarized in [2], we found our NeurELA outperforms them in three aspects: i) accurate per-step landscape profiling; ii) adaptable tuning for unseen MetaBBO approaches; iii) efficient computation. [2] Mersmann, Olaf, et al. "Exploratory landscape analysis." Proceedings of the 13th annual conference on Genetic and evolutionary computation. 2011. 7. a full citation of the paper (that is, author names; title, publication date; name of journal, conference, or book in which article appeared; name of editors, if applicable, of the journal or edited book; publisher name; publisher city; page numbers, if applicable); Ma, Zeyuan, et al. "Neural Exploratory Landscape Analysis for Meta-Black-Box-Optimization." The Thirteenth International Conference on Learning Representations (ICLR). 2025. 8. a statement either that "any prize money, if any, is to be divided equally among the co-authors" OR a specific percentage breakdown as to how the prize money, if any, is to be divided among the co-authors; All prize money, if any, will be allocated equally to all of the four authors. 9. a statement stating why the authors expect that their entry would be the "best," Through comprehensive experimental validation in the paper, we have boosted a large array of existing MetaBBO approaches by replacing their human-crafted landscape feature extraction mechanisms with our pre-trained NeurELA model. More importantly, the computational efficiency of NeurELA model is significantly superior to traditional ELA when the dimension of problem gets higher. This should be regarded as a victory of neuroevolution: an ES optimizer could evolve a Transformer for landscape analysis! 10. An indication of the general type of genetic or evolutionary computation used, such as GA (genetic algorithms), GP (genetic programming), ES (evolution strategies), EP (evolutionary programming), LCS (learning classifier systems), GI (genetic improvement), In the meta-level , ES is used to evolve feature extraction networks; In the low-level of considered MetaBBOs in the paper, competitive optimizers such as DE, GA, ES are used. 11. The date of publication of each paper. If the date of publication is not on or before the deadline for submission, but instead, the paper has been unconditionally accepted for publication and is “in press” by the deadline for this competition, the entry must include a copy of the documentation establishing that the paper meets the "in press" requirement. 24 April 2025