0. Filenames: EuroGP2022_ADP.txt and EuroGP2022_ADP.pdf 1. Complete title of the paper: Evolving Monotone Conjunctions in Regimes Beyond Proved Convergence 2. Information on the authors: --- Pantia-Marina Alchirch Physical mailing address: 3 Pindarou St, Perama (Piraeus), 18863, Greece E-mail address: marina.alchirch@aueb.gr Phone number: +30-210-441-2792 --- Dimitrios I. Diochnos Physical mailing address: 110 W Boyd St, Devon Energy Hall (Room 244), Norman, 73019 OK, USA E-mail address: diochnos@ou.edu Phone number: +1-405-325-2319 --- Katia Papakonstantinopoulou Physical mailing address: Kodrigktonos 12, Room 5, Athens, 112 57, Greece E-mail address: katia@aueb.gr Phone number: +30-210-820-3537 3. Corresponding author: Dimitrios I. Diochnos (diochnos@ou.edu) 4. Abstract of the paper: Recently it was shown, using the typical mutation mechanism that is used in evolutionary algorithms, that monotone conjunctions are provably evolvable under a specific set of Bernoulli$(p)^n$ distributions. A natural question is whether this mutation mechanism allows convergence under other distributions as well. Our experiments indicate that the answer to this question is affirmative and, at the very least, this mechanism converges under Bernoulli$(p)^n$ distributions outside of the known proved regime. Keywords: Evolvability, Genetic programming, Monotone conjunctions, Distribution-specific learning, Bernoulli$(p)^n$ distributions 5. List with letters: B, D, E 6. Statement why the result satisfies the above criteria: The paper proposes an extension of a mutation mechanism that is inspired by the (1+1) Evolutionary Algorithm and shows experimentally that distribution-specific learning of monotone conjunctions is possible for Bernoulli(p)^n distributions that are characterized by any value of p in the (0, 1) interval. This result extends the mutation mechanism used in [R1] so that it can now accommodate values of p that characterize the underlying distribution that are outside of (0, 1/3]U{1/2}. Note that in [R1] convergence of the mechanism is shown rigorously for values of p in (0, 1/3]U{1/2}, whereas this paper shows that experimental convergence occurs with the proposed extension, for values of p outside of (0, 1/3]U{1/2}. Convergence to a good solution (i.e., having low error rate) in this paper is performed in 100% of the experiments that were performed, for many different target functions that were to be learnt in each value of p=j/10 that was tested, for j=1,2,...,9. --- Criteria B and D: Due to the above, the paper satisfies criteria B and D. Criterion (B): The result is equal to or better than a result that was accepted as a new scientific result at the time when it was published in a peer-reviewed scientific journal. Criterion (D): The result is publishable in its own right as a new scientific result independent of the fact that the result was mechanically created. --- Criterion E: Criterion (E): The result is equal to or better than the most recent human-created solution to a long-standing problem for which there has been a succession of increasingly better human-created solutions. In order to provide justification for criterion E we need more context. Namely, learning monotone conjunctions in the framework of evolvability where the work of this paper lies upon, has attracted a lot of attention due to the basic idea that more sophisticated algorithms can potentially be built by using as building blocks algorithms for simpler classes of functions, such as the monotone conjunctions. In particular, the problem was first analyzed rigorously by Leslie Valiant in [R2] with a swapping-type mechanism that would learn monotone conjunctions under the uniform distribution (that is, Bernoulli(p)^n distribution obtained for p=1/2). Subsequently, the analysis of this swapping-type algorithm was simplified in [R3], while in [R4] it was shown rigorously that this swapping-type algorithm converges under Bernoulli(p)^n distributions for any p in the (0, 1) interval. In parallel, Vitaly Feldman has shown in [R5] that monotone conjunctions *cannot* be learnt in a distribution-independent way in the framework of evolvability when one uses the Boolean loss, as it is the case of our paper that we submit to the "Humies" competition. As a result, one can only hope for distribution-specific results similar to the one in our paper, or about results that are distribution-independent for a specific set of distributions; that is, instead of adapting the evolutionary mechanism to a particular underlying distribution and then converging to a good solution, rather to use just one evolutionary mechanism that has the property that it creates good solutions for a restricted class of distributions, without actually caring about the underlying distribution (for example, such a result is available in [R1]). Furthermore, Andrei Lissovoi and Pietro Oliveto in [R6] have explored this problem of monotone conjunctions using the lens of genetic programming (with tree-type representations) and some of their results also apply to the framework of evolvability. However, their work is specific to the uniform distribution (that is, a Bernoulli(p)^n distribution obtained for p=1/2), whereas our work explores experimentally many different values of p in the (0, 1) interval, including the case where p=1/2 and therefore the results of this paper supersede the results of [R6] when viewed in the framework of evolvability. ___REFERENCES___ R1. Dimitrios I. Diochnos. On the Evolvability of Monotone Conjunctions with an Evolutionary Mutation Mechanism. J. Artif. Intell. Res. 70: 891-921 (2021) https://www.jair.org/index.php/jair/article/view/12050 R2. Leslie G. Valiant. Evolvability. J. ACM 56(1): 3:1-3:21 (2009) R3. Dimitrios I. Diochnos, György Turán. On Evolvability: The Swapping Algorithm, Product Distributions, and Covariance. SAGA 2009: 74-88 R4. Dimitrios I. Diochnos. On the Evolution of Monotone Conjunctions: Drilling for Best Approximations. ALT 2016: 98-112 R5. Vitaly Feldman. Distribution-Independent Evolvability of Linear Threshold Functions. COLT 2011: 253-272 R6. Andrei Lissovoi, Pietro S. Oliveto. On the Time and Space Complexity of Genetic Programming for Evolving Boolean Conjunctions. J. Artif. Intell. Res. 66: 655-689 (2019) 7. Full citation of the paper: Pantia-Marina Alchirch, Dimitrios I. Diochnos, and Katia Papakonstantinopoulou. Evolving monotone conjunctions in regimes beyond proved convergence. In Eric Medvet, Gisele Pappa, and Bing Xue, editors, Genetic Programming - 25th European Conference, EuroGP 2022, Held as Part of EvoStar 2022, Madrid, Spain, April 20-22, 2022, Proceedings, volume 13223 of Lecture Notes in Computer Science, pages 228–244. Springer, 2022. 8. Monetary statement: Any prize money, if any, is to be divided equally among the co-authors. 9. Why is the particular entry "best"? First, we note that outside of the framework of evolvability it is known that monotone conjunctions can in fact be learnt *distribution-independently* using approaches based on genetic algorithms (e.g., [R7]), as well as several other algorithms unrelated to evolutionary approaches; e.g., [R8]. However, such approaches rely on information that can be conveyed from individual examples during the course of learning and thus modify the candidate solutions directly so that they better fit the correct labels that need to be predicted. The problem with such approaches is that the modification on the representation of the candidate solutions is an analogue to individuals directly modifying their DNA when such individuals are not particularly fit in an environment and thus such approaches are outside of the scope of the evolvability as they are unrealistic in how evolution appears to be working in the real world. For this reason we believe that it is important to pursue directions in evolutionary machine learning where learning occurs by relying on aggregate information on how well a candidate solution fits a set of training examples and the goodness of fit of candidate solutions in a neighborhood is based on such aggregate information that drives the evolution. Furthermore, while we know from a separate result by Vitaly Feldman [R9] that monotone conjunctions are evolvable under any fixed distribution (with an algorithm that adapts the representations to the specific distribution every time), nevertheless this is a simulation result that is, as admitted by Vitaly Feldman himself, not necessarily the most intuitive or efficient approach for designing evolutionary algorithms. As a consequence of all the above, we believe that this paper continues a line of work that is important and will allow us to understand better the limits of learning with elegant mutation mechanisms, while at the same time providing rigorous guarantees on the generated solutions. Along these lines we note that Lev Reyzin [R10], as well as Andrei Lissovoi and Pietro Olivetto [R6], explicitly mention that intuitive evolutionary mechanisms, similar to the one that we explore in this paper, are desirable and sought for. Finally, our experimental work works in a direction that is virtually very little explored within evolutionary algorithms, in the sense that we generate sequences that satisfy the goal of evolution, and ultimately our goal is to motivate future work and actually prove rigorously the experimental convergence that we have been able to observe in our paper. For all the above reasons we believe that our paper is a worthy candidate for the "Humies" awards. ___REFERENCES___ R6. Andrei Lissovoi, Pietro S. Oliveto. On the Time and Space Complexity of Genetic Programming for Evolving Boolean Conjunctions. J. Artif. Intell. Res. 66: 655-689 (2019) R7. Johannes P. Ros. Learning Boolean Functions with Genetic Algorithms: A PAC Analysis. FOGA 1992: 257-275 R8. Jerome S. Bruner, Jacqueline J. Goodnow, and George A. Austin. A study of thinking. John Wiley & Sons, New York, NY, USA, 1957. R9. Vitaly Feldman. Evolvability from learning algorithms. STOC 2008: 619-628 R10. Lev Reyzin. Statistical Queries and Statistical Algorithms: Foundations and Applications. CoRR abs/2004.00557 (2020) 10. Type of genetic or evolutionary computation used: ES. In particular, a (1+1)-Evolutionary Algorithm. 11. Date of publication: April 20-22, 2022.