1. the complete title of one (or more) paper(s) published in the open literature describing the work that the author claims describes a human-competitive result; Deep Parameter Optimisation 2. the name, complete physical mailing address, e-mail address, and phone number of EACH author of EACH paper(s); Fan Wu Address: Department of Computer Science, University College London, Gower Street, London WC1E 6BT, United Kingdom E-mail: fan.wu.12@ucl.ac.uk Phone: +44 (0)2076793058 Westley Weimer Address: Department of Computer Science, University of Virginia, 85 Engineer’s Way, Charlottesville, VA 22904-4740, United States E-mail: weimer@cs.virginia.edu Phone: +1 4349241021 Mark Harman Address: Department of Computer Science, University College London, Gower Street, London WC1E 6BT, United Kingdom E-mail: mark.harman@ucl.ac.uk Phone: +44 (0)2076791305 Yue Jia Address: Department of Computer Science, University College London, Gower Street, London WC1E 6BT, United Kingdom E-mail: yue.jia@ucl.ac.uk Phone: +44 (0)2076793673 Jens Krinke Address: Department of Computer Science, University College London, Gower Street, London WC1E 6BT, United Kingdom E-mail: j.krinke@ucl.ac.uk Phone: +44 (0)2076797754 3. the name of the corresponding author (i.e., the author to whom notices will be sent concerning the competition); Fan Wu 4. the abstract of the paper(s); ABSTRACT We introduce a mutation-based approach to automatically discover and expose `deep' (previously unavailable) parameters that affect a program's runtime costs. These discovered parameters, together with existing (`shallow') parameters, form a search space that we tune using search-based optimisation in a bi-objective formulation that optimises both time and memory consumption. We implemented our approach and evaluated it on four real-world programs. The results show that we can improve execution time by 12% or achieve a 21% memory consumption reduction in the best cases. In three subjects, our deep parameter tuning results in a significant improvement over the baseline of shallow parameter tuning, demonstrating the potential value of our deep parameter extraction approach. 5. a list containing one or more of the eight letters (A, B, C, D, E, F, G, or H) that correspond to the criteria (see above) that the author claims that the work satisfies; A, E. 6. a statement stating why the result satisfies the criteria that the contestant claims (see examples of statements of human-competitiveness as a guide to aid in constructing this part of the submission); A: In this paper, we proposed a new way of optimising existing software by exposing implicit crucial parameters, which are later tuned in a bi-objective formulation to optimised the performance. It is the first time this idea was proposed and implemented. It has attracted many researchers to follow and extend this idea around the world. It is also extended for an in-progress patent application. E: We optimised dlmalloc, state-of-the-art memory management library, for four real world subjects. Memory management is a field in which there has been a succession of increasingly better human-created solutions. Our approach based on the state-of-the-art memory management technique but improved its performance by about 20%. 7. a full citation of the paper (that is, author names; publication date; name of journal, conference, technical report, thesis, book, or book chapter; name of editors, if applicable, of the journal or edited book; publisher name; publisher city; page numbers, if applicable); @inproceedings{Wu:2015:DPO:2739480.2754648, author = {Wu, Fan and Weimer, Westley and Harman, Mark and Jia, Yue and Krinke, Jens}, title = {Deep Parameter Optimisation}, booktitle = {Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation}, series = {GECCO '15}, year = {2015}, isbn = {978-1-4503-3472-3}, location = {Madrid, Spain}, pages = {1375--1382}, numpages = {8}, url = {http://doi.acm.org/10.1145/2739480.2754648}, doi = {10.1145/2739480.2754648}, acmid = {2754648}, publisher = {ACM}, address = {New York, NY, USA}, keywords = {memory allocation, parameter exposure, parameter tuning}, } 8. a statement either that "any prize money, if any, is to be divided equally among the co-authors" OR a specific percentage breakdown as to how the prize money, if any, is to be divided among the co-authors; Any prize money, if any, is to be divided equally among the co-authors. 9. a statement stating why the authors expect that their entry would be the "best," and Genetic Improvement (GI) has been shown an effective approach to improve software. It has been applied to bug-fixing, execution time reduction, memory saving etc.. Because human developers usually focus on implementing the main features of the software, it is difficult for them to optimise other non-functional properties at the same time, but, fortunately, it is what computers are good at. Currently works on GI automatically modify the source code and seek for improvement in those "mutated" versions. However, they always focus on the improvement of one objective such as the execution time. This paper is the first paper that optimise both execution time and memory consumption in a bi-objective formulation, which is practically impossible for human developers to optimise while implementing the main features. Our approach tries to exposed "deep" parameters that were previously unavailable. In comparison with developer-specified "shallow" parameters, they may contain information that even exceed the human developers' knowledge about the software. We applied this approach on the state-of-the-art memory management library and achieved about 20% improvement on both execution time and memory consumption in the best cases. We also produced a set of solutions that trade-off these two properties, which is impossible for any human developers to produce manually. 10. an indication of the general type of genetic or evolutionary computation used, such as GA (genetic algorithms), GP (genetic programming), ES (evolution strategies), EP (evolutionary programming), LCS (learning classifier systems), GE (grammatical evolution), GEP (gene expression programming), DE (differential evolution), etc. GA