As in any other GP approach, the objective is to find an executable program, program fragment, or function, which will achieve a good fitness value for a given objective function. In most published work on GP, a LISP-style tree-structured expression is directly manipulated, whereas GE applies genetic operators to an integer string, subsequently mapped to a program (or similar) through the use of a grammar, which is typically expressed in Backus–Naur form. One of the benefits of GE is that this mapping simplifies the application of search to different programming languages and other structures.
Problem addressed
In type-free, conventional Koza-style GP, the function set must meet the requirement of closure: all functions must be capable of accepting as their arguments the output of all other functions in the function set. Usually, this is implemented by dealing with a single data-type such as double-precision floating point. While modern Genetic Programming frameworks support typing, such type-systems have limitations that Grammatical Evolution does not suffer from.
GE's solution
GE offers a solution to the single-type limitation by evolving solutions according to a user-specified grammar (usually a grammar in Backus-Naur form). Therefore the search space can be restricted, and domain knowledge of the problem can be incorporated. The inspiration for this approach comes from a desire to separate the "genotype" from the "phenotype": in GP, the objects the search algorithm operates on and what the fitness evaluation function interprets are one and the same. In contrast, GE's "genotypes" are ordered lists of integers which code for selecting rules from the provided context-free grammar. The phenotype, however, is the same as in Koza-style GP: a tree-like structure that is evaluated recursively. This model is more in line with how genetics work in nature, where there is a separation between an organism's genotype and the final expression of phenotype in proteins, etc.
Separating genotype and phenotype allows a modular approach. In particular, the search portion of the GE paradigm needn't be carried out by any one particular algorithm or method. Observe that the objects GE performs search on are the same as those used in genetic algorithms. This means, in principle, that any existing genetic algorithm package, such as the popular GAlib, can be used to carry out the search, and a developer implementing a GE system need only worry about carrying out the mapping from list of integers to program tree. It is also in principle possible to perform the search using some other method, such as particle swarm optimization (see the remark below); the modular nature of GE creates many opportunities for hybrids as the problem of interest to be solved dictates.
Brabazon and O'Neill have successfully applied GE to predicting corporate bankruptcy, forecasting stock indices, bond credit ratings, and other financial applications.[citation needed] GE has also been used with a classic predator-prey model to explore the impact of parameters such as predator efficiency, niche number, and random mutations on ecological stability.[2]
It is possible to structure a GE grammar that for a given function/terminal set is equivalent to genetic programming.
Criticism
Despite its successes, GE has been the subject of some criticism. One issue is that as a result of its mapping operation, GE's genetic operators do not achieve high locality[3][4] which is a highly regarded property of genetic operators in evolutionary algorithms.[3]
Variants
Although GE was originally described in terms of using an Evolutionary Algorithm, specifically, a Genetic Algorithm, other variants exist. For example, GE researchers have experimented with using particle swarm optimization to carry out the searching instead of genetic algorithms with results comparable to that of normal GE; this is referred to as a "grammatical swarm"; using only the basic PSO model it has been found that PSO is probably equally capable of carrying out the search process in GE as simple genetic algorithms are. (Although PSO is normally a floating-point search paradigm, it can be discretized, e.g., by simply rounding each vector to the nearest integer, for use with GE.)
Yet another possible variation that has been experimented with in the literature is attempting to encode semantic information in the grammar in order to further bias the search process. Other work showed that, with biased grammars that leverage domain knowledge, even random search can be used to drive GE.[5]
Related Work
GE was originally a combination of the linear representation as used by the Genetic Algorithm for Developing Software (GADS)[citation needed] and Backus Naur Form grammars, which were originally used in tree-based GP by Wong and Leung[6] in 1995 and Whigham in 1996.[7] Other related work noted in the original GE paper was that of Frederic Gruau,[8] who used a conceptually similar "embryonic" approach, as well as that of Keller and Banzhaf,[9] which similarly used linear genomes.
Implementations
There are several implementations of GE. These include the following.
^Alfonseca, Manuel; Soler Gil, Francisco José (2 January 2015). "Evolving a predator-prey ecosystem of mathematical expressions with grammatical evolution". Complexity. 20 (3): 66–83. Bibcode:2015Cmplx..20c..66A. doi:10.1002/cplx.21507. hdl:10486/663611.
^Whigham, P. (1996). "Search Bias, Language Bias and Genetic ProgrammingP". S2CID16631215. {{cite web}}: Missing or empty |url= (help)
^Gruau, Frédéric (1994), Neural Network Synthesis Using Cellular Encoding And The Genetic Algorithm, CiteSeerX10.1.1.29.5939
^Kellere, Robert E. (1996). "Genetic Programming Using Mutation, Reproduction and Genotype-phenotype Mapping from Linear Binary Genomes into Linear Lalr(1) Phenotypes Paper Category: Genetic Programming (gp)". S2CID18095204. {{cite web}}: Missing or empty |url= (help)