Genetic Algorithms for Optimization
A genetic algorithm is a search heuristic for optimization given a set of guidelines. The algorithm works with different kinds of strings of data that represent an object. The purpose of the algorithm is to select ideal output from a programmed environment. A simple example would be to use text characters as a string of data with the goal of writing the text “Hello World!”. First an environment that can generate strings of random text and store the results would be made. Then a genetic algorithm would be applied to the outputted text of random characters which would attach a fitness value to each text based on how similar it was to the text “Hello World!”. The next step can be performed in a number of ways but a simple one would be to duplicate the output with the best fitness value and then randomly mutate it. The purpose of this step is to create the next generation of text strings which will also be analyzed, replicated, and mutated. Because the new generation of text is also given a fitness value and only replicates if it receives a good enough fitness rating the output begins to resemble “Hello World!”.
Genetic Algorithms can be used for much more interesting purposes assuming a specific goal for output can be well defined. A popular exercise for using genetic algorithms as an actual evolution process is randomly creating virtual creatures. This is significantly more complex by comparison to the “Hello World!” example in terms of the programmed environment but the genetic algorithm process is nearly identical. The environment would have a method of randomly generating geometry, randomly applying movements to the joints the geometry creates, and applying accurate physics to the geometry (such as mass, gravity, collisions, force, ect). However, if the goal for a user of the environment was to create a creature that is capable of walking (swimming, flying, ect would also be possible) all the genetic algorithm would have to test for is the collection of geometry that moves farthest from its starting position. For example if a collection of cubes were slapped together by the environment and happened to wiggle in a way that pushed the cubes away from where it was created it would be given a high fitness value. By comparison a collection of cubes that does not travel far away would be given a low fitness value. The collections of cubes that receive high fitness values would replicate and mutate resulting in the next generation of creatures which would also be tested. Over many hundreds if not thousands of generations this process has a tendency of building highly functional walking animations for geometry that can resemble real animals.
Although the last example serves a more theoretical if not recreational purpose, it could be modified to produce highly functional design in terms of engineering. For example if the goal were to create a more aerodynamically efficient aircraft all that would be required is a current virtual model of the aircraft and an environment that can apply physics accurately to that model. The new process for the genetic algorithm would be randomly modifying the surfaces of the aircraft, flying the virtual model through simulated air resistance, and seeing if the random modification was beneficial. If the modifications were beneficial then the model could be duplicated and further altered in future generations. The current best model(s) for any generation would be remembered and stored in a way that the user could access them whenever they decided the process was finished. Given enough time this process would produce a speed optimized model assuming the physics simulations were accurate. However, it is worth noting that because this process is random the amount of time this could take in terms of generations is also random. To assist the speed of the algorithm it can be combined with a digital neural network to find patterns in what kinds of modifications were beneficial and bias that kind of modification.
Article Written by: Alex Simes