PyEC provides an implementation of several wellknown optimization methods with a particular focus on methods of evolutionary computation or natural computation. If you are just interested in optimizing functions, see the section on Basic Usage below.
For the time being, this code is in an experimental stage and will likely change substantially with the next few minor versions.
PyEC is intended for three main groups of people:
Anyone who needs to optimize functions. PyEC can be used as a dropin module to perform optimization quickly and accurately. The most common optimization methods have been packaged in an easily accessible form for this purpose, as described in Basic Usage below.
Researchers in Evolutionary Computation. PyEC contains tools to construct, modify, and parameterize optimization methods. It can be used as a toolbox for the researcher in evolutionary computation to construct evolutionary algorithms on complex search domains, or to test new genetic operators or new combinations of existing evolutionary components. These features are not well documented yet, but the classspecific Documentation should help you get started.
Anyone interested in Stochastic Optimization. PyEC efficiently implements the most common methods in stochastic optimization. It can be used to test and experiment how different algorithms work, and what effects different parameter settings can have.
PyEC is distributed to PyPi and can be downloaded with setuptools using the name pyec, like so:
$ easy_install pyec
PyEC has dependencies on the linear algebra package NumPy<http://numpy.scipy.org/> and the scientific computing package SciPy<http://www.scipy.org/>.
PyEC provides various optimization methods. If you just want to optimize, the following examples should help. PyEC provides easily accessible optimization routines for Evolutionary Annealing, Differential Evolution, NelderMead, Generating Set Search, CMAES, Particle Swarm Optimization, and Simulated Annealing.
Start a Python terminal session as follows:
Python 2.6.5 (r265:79359, Mar 24 2010, 01:32:55)
[GCC 4.0.1 (Apple Inc. build 5493)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import pyec.optimize
To start with, we are going to optimize Branin’s function. This is a real function in two dimensions. It is defined as
and it has three global optima with \(f(\pi, 12.275) = f(\pi, 2.275) = f(9.42478, 2.475) = 0.39788735\). Usually, the optimization is constrained by \(5 < x < 10\) and \(0 < y < 15\):
>>> from numpy import *
>>> def branin(x):
... return (x[1]  (5.1/(4*(pi**2)))*(x[0]**2) + (5./pi)*x[0]  6.)**2 + 10 * (1.  1./(8.*pi))*cos(x[0])+10.
...
Here are some examples of PyEC on Branin’s function with no constraints, on about 2,500 function evaluations (default):
>>> pyec.optimize.evolutionary_annealing(branin, dimension=2)
(array([ 3.14159266, 2.27500002]), 0.39788735772973816)
>>> pyec.optimize.differential_evolution(branin, dimension=2)
(array([ 3.14100091, 2.25785439]), 0.39819904998361366)
>>> pyec.optimize.cmaes(branin, dimension=2)
(array([ 3.14159266, 2.27500001]), 0.39788735772973816)
>>> pyec.optimize.nelder_mead(branin, dimension=2)
(array([ 3.14159308, 2.27499873]), 0.39788735773149675)
>>> pyec.optimize.generating_set_search(branin, dimension=2)
(array([ 3.12552433, 2.29660443]), 0.39920864246465015)
>>> pyec.optimize.particle_swarm_optimization(branin, dimension=2)
(array([ 3.1282098 , 2.26735578]), 0.39907497646047574)
In these examples, all we had to do was to pass our function to one of PyEC’s optimizers along with the dimension (Branin’s has 2 inputs variables), and these methods used default configurations in order to locate the global minimum (evolutionary_annealing() requires the space_scale parameter for unconstrained optimization). The return values are a tuple with two items. The first element is the best solution found so far, and the second element is the function value at that solution. In this case, many of the optimizers were not particularly accurate. Some methods missed the global minimum by an error on the order of 0.01, which is much larger than we would prefer, especially for this simple problem. Most of these methods can be made more accurate by replacing some of the default parameters:
>>> pyec.optimize.differential_evolution(branin, dimension=2, generations=250, population=50, CR=.2, F=.2)
(array([ 3.14159085, 2.27498739]), 0.39788735794177654)
>>> pyec.optimize.generating_set_search(branin, dimension=2, generations=5000, expansion_factor=1., initial_step=0.5)
(array([ 3.1386125 , 2.27810758]), 0.3979306095569175)
>>> pyec.optimize.particle_swarm_optimization(branin, dimension=2, omega=0.5, phi_p=0.0)
(array([ 3.14159258, 2.27500003]), 0.39788735772976125)
Now all but one of the methods have found the global optimum accurately up to eight decimal places.
If we had wished to find the maximum values rather than the minimum values of the function, we could have passed the parameter minimize=False to any of these optimizers:
>>> pyec.optimize.differential_evolution(branin, dimension=2, minimize=False)
In general, a function \(f(x)\) can be maximized by minimizing the alternate function \(f(x)\) instead, which is what PyEC does internally when minimize=False.
Branin’s function is relatively easy to optimize; we would like to try a harder function. PyEC ships with several benchmark multimodal functions, many of which are defined in higher dimensions as well. These benchmarks can be referenced by name when calling PyEC’s optimizers. One example is Rastrigin’s function (see <http://en.wikipedia.org/wiki/Rastrigin_function>):
def rastrigin(x):
return 10 * len(x) + ((x ** 2)  10 * cos(2 * pi * x)).sum()
Rastrigin’s has a minimum at the origin, where its function value is 0. PyEC’s optimizers can find this minimum with a little tweaking:
>>> pyec.optimize.differential_evolution("rastrigin", dimension=10, generations=2500, population=100, CR=.2, F=.2)
(array([ 3.09226981e05, 2.19169568e05, 4.46486498e06,
1.50452001e05, 6.03987807e05, 6.17905562e06,
4.82476074e05, 1.02580314e05, 2.07212921e05,
9.15748483e06]), 1.6496982624403245e06)
PyEC’s optimizers are stochastic; that is, they search the space randomly. No two runs of any optimizer are the same. You can get widely varying results from different runs of the same algorithm, so it’s best to run an algorithm a few times if you’re not satisfied with the results:
>>> pyec.optimize.cmaes("rastrigin", dimension=10, generations=2500,population=250)
(array([ 1.48258949e03, 2.25335429e04, 5.35427662e04,
2.74244483e03, 3.20044246e03, 4.59549462e03,
2.09654701e03, 9.93491865e01, 8.95951435e04,
9.95219709e01]), 1.9996060669315057)
>>> pyec.optimize.cmaes("rastrigin", dimension=10, generations=2500,population=250)
(array([ 4.26366209e04, 7.29513508e04, 5.97365406e04,
9.93842635e01, 4.47482962e04, 3.32484925e03,
3.98886672e03, 4.06692711e04, 1.49134732e03,
3.80257643e03]), 1.0041502969750979)
>>> pyec.optimize.cmaes("rastrigin", dimension=10, generations=2500,population=250)
(array([ 4.24651080e04, 7.78200373e04, 4.80037528e03,
1.06188871e03, 2.50392639e04, 3.00255770e03,
9.91998151e01, 5.52063421e03, 3.44827888e03,
9.97582491e01]), 2.0081777356006967)
Every optimizer performs best on some set of problems, and performs worse on others. Here, differential_evolution() performs well on rastrigin, whereas CMAES is less reliable on this function. The opposite situation can hold true for other optimization problems.
PyEC supports constrained optimization as well. pyec.optimize.Hyperrectangle boundaries can be implemented by:
>>> pyec.optimize.evolutionary_annealing(lambda x: x[0]x[1], dimension=2, constraint=pyec.optimize.Hyperrectangle(center=5.0,scale=2.5))
(array([ 2.5, 7.5]), 5.0)
Other constraints can be implemented by creating a subclass of pyec.optimize.Constraint and passing it to the optimizer via the keyword argument constraint.
Version 0.3 has substantially changed the internal representation of optimizers with the following goals:
Future versions will offer support for features such as:
If you’re just interested in optimization, then the methods in pyec.optimize should be sufficient. But if you are interested in developing new stochastic iterative optimization algorithms or augmenting those provided by PyEC, then you’ll need to understand the architecture of PyEC.
As discussed in the introduction, an optimization problem consists of a function to be optimized (the objective function, fitness function, or cost function) and a set of inputs to the function that are acceptable as optima (the search domain).
In PyEC, the search domain is an instance of pyec.Space. This object provides information about the type of the function inputs. Much work on optimization distinguishes between the search space (the set of inputs on which the objective function is defined) and the search domain (the portion of the search space that is acceptable as a solution), but PyEC does not make this distinction, so that an instance of pyec.Space both specifies the space and as well as the feasible constraints. Theoretically, PyEC is intended to work on topological spaces, and the search domain is the topological restriction of the search space to a particular subset. Practically, when a restricted domain is required, then a subclass of the general search space provides the appropriate methods, such as determining whether a point is within the feasible region.
From a theoretical perspective, PyEC regards an optimization method as a conditional probability distribution that probabilistically generates a batch sample of possible solutions (the “population”) given the previously evaluated samples and their scores on the objective function. All optimization methods inherit from pyec.distribution.basic.PopulationDistribution.
Most optimization methods are depend on various parameters for their optimization. In PyEC, these parameters are wrapped in a pyec.Config object. Every optimization method in PyEC provides its own defaults on a class variable named config. For example, the default DifferentialEvolution has a crossover rate of 0.2 (CR = 0.2) by default. PyEC provides a means to modify these configuration parameters before instantiation using the [] operator, e.g.:
>>> from pyec.config import Config
>>> from pyec.distribution.de import DifferentialEvolution
>>> DifferentialEvolution.config.CR
0.2
>>> DE_CR_05 = DifferentialEvolution[Config(CR=0.5)]
>>> DE_CR_05.config.CR
0.5
In this way, it is easy to configure algorithms to run as needed. Under the hood, DE_CR_05 is a new class with its own Config object, distinct from the config property of DifferentialEvolution. These are both in turn distinct from the config on PopulationDistribution. So, for example:
>>> PopulationDistribution.config.__properties__
{'stats': <pyec.util.RunStats.RunStats object at 0x10375b610>,
'observer': None, 'space': None, 'minimize': True, 'populationSize': 1,
'history': <class 'pyec.history.SortedMarkovHistory'>}
>>> DifferentialEvolution.config.__properties__
{'F': 0.5, 'space': <pyec.space.Euclidean object at 0x10375bdd0>,
'initial': None, 'CR': 0.2, 'populationSize': 100,
'history': <class 'pyec.history.LocalBestHistory'>}
>>> DE_CR_05.config.__properties__
{'CR': 0.5}
In this case, DE_CR_05 is a subclass of DifferentialEvolution, which is a subclass of PopulationDistribution. So what happens when an operator is instantiated?:
>>> DifferentialEvolution().config.__properties__
{'F': 0.5, 'stats': <pyec.util.RunStats.RunStats object at 0x10375b610>,
'observer': None, 'space': <pyec.space.Euclidean object at 0x10375bdd0>,
'minimize': True, 'initial': None, 'CR': 0.2, 'populationSize': 100,
'history': <class 'pyec.history.LocalBestHistory'>}
>>> DE_CR_05().config.__properties__
{'F': 0.5, 'stats': <pyec.util.RunStats.RunStats object at 0x10375b610>,
'observer': None, 'space': <pyec.space.Euclidean object at 0x10375bdd0>,
'minimize': True, 'initial': None, 'CR': 0.5, 'populationSize': 100,
'history': <class 'pyec.history.LocalBestHistory'>}
Once the algorithm is instantiated, it has its own config, and this config has properties from each of its parent classes. The instance config takes on its default value from the lowest parent class in the chain that defines it. So, CR=0.5 in DE_CR_05 overrides CR=0.2 in DifferentialEvolution.
Once an optimizer has been configured, it can be run. Running an iterative optimizer means that the optimizer is used to generate a sequence of samples (populations) in the search domain. The optimizer uses the results of previous samples and their scores in order to generate the next sample.
Operationally, PyEC separates the state of an optimization run from the logic used to generate new points for evaluation. The state of a particular optimization run is maintained in a History object. The History object should contain sufficient information for the algorithm to regenerate its state, and the PopulationDistribution defining the optimizer should not contain any state outside the History object that cannot be rebuilt from the History. Each algorithm must specify the History class that it normally uses in its config object. Additionally, each optimizer may override the method compatible that is used to check whether a History instance is compatible with the optimizer. The default history is a MarkovHistory, which stores only the last set of samples and their scores. MarkovHistory is sufficient to implement many genetic algorithms.
A PyEC optimizer is first updated with a history and a fitness function, and then sampled to select new evaluation points to test. These two steps are repeated indefinitely, so that the optimizer is run like so:
>>> de = DifferentialEvolution()
>>> f = lambda x: sum(x ** 2)
>>> pop = de[None, f]()
>>> history = de.history
>>> for i in xrange(100):
... history.update(pop, f, space, de)
... pop = de[history, f]()
...
>>> history.minimal()
(array([ 8.33013471e24]), 6.9391144322723016e47)
In this code, an instance of DifferentialEvolution is created to minimize a basic parabola in one dimension. A blank history of the default class is created by passing None along with the objective function into the [] operator, and then the optimizer is immediately called to generate the initial population. The history is extracted, and then updated with the latest population, the objective function, the search domain, and the optimizer. The actual function evaluation (if necessary) occurs in the call to history.update. Finally, the optimizer is updated with the history, and a new population is generated. These last two steps are repeated, and at the end, the minimal point observed during the process is extracted by a call to history.minimal.
There is a simpler way to run an optimizer, like so:
>>> de100 = (DifferentialEvolution << 100)()
>>> lastPop = de100[None, (lambda x: sum(x**2))]()
>>> de100.history.minimal()
(array([ 1.77413093e23]), 3.1475405527093268e46)
In this second example, the notation DifferentialEvolution << 100 represents an operation called selfconvolution, which is equivalent to running DifferentialEvolution for 100 generations and then return the final sample. Selfconvolution is an operation on optimization algorithms. Behind the scenes, the operation << produces a new optimizer class that inherits from SelfConvolution. PyEC offers several operations for creating new operators beyond selfconvolution. In fact, PyEC provides an algebra for optimizers, discussed next.
Viewed from a certain perspective, the space of all possible optimizers is a normed vector space, similar to the space of real numbers or the space of continuous functions (see Lockett and Miikkulainen, “MeasureTheoretic Analysis of Stochastic Optimization”, 2013). Thus, one can meaningfully speak of adding optimizers, or multiplying them by scalar values. There are also other operations that are useful.
PyEC support several optimizer operations at both the class and the instance level. The main operations currently implemented are: (1) addition, (2) scalar multiplication, (3) convolution, (4) trajectory truncation, and (5) population splitting.
Addition of two optimizer indicates a probabilistic choice. Thus if we have $mathcal{A} + mathcal{B}$, the result is a pseudooptimizer that uses $mathcal{A}$ to produce the next generation/sample half the time, and uses $mathcal{B}$ the other half of the time. This was described as a “pseudooptimizer” because the result is not a probability distribution. To get a probability distribution, we have to normalize the coefficients, that is, $1 + 1 = 2$, so we choose each optimizer with probability $frac{1}[2}$. PyEC keeps track of additive coefficients and normalizes automatically, so that every possible addition is a real optimizer.
Scalar multiplication simply changes the coefficients. If there is only one optimizer involved, this has no effect. But if there are multiple optimizers, then the coefficients express the frequency with which each optimizer is used to generate a distribution. PyEC makes one choice per generation. Note that this is different from the formal theory of Lockett and Miikkulainen, which allows one choice per individual.
So:
>>> from pyec.distribution.de import DifferentialEvolution
>>> from pyec.distribution.cmaes import Cmaes
>>> DE_Plus_CMAES = DifferentialEvolution + Cmaes
<class 'pyec.distribution.convex.DifferentialEvolution_add_qqqCmaesqqq'>
>>> Half_DE_Plus_Half_CMAES = .5 * DifferentialEvolution + .5 * Cmaes
<class 'pyec.distribution.convex.DifferentialEvolution_mult_qqq0_5qqq_add_qqqCmaes_mult_qqq0_5qqqqqq'>
>>> import pyec.optimize
>>> sum([pyec.optimize.optimize(DE_Plus_CMAES, lambda x: (x**2).sum())[1] for i in xrange(10)])/10
2.3966419053038709e20
>>> sum([pyec.optimize.optimize(Half_DE_Plus_Half_CMAES, lambda x: (x**2).sum())[1] for i in xrange(10)])/10
2.898181656813492e20
If, rather than probabilistically choosing one optimizer or the other to generate the population, one wishes instead to have a certain percentage of the next population generated by a particular algorithm, then one can use the  operator to split the population:
>>> split_Half_DE_Half_CMAES = .5 * DifferentialEvolution  .5 * Cmaes
>>> sum([pyec.optimize.optimize(split_Half_DE_Half_CMAES, lambda x: (x**2).sum())[1] for i in xrange(10)])/10
An important concept for the theory of genetic algorithms is the convolution operator. A convolution of optimizers is a process where a population is chosen first according to one optimizer, and then chosen population is fed into a second optimizer as part of its history. Genetic algorithms can be defined using convolutions, and this is done in PyEC using the “<<” for convolution, e.g.:
SimpleGeneticAlgorithm = (
Proportional << ((Proportional >> 1) <<
Crossover[_(crosser=OnePointDualCrosser)])
<< Bernoulli
)[_(space=BinaryReal(realDim=5))]
When the second argument to << is an integer, then selfconvolution is implied, and the optimizer is convolved with itself the specified number of times, as shown above.
Trajectory truncation is denoted by >>. If the second argument is an integer, the specified number of populations are truncated off of the population history. If the second argument is an optimizer, then the first optimizer is convolved with the second, but the second optimizer is trajectory truncated by one. Thus the following two are equivalent:
>>> OptimizerA >> OptimizerB
>>> OptimizerA << (OptimizerB >> 1)
Each operation creates a new class. These classes are assigned a name that matches the operations that generated them. PyEC tries to reuse classes where possible.
In general, combining optimizers with these operations will not automatically create good optimizers, although in the case of addition and scalar multiplication, the performance change can be smooth as the coefficients are varied (see Lockett, “MeasureTheoretic Analysis of Performance in Evolutionary Algorithms”). However, these constructive operations can save a lot of time on coding and testing if used properly to implement new optimizers.
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Apply Correlated Matrix Adaptation Evolution Strategy (CMAES) to optimize a function. See <http://en.wikipedia.org/wiki/CMAES>.
Calls optimize().
Extra keyword arguments:
Apply differential evolution (DE) to optimize a function. See <http://en.wikipedia.org/wiki/Differential_evolution>.
Calls optimize().
Extra keyword arguments:
Apply differential evolution (DE) to optimize a function. See <http://en.wikipedia.org/wiki/Differential_evolution>.
Calls optimize().
Extra keyword arguments:
Apply evolutionary annealing to optimize a function. See Chapter 11 of <http://nn.cs.utexas.edu/downloads/papers/lockett.thesis.pdf>.
Calls optimize().
Extra keyword arguments:
Apply evolutionary annealing to optimize a function. See Chapter 11 of <http://nn.cs.utexas.edu/downloads/papers/lockett.thesis.pdf>.
Calls optimize().
Extra keyword arguments:
Apply a basic generating set search to optimize a function. See <http://smartfields.stanford.edu/documents/080403_kolda.pdf>.
Uses no search heuristic, and uses the d+1 size basis in dimension d.
Calls optimize().
Extra keyword arguments:
Apply a basic generating set search to optimize a function. See <http://smartfields.stanford.edu/documents/080403_kolda.pdf>.
Uses no search heuristic, and uses the d+1 size basis in dimension d.
Calls optimize().
Extra keyword arguments:
Apply NelderMead method to optimize a function. See <http://en.wikipedia.org/wiki/NelderMead_method>.
Calls optimize().
Extra keyword arguments:
Apply NelderMead method to optimize a function. See <http://en.wikipedia.org/wiki/NelderMead_method>.
Calls optimize().
Extra keyword arguments:
Configure and run an optimizer on a function.
By default the function will be minimize, but maximization can be performed by setting the keyword argument minimize to False.
Benchmark functions can be optimized by name. The following names are supported:
Parameters: 


Returns:  A tuple (best solution, best value) where the first element is the best solution located during optimization and best value is the value of the function at best solution. 
Keyword arguments:
Apply particle swarm optmization to optimize a function. See <http://en.wikipedia.org/wiki/Particle_swarm_optimization>.
Calls optimize().
Extra keyword arguments:
Apply particle swarm optmization to optimize a function. See <http://en.wikipedia.org/wiki/Particle_swarm_optimization>.
Calls optimize().
Extra keyword arguments:
Apply simulated annealing to optimize a function. See <http://en.wikipedia.org/wiki/Simulated_annealing>.
Calls optimize().
Extra keyword arguments:
Apply simulated annealing to optimize a function. See <http://en.wikipedia.org/wiki/Simulated_annealing>.
Calls optimize().
Extra keyword arguments:
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
A configuration object for an optimization algorithm. Each optimizer is created with a configuration object that is used to parameterize the optimizer instance. A configuration object may have arbitrary properties and methods. Default versions of methods used by several optimizers are provided.
The initializer allows arbitrary keywords so that config objects can be quickly created, e.g.
from pyec.config import Config as _ opt = SomeOptimizer[_(p=.5,q=.2)]
Given a second config object, produce a new config object that contains the properties of this config and the argument, with the argument given precedence.
The copy is shallow, so all objects will share any internal state.
Parameters:  cfg (Config) – A second configuration 

Returns:  A third Config object with the desired properties 
A builder to generate configuration objects with certain parameters.
A builder creates a specific configuration object for a specific optimization method. Its __init__ method is used to set the optimization class and the default parameters. The cfg property can then be modified to replace the defaults, and then configure can be called to generate an optimizer with the desired configuration. When an optimizer is instantiated, a copy of the default configuration is used, so the the builder can be reused.
Several default training parameters are placed into the Config object; view the source for details.
Parameters:  algcls (class) – A class object or other generator that produces a PopulationDistribution instance when called. 

Creates an optimizer instance and applies the built configuration to the optimizer.
Parameters: 


Returns:  The configured optimizer instance, usually a PopulationDistribution instance. 
Called by configure to install any final properties into the Config object after it has been copied. Properties that are changed by postConfigure are not shared among different optimizer instances. This method should be used to install any objects that contain state that is specific to an optimizer.
Parameters:  cfg (Config) – The copied Config object. 

Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
An Exception to indicate that an optimizer provided as an argument cannot be run by Trainer.
A simple stats recording tool that can be used to aggregate the amount of time spent in various methods of an optimizer. Keeps track of variables by key names and outputs the time spent between start and stop for each key. For a recorded key, the average time spent between start and stop can be retrieved by [], like so:
def checkStats():
stats = RunStats()
# start the timer
stats.start("test")
.. .
# stop the timer
stats.stop("test")
# print just the key "test"
print stats["test"]
# print all
print stats
Tool to run an populationbased optimizer on a problem.
Parameters: 


An int indicating how to group the data.
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
A Distribution that can be sampled.
Parameters:  config (Config) – A set of configuration parameters for the distribution. 

A Gaussian Proposal Distribution. Used as an initial distribution by several algorithms, and as a proposal distribution by Simulated Annealing.
Config parameters: * sd – The standard deviation for the Gaussian, may be ndarray or float * sd_factor – The factor to multiply / divide when adjusting the std.
dev. dynamically
Params config:  The configuration parameters. 

Mixin.
Find a compatible subhistory for this suboptimizer.
Parameters:  sub (PopulationDistribution) – One of the suboptimizers for this convolution 

Returns:  A compatible History for the suboptimizer 
A distribution governing a populationbased optimizer.
This is the parent class for optimizers in PyEC. Its core methods are batch, inherited from Distribution, and update, which reports the population and the scores after a generation is complete.
Config parameters * populationSize – The size of each population, default 1 * history – The class of the History to use, default
SortedMarkovHistory
overridden
a callable object that returns a single solution, or None to use the space’s randomizer
which will be called with this instance and the next population when updated, e.g. self.config.report(self, pop). In this case, pop is a scored population, consisting of a list of (solution, score) tuples.
Params config:  The configuration parameters. 

Check whether the provided history is acceptable for this optimizer.
Parameters:  history (A History) – The history object that will be used to track the progress of the algorithm 

Returns:  bool – whether the history class is acceptable 
Convert a point to a scorable representation. Deprecated. Use Space.convert instead.
Params x:  The candidate solution (genotype) 

Returns:  The converted candidate (phenotype) 
Whether the optimizer uses the scores at all; used to prevent evaluation when not necessary in convolutions.
Returns:  bool – whether this optimizer needs scores 

Run this PopulationDistribution instance to maximize a fitness function.
After running this method, the property PopulationDistribution.trainer will contain the Trainer object used to optimize the function.
Parameters: 


Returns:  The :class:`History for the object 
Update the state of the PopulationDistribution based on the history.
Params history:  A History object with sufficient info to update the state of the optimizer 

Parameters:  fitness (Any callable) – A fitness function 
Metaclass for optimizers; defines class operations that generate new optimizers before instantiation.
A proposal distribution for e.g. simulated annealing.
Standard initialization for population distributions, inserted by PopulationDistributionMeta. Will call the provided initialization method as well.
Parameters: 


Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
RealGeneticAlgorithm uses linear ranking selection, uniform crossover, and Gaussian mutation.
alias of Tournament_convolve_qqqTournament_truncate_1_convolve_qqqCrossoverqqqqqq_convolve_qqqBernoulliqqq_qqq278371753qqq
ElitistGeneticAlgorithm shows how to apply elitism; in this case, the top 10% of the population will be preserved for the next generation.
alias of Ranking_convolve_qqqRanking_truncate_1_convolve_qqqCrossoverqqqqqq_convolve_qqqGaussianqqq_qqq278371793qqq
GeneticAlgorithm uses tournament selection over the entire population, uniform crossover, and Bernoulli mutation.
alias of Proportional_convolve_qqqProportional_truncate_1_convolve_qqqCrossover_qqq278371689qqqqqqqqq_convolve_qqqBernoulliqqq_qqq278371781qqq
Mainly, these files are just examples of how to make a genetic algorithm work. Genetic algorithms are just convolutions of standard components.
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Initialize the 1996 version of CMAES.
A (10/3,100)ES with intermediate crossover
alias of EvolutionStrategySelection_qqq278467577qqq_convolve_qqqEndogeneousGaussian_qqq278467581qqqqqq_qqq278456053qqq
A (10,100)ES
alias of EvolutionStrategySelection_qqq278467557qqq_convolve_qqqEndogeneousGaussian_qqq278467365qqqqqq_qqq278467573qqq
A (10/3,100)ES with dominany crossover
alias of EvolutionStrategySelection_qqq278366129qqq_convolve_qqqEvolutionStrategySelection_qqq278456037qqq_truncate_1_convolve_qqqEvolutionStrategySelection_qqq278455977qqq_truncate_2_convolve_qqqCrossover_qqq278456057qqqqqqqqqqqq_convolve_qqqEndogeneousGaussian_qqq278455881qqqqqq_qqq278455961qqq
A (1+1)ES; Note the use of EndogeneousProduct space
Convenience function to configure the 1996 version of CMAES
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Base class for annealed selection. See e.g.
Lockett, A. and Miikkulainen, R. Realspace Evolutionary Annealing, 2011.
For a more uptodate version, see Chapters 1113 of
<http://nn.cs.utexas.edu/downloads/papers/lockett.thesis.pdf>.
Config parameters (many like simulated annealing)
a string, one of (“log”, “linear”, “discount”). If it is a callable, then it will be passed the updates value in History and should return a floating point value that will divide the exponent in the Boltzmann distribution. That is, schedule(n) should converge to zero as n goes to infinity.
schedules “log” and “inear”. As a divisor, it divides the temperature but multiplies the exponent.
temp0 – An initial temperature for the temperature decay in “discount”
History, scaling the rate of decline in the temperature
is multiplied by discount once each time the update method is called
separator – A SeparationAlgorithm for partitioning regions
Annealing and its descendents pass along a tuple; this class extracts the point for use with other methods.
A History that records all points and uses those points to partition the search domain usina a Partition object. Also stores a ScoreTree and a AreaTree.
This class is memory intensive, so be careful about convolving or convexly combining it, since each of those combinations may cause multiple copies of this history to be stored in memory.
Proportional Annealing, as described in
Lockett, A. And Miikkulainen, R. Realspace Evolutionary Annealing (2011).
See Annealing for more details.
Config parameters:
Tournament Annealing, as described in Chapter 11 of
<http://nn.cs.utexas.edu/downloads/papers/lockett.thesis.pdf>
See Annealing for more details.
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Select the best member of the prior population.
Implements elitism by replacing the worst member of the population with the best solution seen so far. If the best solution is a member of the current population, nothing is changed.
Elitism can be applied to an algorithm by piping the algorithm with elitism, i.e.:
ElitistVersion = .2 * Elitist  .8 * SomeAlgorithm
for a new version of SomeAlgorithm that selects the first 20 % of the population as the best solutions so far, and fills in the rest of the population with SomeAlgorithm.
Implements standard selection for evolution strategies.
The property config.selection determines the type of selection, either “plus” or “comma”. The config should provide $mu$, and the population size together with mu and the selection type determines lambda.
Config parameters
function so that any fitness values may be used.
$p(x) = exp(
rac{f(x)}{T})$
Config parameters:
 T – A “temperature” to divide the explicand.
Fitness proportional selection with a modulating function.
Config parameters:
is performed on the image of this function. For fitness $f$, this is $modulatorcdot f$. The modulator is given the index of the element in the population second. The modulator is passed the configuration function as a third argument; if this causes an exception, then on the score is passed. That is, the fitness s = f(x) is computed, and then a call is made to modulator(s,i,cfg), and if this causes an exception, modulator(s,i) is used.
Computes the probability of selecting a solution from a population using linear ranking selection.
See e.g. <http://www.pohlheim.com/Papers/mpga_gal95/gal2_3.html>
Computes the probability of selecting a solution from a population using nonlinear ranking selection.
See e.g. <http://www.pohlheim.com/Papers/mpga_gal95/gal2_3.html>
Fitness proportional selection (roulette wheel selection).
See <http://en.wikipedia.org/wiki/Fitness_proportionate_selection>.
Fitness values must be nonnegative.
This Selection method is just GeneralizedProportional with a modulating function
A policy to allocate selection probabilities under ranking selection.
Ranking selection, e.g. <http://www.pohlheim.com/Papers/mpga_gal95/gal2_3.html>
Takes a ranking function which weights individuals according to rank Rank is 1 for lowest, K for highest in population of size K
Note that the ranking function is a class, instantiated with the algorithm’s config. When this method is initialized, the ranking function is instanted and placed in config.rankerInst, unless the config.rankerInst is provided first
Config parameters
A selection method (Abstract class)
Tournament selection on the entire population.
See <http://en.wikipedia.org/wiki/Tournament_selection>.
This class extends GeneralizedProportional, but only to get the method buildProbabilities (and the particular use of batch). Its modulating function is builtin and is just the rank of the member in the population.
Config parameters: * pressure – The probability of choosing the best
member of the randomly subsampled population.
for tournament selection over the entire population
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Like AreaSensitiveGaussian but with a Bernoulli.
Because a bit space is finite, there is no need to decay, but we still want to make exiting the area less likely to promote exploration within the selected area.
Config parameters * p – The bitflipping prob
Gaussian mutation with a variance that depends on the size of a partition of the space (see pyec.util.partitions.Partition).
Config parameters: * jogo2012 – Whether to use the methods from Lockett & Miikkulainen, 2012 * decay – A function decay(n,config) to compute the multiplier
that controls the rate of decrease in standard deviation. Faster decay causes faster convergence, but may miss the optimum. Default is ((1/generations))
Standard deviation is determined by:
sd = .5 * ((upperlower) * decay(n)
where dim is the dimension of the space (config.dim) and area is the volume of the partition region for the object being mutated.
Bernoulli mutation on a TernaryString object. Randomly flips bits on a fully specified binary string.
Config parameters * p – The probability of flipping each bit.
This method uses an iterative algorithm to flip bits in a TernaryString with a given probability. The algorithm essentially uses the binary representation of the bitflipping probability in order to convert random byte sampling (p=.5) to account for any probability, with resolution of 1e16.
Cauchy mutation.
Config parameters * sd – The width parameter of the Cauchy (pi) * p – The probability of mutating each component, 1.0 by default
Implements an old (1996) version of mutation for the Correlated Matrix Adaption method of Hansen and Ostermeier.
This method is obsolete and is very different from the modern version.
NEEDS A SPACE TO HANDLE PULLING OUT THE SIGNIFICANT DIMENSIONS, see EndogeneousProduct
1996 paper says to set cmaCumulation to $
rac{1}{n}$, cmaCorrelation to $ rac{2}{n^2}$.
Default assumes will be run 1000 generations, n=1000
Config Parameters:
 sd
 cmaCumulation
 cmaDamping
 cmaCorrelation
take a N x N matrix and make a N(N+1)/2 array
take a N(N+1)/2 array and make a N x N matrix
Base class to perform recombination on a given list of organisms.
Parameters:  config (Config) – The configuration parameters. 

Performs recombination using a crossover policy given by a Crosser instance.
Config parameters
on this value fails, then no crossover is performed, and the first selected organism is chosen
order – The number of organisms to crossover at each step
crosser – A Crosser to perform the crossover
Like DecayedGaussian, but with a Bernoulli distribution instead of a Gaussian; the bit flipping probability decays to zero.
Config parameters * sd – A scale factor on the standard deviation. * decay – A multiplicative decay factor. * decayExp – An exponential decay factor.
Bit flip prob is determined by:
p = p0 * exp((generation * varDecay) ** varExp)
Gaussian mutation with a variance that decreases over time according to fixed formula.
Config parameters * sd – A scale factor on the standard deviation. * decay – A multiplicative decay factor. * decayExp – An exponential decay factor.
Standard deviation is determined by:
sd = sd0 * exp((generation * varDecay) ** varExp)
Cross multiple organisms using generalized uniform crossover.
Gaussian mutation for evolution strategies, where the adaptive mutation parameters are encoded with the organism.
NEEDS A SPACE TO HANDLE PULLING OUT THE SIGNIFICANT DIMENSIONS
Config params
sd – The standard deviation to start with.
number of dimensions being searched.
Gaussian mutation. Randomly mutates a random selection of components of a real vector.
Config parameters * sd – The standard deviation to use. * p – The probability of mutating each component; defaults to 1.0
Cross multiple organisms by averaging their values componentwise.
Normally used by evolution strategies.
Base class for mutation operators. Provides a method mutate that performs mutation on a single member of a population.
Cross two organisms at one point and return only one crossed version.
Unlike OnePointDualCrosser, this discards the second variant.
See <http://en.wikipedia.org/wiki/Crossover_(genetic_algorithm)#Onepoint_crossover>.
One point crossover, mixes two individuals to create two new individuals by splicing them together at a random index.
So:
xxxxxx
oooooo
might become:
xxoooo
ooxxxx
And both items are returned.
See <http://en.wikipedia.org/wiki/Crossover_(genetic_algorithm)#Onepoint_crossover>.
Cross two organism by wrapping them into a circle, picking two indices, and combining the two rings.
See <http://en.wikipedia.org/wiki/Crossover_(genetic_algorithm)#Twopoint_crossover>.
Uniform mutation within a partition region. The region is determined by the organism being mutated.
Parameters:  config (Config) – The configuration parameters. 

Uniform crossover.
See <http://en.wikipedia.org/wiki/Crossover_(genetic_algorithm)#Uniform_Crossover_and_Half_Uniform_Crossover>
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Implements Differential Evolution (DE) as described by:
Storn, Ranier and Price, Kenneth. Differential Evolution – A simple and efficient adaptive scheme for global optimization over continuous spaces. 1995.
See <http://en.wikipedia.org/wiki/Differential_evolution> for algorithm details.
See <http://www.hvasslabs.org/people/magnus/publications/pedersen10goodde.pdf> for a good discussion of parameter settings for DE.
Config parameters:
Other defaults:
Parameters:  cfg (Config) – The configuration object for Differential Evolution. 

Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
The Correlated Matrix Adaptation Evolution Strategy algorithm as described by:
Hansen, N. and Ostermeier, A. Completely derandomized selfadaptation in evolution strategies. In Evolutionary Computation 9(2), pp 159–195, 2001.
See <http://en.wikipedia.org/wiki/CMAES> for details.
A fastconverging populationbased optimizer that finds reasonably good solutions with relatively few resources. In general, each population is sampled from a multivariate Gaussian whose parameters are altered to optimize the probability of finding good solutions.
Akimoto (2010) proved that CMAES is an instance of Natural Evolution Strategies (Wierstra, 2008), which implements a gradientfollowing method at the population level.
Config parameters * mu – the number of solutions from each generation used as parents for the next. By default equal to half the Config.populationSize. * restart – whether to restart when the distribution collapses; defaults to True (see e.g. `Auger and Hansen, 2005)
Params cfg:  The configuration object for CMAES. 

A history that stores the parameters for CMAES
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
A History for Particle Swarm Optimization. Rembers the local best and the velocities.
Particle Swarm Optimization.
Config parameters
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
The NelderMead method of optimization for real spaces.
A history that saves the state for the NelderMead algorithm
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
A proposal distribution for structure searching with a heuristic algorithm such as simulated annealing.
Config params:
branchFactor The maximum number of parents per variable
probability of removing a random edge.
probability of adding a random edge
returns a probability of reversing a random edges
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
A ternary string with three values: True, False, and Unknown
Params base:  An object whose bits are treated as a bit string 

Params known:  An object whose bits determine whether the value at that bit is known in base 
Params length:  The maximum number of bits allowed 
This method uses an iterative algorithm to flip bits in a TernaryString with a given probability. The algorithm essentially uses the binary representation of the bitflipping probability in order to convert random byte sampling (p=.5) to account for any probability, with resolution of 1e16.
Parameters: 


Returns:  A TernaryString with the specified length that is 1 in each position with probability p 
Convert a number to its binary representation.
Parameters: 


Returns:  A string of zeros and ones with the least significant bits at the left. 
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Separation algorithm for Bayesian networks based on structure.
Used to separate spaces defined by a layer hierarchy; written for neural networks initially. Assumes that the space is divided into layered strata; see LayeredSpace. The bottom layer is separated differently from the higher layers, and so the config should contain a property named “secondary_separator” that can be used to separate points in the lowest rung of the hierarchy.
Like VectorSeparationAlgorithm, except that it always partitions the longest side. This can be helpful to guarantee that the longest side length of any partition hyperrectangle is reduced evenly.
A algorithm that enables traversal and separation of the partition tree. This class abstracts the spacedependent portions of the partitioning logic.
Parameters:  config (Config) – Configuration parameters 

Check whether this partitioning method is compatible with a given space
Parameters:  space (class:Space) – The space to check 

Insert the point into the partition tree, separating the current partition that contains it.
Parameters: 


Do the actual work to partition a region.
Parameters: 


Throws :  SeparationException when separation of the node fails 
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Copyright (C) 2012 Alan J Lockett
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.