Introduction to PyEC

PyEC provides an implementation of several well-known optimization methods with a particular focus on methods of evolutionary computation or natural computation. If you are just interested in optimizing functions, see the section on Basic Usage below.

For the time being, this code is in an experimental stage and will likely change substantially with the next few minor versions.

Who is PyEC for?

PyEC is intended for three main groups of people:

Anyone who needs to optimize functions. PyEC can be used as a drop-in module to perform optimization quickly and accurately. The most common optimization methods have been packaged in an easily accessible form for this purpose, as described in Basic Usage below.

Researchers in Evolutionary Computation. PyEC contains tools to construct, modify, and parameterize optimization methods. It can be used as a toolbox for the researcher in evolutionary computation to construct evolutionary algorithms on complex search domains, or to test new genetic operators or new combinations of existing evolutionary components. These features are not well documented yet, but the class-specific Documentation should help you get started.

Anyone interested in Stochastic Optimization. PyEC efficiently implements the most common methods in stochastic optimization. It can be used to test and experiment how different algorithms work, and what effects different parameter settings can have.

Installation

PyEC is distributed to PyPi and can be downloaded with setuptools using the name pyec, like so:

$ easy_install pyec

PyEC has dependencies on the linear algebra package NumPy<http://numpy.scipy.org/> and the scientific computing package SciPy<http://www.scipy.org/>.

Basic Usage

PyEC provides various optimization methods. If you just want to optimize, the following examples should help. PyEC provides easily accessible optimization routines for Evolutionary Annealing, Differential Evolution, Nelder-Mead, Generating Set Search, CMA-ES, Particle Swarm Optimization, and Simulated Annealing.

Start a Python terminal session as follows:

Python 2.6.5 (r265:79359, Mar 24 2010, 01:32:55)
[GCC 4.0.1 (Apple Inc. build 5493)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import pyec.optimize

To start with, we are going to optimize Branin’s function. This is a real function in two dimensions. It is defined as

\[f(x,y) = \left(y-\frac{5.1}{4\pi^2}x^2+\frac{5}{\pi}x-6\right)^2 + 10\left(1-\frac{1}{8\pi}\right)\cos(x) + 10\]

and it has three global optima with \(f(-\pi, 12.275) = f(\pi, 2.275) = f(9.42478, 2.475) = 0.39788735\). Usually, the optimization is constrained by \(-5 < x < 10\) and \(0 < y < 15\):

>>> from numpy import *
>>> def branin(x):
...    return (x[1] - (5.1/(4*(pi**2)))*(x[0]**2) + (5./pi)*x[0] - 6.)**2 + 10 * (1. - 1./(8.*pi))*cos(x[0])+10.
...

Here are some examples of PyEC on Branin’s function with no constraints, on about 2,500 function evaluations (default):

>>> pyec.optimize.evolutionary_annealing(branin, dimension=2)
(array([ 3.14159266,  2.27500002]), 0.39788735772973816)
>>> pyec.optimize.differential_evolution(branin, dimension=2)
(array([ 3.14100091,  2.25785439]), 0.39819904998361366)
>>> pyec.optimize.cmaes(branin, dimension=2)
(array([ 3.14159266,  2.27500001]), 0.39788735772973816)
>>> pyec.optimize.nelder_mead(branin, dimension=2)
(array([ 3.14159308,  2.27499873]), 0.39788735773149675)
>>> pyec.optimize.generating_set_search(branin, dimension=2)
(array([ 3.12552433,  2.29660443]), 0.39920864246465015)
>>> pyec.optimize.particle_swarm_optimization(branin, dimension=2)
(array([ 3.1282098 ,  2.26735578]), 0.39907497646047574)

In these examples, all we had to do was to pass our function to one of PyEC’s optimizers along with the dimension (Branin’s has 2 inputs variables), and these methods used default configurations in order to locate the global minimum (evolutionary_annealing() requires the space_scale parameter for unconstrained optimization). The return values are a tuple with two items. The first element is the best solution found so far, and the second element is the function value at that solution. In this case, many of the optimizers were not particularly accurate. Some methods missed the global minimum by an error on the order of 0.01, which is much larger than we would prefer, especially for this simple problem. Most of these methods can be made more accurate by replacing some of the default parameters:

>>> pyec.optimize.differential_evolution(branin, dimension=2, generations=250, population=50, CR=.2, F=.2)
(array([ 3.14159085,  2.27498739]), 0.39788735794177654)
>>> pyec.optimize.generating_set_search(branin, dimension=2, generations=5000, expansion_factor=1., initial_step=0.5)
(array([ 3.1386125 ,  2.27810758]), 0.3979306095569175)
>>> pyec.optimize.particle_swarm_optimization(branin, dimension=2, omega=-0.5, phi_p=0.0)
(array([ 3.14159258,  2.27500003]), 0.39788735772976125)

Now all but one of the methods have found the global optimum accurately up to eight decimal places.

If we had wished to find the maximum values rather than the minimum values of the function, we could have passed the parameter minimize=False to any of these optimizers:

>>> pyec.optimize.differential_evolution(branin, dimension=2, minimize=False)

In general, a function \(f(x)\) can be maximized by minimizing the alternate function \(-f(x)\) instead, which is what PyEC does internally when minimize=False.

Branin’s function is relatively easy to optimize; we would like to try a harder function. PyEC ships with several benchmark multimodal functions, many of which are defined in higher dimensions as well. These benchmarks can be referenced by name when calling PyEC’s optimizers. One example is Rastrigin’s function (see <http://en.wikipedia.org/wiki/Rastrigin_function>):

def rastrigin(x):
   return 10 * len(x) + ((x ** 2) - 10 * cos(2 * pi * x)).sum()

Rastrigin’s has a minimum at the origin, where its function value is 0. PyEC’s optimizers can find this minimum with a little tweaking:

>>> pyec.optimize.differential_evolution("rastrigin", dimension=10, generations=2500, population=100, CR=.2, F=.2)
(array([ -3.09226981e-05,   2.19169568e-05,   4.46486498e-06,
         -1.50452001e-05,   6.03987807e-05,  -6.17905562e-06,
          4.82476074e-05,  -1.02580314e-05,  -2.07212921e-05,
          9.15748483e-06]), 1.6496982624403245e-06)

PyEC’s optimizers are stochastic; that is, they search the space randomly. No two runs of any optimizer are the same. You can get widely varying results from different runs of the same algorithm, so it’s best to run an algorithm a few times if you’re not satisfied with the results:

>>> pyec.optimize.cmaes("rastrigin", dimension=10, generations=2500,population=250)
(array([  1.48258949e-03,   2.25335429e-04,  -5.35427662e-04,
         -2.74244483e-03,   3.20044246e-03,  -4.59549462e-03,
         -2.09654701e-03,   9.93491865e-01,   8.95951435e-04,
         -9.95219709e-01]), 1.9996060669315057)
>>> pyec.optimize.cmaes("rastrigin", dimension=10, generations=2500,population=250)
(array([ -4.26366209e-04,  -7.29513508e-04,   5.97365406e-04,
         -9.93842635e-01,   4.47482962e-04,  -3.32484925e-03,
         -3.98886672e-03,   4.06692711e-04,  -1.49134732e-03,
          3.80257643e-03]), 1.0041502969750979)
>>> pyec.optimize.cmaes("rastrigin", dimension=10, generations=2500,population=250)
(array([ -4.24651080e-04,   7.78200373e-04,   4.80037528e-03,
          1.06188871e-03,   2.50392639e-04,  -3.00255770e-03,
         -9.91998151e-01,   5.52063421e-03,  -3.44827888e-03,
         -9.97582491e-01]), 2.0081777356006967)

Every optimizer performs best on some set of problems, and performs worse on others. Here, differential_evolution() performs well on rastrigin, whereas CMA-ES is less reliable on this function. The opposite situation can hold true for other optimization problems.

Constrained Optimization

PyEC supports constrained optimization as well. pyec.optimize.Hyperrectangle boundaries can be implemented by:

>>> pyec.optimize.evolutionary_annealing(lambda x: x[0]-x[1], dimension=2, constraint=pyec.optimize.Hyperrectangle(center=5.0,scale=2.5))
(array([ 2.5,  7.5]), -5.0)

Other constraints can be implemented by creating a subclass of pyec.optimize.Constraint and passing it to the optimizer via the keyword argument constraint.

Release Notes

Version 0.3 has substantially changed the internal representation of optimizers with the following goals:

  • Each optimizer should publish its configuration parameter requirements in a code-accessible form.
  • Standardize configuration parameters across components.
  • Better documentation of each method, especially the componential construction of optimizers in PyEC.
  • Basic support for neural networks and neuroevolution.
  • Algebraic operators for optimizers, e.g. SimpleGeneticAlgorithm = ProportionalSelection * -ProportionalSelection * OnePointCrossover * GaussianMutation[p=-0.1].
  • New training framework derived from self-convolution
  • Better support for different search spaces.
  • Better support for complex constraint regions.

Future versions will offer support for features such as:

  • Automatic function specific optimizer configuration
  • Improved documentation

Documentation

If you’re just interested in optimization, then the methods in pyec.optimize should be sufficient. But if you are interested in developing new stochastic iterative optimization algorithms or augmenting those provided by PyEC, then you’ll need to understand the architecture of PyEC.

Basic Concepts

As discussed in the introduction, an optimization problem consists of a function to be optimized (the objective function, fitness function, or cost function) and a set of inputs to the function that are acceptable as optima (the search domain).

In PyEC, the search domain is an instance of pyec.Space. This object provides information about the type of the function inputs. Much work on optimization distinguishes between the search space (the set of inputs on which the objective function is defined) and the search domain (the portion of the search space that is acceptable as a solution), but PyEC does not make this distinction, so that an instance of pyec.Space both specifies the space and as well as the feasible constraints. Theoretically, PyEC is intended to work on topological spaces, and the search domain is the topological restriction of the search space to a particular subset. Practically, when a restricted domain is required, then a subclass of the general search space provides the appropriate methods, such as determining whether a point is within the feasible region.

From a theoretical perspective, PyEC regards an optimization method as a conditional probability distribution that probabilistically generates a batch sample of possible solutions (the “population”) given the previously evaluated samples and their scores on the objective function. All optimization methods inherit from pyec.distribution.basic.PopulationDistribution.

Configuration

Most optimization methods are depend on various parameters for their optimization. In PyEC, these parameters are wrapped in a pyec.Config object. Every optimization method in PyEC provides its own defaults on a class variable named config. For example, the default DifferentialEvolution has a crossover rate of 0.2 (CR = 0.2) by default. PyEC provides a means to modify these configuration parameters before instantiation using the [] operator, e.g.:

>>> from pyec.config import Config
>>> from pyec.distribution.de import DifferentialEvolution
>>> DifferentialEvolution.config.CR
0.2
>>> DE_CR_05 = DifferentialEvolution[Config(CR=0.5)]
>>> DE_CR_05.config.CR
0.5

In this way, it is easy to configure algorithms to run as needed. Under the hood, DE_CR_05 is a new class with its own Config object, distinct from the config property of DifferentialEvolution. These are both in turn distinct from the config on PopulationDistribution. So, for example:

>>> PopulationDistribution.config.__properties__
{'stats': <pyec.util.RunStats.RunStats object at 0x10375b610>,
 'observer': None, 'space': None, 'minimize': True, 'populationSize': 1,
 'history': <class 'pyec.history.SortedMarkovHistory'>}
>>> DifferentialEvolution.config.__properties__
{'F': 0.5, 'space': <pyec.space.Euclidean object at 0x10375bdd0>,
 'initial': None, 'CR': 0.2, 'populationSize': 100,
 'history': <class 'pyec.history.LocalBestHistory'>}
>>> DE_CR_05.config.__properties__
{'CR': 0.5}

In this case, DE_CR_05 is a subclass of DifferentialEvolution, which is a subclass of PopulationDistribution. So what happens when an operator is instantiated?:

>>> DifferentialEvolution().config.__properties__
{'F': 0.5, 'stats': <pyec.util.RunStats.RunStats object at 0x10375b610>,
 'observer': None, 'space': <pyec.space.Euclidean object at 0x10375bdd0>,
 'minimize': True, 'initial': None, 'CR': 0.2, 'populationSize': 100,
 'history': <class 'pyec.history.LocalBestHistory'>}
>>> DE_CR_05().config.__properties__
{'F': 0.5, 'stats': <pyec.util.RunStats.RunStats object at 0x10375b610>,
 'observer': None, 'space': <pyec.space.Euclidean object at 0x10375bdd0>,
 'minimize': True, 'initial': None, 'CR': 0.5, 'populationSize': 100,
 'history': <class 'pyec.history.LocalBestHistory'>}

Once the algorithm is instantiated, it has its own config, and this config has properties from each of its parent classes. The instance config takes on its default value from the lowest parent class in the chain that defines it. So, CR=0.5 in DE_CR_05 overrides CR=0.2 in DifferentialEvolution.

Running an Optimizer

Once an optimizer has been configured, it can be run. Running an iterative optimizer means that the optimizer is used to generate a sequence of samples (populations) in the search domain. The optimizer uses the results of previous samples and their scores in order to generate the next sample.

Operationally, PyEC separates the state of an optimization run from the logic used to generate new points for evaluation. The state of a particular optimization run is maintained in a History object. The History object should contain sufficient information for the algorithm to regenerate its state, and the PopulationDistribution defining the optimizer should not contain any state outside the History object that cannot be rebuilt from the History. Each algorithm must specify the History class that it normally uses in its config object. Additionally, each optimizer may override the method compatible that is used to check whether a History instance is compatible with the optimizer. The default history is a MarkovHistory, which stores only the last set of samples and their scores. MarkovHistory is sufficient to implement many genetic algorithms.

A PyEC optimizer is first updated with a history and a fitness function, and then sampled to select new evaluation points to test. These two steps are repeated indefinitely, so that the optimizer is run like so:

>>> de = DifferentialEvolution()
>>> f = lambda x: sum(x ** 2)
>>> pop = de[None, f]()
>>> history = de.history
>>> for i in xrange(100):
...    history.update(pop, f, space, de)
...    pop = de[history, f]()
...
>>> history.minimal()
(array([  8.33013471e-24]), 6.9391144322723016e-47)

In this code, an instance of DifferentialEvolution is created to minimize a basic parabola in one dimension. A blank history of the default class is created by passing None along with the objective function into the [] operator, and then the optimizer is immediately called to generate the initial population. The history is extracted, and then updated with the latest population, the objective function, the search domain, and the optimizer. The actual function evaluation (if necessary) occurs in the call to history.update. Finally, the optimizer is updated with the history, and a new population is generated. These last two steps are repeated, and at the end, the minimal point observed during the process is extracted by a call to history.minimal.

There is a simpler way to run an optimizer, like so:

>>> de100 = (DifferentialEvolution << 100)()
>>> lastPop = de100[None, (lambda x: sum(x**2))]()
>>> de100.history.minimal()
(array([  1.77413093e-23]), 3.1475405527093268e-46)

In this second example, the notation DifferentialEvolution << 100 represents an operation called self-convolution, which is equivalent to running DifferentialEvolution for 100 generations and then return the final sample. Self-convolution is an operation on optimization algorithms. Behind the scenes, the operation << produces a new optimizer class that inherits from SelfConvolution. PyEC offers several operations for creating new operators beyond self-convolution. In fact, PyEC provides an algebra for optimizers, discussed next.

Optimizer Algebra

Viewed from a certain perspective, the space of all possible optimizers is a normed vector space, similar to the space of real numbers or the space of continuous functions (see Lockett and Miikkulainen, “Measure-Theoretic Analysis of Stochastic Optimization”, 2013). Thus, one can meaningfully speak of adding optimizers, or multiplying them by scalar values. There are also other operations that are useful.

PyEC support several optimizer operations at both the class and the instance level. The main operations currently implemented are: (1) addition, (2) scalar multiplication, (3) convolution, (4) trajectory truncation, and (5) population splitting.

Addition of two optimizer indicates a probabilistic choice. Thus if we have $mathcal{A} + mathcal{B}$, the result is a pseudo-optimizer that uses $mathcal{A}$ to produce the next generation/sample half the time, and uses $mathcal{B}$ the other half of the time. This was described as a “pseudo-optimizer” because the result is not a probability distribution. To get a probability distribution, we have to normalize the coefficients, that is, $1 + 1 = 2$, so we choose each optimizer with probability $frac{1}[2}$. PyEC keeps track of additive coefficients and normalizes automatically, so that every possible addition is a real optimizer.

Scalar multiplication simply changes the coefficients. If there is only one optimizer involved, this has no effect. But if there are multiple optimizers, then the coefficients express the frequency with which each optimizer is used to generate a distribution. PyEC makes one choice per generation. Note that this is different from the formal theory of Lockett and Miikkulainen, which allows one choice per individual.

So:

>>> from pyec.distribution.de import DifferentialEvolution
>>> from pyec.distribution.cmaes import Cmaes
>>> DE_Plus_CMAES = DifferentialEvolution + Cmaes
<class 'pyec.distribution.convex.DifferentialEvolution_add_qqqCmaesqqq'>
>>> Half_DE_Plus_Half_CMAES = .5 * DifferentialEvolution + .5 * Cmaes
<class 'pyec.distribution.convex.DifferentialEvolution_mult_qqq0_5qqq_add_qqqCmaes_mult_qqq0_5qqqqqq'>
>>> import pyec.optimize
>>> sum([pyec.optimize.optimize(DE_Plus_CMAES, lambda x: -(x**2).sum())[1] for i in xrange(10)])/10
2.3966419053038709e-20
>>> sum([pyec.optimize.optimize(Half_DE_Plus_Half_CMAES, lambda x: -(x**2).sum())[1] for i in xrange(10)])/10
2.898181656813492e-20

If, rather than probabilistically choosing one optimizer or the other to generate the population, one wishes instead to have a certain percentage of the next population generated by a particular algorithm, then one can use the | operator to split the population:

>>> split_Half_DE_Half_CMAES = .5 * DifferentialEvolution | .5 * Cmaes
>>> sum([pyec.optimize.optimize(split_Half_DE_Half_CMAES, lambda x: -(x**2).sum())[1] for i in xrange(10)])/10

An important concept for the theory of genetic algorithms is the convolution operator. A convolution of optimizers is a process where a population is chosen first according to one optimizer, and then chosen population is fed into a second optimizer as part of its history. Genetic algorithms can be defined using convolutions, and this is done in PyEC using the “<<” for convolution, e.g.:

SimpleGeneticAlgorithm = (
   Proportional << ((Proportional >> 1) <<
                     Crossover[_(crosser=OnePointDualCrosser)])
   << Bernoulli
)[_(space=BinaryReal(realDim=5))]

When the second argument to << is an integer, then self-convolution is implied, and the optimizer is convolved with itself the specified number of times, as shown above.

Trajectory truncation is denoted by >>. If the second argument is an integer, the specified number of populations are truncated off of the population history. If the second argument is an optimizer, then the first optimizer is convolved with the second, but the second optimizer is trajectory truncated by one. Thus the following two are equivalent:

>>> OptimizerA >> OptimizerB
>>> OptimizerA << (OptimizerB >> 1)

Each operation creates a new class. These classes are assigned a name that matches the operations that generated them. PyEC tries to reuse classes where possible.

In general, combining optimizers with these operations will not automatically create good optimizers, although in the case of addition and scalar multiplication, the performance change can be smooth as the coefficients are varied (see Lockett, “Measure-Theoretic Analysis of Performance in Evolutionary Algorithms”). However, these constructive operations can save a lot of time on coding and testing if used properly to implement new optimizers.

Optimization

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

pyec.optimize.cmaes(func, **kwargs)[source]

Apply Correlated Matrix Adaptation Evolution Strategy (CMA-ES) to optimize a function. See <http://en.wikipedia.org/wiki/CMA-ES>.

Calls optimize().

Extra keyword arguments:

  • parents – The percentage of the population to use as parents, default 0.5.
  • variance – The standard deviation for CMA-ES to use during initialization, if Gaussian initialization is used (only unconstrained optimization); default is 1.0.
pyec.optimize.de(func, **kwargs)

Apply differential evolution (DE) to optimize a function. See <http://en.wikipedia.org/wiki/Differential_evolution>.

Calls optimize().

Extra keyword arguments:

  • CR – The crossover probability for DE, default 0.2.
  • F – The learning rate for DE, default 0.5.
pyec.optimize.differential_evolution(func, **kwargs)[source]

Apply differential evolution (DE) to optimize a function. See <http://en.wikipedia.org/wiki/Differential_evolution>.

Calls optimize().

Extra keyword arguments:

  • CR – The crossover probability for DE, default 0.2.
  • F – The learning rate for DE, default 0.5.
pyec.optimize.evoanneal(func, **kwargs)

Apply evolutionary annealing to optimize a function. See Chapter 11 of <http://nn.cs.utexas.edu/downloads/papers/lockett.thesis.pdf>.

Calls optimize().

Extra keyword arguments:

  • learning_rate – A scaling factor controlling the temperature schedule; smaller numbers search more slowly and thoroughly, larger numbers search faster and less thoroughly.
  • variance – The initial standard deviation of the Gaussian mutation distribution, i.e. how locally the search is spaced. Defaults to 1.0, does not need to be changed.
  • jogo2012 – Use the parameters from Lockett and Miikkulainen in JOGO
pyec.optimize.evolutionary_annealing(func, **kwargs)[source]

Apply evolutionary annealing to optimize a function. See Chapter 11 of <http://nn.cs.utexas.edu/downloads/papers/lockett.thesis.pdf>.

Calls optimize().

Extra keyword arguments:

  • learning_rate – A scaling factor controlling the temperature schedule; smaller numbers search more slowly and thoroughly, larger numbers search faster and less thoroughly.
  • variance – The initial standard deviation of the Gaussian mutation distribution, i.e. how locally the search is spaced. Defaults to 1.0, does not need to be changed.
  • jogo2012 – Use the parameters from Lockett and Miikkulainen in JOGO

Apply a basic generating set search to optimize a function. See <http://smartfields.stanford.edu/documents/080403_kolda.pdf>.

Uses no search heuristic, and uses the d+1 size basis in dimension d.

Calls optimize().

Extra keyword arguments:

  • convergence – The tolerance on the simplex before restarting; default 1e-10.
  • penalty_func – A penalty function for the objective.
  • expansion_factor – Multiplicative expansion factor to use when a new best solution is found; default is 1.1.
  • contraction_factor – Multiplicative contraction factor to use when no new best is found; default is 0.95.
  • initial_step – The initial step.
pyec.optimize.gss(func, generations=500, population=1, **kwargs)

Apply a basic generating set search to optimize a function. See <http://smartfields.stanford.edu/documents/080403_kolda.pdf>.

Uses no search heuristic, and uses the d+1 size basis in dimension d.

Calls optimize().

Extra keyword arguments:

  • convergence – The tolerance on the simplex before restarting; default 1e-10.
  • penalty_func – A penalty function for the objective.
  • expansion_factor – Multiplicative expansion factor to use when a new best solution is found; default is 1.1.
  • contraction_factor – Multiplicative contraction factor to use when no new best is found; default is 0.95.
  • initial_step – The initial step.
pyec.optimize.nelder_mead(func, generations=500, population=1, **kwargs)[source]

Apply Nelder-Mead method to optimize a function. See <http://en.wikipedia.org/wiki/Nelder-Mead_method>.

Calls optimize().

Extra keyword arguments:

  • convergence – The tolerance on the simplex before restarting; default 1e-10.
  • alpha, beta, gamma, delta – standard parameters for Nelder-Mead.
pyec.optimize.nm(func, generations=500, population=1, **kwargs)

Apply Nelder-Mead method to optimize a function. See <http://en.wikipedia.org/wiki/Nelder-Mead_method>.

Calls optimize().

Extra keyword arguments:

  • convergence – The tolerance on the simplex before restarting; default 1e-10.
  • alpha, beta, gamma, delta – standard parameters for Nelder-Mead.
pyec.optimize.optimize(optimizer, func, dimension=5, population=25, generations=100, **kwargs)[source]

Configure and run an optimizer on a function.

By default the function will be minimize, but maximization can be performed by setting the keyword argument minimize to False.

Benchmark functions can be optimized by name. The following names are supported:

  • ackley – A checker-board like oscillator, minimum is -13.37 in 5 dimensions.
  • ackley2 – Exponentiated and centered version of ackley, minimum is 0 at 0.
  • griewank – Oscillator with large scale, minimum at 0.
  • langerman – Sparse, rough, multi-modal. Minimum is 0.98 in five dimensions.
  • rosenbrock – Standard benchmark.
  • rastrigin – Oscillator. Minimum at
  • salomon – Ring oscillation. Minimum 0 at 0.
  • schwefel – Deceptive multimodal function. Minimum is -418 on (-512,512).
  • shekel2 – Shekel’s foxholes, modified. Minimum is -10.4 in five dimensions.
  • sphere – A spherical paraboloid, minimum is 0 at 0
  • whitley – Complex, fractal like shape with small relevant area. Minimum is 0.0.
  • weierstrass – Everywhere continuous, nowhere differentiable; minimum is 0 at 0.
Parameters:
  • optimizer (class) – A PopulationDistribution subclass
  • func (any callable object or str) – The function to be optimized, or a lookup key for a benchmark.
  • dimension (int) – The vector dimension in the search domain
  • population (int) – The population size (sample size) for the optimizer.
  • generations (int) – The number of populations to build (number of samples) during optimization.
Returns:

A tuple (best solution, best value) where the first element is the best solution located during optimization and best value is the value of the function at best solution.

Keyword arguments:

  • minimize – Whether to minimize the function, otherwise maximize; default is True.
  • initial – A callable (no arguments) that returns random starting points for the initial distribution of the optimizer.
  • display – Show progress information once every second.
  • constraint – A Boundary object implementing a constraint region (default is unconstrained).
pyec.optimize.particle_swarm_optimization(func, generations=100, population=25, **kwargs)[source]

Apply particle swarm optmization to optimize a function. See <http://en.wikipedia.org/wiki/Particle_swarm_optimization>.

Calls optimize().

Extra keyword arguments:

  • omega – The velocity decay.
  • phi_g – The global best influence parameter.
  • phi_p – The local best influence parameter.
pyec.optimize.pso(func, generations=100, population=25, **kwargs)

Apply particle swarm optmization to optimize a function. See <http://en.wikipedia.org/wiki/Particle_swarm_optimization>.

Calls optimize().

Extra keyword arguments:

  • omega – The velocity decay.
  • phi_g – The global best influence parameter.
  • phi_p – The local best influence parameter.
pyec.optimize.sa(func, generations=1000, population=1, **kwargs)

Apply simulated annealing to optimize a function. See <http://en.wikipedia.org/wiki/Simulated_annealing>.

Calls optimize().

Extra keyword arguments:

  • schedule – One of (log, linear) for a logarithmic or linear cooling schedule, or a function T(n) to return the temperature at time n.
  • learning_rate – The temperature will be divided by the learning rate is a logarithmic or linear schedule is used.
  • restart_prob – A probability to restart simulated annealing; 0.001 by default.
pyec.optimize.simulated_annealing(func, generations=1000, population=1, **kwargs)[source]

Apply simulated annealing to optimize a function. See <http://en.wikipedia.org/wiki/Simulated_annealing>.

Calls optimize().

Extra keyword arguments:

  • schedule – One of (log, linear) for a logarithmic or linear cooling schedule, or a function T(n) to return the temperature at time n.
  • learning_rate – The temperature will be divided by the learning rate is a logarithmic or linear schedule is used.
  • restart_prob – A probability to restart simulated annealing; 0.001 by default.

Basics

Configuration

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

class pyec.config.Config(**kwargs)[source]

A configuration object for an optimization algorithm. Each optimizer is created with a configuration object that is used to parameterize the optimizer instance. A configuration object may have arbitrary properties and methods. Default versions of methods used by several optimizers are provided.

The initializer allows arbitrary keywords so that config objects can be quickly created, e.g.

from pyec.config import Config as _ opt = SomeOptimizer[_(p=.5,q=.2)]

merge(cfg)[source]

Given a second config object, produce a new config object that contains the properties of this config and the argument, with the argument given precedence.

The copy is shallow, so all objects will share any internal state.

Parameters:cfg (Config) – A second configuration
Returns:A third Config object with the desired properties
class pyec.config.ConfigBuilder(algcls)[source]

A builder to generate configuration objects with certain parameters.

A builder creates a specific configuration object for a specific optimization method. Its __init__ method is used to set the optimization class and the default parameters. The cfg property can then be modified to replace the defaults, and then configure can be called to generate an optimizer with the desired configuration. When an optimizer is instantiated, a copy of the default configuration is used, so the the builder can be reused.

Several default training parameters are placed into the Config object; view the source for details.

Parameters:algcls (class) – A class object or other generator that produces a PopulationDistribution instance when called.
configure(generations, populationSize, dimension=1, function=None)[source]

Creates an optimizer instance and applies the built configuration to the optimizer.

Parameters:
  • generations (int) – The number of generations (samples) to take during optimization
  • populationSize (int) – The size of each population per generation (number of proposed solutions per sample)
  • dimension (int) – The dimension of the object, for vector optimization (binary or real); default 1.
  • function (any callable object) – A function to be maximized; the fitness function or objective. This function will be placed in the Config object so that the optimizer can access it as necessary.
Returns:

The configured optimizer instance, usually a PopulationDistribution instance.

postConfigure(cfg)[source]

Called by configure to install any final properties into the Config object after it has been copied. Properties that are changed by postConfigure are not shared among different optimizer instances. This method should be used to install any objects that contain state that is specific to an optimizer.

Parameters:cfg (Config) – The copied Config object.

Trainer

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

exception pyec.trainer.BadAlgorithm[source]

An Exception to indicate that an optimizer provided as an argument cannot be run by Trainer.

class pyec.trainer.RunStats[source]

A simple stats recording tool that can be used to aggregate the amount of time spent in various methods of an optimizer. Keeps track of variables by key names and outputs the time spent between start and stop for each key. For a recorded key, the average time spent between start and stop can be retrieved by [], like so:

def checkStats():     
   stats = RunStats()
   # start the timer
   stats.start("test")
   .. .
   # stop the timer
   stats.stop("test")
   # print just the key "test"
   print stats["test"]
   # print all
   print stats
start(key)[source]

Start recording time for key.

Parameters:key (str) – A name for the key.
stop(key)[source]

Stop recording time for key.

Parameters:key (str) – The previously started key that is to be stopped.
class pyec.trainer.Trainer(fitness, optimizer, **kwargs)[source]

Tool to run an population-based optimizer on a problem.

Parameters:
  • fitness (any callable object) – A function to be maximized; the fitness function or objective.
  • optimizer (PopulationDistribution) – An optimizer to use to optimize the function.
data = []

An int indicating how to group the data.

train()[source]

Run the optimizer for optimizer.config.generations generations, each with population size optimizer.config.populationSize.

Returns:A tuple with two elements containing the best solution found and the maximal fitness (objective value).

Distributions

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

class pyec.distribution.basic.Distribution(**kwargs)[source]

A Distribution that can be sampled.

Parameters:config (Config) – A set of configuration parameters for the distribution.
batch(sampleSize)[source]

Get a sample from the distribution.

Parameters:sampleSize (int) – The size of the sample to generate.
sample(**kwargs)[source]

Get a single sample from the distribution.

class pyec.distribution.basic.GaussianProposal(*args, **kwargs)[source]

A Gaussian Proposal Distribution. Used as an initial distribution by several algorithms, and as a proposal distribution by Simulated Annealing.

Config parameters: * sd – The standard deviation for the Gaussian, may be ndarray or float * sd_factor – The factor to multiply / divide when adjusting the std.

dev. dynamically
Params config:The configuration parameters.
class pyec.distribution.basic.HistoryMapper(**kwargs)[source]

Mixin.

mapHistory(sub, history=None)[source]

Find a compatible subhistory for this suboptimizer.

Parameters:sub (PopulationDistribution) – One of the suboptimizers for this convolution
Returns:A compatible History for the suboptimizer
class pyec.distribution.basic.PopulationDistribution(**kwargs)[source]

A distribution governing a population-based optimizer.

This is the parent class for optimizers in PyEC. Its core methods are batch, inherited from Distribution, and update, which reports the population and the scores after a generation is complete.

Config parameters * populationSize – The size of each population, default 1 * history – The class of the History to use, default

SortedMarkovHistory
  • space – The Space to search, default None must be

    overridden

  • initial – The initial distribution; a Distribution,

    a callable object that returns a single solution, or None to use the space’s randomizer

  • observer – A view or listener; should have a method report()

    which will be called with this instance and the next population when updated, e.g. self.config.report(self, pop). In this case, pop is a scored population, consisting of a list of (solution, score) tuples.

Params config:The configuration parameters.
compatible(history)[source]

Check whether the provided history is acceptable for this optimizer.

Parameters:history (A History) – The history object that will be used to track the progress of the algorithm
Returns:bool – whether the history class is acceptable
convert(x)[source]
Convert a point to a scorable representation. Deprecated. Use Space.convert instead.
Params x:The candidate solution (genotype)
Returns:The converted candidate (phenotype)
needsScores()[source]

Whether the optimizer uses the scores at all; used to prevent evaluation when not necessary in convolutions.

Returns:bool – whether this optimizer needs scores
run(fitness=None, history=None, extraArgs=None, **kwargs)[source]

Run this PopulationDistribution instance to maximize a fitness function.

After running this method, the property PopulationDistribution.trainer will contain the Trainer object used to optimize the function.

Parameters:
  • fitness (any callable object with a single argument, or None) – The fitness function (objective) to be maximized. If None, then the function will be looked up from pyec.util.registry.BENCHMARKS based on the function property of the Config.
  • history (History or None) – A history object to extend, or None to create a new history from the class in the history property of the Config object.
  • extraArgs (list) – Any extra args to be passed to pyec.util.registry.BENCHMARKS.
Returns:

The :class:`History for the object

update(history, fitness)[source]

Update the state of the PopulationDistribution based on the history.

Params history:A History object with sufficient info to update the state of the optimizer
Parameters:fitness (Any callable) – A fitness function
class pyec.distribution.basic.PopulationDistributionMeta[source]

Metaclass for optimizers; defines class operations that generate new optimizers before instantiation.

class pyec.distribution.basic.ProposalDistribution(**kwargs)[source]

A proposal distribution for e.g. simulated annealing.

adjust(rate)[source]

Given the current acceptance rate, alter the distribution as needed.

densityRatio(x1, x2, i=None)[source]

Given two points, give the ratio of the densities p(x1) / p(x2).

pyec.distribution.basic.initStandard(cls, init=None)[source]

Standard initialization for population distributions, inserted by PopulationDistributionMeta. Will call the provided initialization method as well.

Parameters:
  • cls (A class of type PopulationDistributionMeta) – The class being initialized
  • init – The existing initialization method as defined in the class; keyword arguments will be placed in a config and merged in with the class level config. IT IS STILL THE RESPONSIBILITY OF THE USER TO CALL super().__init__ IF __init__ IS OVERRIDDEN! This provides a level of automaticity without preventing the user from doing whatever they want.

Evolutionary Computation

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

pyec.distribution.ec.ga.GeneticAlgorithm

RealGeneticAlgorithm uses linear ranking selection, uniform crossover, and Gaussian mutation.

alias of Tournament_convolve_qqqTournament_truncate_1_convolve_qqqCrossoverqqqqqq_convolve_qqqBernoulliqqq_qqq278371753qqq

pyec.distribution.ec.ga.RealGeneticAlgorithm

ElitistGeneticAlgorithm shows how to apply elitism; in this case, the top 10% of the population will be preserved for the next generation.

alias of Ranking_convolve_qqqRanking_truncate_1_convolve_qqqCrossoverqqqqqq_convolve_qqqGaussianqqq_qqq278371793qqq

pyec.distribution.ec.ga.SimpleGeneticAlgorithm

GeneticAlgorithm uses tournament selection over the entire population, uniform crossover, and Bernoulli mutation.

alias of Proportional_convolve_qqqProportional_truncate_1_convolve_qqqCrossover_qqq278371689qqqqqqqqq_convolve_qqqBernoulliqqq_qqq278371781qqq

pyec.distribution.ec.ga.log = <logging.Logger object at 0x1098eea90>

Mainly, these files are just examples of how to make a genetic algorithm work. Genetic algorithms are just convolutions of standard components.

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

class pyec.distribution.ec.es.Cmaes1996Initial(**kwargs)[source]

Initialize the 1996 version of CMA-ES.

pyec.distribution.ec.es.ES10_100

A (10/3,100)-ES with intermediate crossover

alias of EvolutionStrategySelection_qqq278467577qqq_convolve_qqqEndogeneousGaussian_qqq278467581qqqqqq_qqq278456053qqq

pyec.distribution.ec.es.ES1_1

A (10,100)-ES

alias of EvolutionStrategySelection_qqq278467557qqq_convolve_qqqEndogeneousGaussian_qqq278467365qqqqqq_qqq278467573qqq

pyec.distribution.ec.es.IntermediateCrossES10Slash3_100

A (10/3,100)-ES with dominany crossover

alias of EvolutionStrategySelection_qqq278366129qqq_convolve_qqqEvolutionStrategySelection_qqq278456037qqq_truncate_1_convolve_qqqEvolutionStrategySelection_qqq278455977qqq_truncate_2_convolve_qqqCrossover_qqq278456057qqqqqqqqqqqq_convolve_qqqEndogeneousGaussian_qqq278455881qqqqqq_qqq278455961qqq

pyec.distribution.ec.es.log = <logging.Logger object at 0x109913dd0>

A (1+1)-ES; Note the use of EndogeneousProduct space

pyec.distribution.ec.es.makeCmaesConfig(populationSize, mu, dim, sd)[source]

Convenience function to configure the 1996 version of CMA-ES

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

class pyec.distribution.ec.evoanneal.Annealing(*args, **kwargs)[source]

Base class for annealed selection. See e.g.

Lockett, A. and Miikkulainen, R. Real-space Evolutionary Annealing, 2011.

For a more up-to-date version, see Chapters 11-13 of

<http://nn.cs.utexas.edu/downloads/papers/lockett.thesis.pdf>.

Config parameters (many like simulated annealing)

  • schedule – The cooling schedule to use. May be any callable function, or

    a string, one of (“log”, “linear”, “discount”). If it is a callable, then it will be passed the updates value in History and should return a floating point value that will divide the exponent in the Boltzmann distribution. That is, schedule(n) should converge to zero as n goes to infinity.

  • learningRate – A divisor for the cooling schedule, used for built-in

    schedules “log” and “inear”. As a divisor, it divides the temperature but multiplies the exponent.

  • temp0 – An initial temperature for the temperature decay in “discount”

  • divisor – A divisor that divides the updates property of

    History, scaling the rate of decline in the temperature

  • discount – The decay factor for the built-in discount schedule; temp0

    is multiplied by discount once each time the update method is called

  • separator – A SeparationAlgorithm for partitioning regions

sample()[source]

Child classes should override this method in order to select a point from the active segment in pyec.util.partitions.Point.

The actual return should be a partition node.

class pyec.distribution.ec.evoanneal.AreaStripper(**kwargs)[source]

Annealing and its descendents pass along a tuple; this class extracts the point for use with other methods.

class pyec.distribution.ec.evoanneal.PartitionHistory(config)[source]

A History that records all points and uses those points to partition the search domain usina a Partition object. Also stores a ScoreTree and a AreaTree.

This class is memory intensive, so be careful about convolving or convexly combining it, since each of those combinations may cause multiple copies of this history to be stored in memory.

internalUpdate(population)[source]

Insert all new points into the partition.

class pyec.distribution.ec.evoanneal.ProportionalAnnealing(**kwargs)[source]

Proportional Annealing, as described in

Lockett, A. And Miikkulainen, R. Real-space Evolutionary Annealing (2011).

See Annealing for more details.

Config parameters:

  • taylorCenter – Center for Taylor approximation to annealing densities.
  • taylorDepth – The number of coefficients to use for Taylor approximation of the annealing density.
class pyec.distribution.ec.evoanneal.TournamentAnnealing(**kwargs)[source]

Tournament Annealing, as described in Chapter 11 of

<http://nn.cs.utexas.edu/downloads/papers/lockett.thesis.pdf>

See Annealing for more details.

Evolutionary Operators

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

class pyec.distribution.ec.selectors.BestSelection(*args, **kwargs)[source]

Select the best member of the prior population.

class pyec.distribution.ec.selectors.Elitist(**kwargs)[source]

Implements elitism by replacing the worst member of the population with the best solution seen so far. If the best solution is a member of the current population, nothing is changed.

Elitism can be applied to an algorithm by piping the algorithm with elitism, i.e.:

ElitistVersion = .2 * Elitist | .8 * SomeAlgorithm

for a new version of SomeAlgorithm that selects the first 20 % of the population as the best solutions so far, and fills in the rest of the population with SomeAlgorithm.

class pyec.distribution.ec.selectors.EvolutionStrategySelection(*args, **kwargs)[source]

Implements standard selection for evolution strategies.

The property config.selection determines the type of selection, either “plus” or “comma”. The config should provide $mu$, and the population size together with mu and the selection type determines lambda.

Config parameters

  • mu – The number of parents to be taken from the prior population
  • selection – Either “plus” selection or “comma”; “comma” is default
class pyec.distribution.ec.selectors.ExponentiatedProportional(**kwargs)[source]
Fitness propotional selection, but with an exponentional modulationg

function so that any fitness values may be used.

$p(x) = exp(

rac{-f(x)}{T})$

Config parameters:

  • T – A “temperature” to divide the explicand.
class pyec.distribution.ec.selectors.GeneralizedProportional(*args, **kwargs)[source]

Fitness proportional selection with a modulating function.

Config parameters:

  • modulator – A funtion applied to the fitness; proportional selection

    is performed on the image of this function. For fitness $f$, this is $modulatorcdot f$. The modulator is given the index of the element in the population second. The modulator is passed the configuration function as a third argument; if this causes an exception, then on the score is passed. That is, the fitness s = f(x) is computed, and then a call is made to modulator(s,i,cfg), and if this causes an exception, modulator(s,i) is used.

buildProbabilities(lastPopulation)[source]

Take the last population (with scores) and recompute the sampling vector for proportional selection.

Parameters:lastPopulation (A list of (point, score) tuples) – The last population, of size config.populationSize
class pyec.distribution.ec.selectors.LinearRanker(config)[source]

Computes the probability of selecting a solution from a population using linear ranking selection.

See e.g. <http://www.pohlheim.com/Papers/mpga_gal95/gal2_3.html>

class pyec.distribution.ec.selectors.NonlinearRanker(config)[source]

Computes the probability of selecting a solution from a population using nonlinear ranking selection.

See e.g. <http://www.pohlheim.com/Papers/mpga_gal95/gal2_3.html>

class pyec.distribution.ec.selectors.Proportional(**kwargs)[source]

Fitness proportional selection (roulette wheel selection).

See <http://en.wikipedia.org/wiki/Fitness_proportionate_selection>.

Fitness values must be nonnegative.

This Selection method is just GeneralizedProportional with a modulating function

class pyec.distribution.ec.selectors.Ranker[source]

A policy to allocate selection probabilities under ranking selection.

class pyec.distribution.ec.selectors.Ranking(*args, **kwargs)[source]

Ranking selection, e.g. <http://www.pohlheim.com/Papers/mpga_gal95/gal2_3.html>

Takes a ranking function which weights individuals according to rank Rank is 1 for lowest, K for highest in population of size K

Note that the ranking function is a class, instantiated with the algorithm’s config. When this method is initialized, the ranking function is instanted and placed in config.rankerInst, unless the config.rankerInst is provided first

Config parameters

  • pressure – The selection pressure passed to the ranker.

  • ranker – A Ranker subclass for the ranking policy.

  • rankerInst – A Ranker instance for the ranking policy;

    if not provided, then ranker(config) is used.

class pyec.distribution.ec.selectors.Selection(**kwargs)[source]

A selection method (Abstract class)

class pyec.distribution.ec.selectors.Tournament(*args, **kwargs)[source]

Tournament selection on the entire population.

See <http://en.wikipedia.org/wiki/Tournament_selection>.

This class extends GeneralizedProportional, but only to get the method buildProbabilities (and the particular use of batch). Its modulating function is built-in and is just the rank of the member in the population.

Config parameters: * pressure – The probability of choosing the best

member of the randomly subsampled population.
  • size – The size of the subsample to select from, or None

    for tournament selection over the entire population

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

class pyec.distribution.ec.mutators.AreaSensitiveBernoulli(**kwargs)[source]

Like AreaSensitiveGaussian but with a Bernoulli.

Because a bit space is finite, there is no need to decay, but we still want to make exiting the area less likely to promote exploration within the selected area.

Config parameters * p – The bit-flipping prob

class pyec.distribution.ec.mutators.AreaSensitiveGaussian(*args, **kwargs)[source]

Gaussian mutation with a variance that depends on the size of a partition of the space (see pyec.util.partitions.Partition).

Config parameters: * jogo2012 – Whether to use the methods from Lockett & Miikkulainen, 2012 * decay – A function decay(n,config) to compute the multiplier

that controls the rate of decrease in standard deviation. Faster decay causes faster convergence, but may miss the optimum. Default is ((1/generations))

Standard deviation is determined by:

sd = .5 * ((upper-lower) * decay(n)

where dim is the dimension of the space (config.dim) and area is the volume of the partition region for the object being mutated.

mutate(x)[source]

Apply area-sensitive gaussian mutation. Assume that the passed value is a Point object

Parameters:x (Point) – The point to mutate
Returns:A numpy.ndarray mutation the point property of x.
class pyec.distribution.ec.mutators.Bernoulli(*args, **kwargs)[source]

Bernoulli mutation on a TernaryString object. Randomly flips bits on a fully specified binary string.

Config parameters * p – The probability of flipping each bit.

mutate(x)[source]

This method uses an iterative algorithm to flip bits in a TernaryString with a given probability. The algorithm essentially uses the binary representation of the bit-flipping probability in order to convert random byte sampling (p=.5) to account for any probability, with resolution of 1e-16.

class pyec.distribution.ec.mutators.Cauchy(*args, **kwargs)[source]

Cauchy mutation.

Config parameters * sd – The width parameter of the Cauchy (pi) * p – The probability of mutating each component, 1.0 by default

class pyec.distribution.ec.mutators.CorrelatedEndogeneousGaussian(*args, **kwargs)[source]

Implements an old (1996) version of mutation for the Correlated Matrix Adaption method of Hansen and Ostermeier.

This method is obsolete and is very different from the modern version.

NEEDS A SPACE TO HANDLE PULLING OUT THE SIGNIFICANT DIMENSIONS, see EndogeneousProduct

1996 paper says to set cmaCumulation to $

rac{1}{sqrt{n}}$,
cmaDamping to $

rac{1}{n}$, cmaCorrelation to $ rac{2}{n^2}$.

Default assumes will be run 1000 generations, n=1000

Config Parameters:

  • sd
  • cmaCumulation
  • cmaDamping
  • cmaCorrelation
pack(mat)[source]

take a N x N matrix and make a N(N+1)/2 array

unpack(sig)[source]

take a N(N+1)/2 array and make a N x N matrix

class pyec.distribution.ec.mutators.Crosser(config)[source]

Base class to perform recombination on a given list of organisms.

Parameters:config (Config) – The configuration parameters.
class pyec.distribution.ec.mutators.Crossover(*args, **kwargs)[source]

Performs recombination using a crossover policy given by a Crosser instance.

Config parameters

  • crossoverProb – A probability that is sampled; if a bernouli test

    on this value fails, then no crossover is performed, and the first selected organism is chosen

  • order – The number of organisms to crossover at each step

  • crosser – A Crosser to perform the crossover

class pyec.distribution.ec.mutators.DecayedBernoulli(**kwargs)[source]

Like DecayedGaussian, but with a Bernoulli distribution instead of a Gaussian; the bit flipping probability decays to zero.

Config parameters * sd – A scale factor on the standard deviation. * decay – A multiplicative decay factor. * decayExp – An exponential decay factor.

Bit flip prob is determined by:

p = p0 * exp(-(generation * varDecay) ** varExp)
class pyec.distribution.ec.mutators.DecayedGaussian(**kwargs)[source]

Gaussian mutation with a variance that decreases over time according to fixed formula.

Config parameters * sd – A scale factor on the standard deviation. * decay – A multiplicative decay factor. * decayExp – An exponential decay factor.

Standard deviation is determined by:

sd = sd0 * exp(-(generation * varDecay) ** varExp)
class pyec.distribution.ec.mutators.DominantCrosser(config)[source]

Cross multiple organisms using generalized uniform crossover.

class pyec.distribution.ec.mutators.EndogeneousGaussian(*args, **kwargs)[source]

Gaussian mutation for evolution strategies, where the adaptive mutation parameters are encoded with the organism.

NEEDS A SPACE TO HANDLE PULLING OUT THE SIGNIFICANT DIMENSIONS

Config params

  • sd – The standard deviation to start with.

  • baseDim – The portion of the genotype encoding the point, i.e. the

    number of dimensions being searched.

class pyec.distribution.ec.mutators.Gaussian(*args, **kwargs)[source]

Gaussian mutation. Randomly mutates a random selection of components of a real vector.

Config parameters * sd – The standard deviation to use. * p – The probability of mutating each component; defaults to 1.0

class pyec.distribution.ec.mutators.IntermediateCrosser(config)[source]

Cross multiple organisms by averaging their values component-wise.

Normally used by evolution strategies.

class pyec.distribution.ec.mutators.Mutation(**kwargs)[source]

Base class for mutation operators. Provides a method mutate that performs mutation on a single member of a population.

mutate(x)[source]

Perform mutation on a single organism.

Parameters:x (varied) – The organism to mutate.
Returns:The mutated organism.
class pyec.distribution.ec.mutators.OnePointCrosser(config)[source]

Cross two organisms at one point and return only one crossed version.

Unlike OnePointDualCrosser, this discards the second variant.

See <http://en.wikipedia.org/wiki/Crossover_(genetic_algorithm)#One-point_crossover>.

class pyec.distribution.ec.mutators.OnePointDualCrosser(config)[source]

One point crossover, mixes two individuals to create two new individuals by splicing them together at a random index.

So:

xxxxxx
oooooo

might become:

xxoooo
ooxxxx

And both items are returned.

See <http://en.wikipedia.org/wiki/Crossover_(genetic_algorithm)#One-point_crossover>.

class pyec.distribution.ec.mutators.TwoPointCrosser(config)[source]

Cross two organism by wrapping them into a circle, picking two indices, and combining the two rings.

See <http://en.wikipedia.org/wiki/Crossover_(genetic_algorithm)#Two-point_crossover>.

class pyec.distribution.ec.mutators.UniformArea(**kwargs)[source]

Uniform mutation within a partition region. The region is determined by the organism being mutated.

Parameters:config (Config) – The configuration parameters.
mutate(x)[source]

Apply uniform area mutation. Assume that the passed value is a Point object

Parameters:x (Point) – The point to mutate
Returns:A numpy.ndarray mutation the point property of x.
class pyec.distribution.ec.mutators.UniformCrosser(config)[source]

Uniform crossover.

See <http://en.wikipedia.org/wiki/Crossover_(genetic_algorithm)#Uniform_Crossover_and_Half_Uniform_Crossover>

Natural Computation

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

class pyec.distribution.de.DifferentialEvolution(**kwargs)[source]

Implements Differential Evolution (DE) as described by:

Storn, Ranier and Price, Kenneth. Differential Evolution – A simple and efficient adaptive scheme for global optimization over continuous spaces. 1995.

See <http://en.wikipedia.org/wiki/Differential_evolution> for algorithm details.

See <http://www.hvass-labs.org/people/magnus/publications/pedersen10good-de.pdf> for a good discussion of parameter settings for DE.

Config parameters:

  • CR – crossover probability (default .2)
  • F – the learning rate (default .5)

Other defaults:

  • history – LocalMinimumHistory
  • space – Euclidean
  • populationSize – 100
  • initial – None
Parameters:cfg (Config) – The configuration object for Differential Evolution.

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

class pyec.distribution.cmaes.Cmaes(*args, **kwargs)[source]

The Correlated Matrix Adaptation Evolution Strategy algorithm as described by:

Hansen, N. and Ostermeier, A. Completely derandomized self-adaptation in evolution strategies. In Evolutionary Computation 9(2), pp 159–195, 2001.

See <http://en.wikipedia.org/wiki/CMA-ES> for details.

A fast-converging population-based optimizer that finds reasonably good solutions with relatively few resources. In general, each population is sampled from a multi-variate Gaussian whose parameters are altered to optimize the probability of finding good solutions.

Akimoto (2010) proved that CMA-ES is an instance of Natural Evolution Strategies (Wierstra, 2008), which implements a gradient-following method at the population level.

Config parameters * mu – the number of solutions from each generation used as parents for the next. By default equal to half the Config.populationSize. * restart – whether to restart when the distribution collapses; defaults to True (see e.g. `Auger and Hansen, 2005)

Params cfg:The configuration object for CMA-ES.
class pyec.distribution.cmaes.CmaesHistory(config)[source]

A history that stores the parameters for CMA-ES

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

class pyec.distribution.pso.PSOHistory(config)[source]

A History for Particle Swarm Optimization. Rembers the local best and the velocities.

class pyec.distribution.pso.ParticleSwarmOptimization(*args, **kwargs)[source]

Particle Swarm Optimization.

Config parameters

  • omega – The decay factor for velocities
  • phig – The global best component in velocity update
  • phip – The local best component in velocity update

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

class pyec.distribution.eda.Boa(*args, **kwargs)[source]

The Bayesian Optimization Algorithm of Martin Pelikan.

class pyec.distribution.eda.RBoa(**kwargs)[source]

The Real-coded Boa algorithm of Ahn et al.

Direct Search

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

class pyec.distribution.neldermead.NelderMead(*args, **kwargs)[source]

The Nelder-Mead method of optimization for real spaces.

class pyec.distribution.neldermead.NelderMeadHistory(config)[source]

A history that saves the state for the Nelder-Mead algorithm

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

class pyec.distribution.gss.GeneratingSetSearch(*args, **kwargs)[source]

Based on Kolda, Lewis, and Torczon, Optimization by Direct Search: New Perspectives on Some Classical and Modern Methods (2003)

Primarily implemented to enable MADS (Audet and Dennis, 2006)

Bayesian Networks

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

class pyec.distribution.bayes.structure.proposal.StructureProposal(*args, **kwargs)[source]

A proposal distribution for structure searching with a heuristic algorithm such as simulated annealing.

Config params:

  • branchFactor The maximum number of parents per variable

  • remove_edge_prob A callable object that takes a network and returns a

    probability of removing a random edge.

  • add_edge_prob A callable object that takes a network and returns a

    probability of adding a random edge

  • reverse_edge_prob A callable object that takes a network and

    returns a probability of reversing a random edges

adjust(acceptanceRatio)[source]

Called by simulated annealing to update generation statistics

densityRatio(x, y, i)[source]

Called by simulated annealing to adjust for assymetric densities The density governing this proposal should be symmetric. Still need to check this claim

Boltzmann Machines

Utilities

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

class pyec.util.TernaryString.TernaryString(base, known, length=10000)[source]

A ternary string with three values: True, False, and Unknown

Params base:An object whose bits are treated as a bit string
Params known:An object whose bits determine whether the value at that bit is known in base
Params length:The maximum number of bits allowed
classmethod bernoulli(p=0.5, length=1)[source]

This method uses an iterative algorithm to flip bits in a TernaryString with a given probability. The algorithm essentially uses the binary representation of the bit-flipping probability in order to convert random byte sampling (p=.5) to account for any probability, with resolution of 1e-16.

Parameters:
  • p (float or numpy.ndarray with length length) – The probability of flipping bits. May be an array with length equal to length so that different positions have different bit flipping probabilities.
  • length (int) – The number of bits to produce
Returns:

A TernaryString with the specified length that is 1 in each position with probability p

distance(x)[source]

hamming distance

pyec.util.TernaryString.binary(x, digits=10)[source]

Convert a number to its binary representation.

Parameters:
  • x (int or long) – The number to convert
  • digits (int) – The number of binary digits to convert, from right to left
Returns:

A string of zeros and ones with the least significant bits at the left.

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

class pyec.util.partitions.BayesSeparationAlgorithm(config)[source]

Separation algorithm for Bayesian networks based on structure.

class pyec.util.partitions.LayeredSeparationAlgorithm(config)[source]

Used to separate spaces defined by a layer hierarchy; written for neural networks initially. Assumes that the space is divided into layered strata; see LayeredSpace. The bottom layer is separated differently from the higher layers, and so the config should contain a property named “secondary_separator” that can be used to separate points in the lowest rung of the hierarchy.

class pyec.util.partitions.LongestSideVectorSeparationAlgorithm(config)[source]

Like VectorSeparationAlgorithm, except that it always partitions the longest side. This can be helpful to guarantee that the longest side length of any partition hyperrectangle is reduced evenly.

class pyec.util.partitions.SeparationAlgorithm(config)[source]

A algorithm that enables traversal and separation of the partition tree. This class abstracts the space-dependent portions of the partitioning logic.

Parameters:config (Config) – Configuration parameters
compatible(space)[source]

Check whether this partitioning method is compatible with a given space

Parameters:space (class:Space) – The space to check
separate(tree, point)[source]

Insert the point into the partition tree, separating the current partition that contains it.

Parameters:
  • tree (Partition) – The partition tree
  • point (config.space.type) – The point to insert
  • config (Config) – A Config object
  • stats (RunStats) –
    • an RunStats object
split(point, node)[source]

Do the actual work to partition a region.

Parameters:
  • point (config.space.type) – The point that resulted in the split
  • node (PartitionTreeNode) – A node from the partition tree corresponding to a region of the space
Throws :

SeparationException when separation of the node fails

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Copyright (C) 2012 Alan J Lockett

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

class pyec.util.benchmark.mix(code, *args)[source]

Take convex combo of two benchmarks, e.g.

.2_shekel2_langerman

Only works if optimizers have the same center, scale

Indices and tables