optimization

Beyond Gradients

Alan J. Lockett's Research Pages
English

What This Site is About

Isaac Newton

Optimization Beyond Gradients is about what to do when you need to optimize functions of a post-Newtonian character. Derivatives were the cutting edge of mathematics back in the 1700s, when the mere suggestion of non-analytic functions was enough to make any Oxford don shake in his stockings and ruffle his powdered coif. Nowadays, functions whose optima can be found analytically are a rare treat. 

In research disciplines such as machine learning or robotics, the functions being optimized may be non-smooth, non-convex, multimodal, discontinuous, or discrete. They might operate on binary codes, graphs, computer programs, or even neural networks. In these settings, taking derivatives may not be the best way to solve your problem, if the derivatives even exist. There is a large variety of non-gradient-based optimization methods that can be used in these situations. In the mathematical optimization community, these methods are referred to as heuristic search. This term is a misnomer; algorithms like differential evolution or evolutionary annealing have been proven to converge to the global optimum under certain conditions. These methods are no more heuristic than gradient descent, which can fail catastrophically even on functions such as the Weierstrass function, which is everywhere continuous and nowhere differentiable. By contrast, evolutionary annealing can be configured to locate the optima of the Weierstrass function reliably.

Weierstrass Function

This site is about non-gradient methods: how to define them, how to configure them, and how to determine the best optimization method for your problem. In the Research section, you'll find a discussion of optimization methods form a vector space, how optimization problems can be modeled as a distribution over functions, and how the best optimization method for your problem can be derived from the interplay of these two features. There are also pages describing Martingale Optimization, especially Evolutionary Annealing, a new optimization method for non-smooth, multimodal functions.

The Software section contains links to open-source software that implements the methods and ideas discussed on this site, especially PyEC, a software package for evolutionary optimization in the Python language. Links to published research papers are provided on the Publications page, and you can find out more about who I am on the About page.

Enjoy the site! Send any comments or contact requests to Alan Lockett at alan.lockett@gmail.com.

About Me

Alan J. Lockett My research focuses on how to optimize efficiently in complex spaces, especially for training deep neural networks for robotic control. I have a Ph.D. from the University of Texas. I recently received an NSF grant to study at IDSIA in Switzerland starting in summer 2013. See my About page for contact information and more.