The goal of numerical optimization is to solve an equation. There are many calculus-based deterministic techniques available; however, some difficult problems, especially in the areas of machine learning and artificial intelligence, can’t be solved easily using classical optimization techniques. In such situations, alternatives such as amoeba method optimization can be of value. Amoeba method optimization, which I’ll discuss in this article, is similar in some respects to particle swarm optimization, which I described in the August 2011 issue of MSDN Magazine (msdn.microsoft.com/magazine/hh335067).

The best way to understand what amoeba method optimization is and to see where I’m headed is to examine **Figure1**, which shows a demo program in action. The goal in this program is to find the values of x and y that minimize a relatively simple standard benchmark problem called Rosenbrock’s function. This function has a known solution at x = 1.0 and y = 1.0 when the value of the function is 0.

**Figure 1 Amoeba Method Optimization Demo**

The demo program creates a virtual amoeba that contains three random potential solutions. The best of these initial solutions isn’t very good, at x = -0.66 and y = 5.43, yielding a function value of 2499.52. The demo program calls a solve method and, behind the scenes, it uses the amoeba method to iteratively find better and better solutions. After 50 iterations, the algorithm succeeds in finding the optimal solution.

In the sections that follow I present and explain the complete source code for the demo program. The code is available at archive.msdn.microsoft.com/mag201306TestRun. This article assumes you have at least intermediate-level programming skills with a modern procedural language. I coded the demo using C# but you shouldn’t have too much trouble refactoring the demo to another language, such as Visual Basic .NET or Python.

## The Amoeba Method Algorithm

The amoeba method optimization algorithm I present here is based on the 1965 research paper, “A Simplex Method for Function Minimization,” by J.A. Nelder and R. Mead.

The key parts of this algorithm are illustrated in **Figure 2**. At any point in time, there are several possible solutions. In most cases, amoeba optimization uses three solutions, colored red in the figure. Each solution has an associated objective function value, so there will be a worst solution (the highest function value because the goal is to minimize), a best solution (the smallest function value) and other(s).

**Figure 2 Amoeba Optimization Primitive Operations**

The algorithm is an iterative process. At each step, it attempts to replace the worst solution with a new, better solution from among three candidates: a reflected point, an expanded point and a contracted point. Each of these candidates lies along a line from the worst point through the centroid—a point that’s in the middle of all points except the worst point. In the usual case with three solutions, the centroid will lie halfway between the best point and the other (non-worst) point.

If neither the reflected point, nor the expanded point, nor the contracted point is better than the current worst solution, the amoeba shrinks itself by moving all points, except for the best point, halfway toward the best point. Some research papers call this process multiple contraction.

When graphed over time, if the three current solution points are connected by a line (as with the black dashed line in **Figure 2**), the solutions form a triangle, and their movement resembles that of an amoeba crawling through its environment. In mathematical terms, a triangle on a plane is called a simplex, so this optimization algorithm, in addition to being called the amoeba method or the Nelder-Mead method, is sometimes called the simplex method.

## Overall Program Structure

I coded the amoeba optimization demo program as a single C# console application. I used Visual Studio 2010 (any version of Visual Studio should work) and created a new project named AmoebaOptimization. After the project loaded, in the Solution Explorer window I renamed file Program.cs to the more descriptive AmoebaProgram.cs, which automatically renamed class Program. I deleted all unneeded template-generated using statements except for the single statement that references the top-level System namespace.

The entire program structure, with some comments and WriteLine statements removed, is listed in **Figure 3**.

**Figure 3 Amoeba Optimization Program Structure**

using System; namespace AmoebaOptimization { class AmoebaProgram { static void Main(string[] args) { try { Console.WriteLine("\nBegin amoeba method optimization demo\n"); int dim = 2; // problem dimension int amoebaSize = 3; // number potential solutions double minX = -10.0; double maxX = 10.0; int maxLoop = 50; Console.WriteLine("Creating amoeba with size = " + amoebaSize); Amoeba a = new Amoeba(amoebaSize, dim, minX, maxX, maxLoop); Console.WriteLine("\nInitial amoeba is: \n"); Console.WriteLine(a.ToString()); Solution sln = a.Solve(); Console.WriteLine("Final amoeba is: \n"); Console.WriteLine(a.ToString()); Console.WriteLine("\nBest solution found: \n"); Console.WriteLine(sln.ToString()); Console.WriteLine("\nEnd amoeba method optimization demo\n"); } catch (Exception ex) { Console.WriteLine(ex.Message); } } public static double ObjectiveFunction( double[] vector, object dataSource) { . . } } public class Solution : IComparable<Solution> { . . } public class Amoeba { . . } }

## The Objective Function

Amoeba method optimization is most often used to solve a numerical minimization problem. The function to minimize is generally called a cost function or objective function. The demo program in **Figure 1** is solving a dummy mathematical benchmark problem called Rosenbrock’s function. The function has two input variables, x and y, and is defined as f(x,y) = 100 * (y - x^{2})^{2}> + (1 - x)^{2}>. The function has a solution of x = 1.0, y = 1.0, which gives a value of 0.0. **Figure 4** shows a three-dimensional plot of Rosenbrock’s function.

**Figure 4 Graph of the Objective Function**

In real-world situations, amoeba optimization is used to find the solution to difficult problems that are based on data. For example, suppose you’re trying to predict stock market prices. You might come up with a list of factors you believe are predictors, and an equation, but you need to determine a set of numeric constants for the equation that minimize the error on a set of training data that has known results.

The objective function for the demo program is defined in the main program class as:

public static double ObjectiveFunction( double[] vector, object dataSource) { double x = vector[0]; double y = vector[1]; return 100.0 * Math.Pow((y - x * x), 2) + Math.Pow(1 - x, 2); }

I use a dummy input parameter named dataSource to indicate that in most situations the objective function depends on some external data source, such as a text file or SQL table. Because the function is declared using the public and static modifiers, the function is visible to all code in the demo program.

## The Solution Class

Amoeba optimization maintains a collection of possible solutions, defined in a Solution class:

public class Solution : IComparable<Solution> { public double[] vector; public double value; static Random random = new Random(1); public Solution(int dim, double minX, double maxX) { . . } public Solution(double[] vector) { . . } public int CompareTo(Solution other) { . . } public override string ToString() { . . } }

The class derives from the IComparable interface so that Solution objects can be automatically sorted. A Solution object has just two significant fields: one is an array of double named vector that holds the numeric values of the solution, and the other is the value of the objective function. I use public scope member fields and remove all error checking for simplicity. The static Random object allows the code to generate random solutions.

The first Solution constructor creates a random solution:

public Solution(int dim, double minX, double maxX) { this.vector = new double[dim]; for (int i = 0; i < dim; ++i) this.vector[i] = (maxX - minX) * random.NextDouble() + minX; this.value = AmoebaProgram.ObjectiveFunction(this.vector, null); }

The constructor accepts a problem dimension, and limits for each vector component. The dimension for the demo program is 2 because Rosenbrock’s function has two input variables, x and y. After allocating space for member field vector, the constructor assigns random values between minX and maxX to the vector array, and then calls the globally accessible objective function to compute the value field.

The second Solution constructor creates a solution from a specified array of double:

public Solution(double[] vector) { this.vector = new double[vector.Length]; Array.Copy(vector, this.vector, vector.Length); this.value = AmoebaProgram.ObjectiveFunction(this.vector, null); }

Because the Solution class derives from the IComparable interface, the class must implement a CompareTo method. CompareTo is defined so that Solution objects will be automatically sorted from best (smaller) to worst (larger) values of the objective function:

public int CompareTo(Solution other) { if (this.value < other.value) return -1; else if (this.value > other.value) return 1; else return 0; }

For visualization and debugging purposes, the Solution class defines a simple ToString method using string concatenation:

## The Amoeba Class

The Amoeba class is essentially an array of Solution objects plus a Solve method that uses the amoeba method algorithm. The structure of the Amoeba class is listed in **Figure 5**.

**Figure 5 The Amoeba Class**

public class Amoeba { public int amoebaSize; // Number of solutions public int dim; // Problem dimension public Solution[] solutions; // Potential solutions (vector + value) public double minX; public double maxX; public double alpha; // Reflection public double beta; // Contraction public double gamma; // Expansion public int maxLoop; // Limits main solving loop public Amoeba(int amoebaSize, int dim, double minX, double maxX, int maxLoop) { . . } public Solution Centroid() { . . } public Solution Reflected(Solution centroid) { . . } public Solution Expanded(Solution reflected, Solution centroid) { . . } public Solution Contracted(Solution centroid) { . . } public void Shrink() { . . } public void ReplaceWorst(Solution newSolution) { . . } public bool IsWorseThanAllButWorst(Solution reflected) { . . } public Solution Solve() { . . } public override string ToString() { . . } }

I declare all fields and methods using public scope for simplicity and for easier debugging during development. The amoebaSize field specifies the number of potential solutions in the Amoeba object. By far the most common value is 3, but you may want to experiment with larger values. The dim field represents the number of variables in the objective function that must be solved for—two in the case of Rosenbrock’s function.

Array solutions holds the potential Solution objects. Although it’s not clear from the declaration, array solutions must be sorted at all times, from best solution (smallest value field) to worst solution. Fields minX and maxX constrain the initial values in each Solution object. These values will vary from problem to problem.

Fields alpha, beta and gamma are constants that are used by helper methods called by the Solve method. Field maxLoop limits the number of iterations of the processing loop in Solve.

The single Amoeba constructor creates an array of amoebaSize Solution objects, each of which has a vector field of size dim. All of the work is performed by method Solve; all the other methods in class Amoeba are helper methods.

The Amoeba constructor is defined as:

public Amoeba(int amoebaSize, int dim, double minX, double maxX, int maxLoop) { this.amoebaSize = amoebaSize; this.dim = dim; this.minX = minX; this.maxX = maxX; this.alpha = 1.0; this.beta = 0.5; this.gamma = 2.0; this.maxLoop = maxLoop; this.solutions = new Solution[amoebaSize]; for (int i = 0; i < solutions.Length; ++i) solutions[i] = new Solution(dim, minX, maxX); Array.Sort(solutions); }

Fields alpha, beta and gamma control the behavior of the Solve method and are assigned hardcoded values of 1.0, 0.5 and 2.0, respectively. Research has shown that these values generally give good results, but you might want to experiment. After array solutions has been allocated, a random Solution object is assigned to each cell. The Array.Sort method sorts the solutions from best value to worst.

The Amoeba class has a simple ToString method for visualization and easier debugging:

## The Algorithm Primitives

A key aspect of the amoeba optimization algorithm is that the current worst solution is replaced—if it leads to a better set of solutions—by a so-called reflected point, expanded point or contracted point.

Amoeba class helper method Centroid creates a Solution object that is in some sense a middle solution between all solutions in the amoeba except for the worst solution (the worst solution is the one with the largest solution value because the goal is to minimize the objective function, and it will be located at index amoebaSize-1):

public Solution Centroid() { double[] c = new double[dim]; for (int i = 0; i < amoebaSize - 1; ++i) for (int j = 0; j < dim; ++j) c[j] += solutions[i].vector[j]; // Accumulate sum of each component for (int j = 0; j < dim; ++j) c[j] = c[j] / (amoebaSize - 1); Solution s = new Solution(c); return s; }

Helper method Reflected creates a Solution object that’s in the general direction of better solutions. Constant alpha, typically set to 1.0, controls how far away from the centroid to move to yield the reflected solution. Larger values of alpha generate reflected points that are farther from the centroid:

public Solution Reflected(Solution centroid) { double[] r = new double[dim]; double[] worst = this.solutions[amoebaSize - 1].vector; // Convenience only for (int j = 0; j < dim; ++j) r[j] = ((1 + alpha) * centroid.vector[j]) - (alpha * worst[j]); Solution s = new Solution(r); return s; }

Helper method Expanded creates a Solution object that’s even farther from the centroid than the reflected solution. Constant gamma, typically set to 2.0, controls how far the reflected point is from the centroid:

public Solution Expanded(Solution reflected, Solution centroid) { double[] e = new double[dim]; for (int j = 0; j < dim; ++j) e[j] = (gamma * reflected.vector[j]) + ((1 - gamma) * centroid.vector[j]); Solution s = new Solution(e); return s; }

Helper method Contracted creates a Solution object that’s roughly in between the worst solution and the centroid. Constant beta, typically set to 0.50, controls how close to the worst solution the contracted point is:

public Solution Contracted(Solution centroid) { double[] v = new double[dim]; // Didn't want to reuse 'c' from centroid routine double[] worst = this.solutions[amoebaSize - 1].vector; // Convenience only for (int j = 0; j < dim; ++j) v[j] = (beta * worst[j]) + ((1 - beta) * centroid.vector[j]); Solution s = new Solution(v); return s; }

Helper method ReplaceWorst replaces the current worst solution, located at index amoebaSize-1, with a different solution (the reflected, expanded or contracted point):

public void ReplaceWorst(Solution newSolution) { for (int j = 0; j < dim; ++j) solutions[amoebaSize-1].vector[j] = newSolution.vector[j]; solutions[amoebaSize - 1].value = newSolution.value; Array.Sort(solutions); }

If neither the reflected, nor the expanded, nor the contracted point gives a better set of solutions, the amoeba algorithm shrinks the current solution set. Every solution point, except for the best point at index 0, is moved halfway from its current location toward the best point:

public void Shrink() { for (int i = 1; i < amoebaSize; ++i) // start at [1] { for (int j = 0; j < dim; ++j) { solutions[i].vector[j] = (solutions[i].vector[j] + solutions[0].vector[j]) / 2.0; solutions[i].value = AmoebaProgram.ObjectiveFunction( solutions[i].vector, null); } } Array.Sort(solutions); }

## The Solve Method

The amoeba optimization solve algorithm is given in **Figure 6**, in high-level pseudo-code.

**Figure 6 The Amoeba Optimization Solve Algorithm**

generate amoebaSize random solutions while not done loop compute centroid compute reflected if reflected is better than best solution then compute expanded replace worst solution with better of reflected, expanded else if reflected is worse than all but worst then if reflected is better than worst solution then replace worst solution with reflected end if compute contracted if contracted is worse than worst shrink the amoeba else replace worst solution with contracted end if else replace worst solution with reflected end if end loop return best solution found

Even though the algorithm is short, it’s a bit trickier than it first appears and you’ll likely have to examine it very closely if you want to modify it for some reason. Helper method IsWorseThanAllButWorst makes the Solve method quite a bit neater. The helper examines a Solution object and returns true only if the Solution object (always the reflected solution in the algorithm) is worse (has a greater objective function value) than all other solutions in the amoeba, except for possibly the worst solution (located at index amoebaSize-1):

public bool IsWorseThanAllButWorst(Solution reflected) { for (int i = 0; i < amoebaSize - 1; ++i) { if (reflected.value <= solutions[i].value) // Found worse solution return false; } return true; }

With all the helper methods in place, the Solve method, listed in **Figure 7**, is rather short. The processing loop in Solve exits after maxLoop iterations. In general, a good value of maxLoop varies from problem to problem and must be determined through trial and error. An alternative or additional stopping condition is to exit the processing loop when the average error of the solutions in the amoeba drops below some problem-dependent value.

**Figure 7 The Solve Method**

public Solution Solve() { int t = 0; // Loop counter while (t < maxLoop) { ++t; Solution centroid = Centroid(); Solution reflected = Reflected(centroid); if (reflected.value < solutions[0].value) { Solution expanded = Expanded(reflected, centroid); if (expanded.value < solutions[0].value) ReplaceWorst(expanded); else ReplaceWorst(reflected); continue; } if (IsWorseThanAllButWorst(reflected) == true) { if (reflected.value <= solutions[amoebaSize - 1].value) ReplaceWorst(reflected); Solution contracted = Contracted(centroid); if (contracted.value > solutions[amoebaSize - 1].value) Shrink(); else ReplaceWorst(contracted); continue; } ReplaceWorst(reflected); } // loop return solutions[0]; // Best solution }

## Customizing the Code

The example code and explanation presented in this article should get you up and running if you want to experiment with or use amoeba method optimization in a software system. There are several modifications you might want to consider. In the main algorithm loop, before computing the centroid and reflected solutions, I often compute a purely random solution and check to see if this random solution is better than the current worst solution. This approach helps prevent the optimization algorithm from getting stuck in a local minimum solution.

Another customization possibility is to compute multiple reflected points. Instead of a single reflected point that lies on the line between the current worst solution and the centroid, you can compute additional reflected points that lie on different lines. This approach also helps avoid local minima traps.

**Dr. James McCaffrey** *works for Volt Information Sciences Inc., where he manages technical training for software engineers working at the Microsoft Redmond, Wash., campus. He has worked on several Microsoft products including Internet Explorer and MSN Search. He’s the author of “.NET Test Automation Recipes” (Apress, 2006), and can be reached at jammc@microsoft.com.*

Thanks to the following technical experts for reviewing this article: Darren Gehring (Microsoft) and Mark Marron (Microsoft)

Mark Marron works for Microsoft Research in Redmond, Washington. He received a BA in Applied Mathematics from UC Berkeley and his Ph.D. from the University of New Mexico. His expertise is in the area of program analysis and program synthesis, with an emphasis on using this information to support optimization and software engineering applications. His Web site is at http://research.microsoft.com/en-us/um/people/marron/.

Darren Gehring is a Test Manager at Microsoft Research in Redmond, Washington. Prior to working in Microsoft Research, he worked in the Microsoft SQL Server product group for 10 years. Darren's Web page is at http://research.microsoft.com/en-us/people/darrenge/.