Optimization In One Dimension in IronPython QuickStart Sample
Illustrates the use of the Brent and Golden Section optimizer classes in the Numerics.NET.Optimization namespace for one-dimensional optimization in IronPython.
This sample is also available in: C#, Visual Basic, F#.
Overview
This QuickStart sample demonstrates how to find the minimum or maximum of a function in one dimension using Numerics.NET’s optimization capabilities.
The sample shows two different optimization algorithms:
- Brent’s algorithm - A sophisticated and efficient method that combines golden section search with
parabolic interpolation. It is the recommended method for most one-dimensional optimization problems.
The sample demonstrates how to:
- Create and configure a BrentOptimizer
- Find an interval containing an extremum using FindBracket
- Locate the precise minimum or maximum
- Access results and error estimates
- Golden Section Search - A simpler but slower algorithm based on the golden ratio. The sample shows how to use the GoldenSectionOptimizer class to find extrema.
Both methods are demonstrated on two test functions:
- A cubic polynomial f(x) = x� - 2x - 5
- An exponential function f(x) = 1/exp(x� - 0.7x + 0.2)
The sample illustrates proper error handling, accessing optimization results, and comparing estimated versus actual errors. It also shows how to use lambda expressions for defining objective functions in C# 3.0 and later.
The code
import numerics
from math import exp, sqrt
# The optimization classes resides in the
# Extreme.Mathematics.EquationSolvers namespace.
from Extreme.Mathematics.Optimization import *
# Function delegates reside in the Extreme.Mathematics
# namespace.
from Extreme.Mathematics import *
# Illustrates the use of the Brent and Golden Section optimizers
# in the Extreme.Mathematics.Optimization namespace of Extreme Numerics.NET.
# Several algorithms exist for optimizing functions
# in one variable. The most common one is
# Brent's algorithm.
# The function we are trying to minimize is called the
# objective function and must be provided as a Func<double, double>.
# For more information about this delegate, see the
# FunctionDelegates QuickStart sample.
#
# Brent's algorithm
#
# Now let's create the BrentOptimizer object.
optimizer = BrentOptimizer()
# Set the objective function:
optimizer.ObjectiveFunction = lambda x: x ** 3 - 2 * x - 5
# Optimizers can find either a minimum or a maximum.
# Which of the two is specified by the ExtremumType
# property
optimizer.ExtremumType = ExtremumType.Minimum
# The first phase is to find an interval that contains
# a local minimum. This is done by the FindBracket method.
optimizer.FindBracket(0, 3)
# You can verify that an interval was found from the
# IsBracketValid property:
if not optimizer.IsBracketValid:
raise Exception("An interval containing a minimum was not found.")
# Finally, we can run the optimizer by calling the FindExtremum method:
optimizer.FindExtremum()
print "Function 1: x^3 - 2x - 5"
# The Status property indicates
# the result of running the algorithm.
print " Status:", optimizer.Status
# The result is available through the
# Result property.
print " Minimum:", optimizer.Result
exactResult = sqrt(2/3.0)
result = optimizer.Extremum
print " Exact minimum:", exactResult
# You can find out the estimated error of the result
# through the EstimatedError property:
print " Estimated error:", optimizer.EstimatedError
print " Actual error:", result - exactResult
print " # iterations:", optimizer.IterationsNeeded
print "Function 2: 1/Exp(x*x - 0.7*x +0.2)"
f2 = lambda x: 1/exp(x**2 - 0.7*x +0.2)
# You can also perform these calculations more directly
# using the FindMinimum or FindMaximum methods. This implicitly
# calls the FindBracket method.
result = optimizer.FindMaximum(f2, 0)
print " Maximum:", result
print " Actual maximum:", 0.35
print " Estimated error:", optimizer.EstimatedError
print " Actual error:", result - 0.35
print " # iterations:", optimizer.IterationsNeeded
#
# Golden section search
#
# A slower but simpler algorithm for finding an extremum
# is the golden section search. It is implemented by the
# GoldenSectionMinimizer class:
optimizer2 = GoldenSectionOptimizer()
# Maximum at x = 0.35
print "Using Golden Section optimizer:"
result = optimizer2.FindMaximum(f2, 0)
print " Maximum:", result
print " Actual maximum:", 0.35
print " Estimated error:", optimizer2.EstimatedError
print " Actual error:", result - 0.35
print " # iterations:", optimizer2.IterationsNeeded