Copyright (c) 2008-2014, pyOpt Developers
This is a quick guide to begin solving optimization problems with pyOpt.
pyOpt is design to solve general constrained nonlinear optimization problems:
min f(x)
x
s.t. g_j(x) = 0, j = 1, ..., m_e
g_j(x) <= 0, j = m_e + 1, ..., m
x_i_L <= x_i <= x_i_U, i = 1, ..., n
where:
x
is the vector of design variablesf(x)
is a nonlinear functiong(x)
is a linear or nonlinear functionn
is the number of design variablesm_e
is the number of equality constraintsm
is the total number of constraints (number of equality constraints:m_i = m - m_e
)
Instantiating an Optimization Problem:
opt_prob = Optimization(name, obj_fun, var_set={}, obj_set={}, con_set={})
where:
name
: name of the problem (e.g.'name'
)obj_fun
: objective functionvar_set
: dict containing the variablesobj_set
: dict containing the objectivescon_set
: dict containing the constraints
def obj_fun(x, *args, **kwargs):
f = <your_function>(x, *args, **kwargs)
g = <your_function>(x, *args, **kwargs)
fail = 0
return f, g, fail
where:
f
: objective valueg
: list of constraint values- If the optimization problem is unconstrained,
g
must be an empty list:g = []
- Inequality constraints are handled as
<=
.
- If the optimization problem is unconstrained,
fail
:0
: successful function evaluation1
: unsuccessful function evaluation (test must be provided by user)
opt_prob.addObj('name', value=0.0, optimum=0.0)
opt_prob.addVar('name', type='c', value=0.0, lower=-inf, upper=inf, choices=listofchoices)
opt_prob.addVarGroup('name', numerinGroup, type='c', value=value, lower=lb, upper=up,choices=listochoices)
where:
value
,lb
,ub
: float, int or list- Supported types:
'c'
: continous design variable'i'
: integer design variable'd'
: discrete design variable (based on choices, e.g.: list/dict of materials)
opt_prob.addCon('name', type='i', lower=-inf, upper=inf, equal=0.0)
opt_prob.addConGroup('name', numberinGroup, type='i', lower=lb, upper=up, equal=eq)
where:
lb
,ub
,eq
: float, int or list- Supported Types:
'i'
: inequality constraint.'e'
: equality constraint.
Instanciating an Optimizer (e.g.: Snopt):
opt = pySNOPT.SNOPT()
During instantiation:
opt = pySNOPT.SNOPT(options={'name':value,...})
or one by one:
opt.setOption('name',value)
opt.getOption('name')
opt.ListAttributes()
opt(opt_prob, sens_type='FD', disp_opts=False, sens_mode='',*args, **kwargs)
where:
sens_type
: sensitivity type'FD'
: finite differences'CS'
: complex step
disp_opts
: flag for displaying the options in the solution outputopt_prob
: user provided function- format:
grad_function = lambda x, f, g : g_obj, g_con, fail
(See Objective FUnction Template section)
- format:
sens_mode
: parallel sensitivity flag ('serial'
,'pgc'
,'parallel'
)- Additional arguments and keyword arguments (e.g.: parameters) can be passed to the objective function
- Print Optimization problem with the initial values:
print(opt_prob)
- Print specific solution of the Optimization problem:
print(opt_prob._solutions[key])
where:
key
: index in order of optimizer call.
opt_prob.write2file(outfile='', disp_sols=False, solutions=[])
where:
outfile
: filename or file instance (default name=opt_prob name[0].txt
)disp_sols
: True will display all the stored solutionssolutions
: list of indices of stored solutions to display
The solution can be used directly as a optimization problem for refinement by the same or a new optimizer:
optimizer(opt_prob._solutions[key])
where:
key
: index in order of optimizer call
The new solution will be stored as a sub-solution of the previous solution:
print(opt_prob._solutions[key]._solutions[nkey])
The history flag stores all function evaluations from an optimizer in binary format in a .bin and .cue file:
optimizer(opt_prob, store_hst=True)
where:
store_hst
:True
: use default file name for the history- string type: custom filename
The binary history file can be used to hot start the optimizer if the optimization was interrupted. The flag needs the filename of the history (True will use the default name)
optimizer(opt_prob, store_hst=True, hot_start=True)
If the store history flag is set as the same as the hot start flag a temporary file will be created during the run and teh original file will be overwritten at the end.
For hot start to work properly all options must be the same as when the history was created.