Skip to content

Commit a0c192e

Browse files
authored
Polish changelog for release 0.5.0 (#525)
1 parent caeb370 commit a0c192e

File tree

4 files changed

+92
-128
lines changed

4 files changed

+92
-128
lines changed

CHANGES.md

Lines changed: 68 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -10,31 +10,71 @@ we drop the official support for Python 3.9.
1010

1111
## 0.5.0
1212

13-
This is a major release with several breaking changes and deprecations. On a high level,
14-
the major changes are:
13+
This is a major release with several breaking changes and deprecations. In this
14+
release we started implementing two major enhancement proposals and renamed the package
15+
from estimagic to optimagic (while keeping the `estimagic` namespace for the estimation
16+
capabilities).
1517

16-
- Implement EP-02: Static typing
17-
- Implement EP-03: Alignment with SciPy
18-
- Rename the package from `estimagic` to `optimagic` (while keeping the `estimagic`
19-
namespace for the estimation capabilities).
18+
- [EP-02: Static typing](https://estimagic.org/en/latest/development/ep-02-typing.html)
19+
- [EP-03: Alignment with SciPy](https://estimagic.org/en/latest/development/ep-03-alignment.html)
2020

21+
The implementation of the two enhancement proposals is not complete and will likely
22+
take until version `0.6.0`. However, all breaking changes and deprecations (with the
23+
exception of a minor change in benchmarking) are already implemented such that updating
24+
to version `0.5.0` is future proof.
2125

2226
- {gh}`500` removes the dashboard, the support for simopt optimizers and the
2327
`derivative_plot` ({ghuser}`janosg`)
28+
- {gh}`502` renames estimagic to optimagic ({ghuser}`janosg`)
2429
- {gh}`504` aligns `maximize` and `minimize` more closely with scipy. All related
2530
deprecations and breaking changes are listed below. As a result, scipy code that uses
2631
minimize with the arguments `x0`, `fun`, `jac` and `method` will run without changes
2732
in optimagic. Similarly, to `OptimizeResult` gets some aliases so it behaves more
2833
like SciPy's.
34+
- {gh}`506` introduces the new `Bounds` object and deprecates `lower_bounds`,
35+
`upper_bounds`, `soft_lower_bounds` and `soft_upper_bounds` ({ghuser}`janosg`)
36+
- {gh}`507` updates the infrastructure so we can make parallel releases under the names
37+
`optimagic` and `estimagic` ({ghuser}`timmens`)
38+
- {gh}`508` introduces the new `ScalingOptions` object and deprecates the
39+
`scaling_options` argument of `maximize` and `minimize` ({ghuser}`timmens`)
40+
- {gh}`512` implements the new interface for objective functions and derivatives
41+
({ghuser}`janosg`)
42+
- {gh}`513` implements the new `optimagic.MultistartOptions` object and deprecates the
43+
`multistart_options` argument of `maximize` and `minimize` ({ghuser}`timmens`)
44+
- {gh}`514` and {gh}`516` introduce the `NumdiffResult` object that is returned from
45+
`first_derivative` and `second_derivative`. It also fixes several bugs in the
46+
pytree handling in `first_derivative` and `second_derivative` and deprecates
47+
Richardson Extrapolation and the `key` ({ghuser}`timmens`)
48+
- {gh}`517` introduces the new `NumdiffOptions` object for configuring numerical
49+
differentiation during optimization or estimation ({ghuser}`timmens`)
50+
- {gh}`519` rewrites the logging code and introduces new `LogOptions` objects
51+
({ghuser}`schroedk`)
52+
- {gh}`521` introduces the new internal algorithm interface.
53+
({ghuser}`janosg` and {ghuser}`mpetrosian`)
54+
- {gh}`522` introduces the new `Constraint` objects and deprecates passing
55+
dictionaries or lists of dictionaries as constraints ({ghuser}`timmens`)
56+
2957

3058
### Breaking changes
3159

3260
- When providing a path for the argument `logging` of the functions
3361
`maximize` and `minimize` and the file already exists, the default
3462
behavior is to raise an error now. Replacement or extension
3563
of an existing file must be explicitly configured.
36-
- The argument `if_table_exists` has no effect anymore and a
64+
- The argument `if_table_exists` in `log_options` has no effect anymore and a
3765
corresponding warning is raised.
66+
- `OptimizeResult.history` is now a `optimagic.History` object instead of a
67+
dictionary. Dictionary style access is implemented but deprecated. Other dictionary
68+
methods might not work.
69+
- The result of `first_derivative` and `second_derivative` is now a
70+
`optimagic.NumdiffResult` object instead of a dictionary. Dictionary style access is
71+
implemented but other dictionary methods might not work.
72+
- The dashboard is removed
73+
- The `derivative_plot` is removed.
74+
- Optimizers from Simopt are removed.
75+
- Passing callables with the old internal algorithm interface as `algorithm` to
76+
`minimize` and `maximize` is not supported anymore. Use the new
77+
`Algorithm` objects instead. For examples see: https://tinyurl.com/24a5cner
3878

3979

4080
### Deprecations
@@ -62,10 +102,29 @@ the major changes are:
62102
- `convergence_scaled_gradient_tolerance` -> `convergence_gtol_scaled`
63103
- `stopping_max_criterion_evaluations` -> `stopping_maxfun`
64104
- `stopping_max_iterations` -> `stopping_maxiter`
65-
- The `log_options` argument of `minimize` and `maximize` is deprecated,
66-
an according FutureWarning is raised.
105+
- The arguments `lower_bounds`, `upper_bounds`, `soft_lower_bounds` and
106+
`soft_upper_bounds` are deprecated and replaced by `optimagic.Bounds`. This affects
107+
`maximize`, `minimize`, `estimate_ml`, `estimate_msm`, `slice_plot` and several
108+
other functions.
109+
- The `log_options` argument of `minimize` and `maximize` is deprecated. Instead,
110+
`LogOptions` objects can be passed under the `logging` argument.
67111
- The class `OptimizeLogReader` is deprecated and redirects to
68112
`SQLiteLogReader`.
113+
- The `scaling_options` argument of `maximize` and `minimize` is deprecated. Instead a
114+
`ScalingOptions` object can be passed under the `scaling` argument that was previously
115+
just a bool.
116+
- Objective functions that return a dictionary with the special keys "value",
117+
"contributions" and "root_contributions" are deprecated. Instead, likelihood and
118+
least-squares functions are marked with a `mark.likelihood` or `mark.least_squares`
119+
decorator. There is a detailed how-to guide that shows the new behavior. This affects
120+
`maximize`, `minimize`, `slice_plot` and other functions that work with objective
121+
functions.
122+
- The `multistart_options` argument of `minimize` and `maximize` is deprecated. Instead,
123+
a `MultistartOptions` object can be passed under the `multistart` argument.
124+
- Richardson Extrapolation is deprecated in `first_derivative` and `second_derivative`
125+
- The `key` argument is deprecated in `first_derivative` and `second_derivative`
126+
- Passing dictionaries or lists of dictionaries as `constraints` to `maximize` or
127+
`minimize` is deprecated. Use the new `Constraint` objects instead.
69128

70129
## 0.4.7
71130

README.md

Lines changed: 12 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -28,10 +28,20 @@ tools, parallel numerical derivatives and more.
2828
perform statistical inference on estimated parameters. *estimagic* is now a subpackage
2929
of *optimagic*.
3030

31+
## Documentation
32+
33+
The documentation is hosted at https://optimagic.readthedocs.io
34+
3135
## Installation
3236

33-
The package can be installed via conda. To do so, type the following commands in a
34-
terminal:
37+
The package can be installed via pip or conda. To do so, type the following commands in
38+
a terminal:
39+
40+
```bash
41+
pip install optimagic
42+
```
43+
44+
or
3545

3646
```bash
3747
$ conda config --add channels conda-forge
@@ -67,10 +77,6 @@ To enable all algorithms at once, do the following:
6777

6878
`pip install fides>=0.7.4 (Make sure you have at least 0.7.1)`
6979

70-
## Documentation
71-
72-
The documentation is hosted ([on rtd](https://estimagic.readthedocs.io/en/latest/#))
73-
7480
## Citation
7581

7682
If you use optimagic for your research, please do not forget to cite it.

docs/source/explanation/internal_optimizers.md

Lines changed: 3 additions & 109 deletions
Original file line numberDiff line numberDiff line change
@@ -23,49 +23,10 @@ transformed problem.
2323

2424
## The internal optimizer interface
2525

26-
An internal optimizer is a a function that minimizes a criterion function and fulfills a
27-
few conditions. In our experience, it is not hard to wrap any optimizer into this
28-
interface. The mandatory conditions for an internal optimizer function are:
29-
30-
1. It is decorated with the `mark_minimizer` decorator and thus carries information that
31-
tells optimagic how to use the internal optimizer.
32-
33-
1. It uses the standard names for the arguments that describe the optimization problem:
34-
35-
- criterion: for the criterion function
36-
- x: for the start parameters in form of a 1d numpy array
37-
- derivative: for the first derivative of the criterion function
38-
- criterion_and_derivative: for a function that evaluates the criterion and its first
39-
derivative jointly
40-
- lower_bounds: for lower bounds in form of a 1d numpy array
41-
- upper_bounds: for upper bounds in form of a 1d numpy array
42-
- nonlinear_constraints: for nonlinear constraints in form a list of dictionaries
43-
44-
Of course, algorithms that do not need a certain argument (e.g. unbounded or
45-
derivative free ones) do not need those arguments at all.
46-
47-
1. All other arguments have default values.
48-
49-
(internal-optimizer-output)=
26+
(to be written)
5027

5128
## Output of internal optimizers
5229

53-
After convergence or when another stopping criterion is achieved the internal optimizer
54-
should return a dictionary with the following entries:
55-
56-
- solution_x: The best parameter achieved so far
57-
- solution_criterion: The value of the criterion at solution_x. This can be a scalar or
58-
dictionary.
59-
- n_fun_evals: The number of criterion evaluations.
60-
- n_jac_evals: The number of derivative evaluations.
61-
- n_iterations: The number of iterations
62-
- success: True if convergence was achieved
63-
- message: A string with additional information.
64-
65-
If some of the entries are missing, they will automatically be filled with `None` and no
66-
errors are raised. Nevertheless, you should try to return as much information as
67-
possible.
68-
6930
(naming-conventions)=
7031

7132
## Naming conventions for algorithm specific arguments
@@ -89,75 +50,8 @@ the exact meaning of all options for all optimizers.
8950

9051
## Algorithms that parallelize
9152

92-
Algorithms can evaluate the criterion function in parallel. To make such a parallel
93-
algorithm fully compatible with optimagic (including history collection and benchmarking
94-
functionality), the following conditions need to be fulfilled:
95-
96-
- The algorithm has an argument called `n_cores` which determines how many cores are
97-
used for the parallelization.
98-
- The algorithm has an argument called `batch_evaluator` and all parallelization is done
99-
using a built-in or user provided batch evaluator.
100-
101-
Moreover, we strongly suggest to comply with the following convention:
102-
103-
- The algorithm has an argument called `batch_size` which is an integer that is greater
104-
or equal to `n_cores`. Setting the `batch_size` larger than n_cores, allows to
105-
simulate how the algorithm would behave with `n_cores=batch_size` but only uses
106-
`n_cores` cores. This allows to simulate / benchmark the parallelizability of an
107-
algorithm even if no parallel hardware is available.
108-
109-
If the mandatory conditions are not fulfilled, the algorithm should disable all history
110-
collection by using `mark_minimizer(..., disable_history=True)`.
53+
(to be written)
11154

11255
## Nonlinear constraints
11356

114-
optimagic can pass nonlinear constraints to the internal optimizer. The internal
115-
interface for nonlinear constraints is as follows.
116-
117-
A nonlinear constraint is a `list` of `dict` 's, where each `dict` represents a group of
118-
constraints. In each group the constraint function can potentially be multi-dimensional.
119-
We distinguish between equality and inequality constraints, which is signalled by a dict
120-
entry `type` that takes values `"eq"` and `"ineq"`. The constraint function, which takes
121-
as input an internal parameter vector, is stored under the entry `fun`, while the
122-
Jacobian of that function is stored at `jac`. The tolerance for the constraints is
123-
stored under `tol`. At last, the number of constraints in each group is specified under
124-
`n_constr`. An example list with one constraint that would be passed to the internal
125-
optimizer is given by
126-
127-
```
128-
constraints = [
129-
{
130-
"type": "ineq",
131-
"n_constr": 1,
132-
"tol": 1e-5,
133-
"fun": lambda x: x**3,
134-
"jac": lambda x: 3 * x**2,
135-
}
136-
]
137-
```
138-
139-
**Equality.** Internal equality constraints assume that the constraint is met when the
140-
function is zero. That is
141-
142-
$$
143-
0 = g(x) \in \mathbb{R}^m .
144-
$$
145-
146-
**Inequality.** Internal inequality constraints assume that the constraint is met when
147-
the function is greater or equal to zero. That is
148-
149-
$$
150-
0 \leq g(x) \in \mathbb{R}^m .
151-
$$
152-
153-
## Other conventions
154-
155-
- Internal optimizer are functions and should thus adhere to python naming conventions,
156-
for functions (i.e. only consist of lowercase letters and individual words should be
157-
separated by underscores). For optimizers that are implemented in many packages (e.g.
158-
Nelder Mead or BFGS), the name of the original package in which it was implemented has
159-
to be part of the name.
160-
- All arguments of an internal optimizer should actually be used. In particular, if an
161-
optimizer does not support bounds it should not have `lower_bounds` and `upper_bounds`
162-
as arguments; derivative free optimizers should not have `derivative` or
163-
`criterion_and_derivative` as arguments, etc.
57+
(to be written)

pyproject.toml

Lines changed: 9 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -14,23 +14,28 @@ dependencies = [
1414
"pybaum>=0.1.2",
1515
"scipy>=1.2.1",
1616
"sqlalchemy>=1.3",
17+
"annotated-types",
18+
"typing-extensions",
19+
"nlopt",
1720
]
1821
dynamic = ["version"]
1922
keywords = [
20-
"econometrics",
23+
"nonlinear optimization",
24+
"optimization",
25+
"derivative free optimization",
26+
"global optimization",
27+
"parallel optimization",
2128
"statistics",
2229
"estimation",
2330
"extremum estimation",
24-
"optimization",
2531
"inference",
2632
"numerical differentiation",
2733
"finite differences",
28-
"derivative free optimization",
2934
"method of simulated moments",
3035
"maximum likelihood",
3136
]
3237
classifiers = [
33-
"Development Status :: 4 - Beta",
38+
"Development Status :: 5 - Production/Stable",
3439
"Intended Audience :: Science/Research",
3540
"License :: OSI Approved :: MIT License",
3641
"Operating System :: MacOS :: MacOS X",

0 commit comments

Comments
 (0)