@@ -63,9 +63,8 @@ can be simplified given that the optimization package is already loaded:
63
63
r = densratio (x_nu, x_de, KLIEP ())
64
64
```
65
65
66
- Different implementations of the same estimator are loaded using the
67
- [ Requires.jl] ( https://github.com/MikeInnes/Requires.jl ) package, and
68
- the keyword argument ` optlib ` can be any of:
66
+ Different implementations of the same estimator are loaded using package extensions,
67
+ and the keyword argument ` optlib ` can be any of:
69
68
70
69
* ` JuliaLib ` - Pure Julia implementation
71
70
* ` OptimLib ` - [ Optim.jl] ( https://github.com/JuliaNLSolvers/Optim.jl ) implementation
@@ -95,14 +94,14 @@ r = densratiofunc(x_nu, x_de, KLIEP())
95
94
### Hyperparameter tuning
96
95
97
96
Methods like ` KLIEP ` are equipped with tuning strategies, and its hyperparameters
98
- can be found using the following line :
97
+ can be found using the following code :
99
98
100
99
``` julia
101
100
dre = fit (KLIEP, x_nu, x_de, LCV ((σ= [1. ,2. ,3. ],b= [100 ]))
102
101
```
103
102
104
103
The function returns a `KLIEP` instance with parameters optimized for the samples.
105
- In this case, the line uses likelihood cross-validation `LCV` as the tuning
104
+ In this case, the code uses likelihood cross-validation `LCV` as the tuning
106
105
strategy. It accepts a named tuple with the hyperparameter ranges for `KLIEP`,
107
106
the kernel width `σ` and the number of basis functions `b`. Currently, the
108
107
following tuning strategies are implemented:
0 commit comments