Skip to content

piximi/autotuner

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

autotuner

Hyperparameter search module for tensorflow.js (sequential) models.
The following parameters are autotuned: lossFunction, optimizer algorithm, batchSize and epochs

Usage

import { TensorflowlModelAutotuner } from '@piximi/autotuner';

Getting Started

Initialize the autotuner by specifying metrics, a dataset and the number of categories.

var autotuner = new TensorflowlModelAutotuner(['accuracy'], dataset, numberOfCategories);

Adding a model to the autotuner

// create some uncompiled sequential tensorflow model
const testModel = await createModel();

const parameters = { lossfunction: [LossFunction.categoricalCrossentropy], optimizerAlgorithm: [tensorflow.train.adadelta()], batchSize: [10], epochs: [5,10] };
autotuner.addModel('testModel', testModel, parameters);

Tuning the hyperparameters

Specify the optimization algorithm: the hyperparameters can be tuned by either doing bayesian optimization or by doing a simple grid search.

autotuner.bayesianOptimization();
autotuner.gridSearchOptimizytion();

The ojective function of the optimization can be specified (either 'error' or 'accuracy'):

autotuner.gridSearchOptimizytion('accuracy');

The acquisition function of the optimization can be specified. Either expected improvement (i.e. 'ei') or upper confidence bound (i.e. 'ucb'):

autotuner.gridSearchOptimizytion('accuracy', 'ei');

Evaluating a model can be done using cross validation:

autotuner.gridSearchOptimizytion('accuracy', 'ei', true);

When doing bayesian optimization the maximum number of domain points to be evaluated can be specified as an optional parameter:

autotuner.bayesianOptimization('accuracy', 'ei', true, 0.8);

In the example above the optimizytion stops after 80% of the domain ponits have been evaluated. By default this value is set to 0.75.

Evaluate the best hyperparameters

The best hyperparameters found in the optimization run can be evaluated on the test set. Specify the objective and wheter or not to use cross validation.

autotuner.evaluateBestParameter('error', true);

Example

An example usage can be found here:

tets/runExampleAutotuner.ts

Development

Pull and initialize:

git clone https://github.com/piximi/autotuner.git
cd autotuner
npm install

To run tests:

npm run test
npm run runExampleAutotuner

To compile the code and check for type errors:

npm run build

About

Hyperparameter search module for tensorflow.js

Resources

License

Stars

Watchers

Forks

Packages

No packages published