Skip to content

@ralphschuler.ai function caller.index.<internal>.OpenAI.FineTunes.FineTune.Hyperparams

github-actions edited this page Nov 26, 2023 · 1 revision

Interface: Hyperparams

FineTunes.FineTune.Hyperparams

The hyperparameters used for the fine-tuning job. See the fine-tuning guide for more details.

Table of contents

Properties

Properties

batch_size

batch_size: number

The batch size to use for training. The batch size is the number of training examples used to train a single forward and backward pass.

Defined in

node_modules/openai/resources/fine-tunes.d.ts:116


classification_n_classes

Optional classification_n_classes: number

The number of classes to use for computing classification metrics.

Defined in

node_modules/openai/resources/fine-tunes.d.ts:133


classification_positive_class

Optional classification_positive_class: string

The positive class to use for computing classification metrics.

Defined in

node_modules/openai/resources/fine-tunes.d.ts:137


compute_classification_metrics

Optional compute_classification_metrics: boolean

The classification metrics to compute using the validation dataset at the end of every epoch.

Defined in

node_modules/openai/resources/fine-tunes.d.ts:142


learning_rate_multiplier

learning_rate_multiplier: number

The learning rate multiplier to use for training.

Defined in

node_modules/openai/resources/fine-tunes.d.ts:120


n_epochs

n_epochs: number

The number of epochs to train the model for. An epoch refers to one full cycle through the training dataset.

Defined in

node_modules/openai/resources/fine-tunes.d.ts:125


prompt_loss_weight

prompt_loss_weight: number

The weight to use for loss on the prompt tokens.

Defined in

node_modules/openai/resources/fine-tunes.d.ts:129

Typescript Libraries

Modules

Namespaces

Clone this wiki locally