Uses Vite
yarn install
yarn devconst network = new Network(
// Inputs
2,
// Output
{ neurons: 1, activationFunction: ActivationFunction.Sigmoid },
// Hidden layers
[
{ neurons: 3, activationFunction: ActivationFunction.Relu, bias: 1 },
{ neurons: 3, bias: 0.5 } // Default function is Sigmoid
]
)const trainer = new Backpropagation(network, 0.3 /* Learning Rate */)
const xorSamples: TrainingSample[] = [
{ inputs: [0, 0], outputs: [0] },
{ inputs: [0, 1], outputs: [1] },
{ inputs: [1, 0], outputs: [1] },
{ inputs: [1, 1], outputs: [0] },
];trainer.train(xorSamples[0])trainer.trainBatch(xorSamples)network.compute(xorSamples[0].inputs)Exporting the network will maintain all weights, activation functions, and biases when importing again.
const networkExport = network.export()
const networkJson = JSON.stringify(networkExport)const networkExport = JSON.parse(networkJson)
const network = Network.fromNetworkExport(networkExport)const renderer = new Renderer(document.getElementById("graph")!)
renderer.draw(network)Sample XOR Network with 2 hidden layers
Currently, the graph library doesn't do a great job of fitting the whole network, so adjusting some of the coordinates in the renderer is needed. For larger networks, it's not very feasible to render.
