Optimisers

synopsis

Optimisers to use with NPU API for training

npu.optim.Adam(lr=0.001)

Adam Optimiser.

Parameters
  • lr (float) – Learning rate

  • beta1 (float) – Beta 1

  • beta2 (float) – Beta 2

  • epsilon (float) – Epsilon

npu.optim.RMS(lr=0.001, decay=0)

RMS Optimiser.

Parameters
  • lr (float) – Learning rate

  • decay (float) – Decay

npu.optim.SGD(lr=0.001, momentum=0)

SGD Optimiser.

Parameters
  • lr (float) – Learning rate

  • momentum (float) – Momentum