Follow this steps to get up and running with the NPU API.
Goal of this tutorial:
Understand the core functionalities of how to accelerate your AI with the NPU python library.
Get access to our API dashboard.
Install our NPU python library.
Train your first model with our API.
Create an account in the Dashboard¶
The dashboard allows you to view in one place everything that you are running with our API. Without the dashboard you cannot use our API. Create an account in the Dashboard to access all of the API functionalities.
You can learn more about our Dashboard and its functionalities in our dedicated page.
Install the python library¶
Using Python 3 in your environment run:
pip3 install npu
Train your first model¶
You will now see how simple it is to train your model.
First thing we are going to do is import the NPU library, model and dataset. For this tutorial we will be using the resnet18 and the CIFAR10 dataset.
import npu from npu.vision.models import resnet18 from npu.vision.datasets import CIFAR10
The NPU library contains under the vision package a range of models (fresh and pre-trained) and datasets without requiring you to have them on your local machine. We call these, global models and global datasets.
You can learn more about the vision package in our dedicated page.
Next, after importing, we are going to enable API access. To access remotely to our accelerator cards on our cloud, you need to have an API token. This is provided on your dashboard. You can find it in the home page or under your account. Although your token will be different from this one, it should look like this:
We are gonna take the token and pass it as an argument to our API acces line of code:
import npu from npu.vision.models import resnet18 from npu.vision.datasets import CIFAR10 npu.api('qO8teIJLqmVFtGvP1_yaBHIVXOLrf9FJezpW9thstyU')
We are now ready to train our first model. We will specify the training and validation data, the loss, the optimiser, the batch size and epochs.
import npu from npu.vision.models import resnet18 from npu.vision.datasets import CIFAR10 npu.api('qO8teIJLqmVFtGvP1_yaBHIVXOLrf9FJezpW9thstyU') model_trained = npu.train(resnet18(pretrained=True), train_data=CIFAR10.train, val_data=CIFAR10.val, loss=npu.loss.SparseCrossEntropyLoss, optim=npu.optim.SGD(lr=0.01), batch_size=256, epochs=2)
If you run this script you will be able to see how the training evolves, loss is minimised, accuracy is increased and much more at the dashboard. Go to your tasks section at the dashboard and view your first training task.
Follow the tutorials to learn all the features of the NPU library
Check the NPU library reference page to learn more about each of the functions
Check the Dashboard page to learn more about its functionalities