PennyLane
Install
Install

Related materials

  • Related contentQuanvolutional Neural Networks
  • Related contentTurning quantum nodes into Torch Layers
  • Related contentVariational Quantum Linear Solver

Contents

  1. Introduction
  2. Classical-to-quantum transfer learning
  3. General setup
  4. Setting of the main hyper-parameters of the model
  5. Dataset loading
  6. Variational quantum circuit
  7. Dressed quantum circuit
  8. Hybrid classical-quantum model
  9. Training and results
  10. Visualizing the model predictions
  11. References
  12. About the author

Downloads

  • Download Python script
  • Download Notebook
  • View on GitHub
  1. Demos/
  2. Quantum Machine Learning/
  3. Quantum transfer learning

Quantum transfer learning

Andrea Mari

Andrea Mari

Published: December 18, 2019. Last updated: September 22, 2025.

In this tutorial we apply a machine learning method, known as transfer learning, to an image classifier based on a hybrid classical-quantum network.

This example follows the general structure of the PyTorch tutorial on transfer learning by Sasank Chilamkurthy, with the crucial difference of using a quantum circuit to perform the final classification task.

More details on this topic can be found in the research paper [1] (Mari et al. (2019)).

Introduction

Transfer learning is a well-established technique for training artificial neural networks (see e.g., Ref. [2]), which is based on the general intuition that if a pre-trained network is good at solving a given problem, then, with just a bit of additional training, it can be used to also solve a different but related problem.

As discussed in Ref. [1], this idea can be formalized in terms of two abstract networks \(A\) and \(B,\) independently from their quantum or classical physical nature.


transfer_general

As sketched in the above figure, one can give the following general definition of the transfer learning method:

  1. Take a network \(A\) that has been pre-trained on a dataset \(D_A\) and for a given task \(T_A.\)

  2. Remove some of the final layers. In this way, the resulting truncated network \(A'\) can be used as a feature extractor.

  3. Connect a new trainable network \(B\) at the end of the pre-trained network \(A'.\)

  4. Keep the weights of \(A'\) constant, and train the final block \(B\) with a new dataset \(D_B\) and/or for a new task of interest \(T_B.\)

When dealing with hybrid systems, depending on the physical nature (classical or quantum) of the networks \(A\) and \(B,\) one can have different implementations of transfer learning as

summarized in following table:


Network A

Network B

Transfer learning scheme

Classical

Classical

CC - Standard classical method. See e.g., Ref. [2].

Classical

Quantum

CQ - Hybrid model presented in this tutorial.

Quantum

Classical

QC - Model studied in Ref. [1].

Quantum

Quantum

QQ - Model studied in Ref. [1].

Classical-to-quantum transfer learning

We focus on the CQ transfer learning scheme discussed in the previous section and we give a specific example.

  1. As pre-trained network \(A\) we use ResNet18, a deep residual neural network introduced by Microsoft in Ref. [3], which is pre-trained on the ImageNet dataset.

  2. After removing its final layer we obtain \(A',\) a pre-processing block which maps any input high-resolution image into 512 abstract features.

  3. Such features are classified by a 4-qubit “dressed quantum circuit” \(B,\) i.e., a variational quantum circuit sandwiched between two classical layers.

  4. The hybrid model is trained, keeping \(A'\) constant, on the Hymenoptera dataset (a small subclass of ImageNet) containing images of ants and bees.

A graphical representation of the full data processing pipeline is given in the figure below.

transfer_c2q

General setup

Note

To use the PyTorch interface in PennyLane, you must first install PyTorch.

In addition to PennyLane, we will also need some standard PyTorch libraries and the plotting library matplotlib.

# Some parts of this code are based on the Python script:
# https://github.com/pytorch/tutorials/blob/master/beginner_source/transfer_learning_tutorial.py
# License: BSD

import time
import os
import copy
import urllib.request
import shutil

# PyTorch
import torch
import torch.nn as nn
import torch.optim as optim
from torch.optim import lr_scheduler
import torchvision
from torchvision import datasets, transforms

# Pennylane
import pennylane as qml
from pennylane import numpy as np

torch.manual_seed(42)
np.random.seed(42)

# Plotting
import matplotlib.pyplot as plt

# OpenMP: number of parallel threads.
os.environ["OMP_NUM_THREADS"] = "1"

Setting of the main hyper-parameters of the model

Note

To reproduce the results of Ref. [1], num_epochs should be set to 30 which may take a long time. We suggest to first try with num_epochs=1 and, if everything runs smoothly, increase it to a larger value.

n_qubits = 4                # Number of qubits
step = 0.0004               # Learning rate
batch_size = 4              # Number of samples for each training step
num_epochs = 3              # Number of training epochs
q_depth = 6                 # Depth of the quantum circuit (number of variational layers)
gamma_lr_scheduler = 0.1    # Learning rate reduction applied every 10 epochs.
q_delta = 0.01              # Initial spread of random quantum weights
start_time = time.time()    # Start of the computation timer

We initialize a PennyLane device with a default.qubit backend.

dev = qml.device("default.qubit", wires=n_qubits)

We configure PyTorch to use CUDA only if available. Otherwise the CPU is used.

device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")

Dataset loading

Note

The dataset containing images of ants and bees can be downloaded here and should be extracted in the subfolder ../_data/hymenoptera_data.

This is a very small dataset (roughly 250 images), too small for training from scratch a classical or quantum model, however it is enough when using transfer learning approach.

The PyTorch packages torchvision and torch.utils.data are used for loading the dataset and performing standard preliminary image operations: resize, center, crop, normalize, etc.

data_transforms = {
    "train": transforms.Compose(
        [
            # transforms.RandomResizedCrop(224),     # uncomment for data augmentation
            # transforms.RandomHorizontalFlip(),     # uncomment for data augmentation
            transforms.Resize(256),
            transforms.CenterCrop(224),
            transforms.ToTensor(),
            # Normalize input channels using mean values and standard deviations of ImageNet.
            transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225]),
        ]
    ),
    "val": transforms.Compose(
        [
            transforms.Resize(256),
            transforms.CenterCrop(224),
            transforms.ToTensor(),
            transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225]),
        ]
    ),
}

data_dir = "hymenoptera_data"
if not os.path.exists(data_dir):
    urllib.request.urlretrieve(
        "https://download.pytorch.org/tutorial/hymenoptera_data.zip", f"{data_dir}.zip"
    )
    shutil.unpack_archive(f"{data_dir}.zip")

image_datasets = {
    x if x == "train" else "validation": datasets.ImageFolder(
        os.path.join(data_dir, x), data_transforms[x]
    )
    for x in ["train", "val"]
}
dataset_sizes = {x: len(image_datasets[x]) for x in ["train", "validation"]}
class_names = image_datasets["train"].classes

# Initialize dataloader
dataloaders = {
    x: torch.utils.data.DataLoader(image_datasets[x], batch_size=batch_size, shuffle=True)
    for x in ["train", "validation"]
}

# function to plot images
def imshow(inp, title=None):
    """Display image from tensor."""
    inp = inp.numpy().transpose((1, 2, 0))
    # Inverse of the initial normalization operation.
    mean = np.array([0.485, 0.456, 0.406])
    std = np.array([0.229, 0.224, 0.225])
    inp = std * inp + mean
    inp = np.clip(inp, 0, 1)
    plt.imshow(inp)
    if title is not None:
        plt.title(title)

Let us show a batch of the test data, just to have an idea of the classification problem.

# Get a batch of training data
inputs, classes = next(iter(dataloaders["validation"]))

# Make a grid from batch
out = torchvision.utils.make_grid(inputs)

imshow(out, title=[class_names[x] for x in classes])

dataloaders = {
    x: torch.utils.data.DataLoader(image_datasets[x], batch_size=batch_size, shuffle=True)
    for x in ["train", "validation"]
}
['bees', 'ants', 'bees', 'bees']

Variational quantum circuit

We first define some quantum layers that will compose the quantum circuit.

def H_layer(nqubits):
    """Layer of single-qubit Hadamard gates.
    """
    for idx in range(nqubits):
        qml.Hadamard(wires=idx)


def RY_layer(w):
    """Layer of parametrized qubit rotations around the y axis.
    """
    for idx, element in enumerate(w):
        qml.RY(element, wires=idx)


def entangling_layer(nqubits):
    """Layer of CNOTs followed by another shifted layer of CNOT.
    """
    # In other words it should apply something like :
    # CNOT  CNOT  CNOT  CNOT...  CNOT
    #   CNOT  CNOT  CNOT...  CNOT
    for i in range(0, nqubits - 1, 2):  # Loop over even indices: i=0,2,...N-2
        qml.CNOT(wires=[i, i + 1])
    for i in range(1, nqubits - 1, 2):  # Loop over odd indices:  i=1,3,...N-3
        qml.CNOT(wires=[i, i + 1])

Now we define the quantum circuit through the PennyLane qnode decorator .

The structure is that of a typical variational quantum circuit:

  • Embedding layer: All qubits are first initialized in a balanced superposition of up and down states, then they are rotated according to the input parameters (local embedding).

  • Variational layers: A sequence of trainable rotation layers and constant entangling layers is applied.

  • Measurement layer: For each qubit, the local expectation value of the \(Z\) operator is measured. This produces a classical output vector, suitable for additional post-processing.

@qml.qnode(dev)
def quantum_net(q_input_features, q_weights_flat):
    """
    The variational quantum circuit.
    """

    # Reshape weights
    q_weights = q_weights_flat.reshape(q_depth, n_qubits)

    # Start from state |+> , unbiased w.r.t. |0> and |1>
    H_layer(n_qubits)

    # Embed features in the quantum node
    RY_layer(q_input_features)

    # Sequence of trainable variational layers
    for k in range(q_depth):
        entangling_layer(n_qubits)
        RY_layer(q_weights[k])

    # Expectation values in the Z basis
    exp_vals = [qml.expval(qml.PauliZ(position)) for position in range(n_qubits)]
    return tuple(exp_vals)

Dressed quantum circuit

We can now define a custom torch.nn.Module representing a dressed quantum circuit.

This is a concatenation of:

  • A classical pre-processing layer (nn.Linear).

  • A classical activation function (torch.tanh).

  • A constant np.pi/2.0 scaling.

  • The previously defined quantum circuit (quantum_net).

  • A classical post-processing layer (nn.Linear).

The input of the module is a batch of vectors with 512 real parameters (features) and the output is a batch of vectors with two real outputs (associated with the two classes of images: ants and bees).

class DressedQuantumNet(nn.Module):
    """
    Torch module implementing the *dressed* quantum net.
    """

    def __init__(self):
        """
        Definition of the *dressed* layout.
        """

        super().__init__()
        self.pre_net = nn.Linear(512, n_qubits)
        self.q_params = nn.Parameter(q_delta * torch.randn(q_depth * n_qubits))
        self.post_net = nn.Linear(n_qubits, 2)

    def forward(self, input_features):
        """
        Defining how tensors are supposed to move through the *dressed* quantum
        net.
        """

        # obtain the input features for the quantum circuit
        # by reducing the feature dimension from 512 to 4
        pre_out = self.pre_net(input_features)
        q_in = torch.tanh(pre_out) * np.pi / 2.0

        # Apply the quantum circuit to each element of the batch and append to q_out
        q_out = torch.Tensor(0, n_qubits)
        q_out = q_out.to(device)
        for elem in q_in:
            q_out_elem = torch.hstack(quantum_net(elem, self.q_params)).float().unsqueeze(0)
            q_out = torch.cat((q_out, q_out_elem))

        # return the two-dimensional prediction from the postprocessing layer
        return self.post_net(q_out)

Hybrid classical-quantum model

We are finally ready to build our full hybrid classical-quantum network. We follow the transfer learning approach:

  1. First load the classical pre-trained network ResNet18 from the torchvision.models zoo.

  2. Freeze all the weights since they should not be trained.

  3. Replace the last fully connected layer with our trainable dressed quantum circuit (DressedQuantumNet).

Note

The ResNet18 model is automatically downloaded by PyTorch and it may take several minutes (only the first time).

weights = torchvision.models.ResNet18_Weights.IMAGENET1K_V1
model_hybrid = torchvision.models.resnet18(weights=weights)

for param in model_hybrid.parameters():
    param.requires_grad = False


# Notice that model_hybrid.fc is the last layer of ResNet18
model_hybrid.fc = DressedQuantumNet()

# Use CUDA or CPU according to the "device" object.
model_hybrid = model_hybrid.to(device)
Downloading: "https://download.pytorch.org/models/resnet18-f37072fd.pth" to /home/runner/.cache/torch/hub/checkpoints/resnet18-f37072fd.pth

0.0%
0.0%
0.1%
0.1%
0.1%
0.1%
0.1%
0.1%
0.2%
0.2%
0.2%
0.2%
0.2%
0.2%
0.3%
0.3%
0.3%
0.3%
0.3%
0.3%
0.4%
0.4%
0.4%
0.4%
0.4%
0.5%
0.5%
0.5%
0.5%
0.5%
0.5%
0.6%
0.6%
0.6%
0.6%
0.6%
0.6%
0.7%
0.7%
0.7%
0.7%
0.7%
0.8%
0.8%
0.8%
0.8%
0.8%
0.8%
0.9%
0.9%
0.9%
0.9%
0.9%
0.9%
1.0%
1.0%
1.0%
1.0%
1.0%
1.0%
1.1%
1.1%
1.1%
1.1%
1.1%
1.2%
1.2%
1.2%
1.2%
1.2%
1.2%
1.3%
1.3%
1.3%
1.3%
1.3%
1.3%
1.4%
1.4%
1.4%
1.4%
1.4%
1.5%
1.5%
1.5%
1.5%
1.5%
1.5%
1.6%
1.6%
1.6%
1.6%
1.6%
1.6%
1.7%
1.7%
1.7%
1.7%
1.7%
1.7%
1.8%
1.8%
1.8%
1.8%
1.8%
1.9%
1.9%
1.9%
1.9%
1.9%
1.9%
2.0%
2.0%
2.0%
2.0%
2.0%
2.0%
2.1%
2.1%
2.1%
2.1%
2.1%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
2.3%
2.3%
2.3%
2.3%
2.3%
2.3%
2.4%
2.4%
2.4%
2.4%
2.4%
2.4%
2.5%
2.5%
2.5%
2.5%
2.5%
2.6%
2.6%
2.6%
2.6%
2.6%
2.6%
2.7%
2.7%
2.7%
2.7%
2.7%
2.7%
2.8%
2.8%
2.8%
2.8%
2.8%
2.9%
2.9%
2.9%
2.9%
2.9%
2.9%
3.0%
3.0%
3.0%
3.0%
3.0%
3.0%
3.1%
3.1%
3.1%
3.1%
3.1%
3.1%
3.2%
3.2%
3.2%
3.2%
3.2%
3.3%
3.3%
3.3%
3.3%
3.3%
3.3%
3.4%
3.4%
3.4%
3.4%
3.4%
3.4%
3.5%
3.5%
3.5%
3.5%
3.5%
3.6%
3.6%
3.6%
3.6%
3.6%
3.6%
3.7%
3.7%
3.7%
3.7%
3.7%
3.7%
3.8%
3.8%
3.8%
3.8%
3.8%
3.8%
3.9%
3.9%
3.9%
3.9%
3.9%
4.0%
4.0%
4.0%
4.0%
4.0%
4.0%
4.1%
4.1%
4.1%
4.1%
4.1%
4.1%
4.2%
4.2%
4.2%
4.2%
4.2%
4.3%
4.3%
4.3%
4.3%
4.3%
4.3%
4.4%
4.4%
4.4%
4.4%
4.4%
4.4%
4.5%
4.5%
4.5%
4.5%
4.5%
4.5%
4.6%
4.6%
4.6%
4.6%
4.6%
4.7%
4.7%
4.7%
4.7%
4.7%
4.7%
4.8%
4.8%
4.8%
4.8%
4.8%
4.8%
4.9%
4.9%
4.9%
4.9%
4.9%
5.0%
5.0%
5.0%
5.0%
5.0%
5.0%
5.1%
5.1%
5.1%
5.1%
5.1%
5.1%
5.2%
5.2%
5.2%
5.2%
5.2%
5.2%
5.3%
5.3%
5.3%
5.3%
5.3%
5.4%
5.4%
5.4%
5.4%
5.4%
5.4%
5.5%
5.5%
5.5%
5.5%
5.5%
5.5%
5.6%
5.6%
5.6%
5.6%
5.6%
5.7%
5.7%
5.7%
5.7%
5.7%
5.7%
5.8%
5.8%
5.8%
5.8%
5.8%
5.8%
5.9%
5.9%
5.9%
5.9%
5.9%
5.9%
6.0%
6.0%
6.0%
6.0%
6.0%
6.1%
6.1%
6.1%
6.1%
6.1%
6.1%
6.2%
6.2%
6.2%
6.2%
6.2%
6.2%
6.3%
6.3%
6.3%
6.3%
6.3%
6.3%
6.4%
6.4%
6.4%
6.4%
6.4%
6.5%
6.5%
6.5%
6.5%
6.5%
6.5%
6.6%
6.6%
6.6%
6.6%
6.6%
6.6%
6.7%
6.7%
6.7%
6.7%
6.7%
6.8%
6.8%
6.8%
6.8%
6.8%
6.8%
6.9%
6.9%
6.9%
6.9%
6.9%
6.9%
7.0%
7.0%
7.0%
7.0%
7.0%
7.0%
7.1%
7.1%
7.1%
7.1%
7.1%
7.2%
7.2%
7.2%
7.2%
7.2%
7.2%
7.3%
7.3%
7.3%
7.3%
7.3%
7.3%
7.4%
7.4%
7.4%
7.4%
7.4%
7.5%
7.5%
7.5%
7.5%
7.5%
7.5%
7.6%
7.6%
7.6%
7.6%
7.6%
7.6%
7.7%
7.7%
7.7%
7.7%
7.7%
7.7%
7.8%
7.8%
7.8%
7.8%
7.8%
7.9%
7.9%
7.9%
7.9%
7.9%
7.9%
8.0%
8.0%
8.0%
8.0%
8.0%
8.0%
8.1%
8.1%
8.1%
8.1%
8.1%
8.2%
8.2%
8.2%
8.2%
8.2%
8.2%
8.3%
8.3%
8.3%
8.3%
8.3%
8.3%
8.4%
8.4%
8.4%
8.4%
8.4%
8.4%
8.5%
8.5%
8.5%
8.5%
8.5%
8.6%
8.6%
8.6%
8.6%
8.6%
8.6%
8.7%
8.7%
8.7%
8.7%
8.7%
8.7%
8.8%
8.8%
8.8%
8.8%
8.8%
8.9%
8.9%
8.9%
8.9%
8.9%
8.9%
9.0%
9.0%
9.0%
9.0%
9.0%
9.0%
9.1%
9.1%
9.1%
9.1%
9.1%
9.1%
9.2%
9.2%
9.2%
9.2%
9.2%
9.3%
9.3%
9.3%
9.3%
9.3%
9.3%
9.4%
9.4%
9.4%
9.4%
9.4%
9.4%
9.5%
9.5%
9.5%
9.5%
9.5%
9.6%
9.6%
9.6%
9.6%
9.6%
9.6%
9.7%
9.7%
9.7%
9.7%
9.7%
9.7%
9.8%
9.8%
9.8%
9.8%
9.8%
9.8%
9.9%
9.9%
9.9%
9.9%
9.9%
10.0%
10.0%
10.0%
10.0%
10.0%
10.0%
10.1%
10.1%
10.1%
10.1%
10.1%
10.1%
10.2%
10.2%
10.2%
10.2%
10.2%
10.3%
10.3%
10.3%
10.3%
10.3%
10.3%
10.4%
10.4%
10.4%
10.4%
10.4%
10.4%
10.5%
10.5%
10.5%
10.5%
10.5%
10.5%
10.6%
10.6%
10.6%
10.6%
10.6%
10.7%
10.7%
10.7%
10.7%
10.7%
10.7%
10.8%
10.8%
10.8%
10.8%
10.8%
10.8%
10.9%
10.9%
10.9%
10.9%
10.9%
11.0%
11.0%
11.0%
11.0%
11.0%
11.0%
11.1%
11.1%
11.1%
11.1%
11.1%
11.1%
11.2%
11.2%
11.2%
11.2%
11.2%
11.2%
11.3%
11.3%
11.3%
11.3%
11.3%
11.4%
11.4%
11.4%
11.4%
11.4%
11.4%
11.5%
11.5%
11.5%
11.5%
11.5%
11.5%
11.6%
11.6%
11.6%
11.6%
11.6%
11.7%
11.7%
11.7%
11.7%
11.7%
11.7%
11.8%
11.8%
11.8%
11.8%
11.8%
11.8%
11.9%
11.9%
11.9%
11.9%
11.9%
11.9%
12.0%
12.0%
12.0%
12.0%
12.0%
12.1%
12.1%
12.1%
12.1%
12.1%
12.1%
12.2%
12.2%
12.2%
12.2%
12.2%
12.2%
12.3%
12.3%
12.3%
12.3%
12.3%
12.3%
12.4%
12.4%
12.4%
12.4%
12.4%
12.5%
12.5%
12.5%
12.5%
12.5%
12.5%
12.6%
12.6%
12.6%
12.6%
12.6%
12.6%
12.7%
12.7%
12.7%
12.7%
12.7%
12.8%
12.8%
12.8%
12.8%
12.8%
12.8%
12.9%
12.9%
12.9%
12.9%
12.9%
12.9%
13.0%
13.0%
13.0%
13.0%
13.0%
13.0%
13.1%
13.1%
13.1%
13.1%
13.1%
13.2%
13.2%
13.2%
13.2%
13.2%
13.2%
13.3%
13.3%
13.3%
13.3%
13.3%
13.3%
13.4%
13.4%
13.4%
13.4%
13.4%
13.5%
13.5%
13.5%
13.5%
13.5%
13.5%
13.6%
13.6%
13.6%
13.6%
13.6%
13.6%
13.7%
13.7%
13.7%
13.7%
13.7%
13.7%
13.8%
13.8%
13.8%
13.8%
13.8%
13.9%
13.9%
13.9%
13.9%
13.9%
13.9%
14.0%
14.0%
14.0%
14.0%
14.0%
14.0%
14.1%
14.1%
14.1%
14.1%
14.1%
14.2%
14.2%
14.2%
14.2%
14.2%
14.2%
14.3%
14.3%
14.3%
14.3%
14.3%
14.3%
14.4%
14.4%
14.4%
14.4%
14.4%
14.4%
14.5%
14.5%
14.5%
14.5%
14.5%
14.6%
14.6%
14.6%
14.6%
14.6%
14.6%
14.7%
14.7%
14.7%
14.7%
14.7%
14.7%
14.8%
14.8%
14.8%
14.8%
14.8%
14.9%
14.9%
14.9%
14.9%
14.9%
14.9%
15.0%
15.0%
15.0%
15.0%
15.0%
15.0%
15.1%
15.1%
15.1%
15.1%
15.1%
15.1%
15.2%
15.2%
15.2%
15.2%
15.2%
15.3%
15.3%
15.3%
15.3%
15.3%
15.3%
15.4%
15.4%
15.4%
15.4%
15.4%
15.4%
15.5%
15.5%
15.5%
15.5%
15.5%
15.6%
15.6%
15.6%
15.6%
15.6%
15.6%
15.7%
15.7%
15.7%
15.7%
15.7%
15.7%
15.8%
15.8%
15.8%
15.8%
15.8%
15.8%
15.9%
15.9%
15.9%
15.9%
15.9%
16.0%
16.0%
16.0%
16.0%
16.0%
16.0%
16.1%
16.1%
16.1%
16.1%
16.1%
16.1%
16.2%
16.2%
16.2%
16.2%
16.2%
16.3%
16.3%
16.3%
16.3%
16.3%
16.3%
16.4%
16.4%
16.4%
16.4%
16.4%
16.4%
16.5%
16.5%
16.5%
16.5%
16.5%
16.5%
16.6%
16.6%
16.6%
16.6%
16.6%
16.7%
16.7%
16.7%
16.7%
16.7%
16.7%
16.8%
16.8%
16.8%
16.8%
16.8%
16.8%
16.9%
16.9%
16.9%
16.9%
16.9%
17.0%
17.0%
17.0%
17.0%
17.0%
17.0%
17.1%
17.1%
17.1%
17.1%
17.1%
17.1%
17.2%
17.2%
17.2%
17.2%
17.2%
17.2%
17.3%
17.3%
17.3%
17.3%
17.3%
17.4%
17.4%
17.4%
17.4%
17.4%
17.4%
17.5%
17.5%
17.5%
17.5%
17.5%
17.5%
17.6%
17.6%
17.6%
17.6%
17.6%
17.7%
17.7%
17.7%
17.7%
17.7%
17.7%
17.8%
17.8%
17.8%
17.8%
17.8%
17.8%
17.9%
17.9%
17.9%
17.9%
17.9%
17.9%
18.0%
18.0%
18.0%
18.0%
18.0%
18.1%
18.1%
18.1%
18.1%
18.1%
18.1%
18.2%
18.2%
18.2%
18.2%
18.2%
18.2%
18.3%
18.3%
18.3%
18.3%
18.3%
18.3%
18.4%
18.4%
18.4%
18.4%
18.4%
18.5%
18.5%
18.5%
18.5%
18.5%
18.5%
18.6%
18.6%
18.6%
18.6%
18.6%
18.6%
18.7%
18.7%
18.7%
18.7%
18.7%
18.8%
18.8%
18.8%
18.8%
18.8%
18.8%
18.9%
18.9%
18.9%
18.9%
18.9%
18.9%
19.0%
19.0%
19.0%
19.0%
19.0%
19.0%
19.1%
19.1%
19.1%
19.1%
19.1%
19.2%
19.2%
19.2%
19.2%
19.2%
19.2%
19.3%
19.3%
19.3%
19.3%
19.3%
19.3%
19.4%
19.4%
19.4%
19.4%
19.4%
19.5%
19.5%
19.5%
19.5%
19.5%
19.5%
19.6%
19.6%
19.6%
19.6%
19.6%
19.6%
19.7%
19.7%
19.7%
19.7%
19.7%
19.7%
19.8%
19.8%
19.8%
19.8%
19.8%
19.9%
19.9%
19.9%
19.9%
19.9%
19.9%
20.0%
20.0%
20.0%
20.0%
20.0%
20.0%
20.1%
20.1%
20.1%
20.1%
20.1%
20.2%
20.2%
20.2%
20.2%
20.2%
20.2%
20.3%
20.3%
20.3%
20.3%
20.3%
20.3%
20.4%
20.4%
20.4%
20.4%
20.4%
20.4%
20.5%
20.5%
20.5%
20.5%
20.5%
20.6%
20.6%
20.6%
20.6%
20.6%
20.6%
20.7%
20.7%
20.7%
20.7%
20.7%
20.7%
20.8%
20.8%
20.8%
20.8%
20.8%
20.9%
20.9%
20.9%
20.9%
20.9%
20.9%
21.0%
21.0%
21.0%
21.0%
21.0%
21.0%
21.1%
21.1%
21.1%
21.1%
21.1%
21.1%
21.2%
21.2%
21.2%
21.2%
21.2%
21.3%
21.3%
21.3%
21.3%
21.3%
21.3%
21.4%
21.4%
21.4%
21.4%
21.4%
21.4%
21.5%
21.5%
21.5%
21.5%
21.5%
21.6%
21.6%
21.6%
21.6%
21.6%
21.6%
21.7%
21.7%
21.7%
21.7%
21.7%
21.7%
21.8%
21.8%
21.8%
21.8%
21.8%
21.8%
21.9%
21.9%
21.9%
21.9%
21.9%
22.0%
22.0%
22.0%
22.0%
22.0%
22.0%
22.1%
22.1%
22.1%
22.1%
22.1%
22.1%
22.2%
22.2%
22.2%
22.2%
22.2%
22.3%
22.3%
22.3%
22.3%
22.3%
22.3%
22.4%
22.4%
22.4%
22.4%
22.4%
22.4%
22.5%
22.5%
22.5%
22.5%
22.5%
22.5%
22.6%
22.6%
22.6%
22.6%
22.6%
22.7%
22.7%
22.7%
22.7%
22.7%
22.7%
22.8%
22.8%
22.8%
22.8%
22.8%
22.8%
22.9%
22.9%
22.9%
22.9%
22.9%
23.0%
23.0%
23.0%
23.0%
23.0%
23.0%
23.1%
23.1%
23.1%
23.1%
23.1%
23.1%
23.2%
23.2%
23.2%
23.2%
23.2%
23.2%
23.3%
23.3%
23.3%
23.3%
23.3%
23.4%
23.4%
23.4%
23.4%
23.4%
23.4%
23.5%
23.5%
23.5%
23.5%
23.5%
23.5%
23.6%
23.6%
23.6%
23.6%
23.6%
23.7%
23.7%
23.7%
23.7%
23.7%
23.7%
23.8%
23.8%
23.8%
23.8%
23.8%
23.8%
23.9%
23.9%
23.9%
23.9%
23.9%
23.9%
24.0%
24.0%
24.0%
24.0%
24.0%
24.1%
24.1%
24.1%
24.1%
24.1%
24.1%
24.2%
24.2%
24.2%
24.2%
24.2%
24.2%
24.3%
24.3%
24.3%
24.3%
24.3%
24.4%
24.4%
24.4%
24.4%
24.4%
24.4%
24.5%
24.5%
24.5%
24.5%
24.5%
24.5%
24.6%
24.6%
24.6%
24.6%
24.6%
24.6%
24.7%
24.7%
24.7%
24.7%
24.7%
24.8%
24.8%
24.8%
24.8%
24.8%
24.8%
24.9%
24.9%
24.9%
24.9%
24.9%
24.9%
25.0%
25.0%
25.0%
25.0%
25.0%
25.0%
25.1%
25.1%
25.1%
25.1%
25.1%
25.2%
25.2%
25.2%
25.2%
25.2%
25.2%
25.3%
25.3%
25.3%
25.3%
25.3%
25.3%
25.4%
25.4%
25.4%
25.4%
25.4%
25.5%
25.5%
25.5%
25.5%
25.5%
25.5%
25.6%
25.6%
25.6%
25.6%
25.6%
25.6%
25.7%
25.7%
25.7%
25.7%
25.7%
25.7%
25.8%
25.8%
25.8%
25.8%
25.8%
25.9%
25.9%
25.9%
25.9%
25.9%
25.9%
26.0%
26.0%
26.0%
26.0%
26.0%
26.0%
26.1%
26.1%
26.1%
26.1%
26.1%
26.2%
26.2%
26.2%
26.2%
26.2%
26.2%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.4%
26.4%
26.4%
26.4%
26.4%
26.4%
26.5%
26.5%
26.5%
26.5%
26.5%
26.6%
26.6%
26.6%
26.6%
26.6%
26.6%
26.7%
26.7%
26.7%
26.7%
26.7%
26.7%
26.8%
26.8%
26.8%
26.8%
26.8%
26.9%
26.9%
26.9%
26.9%
26.9%
26.9%
27.0%
27.0%
27.0%
27.0%
27.0%
27.0%
27.1%
27.1%
27.1%
27.1%
27.1%
27.1%
27.2%
27.2%
27.2%
27.2%
27.2%
27.3%
27.3%
27.3%
27.3%
27.3%
27.3%
27.4%
27.4%
27.4%
27.4%
27.4%
27.4%
27.5%
27.5%
27.5%
27.5%
27.5%
27.6%
27.6%
27.6%
27.6%
27.6%
27.6%
27.7%
27.7%
27.7%
27.7%
27.7%
27.7%
27.8%
27.8%
27.8%
27.8%
27.8%
27.8%
27.9%
27.9%
27.9%
27.9%
27.9%
28.0%
28.0%
28.0%
28.0%
28.0%
28.0%
28.1%
28.1%
28.1%
28.1%
28.1%
28.1%
28.2%
28.2%
28.2%
28.2%
28.2%
28.3%
28.3%
28.3%
28.3%
28.3%
28.3%
28.4%
28.4%
28.4%
28.4%
28.4%
28.4%
28.5%
28.5%
28.5%
28.5%
28.5%
28.5%
28.6%
28.6%
28.6%
28.6%
28.6%
28.7%
28.7%
28.7%
28.7%
28.7%
28.7%
28.8%
28.8%
28.8%
28.8%
28.8%
28.8%
28.9%
28.9%
28.9%
28.9%
28.9%
29.0%
29.0%
29.0%
29.0%
29.0%
29.0%
29.1%
29.1%
29.1%
29.1%
29.1%
29.1%
29.2%
29.2%
29.2%
29.2%
29.2%
29.2%
29.3%
29.3%
29.3%
29.3%
29.3%
29.4%
29.4%
29.4%
29.4%
29.4%
29.4%
29.5%
29.5%
29.5%
29.5%
29.5%
29.5%
29.6%
29.6%
29.6%
29.6%
29.6%
29.7%
29.7%
29.7%
29.7%
29.7%
29.7%
29.8%
29.8%
29.8%
29.8%
29.8%
29.8%
29.9%
29.9%
29.9%
29.9%
29.9%
29.9%
30.0%
30.0%
30.0%
30.0%
30.0%
30.1%
30.1%
30.1%
30.1%
30.1%
30.1%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.3%
30.3%
30.3%
30.3%
30.3%
30.4%
30.4%
30.4%
30.4%
30.4%
30.4%
30.5%
30.5%
30.5%
30.5%
30.5%
30.5%
30.6%
30.6%
30.6%
30.6%
30.6%
30.6%
30.7%
30.7%
30.7%
30.7%
30.7%
30.8%
30.8%
30.8%
30.8%
30.8%
30.8%
30.9%
30.9%
30.9%
30.9%
30.9%
30.9%
31.0%
31.0%
31.0%
31.0%
31.0%
31.0%
31.1%
31.1%
31.1%
31.1%
31.1%
31.2%
31.2%
31.2%
31.2%
31.2%
31.2%
31.3%
31.3%
31.3%
31.3%
31.3%
31.3%
31.4%
31.4%
31.4%
31.4%
31.4%
31.5%
31.5%
31.5%
31.5%
31.5%
31.5%
31.6%
31.6%
31.6%
31.6%
31.6%
31.6%
31.7%
31.7%
31.7%
31.7%
31.7%
31.7%
31.8%
31.8%
31.8%
31.8%
31.8%
31.9%
31.9%
31.9%
31.9%
31.9%
31.9%
32.0%
32.0%
32.0%
32.0%
32.0%
32.0%
32.1%
32.1%
32.1%
32.1%
32.1%
32.2%
32.2%
32.2%
32.2%
32.2%
32.2%
32.3%
32.3%
32.3%
32.3%
32.3%
32.3%
32.4%
32.4%
32.4%
32.4%
32.4%
32.4%
32.5%
32.5%
32.5%
32.5%
32.5%
32.6%
32.6%
32.6%
32.6%
32.6%
32.6%
32.7%
32.7%
32.7%
32.7%
32.7%
32.7%
32.8%
32.8%
32.8%
32.8%
32.8%
32.9%
32.9%
32.9%
32.9%
32.9%
32.9%
33.0%
33.0%
33.0%
33.0%
33.0%
33.0%
33.1%
33.1%
33.1%
33.1%
33.1%
33.1%
33.2%
33.2%
33.2%
33.2%
33.2%
33.3%
33.3%
33.3%
33.3%
33.3%
33.3%
33.4%
33.4%
33.4%
33.4%
33.4%
33.4%
33.5%
33.5%
33.5%
33.5%
33.5%
33.6%
33.6%
33.6%
33.6%
33.6%
33.6%
33.7%
33.7%
33.7%
33.7%
33.7%
33.7%
33.8%
33.8%
33.8%
33.8%
33.8%
33.8%
33.9%
33.9%
33.9%
33.9%
33.9%
34.0%
34.0%
34.0%
34.0%
34.0%
34.0%
34.1%
34.1%
34.1%
34.1%
34.1%
34.1%
34.2%
34.2%
34.2%
34.2%
34.2%
34.3%
34.3%
34.3%
34.3%
34.3%
34.3%
34.4%
34.4%
34.4%
34.4%
34.4%
34.4%
34.5%
34.5%
34.5%
34.5%
34.5%
34.5%
34.6%
34.6%
34.6%
34.6%
34.6%
34.7%
34.7%
34.7%
34.7%
34.7%
34.7%
34.8%
34.8%
34.8%
34.8%
34.8%
34.8%
34.9%
34.9%
34.9%
34.9%
34.9%
35.0%
35.0%
35.0%
35.0%
35.0%
35.0%
35.1%
35.1%
35.1%
35.1%
35.1%
35.1%
35.2%
35.2%
35.2%
35.2%
35.2%
35.2%
35.3%
35.3%
35.3%
35.3%
35.3%
35.4%
35.4%
35.4%
35.4%
35.4%
35.4%
35.5%
35.5%
35.5%
35.5%
35.5%
35.5%
35.6%
35.6%
35.6%
35.6%
35.6%
35.7%
35.7%
35.7%
35.7%
35.7%
35.7%
35.8%
35.8%
35.8%
35.8%
35.8%
35.8%
35.9%
35.9%
35.9%
35.9%
35.9%
35.9%
36.0%
36.0%
36.0%
36.0%
36.0%
36.1%
36.1%
36.1%
36.1%
36.1%
36.1%
36.2%
36.2%
36.2%
36.2%
36.2%
36.2%
36.3%
36.3%
36.3%
36.3%
36.3%
36.4%
36.4%
36.4%
36.4%
36.4%
36.4%
36.5%
36.5%
36.5%
36.5%
36.5%
36.5%
36.6%
36.6%
36.6%
36.6%
36.6%
36.6%
36.7%
36.7%
36.7%
36.7%
36.7%
36.8%
36.8%
36.8%
36.8%
36.8%
36.8%
36.9%
36.9%
36.9%
36.9%
36.9%
36.9%
37.0%
37.0%
37.0%
37.0%
37.0%
37.0%
37.1%
37.1%
37.1%
37.1%
37.1%
37.2%
37.2%
37.2%
37.2%
37.2%
37.2%
37.3%
37.3%
37.3%
37.3%
37.3%
37.3%
37.4%
37.4%
37.4%
37.4%
37.4%
37.5%
37.5%
37.5%
37.5%
37.5%
37.5%
37.6%
37.6%
37.6%
37.6%
37.6%
37.6%
37.7%
37.7%
37.7%
37.7%
37.7%
37.7%
37.8%
37.8%
37.8%
37.8%
37.8%
37.9%
37.9%
37.9%
37.9%
37.9%
37.9%
38.0%
38.0%
38.0%
38.0%
38.0%
38.0%
38.1%
38.1%
38.1%
38.1%
38.1%
38.2%
38.2%
38.2%
38.2%
38.2%
38.2%
38.3%
38.3%
38.3%
38.3%
38.3%
38.3%
38.4%
38.4%
38.4%
38.4%
38.4%
38.4%
38.5%
38.5%
38.5%
38.5%
38.5%
38.6%
38.6%
38.6%
38.6%
38.6%
38.6%
38.7%
38.7%
38.7%
38.7%
38.7%
38.7%
38.8%
38.8%
38.8%
38.8%
38.8%
38.9%
38.9%
38.9%
38.9%
38.9%
38.9%
39.0%
39.0%
39.0%
39.0%
39.0%
39.0%
39.1%
39.1%
39.1%
39.1%
39.1%
39.1%
39.2%
39.2%
39.2%
39.2%
39.2%
39.3%
39.3%
39.3%
39.3%
39.3%
39.3%
39.4%
39.4%
39.4%
39.4%
39.4%
39.4%
39.5%
39.5%
39.5%
39.5%
39.5%
39.6%
39.6%
39.6%
39.6%
39.6%
39.6%
39.7%
39.7%
39.7%
39.7%
39.7%
39.7%
39.8%
39.8%
39.8%
39.8%
39.8%
39.8%
39.9%
39.9%
39.9%
39.9%
39.9%
40.0%
40.0%
40.0%
40.0%
40.0%
40.0%
40.1%
40.1%
40.1%
40.1%
40.1%
40.1%
40.2%
40.2%
40.2%
40.2%
40.2%
40.3%
40.3%
40.3%
40.3%
40.3%
40.3%
40.4%
40.4%
40.4%
40.4%
40.4%
40.4%
40.5%
40.5%
40.5%
40.5%
40.5%
40.5%
40.6%
40.6%
40.6%
40.6%
40.6%
40.7%
40.7%
40.7%
40.7%
40.7%
40.7%
40.8%
40.8%
40.8%
40.8%
40.8%
40.8%
40.9%
40.9%
40.9%
40.9%
40.9%
41.0%
41.0%
41.0%
41.0%
41.0%
41.0%
41.1%
41.1%
41.1%
41.1%
41.1%
41.1%
41.2%
41.2%
41.2%
41.2%
41.2%
41.2%
41.3%
41.3%
41.3%
41.3%
41.3%
41.4%
41.4%
41.4%
41.4%
41.4%
41.4%
41.5%
41.5%
41.5%
41.5%
41.5%
41.5%
41.6%
41.6%
41.6%
41.6%
41.6%
41.7%
41.7%
41.7%
41.7%
41.7%
41.7%
41.8%
41.8%
41.8%
41.8%
41.8%
41.8%
41.9%
41.9%
41.9%
41.9%
41.9%
41.9%
42.0%
42.0%
42.0%
42.0%
42.0%
42.1%
42.1%
42.1%
42.1%
42.1%
42.1%
42.2%
42.2%
42.2%
42.2%
42.2%
42.2%
42.3%
42.3%
42.3%
42.3%
42.3%
42.4%
42.4%
42.4%
42.4%
42.4%
42.4%
42.5%
42.5%
42.5%
42.5%
42.5%
42.5%
42.6%
42.6%
42.6%
42.6%
42.6%
42.6%
42.7%
42.7%
42.7%
42.7%
42.7%
42.8%
42.8%
42.8%
42.8%
42.8%
42.8%
42.9%
42.9%
42.9%
42.9%
42.9%
42.9%
43.0%
43.0%
43.0%
43.0%
43.0%
43.0%
43.1%
43.1%
43.1%
43.1%
43.1%
43.2%
43.2%
43.2%
43.2%
43.2%
43.2%
43.3%
43.3%
43.3%
43.3%
43.3%
43.3%
43.4%
43.4%
43.4%
43.4%
43.4%
43.5%
43.5%
43.5%
43.5%
43.5%
43.5%
43.6%
43.6%
43.6%
43.6%
43.6%
43.6%
43.7%
43.7%
43.7%
43.7%
43.7%
43.7%
43.8%
43.8%
43.8%
43.8%
43.8%
43.9%
43.9%
43.9%
43.9%
43.9%
43.9%
44.0%
44.0%
44.0%
44.0%
44.0%
44.0%
44.1%
44.1%
44.1%
44.1%
44.1%
44.2%
44.2%
44.2%
44.2%
44.2%
44.2%
44.3%
44.3%
44.3%
44.3%
44.3%
44.3%
44.4%
44.4%
44.4%
44.4%
44.4%
44.4%
44.5%
44.5%
44.5%
44.5%
44.5%
44.6%
44.6%
44.6%
44.6%
44.6%
44.6%
44.7%
44.7%
44.7%
44.7%
44.7%
44.7%
44.8%
44.8%
44.8%
44.8%
44.8%
44.9%
44.9%
44.9%
44.9%
44.9%
44.9%
45.0%
45.0%
45.0%
45.0%
45.0%
45.0%
45.1%
45.1%
45.1%
45.1%
45.1%
45.1%
45.2%
45.2%
45.2%
45.2%
45.2%
45.3%
45.3%
45.3%
45.3%
45.3%
45.3%
45.4%
45.4%
45.4%
45.4%
45.4%
45.4%
45.5%
45.5%
45.5%
45.5%
45.5%
45.6%
45.6%
45.6%
45.6%
45.6%
45.6%
45.7%
45.7%
45.7%
45.7%
45.7%
45.7%
45.8%
45.8%
45.8%
45.8%
45.8%
45.8%
45.9%
45.9%
45.9%
45.9%
45.9%
46.0%
46.0%
46.0%
46.0%
46.0%
46.0%
46.1%
46.1%
46.1%
46.1%
46.1%
46.1%
46.2%
46.2%
46.2%
46.2%
46.2%
46.3%
46.3%
46.3%
46.3%
46.3%
46.3%
46.4%
46.4%
46.4%
46.4%
46.4%
46.4%
46.5%
46.5%
46.5%
46.5%
46.5%
46.5%
46.6%
46.6%
46.6%
46.6%
46.6%
46.7%
46.7%
46.7%
46.7%
46.7%
46.7%
46.8%
46.8%
46.8%
46.8%
46.8%
46.8%
46.9%
46.9%
46.9%
46.9%
46.9%
47.0%
47.0%
47.0%
47.0%
47.0%
47.0%
47.1%
47.1%
47.1%
47.1%
47.1%
47.1%
47.2%
47.2%
47.2%
47.2%
47.2%
47.2%
47.3%
47.3%
47.3%
47.3%
47.3%
47.4%
47.4%
47.4%
47.4%
47.4%
47.4%
47.5%
47.5%
47.5%
47.5%
47.5%
47.5%
47.6%
47.6%
47.6%
47.6%
47.6%
47.7%
47.7%
47.7%
47.7%
47.7%
47.7%
47.8%
47.8%
47.8%
47.8%
47.8%
47.8%
47.9%
47.9%
47.9%
47.9%
47.9%
47.9%
48.0%
48.0%
48.0%
48.0%
48.0%
48.1%
48.1%
48.1%
48.1%
48.1%
48.1%
48.2%
48.2%
48.2%
48.2%
48.2%
48.2%
48.3%
48.3%
48.3%
48.3%
48.3%
48.4%
48.4%
48.4%
48.4%
48.4%
48.4%
48.5%
48.5%
48.5%
48.5%
48.5%
48.5%
48.6%
48.6%
48.6%
48.6%
48.6%
48.6%
48.7%
48.7%
48.7%
48.7%
48.7%
48.8%
48.8%
48.8%
48.8%
48.8%
48.8%
48.9%
48.9%
48.9%
48.9%
48.9%
48.9%
49.0%
49.0%
49.0%
49.0%
49.0%
49.0%
49.1%
49.1%
49.1%
49.1%
49.1%
49.2%
49.2%
49.2%
49.2%
49.2%
49.2%
49.3%
49.3%
49.3%
49.3%
49.3%
49.3%
49.4%
49.4%
49.4%
49.4%
49.4%
49.5%
49.5%
49.5%
49.5%
49.5%
49.5%
49.6%
49.6%
49.6%
49.6%
49.6%
49.6%
49.7%
49.7%
49.7%
49.7%
49.7%
49.7%
49.8%
49.8%
49.8%
49.8%
49.8%
49.9%
49.9%
49.9%
49.9%
49.9%
49.9%
50.0%
50.0%
50.0%
50.0%
50.0%
50.0%
50.1%
50.1%
50.1%
50.1%
50.1%
50.2%
50.2%
50.2%
50.2%
50.2%
50.2%
50.3%
50.3%
50.3%
50.3%
50.3%
50.3%
50.4%
50.4%
50.4%
50.4%
50.4%
50.4%
50.5%
50.5%
50.5%
50.5%
50.5%
50.6%
50.6%
50.6%
50.6%
50.6%
50.6%
50.7%
50.7%
50.7%
50.7%
50.7%
50.7%
50.8%
50.8%
50.8%
50.8%
50.8%
50.9%
50.9%
50.9%
50.9%
50.9%
50.9%
51.0%
51.0%
51.0%
51.0%
51.0%
51.0%
51.1%
51.1%
51.1%
51.1%
51.1%
51.1%
51.2%
51.2%
51.2%
51.2%
51.2%
51.3%
51.3%
51.3%
51.3%
51.3%
51.3%
51.4%
51.4%
51.4%
51.4%
51.4%
51.4%
51.5%
51.5%
51.5%
51.5%
51.5%
51.6%
51.6%
51.6%
51.6%
51.6%
51.6%
51.7%
51.7%
51.7%
51.7%
51.7%
51.7%
51.8%
51.8%
51.8%
51.8%
51.8%
51.8%
51.9%
51.9%
51.9%
51.9%
51.9%
52.0%
52.0%
52.0%
52.0%
52.0%
52.0%
52.1%
52.1%
52.1%
52.1%
52.1%
52.1%
52.2%
52.2%
52.2%
52.2%
52.2%
52.3%
52.3%
52.3%
52.3%
52.3%
52.3%
52.4%
52.4%
52.4%
52.4%
52.4%
52.4%
52.5%
52.5%
52.5%
52.5%
52.5%
52.5%
52.6%
52.6%
52.6%
52.6%
52.6%
52.7%
52.7%
52.7%
52.7%
52.7%
52.7%
52.8%
52.8%
52.8%
52.8%
52.8%
52.8%
52.9%
52.9%
52.9%
52.9%
52.9%
53.0%
53.0%
53.0%
53.0%
53.0%
53.0%
53.1%
53.1%
53.1%
53.1%
53.1%
53.1%
53.2%
53.2%
53.2%
53.2%
53.2%
53.2%
53.3%
53.3%
53.3%
53.3%
53.3%
53.4%
53.4%
53.4%
53.4%
53.4%
53.4%
53.5%
53.5%
53.5%
53.5%
53.5%
53.5%
53.6%
53.6%
53.6%
53.6%
53.6%
53.7%
53.7%
53.7%
53.7%
53.7%
53.7%
53.8%
53.8%
53.8%
53.8%
53.8%
53.8%
53.9%
53.9%
53.9%
53.9%
53.9%
53.9%
54.0%
54.0%
54.0%
54.0%
54.0%
54.1%
54.1%
54.1%
54.1%
54.1%
54.1%
54.2%
54.2%
54.2%
54.2%
54.2%
54.2%
54.3%
54.3%
54.3%
54.3%
54.3%
54.4%
54.4%
54.4%
54.4%
54.4%
54.4%
54.5%
54.5%
54.5%
54.5%
54.5%
54.5%
54.6%
54.6%
54.6%
54.6%
54.6%
54.6%
54.7%
54.7%
54.7%
54.7%
54.7%
54.8%
54.8%
54.8%
54.8%
54.8%
54.8%
54.9%
54.9%
54.9%
54.9%
54.9%
54.9%
55.0%
55.0%
55.0%
55.0%
55.0%
55.0%
55.1%
55.1%
55.1%
55.1%
55.1%
55.2%
55.2%
55.2%
55.2%
55.2%
55.2%
55.3%
55.3%
55.3%
55.3%
55.3%
55.3%
55.4%
55.4%
55.4%
55.4%
55.4%
55.5%
55.5%
55.5%
55.5%
55.5%
55.5%
55.6%
55.6%
55.6%
55.6%
55.6%
55.6%
55.7%
55.7%
55.7%
55.7%
55.7%
55.7%
55.8%
55.8%
55.8%
55.8%
55.8%
55.9%
55.9%
55.9%
55.9%
55.9%
55.9%
56.0%
56.0%
56.0%
56.0%
56.0%
56.0%
56.1%
56.1%
56.1%
56.1%
56.1%
56.2%
56.2%
56.2%
56.2%
56.2%
56.2%
56.3%
56.3%
56.3%
56.3%
56.3%
56.3%
56.4%
56.4%
56.4%
56.4%
56.4%
56.4%
56.5%
56.5%
56.5%
56.5%
56.5%
56.6%
56.6%
56.6%
56.6%
56.6%
56.6%
56.7%
56.7%
56.7%
56.7%
56.7%
56.7%
56.8%
56.8%
56.8%
56.8%
56.8%
56.9%
56.9%
56.9%
56.9%
56.9%
56.9%
57.0%
57.0%
57.0%
57.0%
57.0%
57.0%
57.1%
57.1%
57.1%
57.1%
57.1%
57.1%
57.2%
57.2%
57.2%
57.2%
57.2%
57.3%
57.3%
57.3%
57.3%
57.3%
57.3%
57.4%
57.4%
57.4%
57.4%
57.4%
57.4%
57.5%
57.5%
57.5%
57.5%
57.5%
57.6%
57.6%
57.6%
57.6%
57.6%
57.6%
57.7%
57.7%
57.7%
57.7%
57.7%
57.7%
57.8%
57.8%
57.8%
57.8%
57.8%
57.8%
57.9%
57.9%
57.9%
57.9%
57.9%
58.0%
58.0%
58.0%
58.0%
58.0%
58.0%
58.1%
58.1%
58.1%
58.1%
58.1%
58.1%
58.2%
58.2%
58.2%
58.2%
58.2%
58.3%
58.3%
58.3%
58.3%
58.3%
58.3%
58.4%
58.4%
58.4%
58.4%
58.4%
58.4%
58.5%
58.5%
58.5%
58.5%
58.5%
58.5%
58.6%
58.6%
58.6%
58.6%
58.6%
58.7%
58.7%
58.7%
58.7%
58.7%
58.7%
58.8%
58.8%
58.8%
58.8%
58.8%
58.8%
58.9%
58.9%
58.9%
58.9%
58.9%
59.0%
59.0%
59.0%
59.0%
59.0%
59.0%
59.1%
59.1%
59.1%
59.1%
59.1%
59.1%
59.2%
59.2%
59.2%
59.2%
59.2%
59.2%
59.3%
59.3%
59.3%
59.3%
59.3%
59.4%
59.4%
59.4%
59.4%
59.4%
59.4%
59.5%
59.5%
59.5%
59.5%
59.5%
59.5%
59.6%
59.6%
59.6%
59.6%
59.6%
59.7%
59.7%
59.7%
59.7%
59.7%
59.7%
59.8%
59.8%
59.8%
59.8%
59.8%
59.8%
59.9%
59.9%
59.9%
59.9%
59.9%
59.9%
60.0%
60.0%
60.0%
60.0%
60.0%
60.1%
60.1%
60.1%
60.1%
60.1%
60.1%
60.2%
60.2%
60.2%
60.2%
60.2%
60.2%
60.3%
60.3%
60.3%
60.3%
60.3%
60.4%
60.4%
60.4%
60.4%
60.4%
60.4%
60.5%
60.5%
60.5%
60.5%
60.5%
60.5%
60.6%
60.6%
60.6%
60.6%
60.6%
60.6%
60.7%
60.7%
60.7%
60.7%
60.7%
60.8%
60.8%
60.8%
60.8%
60.8%
60.8%
60.9%
60.9%
60.9%
60.9%
60.9%
60.9%
61.0%
61.0%
61.0%
61.0%
61.0%
61.1%
61.1%
61.1%
61.1%
61.1%
61.1%
61.2%
61.2%
61.2%
61.2%
61.2%
61.2%
61.3%
61.3%
61.3%
61.3%
61.3%
61.3%
61.4%
61.4%
61.4%
61.4%
61.4%
61.5%
61.5%
61.5%
61.5%
61.5%
61.5%
61.6%
61.6%
61.6%
61.6%
61.6%
61.6%
61.7%
61.7%
61.7%
61.7%
61.7%
61.7%
61.8%
61.8%
61.8%
61.8%
61.8%
61.9%
61.9%
61.9%
61.9%
61.9%
61.9%
62.0%
62.0%
62.0%
62.0%
62.0%
62.0%
62.1%
62.1%
62.1%
62.1%
62.1%
62.2%
62.2%
62.2%
62.2%
62.2%
62.2%
62.3%
62.3%
62.3%
62.3%
62.3%
62.3%
62.4%
62.4%
62.4%
62.4%
62.4%
62.4%
62.5%
62.5%
62.5%
62.5%
62.5%
62.6%
62.6%
62.6%
62.6%
62.6%
62.6%
62.7%
62.7%
62.7%
62.7%
62.7%
62.7%
62.8%
62.8%
62.8%
62.8%
62.8%
62.9%
62.9%
62.9%
62.9%
62.9%
62.9%
63.0%
63.0%
63.0%
63.0%
63.0%
63.0%
63.1%
63.1%
63.1%
63.1%
63.1%
63.1%
63.2%
63.2%
63.2%
63.2%
63.2%
63.3%
63.3%
63.3%
63.3%
63.3%
63.3%
63.4%
63.4%
63.4%
63.4%
63.4%
63.4%
63.5%
63.5%
63.5%
63.5%
63.5%
63.6%
63.6%
63.6%
63.6%
63.6%
63.6%
63.7%
63.7%
63.7%
63.7%
63.7%
63.7%
63.8%
63.8%
63.8%
63.8%
63.8%
63.8%
63.9%
63.9%
63.9%
63.9%
63.9%
64.0%
64.0%
64.0%
64.0%
64.0%
64.0%
64.1%
64.1%
64.1%
64.1%
64.1%
64.1%
64.2%
64.2%
64.2%
64.2%
64.2%
64.3%
64.3%
64.3%
64.3%
64.3%
64.3%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
64.5%
64.5%
64.5%
64.5%
64.5%
64.5%
64.6%
64.6%
64.6%
64.6%
64.6%
64.7%
64.7%
64.7%
64.7%
64.7%
64.7%
64.8%
64.8%
64.8%
64.8%
64.8%
64.8%
64.9%
64.9%
64.9%
64.9%
64.9%
65.0%
65.0%
65.0%
65.0%
65.0%
65.0%
65.1%
65.1%
65.1%
65.1%
65.1%
65.1%
65.2%
65.2%
65.2%
65.2%
65.2%
65.2%
65.3%
65.3%
65.3%
65.3%
65.3%
65.4%
65.4%
65.4%
65.4%
65.4%
65.4%
65.5%
65.5%
65.5%
65.5%
65.5%
65.5%
65.6%
65.6%
65.6%
65.6%
65.6%
65.7%
65.7%
65.7%
65.7%
65.7%
65.7%
65.8%
65.8%
65.8%
65.8%
65.8%
65.8%
65.9%
65.9%
65.9%
65.9%
65.9%
65.9%
66.0%
66.0%
66.0%
66.0%
66.0%
66.1%
66.1%
66.1%
66.1%
66.1%
66.1%
66.2%
66.2%
66.2%
66.2%
66.2%
66.2%
66.3%
66.3%
66.3%
66.3%
66.3%
66.4%
66.4%
66.4%
66.4%
66.4%
66.4%
66.5%
66.5%
66.5%
66.5%
66.5%
66.5%
66.6%
66.6%
66.6%
66.6%
66.6%
66.6%
66.7%
66.7%
66.7%
66.7%
66.7%
66.8%
66.8%
66.8%
66.8%
66.8%
66.8%
66.9%
66.9%
66.9%
66.9%
66.9%
66.9%
67.0%
67.0%
67.0%
67.0%
67.0%
67.1%
67.1%
67.1%
67.1%
67.1%
67.1%
67.2%
67.2%
67.2%
67.2%
67.2%
67.2%
67.3%
67.3%
67.3%
67.3%
67.3%
67.3%
67.4%
67.4%
67.4%
67.4%
67.4%
67.5%
67.5%
67.5%
67.5%
67.5%
67.5%
67.6%
67.6%
67.6%
67.6%
67.6%
67.6%
67.7%
67.7%
67.7%
67.7%
67.7%
67.7%
67.8%
67.8%
67.8%
67.8%
67.8%
67.9%
67.9%
67.9%
67.9%
67.9%
67.9%
68.0%
68.0%
68.0%
68.0%
68.0%
68.0%
68.1%
68.1%
68.1%
68.1%
68.1%
68.2%
68.2%
68.2%
68.2%
68.2%
68.2%
68.3%
68.3%
68.3%
68.3%
68.3%
68.3%
68.4%
68.4%
68.4%
68.4%
68.4%
68.4%
68.5%
68.5%
68.5%
68.5%
68.5%
68.6%
68.6%
68.6%
68.6%
68.6%
68.6%
68.7%
68.7%
68.7%
68.7%
68.7%
68.7%
68.8%
68.8%
68.8%
68.8%
68.8%
68.9%
68.9%
68.9%
68.9%
68.9%
68.9%
69.0%
69.0%
69.0%
69.0%
69.0%
69.0%
69.1%
69.1%
69.1%
69.1%
69.1%
69.1%
69.2%
69.2%
69.2%
69.2%
69.2%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.4%
69.4%
69.4%
69.4%
69.4%
69.4%
69.5%
69.5%
69.5%
69.5%
69.5%
69.6%
69.6%
69.6%
69.6%
69.6%
69.6%
69.7%
69.7%
69.7%
69.7%
69.7%
69.7%
69.8%
69.8%
69.8%
69.8%
69.8%
69.8%
69.9%
69.9%
69.9%
69.9%
69.9%
70.0%
70.0%
70.0%
70.0%
70.0%
70.0%
70.1%
70.1%
70.1%
70.1%
70.1%
70.1%
70.2%
70.2%
70.2%
70.2%
70.2%
70.3%
70.3%
70.3%
70.3%
70.3%
70.3%
70.4%
70.4%
70.4%
70.4%
70.4%
70.4%
70.5%
70.5%
70.5%
70.5%
70.5%
70.5%
70.6%
70.6%
70.6%
70.6%
70.6%
70.7%
70.7%
70.7%
70.7%
70.7%
70.7%
70.8%
70.8%
70.8%
70.8%
70.8%
70.8%
70.9%
70.9%
70.9%
70.9%
70.9%
71.0%
71.0%
71.0%
71.0%
71.0%
71.0%
71.1%
71.1%
71.1%
71.1%
71.1%
71.1%
71.2%
71.2%
71.2%
71.2%
71.2%
71.2%
71.3%
71.3%
71.3%
71.3%
71.3%
71.4%
71.4%
71.4%
71.4%
71.4%
71.4%
71.5%
71.5%
71.5%
71.5%
71.5%
71.5%
71.6%
71.6%
71.6%
71.6%
71.6%
71.7%
71.7%
71.7%
71.7%
71.7%
71.7%
71.8%
71.8%
71.8%
71.8%
71.8%
71.8%
71.9%
71.9%
71.9%
71.9%
71.9%
71.9%
72.0%
72.0%
72.0%
72.0%
72.0%
72.1%
72.1%
72.1%
72.1%
72.1%
72.1%
72.2%
72.2%
72.2%
72.2%
72.2%
72.2%
72.3%
72.3%
72.3%
72.3%
72.3%
72.4%
72.4%
72.4%
72.4%
72.4%
72.4%
72.5%
72.5%
72.5%
72.5%
72.5%
72.5%
72.6%
72.6%
72.6%
72.6%
72.6%
72.6%
72.7%
72.7%
72.7%
72.7%
72.7%
72.8%
72.8%
72.8%
72.8%
72.8%
72.8%
72.9%
72.9%
72.9%
72.9%
72.9%
72.9%
73.0%
73.0%
73.0%
73.0%
73.0%
73.1%
73.1%
73.1%
73.1%
73.1%
73.1%
73.2%
73.2%
73.2%
73.2%
73.2%
73.2%
73.3%
73.3%
73.3%
73.3%
73.3%
73.3%
73.4%
73.4%
73.4%
73.4%
73.4%
73.5%
73.5%
73.5%
73.5%
73.5%
73.5%
73.6%
73.6%
73.6%
73.6%
73.6%
73.6%
73.7%
73.7%
73.7%
73.7%
73.7%
73.7%
73.8%
73.8%
73.8%
73.8%
73.8%
73.9%
73.9%
73.9%
73.9%
73.9%
73.9%
74.0%
74.0%
74.0%
74.0%
74.0%
74.0%
74.1%
74.1%
74.1%
74.1%
74.1%
74.2%
74.2%
74.2%
74.2%
74.2%
74.2%
74.3%
74.3%
74.3%
74.3%
74.3%
74.3%
74.4%
74.4%
74.4%
74.4%
74.4%
74.4%
74.5%
74.5%
74.5%
74.5%
74.5%
74.6%
74.6%
74.6%
74.6%
74.6%
74.6%
74.7%
74.7%
74.7%
74.7%
74.7%
74.7%
74.8%
74.8%
74.8%
74.8%
74.8%
74.9%
74.9%
74.9%
74.9%
74.9%
74.9%
75.0%
75.0%
75.0%
75.0%
75.0%
75.0%
75.1%
75.1%
75.1%
75.1%
75.1%
75.1%
75.2%
75.2%
75.2%
75.2%
75.2%
75.3%
75.3%
75.3%
75.3%
75.3%
75.3%
75.4%
75.4%
75.4%
75.4%
75.4%
75.4%
75.5%
75.5%
75.5%
75.5%
75.5%
75.6%
75.6%
75.6%
75.6%
75.6%
75.6%
75.7%
75.7%
75.7%
75.7%
75.7%
75.7%
75.8%
75.8%
75.8%
75.8%
75.8%
75.8%
75.9%
75.9%
75.9%
75.9%
75.9%
76.0%
76.0%
76.0%
76.0%
76.0%
76.0%
76.1%
76.1%
76.1%
76.1%
76.1%
76.1%
76.2%
76.2%
76.2%
76.2%
76.2%
76.3%
76.3%
76.3%
76.3%
76.3%
76.3%
76.4%
76.4%
76.4%
76.4%
76.4%
76.4%
76.5%
76.5%
76.5%
76.5%
76.5%
76.5%
76.6%
76.6%
76.6%
76.6%
76.6%
76.7%
76.7%
76.7%
76.7%
76.7%
76.7%
76.8%
76.8%
76.8%
76.8%
76.8%
76.8%
76.9%
76.9%
76.9%
76.9%
76.9%
77.0%
77.0%
77.0%
77.0%
77.0%
77.0%
77.1%
77.1%
77.1%
77.1%
77.1%
77.1%
77.2%
77.2%
77.2%
77.2%
77.2%
77.2%
77.3%
77.3%
77.3%
77.3%
77.3%
77.4%
77.4%
77.4%
77.4%
77.4%
77.4%
77.5%
77.5%
77.5%
77.5%
77.5%
77.5%
77.6%
77.6%
77.6%
77.6%
77.6%
77.7%
77.7%
77.7%
77.7%
77.7%
77.7%
77.8%
77.8%
77.8%
77.8%
77.8%
77.8%
77.9%
77.9%
77.9%
77.9%
77.9%
77.9%
78.0%
78.0%
78.0%
78.0%
78.0%
78.1%
78.1%
78.1%
78.1%
78.1%
78.1%
78.2%
78.2%
78.2%
78.2%
78.2%
78.2%
78.3%
78.3%
78.3%
78.3%
78.3%
78.4%
78.4%
78.4%
78.4%
78.4%
78.4%
78.5%
78.5%
78.5%
78.5%
78.5%
78.5%
78.6%
78.6%
78.6%
78.6%
78.6%
78.6%
78.7%
78.7%
78.7%
78.7%
78.7%
78.8%
78.8%
78.8%
78.8%
78.8%
78.8%
78.9%
78.9%
78.9%
78.9%
78.9%
78.9%
79.0%
79.0%
79.0%
79.0%
79.0%
79.1%
79.1%
79.1%
79.1%
79.1%
79.1%
79.2%
79.2%
79.2%
79.2%
79.2%
79.2%
79.3%
79.3%
79.3%
79.3%
79.3%
79.3%
79.4%
79.4%
79.4%
79.4%
79.4%
79.5%
79.5%
79.5%
79.5%
79.5%
79.5%
79.6%
79.6%
79.6%
79.6%
79.6%
79.6%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.8%
79.8%
79.8%
79.8%
79.8%
79.9%
79.9%
79.9%
79.9%
79.9%
79.9%
80.0%
80.0%
80.0%
80.0%
80.0%
80.0%
80.1%
80.1%
80.1%
80.1%
80.1%
80.2%
80.2%
80.2%
80.2%
80.2%
80.2%
80.3%
80.3%
80.3%
80.3%
80.3%
80.3%
80.4%
80.4%
80.4%
80.4%
80.4%
80.4%
80.5%
80.5%
80.5%
80.5%
80.5%
80.6%
80.6%
80.6%
80.6%
80.6%
80.6%
80.7%
80.7%
80.7%
80.7%
80.7%
80.7%
80.8%
80.8%
80.8%
80.8%
80.8%
80.9%
80.9%
80.9%
80.9%
80.9%
80.9%
81.0%
81.0%
81.0%
81.0%
81.0%
81.0%
81.1%
81.1%
81.1%
81.1%
81.1%
81.1%
81.2%
81.2%
81.2%
81.2%
81.2%
81.3%
81.3%
81.3%
81.3%
81.3%
81.3%
81.4%
81.4%
81.4%
81.4%
81.4%
81.4%
81.5%
81.5%
81.5%
81.5%
81.5%
81.6%
81.6%
81.6%
81.6%
81.6%
81.6%
81.7%
81.7%
81.7%
81.7%
81.7%
81.7%
81.8%
81.8%
81.8%
81.8%
81.8%
81.8%
81.9%
81.9%
81.9%
81.9%
81.9%
82.0%
82.0%
82.0%
82.0%
82.0%
82.0%
82.1%
82.1%
82.1%
82.1%
82.1%
82.1%
82.2%
82.2%
82.2%
82.2%
82.2%
82.3%
82.3%
82.3%
82.3%
82.3%
82.3%
82.4%
82.4%
82.4%
82.4%
82.4%
82.4%
82.5%
82.5%
82.5%
82.5%
82.5%
82.5%
82.6%
82.6%
82.6%
82.6%
82.6%
82.7%
82.7%
82.7%
82.7%
82.7%
82.7%
82.8%
82.8%
82.8%
82.8%
82.8%
82.8%
82.9%
82.9%
82.9%
82.9%
82.9%
83.0%
83.0%
83.0%
83.0%
83.0%
83.0%
83.1%
83.1%
83.1%
83.1%
83.1%
83.1%
83.2%
83.2%
83.2%
83.2%
83.2%
83.2%
83.3%
83.3%
83.3%
83.3%
83.3%
83.4%
83.4%
83.4%
83.4%
83.4%
83.4%
83.5%
83.5%
83.5%
83.5%
83.5%
83.5%
83.6%
83.6%
83.6%
83.6%
83.6%
83.7%
83.7%
83.7%
83.7%
83.7%
83.7%
83.8%
83.8%
83.8%
83.8%
83.8%
83.8%
83.9%
83.9%
83.9%
83.9%
83.9%
83.9%
84.0%
84.0%
84.0%
84.0%
84.0%
84.1%
84.1%
84.1%
84.1%
84.1%
84.1%
84.2%
84.2%
84.2%
84.2%
84.2%
84.2%
84.3%
84.3%
84.3%
84.3%
84.3%
84.4%
84.4%
84.4%
84.4%
84.4%
84.4%
84.5%
84.5%
84.5%
84.5%
84.5%
84.5%
84.6%
84.6%
84.6%
84.6%
84.6%
84.6%
84.7%
84.7%
84.7%
84.7%
84.7%
84.8%
84.8%
84.8%
84.8%
84.8%
84.8%
84.9%
84.9%
84.9%
84.9%
84.9%
84.9%
85.0%
85.0%
85.0%
85.0%
85.0%
85.1%
85.1%
85.1%
85.1%
85.1%
85.1%
85.2%
85.2%
85.2%
85.2%
85.2%
85.2%
85.3%
85.3%
85.3%
85.3%
85.3%
85.3%
85.4%
85.4%
85.4%
85.4%
85.4%
85.5%
85.5%
85.5%
85.5%
85.5%
85.5%
85.6%
85.6%
85.6%
85.6%
85.6%
85.6%
85.7%
85.7%
85.7%
85.7%
85.7%
85.7%
85.8%
85.8%
85.8%
85.8%
85.8%
85.9%
85.9%
85.9%
85.9%
85.9%
85.9%
86.0%
86.0%
86.0%
86.0%
86.0%
86.0%
86.1%
86.1%
86.1%
86.1%
86.1%
86.2%
86.2%
86.2%
86.2%
86.2%
86.2%
86.3%
86.3%
86.3%
86.3%
86.3%
86.3%
86.4%
86.4%
86.4%
86.4%
86.4%
86.4%
86.5%
86.5%
86.5%
86.5%
86.5%
86.6%
86.6%
86.6%
86.6%
86.6%
86.6%
86.7%
86.7%
86.7%
86.7%
86.7%
86.7%
86.8%
86.8%
86.8%
86.8%
86.8%
86.9%
86.9%
86.9%
86.9%
86.9%
86.9%
87.0%
87.0%
87.0%
87.0%
87.0%
87.0%
87.1%
87.1%
87.1%
87.1%
87.1%
87.1%
87.2%
87.2%
87.2%
87.2%
87.2%
87.3%
87.3%
87.3%
87.3%
87.3%
87.3%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
87.5%
87.5%
87.5%
87.5%
87.5%
87.6%
87.6%
87.6%
87.6%
87.6%
87.6%
87.7%
87.7%
87.7%
87.7%
87.7%
87.7%
87.8%
87.8%
87.8%
87.8%
87.8%
87.8%
87.9%
87.9%
87.9%
87.9%
87.9%
88.0%
88.0%
88.0%
88.0%
88.0%
88.0%
88.1%
88.1%
88.1%
88.1%
88.1%
88.1%
88.2%
88.2%
88.2%
88.2%
88.2%
88.3%
88.3%
88.3%
88.3%
88.3%
88.3%
88.4%
88.4%
88.4%
88.4%
88.4%
88.4%
88.5%
88.5%
88.5%
88.5%
88.5%
88.5%
88.6%
88.6%
88.6%
88.6%
88.6%
88.7%
88.7%
88.7%
88.7%
88.7%
88.7%
88.8%
88.8%
88.8%
88.8%
88.8%
88.8%
88.9%
88.9%
88.9%
88.9%
88.9%
89.0%
89.0%
89.0%
89.0%
89.0%
89.0%
89.1%
89.1%
89.1%
89.1%
89.1%
89.1%
89.2%
89.2%
89.2%
89.2%
89.2%
89.2%
89.3%
89.3%
89.3%
89.3%
89.3%
89.4%
89.4%
89.4%
89.4%
89.4%
89.4%
89.5%
89.5%
89.5%
89.5%
89.5%
89.5%
89.6%
89.6%
89.6%
89.6%
89.6%
89.7%
89.7%
89.7%
89.7%
89.7%
89.7%
89.8%
89.8%
89.8%
89.8%
89.8%
89.8%
89.9%
89.9%
89.9%
89.9%
89.9%
89.9%
90.0%
90.0%
90.0%
90.0%
90.0%
90.1%
90.1%
90.1%
90.1%
90.1%
90.1%
90.2%
90.2%
90.2%
90.2%
90.2%
90.2%
90.3%
90.3%
90.3%
90.3%
90.3%
90.4%
90.4%
90.4%
90.4%
90.4%
90.4%
90.5%
90.5%
90.5%
90.5%
90.5%
90.5%
90.6%
90.6%
90.6%
90.6%
90.6%
90.6%
90.7%
90.7%
90.7%
90.7%
90.7%
90.8%
90.8%
90.8%
90.8%
90.8%
90.8%
90.9%
90.9%
90.9%
90.9%
90.9%
90.9%
91.0%
91.0%
91.0%
91.0%
91.0%
91.1%
91.1%
91.1%
91.1%
91.1%
91.1%
91.2%
91.2%
91.2%
91.2%
91.2%
91.2%
91.3%
91.3%
91.3%
91.3%
91.3%
91.3%
91.4%
91.4%
91.4%
91.4%
91.4%
91.5%
91.5%
91.5%
91.5%
91.5%
91.5%
91.6%
91.6%
91.6%
91.6%
91.6%
91.6%
91.7%
91.7%
91.7%
91.7%
91.7%
91.7%
91.8%
91.8%
91.8%
91.8%
91.8%
91.9%
91.9%
91.9%
91.9%
91.9%
91.9%
92.0%
92.0%
92.0%
92.0%
92.0%
92.0%
92.1%
92.1%
92.1%
92.1%
92.1%
92.2%
92.2%
92.2%
92.2%
92.2%
92.2%
92.3%
92.3%
92.3%
92.3%
92.3%
92.3%
92.4%
92.4%
92.4%
92.4%
92.4%
92.4%
92.5%
92.5%
92.5%
92.5%
92.5%
92.6%
92.6%
92.6%
92.6%
92.6%
92.6%
92.7%
92.7%
92.7%
92.7%
92.7%
92.7%
92.8%
92.8%
92.8%
92.8%
92.8%
92.9%
92.9%
92.9%
92.9%
92.9%
92.9%
93.0%
93.0%
93.0%
93.0%
93.0%
93.0%
93.1%
93.1%
93.1%
93.1%
93.1%
93.1%
93.2%
93.2%
93.2%
93.2%
93.2%
93.3%
93.3%
93.3%
93.3%
93.3%
93.3%
93.4%
93.4%
93.4%
93.4%
93.4%
93.4%
93.5%
93.5%
93.5%
93.5%
93.5%
93.6%
93.6%
93.6%
93.6%
93.6%
93.6%
93.7%
93.7%
93.7%
93.7%
93.7%
93.7%
93.8%
93.8%
93.8%
93.8%
93.8%
93.8%
93.9%
93.9%
93.9%
93.9%
93.9%
94.0%
94.0%
94.0%
94.0%
94.0%
94.0%
94.1%
94.1%
94.1%
94.1%
94.1%
94.1%
94.2%
94.2%
94.2%
94.2%
94.2%
94.3%
94.3%
94.3%
94.3%
94.3%
94.3%
94.4%
94.4%
94.4%
94.4%
94.4%
94.4%
94.5%
94.5%
94.5%
94.5%
94.5%
94.5%
94.6%
94.6%
94.6%
94.6%
94.6%
94.7%
94.7%
94.7%
94.7%
94.7%
94.7%
94.8%
94.8%
94.8%
94.8%
94.8%
94.8%
94.9%
94.9%
94.9%
94.9%
94.9%
95.0%
95.0%
95.0%
95.0%
95.0%
95.0%
95.1%
95.1%
95.1%
95.1%
95.1%
95.1%
95.2%
95.2%
95.2%
95.2%
95.2%
95.2%
95.3%
95.3%
95.3%
95.3%
95.3%
95.4%
95.4%
95.4%
95.4%
95.4%
95.4%
95.5%
95.5%
95.5%
95.5%
95.5%
95.5%
95.6%
95.6%
95.6%
95.6%
95.6%
95.7%
95.7%
95.7%
95.7%
95.7%
95.7%
95.8%
95.8%
95.8%
95.8%
95.8%
95.8%
95.9%
95.9%
95.9%
95.9%
95.9%
95.9%
96.0%
96.0%
96.0%
96.0%
96.0%
96.1%
96.1%
96.1%
96.1%
96.1%
96.1%
96.2%
96.2%
96.2%
96.2%
96.2%
96.2%
96.3%
96.3%
96.3%
96.3%
96.3%
96.4%
96.4%
96.4%
96.4%
96.4%
96.4%
96.5%
96.5%
96.5%
96.5%
96.5%
96.5%
96.6%
96.6%
96.6%
96.6%
96.6%
96.6%
96.7%
96.7%
96.7%
96.7%
96.7%
96.8%
96.8%
96.8%
96.8%
96.8%
96.8%
96.9%
96.9%
96.9%
96.9%
96.9%
96.9%
97.0%
97.0%
97.0%
97.0%
97.0%
97.1%
97.1%
97.1%
97.1%
97.1%
97.1%
97.2%
97.2%
97.2%
97.2%
97.2%
97.2%
97.3%
97.3%
97.3%
97.3%
97.3%
97.3%
97.4%
97.4%
97.4%
97.4%
97.4%
97.5%
97.5%
97.5%
97.5%
97.5%
97.5%
97.6%
97.6%
97.6%
97.6%
97.6%
97.6%
97.7%
97.7%
97.7%
97.7%
97.7%
97.8%
97.8%
97.8%
97.8%
97.8%
97.8%
97.9%
97.9%
97.9%
97.9%
97.9%
97.9%
98.0%
98.0%
98.0%
98.0%
98.0%
98.0%
98.1%
98.1%
98.1%
98.1%
98.1%
98.2%
98.2%
98.2%
98.2%
98.2%
98.2%
98.3%
98.3%
98.3%
98.3%
98.3%
98.3%
98.4%
98.4%
98.4%
98.4%
98.4%
98.4%
98.5%
98.5%
98.5%
98.5%
98.5%
98.6%
98.6%
98.6%
98.6%
98.6%
98.6%
98.7%
98.7%
98.7%
98.7%
98.7%
98.7%
98.8%
98.8%
98.8%
98.8%
98.8%
98.9%
98.9%
98.9%
98.9%
98.9%
98.9%
99.0%
99.0%
99.0%
99.0%
99.0%
99.0%
99.1%
99.1%
99.1%
99.1%
99.1%
99.1%
99.2%
99.2%
99.2%
99.2%
99.2%
99.3%
99.3%
99.3%
99.3%
99.3%
99.3%
99.4%
99.4%
99.4%
99.4%
99.4%
99.4%
99.5%
99.5%
99.5%
99.5%
99.5%
99.6%
99.6%
99.6%
99.6%
99.6%
99.6%
99.7%
99.7%
99.7%
99.7%
99.7%
99.7%
99.8%
99.8%
99.8%
99.8%
99.8%
99.8%
99.9%
99.9%
99.9%
99.9%
99.9%
100.0%
100.0%
100.0%
100.0%

Training and results

Before training the network we need to specify the loss function.

We use, as usual in classification problem, the cross-entropy which is directly available within torch.nn.

criterion = nn.CrossEntropyLoss()

We also initialize the Adam optimizer which is called at each training step in order to update the weights of the model.

optimizer_hybrid = optim.Adam(model_hybrid.fc.parameters(), lr=step)

We schedule to reduce the learning rate by a factor of gamma_lr_scheduler every 10 epochs.

exp_lr_scheduler = lr_scheduler.StepLR(
    optimizer_hybrid, step_size=10, gamma=gamma_lr_scheduler
)

What follows is a training function that will be called later. This function should return a trained model that can be used to make predictions (classifications).

def train_model(model, criterion, optimizer, scheduler, num_epochs):
    since = time.time()
    best_model_wts = copy.deepcopy(model.state_dict())
    best_acc = 0.0
    best_loss = 10000.0  # Large arbitrary number
    best_acc_train = 0.0
    best_loss_train = 10000.0  # Large arbitrary number
    print("Training started:")

    for epoch in range(num_epochs):

        # Each epoch has a training and validation phase
        for phase in ["train", "validation"]:
            if phase == "train":
                # Set model to training mode
                model.train()
            else:
                # Set model to evaluate mode
                model.eval()
            running_loss = 0.0
            running_corrects = 0

            # Iterate over data.
            n_batches = dataset_sizes[phase] // batch_size
            it = 0
            for inputs, labels in dataloaders[phase]:
                since_batch = time.time()
                batch_size_ = len(inputs)
                inputs = inputs.to(device)
                labels = labels.to(device)
                optimizer.zero_grad()

                # Track/compute gradient and make an optimization step only when training
                with torch.set_grad_enabled(phase == "train"):
                    outputs = model(inputs)
                    _, preds = torch.max(outputs, 1)
                    loss = criterion(outputs, labels)
                    if phase == "train":
                        loss.backward()
                        optimizer.step()

                # Print iteration results
                running_loss += loss.item() * batch_size_
                batch_corrects = torch.sum(preds == labels.data).item()
                running_corrects += batch_corrects
                print(
                    "Phase: {} Epoch: {}/{} Iter: {}/{} Batch time: {:.4f}".format(
                        phase,
                        epoch + 1,
                        num_epochs,
                        it + 1,
                        n_batches + 1,
                        time.time() - since_batch,
                    ),
                    end="\r",
                    flush=True,
                )
                it += 1

            # Print epoch results
            epoch_loss = running_loss / dataset_sizes[phase]
            epoch_acc = running_corrects / dataset_sizes[phase]
            print(
                "Phase: {} Epoch: {}/{} Loss: {:.4f} Acc: {:.4f}        ".format(
                    "train" if phase == "train" else "validation  ",
                    epoch + 1,
                    num_epochs,
                    epoch_loss,
                    epoch_acc,
                )
            )

            # Check if this is the best model wrt previous epochs
            if phase == "validation" and epoch_acc > best_acc:
                best_acc = epoch_acc
                best_model_wts = copy.deepcopy(model.state_dict())
            if phase == "validation" and epoch_loss < best_loss:
                best_loss = epoch_loss
            if phase == "train" and epoch_acc > best_acc_train:
                best_acc_train = epoch_acc
            if phase == "train" and epoch_loss < best_loss_train:
                best_loss_train = epoch_loss

            # Update learning rate
            if phase == "train":
                scheduler.step()

    # Print final results
    model.load_state_dict(best_model_wts)
    time_elapsed = time.time() - since
    print(
        "Training completed in {:.0f}m {:.0f}s".format(time_elapsed // 60, time_elapsed % 60)
    )
    print("Best test loss: {:.4f} | Best test accuracy: {:.4f}".format(best_loss, best_acc))
    return model

We are ready to perform the actual training process.

model_hybrid = train_model(
    model_hybrid, criterion, optimizer_hybrid, exp_lr_scheduler, num_epochs=num_epochs
)
Training started:
Phase: train Epoch: 1/3 Iter: 1/62 Batch time: 0.2054
Phase: train Epoch: 1/3 Iter: 2/62 Batch time: 0.2011
Phase: train Epoch: 1/3 Iter: 3/62 Batch time: 0.2009
Phase: train Epoch: 1/3 Iter: 4/62 Batch time: 0.2017
Phase: train Epoch: 1/3 Iter: 5/62 Batch time: 0.2068
Phase: train Epoch: 1/3 Iter: 6/62 Batch time: 0.2032
Phase: train Epoch: 1/3 Iter: 7/62 Batch time: 0.2024
Phase: train Epoch: 1/3 Iter: 8/62 Batch time: 0.2031
Phase: train Epoch: 1/3 Iter: 9/62 Batch time: 0.2027
Phase: train Epoch: 1/3 Iter: 10/62 Batch time: 0.2025
Phase: train Epoch: 1/3 Iter: 11/62 Batch time: 0.2030
Phase: train Epoch: 1/3 Iter: 12/62 Batch time: 0.2018
Phase: train Epoch: 1/3 Iter: 13/62 Batch time: 0.2027
Phase: train Epoch: 1/3 Iter: 14/62 Batch time: 0.2103
Phase: train Epoch: 1/3 Iter: 15/62 Batch time: 0.2124
Phase: train Epoch: 1/3 Iter: 16/62 Batch time: 0.2061
Phase: train Epoch: 1/3 Iter: 17/62 Batch time: 0.2061
Phase: train Epoch: 1/3 Iter: 18/62 Batch time: 0.2061
Phase: train Epoch: 1/3 Iter: 19/62 Batch time: 0.2026
Phase: train Epoch: 1/3 Iter: 20/62 Batch time: 0.2037
Phase: train Epoch: 1/3 Iter: 21/62 Batch time: 0.2021
Phase: train Epoch: 1/3 Iter: 22/62 Batch time: 0.2029
Phase: train Epoch: 1/3 Iter: 23/62 Batch time: 0.2018
Phase: train Epoch: 1/3 Iter: 24/62 Batch time: 0.2047
Phase: train Epoch: 1/3 Iter: 25/62 Batch time: 0.2036
Phase: train Epoch: 1/3 Iter: 26/62 Batch time: 0.2008
Phase: train Epoch: 1/3 Iter: 27/62 Batch time: 0.2034
Phase: train Epoch: 1/3 Iter: 28/62 Batch time: 0.2030
Phase: train Epoch: 1/3 Iter: 29/62 Batch time: 0.2172
Phase: train Epoch: 1/3 Iter: 30/62 Batch time: 0.2022
Phase: train Epoch: 1/3 Iter: 31/62 Batch time: 0.2025
Phase: train Epoch: 1/3 Iter: 32/62 Batch time: 0.2018
Phase: train Epoch: 1/3 Iter: 33/62 Batch time: 0.2042
Phase: train Epoch: 1/3 Iter: 34/62 Batch time: 0.2015
Phase: train Epoch: 1/3 Iter: 35/62 Batch time: 0.2046
Phase: train Epoch: 1/3 Iter: 36/62 Batch time: 0.2055
Phase: train Epoch: 1/3 Iter: 37/62 Batch time: 0.2033
Phase: train Epoch: 1/3 Iter: 38/62 Batch time: 0.2025
Phase: train Epoch: 1/3 Iter: 39/62 Batch time: 0.2013
Phase: train Epoch: 1/3 Iter: 40/62 Batch time: 0.2039
Phase: train Epoch: 1/3 Iter: 41/62 Batch time: 0.2026
Phase: train Epoch: 1/3 Iter: 42/62 Batch time: 0.2040
Phase: train Epoch: 1/3 Iter: 43/62 Batch time: 0.2032
Phase: train Epoch: 1/3 Iter: 44/62 Batch time: 0.2039
Phase: train Epoch: 1/3 Iter: 45/62 Batch time: 0.2031
Phase: train Epoch: 1/3 Iter: 46/62 Batch time: 0.2044
Phase: train Epoch: 1/3 Iter: 47/62 Batch time: 0.2046
Phase: train Epoch: 1/3 Iter: 48/62 Batch time: 0.2045
Phase: train Epoch: 1/3 Iter: 49/62 Batch time: 0.2024
Phase: train Epoch: 1/3 Iter: 50/62 Batch time: 0.2035
Phase: train Epoch: 1/3 Iter: 51/62 Batch time: 0.2026
Phase: train Epoch: 1/3 Iter: 52/62 Batch time: 0.2054
Phase: train Epoch: 1/3 Iter: 53/62 Batch time: 0.2022
Phase: train Epoch: 1/3 Iter: 54/62 Batch time: 0.2024
Phase: train Epoch: 1/3 Iter: 55/62 Batch time: 0.2027
Phase: train Epoch: 1/3 Iter: 56/62 Batch time: 0.2026
Phase: train Epoch: 1/3 Iter: 57/62 Batch time: 0.2014
Phase: train Epoch: 1/3 Iter: 58/62 Batch time: 0.2031
Phase: train Epoch: 1/3 Iter: 59/62 Batch time: 0.2036
Phase: train Epoch: 1/3 Iter: 60/62 Batch time: 0.2021
Phase: train Epoch: 1/3 Iter: 61/62 Batch time: 0.2019
Phase: train Epoch: 1/3 Loss: 0.6990 Acc: 0.5246
Phase: validation Epoch: 1/3 Iter: 1/39 Batch time: 0.1635
Phase: validation Epoch: 1/3 Iter: 2/39 Batch time: 0.1627
Phase: validation Epoch: 1/3 Iter: 3/39 Batch time: 0.1633
Phase: validation Epoch: 1/3 Iter: 4/39 Batch time: 0.1619
Phase: validation Epoch: 1/3 Iter: 5/39 Batch time: 0.1616
Phase: validation Epoch: 1/3 Iter: 6/39 Batch time: 0.1616
Phase: validation Epoch: 1/3 Iter: 7/39 Batch time: 0.1609
Phase: validation Epoch: 1/3 Iter: 8/39 Batch time: 0.1624
Phase: validation Epoch: 1/3 Iter: 9/39 Batch time: 0.1626
Phase: validation Epoch: 1/3 Iter: 10/39 Batch time: 0.1599
Phase: validation Epoch: 1/3 Iter: 11/39 Batch time: 0.1644
Phase: validation Epoch: 1/3 Iter: 12/39 Batch time: 0.1606
Phase: validation Epoch: 1/3 Iter: 13/39 Batch time: 0.1624
Phase: validation Epoch: 1/3 Iter: 14/39 Batch time: 0.1619
Phase: validation Epoch: 1/3 Iter: 15/39 Batch time: 0.1626
Phase: validation Epoch: 1/3 Iter: 16/39 Batch time: 0.1629
Phase: validation Epoch: 1/3 Iter: 17/39 Batch time: 0.1644
Phase: validation Epoch: 1/3 Iter: 18/39 Batch time: 0.1644
Phase: validation Epoch: 1/3 Iter: 19/39 Batch time: 0.1609
Phase: validation Epoch: 1/3 Iter: 20/39 Batch time: 0.1615
Phase: validation Epoch: 1/3 Iter: 21/39 Batch time: 0.1621
Phase: validation Epoch: 1/3 Iter: 22/39 Batch time: 0.1617
Phase: validation Epoch: 1/3 Iter: 23/39 Batch time: 0.1625
Phase: validation Epoch: 1/3 Iter: 24/39 Batch time: 0.1630
Phase: validation Epoch: 1/3 Iter: 25/39 Batch time: 0.1618
Phase: validation Epoch: 1/3 Iter: 26/39 Batch time: 0.1620
Phase: validation Epoch: 1/3 Iter: 27/39 Batch time: 0.1602
Phase: validation Epoch: 1/3 Iter: 28/39 Batch time: 0.1617
Phase: validation Epoch: 1/3 Iter: 29/39 Batch time: 0.1622
Phase: validation Epoch: 1/3 Iter: 30/39 Batch time: 0.1654
Phase: validation Epoch: 1/3 Iter: 31/39 Batch time: 0.1644
Phase: validation Epoch: 1/3 Iter: 32/39 Batch time: 0.1662
Phase: validation Epoch: 1/3 Iter: 33/39 Batch time: 0.1932
Phase: validation Epoch: 1/3 Iter: 34/39 Batch time: 0.1612
Phase: validation Epoch: 1/3 Iter: 35/39 Batch time: 0.1607
Phase: validation Epoch: 1/3 Iter: 36/39 Batch time: 0.1622
Phase: validation Epoch: 1/3 Iter: 37/39 Batch time: 0.1616
Phase: validation Epoch: 1/3 Iter: 38/39 Batch time: 0.1618
Phase: validation Epoch: 1/3 Iter: 39/39 Batch time: 0.0454
Phase: validation   Epoch: 1/3 Loss: 0.6429 Acc: 0.6536
Phase: train Epoch: 2/3 Iter: 1/62 Batch time: 0.1954
Phase: train Epoch: 2/3 Iter: 2/62 Batch time: 0.1966
Phase: train Epoch: 2/3 Iter: 3/62 Batch time: 0.1970
Phase: train Epoch: 2/3 Iter: 4/62 Batch time: 0.1968
Phase: train Epoch: 2/3 Iter: 5/62 Batch time: 0.1985
Phase: train Epoch: 2/3 Iter: 6/62 Batch time: 0.1957
Phase: train Epoch: 2/3 Iter: 7/62 Batch time: 0.1989
Phase: train Epoch: 2/3 Iter: 8/62 Batch time: 0.1969
Phase: train Epoch: 2/3 Iter: 9/62 Batch time: 0.1965
Phase: train Epoch: 2/3 Iter: 10/62 Batch time: 0.1977
Phase: train Epoch: 2/3 Iter: 11/62 Batch time: 0.2007
Phase: train Epoch: 2/3 Iter: 12/62 Batch time: 0.1973
Phase: train Epoch: 2/3 Iter: 13/62 Batch time: 0.1971
Phase: train Epoch: 2/3 Iter: 14/62 Batch time: 0.1989
Phase: train Epoch: 2/3 Iter: 15/62 Batch time: 0.1979
Phase: train Epoch: 2/3 Iter: 16/62 Batch time: 0.1975
Phase: train Epoch: 2/3 Iter: 17/62 Batch time: 0.1967
Phase: train Epoch: 2/3 Iter: 18/62 Batch time: 0.1978
Phase: train Epoch: 2/3 Iter: 19/62 Batch time: 0.1970
Phase: train Epoch: 2/3 Iter: 20/62 Batch time: 0.1966
Phase: train Epoch: 2/3 Iter: 21/62 Batch time: 0.1972
Phase: train Epoch: 2/3 Iter: 22/62 Batch time: 0.2147
Phase: train Epoch: 2/3 Iter: 23/62 Batch time: 0.2073
Phase: train Epoch: 2/3 Iter: 24/62 Batch time: 0.2040
Phase: train Epoch: 2/3 Iter: 25/62 Batch time: 0.2008
Phase: train Epoch: 2/3 Iter: 26/62 Batch time: 0.2017
Phase: train Epoch: 2/3 Iter: 27/62 Batch time: 0.1987
Phase: train Epoch: 2/3 Iter: 28/62 Batch time: 0.1974
Phase: train Epoch: 2/3 Iter: 29/62 Batch time: 0.1976
Phase: train Epoch: 2/3 Iter: 30/62 Batch time: 0.1974
Phase: train Epoch: 2/3 Iter: 31/62 Batch time: 0.1965
Phase: train Epoch: 2/3 Iter: 32/62 Batch time: 0.1971
Phase: train Epoch: 2/3 Iter: 33/62 Batch time: 0.1972
Phase: train Epoch: 2/3 Iter: 34/62 Batch time: 0.1971
Phase: train Epoch: 2/3 Iter: 35/62 Batch time: 0.1980
Phase: train Epoch: 2/3 Iter: 36/62 Batch time: 0.1993
Phase: train Epoch: 2/3 Iter: 37/62 Batch time: 0.1970
Phase: train Epoch: 2/3 Iter: 38/62 Batch time: 0.1977
Phase: train Epoch: 2/3 Iter: 39/62 Batch time: 0.1977
Phase: train Epoch: 2/3 Iter: 40/62 Batch time: 0.1965
Phase: train Epoch: 2/3 Iter: 41/62 Batch time: 0.1981
Phase: train Epoch: 2/3 Iter: 42/62 Batch time: 0.1973
Phase: train Epoch: 2/3 Iter: 43/62 Batch time: 0.1973
Phase: train Epoch: 2/3 Iter: 44/62 Batch time: 0.1976
Phase: train Epoch: 2/3 Iter: 45/62 Batch time: 0.1978
Phase: train Epoch: 2/3 Iter: 46/62 Batch time: 0.1974
Phase: train Epoch: 2/3 Iter: 47/62 Batch time: 0.1970
Phase: train Epoch: 2/3 Iter: 48/62 Batch time: 0.1971
Phase: train Epoch: 2/3 Iter: 49/62 Batch time: 0.1976
Phase: train Epoch: 2/3 Iter: 50/62 Batch time: 0.2201
Phase: train Epoch: 2/3 Iter: 51/62 Batch time: 0.2009
Phase: train Epoch: 2/3 Iter: 52/62 Batch time: 0.1966
Phase: train Epoch: 2/3 Iter: 53/62 Batch time: 0.1972
Phase: train Epoch: 2/3 Iter: 54/62 Batch time: 0.1968
Phase: train Epoch: 2/3 Iter: 55/62 Batch time: 0.1971
Phase: train Epoch: 2/3 Iter: 56/62 Batch time: 0.1969
Phase: train Epoch: 2/3 Iter: 57/62 Batch time: 0.1969
Phase: train Epoch: 2/3 Iter: 58/62 Batch time: 0.1965
Phase: train Epoch: 2/3 Iter: 59/62 Batch time: 0.1968
Phase: train Epoch: 2/3 Iter: 60/62 Batch time: 0.1966
Phase: train Epoch: 2/3 Iter: 61/62 Batch time: 0.1982
Phase: train Epoch: 2/3 Loss: 0.6134 Acc: 0.7008
Phase: validation Epoch: 2/3 Iter: 1/39 Batch time: 0.1578
Phase: validation Epoch: 2/3 Iter: 2/39 Batch time: 0.1560
Phase: validation Epoch: 2/3 Iter: 3/39 Batch time: 0.1553
Phase: validation Epoch: 2/3 Iter: 4/39 Batch time: 0.1559
Phase: validation Epoch: 2/3 Iter: 5/39 Batch time: 0.1584
Phase: validation Epoch: 2/3 Iter: 6/39 Batch time: 0.1557
Phase: validation Epoch: 2/3 Iter: 7/39 Batch time: 0.1562
Phase: validation Epoch: 2/3 Iter: 8/39 Batch time: 0.1551
Phase: validation Epoch: 2/3 Iter: 9/39 Batch time: 0.1554
Phase: validation Epoch: 2/3 Iter: 10/39 Batch time: 0.1551
Phase: validation Epoch: 2/3 Iter: 11/39 Batch time: 0.1554
Phase: validation Epoch: 2/3 Iter: 12/39 Batch time: 0.1566
Phase: validation Epoch: 2/3 Iter: 13/39 Batch time: 0.1550
Phase: validation Epoch: 2/3 Iter: 14/39 Batch time: 0.1557
Phase: validation Epoch: 2/3 Iter: 15/39 Batch time: 0.1556
Phase: validation Epoch: 2/3 Iter: 16/39 Batch time: 0.1559
Phase: validation Epoch: 2/3 Iter: 17/39 Batch time: 0.1562
Phase: validation Epoch: 2/3 Iter: 18/39 Batch time: 0.1560
Phase: validation Epoch: 2/3 Iter: 19/39 Batch time: 0.1557
Phase: validation Epoch: 2/3 Iter: 20/39 Batch time: 0.1554
Phase: validation Epoch: 2/3 Iter: 21/39 Batch time: 0.1559
Phase: validation Epoch: 2/3 Iter: 22/39 Batch time: 0.1556
Phase: validation Epoch: 2/3 Iter: 23/39 Batch time: 0.1573
Phase: validation Epoch: 2/3 Iter: 24/39 Batch time: 0.1558
Phase: validation Epoch: 2/3 Iter: 25/39 Batch time: 0.1563
Phase: validation Epoch: 2/3 Iter: 26/39 Batch time: 0.1553
Phase: validation Epoch: 2/3 Iter: 27/39 Batch time: 0.1580
Phase: validation Epoch: 2/3 Iter: 28/39 Batch time: 0.1549
Phase: validation Epoch: 2/3 Iter: 29/39 Batch time: 0.1557
Phase: validation Epoch: 2/3 Iter: 30/39 Batch time: 0.1555
Phase: validation Epoch: 2/3 Iter: 31/39 Batch time: 0.1564
Phase: validation Epoch: 2/3 Iter: 32/39 Batch time: 0.1559
Phase: validation Epoch: 2/3 Iter: 33/39 Batch time: 0.1563
Phase: validation Epoch: 2/3 Iter: 34/39 Batch time: 0.1568
Phase: validation Epoch: 2/3 Iter: 35/39 Batch time: 0.1562
Phase: validation Epoch: 2/3 Iter: 36/39 Batch time: 0.1562
Phase: validation Epoch: 2/3 Iter: 37/39 Batch time: 0.1563
Phase: validation Epoch: 2/3 Iter: 38/39 Batch time: 0.1560
Phase: validation Epoch: 2/3 Iter: 39/39 Batch time: 0.0440
Phase: validation   Epoch: 2/3 Loss: 0.5389 Acc: 0.8235
Phase: train Epoch: 3/3 Iter: 1/62 Batch time: 0.1936
Phase: train Epoch: 3/3 Iter: 2/62 Batch time: 0.1962
Phase: train Epoch: 3/3 Iter: 3/62 Batch time: 0.1952
Phase: train Epoch: 3/3 Iter: 4/62 Batch time: 0.1978
Phase: train Epoch: 3/3 Iter: 5/62 Batch time: 0.1963
Phase: train Epoch: 3/3 Iter: 6/62 Batch time: 0.1957
Phase: train Epoch: 3/3 Iter: 7/62 Batch time: 0.1994
Phase: train Epoch: 3/3 Iter: 8/62 Batch time: 0.1981
Phase: train Epoch: 3/3 Iter: 9/62 Batch time: 0.1967
Phase: train Epoch: 3/3 Iter: 10/62 Batch time: 0.1976
Phase: train Epoch: 3/3 Iter: 11/62 Batch time: 0.1969
Phase: train Epoch: 3/3 Iter: 12/62 Batch time: 0.1964
Phase: train Epoch: 3/3 Iter: 13/62 Batch time: 0.1973
Phase: train Epoch: 3/3 Iter: 14/62 Batch time: 0.1968
Phase: train Epoch: 3/3 Iter: 15/62 Batch time: 0.1972
Phase: train Epoch: 3/3 Iter: 16/62 Batch time: 0.1965
Phase: train Epoch: 3/3 Iter: 17/62 Batch time: 0.1966
Phase: train Epoch: 3/3 Iter: 18/62 Batch time: 0.1973
Phase: train Epoch: 3/3 Iter: 19/62 Batch time: 0.1972
Phase: train Epoch: 3/3 Iter: 20/62 Batch time: 0.1969
Phase: train Epoch: 3/3 Iter: 21/62 Batch time: 0.1967
Phase: train Epoch: 3/3 Iter: 22/62 Batch time: 0.1973
Phase: train Epoch: 3/3 Iter: 23/62 Batch time: 0.1969
Phase: train Epoch: 3/3 Iter: 24/62 Batch time: 0.1985
Phase: train Epoch: 3/3 Iter: 25/62 Batch time: 0.1970
Phase: train Epoch: 3/3 Iter: 26/62 Batch time: 0.1986
Phase: train Epoch: 3/3 Iter: 27/62 Batch time: 0.1976
Phase: train Epoch: 3/3 Iter: 28/62 Batch time: 0.1967
Phase: train Epoch: 3/3 Iter: 29/62 Batch time: 0.1982
Phase: train Epoch: 3/3 Iter: 30/62 Batch time: 0.1963
Phase: train Epoch: 3/3 Iter: 31/62 Batch time: 0.1970
Phase: train Epoch: 3/3 Iter: 32/62 Batch time: 0.1970
Phase: train Epoch: 3/3 Iter: 33/62 Batch time: 0.1974
Phase: train Epoch: 3/3 Iter: 34/62 Batch time: 0.1978
Phase: train Epoch: 3/3 Iter: 35/62 Batch time: 0.1969
Phase: train Epoch: 3/3 Iter: 36/62 Batch time: 0.1970
Phase: train Epoch: 3/3 Iter: 37/62 Batch time: 0.1980
Phase: train Epoch: 3/3 Iter: 38/62 Batch time: 0.1988
Phase: train Epoch: 3/3 Iter: 39/62 Batch time: 0.1971
Phase: train Epoch: 3/3 Iter: 40/62 Batch time: 0.1973
Phase: train Epoch: 3/3 Iter: 41/62 Batch time: 0.1978
Phase: train Epoch: 3/3 Iter: 42/62 Batch time: 0.1980
Phase: train Epoch: 3/3 Iter: 43/62 Batch time: 0.1994
Phase: train Epoch: 3/3 Iter: 44/62 Batch time: 0.1986
Phase: train Epoch: 3/3 Iter: 45/62 Batch time: 0.1968
Phase: train Epoch: 3/3 Iter: 46/62 Batch time: 0.1977
Phase: train Epoch: 3/3 Iter: 47/62 Batch time: 0.1977
Phase: train Epoch: 3/3 Iter: 48/62 Batch time: 0.1983
Phase: train Epoch: 3/3 Iter: 49/62 Batch time: 0.1971
Phase: train Epoch: 3/3 Iter: 50/62 Batch time: 0.1971
Phase: train Epoch: 3/3 Iter: 51/62 Batch time: 0.1974
Phase: train Epoch: 3/3 Iter: 52/62 Batch time: 0.2003
Phase: train Epoch: 3/3 Iter: 53/62 Batch time: 0.2003
Phase: train Epoch: 3/3 Iter: 54/62 Batch time: 0.1984
Phase: train Epoch: 3/3 Iter: 55/62 Batch time: 0.1980
Phase: train Epoch: 3/3 Iter: 56/62 Batch time: 0.1975
Phase: train Epoch: 3/3 Iter: 57/62 Batch time: 0.1974
Phase: train Epoch: 3/3 Iter: 58/62 Batch time: 0.1965
Phase: train Epoch: 3/3 Iter: 59/62 Batch time: 0.1992
Phase: train Epoch: 3/3 Iter: 60/62 Batch time: 0.1972
Phase: train Epoch: 3/3 Iter: 61/62 Batch time: 0.1977
Phase: train Epoch: 3/3 Loss: 0.5652 Acc: 0.7418
Phase: validation Epoch: 3/3 Iter: 1/39 Batch time: 0.1579
Phase: validation Epoch: 3/3 Iter: 2/39 Batch time: 0.1557
Phase: validation Epoch: 3/3 Iter: 3/39 Batch time: 0.1567
Phase: validation Epoch: 3/3 Iter: 4/39 Batch time: 0.1567
Phase: validation Epoch: 3/3 Iter: 5/39 Batch time: 0.1559
Phase: validation Epoch: 3/3 Iter: 6/39 Batch time: 0.1565
Phase: validation Epoch: 3/3 Iter: 7/39 Batch time: 0.1563
Phase: validation Epoch: 3/3 Iter: 8/39 Batch time: 0.1563
Phase: validation Epoch: 3/3 Iter: 9/39 Batch time: 0.1566
Phase: validation Epoch: 3/3 Iter: 10/39 Batch time: 0.1565
Phase: validation Epoch: 3/3 Iter: 11/39 Batch time: 0.1561
Phase: validation Epoch: 3/3 Iter: 12/39 Batch time: 0.1562
Phase: validation Epoch: 3/3 Iter: 13/39 Batch time: 0.1562
Phase: validation Epoch: 3/3 Iter: 14/39 Batch time: 0.1564
Phase: validation Epoch: 3/3 Iter: 15/39 Batch time: 0.1564
Phase: validation Epoch: 3/3 Iter: 16/39 Batch time: 0.1565
Phase: validation Epoch: 3/3 Iter: 17/39 Batch time: 0.1569
Phase: validation Epoch: 3/3 Iter: 18/39 Batch time: 0.1561
Phase: validation Epoch: 3/3 Iter: 19/39 Batch time: 0.1560
Phase: validation Epoch: 3/3 Iter: 20/39 Batch time: 0.1561
Phase: validation Epoch: 3/3 Iter: 21/39 Batch time: 0.1563
Phase: validation Epoch: 3/3 Iter: 22/39 Batch time: 0.1564
Phase: validation Epoch: 3/3 Iter: 23/39 Batch time: 0.1560
Phase: validation Epoch: 3/3 Iter: 24/39 Batch time: 0.1572
Phase: validation Epoch: 3/3 Iter: 25/39 Batch time: 0.1568
Phase: validation Epoch: 3/3 Iter: 26/39 Batch time: 0.1573
Phase: validation Epoch: 3/3 Iter: 27/39 Batch time: 0.1567
Phase: validation Epoch: 3/3 Iter: 28/39 Batch time: 0.1567
Phase: validation Epoch: 3/3 Iter: 29/39 Batch time: 0.1576
Phase: validation Epoch: 3/3 Iter: 30/39 Batch time: 0.1568
Phase: validation Epoch: 3/3 Iter: 31/39 Batch time: 0.1563
Phase: validation Epoch: 3/3 Iter: 32/39 Batch time: 0.1561
Phase: validation Epoch: 3/3 Iter: 33/39 Batch time: 0.1568
Phase: validation Epoch: 3/3 Iter: 34/39 Batch time: 0.1562
Phase: validation Epoch: 3/3 Iter: 35/39 Batch time: 0.1562
Phase: validation Epoch: 3/3 Iter: 36/39 Batch time: 0.1564
Phase: validation Epoch: 3/3 Iter: 37/39 Batch time: 0.1561
Phase: validation Epoch: 3/3 Iter: 38/39 Batch time: 0.1565
Phase: validation Epoch: 3/3 Iter: 39/39 Batch time: 0.0439
Phase: validation   Epoch: 3/3 Loss: 0.4484 Acc: 0.8497
Training completed in 0m 60s
Best test loss: 0.4484 | Best test accuracy: 0.8497

Visualizing the model predictions

We first define a visualization function for a batch of test data.

def visualize_model(model, num_images=6, fig_name="Predictions"):
    images_so_far = 0
    _fig = plt.figure(fig_name)
    model.eval()
    with torch.no_grad():
        for _i, (inputs, labels) in enumerate(dataloaders["validation"]):
            inputs = inputs.to(device)
            labels = labels.to(device)
            outputs = model(inputs)
            _, preds = torch.max(outputs, 1)
            for j in range(inputs.size()[0]):
                images_so_far += 1
                ax = plt.subplot(num_images // 2, 2, images_so_far)
                ax.axis("off")
                ax.set_title("[{}]".format(class_names[preds[j]]))
                imshow(inputs.cpu().data[j])
                if images_so_far == num_images:
                    return

Finally, we can run the previous function to see a batch of images with the corresponding predictions.

visualize_model(model_hybrid, num_images=batch_size)
plt.show()
[ants], [ants], [ants], [ants]

References

[1] Andrea Mari, Thomas R. Bromley, Josh Izaac, Maria Schuld, and Nathan Killoran. Transfer learning in hybrid classical-quantum neural networks. arXiv:1912.08278 (2019).

[2] Rajat Raina, Alexis Battle, Honglak Lee, Benjamin Packer, and Andrew Y Ng. Self-taught learning: transfer learning from unlabeled data. Proceedings of the 24th International Conference on Machine Learning*, 759–766 (2007).

[3] Kaiming He, Xiangyu Zhang, Shaoqing ren and Jian Sun. Deep residual learning for image recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 770-778 (2016).

[4] Ville Bergholm, Josh Izaac, Maria Schuld, Christian Gogolin, Carsten Blank, Keri McKiernan, and Nathan Killoran. PennyLane: Automatic differentiation of hybrid quantum-classical computations. arXiv:1811.04968 (2018).

About the author

Andrea Mari
Andrea Mari

Andrea Mari

Andrea obtained a PhD in quantum information theory from the University of Potsdam (Germany). He worked as a postdoc at Scuola Normale Superiore (Pisa, Italy) and as a remote researcher at Xanadu. Since 2020 is a Member of Technical Staff at Unitary ...

Total running time of the script: (1 minute 6.130 seconds)

Share demo

Ask a question on the forum

Related Demos

Quanvolutional Neural Networks

Turning quantum nodes into Torch Layers

Variational Quantum Linear Solver

Quantum GANs

Multiclass margin classifier

Adversarial attacks and robustness for quantum machine learning

Coherent Variational Quantum Linear Solver

Dropout in Quantum Neural Networks

Turning quantum nodes into Keras Layers

Basic tutorial: qubit rotation

PennyLane

PennyLane is a cross-platform Python library for quantum computing, quantum machine learning, and quantum chemistry. Built by researchers, for research. Created with ❤️ by Xanadu.

Research

  • Research
  • Performance
  • Hardware & Simulators
  • Demos
  • Quantum Compilation
  • Quantum Datasets

Education

  • Teach
  • Learn
  • Codebook
  • Coding Challenges
  • Videos
  • Glossary

Software

  • Install PennyLane
  • Features
  • Documentation
  • Catalyst Compilation Docs
  • Development Guide
  • API
  • GitHub
Stay updated with our newsletter

© Copyright 2025 | Xanadu | All rights reserved

TensorFlow, the TensorFlow logo and any related marks are trademarks of Google Inc.

Privacy Policy|Terms of Service|Cookie Policy|Code of Conduct