PyTorch freeze part of the layers

Jimmy (xiaoke) Shen
5 min readJun 17, 2020

In PyTorch we can freeze the layer by setting the requires_grad to False. The weight freeze is helpful when we want to apply a pretrained model.

Here I’d like to explore this process.

Build a toy model

import torch.nn as nn
from torch.autograd import Variable
import torch.optim as optim
class Net(nn.Module):

def __init__(self):
super().__init__()
self.fc1 = nn.Linear(2, 4)
self.fc2 = nn.Linear(4, 3)
self.out = nn.Linear(3, 1)
self.out_act = nn.Sigmoid()…

--

--