时间:2021-06-19 08:16:51 | 栏目:Python代码 | 点击:次
python代码
for i, para in enumerate(self._net.module.features.parameters()):
if i < 16:
para.requires_grad = False
else:
para.requires_grad = True
# Solver.
# self._solver = torch.optim.SGD(
# self._net.parameters(), lr=self._options['base_lr'],
# momentum=0.9, weight_decay=self._options['weight_decay'])
self._solver = torch.optim.SGD(
self._net.module.parameters(), lr=self._options['base_lr'],
momentum=0.9, weight_decay=self._options['weight_decay'])
分析
通过for循环将需要冻结的layer的requires_grad属性设置为False