반응형
params_to_optimize = [param for name, param in self.model.named_parameters() if self.check_name_validity(TRAIN_WEIGHT_LIST, name)]
self.opt = optim.AdamW(params_to_optimize, lr=config['lr'], weight_decay=config['w_decay'])
'Deep Learning' 카테고리의 다른 글
Improving DDPD noise estimation (0) | 2025.03.18 |
---|---|
Class-specific DDPM (0) | 2025.03.11 |
How to count the number of parameters of a NN and measure FLOPs required for the NN (0) | 2024.12.18 |
Denoising Diffusion Probabilistic model (0) | 2024.11.25 |
Initialize specific parameters of a NN with pre-trained ones and stop them from learning (0) | 2024.08.16 |