Optimiser#
These configs (generally) configure PyTorch Optimizer
objects.
Note
To use an alternative optimiser config, use a command like:
workshop train optimiser=<OPTIMISER_NAME> encoder=gvp dataset=cath task=inverse_folding trainer=cpu
# or
python proteinworkshop/train.py optimiser=<OPTIMISER_NAME> encoder=gvp dataset=cath task=inverse_folding trainer=cpu # or trainer=gpu
where <OPTIMISER_NAME>
is the name of the optimiser config.
Note
To change the learning rate, use a command like:
workshop train optimizer.lr=0.0001 encoder=gvp dataset=cath task=inverse_folding trainer=cpu
# or
python proteinworkshop/train.py optimizer.lr=0.0001 encoder=gvp dataset=cath task=inverse_folding trainer=cpu # or trainer=gpu
where 0.0001
is the new learning rate.
ADAM (adam
)#
# Example usage:
python proteinworkshop/train.py ... optimiser=adam optimiser.optimizer.lr=0.0001 ...
optimizer:
_target_: torch.optim.Adam
_partial_: true
lr: 0.001
weight_decay: 0.0
ADAM-W (adamw
)#
# Example usage:
python proteinworkshop/train.py ... optimiser=adamw optimiser.optimizer.lr=0.0001 ...
optimizer:
_target_: torch.optim.AdamW
_partial_: true
lr: 0.001
weight_decay: 0.0
Lion (lion
)#
# Example usage:
python proteinworkshop/train.py ... optimiser=lion optimiser.optimizer.lr=0.0001 ...
optimizer:
_target_: lion_pytorch.Lion
_partial_: true
lr: 0.0001
weight_decay: 0.0