Skip to content

NoteDance/Pytorch-PCGrad

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Pytorch-PCGrad

PyTorch Implementation of "Gradient Surgery for Multi-Task Learning" using multiprocessing

Usage

import torch
import torch.nn as nn
import torch.optim as optim
from ppcgrad import PPCGrad

# wrap your favorite optimizer
optimizer = PPCGrad(optim.Adam(net.parameters())) 
losses = [...] # a list of per-task losses
assert len(losses) == num_tasks
optimizer.pc_backward(losses) # calculate the gradient can apply gradient modification
optimizer.step()  # apply gradient step

About

PyTorch Implementation of "Gradient Surgery for Multi-Task Learning" using multiprocessing

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages