Skip to content

JetBrains-Research/code2seq

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

code2seq

JetBrains Research Github action: build Code style: black

PyTorch's implementation of code2seq model.

Installation

You can easily install model through the PIP:

pip install code2seq

Usage

Minimal code example to run the model:

from argparse import ArgumentParser

from omegaconf import DictConfig, OmegaConf
from pytorch_lightning import Trainer

from code2seq.data.path_context_data_module import PathContextDataModule
from code2seq.model import Code2Seq


def train(config: DictConfig):
    # Load data module
    data_module = PathContextDataModule(config.data_folder, config.data)
    data_module.prepare_data()
    data_module.setup()

    # Load model
    model = Code2Seq(
        config.model,
        config.optimizer,
        data_module.vocabulary,
        config.train.teacher_forcing
    )

    trainer = Trainer(max_epochs=config.hyper_parameters.n_epochs)
    trainer.fit(model, datamodule=data_module)


if __name__ == "__main__":
    __arg_parser = ArgumentParser()
    __arg_parser.add_argument("config", help="Path to YAML configuration file", type=str)
    __args = __arg_parser.parse_args()

    __config = OmegaConf.load(__args.config)
    train(__config)

Navigate to config directory to see examples of configs. If you have any questions, then feel free to open the issue.

About

PyTorch's implementation of the code2seq model.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •