PatentTransformer is our codename for "Augmented Inventing." The ultimate goal of this project is to help inventors conceive better inventions and quality patents. We leverage Transformer-based models, such as GPT-2 and BERT for patent text generation and measurement. Our source code will be released soon.
- PatentBERT: Patent Classification with Fine-Tuning a pre-trained BERT Model (Special Issue: Artificial Intelligence (AI) for Intellectual Property (IP), World Patent Information Journal, 2020-06-01)
- Patent Claim Generation by Fine-Tuning OpenAI GPT-2 (to be published by World Patent Information Journal)
- Measuring Patent Claim Generation by Span Relevancy (Proceedings of the Thirteenth International Workshop on Juris-informatics (JURISIN 2019), hosted by JSAI-isAI2019)
- Personalized Patent Claim Generation and Measurement (Best Doctoral Consortium paper at the 32nd International Conference on Legal Knowledge and Information Systems (JURIX 2019)).
- Measuring and Controlling Text Generation by Semantic Search: A Textual Similarity Approach for PatentTransformer (PhD Symposium of The Web Conference 2020 (formerly known as WWW conference))
- Controlling Patent Text Generation by Structural Metadata (under review)