The official GitHub page for the survey paper "A Survey of Large Language Models".
-
Updated
Aug 20, 2024 - Python
The official GitHub page for the survey paper "A Survey of Large Language Models".
Use PEFT or Full-parameter to finetune 400+ LLMs (Qwen2.5, InternLM3, GLM4, Llama3.3, Mistral, Yi1.5, Baichuan2, DeepSeek3, ...) and 150+ MLLMs (Qwen2-VL, Qwen2-Audio, Llama3.2-Vision, Llava, InternVL2.5, MiniCPM-V-2.6, GLM4v, Xcomposer2.5, Yi-VL, DeepSeek-VL2, Phi3.5-Vision, GOT-OCR2, ...).
Data processing for and with foundation models! 🍎 🍋 🌽 ➡️ ➡️🍸 🍹 🍷
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Papers about pretraining and self-supervised learning on Graph Neural Networks (GNN).
An Open-sourced Knowledgable Large Language Model Framework.
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
Oscar and VinVL
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Research code for ECCV 2020 paper "UNITER: UNiversal Image-TExt Representation Learning"
[ICLR 2024] Sheared LLaMA: Accelerating Language Model Pre-training via Structured Pruning
[NeurIPS 2020] "Graph Contrastive Learning with Augmentations" by Yuning You, Tianlong Chen, Yongduo Sui, Ting Chen, Zhangyang Wang, Yang Shen
Code for KDD'20 "Generative Pre-Training of Graph Neural Networks"
Multi-modality pre-training
The repository of ET-BERT, a network traffic classification model on encrypted traffic. The work has been accepted as The Web Conference (WWW) 2022 accepted paper.
[NeurlPS D&B 2024] Generative AI for Math: MathPile
Code for our SIGKDD'22 paper Pre-training-Enhanced Spatial-Temporal Graph Neural Network For Multivariate Time Series Forecasting.
GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training @ KDD 2020
[Survey] Masked Modeling for Self-supervised Representation Learning on Vision and Beyond (https://arxiv.org/abs/2401.00897)
The official repo for [NeurIPS'23] "SAMRS: Scaling-up Remote Sensing Segmentation Dataset with Segment Anything Model"
Add a description, image, and links to the pre-training topic page so that developers can more easily learn about it.
To associate your repository with the pre-training topic, visit your repo's landing page and select "manage topics."