diff --git a/README.md b/README.md index d227281..3ac8e0e 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,6 @@ # Adapting Multimodal Large Language Models to Domains via Post-Training -This repository provides an implementation preview of our paper: [On Domain-Specific Post-Training for Multimodal Large Language Models](https://huggingface.co/papers/2411.19930). +This repository provides an implementation preview of our paper: [On Domain-Specific Post-Training for Multimodal Large Language Models](https://arxiv.org/abs/2411.19930). We investigate domain adaptation of MLLMs through post-training, focusing on data synthesis, training pipelines, and task evaluation. **(1) Data Synthesis**: Using open-source models, we develop a visual instruction synthesizer that effectively generates diverse visual instruction tasks from domain-specific image-caption pairs. **Our synthetic tasks surpass those generated by manual rules, GPT-4, and GPT-4V in enhancing the domain-specific performance of MLLMs.** @@ -16,7 +16,10 @@ We investigate domain adaptation of MLLMs through post-training, focusing on dat

-# License +### Updates +- **[2024/11/29]** Released our paper. + +## License ```text LICENSE AGREEMENT @@ -30,3 +33,15 @@ You are granted the right to use the code and/or Database under the following te · You warrant that you have the authorization to enter into this License Agreement. · You comply with the terms enforced by the corporates whose products were used in collecting the code and/or data. The terms unanimously enforce, including but not limited to, restricting the use of the code and/or data to non-commercial academic research. ``` + +## Citation +If you find our work helpful, please cite us: + +```bibtex +@article{adamllm, + title={On Domain-Specific Post-Training for Multimodal Large Language Models}, + author={Cheng, Daixuan and Huang, Shaohan and Zhu, Ziyu and Zhang, Xintong and Zhao, Wayne Xin and Luan, Zhongzhi and Dai, Bo and Zhang, Zhenliang}, + journal={arXiv preprint arXiv:2411.19930}, + year={2024} +} +```