| | --- |
| | language: code |
| | license: apache-2.0 |
| |
|
| | --- |
| | # UniXcoder |
| |
|
| | ## Model description |
| |
|
| | UniXcoder is a unified cross-modal pre-trained model for programming languages to support both code-related understanding and generation tasks. The model can support six languages: java, ruby, python, php, javascript, and go. |
| |
|
| | [UniXcoder: Unified Cross-Modal Pre-training for Code Representation.](https://arxiv.org/abs/2203.03850) Daya Guo, Shuai Lu, Nan Duan, Yanlin Wang, Ming Zhou, Jian Yin. |
| | [GitHub](hhttps://github.com/microsoft/CodeBERT/tree/master/UniXcoder#unixcoder) |
| |
|
| | ## Citation |
| |
|
| | If you use UniXcoder, please consider citing the following paper: |
| |
|
| | ``` |
| | @article{guo2022unixcoder, |
| | title={UniXcoder: Unified Cross-Modal Pre-training for Code Representation}, |
| | author={Guo, Daya and Lu, Shuai and Duan, Nan and Wang, Yanlin and Zhou, Ming and Yin, Jian}, |
| | journal={arXiv preprint arXiv:2203.03850}, |
| | year={2022} |
| | } |
| | ``` |
| |
|
| |
|