Instructions to use CofeAI/Tele-FLM with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use CofeAI/Tele-FLM with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="CofeAI/Tele-FLM", trust_remote_code=True)# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("CofeAI/Tele-FLM", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -154,12 +154,15 @@ The parallel training setup for Tele-FLM is configured as follows: tensor parall
|
|
| 154 |
## Citation
|
| 155 |
If you find our work helpful, please consider citing it.
|
| 156 |
```
|
| 157 |
-
@
|
| 158 |
-
|
| 159 |
-
|
| 160 |
-
|
| 161 |
-
|
| 162 |
-
|
| 163 |
-
|
|
|
|
|
|
|
|
|
|
| 164 |
}
|
| 165 |
```
|
|
|
|
| 154 |
## Citation
|
| 155 |
If you find our work helpful, please consider citing it.
|
| 156 |
```
|
| 157 |
+
@article{tele-flm-2024,
|
| 158 |
+
author = {Xiang Li and Yiqun Yao and Xin Jiang and Xuezhi Fang and Chao Wang and Xinzhang Liu and Zihan Wang and Yu Zhao and Xin Wang and Yuyao Huang and Shuangyong Song and Yongxiang Li and Zheng Zhang and Bo Zhao and Aixin Sun and Yequan Wang and Zhongjiang He and Zhongyuan Wang and Xuelong Li and Tiejun Huang},
|
| 159 |
+
title = {Tele-FLM Technical Report},
|
| 160 |
+
journal = {CoRR},
|
| 161 |
+
volume = {abs/2404.16645},
|
| 162 |
+
year = {2024},
|
| 163 |
+
url = {https://doi.org/10.48550/arXiv.2404.16645},
|
| 164 |
+
doi = {10.48550/ARXIV.2404.16645},
|
| 165 |
+
eprinttype = {arXiv},
|
| 166 |
+
eprint = {2404.16645},
|
| 167 |
}
|
| 168 |
```
|