Datasets:
ZhouChuYue
Cursor
commited on
Commit
·
98dbb6e
1
Parent(s):
a00e231
Update README: Add UltraData-Math-L2 dataset description
Browse files
README.md
CHANGED
|
@@ -64,6 +64,7 @@ Experiments show that on the MiniCPM-1B architecture, ***UltraData-Math*** achie
|
|
| 64 |
***UltraData-Math*** has been applied to the mathematical pre-training of the [MiniCPM Series](https://huggingface.co/collections/openbmb/minicpm-4-6841ab29d180257e940baa9b) models.
|
| 65 |
|
| 66 |
- **[UltraData-Math-L1](https://huggingface.co/datasets/openbmb/UltraData-Math)**: Large-scale high-quality mathematical pre-training dataset, containing 170.5B tokens of web mathematical corpus. (**<-- you are here**)
|
|
|
|
| 67 |
- **[UltraData-Math-L3](https://huggingface.co/datasets/openbmb/UltraData-Math-L3)**: High-quality synthetic mathematical dataset, containing 88B tokens of multi-format synthetic data (Q&A, multi-turn dialogues, knowledge textbooks, etc.).
|
| 68 |
|
| 69 |
## 🏗️ Data Processing Pipeline
|
|
|
|
| 64 |
***UltraData-Math*** has been applied to the mathematical pre-training of the [MiniCPM Series](https://huggingface.co/collections/openbmb/minicpm-4-6841ab29d180257e940baa9b) models.
|
| 65 |
|
| 66 |
- **[UltraData-Math-L1](https://huggingface.co/datasets/openbmb/UltraData-Math)**: Large-scale high-quality mathematical pre-training dataset, containing 170.5B tokens of web mathematical corpus. (**<-- you are here**)
|
| 67 |
+
- **[UltraData-Math-L2](https://huggingface.co/datasets/openbmb/UltraData-Math-L2)**: High-quality mathematical pre-training dataset selected by the quality model, containing 33.7B tokens of high-quality web mathematical corpus.
|
| 68 |
- **[UltraData-Math-L3](https://huggingface.co/datasets/openbmb/UltraData-Math-L3)**: High-quality synthetic mathematical dataset, containing 88B tokens of multi-format synthetic data (Q&A, multi-turn dialogues, knowledge textbooks, etc.).
|
| 69 |
|
| 70 |
## 🏗️ Data Processing Pipeline
|