Dataset Preview
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code: DatasetGenerationError
Exception: CastError
Message: Couldn't cast
image: string
question: string
thinking: string
answer: string
boxes: list<item: list<item: int64>>
child 0, item: list<item: int64>
child 0, item: int64
points: list<item: null>
child 0, item: null
label: string
normalized: bool
to
{'image': Value('string'), 'label': Value('string'), 'boxes': List(List(Value('int64'))), 'points': List(Value('null')), 'normalized': Value('bool')}
because column names don't match
Traceback: Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1779, in _prepare_split_single
for key, table in generator:
^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/json/json.py", line 299, in _generate_tables
self._cast_table(pa_table, json_field_paths=json_field_paths),
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/json/json.py", line 128, in _cast_table
pa_table = table_cast(pa_table, self.info.features.arrow_schema)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2321, in table_cast
return cast_table_to_schema(table, schema)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2249, in cast_table_to_schema
raise CastError(
datasets.table.CastError: Couldn't cast
image: string
question: string
thinking: string
answer: string
boxes: list<item: list<item: int64>>
child 0, item: list<item: int64>
child 0, item: int64
points: list<item: null>
child 0, item: null
label: string
normalized: bool
to
{'image': Value('string'), 'label': Value('string'), 'boxes': List(List(Value('int64'))), 'points': List(Value('null')), 'normalized': Value('bool')}
because column names don't match
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1347, in compute_config_parquet_and_info_response
parquet_operations = convert_to_parquet(builder)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 980, in convert_to_parquet
builder.download_and_prepare(
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 882, in download_and_prepare
self._download_and_prepare(
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 943, in _download_and_prepare
self._prepare_split(split_generator, **prepare_split_kwargs)
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1646, in _prepare_split
for job_id, done, content in self._prepare_split_single(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1832, in _prepare_split_single
raise DatasetGenerationError("An error occurred while generating the dataset") from e
datasets.exceptions.DatasetGenerationError: An error occurred while generating the datasetNeed help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
image string | label string | boxes list | points list | normalized bool |
|---|---|---|---|---|
images/000000000009.jpg | 45 | [
[
2,
391,
956,
986
],
[
487,
9,
985,
485
],
[
0,
28,
678,
809
]
] | [] | true |
images/000000000009.jpg | 50 | [
[
390,
477,
883,
987
]
] | [] | true |
images/000000000009.jpg | 49 | [
[
587,
84,
705,
181
],
[
727,
81,
818,
178
],
[
602,
153,
733,
300
],
[
568,
5,
716,
153
]
] | [] | true |
images/000000000025.jpg | 23 | [
[
602,
141,
937,
838
],
[
83,
836,
289,
965
]
] | [] | true |
images/000000000030.jpg | 58 | [
[
320,
72,
718,
829
]
] | [] | true |
images/000000000030.jpg | 75 | [
[
371,
364,
631,
819
]
] | [] | true |
images/000000000034.jpg | 22 | [
[
1,
47,
690,
938
]
] | [] | true |
images/000000000036.jpg | 25 | [
[
0,
78,
951,
750
]
] | [] | true |
images/000000000036.jpg | 0 | [
[
348,
254,
993,
980
]
] | [] | true |
images/000000000042.jpg | 16 | [
[
334,
86,
878,
596
]
] | [] | true |
images/000000000049.jpg | 17 | [
[
426,
453,
768,
821
],
[
214,
489,
434,
805
]
] | [] | true |
images/000000000049.jpg | 0 | [
[
533,
520,
706,
666
],
[
311,
522,
460,
648
],
[
313,
668,
344,
735
],
[
746,
666,
774,
721
],
[
501,
668,
520,
715
],
[
909,
666,
948,
685
]
] | [] | true |
images/000000000049.jpg | 58 | [
[
529,
852,
715,
946
]
] | [] | true |
images/000000000061.jpg | 0 | [
[
408,
422,
483,
539
],
[
614,
431,
665,
512
],
[
408,
402,
462,
513
]
] | [] | true |
images/000000000061.jpg | 20 | [
[
578,
510,
707,
607
],
[
348,
512,
570,
671
]
] | [] | true |
images/000000000064.jpg | 2 | [
[
109,
605,
476,
851
]
] | [] | true |
images/000000000064.jpg | 7 | [
[
110,
552,
369,
645
]
] | [] | true |
images/000000000064.jpg | 11 | [
[
241,
354,
318,
471
]
] | [] | true |
images/000000000064.jpg | 74 | [
[
233,
66,
554,
304
]
] | [] | true |
images/000000000071.jpg | 2 | [
[
730,
509,
774,
542
],
[
816,
524,
854,
552
],
[
682,
511,
718,
531
],
[
915,
542,
958,
576
],
[
776,
525,
813,
551
],
[
939,
508,
966,
521
],
[
964,
557,
999,
587
],
[
74... | [] | true |
images/000000000071.jpg | 6 | [
[
75,
436,
829,
716
]
] | [] | true |
images/000000000071.jpg | 7 | [
[
521,
452,
558,
476
],
[
566,
463,
604,
479
]
] | [] | true |
images/000000000072.jpg | 23 | [
[
320,
202,
996,
981
],
[
118,
112,
664,
999
]
] | [] | true |
images/000000000073.jpg | 3 | [
[
23,
36,
971,
987
],
[
3,
5,
478,
430
]
] | [] | true |
images/000000000074.jpg | 16 | [
[
97,
648,
559,
890
]
] | [] | true |
images/000000000074.jpg | 1 | [
[
4,
9,
253,
741
]
] | [] | true |
images/000000000074.jpg | 0 | [
[
461,
220,
490,
358
],
[
510,
228,
531,
288
],
[
557,
224,
581,
346
],
[
721,
246,
771,
345
],
[
433,
244,
456,
353
],
[
441,
243,
459,
299
]
] | [] | true |
images/000000000077.jpg | 0 | [
[
432,
150,
638,
467
],
[
641,
143,
830,
402
],
[
50,
446,
243,
887
],
[
572,
270,
621,
381
],
[
532,
155,
578,
375
]
] | [] | true |
images/000000000077.jpg | 36 | [
[
83,
853,
184,
918
],
[
473,
375,
527,
451
],
[
710,
376,
758,
416
]
] | [] | true |
images/000000000078.jpg | 74 | [
[
587,
3,
937,
388
]
] | [] | true |
images/000000000081.jpg | 4 | [
[
60,
95,
972,
843
]
] | [] | true |
images/000000000086.jpg | 0 | [
[
297,
286,
549,
873
]
] | [] | true |
images/000000000086.jpg | 3 | [
[
254,
541,
828,
991
]
] | [] | true |
images/000000000086.jpg | 26 | [
[
517,
460,
780,
632
]
] | [] | true |
images/000000000089.jpg | 43 | [
[
785,
220,
825,
495
],
[
744,
265,
775,
486
],
[
825,
207,
867,
494
],
[
867,
210,
900,
499
],
[
896,
201,
938,
498
]
] | [] | true |
images/000000000089.jpg | 69 | [
[
216,
422,
741,
984
]
] | [] | true |
images/000000000089.jpg | 68 | [
[
0,
91,
185,
392
]
] | [] | true |
images/000000000089.jpg | 73 | [
[
744,
695,
885,
758
],
[
776,
760,
954,
830
],
[
801,
838,
999,
942
]
] | [] | true |
images/000000000092.jpg | 42 | [
[
653,
277,
911,
999
]
] | [] | true |
images/000000000092.jpg | 55 | [
[
196,
2,
783,
687
]
] | [] | true |
images/000000000094.jpg | 2 | [
[
556,
644,
623,
734
]
] | [] | true |
images/000000000094.jpg | 7 | [
[
846,
615,
975,
834
]
] | [] | true |
images/000000000109.jpg | 16 | [
[
842,
710,
878,
754
]
] | [] | true |
images/000000000109.jpg | 0 | [
[
801,
652,
834,
730
],
[
915,
644,
943,
699
],
[
855,
599,
863,
619
],
[
116,
475,
126,
512
],
[
943,
652,
958,
679
]
] | [] | true |
images/000000000109.jpg | 13 | [
[
588,
628,
642,
685
],
[
700,
677,
763,
732
]
] | [] | true |
images/000000000110.jpg | 56 | [
[
216,
278,
371,
444
],
[
381,
286,
512,
533
],
[
0,
279,
70,
632
],
[
646,
164,
681,
190
]
] | [] | true |
images/000000000110.jpg | 42 | [
[
460,
766,
756,
999
]
] | [] | true |
images/000000000110.jpg | 43 | [
[
428,
750,
454,
999
]
] | [] | true |
images/000000000110.jpg | 53 | [
[
213,
816,
740,
987
]
] | [] | true |
images/000000000110.jpg | 60 | [
[
186,
157,
411,
315
],
[
913,
209,
999,
283
],
[
5,
785,
999,
999
]
] | [] | true |
images/000000000110.jpg | 0 | [
[
325,
0,
999,
988
],
[
2,
267,
394,
853
],
[
387,
2,
588,
287
],
[
307,
0,
414,
196
],
[
220,
5,
347,
183
],
[
0,
0,
203,
363
],
[
205,
0,
253,
116
],
[
635,
3,
68... | [] | true |
images/000000000110.jpg | 41 | [
[
287,
165,
317,
203
],
[
255,
162,
284,
211
],
[
327,
168,
359,
200
],
[
356,
158,
382,
207
]
] | [] | true |
images/000000000113.jpg | 56 | [
[
221,
595,
598,
694
],
[
26,
902,
199,
998
]
] | [] | true |
images/000000000113.jpg | 0 | [
[
10,
49,
375,
759
],
[
820,
157,
999,
669
],
[
356,
70,
825,
673
]
] | [] | true |
images/000000000113.jpg | 41 | [
[
169,
715,
293,
811
],
[
834,
250,
886,
279
],
[
679,
149,
740,
191
],
[
832,
106,
888,
140
],
[
825,
153,
883,
192
],
[
912,
151,
955,
187
],
[
470,
219,
518,
253
],
[
42... | [] | true |
images/000000000113.jpg | 43 | [
[
609,
567,
723,
703
]
] | [] | true |
images/000000000113.jpg | 55 | [
[
342,
662,
967,
896
]
] | [] | true |
images/000000000113.jpg | 60 | [
[
47,
613,
999,
999
]
] | [] | true |
images/000000000127.jpg | 58 | [
[
333,
103,
434,
250
]
] | [] | true |
images/000000000127.jpg | 60 | [
[
168,
306,
998,
999
],
[
415,
198,
749,
246
]
] | [] | true |
images/000000000127.jpg | 41 | [
[
596,
459,
885,
719
]
] | [] | true |
images/000000000127.jpg | 43 | [
[
470,
595,
632,
925
]
] | [] | true |
images/000000000127.jpg | 44 | [
[
541,
574,
674,
745
]
] | [] | true |
images/000000000127.jpg | 55 | [
[
280,
547,
476,
841
]
] | [] | true |
images/000000000127.jpg | 73 | [
[
402,
403,
603,
595
]
] | [] | true |
images/000000000127.jpg | 13 | [
[
0,
309,
100,
480
],
[
246,
179,
315,
213
],
[
166,
206,
267,
308
],
[
307,
306,
390,
381
]
] | [] | true |
images/000000000127.jpg | 25 | [
[
459,
0,
623,
217
],
[
205,
7,
256,
122
],
[
163,
3,
207,
96
]
] | [] | true |
images/000000000127.jpg | 26 | [
[
175,
257,
908,
596
]
] | [] | true |
images/000000000127.jpg | 0 | [
[
693,
18,
729,
92
]
] | [] | true |
images/000000000133.jpg | 59 | [
[
22,
6,
999,
877
]
] | [] | true |
images/000000000133.jpg | 77 | [
[
820,
44,
895,
103
]
] | [] | true |
images/000000000136.jpg | 0 | [
[
2,
306,
109,
995
],
[
0,
164,
138,
999
]
] | [] | true |
images/000000000136.jpg | 23 | [
[
164,
311,
535,
986
],
[
633,
352,
889,
791
]
] | [] | true |
images/000000000138.jpg | 72 | [
[
26,
179,
218,
608
]
] | [] | true |
images/000000000138.jpg | 74 | [
[
493,
34,
600,
154
]
] | [] | true |
images/000000000138.jpg | 45 | [
[
246,
321,
317,
355
]
] | [] | true |
images/000000000138.jpg | 69 | [
[
406,
305,
593,
929
]
] | [] | true |
images/000000000138.jpg | 71 | [
[
775,
354,
977,
448
]
] | [] | true |
images/000000000138.jpg | 58 | [
[
875,
79,
944,
181
]
] | [] | true |
images/000000000138.jpg | 75 | [
[
341,
96,
393,
178
]
] | [] | true |
images/000000000142.jpg | 39 | [
[
534,
139,
867,
535
]
] | [] | true |
images/000000000142.jpg | 60 | [
[
0,
353,
999,
706
]
] | [] | true |
images/000000000142.jpg | 46 | [
[
221,
585,
788,
885
]
] | [] | true |
images/000000000142.jpg | 48 | [
[
155,
587,
912,
999
]
] | [] | true |
images/000000000143.jpg | 14 | [
[
711,
385,
929,
737
],
[
213,
209,
373,
543
],
[
442,
254,
609,
579
],
[
405,
541,
567,
783
],
[
190,
638,
344,
960
],
[
60,
40,
219,
308
],
[
200,
474,
390,
743
],
[
780,... | [] | true |
images/000000000144.jpg | 23 | [
[
364,
253,
936,
999
],
[
302,
274,
771,
999
],
[
76,
166,
690,
999
]
] | [] | true |
images/000000000149.jpg | 0 | [
[
606,
758,
621,
789
],
[
530,
739,
541,
770
],
[
520,
737,
530,
770
],
[
627,
760,
650,
790
],
[
873,
741,
885,
784
],
[
840,
738,
848,
772
],
[
428,
762,
441,
792
],
[
85... | [] | true |
images/000000000149.jpg | 33 | [
[
678,
449,
706,
484
],
[
438,
589,
453,
604
],
[
680,
193,
697,
222
],
[
458,
486,
493,
511
],
[
98,
795,
163,
826
],
[
601,
641,
659,
678
]
] | [] | true |
images/000000000149.jpg | 2 | [
[
435,
728,
450,
740
],
[
486,
740,
502,
751
],
[
461,
739,
474,
748
],
[
400,
750,
431,
767
]
] | [] | true |
images/000000000151.jpg | 0 | [
[
905,
96,
968,
179
]
] | [] | true |
images/000000000151.jpg | 6 | [
[
439,
7,
999,
999
]
] | [] | true |
images/000000000151.jpg | 11 | [
[
442,
514,
522,
569
]
] | [] | true |
images/000000000154.jpg | 22 | [
[
28,
493,
845,
999
],
[
95,
301,
735,
518
],
[
560,
146,
792,
246
]
] | [] | true |
images/000000000164.jpg | 39 | [
[
609,
383,
621,
439
],
[
584,
395,
595,
438
],
[
574,
384,
585,
442
],
[
598,
390,
609,
441
],
[
677,
583,
706,
649
],
[
723,
588,
738,
664
],
[
624,
336,
639,
372
],
[
58... | [] | true |
images/000000000164.jpg | 72 | [
[
669,
356,
820,
643
]
] | [] | true |
images/000000000164.jpg | 56 | [
[
248,
852,
485,
999
]
] | [] | true |
images/000000000164.jpg | 40 | [
[
240,
395,
260,
454
],
[
220,
397,
245,
473
],
[
169,
409,
192,
478
],
[
91,
408,
122,
486
],
[
128,
407,
152,
489
],
[
150,
408,
168,
485
],
[
119,
406,
136,
488
],
[
186... | [] | true |
images/000000000164.jpg | 41 | [
[
229,
487,
271,
530
],
[
179,
493,
215,
533
],
[
276,
484,
307,
518
],
[
307,
476,
338,
509
],
[
249,
482,
283,
526
],
[
800,
279,
825,
324
],
[
831,
291,
855,
320
],
[
79... | [] | true |
images/000000000164.jpg | 45 | [
[
790,
461,
878,
483
],
[
790,
440,
835,
450
],
[
788,
456,
877,
465
],
[
795,
448,
876,
459
],
[
791,
443,
878,
455
],
[
566,
357,
587,
376
]
] | [] | true |
images/000000000164.jpg | 68 | [
[
590,
473,
673,
537
]
] | [] | true |
End of preview.
TVP Training Data — Thinking with Visual Primitives
Training data for the Thinking with Visual Primitives PyTorch implementation.
Overview
This dataset contains all training data for the multi-stage TVP pipeline:
| Split | File | Samples | Description |
|---|---|---|---|
| Pretrain | pretrain/grounding.jsonl |
146K | COCO-based grounding (label + bbox) |
| SFT | sft/grounding/sft_grounding.jsonl |
30K | Grounding with structured thinking + negatives (15%) |
| SFT | sft/counting/counting_data.jsonl |
8K | Counting with bbox grounding in CoT |
| SFT | sft/spatial/spatial_data.jsonl |
3K | CLEVR-style spatial reasoning |
| SFT | sft/maze/maze_data.jsonl |
5K | Procedural maze navigation (point primitives) |
| SFT | sft/path/path_data.jsonl |
3K | Path tracing (point sequences) |
Data Format
All files are JSONL. Coordinates are normalized integers in [0, 999].
Pretrain Grounding
{
"image": "images/000000000009.jpg",
"label": "person",
"boxes": [[480, 201, 720, 850]],
"points": [],
"normalized": true
}
SFT Grounding (with structured thinking)
{
"image": "images/000000000009.jpg",
"question": "Locate the person in the image.",
"thinking": "1. **Analyzing the request**\nThe user asks me to locate the person in this image.\n2. **Object grounding**\nI see a <|ref|>person<|/ref|><|box|>[[480,201,720,850]]<|/box|>.\n3. **Conclusion**\nThe person is located at the specified coordinates.",
"answer": "The person is located at [[480,201,720,850]].",
"boxes": [[480, 201, 720, 850]],
"points": []
}
SFT Counting
{
"image": "images/000000000025.jpg",
"question": "How many people are in this image?",
"thinking": "1. **Analyzing the request**\nThe user asks me to count the person in this image.\n2. **Object grounding**\nI see 2 instance(s) of <|ref|>person<|/ref|><|box|>[[338,121,630,923],[634,154,888,945]]<|/box|>.\n3. **Conclusion**\nThere are 2 person in this image.",
"count": 2,
"boxes": [[338, 121, 630, 923], [634, 154, 888, 945]]
}
Maze / Path (point primitives)
{
"image": "images/maze_00001.png",
"question": "Navigate from start to end in this maze.",
"thinking": "... DFS exploration with <|point|>[[x,y]]<|/point|> waypoints ...",
"answer": "...",
"points": [[100, 200], [150, 250], [200, 300]]
}
Visual Primitives
# Bounding box
<|ref|>cat<|/ref|><|box|>[[x1,y1,x2,y2]]<|/box|>
# Multiple boxes
<|ref|>person<|/ref|><|box|>[[130,50,400,800],[500,60,750,790]]<|/box|>
# Point sequence
<|point|>[[100,200],[150,250],[200,300]]<|/point|>
Generation Scripts
The scripts/ folder contains all data generation code:
| Script | Purpose |
|---|---|
prepare_all_data.py |
One-command pipeline (downloads COCO + generates all data) |
generate_sft_grounding_data.py |
Grounding with negatives + diverse prompt templates |
generate_maze_data.py |
Procedural maze generation with DFS solutions |
generate_path_data.py |
Path tracing data generation |
Regenerate from scratch
# Full pipeline (downloads COCO 2017 val ~1GB)
python scripts/prepare_all_data.py \
--output_dir data --coco_split val --coco_subset 5000
# Generate grounding with negatives
python scripts/generate_sft_grounding_data.py \
--coco_jsonl data/pretrain/grounding.jsonl \
--image_root data/coco/val \
--output data/sft/grounding/sft_grounding.jsonl \
--neg_ratio 0.15 --max_samples 30000
Source Images
The JSONL files reference COCO 2017 images. Download them separately:
- Train: COCO 2017 Train (18GB)
- Val: COCO 2017 Val (1GB)
For maze/spatial/path tasks, images are procedurally generated by the scripts.
Related
- GitHub Repository - Full training code and pipeline
- TVP-OPD-Qwen2VL-2B — Final distilled model
- TVP-SFTBox-Qwen2VL-2B — Box expert
- TVP-SFTPoint-Qwen2VL-2B — Point expert
- TVP-Pretrain-Qwen2VL-2B — Pretrained base
Citation
@software{wang2026tvp_pytorch,
title={Thinking with Visual Primitives — PyTorch Implementation},
author={Wang, Weishan},
url={https://github.com/vra/Thinking-with-Visual-Primitives-pytorch},
year={2026}
}
License
MIT
- Downloads last month
- -