db-d2 commited on
Commit
0ef5b92
·
verified ·
1 Parent(s): da8b366

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +75 -0
README.md ADDED
@@ -0,0 +1,75 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ task_categories:
4
+ - text-classification
5
+ tags:
6
+ - code
7
+ - vulnerability-detection
8
+ - embeddings
9
+ - codebert
10
+ - positive-unlabeled-learning
11
+ language:
12
+ - code
13
+ size_categories:
14
+ - 100K<n<1M
15
+ ---
16
+
17
+ # PrimeVul CodeBERT Embeddings
18
+
19
+ Pre-extracted [CLS] token embeddings from microsoft/codebert-base for all functions in the PrimeVul v0.1 vulnerability detection dataset.
20
+
21
+ ## What is this?
22
+
23
+ Each .npz file contains frozen CodeBERT embeddings (768-dimensional vectors) for C/C++ functions, along with their labels and CWE type annotations. These embeddings were extracted once using a frozen CodeBERT model and are used for downstream PU (positive-unlabeled) learning experiments without requiring GPU access.
24
+
25
+ ## Files
26
+
27
+ | File | Functions | Vulnerable | Shape |
28
+ |------|-----------|-----------|-------|
29
+ | train.npz | 175,797 | 4,862 (2.77%) | (175797, 768) |
30
+ | valid.npz | 23,948 | 593 (2.48%) | (23948, 768) |
31
+ | test.npz | 24,788 | 549 (2.21%) | (24788, 768) |
32
+ | test_paired.npz | 870 | 435 (50%) | (870, 768) |
33
+
34
+ ## Arrays in each .npz
35
+
36
+ - embeddings: (N, 768) float32 -- CodeBERT [CLS] token vectors
37
+ - labels: (N,) int32 -- 0 = benign, 1 = vulnerable
38
+ - cwe_types: (N,) object -- CWE category string (e.g., "CWE-119") or "unknown"
39
+ - idxs: (N,) int64 -- original PrimeVul record index for traceability
40
+
41
+ ## How to use
42
+
43
+ ```python
44
+ import numpy as np
45
+
46
+ data = np.load("train.npz", allow_pickle=True)
47
+ X = data["embeddings"] # (175797, 768)
48
+ y = data["labels"] # (175797,)
49
+ cwes = data["cwe_types"] # (175797,)
50
+ ```
51
+
52
+ ## Extraction details
53
+
54
+ - Model: microsoft/codebert-base (RoBERTa architecture, 125M parameters)
55
+ - Extraction: frozen model, [CLS] token from final layer
56
+ - Tokenization: max_length=512, truncation=True, padding=max_length
57
+ - Source data: PrimeVul v0.1 (chronological train/valid/test splits)
58
+ - Extracted on: Google Colab, A100 GPU, ~23 minutes for all splits
59
+
60
+ ## Citation
61
+
62
+ If you use these embeddings, please cite the PrimeVul dataset:
63
+
64
+ ```bibtex
65
+ @article{ding2024primevul,
66
+ title={Vulnerability Detection with Code Language Models: How Far Are We?},
67
+ author={Ding, Yangruibo and Fu, Yanjun and Ibrahim, Omniyyah and Sitawarin, Chawin and Chen, Xinyun and Alomair, Basel and Wagner, David and Ray, Baishakhi and Chen, Yizheng},
68
+ journal={arXiv preprint arXiv:2403.18624},
69
+ year={2024}
70
+ }
71
+ ```
72
+
73
+ ## License
74
+
75
+ MIT (same as PrimeVul)