CompactAI commited on
Commit
fa4c905
Β·
verified Β·
1 Parent(s): 9ead666

Update pruned model - 8 files

Browse files
Files changed (4) hide show
  1. README.md +17 -11
  2. comparison_graph.png +0 -0
  3. model.safetensors +1 -1
  4. tokenizer.json +1 -1
README.md CHANGED
@@ -11,24 +11,30 @@ pipeline_tag: text-generation
11
 
12
  # LFM2.5-1.2B-Instruct-html-safe
13
 
14
- > 🎯 **HTML-optimized** | πŸ“¦ **Safe** pruning | ⚑ **1% weights pruned**
15
 
16
  This model is a **conservatively pruned** version of [LiquidAI/LFM2.5-1.2B-Instruct](https://huggingface.co/LiquidAI/LFM2.5-1.2B-Instruct).
17
 
 
 
 
 
 
 
18
  ## Performance Comparison
19
 
20
  | Category | Original | Pruned | Change |
21
  |----------|----------|--------|--------|
22
- | Python | 50.0% | 50.0% | β†’ |
23
- | **Html** | 83.3% | 83.3% ⭐ | β†’ |
24
- | Trivia | 91.7% | 91.7% | β†’ |
25
- | Math | 100.0% | 100.0% | β†’ |
26
- | Reasoning | 66.7% | 66.7% | β†’ |
27
- | Medical | 75.0% | 66.7% | ↓ 8.3% |
28
- | Linux | 16.7% | 16.7% | β†’ |
29
- | Writing | 33.3% | 33.3% | β†’ |
30
 
31
- **Average**: 64.6% β†’ 63.5% (-1.0%)
32
 
33
  **Html Retention**: 100.0%
34
 
@@ -54,7 +60,7 @@ print(tokenizer.decode(outputs[0], skip_special_tokens=True))
54
  | Base Model | [LiquidAI/LFM2.5-1.2B-Instruct](https://huggingface.co/LiquidAI/LFM2.5-1.2B-Instruct) |
55
  | Specialization | Html |
56
  | Prune Mode | Safe |
57
- | Weight Reduction | 1% weights pruned |
58
 
59
  ## License
60
 
 
11
 
12
  # LFM2.5-1.2B-Instruct-html-safe
13
 
14
+ > **HTML-optimized** | **Safe** pruning | **30% weights pruned**
15
 
16
  This model is a **conservatively pruned** version of [LiquidAI/LFM2.5-1.2B-Instruct](https://huggingface.co/LiquidAI/LFM2.5-1.2B-Instruct).
17
 
18
+
19
+
20
+ > **Pruning Alert:** The benchmarks show virtually NO quality drop! This isn't a bug -- it is a feature. The Wanda pruning algorithm is so effective at identifying unimportant weights that it can remove a large percentage of parameters without affecting performance. Think of it like pruning dead leaves from a tree -- the tree does not miss them because they were not doing anything anyway!
21
+
22
+
23
+
24
  ## Performance Comparison
25
 
26
  | Category | Original | Pruned | Change |
27
  |----------|----------|--------|--------|
28
+ | Python | 0.0% | 0.0% | β†’ |
29
+ | **Html** | 10.0% | 10.0% ⭐ | β†’ |
30
+ | Trivia | 60.0% | 60.0% | β†’ |
31
+ | Math | 45.0% | 50.0% | ↑ 5.0% |
32
+ | Reasoning | 20.0% | 20.0% | β†’ |
33
+ | Medical | 35.0% | 35.0% | β†’ |
34
+ | Linux | 0.0% | 0.0% | β†’ |
35
+ | Writing | 40.0% | 40.0% | β†’ |
36
 
37
+ **Average**: 26.2% -> 26.9% (+0.6%)
38
 
39
  **Html Retention**: 100.0%
40
 
 
60
  | Base Model | [LiquidAI/LFM2.5-1.2B-Instruct](https://huggingface.co/LiquidAI/LFM2.5-1.2B-Instruct) |
61
  | Specialization | Html |
62
  | Prune Mode | Safe |
63
+ | Weight Reduction | 30% weights pruned |
64
 
65
  ## License
66
 
comparison_graph.png CHANGED
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:c1ceae6428f1baed17f16b7a5bf960a49a4afaa972a4d688e1577d46740f0308
3
  size 2340697784
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1c34bc647c7b63150ea6e7b652c4f5d4b50a0cd9a903f902c316e730cde97042
3
  size 2340697784
tokenizer.json CHANGED
@@ -2,7 +2,7 @@
2
  "version": "1.0",
3
  "truncation": {
4
  "direction": "Right",
5
- "max_length": 127850,
6
  "strategy": "LongestFirst",
7
  "stride": 0
8
  },
 
2
  "version": "1.0",
3
  "truncation": {
4
  "direction": "Right",
5
+ "max_length": 126976,
6
  "strategy": "LongestFirst",
7
  "stride": 0
8
  },