Shaheer1 commited on
Commit
ada0966
·
1 Parent(s): e3a62b8

Updated README.md file

Browse files
Files changed (1) hide show
  1. README.md +85 -0
README.md CHANGED
@@ -1,3 +1,88 @@
1
  ---
2
  license: mit
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: mit
3
  ---
4
+
5
+ # Query Complexity Classifier
6
+
7
+ This model classifies user queries based on their **complexity level** so they can be routed to an appropriate Large Language Model (LLM).
8
+
9
+ The model predicts three classes:
10
+
11
+ * **Simple**
12
+ * **Medium**
13
+ * **Complex**
14
+
15
+ It can be used as a **pre-routing layer** in AI systems where different LLMs handle different levels of query complexity.
16
+
17
+ ---
18
+
19
+ ## Model
20
+
21
+ Base Model: DistilBERT
22
+ Task: Text Classification (3 classes)
23
+
24
+ ---
25
+
26
+ ## Download and Use
27
+
28
+ You can download and load the model directly from Hugging Face using the `transformers` library.
29
+
30
+ ### Install dependencies
31
+
32
+ ```bash
33
+ pip install transformers torch
34
+ ```
35
+
36
+ ### Load the model
37
+
38
+ ```python
39
+ from transformers import AutoTokenizer, AutoModelForSequenceClassification
40
+ import torch
41
+
42
+ model_name = "Shaheer001/Query-Complexity-Classifier"
43
+
44
+ tokenizer = AutoTokenizer.from_pretrained(model_name)
45
+ model = AutoModelForSequenceClassification.from_pretrained(model_name)
46
+ ```
47
+
48
+ ### Run inference
49
+
50
+ ```python
51
+ text = "Explain how Kubernetes architecture works."
52
+
53
+ inputs = tokenizer(text, return_tensors="pt", truncation=True)
54
+
55
+ outputs = model(**inputs)
56
+
57
+ prediction = torch.argmax(outputs.logits, dim=1).item()
58
+
59
+ labels = ["Simple", "Medium", "Complex"]
60
+
61
+ print("Predicted Complexity:", labels[prediction])
62
+ ```
63
+
64
+ ---
65
+
66
+ ## Example
67
+
68
+ Input:
69
+
70
+ ```
71
+ Explain Kubernetes architecture
72
+ ```
73
+
74
+ Output:
75
+
76
+ ```
77
+ Complex
78
+ ```
79
+
80
+ ---
81
+
82
+ ## Use Case
83
+
84
+ This model can be used to build **LLM routing systems** where queries are automatically sent to different language models depending on their complexity.
85
+
86
+ Example workflow:
87
+
88
+ User Query → Complexity Classifier → LLM Router → Selected LLM