File size: 1,798 Bytes
346338a
6f80beb
 
346338a
 
ada0966
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
---

pipeline_tag: text-classification
library_name: transformers
license: mit
---


# Query Complexity Classifier

This model classifies user queries based on their **complexity level** so they can be routed to an appropriate Large Language Model (LLM).

The model predicts three classes:

* **Simple**
* **Medium**
* **Complex**

It can be used as a **pre-routing layer** in AI systems where different LLMs handle different levels of query complexity.

---

## Model

Base Model: DistilBERT
Task: Text Classification (3 classes)

---

## Download and Use

You can download and load the model directly from Hugging Face using the `transformers` library.

### Install dependencies

```bash

pip install transformers torch

```

### Load the model

```python

from transformers import AutoTokenizer, AutoModelForSequenceClassification

import torch



model_name = "Shaheer001/Query-Complexity-Classifier"



tokenizer = AutoTokenizer.from_pretrained(model_name)

model = AutoModelForSequenceClassification.from_pretrained(model_name)

```

### Run inference

```python

text = "Explain how Kubernetes architecture works."



inputs = tokenizer(text, return_tensors="pt", truncation=True)



outputs = model(**inputs)



prediction = torch.argmax(outputs.logits, dim=1).item()



labels = ["Simple", "Medium", "Complex"]



print("Predicted Complexity:", labels[prediction])

```

---

## Example

Input:

```

Explain Kubernetes architecture

```

Output:

```

Complex

```

---

## Use Case

This model can be used to build **LLM routing systems** where queries are automatically sent to different language models depending on their complexity.

Example workflow:

User Query → Complexity Classifier → LLM Router → Selected LLM