Text Ranking
Transformers
Safetensors
sentence-transformers
Chinese
English
minicpm
text-classification
custom_code
Instructions to use openbmb/MiniCPM-Reranker-Light with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use openbmb/MiniCPM-Reranker-Light with Transformers:
# Load model directly from transformers import AutoModelForSequenceClassification model = AutoModelForSequenceClassification.from_pretrained("openbmb/MiniCPM-Reranker-Light", trust_remote_code=True, dtype="auto") - sentence-transformers
How to use openbmb/MiniCPM-Reranker-Light with sentence-transformers:
from sentence_transformers import CrossEncoder model = CrossEncoder("openbmb/MiniCPM-Reranker-Light", trust_remote_code=True) query = "Which planet is known as the Red Planet?" passages = [ "Venus is often called Earth's twin because of its similar size and proximity.", "Mars, known for its reddish appearance, is often referred to as the Red Planet.", "Jupiter, the largest planet in our solar system, has a prominent red spot.", "Saturn, famous for its rings, is sometimes mistaken for the Red Planet." ] scores = model.predict([(query, passage) for passage in passages]) print(scores) - Notebooks
- Google Colab
- Kaggle
File size: 2,943 Bytes
af828bb fcd1c89 af828bb | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 | {
"_name_or_path": "openbmb/MiniCPM-Reranker-Light",
"architectures": [
"MiniCPMForSequenceClassification"
],
"attention_bias": false,
"attention_dropout": 0.0,
"auto_map": {
"AutoConfig": "configuration_minicpm.MiniCPMConfig",
"AutoModel": "modeling_minicpm.MiniCPMModel",
"AutoModelForCausalLM": "modeling_minicpm.MiniCPMForCausalLM",
"AutoModelForSeq2SeqLM": "modeling_minicpm.MiniCPMForCausalLM",
"AutoModelForSequenceClassification": "modeling_minicpm.MiniCPMForSequenceClassification"
},
"bos_token_id": 1,
"dim_model_base": 256,
"eos_token_id": 2,
"hidden_act": "silu",
"hidden_size": 1536,
"id2label": {
"0": "LABEL_0"
},
"initializer_range": 0.1,
"intermediate_size": 3840,
"is_causal": false,
"label2id": {
"LABEL_0": 0
},
"max_position_embeddings": 4096,
"model_type": "minicpm",
"num_attention_heads": 24,
"num_hidden_layers": 52,
"num_key_value_heads": 8,
"pretraining_tp": 1,
"rms_norm_eps": 1e-05,
"rope_scaling": {
"long_factor": [
1.0004360675811768,
1.0668443441390991,
1.1631425619125366,
1.3025742769241333,
1.5040205717086792,
1.7941505908966064,
2.2101221084594727,
2.802666664123535,
3.6389970779418945,
4.804192543029785,
6.39855432510376,
8.527148246765137,
11.277542114257812,
14.684998512268066,
18.69317054748535,
23.13019371032715,
27.72362518310547,
32.1606559753418,
36.168827056884766,
39.57627868652344,
42.32667541503906,
44.45526885986328,
46.04962921142578,
47.21482849121094,
48.05115509033203,
48.64370346069336,
49.05967712402344,
49.34980392456055,
49.551246643066406,
49.69068145751953,
49.78697967529297,
49.85338592529297
],
"original_max_position_embeddings": 4096,
"short_factor": [
1.0004360675811768,
1.0668443441390991,
1.1631425619125366,
1.3025742769241333,
1.5040205717086792,
1.7941505908966064,
2.2101221084594727,
2.802666664123535,
3.6389970779418945,
4.804192543029785,
6.39855432510376,
8.527148246765137,
11.277542114257812,
14.684998512268066,
18.69317054748535,
23.13019371032715,
27.72362518310547,
32.1606559753418,
36.168827056884766,
39.57627868652344,
42.32667541503906,
44.45526885986328,
46.04962921142578,
47.21482849121094,
48.05115509033203,
48.64370346069336,
49.05967712402344,
49.34980392456055,
49.551246643066406,
49.69068145751953,
49.78697967529297,
49.85338592529297
],
"type": "longrope"
},
"rope_theta": 10000.0,
"scale_depth": 1.4,
"scale_emb": 12,
"torch_dtype": "bfloat16",
"transformers_version": "4.37.2",
"use_cache": false,
"vocab_size": 73440
}
|