Instructions to use ChatterjeeLab/PepMLM-650M with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use ChatterjeeLab/PepMLM-650M with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="ChatterjeeLab/PepMLM-650M")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("ChatterjeeLab/PepMLM-650M") model = AutoModelForMaskedLM.from_pretrained("ChatterjeeLab/PepMLM-650M") - Inference
- Notebooks
- Google Colab
- Kaggle
Commit ·
fae05f7
1
Parent(s): 4a09c97
Update README.md
Browse files
README.md
CHANGED
|
@@ -1,6 +1,9 @@
|
|
| 1 |
---
|
| 2 |
license: mit
|
| 3 |
---
|
|
|
|
|
|
|
|
|
|
| 4 |
**PepMLM: Target Sequence-Conditioned Generation of Peptide Binders via Masked Language Modeling**
|
| 5 |
|
| 6 |
Target proteins that lack accessible binding pockets and conformational stability have posed increasing challenges for drug development.
|
|
|
|
| 1 |
---
|
| 2 |
license: mit
|
| 3 |
---
|
| 4 |
+
|
| 5 |
+

|
| 6 |
+
|
| 7 |
**PepMLM: Target Sequence-Conditioned Generation of Peptide Binders via Masked Language Modeling**
|
| 8 |
|
| 9 |
Target proteins that lack accessible binding pockets and conformational stability have posed increasing challenges for drug development.
|