| | --- |
| | license: apache-2.0 |
| | datasets: |
| | - pietrolesci/pubmed-200k-rct |
| | metrics: |
| | - accuracy |
| | base_model: |
| | - openai-community/gpt2 |
| | tags: |
| | - medical |
| | - biology |
| | - research |
| | - pubmed |
| | model-index: |
| | - name: MedGPT |
| | results: [] |
| | demo: |
| | - name: Try in Space |
| | url: https://huggingface.co/spaces/devmanpreet/Medical-GPT2-Classifier |
| | --- |
| | |
| | # MedGPT — GPT-2 Fine-Tuned on PubMed RCT |
| |
|
| | MedGPT is a GPT-2 model fine-tuned on the `pubmed-200k-rct` dataset. It classifies individual sentences from biomedical abstracts into one of five standard sections: |
| |
|
| | - Background |
| | - Objective |
| | - Methods |
| | - Results |
| | - Conclusion |
| |
|
| | This model is useful for tasks requiring structured understanding or summarization of scientific literature. |
| |
|
| | ## Training Details |
| |
|
| | - Base Model: `gpt2` (124M parameters) |
| | - Dataset: `pietrolesci/pubmed-200k-rct` |
| | - Task: Sentence classification |
| | - Labels: Background, Objective, Methods, Results, Conclusion |
| | - Epochs: 1 (partial training) |
| | - Loss Function: CrossEntropy |
| | - Optimizer: AdamW |
| |
|