Update README.md
Browse files
README.md
CHANGED
|
@@ -8,18 +8,37 @@ pinned: false
|
|
| 8 |
thumbnail: >-
|
| 9 |
https://cdn-avatars.huggingface.co/v1/production/uploads/66ea90952fef8317d8fba8ec/SgCp3hhhVu3HCuHt8sYn5.png
|
| 10 |
---
|
| 11 |
-
## About Us.
|
| 12 |
|
| 13 |
-
|
|
|
|
|
|
|
|
|
|
| 14 |
<p align="center">
|
| 15 |
<img src="https://cdn-avatars.huggingface.co/v1/production/uploads/66ea90952fef8317d8fba8ec/SgCp3hhhVu3HCuHt8sYn5.png" alt="LiGHT Logo" />
|
| 16 |
</p>
|
| 17 |
|
| 18 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 19 |
|
| 20 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 21 |
|
| 22 |
-
|
| 23 |
-
#### π Stay connected with us on LinkedIn:
|
| 24 |
|
| 25 |
-
- **
|
|
|
|
|
|
| 8 |
thumbnail: >-
|
| 9 |
https://cdn-avatars.huggingface.co/v1/production/uploads/66ea90952fef8317d8fba8ec/SgCp3hhhVu3HCuHt8sYn5.png
|
| 10 |
---
|
|
|
|
| 11 |
|
| 12 |
+
## About Us
|
| 13 |
+
|
| 14 |
+
Meet **LiGHT**, the [Laboratory for Intelligent Global Health & Humanitarian Response Technologies](https://www.light-laboratory.org/), based at EPFL (School of Computer Science), Ariadne Labs (Harvard), and the Koita Centre for Digital Health at [Ashoka University](https://www.ashoka.edu.in/).
|
| 15 |
+
|
| 16 |
<p align="center">
|
| 17 |
<img src="https://cdn-avatars.huggingface.co/v1/production/uploads/66ea90952fef8317d8fba8ec/SgCp3hhhVu3HCuHt8sYn5.png" alt="LiGHT Logo" />
|
| 18 |
</p>
|
| 19 |
|
| 20 |
+
We build clinically grounded, auditable AI for medicine β co-designed and evaluated with clinicians, humanitarian responders, and data scientists worldwide.
|
| 21 |
+
|
| 22 |
+
---
|
| 23 |
+
|
| 24 |
+
## π Fully Open Meditron
|
| 25 |
+
|
| 26 |
+
Our latest release, **Fully Open Meditron**, is the first end-to-end auditable pipeline for clinical LLMs β open weights, open data, open training recipe, and clinician-vetted corpus construction.
|
| 27 |
+
|
| 28 |
+
**Apertus-70B-MeditronFO** establishes a new state of the art among fully open medical LLMs, and **Gemma-3-27B-MeditronFO** outperforms MedGemma on HealthBench Hard despite being trained from a fully open pipeline.
|
| 29 |
|
| 30 |
+
π Browse the [**MeditronFO Collection**](https://huggingface.co/collections/EPFLiGHT/meditronfo) β six models across three families (Apertus, OLMo-2, EuroLLM, Gemma-3), plus the [Fully Open Meditron Corpus](https://huggingface.co/datasets/EPFLiGHT/fully-open-meditron) (~601k clinician-audited examples).
|
| 31 |
+
|
| 32 |
+
---
|
| 33 |
+
|
| 34 |
+
## Previous Work
|
| 35 |
+
|
| 36 |
+
- [**Meditron-3**](https://openreview.net/pdf?id=ZcD35zKujO) β open medical LLM suite spanning Llama-3, Phi-4, Qwen2.5, and Gemma-2 bases. See the [Meditron-3 Collection](https://huggingface.co/collections/EPFLiGHT/meditron-3).
|
| 37 |
+
- [**MEDITRON**](https://arxiv.org/pdf/2311.16079) β original 7B/70B medical specialist with curated clinical guidelines.
|
| 38 |
+
|
| 39 |
+
---
|
| 40 |
|
| 41 |
+
#### π Stay connected
|
|
|
|
| 42 |
|
| 43 |
+
- **Prof:** [Annie Hartley](https://www.linkedin.com/in/annie-hartley-324832210/)
|
| 44 |
+
- **Lab:** [light-laboratory.org](https://www.light-laboratory.org/)
|