SupraLabs

non-profit
Activity Feed

AI & ML interests

Train, finetune and explore small models with good results to revolutionize small AI models by making them accessible to everyone

Recent Activity

Organization Card

Welcome to SupraLabs!

🤝 Who we are

We are @AxionLab-official and @LH-Tech-AI and we're creating small open-source models for everyone.

🎯 What we do

We train, finetune and explore small models with good results to revolutionize small AI models by making them accessible to everyone!

🚫 What we do NOT do

We are not making bad, trashy or unclean models and we do not release models halfy open-source but completely open-source for you!

🤖 Models

  • Supra Mini 0.1M - Trained on Kaggle 2xT4, 100k parameters, compared to models 10x it size
  • Supra Mini v2 0.1M - the second version of the Supra Mini series.
  • Supra Mini v3 0.5M - the third version of the Supra Mini series.
  • Supra Mini v4 2M - the fourth version of the Supra Mini series. Improved. More powerful. With context understanding.
  • MicroSupra 1k - Trained on GTX 750 Ti 4GB, a scaling laws experiment.
  • StorySupra-10M - Trained on RTX 5060 Ti 16GB for 10 minutes, coherent.
  • DistillSupra-0.2M - Trained on GTX 750 Ti 4GB for 30 minutes, still incoherent, but the first step for distillation research.
  • More Coming Soon! Come Back later!

🏆 Competing with other creators

We are competing with @CompactAI-O and @LH-Tech-AI (we know it's funny to compete against your own founder, but anyway 🤣😂).
See all our and their tiny models here: https://lh-tech.de/ai/compare-tiny-models.html

🏗️ Future roadmap

  • Supra-10M - Base, Chat, Reasoning - Trained on RTX 5060 Ti 16GB, with Nvidia technologies and CUDA
  • Supra-1M - Base, Chat, Reasoning - Trained on GTX 750Ti 4GB, with Nvidia Technologies and optimizations

💻 Hardware

  • RTX 5060 Ti 16GB (LH-Tech AI)
  • GTX 750Ti 4GB (AxionLab)

📢 Blog

https://huggingface.co/spaces/SupraLabs/Blog

🫶 Feedback and Support

Feedback and Support is very welcome and feel free to ask to join our organization if you want.

datasets 0

None public yet