Burmese Coder ๐Ÿน๐Ÿ”ฅ (SF-FT-BASE)

Model Name: Burmese Coder (Gemma-3 4B)
Author: Dr. Wai Yan Nyein Naing (waiyan.nn18@gmail.com)
Model Type: Autoregressive Language Model (Causal LM)
Base Model: Google gemma-3-4b
Languages: Burmese (my), English (en)

๐Ÿ“– Model Description

Burmese Coder is a state-of-the-art, fine-tuned large language model specifically optimized for professional technical software development and programming assistance in the Burmese language. Built on top of the powerful Gemma-3 4B architecture, this model bridges the gap for Myanmar developers by providing highly accurate, conversational, and culturally nuances technical explanations without language barriers.

The model underwent rigorous training phases, starting with Supervised Fine-Tuning (SFT) on an enriched MBPP (Mostly Basic Python Problems) dataset translated and expanded with step-by-step Burmese explanations. To ensure linguistic purity and eliminate multilingual hallucinations, the model was ultra-hardened using Direct Preference Optimization (DPO) with targeted On-Policy Rejections.

๐ŸŽฏ Intended Uses & Limitations

โš ๏ธ Disclaimer: This model is released strictly for educational purposes and academic testing. It is NOT production-ready and should NOT be used for commercial purposes or integrated into mission-critical applications.

Best-Suited Uses

  • Educational Exploration: Learning and experimenting with fine-tuned Small Language Models (SLMs) tailored for the Burmese language.
  • Code Generation & Prompt Testing: Evaluating the model's ability to write scripts and algorithms based on Burmese instructions in a controlled environment.
  • Academic Research: Serving as a baseline or case study for localized, non-English programming assistants.
  • Local Prototyping: Optimized for edge deployment and local inference testing via Ollama / GGUF on consumer hardware (macOS/Windows/Linux).

Out-of-Scope & Limitations

  • Not for Production or Commercial Use: The model is an experimental research prototype. Its outputs must not be relied upon for production environments or commercial software systems.
  • Domain Restriction: The model's primary focus strictly remains on programming and software engineering. General-purpose conversations outside technical domains may not be robust or highly coherent.

๐Ÿ› ๏ธ Training Details

Training Paradigm

  1. Supervised Fine-Tuning (SFT): Initial instruction fine-tuning to teach the model structured technical problem-solving and accurate Burmese translation.
  2. Preference Alignment (DPO): Phase 4 ultra-hardening using Direct Preference Optimization (Beta=0.5). This phase utilized custom-generated hallucination datasets to heavily penalize language drift and reinforce strict Burmese linguistic consistency.

Dataset

  • MBPP Enriched: 974-sample Mostly Basic Python Problems, augmented with extremely detailed, step-by-step Burmese explanations.
  • Hallucination Rejection DPO: A curated dataset constructed to identify and reject mixed-language outputs.

๐Ÿ’ฌ Example Output

Here is a sample interaction showcasing the model's ability to seamlessly blend technical programming logic with professional Burmese explanations.

User Prompt:

"Python แ€™แ€พแ€ฌ Function แ€แ€…แ€บแ€แ€ฏ Error แ€แ€€แ€บแ€žแ€ฝแ€ฌแ€ธแ€›แ€„แ€บ แ€žแ€แ€บแ€™แ€พแ€แ€บแ€‘แ€ฌแ€ธแ€แ€ฒแ€ท แ€กแ€€แ€ผแ€ญแ€™แ€บแ€กแ€›แ€ฑแ€กแ€แ€ฝแ€€แ€บแ€กแ€แ€ญแ€ฏแ€„แ€บแ€ธ แ€•แ€ผแ€”แ€บแ€€แ€ผแ€ญแ€ฏแ€ธแ€…แ€ฌแ€ธแ€•แ€ฑแ€ธแ€™แ€šแ€ทแ€บ (Retry) Decorator แ€แ€…แ€บแ€แ€ฏแ€€แ€ญแ€ฏ delay แ€กแ€แ€ปแ€ญแ€”แ€บแ€”แ€ฒแ€ทแ€แ€€แ€ฝ แ€›แ€ฑแ€ธแ€•แ€ผแ€•แ€ซแ‹"

Burmese Coder Response: (The model generates both the executable code and the step-by-step explanation)

import time
from functools import wraps

def retry_on_failure(max_retries=3, delay=2):
    """
    Error แ€แ€€แ€บแ€แ€ฒแ€ทแ€•แ€ซแ€€ แ€žแ€แ€บแ€™แ€พแ€แ€บแ€‘แ€ฌแ€ธแ€žแ€ฑแ€ฌ แ€กแ€€แ€ผแ€ญแ€™แ€บแ€กแ€›แ€ฑแ€กแ€แ€ฝแ€€แ€บแ€กแ€แ€ญแ€ฏแ€„แ€บแ€ธ แ€…แ€ฑแ€ฌแ€„แ€ทแ€บแ€†แ€ญแ€ฏแ€„แ€บแ€ธแ แ€•แ€ผแ€”แ€บแ€€แ€ผแ€ญแ€ฏแ€ธแ€…แ€ฌแ€ธแ€™แ€Šแ€ทแ€บ Decorator
    """
    def decorator(func):
        @wraps(func)
        def wrapper(*args, **kwargs):
            for attempt in range(1, max_retries + 1):
                try:
                    return func(*args, **kwargs)
                except Exception as e:
                    print(f"[Error] Attempt {attempt} failed: {e}")
                    if attempt == max_retries:
                        raise e  # แ€”แ€ฑแ€ฌแ€€แ€บแ€†แ€ฏแ€ถแ€ธแ€กแ€€แ€ผแ€ญแ€™แ€บแ€แ€ฝแ€„แ€บ Error แ€€แ€ญแ€ฏ แ€•แ€ผแ€”แ€บแ€œแ€ฝแ€พแ€„แ€ทแ€บแ€™แ€Šแ€บ
                    time.sleep(delay)
        return wrapper
    return decorator

@retry_on_failure(max_retries=3, delay=2)
def unstable_api_call():
    # แ€€แ€ปแ€•แ€”แ€บแ€ธ Error แ€‘แ€ฝแ€€แ€บแ€…แ€ฑแ€™แ€Šแ€ทแ€บ API Call แ€กแ€แ€ฏ
    raise ConnectionError("Server Timeout!")

แ€›แ€พแ€„แ€บแ€ธแ€œแ€„แ€บแ€ธแ€แ€ปแ€€แ€บ (Explanation):

แ€’แ€ฎ code แ€€ แ€กแ€›แ€Šแ€บแ€กแ€žแ€ฝแ€ฑแ€ธแ€™แ€ผแ€„แ€ทแ€บแ€แ€ฒแ€ท Custom Decorator แ€แ€…แ€บแ€แ€ฏแ€€แ€ญแ€ฏ `functools` แ€‘แ€ฒแ€€ `wraps` แ€”แ€ฒแ€ท 
แ€แ€ฝแ€ฒแ€–แ€€แ€บแ€แ€Šแ€บแ€†แ€ฑแ€ฌแ€€แ€บแ€‘แ€ฌแ€ธแ€แ€ฌ แ€–แ€ผแ€…แ€บแ€•แ€ซแ€แ€šแ€บแ‹ `retry_on_failure` แ€†แ€ญแ€ฏแ€แ€ฒแ€ท แ€กแ€•แ€ผแ€„แ€บแ€†แ€ฏแ€ถแ€ธ function แ€€ 
`max_retries` แ€”แ€ฒแ€ท `delay` parameter แ€แ€ฝแ€ฑแ€€แ€ญแ€ฏ แ€œแ€€แ€บแ€แ€ถแ€•แ€ผแ€ฎแ€ธ แ€กแ€แ€ฝแ€„แ€บแ€ธแ€•แ€ญแ€ฏแ€„แ€บแ€ธ logic แ€€แ€ญแ€ฏ แ€‘แ€ญแ€”แ€บแ€ธแ€แ€ปแ€ฏแ€•แ€บแ€•แ€ฑแ€ธแ€•แ€ซแ€แ€šแ€บแ‹

`wrapper` แ€กแ€‘แ€ฒแ€™แ€พแ€ฌแ€แ€ฑแ€ฌแ€ท `try-except block` แ€€แ€ญแ€ฏ แ€กแ€žแ€ฏแ€ถแ€ธแ€•แ€ผแ€ฏแ€•แ€ผแ€ฎแ€ธ Error แ€แ€€แ€บ/แ€™แ€แ€€แ€บ แ€…แ€…แ€บแ€†แ€ฑแ€ธแ€•แ€ซแ€แ€šแ€บแ‹
Error แ€แ€€แ€บแ€แ€ฒแ€ทแ€›แ€„แ€บ `time.sleep(delay)` แ€”แ€ฒแ€ท แ€žแ€แ€บแ€™แ€พแ€แ€บแ€‘แ€ฌแ€ธแ€แ€ฒแ€ท แ€กแ€แ€ปแ€ญแ€”แ€บแ€…แ€ฑแ€ฌแ€„แ€ทแ€บแ€•แ€ผแ€ฎแ€ธ แ€”แ€ฑแ€ฌแ€€แ€บแ€แ€…แ€บแ€€แ€ผแ€ญแ€™แ€บ 
แ€•แ€ผแ€”แ€บแ€€แ€ผแ€ญแ€ฏแ€ธแ€…แ€ฌแ€ธแ€•แ€ซแ€แ€šแ€บแ‹ แ€กแ€€แ€šแ€บแ€œแ€ญแ€ฏแ€ท แ€”แ€ฑแ€ฌแ€€แ€บแ€†แ€ฏแ€ถแ€ธแ€กแ€€แ€ผแ€ญแ€™แ€บแ€กแ€‘แ€ญ Error แ€‘แ€•แ€บแ€แ€€แ€บแ€”แ€ฑแ€žแ€ฑแ€ธแ€›แ€„แ€บแ€แ€ฑแ€ฌแ€ท แ€™แ€ฐแ€œ Error
แ€€แ€ญแ€ฏแ€•แ€ฒ `raise e` แ€†แ€ญแ€ฏแ€•แ€ผแ€ฎแ€ธ แ€•แ€ผแ€”แ€บแ€œแ€ฝแ€พแ€„แ€ทแ€บแ€•แ€ฑแ€ธแ€œแ€ญแ€ฏแ€€แ€บแ€•แ€ซแ€แ€šแ€บแ‹

๐Ÿš€ How to Use (Local Inference)

This model is exported to GGUF format and is highly recommended to be run locally using Ollama for optimal memory efficiency and speed.

Via Ollama

  1. Create the Modelfile:
FROM ./burmese_coder_v4.gguf
# Add specific system prompts or parameters here if needed
  1. Initialize and Run:
ollama create burmese_coder -f Modelfile
ollama run burmese_coder

Via llama.cpp or text-generation-webui

Simply download the .gguf file and load it using the standard llama.cpp pipeline. Choose the quantization level (e.g., Q4_K_M or Q8_0) that best fits your VRAM requirements.

โš–๏ธ License

This model is released under the Gemma License due to its base model heritage. Please adhere to the usage guidelines outlined by Google for Gemma derivatives.

๐Ÿค Acknowledgments

  • Creator: Engineered and trained by Dr. Wai Yan Nyein Naing.
  • Initiative: Part of the Burmese Coding Assistant project.
  • Mission: Empowering the technology and developer community in Myanmar with localized, open-source AI tools.
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support