Building GhostLM a decoder-only transformer LLM trained from scratch in PyTorch on cybersecurity corpora (CVEs, MITRE ATT&CK, CTFtime, Exploit-DB). Interested in domain-specialized pretraining, efficient training on consumer hardware, and the intersection of offensive security and language models. Scale ladder: ghost-tiny (14.7M) → ghost-small (45M, current) → ghost-base (350M) → ghost-1B.