dobetter-norge-v2
Domain-specific LLM fine-tuned for Norwegian legal text. Built on Qwen 2.5 7B with QLoRA 4-bit quantization, trained on 2,847 legislative documents and 7,603 expert Q&A pairs.
Model Overview
dobetter-norge-v2 is a domain-specific large language model purpose-built for Norwegian legal text analysis. The model was fine-tuned by the Blue Note Logic engineering team using QLoRA (Quantized Low-Rank Adaptation) at 4-bit precision on the Qwen 2.5 7B base architecture.
Training Data
- 2,847 Norwegian legal documents — laws, regulations, Supreme Court decisions, and parliamentary preparatory works (forarbeider)
- 7,603 expert Q&A pairs — curated question-answer pairs covering statutory interpretation, case law application, and regulatory compliance
- 31,842 training examples generated through structured data augmentation
Architecture & Quantization
| Parameter | Value |
|---|---|
| Base Model | Qwen 2.5 7B |
| Fine-Tuning | QLoRA 4-bit |
| Output Format | GGUF Q5_K_M |
| File Size | 5.1 GB |
| Context Window | 4,096 tokens |
| Training Loss | 0.1319 |
| Evaluation Loss | 0.1673 |
Compatibility
The GGUF Q5_K_M format is compatible with the following inference engines:
- Ollama —
ollama run dobetter-norge-v2 - llama.cpp — Direct GGUF loading via
-mflag - LM Studio — Import via model browser or drag-and-drop
Intended Use
This model is designed for Norwegian legal professionals, compliance teams, and legal-tech applications requiring accurate interpretation of Norwegian legislation and case law. It is not intended as a replacement for qualified legal counsel.
Related Resources
Claude Model Family Reference
Comprehensive comparison of Anthropic's Claude model tiers: Opus 4.6, Sonnet 4.6, and Haiku 4.5. Context windows, capabilities, pricing, and recommended use cases from BNL's production experience.
View details →
Qwen 2.5 7B Base Model
The base model architecture behind dobetter-norge-v2. Alibaba's Qwen 2.5 7B offers strong multilingual performance and serves as an excellent foundation for domain-specific fine-tuning.
Access →