DeepSeek MCQ Trainer (LoRA) — by Mohammed Sayeeduddin
Model Details
- Model name:
psdba/deepseek-mcq-trainer-mohammedsayeeduddin-lora - Developed by: Mohammed Sayeeduddin
- Model type: LoRA adapter (PEFT) fine-tuned for structured MCQ generation
- Base model:
deepseek-ai/deepseek-coder-1.3b-instruct - Primary use: Instructor-grade MCQ generation in strict JSON format for IT training
- Language(s): English
- License: This repository contains LoRA adapter weights only. Usage is subject to the license of the base model (
deepseek-ai/deepseek-coder-1.3b-instruct). Please review and comply with that license before commercial or redistribution use.
Model Description
This is a specialist training model designed to generate multiple-choice questions (MCQs) in a strict, machine-readable JSON schema.
The adapter was fine-tuned to produce:
- 4 options (A/B/C/D)
- exactly 1 correct answer
- short explanation
- JSON-only responses (no markdown, no extra commentary)
Intended Use
Direct Use
- Corporate training MCQs (FastAPI, Docker/Linux, Python Core, LLM/RAG)
- Classroom quizzes and practice tests
- Building MCQ datasets for LMS/Excel ingestion
Downstream Use
- Integration into training platforms (MCQ generators, exam portals)
- Dataset generation pipelines (JSON → CSV → LMS)
Out-of-Scope Use
- Medical / legal / financial advice
- Open-domain chat or creative writing
- High-stakes decisions without human validation
Bias, Risks, and Limitations
- MCQs may still contain imperfections or ambiguous distractors.
- Always validate questions before real exams or certifications.
- Model can hallucinate if prompts are unclear or request unsupported topics.
How to Get Started
Install
pip install -U transformers peft accelerate
- Downloads last month
- 21
Model tree for psdba/deepseek-mcq-trainer-mohammedsayeeduddin-lora
Base model
deepseek-ai/deepseek-coder-1.3b-instruct