ManifoldGL - Checkpoint 3000
Geometric IGBundle adapter weights for the Neurosymbolic Manifold LLM project.
Architecture
- Base model: Qwen/Qwen2.5-7B-Instruct (NF4 4-bit)
- Adapter: GeometricIGBundleAdapter - fiber bundle geometry with Poincare manifold, Fisher-Rao natural gradient, and Hamiltonian dynamics
- Vision: SigLIP (google/siglip-so400m-patch14-384) for multimodal input
Checkpoint Details
| Metric | Value |
|---|---|
| Training step | 3000 |
| Sectional curvature (K) | -5.72 |
| Manifold entropy (S) | ~0.85 (target 1.39) |
| Adapter parameters | ~524K |
| Adapter file size | ~56 MB (FP32) |
Usage
from igbundle.integrations.hf_patch import wrap_hf_candidate
import torch
# Load base model (4-bit)
model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen2.5-7B-Instruct", load_in_4bit=True)
# Inject geometric adapter
model = wrap_hf_candidate(model, adapter_config)
model.load_state_dict(torch.load("adapter_weights.pt"), strict=False)
Training History
Trained via train_odyssey.py with Phase A entropy unfreeze fixes:
- arcosh NaN gradient clamp (hyperbolic.py)
- Norm-based Poincare ball projection
- Logit-space diversity loss + squared entropy loss
- Differentiable fiber_update_net path
- Conformal factor ceiling raised (0.1 to 0.5 tanh)
- Gradient attenuation reduced (0.001 to 0.1)
Project
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support