summaryrefslogtreecommitdiff
path: root/data/longlamp.py
diff options
context:
space:
mode:
authorYurenHao0426 <Blackhao0426@gmail.com>2026-04-10 14:50:22 -0500
committerYurenHao0426 <Blackhao0426@gmail.com>2026-04-10 14:50:22 -0500
commit112c5d354f36d6ea6e8049cf1aeaebeb9944aa02 (patch)
tree7b56f786ee1450aa545caf565f039e7144ed6b7c /data/longlamp.py
parent26c899101dbb192981cc67d73fc00a2d158b503e (diff)
Fix two bugs: PEFT cleanup model corruption and K=16 OOM
Bug 1: PEFTBaseline.cleanup() corrupted wrapper.model after LoRA unload, causing 'Qwen2Model has no attribute prepare_inputs_for_generation' for subsequent methods. Fix: save reference to original model before wrapping, restore it directly in cleanup() instead of relying on unload(). Bug 2: fit_theta OOM at K=16 due to large logit chunks (128 × 151936 vocab). Fix: reduce CHUNK_SIZE from 128 to 32 (~4x less memory per chunk). Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Diffstat (limited to 'data/longlamp.py')
0 files changed, 0 insertions, 0 deletions