blob: ced0fa97d0d6acf7ed834ee93ee702779a05bbfd (
plain)
1
2
3
4
5
6
|
/u/yurenh2/.local/lib/python3.9/site-packages/transformers/utils/hub.py:110: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.
warnings.warn(
`torch_dtype` is deprecated! Use `dtype` instead!
Loading checkpoint shards: 0%| | 0/4 [00:00<?, ?it/s]
Loading checkpoint shards: 25%|██▌ | 1/4 [02:34<07:43, 154.66s/it]
Loading checkpoint shards: 50%|█████ | 2/4 [03:10<02:49, 84.99s/it]
Loading checkpoint shards: 75%|███████▌ | 3/4 [03:41<01:00, 60.27s/it]
Loading checkpoint shards: 100%|██████████| 4/4 [04:00<00:00, 43.81s/it]
Loading checkpoint shards: 100%|██████████| 4/4 [04:00<00:00, 60.08s/it]
/var/spool/slurmd/job14367333/slurm_script: line 39: 3292815 Killed python scripts/benchmark_inference.py --mode transformers --model $MODEL_8B -n 10
[2025-12-29T04:01:31.106] error: Detected 1 oom_kill event in StepId=14367333.batch. Some of the step tasks have been OOM Killed.
|