blob: 577a1eb03e06bea1c692d1765942eb7410679506 (
plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
|
# One-shot Entropy Minimization
[](https://arxiv.org/abs/2505.20282)
[](https://huggingface.co/zgao3186/qwen25math7b-one-shot-em/)
[](https://www.notion.so/One-shot-Entropy-Minimization-202606db813b80639773f850f39246a5)
### Installation
```bash
pip install torch transformers==4.47.1 accelerate deepspeed psutil pandas numpy wandb
```
---
### Reproducing One-shot EM Training (SOTA)
```bash
accelerate launch train.py \
--model_name Qwen2.5-Math-7B \
--model_path /path/to/Qwen2.5-Math-7B \
--train_data dataset/1shot_rlvr/pi1_r1280.parquet \
--effective_batch 64 \
--micro_batch_size 2 \
--temperature 0.5 \
--learning_rate 2e-5 \
--max_steps 50 \
--log_steps 1 \
--save_steps 1 \
--run_name one_shot \
--wandb_project one-shot-em
```
---
### Reproducing Multi-shot EM Training
```bash
accelerate launch train.py \
--model_name Qwen2.5-Math-7B \
--model_path /path/to/Qwen2.5-Math-7B \
--train_data dataset/numina/numina_00.parquet \
--effective_batch 64 \
--micro_batch_size 2 \
--temperature 0.5 \
--learning_rate 2e-5 \
--max_steps 50 \
--log_steps 1 \
--save_steps 1 \
--run_name multi_shot \
--wandb_project one-shot-em
```
---
### Evaluation
```bash
cd Qwen2.5-Eval/evaluation
bash sh/eval_all_math.sh
```
---
### Acknowledgements
Our dataset references and builds upon the following open-source contributions:
- [NuminaMath-CoT](https://huggingface.co/datasets/AI-MO/NuminaMath-CoT)
- [DeepScaler](https://github.com/agentica-project/deepscaler)
- [One-shot RLVR](https://github.com/ypwang61/One-Shot-RLVR/) – for data selection strategies
- [Qwen2.5-Eval](https://github.com/QwenLM/Qwen2.5-Math/) – for evaluation benchmarks
We sincerely thank the authors and maintainers of these projects for their excellent contributions to the research community!
---
### Citation
```
@misc{gao2025oneshotentropyminimization,
title={One-shot Entropy Minimization},
author={Zitian Gao and Lynx Chen and Haoming Luo and Joey Zhou and Bryan Dai},
year={2025},
eprint={2505.20282},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2505.20282},
}
```
|