summaryrefslogtreecommitdiff
path: root/runs/slurm_logs/15427305_scaled_grid_d12.out
blob: 2b213b121053ee0e331a70b119e807a35e92b463 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
============================================================
SCALED REGULARIZATION GRID SEARCH - DEPTH 12
Job ID: 15427305 | Node: gpub063
Start: Tue Jan  6 22:02:22 CST 2026
============================================================
Grid: λ_reg=[0.0005, 0.001, 0.002, 0.005] × reg_type=[mult_linear, mult_log]
Total: 8 experiments
============================================================
NVIDIA A40, 46068 MiB
============================================================
======================================================================
SCALED REGULARIZATION GRID SEARCH
======================================================================
Depth: 12
Epochs: 100
Device: cuda
GPU: NVIDIA A40
======================================================================

Grid: 4 λ_reg × 2 reg_types = 8 experiments
λ_reg values: [0.0005, 0.001, 0.002, 0.005]
reg_types: ['mult_linear', 'mult_log']

Loading CIFAR-100...
Train: 50000, Test: 10000

============================================================
Config: depth=12, reg_type=mult_linear, λ_reg=0.0005
============================================================
  Training Vanilla...
    Epoch  10: test=0.265
    Epoch  20: test=0.341
    Epoch  30: test=0.399
    Epoch  40: test=0.435
    Epoch  50: test=0.409
    Epoch  60: test=0.445
    Epoch  70: test=0.446
    Epoch  80: test=0.446
    Epoch  90: test=0.456
    Epoch 100: test=0.452
  Training Lyapunov (mult_linear, λ_reg=0.0005)...
    Epoch  10: test=0.010 λ=1.717
    Epoch  20: test=0.010 λ=1.716
    Epoch  30: test=0.010 λ=1.687
    Epoch  40: test=0.010 λ=1.671
    Epoch  50: test=0.011 λ=1.669
    Epoch  60: test=0.010 λ=1.650
    Epoch  70: test=0.010 λ=1.651
    Epoch  80: test=0.011 λ=1.620
    Epoch  90: test=0.011 λ=1.623
    Epoch 100: test=0.010 λ=1.621
  Result: Vanilla=0.456, Lyap=0.011, Δ=-0.445

============================================================
Config: depth=12, reg_type=mult_log, λ_reg=0.0005
============================================================
  Training Vanilla...
    Epoch  10: test=0.224
    Epoch  20: test=0.332
    Epoch  30: test=0.388
    Epoch  40: test=0.421
    Epoch  50: test=0.448
    Epoch  60: test=0.458
    Epoch  70: test=0.465
    Epoch  80: test=0.456
    Epoch  90: test=0.477
    Epoch 100: test=0.464
  Training Lyapunov (mult_log, λ_reg=0.0005)...
    Epoch  10: test=0.010 λ=1.726
    Epoch  20: test=0.013 λ=1.629
    Epoch  30: test=0.010 λ=1.602
    Epoch  40: test=0.010 λ=1.596
    Epoch  50: test=0.010 λ=1.606
    Epoch  60: test=0.010 λ=1.577
    Epoch  70: test=0.010 λ=1.570
    Epoch  80: test=0.010 λ=1.572
    Epoch  90: test=0.010 λ=1.563
    Epoch 100: test=0.010 λ=1.566
  Result: Vanilla=0.477, Lyap=0.013, Δ=-0.464

============================================================
Config: depth=12, reg_type=mult_linear, λ_reg=0.001
============================================================
  Training Vanilla...
    Epoch  10: test=0.249
    Epoch  20: test=0.338
    Epoch  30: test=0.378
    Epoch  40: test=0.429
    Epoch  50: test=0.452
    Epoch  60: test=0.447
    Epoch  70: test=0.484
    Epoch  80: test=0.482
    Epoch  90: test=0.489
    Epoch 100: test=0.478
  Training Lyapunov (mult_linear, λ_reg=0.001)...
    Epoch  10: test=0.012 λ=1.695
    Epoch  20: test=0.011 λ=1.636
    Epoch  30: test=0.010 λ=1.673
    Epoch  40: test=0.009 λ=1.658
    Epoch  50: test=0.010 λ=1.653
    Epoch  60: test=0.009 λ=1.622
    Epoch  70: test=0.008 λ=1.618
    Epoch  80: test=0.010 λ=1.611
    Epoch  90: test=0.012 λ=1.615
    Epoch 100: test=0.011 λ=1.604
  Result: Vanilla=0.489, Lyap=0.012, Δ=-0.477

============================================================
Config: depth=12, reg_type=mult_log, λ_reg=0.001
============================================================
  Training Vanilla...
    Epoch  10: test=0.182
    Epoch  20: test=0.285
    Epoch  30: test=0.326
    Epoch  40: test=0.375
    Epoch  50: test=0.392
    Epoch  60: test=0.407
    Epoch  70: test=0.406
    Epoch  80: test=0.427
    Epoch  90: test=0.414
    Epoch 100: test=0.438
  Training Lyapunov (mult_log, λ_reg=0.001)...
    Epoch  10: test=0.018 λ=1.729
    Epoch  20: test=0.010 λ=1.665
    Epoch  30: test=0.010 λ=1.635
    Epoch  40: test=0.010 λ=1.640
    Epoch  50: test=0.011 λ=1.638
    Epoch  60: test=0.010 λ=1.616
    Epoch  70: test=0.010 λ=1.597
    Epoch  80: test=0.010 λ=1.606
    Epoch  90: test=0.010 λ=1.613
    Epoch 100: test=0.010 λ=1.602
  Result: Vanilla=0.438, Lyap=0.018, Δ=-0.420

============================================================
Config: depth=12, reg_type=mult_linear, λ_reg=0.002
============================================================
  Training Vanilla...
    Epoch  10: test=0.242
    Epoch  20: test=0.345
    Epoch  30: test=0.387
    Epoch  40: test=0.418
    Epoch  50: test=0.458
    Epoch  60: test=0.462
    Epoch  70: test=0.469
    Epoch  80: test=0.474
    Epoch  90: test=0.481
    Epoch 100: test=0.485
  Training Lyapunov (mult_linear, λ_reg=0.002)...
    Epoch  10: test=0.015 λ=1.689
    Epoch  20: test=0.011 λ=1.704
    Epoch  30: test=0.010 λ=1.671
    Epoch  40: test=0.010 λ=1.629
    Epoch  50: test=0.010 λ=1.615
    Epoch  60: test=0.010 λ=1.603
    Epoch  70: test=0.010 λ=1.602
    Epoch  80: test=0.013 λ=1.598
    Epoch  90: test=0.010 λ=1.594
    Epoch 100: test=0.009 λ=1.596
  Result: Vanilla=0.485, Lyap=0.015, Δ=-0.470

============================================================
Config: depth=12, reg_type=mult_log, λ_reg=0.002
============================================================
  Training Vanilla...
    Epoch  10: test=0.181
    Epoch  20: test=0.296
    Epoch  30: test=0.388
    Epoch  40: test=0.424
    Epoch  50: test=0.428
    Epoch  60: test=0.444
    Epoch  70: test=0.444
    Epoch  80: test=0.458
    Epoch  90: test=0.469
    Epoch 100: test=0.466
  Training Lyapunov (mult_log, λ_reg=0.002)...
    Epoch  10: test=0.010 λ=1.717
    Epoch  20: test=0.010 λ=1.650
    Epoch  30: test=0.010 λ=1.643
    Epoch  40: test=0.011 λ=1.682
    Epoch  50: test=0.010 λ=1.652
    Epoch  60: test=0.010 λ=1.623
    Epoch  70: test=0.010 λ=1.632
    Epoch  80: test=0.010 λ=1.631
    Epoch  90: test=0.010 λ=1.613
    Epoch 100: test=0.010 λ=1.613
  Result: Vanilla=0.469, Lyap=0.011, Δ=-0.458

============================================================
Config: depth=12, reg_type=mult_linear, λ_reg=0.005
============================================================
  Training Vanilla...
    Epoch  10: test=0.177
    Epoch  20: test=0.270
    Epoch  30: test=0.385
    Epoch  40: test=0.422
    Epoch  50: test=0.438
    Epoch  60: test=0.426
    Epoch  70: test=0.448
    Epoch  80: test=0.467
    Epoch  90: test=0.468
    Epoch 100: test=0.469
  Training Lyapunov (mult_linear, λ_reg=0.005)...
    Epoch  10: test=0.013 λ=1.662
    Epoch  20: test=0.010 λ=1.668
    Epoch  30: test=0.009 λ=1.604
    Epoch  40: test=0.010 λ=1.588
    Epoch  50: test=0.010 λ=1.620
    Epoch  60: test=0.010 λ=1.589
    Epoch  70: test=0.010 λ=1.594
    Epoch  80: test=0.010 λ=1.596
    Epoch  90: test=0.010 λ=1.600
    Epoch 100: test=0.010 λ=1.600
  Result: Vanilla=0.469, Lyap=0.013, Δ=-0.456

============================================================
Config: depth=12, reg_type=mult_log, λ_reg=0.005
============================================================
  Training Vanilla...
    Epoch  10: test=0.219
    Epoch  20: test=0.357
    Epoch  30: test=0.388
    Epoch  40: test=0.382
    Epoch  50: test=0.399
    Epoch  60: test=0.428
    Epoch  70: test=0.430
    Epoch  80: test=0.460
    Epoch  90: test=0.454
    Epoch 100: test=0.463
  Training Lyapunov (mult_log, λ_reg=0.005)...
    Epoch  10: test=0.017 λ=1.706
    Epoch  20: test=0.010 λ=1.681
    Epoch  30: test=0.009 λ=1.616
    Epoch  40: test=0.010 λ=1.610
    Epoch  50: test=0.010 λ=1.639
    Epoch  60: test=0.010 λ=1.610
    Epoch  70: test=0.010 λ=1.600
    Epoch  80: test=0.010 λ=1.632
    Epoch  90: test=0.010 λ=1.607
    Epoch 100: test=0.010 λ=1.602
  Result: Vanilla=0.463, Lyap=0.017, Δ=-0.446

======================================================================
SUMMARY: DEPTH = 12
======================================================================
reg_type            λ_reg  Vanilla Lyapunov        Δ  Final λ
----------------------------------------------------------------------
mult_linear          0.00    0.456    0.011   -0.445    1.621
mult_log             0.00    0.477    0.013   -0.464    1.566
mult_linear          0.00    0.489    0.012   -0.477    1.604
mult_log             0.00    0.438    0.018   -0.420    1.602
mult_linear          0.00    0.485    0.015   -0.470    1.596
mult_log             0.00    0.469    0.011   -0.458    1.613
mult_linear          0.01    0.469    0.013   -0.456    1.600
mult_log             0.01    0.463    0.017   -0.446    1.602
----------------------------------------------------------------------
BEST: mult_log, λ_reg=0.001 → 0.018 (Δ=-0.420)

Results saved to: ./runs/scaled_grid/depth12_results.json

======================================================================
GRID SEARCH COMPLETE
======================================================================
============================================================
Finished: Thu Jan  8 08:44:00 CST 2026
============================================================