summaryrefslogtreecommitdiff
path: root/runs/slurm_logs/15427304_scaled_grid_d8.out
blob: f0c2f407873d3c54695ba41a9d35bd162085618e (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
============================================================
SCALED REGULARIZATION GRID SEARCH - DEPTH 8
Job ID: 15427304 | Node: gpub083
Start: Tue Jan  6 22:01:52 CST 2026
============================================================
Grid: λ_reg=[0.0005, 0.001, 0.002, 0.005] × reg_type=[mult_linear, mult_log]
Total: 8 experiments
============================================================
NVIDIA A40, 46068 MiB
============================================================
======================================================================
SCALED REGULARIZATION GRID SEARCH
======================================================================
Depth: 8
Epochs: 100
Device: cuda
GPU: NVIDIA A40
======================================================================

Grid: 4 λ_reg × 2 reg_types = 8 experiments
λ_reg values: [0.0005, 0.001, 0.002, 0.005]
reg_types: ['mult_linear', 'mult_log']

Loading CIFAR-100...
Train: 50000, Test: 10000

============================================================
Config: depth=8, reg_type=mult_linear, λ_reg=0.0005
============================================================
  Training Vanilla...
    Epoch  10: test=0.435
    Epoch  20: test=0.515
    Epoch  30: test=0.559
    Epoch  40: test=0.567
    Epoch  50: test=0.578
    Epoch  60: test=0.590
    Epoch  70: test=0.593
    Epoch  80: test=0.597
    Epoch  90: test=0.593
    Epoch 100: test=0.598
  Training Lyapunov (mult_linear, λ_reg=0.0005)...
    Epoch  10: test=0.021 λ=1.726
    Epoch  20: test=0.048 λ=1.659
    Epoch  30: test=0.074 λ=1.649
    Epoch  40: test=0.056 λ=1.666
    Epoch  50: test=0.096 λ=1.671
    Epoch  60: test=0.094 λ=1.668
    Epoch  70: test=0.090 λ=1.677
    Epoch  80: test=0.111 λ=1.666
    Epoch  90: test=0.121 λ=1.669
    Epoch 100: test=0.112 λ=1.667
  Result: Vanilla=0.598, Lyap=0.121, Δ=-0.477

============================================================
Config: depth=8, reg_type=mult_log, λ_reg=0.0005
============================================================
  Training Vanilla...
    Epoch  10: test=0.429
    Epoch  20: test=0.517
    Epoch  30: test=0.556
    Epoch  40: test=0.564
    Epoch  50: test=0.584
    Epoch  60: test=0.575
    Epoch  70: test=0.596
    Epoch  80: test=0.594
    Epoch  90: test=0.600
    Epoch 100: test=0.596
  Training Lyapunov (mult_log, λ_reg=0.0005)...
    Epoch  10: test=0.150 λ=2.466
    Epoch  20: test=0.023 λ=2.422
    Epoch  30: test=0.021 λ=2.654
    Epoch  40: test=0.028 λ=2.725
    Epoch  50: test=0.018 λ=2.763
    Epoch  60: test=0.022 λ=2.787
    Epoch  70: test=0.011 λ=2.807
    Epoch  80: test=0.011 λ=2.827
    Epoch  90: test=0.013 λ=2.828
    Epoch 100: test=0.012 λ=2.828
  Result: Vanilla=0.600, Lyap=0.150, Δ=-0.450

============================================================
Config: depth=8, reg_type=mult_linear, λ_reg=0.001
============================================================
  Training Vanilla...
    Epoch  10: test=0.425
    Epoch  20: test=0.538
    Epoch  30: test=0.560
    Epoch  40: test=0.576
    Epoch  50: test=0.599
    Epoch  60: test=0.600
    Epoch  70: test=0.608
    Epoch  80: test=0.608
    Epoch  90: test=0.609
    Epoch 100: test=0.614
  Training Lyapunov (mult_linear, λ_reg=0.001)...
    Epoch  10: test=0.030 λ=1.726
    Epoch  20: test=0.042 λ=1.748
    Epoch  30: test=0.026 λ=1.624
    Epoch  40: test=0.015 λ=1.667
    Epoch  50: test=0.011 λ=1.659
    Epoch  60: test=0.013 λ=1.662
    Epoch  70: test=0.015 λ=1.667
    Epoch  80: test=0.014 λ=1.641
    Epoch  90: test=0.012 λ=1.642
    Epoch 100: test=0.010 λ=1.642
  Result: Vanilla=0.614, Lyap=0.042, Δ=-0.572

============================================================
Config: depth=8, reg_type=mult_log, λ_reg=0.001
============================================================
  Training Vanilla...
    Epoch  10: test=0.443
    Epoch  20: test=0.530
    Epoch  30: test=0.563
    Epoch  40: test=0.579
    Epoch  50: test=0.602
    Epoch  60: test=0.595
    Epoch  70: test=0.609
    Epoch  80: test=0.605
    Epoch  90: test=0.608
    Epoch 100: test=0.614
  Training Lyapunov (mult_log, λ_reg=0.001)...
    Epoch  10: test=0.028 λ=1.675
    Epoch  20: test=0.016 λ=1.666
    Epoch  30: test=0.026 λ=1.652
    Epoch  40: test=0.018 λ=1.666
    Epoch  50: test=0.015 λ=1.676
    Epoch  60: test=0.012 λ=1.719
    Epoch  70: test=0.011 λ=1.744
    Epoch  80: test=0.011 λ=1.735
    Epoch  90: test=0.015 λ=1.742
    Epoch 100: test=0.014 λ=1.742
  Result: Vanilla=0.614, Lyap=0.028, Δ=-0.586

============================================================
Config: depth=8, reg_type=mult_linear, λ_reg=0.002
============================================================
  Training Vanilla...
    Epoch  10: test=0.439
    Epoch  20: test=0.524
    Epoch  30: test=0.554
    Epoch  40: test=0.573
    Epoch  50: test=0.577
    Epoch  60: test=0.587
    Epoch  70: test=0.591
    Epoch  80: test=0.590
    Epoch  90: test=0.596
    Epoch 100: test=0.601
  Training Lyapunov (mult_linear, λ_reg=0.002)...
    Epoch  10: test=0.010 λ=1.595
    Epoch  20: test=0.010 λ=1.539
    Epoch  30: test=0.010 λ=1.560
    Epoch  40: test=0.010 λ=1.541
    Epoch  50: test=0.012 λ=1.584
    Epoch  60: test=0.010 λ=1.543
    Epoch  70: test=0.010 λ=1.541
    Epoch  80: test=0.010 λ=1.545
    Epoch  90: test=0.010 λ=1.546
    Epoch 100: test=0.010 λ=1.551
  Result: Vanilla=0.601, Lyap=0.012, Δ=-0.589

============================================================
Config: depth=8, reg_type=mult_log, λ_reg=0.002
============================================================
  Training Vanilla...
    Epoch  10: test=0.437
    Epoch  20: test=0.528
    Epoch  30: test=0.559
    Epoch  40: test=0.584
    Epoch  50: test=0.588
    Epoch  60: test=0.597
    Epoch  70: test=0.608
    Epoch  80: test=0.605
    Epoch  90: test=0.615
    Epoch 100: test=0.609
  Training Lyapunov (mult_log, λ_reg=0.002)...
    Epoch  10: test=0.034 λ=1.688
    Epoch  20: test=0.027 λ=1.653
    Epoch  30: test=0.022 λ=1.661
    Epoch  40: test=0.026 λ=1.644
    Epoch  50: test=0.022 λ=1.591
    Epoch  60: test=0.017 λ=1.589
    Epoch  70: test=0.019 λ=1.606
    Epoch  80: test=0.011 λ=1.641
    Epoch  90: test=0.011 λ=1.631
    Epoch 100: test=0.011 λ=1.650
  Result: Vanilla=0.615, Lyap=0.034, Δ=-0.581

============================================================
Config: depth=8, reg_type=mult_linear, λ_reg=0.005
============================================================
  Training Vanilla...
    Epoch  10: test=0.464
    Epoch  20: test=0.525
    Epoch  30: test=0.550
    Epoch  40: test=0.556
    Epoch  50: test=0.563
    Epoch  60: test=0.590
    Epoch  70: test=0.581
    Epoch  80: test=0.584
    Epoch  90: test=0.581
    Epoch 100: test=0.584
  Training Lyapunov (mult_linear, λ_reg=0.005)...
    Epoch  10: test=0.013 λ=1.609
    Epoch  20: test=0.010 λ=1.550
    Epoch  30: test=0.013 λ=1.549
    Epoch  40: test=0.009 λ=1.551
    Epoch  50: test=0.010 λ=1.541
    Epoch  60: test=0.010 λ=1.540
    Epoch  70: test=0.012 λ=1.541
    Epoch  80: test=0.010 λ=1.549
    Epoch  90: test=0.010 λ=1.551
    Epoch 100: test=0.010 λ=1.552
  Result: Vanilla=0.590, Lyap=0.013, Δ=-0.577

============================================================
Config: depth=8, reg_type=mult_log, λ_reg=0.005
============================================================
  Training Vanilla...
    Epoch  10: test=0.439
    Epoch  20: test=0.523
    Epoch  30: test=0.570
    Epoch  40: test=0.583
    Epoch  50: test=0.583
    Epoch  60: test=0.599
    Epoch  70: test=0.588
    Epoch  80: test=0.607
    Epoch  90: test=0.604
    Epoch 100: test=0.610
  Training Lyapunov (mult_log, λ_reg=0.005)...
    Epoch  10: test=0.012 λ=1.576
    Epoch  20: test=0.014 λ=1.536
    Epoch  30: test=0.010 λ=1.531
    Epoch  40: test=0.010 λ=1.528
    Epoch  50: test=0.011 λ=1.532
    Epoch  60: test=0.010 λ=1.541
    Epoch  70: test=0.010 λ=1.554
    Epoch  80: test=0.014 λ=1.572
    Epoch  90: test=0.010 λ=1.550
    Epoch 100: test=0.010 λ=1.581
  Result: Vanilla=0.610, Lyap=0.014, Δ=-0.596

======================================================================
SUMMARY: DEPTH = 8
======================================================================
reg_type            λ_reg  Vanilla Lyapunov        Δ  Final λ
----------------------------------------------------------------------
mult_linear          0.00    0.598    0.121   -0.477    1.667
mult_log             0.00    0.600    0.150   -0.450    2.828
mult_linear          0.00    0.614    0.042   -0.572    1.642
mult_log             0.00    0.614    0.028   -0.586    1.742
mult_linear          0.00    0.601    0.012   -0.589    1.551
mult_log             0.00    0.615    0.034   -0.581    1.650
mult_linear          0.01    0.590    0.013   -0.577    1.552
mult_log             0.01    0.610    0.014   -0.596    1.581
----------------------------------------------------------------------
BEST: mult_log, λ_reg=0.0005 → 0.150 (Δ=-0.450)

Results saved to: ./runs/scaled_grid/depth8_results.json

======================================================================
GRID SEARCH COMPLETE
======================================================================
============================================================
Finished: Wed Jan  7 20:52:01 CST 2026
============================================================