============================================================ SCALED REGULARIZATION GRID SEARCH - DEPTH 12 Job ID: 15427305 | Node: gpub063 Start: Tue Jan 6 22:02:22 CST 2026 ============================================================ Grid: λ_reg=[0.0005, 0.001, 0.002, 0.005] × reg_type=[mult_linear, mult_log] Total: 8 experiments ============================================================ NVIDIA A40, 46068 MiB ============================================================ ====================================================================== SCALED REGULARIZATION GRID SEARCH ====================================================================== Depth: 12 Epochs: 100 Device: cuda GPU: NVIDIA A40 ====================================================================== Grid: 4 λ_reg × 2 reg_types = 8 experiments λ_reg values: [0.0005, 0.001, 0.002, 0.005] reg_types: ['mult_linear', 'mult_log'] Loading CIFAR-100... Train: 50000, Test: 10000 ============================================================ Config: depth=12, reg_type=mult_linear, λ_reg=0.0005 ============================================================ Training Vanilla... Epoch 10: test=0.265 Epoch 20: test=0.341 Epoch 30: test=0.399 Epoch 40: test=0.435 Epoch 50: test=0.409 Epoch 60: test=0.445 Epoch 70: test=0.446 Epoch 80: test=0.446 Epoch 90: test=0.456 Epoch 100: test=0.452 Training Lyapunov (mult_linear, λ_reg=0.0005)... Epoch 10: test=0.010 λ=1.717 Epoch 20: test=0.010 λ=1.716 Epoch 30: test=0.010 λ=1.687 Epoch 40: test=0.010 λ=1.671 Epoch 50: test=0.011 λ=1.669 Epoch 60: test=0.010 λ=1.650 Epoch 70: test=0.010 λ=1.651 Epoch 80: test=0.011 λ=1.620 Epoch 90: test=0.011 λ=1.623 Epoch 100: test=0.010 λ=1.621 Result: Vanilla=0.456, Lyap=0.011, Δ=-0.445 ============================================================ Config: depth=12, reg_type=mult_log, λ_reg=0.0005 ============================================================ Training Vanilla... Epoch 10: test=0.224 Epoch 20: test=0.332 Epoch 30: test=0.388 Epoch 40: test=0.421 Epoch 50: test=0.448 Epoch 60: test=0.458 Epoch 70: test=0.465 Epoch 80: test=0.456 Epoch 90: test=0.477 Epoch 100: test=0.464 Training Lyapunov (mult_log, λ_reg=0.0005)... Epoch 10: test=0.010 λ=1.726 Epoch 20: test=0.013 λ=1.629 Epoch 30: test=0.010 λ=1.602 Epoch 40: test=0.010 λ=1.596 Epoch 50: test=0.010 λ=1.606 Epoch 60: test=0.010 λ=1.577 Epoch 70: test=0.010 λ=1.570 Epoch 80: test=0.010 λ=1.572 Epoch 90: test=0.010 λ=1.563 Epoch 100: test=0.010 λ=1.566 Result: Vanilla=0.477, Lyap=0.013, Δ=-0.464 ============================================================ Config: depth=12, reg_type=mult_linear, λ_reg=0.001 ============================================================ Training Vanilla... Epoch 10: test=0.249 Epoch 20: test=0.338 Epoch 30: test=0.378 Epoch 40: test=0.429 Epoch 50: test=0.452 Epoch 60: test=0.447 Epoch 70: test=0.484 Epoch 80: test=0.482 Epoch 90: test=0.489 Epoch 100: test=0.478 Training Lyapunov (mult_linear, λ_reg=0.001)... Epoch 10: test=0.012 λ=1.695 Epoch 20: test=0.011 λ=1.636 Epoch 30: test=0.010 λ=1.673 Epoch 40: test=0.009 λ=1.658 Epoch 50: test=0.010 λ=1.653 Epoch 60: test=0.009 λ=1.622 Epoch 70: test=0.008 λ=1.618 Epoch 80: test=0.010 λ=1.611 Epoch 90: test=0.012 λ=1.615 Epoch 100: test=0.011 λ=1.604 Result: Vanilla=0.489, Lyap=0.012, Δ=-0.477 ============================================================ Config: depth=12, reg_type=mult_log, λ_reg=0.001 ============================================================ Training Vanilla... Epoch 10: test=0.182 Epoch 20: test=0.285 Epoch 30: test=0.326 Epoch 40: test=0.375 Epoch 50: test=0.392 Epoch 60: test=0.407 Epoch 70: test=0.406 Epoch 80: test=0.427 Epoch 90: test=0.414 Epoch 100: test=0.438 Training Lyapunov (mult_log, λ_reg=0.001)... Epoch 10: test=0.018 λ=1.729 Epoch 20: test=0.010 λ=1.665 Epoch 30: test=0.010 λ=1.635 Epoch 40: test=0.010 λ=1.640 Epoch 50: test=0.011 λ=1.638 Epoch 60: test=0.010 λ=1.616 Epoch 70: test=0.010 λ=1.597 Epoch 80: test=0.010 λ=1.606 Epoch 90: test=0.010 λ=1.613 Epoch 100: test=0.010 λ=1.602 Result: Vanilla=0.438, Lyap=0.018, Δ=-0.420 ============================================================ Config: depth=12, reg_type=mult_linear, λ_reg=0.002 ============================================================ Training Vanilla... Epoch 10: test=0.242 Epoch 20: test=0.345 Epoch 30: test=0.387 Epoch 40: test=0.418 Epoch 50: test=0.458 Epoch 60: test=0.462 Epoch 70: test=0.469 Epoch 80: test=0.474 Epoch 90: test=0.481 Epoch 100: test=0.485 Training Lyapunov (mult_linear, λ_reg=0.002)... Epoch 10: test=0.015 λ=1.689 Epoch 20: test=0.011 λ=1.704 Epoch 30: test=0.010 λ=1.671 Epoch 40: test=0.010 λ=1.629 Epoch 50: test=0.010 λ=1.615 Epoch 60: test=0.010 λ=1.603 Epoch 70: test=0.010 λ=1.602 Epoch 80: test=0.013 λ=1.598 Epoch 90: test=0.010 λ=1.594 Epoch 100: test=0.009 λ=1.596 Result: Vanilla=0.485, Lyap=0.015, Δ=-0.470 ============================================================ Config: depth=12, reg_type=mult_log, λ_reg=0.002 ============================================================ Training Vanilla... Epoch 10: test=0.181 Epoch 20: test=0.296 Epoch 30: test=0.388 Epoch 40: test=0.424 Epoch 50: test=0.428 Epoch 60: test=0.444 Epoch 70: test=0.444 Epoch 80: test=0.458 Epoch 90: test=0.469 Epoch 100: test=0.466 Training Lyapunov (mult_log, λ_reg=0.002)... Epoch 10: test=0.010 λ=1.717 Epoch 20: test=0.010 λ=1.650 Epoch 30: test=0.010 λ=1.643 Epoch 40: test=0.011 λ=1.682 Epoch 50: test=0.010 λ=1.652 Epoch 60: test=0.010 λ=1.623 Epoch 70: test=0.010 λ=1.632 Epoch 80: test=0.010 λ=1.631 Epoch 90: test=0.010 λ=1.613 Epoch 100: test=0.010 λ=1.613 Result: Vanilla=0.469, Lyap=0.011, Δ=-0.458 ============================================================ Config: depth=12, reg_type=mult_linear, λ_reg=0.005 ============================================================ Training Vanilla... Epoch 10: test=0.177 Epoch 20: test=0.270 Epoch 30: test=0.385 Epoch 40: test=0.422 Epoch 50: test=0.438 Epoch 60: test=0.426 Epoch 70: test=0.448 Epoch 80: test=0.467 Epoch 90: test=0.468 Epoch 100: test=0.469 Training Lyapunov (mult_linear, λ_reg=0.005)... Epoch 10: test=0.013 λ=1.662 Epoch 20: test=0.010 λ=1.668 Epoch 30: test=0.009 λ=1.604 Epoch 40: test=0.010 λ=1.588 Epoch 50: test=0.010 λ=1.620 Epoch 60: test=0.010 λ=1.589 Epoch 70: test=0.010 λ=1.594 Epoch 80: test=0.010 λ=1.596 Epoch 90: test=0.010 λ=1.600 Epoch 100: test=0.010 λ=1.600 Result: Vanilla=0.469, Lyap=0.013, Δ=-0.456 ============================================================ Config: depth=12, reg_type=mult_log, λ_reg=0.005 ============================================================ Training Vanilla... Epoch 10: test=0.219 Epoch 20: test=0.357 Epoch 30: test=0.388 Epoch 40: test=0.382 Epoch 50: test=0.399 Epoch 60: test=0.428 Epoch 70: test=0.430 Epoch 80: test=0.460 Epoch 90: test=0.454 Epoch 100: test=0.463 Training Lyapunov (mult_log, λ_reg=0.005)... Epoch 10: test=0.017 λ=1.706 Epoch 20: test=0.010 λ=1.681 Epoch 30: test=0.009 λ=1.616 Epoch 40: test=0.010 λ=1.610 Epoch 50: test=0.010 λ=1.639 Epoch 60: test=0.010 λ=1.610 Epoch 70: test=0.010 λ=1.600 Epoch 80: test=0.010 λ=1.632 Epoch 90: test=0.010 λ=1.607 Epoch 100: test=0.010 λ=1.602 Result: Vanilla=0.463, Lyap=0.017, Δ=-0.446 ====================================================================== SUMMARY: DEPTH = 12 ====================================================================== reg_type λ_reg Vanilla Lyapunov Δ Final λ ---------------------------------------------------------------------- mult_linear 0.00 0.456 0.011 -0.445 1.621 mult_log 0.00 0.477 0.013 -0.464 1.566 mult_linear 0.00 0.489 0.012 -0.477 1.604 mult_log 0.00 0.438 0.018 -0.420 1.602 mult_linear 0.00 0.485 0.015 -0.470 1.596 mult_log 0.00 0.469 0.011 -0.458 1.613 mult_linear 0.01 0.469 0.013 -0.456 1.600 mult_log 0.01 0.463 0.017 -0.446 1.602 ---------------------------------------------------------------------- BEST: mult_log, λ_reg=0.001 → 0.018 (Δ=-0.420) Results saved to: ./runs/scaled_grid/depth12_results.json ====================================================================== GRID SEARCH COMPLETE ====================================================================== ============================================================ Finished: Thu Jan 8 08:44:00 CST 2026 ============================================================