============================================================ SCALED REGULARIZATION GRID SEARCH - DEPTH 4 Job ID: 15348084 | Node: gpub054 Start: Mon Jan 5 13:00:47 CST 2026 ============================================================ Grid: λ_reg=[0.01, 0.05, 0.1, 0.3] × reg_type=[mult_linear, mult_log] Total: 8 experiments ============================================================ NVIDIA A40, 46068 MiB ============================================================ ====================================================================== SCALED REGULARIZATION GRID SEARCH ====================================================================== Depth: 4 Epochs: 100 Device: cuda GPU: NVIDIA A40 ====================================================================== Grid: 4 λ_reg × 2 reg_types = 8 experiments λ_reg values: [0.01, 0.05, 0.1, 0.3] reg_types: ['mult_linear', 'mult_log'] Loading CIFAR-100... Train: 50000, Test: 10000 ============================================================ Config: depth=4, reg_type=mult_linear, λ_reg=0.01 ============================================================ Training Vanilla... Epoch 10: test=0.501 Epoch 20: test=0.558 Epoch 30: test=0.585 Epoch 40: test=0.594 Epoch 50: test=0.609 Epoch 60: test=0.615 Epoch 70: test=0.624 Epoch 80: test=0.629 Epoch 90: test=0.631 Epoch 100: test=0.635 Training Lyapunov (mult_linear, λ_reg=0.01)... Epoch 10: test=0.478 λ=1.879 Epoch 20: test=0.538 λ=1.853 Epoch 30: test=0.549 λ=1.838 Epoch 40: test=0.571 λ=1.841 Epoch 50: test=0.578 λ=1.821 Epoch 60: test=0.582 λ=1.826 Epoch 70: test=0.594 λ=1.828 Epoch 80: test=0.595 λ=1.825 Epoch 90: test=0.592 λ=1.828 Epoch 100: test=0.594 λ=1.827 Result: Vanilla=0.635, Lyap=0.595, Δ=-0.039 ============================================================ Config: depth=4, reg_type=mult_log, λ_reg=0.01 ============================================================ Training Vanilla... Epoch 10: test=0.487 Epoch 20: test=0.550 Epoch 30: test=0.580 Epoch 40: test=0.590 Epoch 50: test=0.598 Epoch 60: test=0.607 Epoch 70: test=0.618 Epoch 80: test=0.622 Epoch 90: test=0.622 Epoch 100: test=0.618 Training Lyapunov (mult_log, λ_reg=0.01)... Epoch 10: test=0.497 λ=1.881 Epoch 20: test=0.561 λ=1.850 Epoch 30: test=0.582 λ=1.844 Epoch 40: test=0.588 λ=1.837 Epoch 50: test=0.602 λ=1.842 Epoch 60: test=0.607 λ=1.853 Epoch 70: test=0.613 λ=1.854 Epoch 80: test=0.616 λ=1.861 Epoch 90: test=0.616 λ=1.862 Epoch 100: test=0.619 λ=1.860 Result: Vanilla=0.622, Lyap=0.619, Δ=-0.003 ============================================================ Config: depth=4, reg_type=mult_linear, λ_reg=0.05 ============================================================ Training Vanilla... Epoch 10: test=0.485 Epoch 20: test=0.556 Epoch 30: test=0.587 Epoch 40: test=0.601 Epoch 50: test=0.608 Epoch 60: test=0.613 Epoch 70: test=0.618 Epoch 80: test=0.616 Epoch 90: test=0.625 Epoch 100: test=0.627 Training Lyapunov (mult_linear, λ_reg=0.05)... Epoch 10: test=0.075 λ=1.435 Epoch 20: test=0.033 λ=1.436 Epoch 30: test=0.016 λ=1.440 Epoch 40: test=0.019 λ=1.445 Epoch 50: test=0.019 λ=1.444 Epoch 60: test=0.017 λ=1.448 Epoch 70: test=0.019 λ=1.452 Epoch 80: test=0.018 λ=1.454 Epoch 90: test=0.018 λ=1.454 Epoch 100: test=0.019 λ=1.456 Result: Vanilla=0.627, Lyap=0.075, Δ=-0.552 ============================================================ Config: depth=4, reg_type=mult_log, λ_reg=0.05 ============================================================ Training Vanilla... Epoch 10: test=0.487 Epoch 20: test=0.553 Epoch 30: test=0.576 Epoch 40: test=0.593 Epoch 50: test=0.606 Epoch 60: test=0.616 Epoch 70: test=0.616 Epoch 80: test=0.623 Epoch 90: test=0.623 Epoch 100: test=0.628 Training Lyapunov (mult_log, λ_reg=0.05)... Epoch 10: test=0.134 λ=1.473 Epoch 20: test=0.027 λ=1.442 Epoch 30: test=0.028 λ=1.447 Epoch 40: test=0.024 λ=1.453 Epoch 50: test=0.017 λ=1.452 Epoch 60: test=0.024 λ=1.457 Epoch 70: test=0.027 λ=1.454 Epoch 80: test=0.020 λ=1.455 Epoch 90: test=0.018 λ=1.457 Epoch 100: test=0.022 λ=1.458 Result: Vanilla=0.628, Lyap=0.134, Δ=-0.494 ============================================================ Config: depth=4, reg_type=mult_linear, λ_reg=0.1 ============================================================ Training Vanilla... Epoch 10: test=0.496 Epoch 20: test=0.552 Epoch 30: test=0.578 Epoch 40: test=0.597 Epoch 50: test=0.604 Epoch 60: test=0.610 Epoch 70: test=0.615 Epoch 80: test=0.617 Epoch 90: test=0.621 Epoch 100: test=0.622 Training Lyapunov (mult_linear, λ_reg=0.1)... Epoch 10: test=0.102 λ=1.477 Epoch 20: test=0.015 λ=1.473 Epoch 30: test=0.025 λ=1.478 Epoch 40: test=0.026 λ=1.480 Epoch 50: test=0.027 λ=1.482 Epoch 60: test=0.029 λ=1.489 Epoch 70: test=0.040 λ=1.490 Epoch 80: test=0.043 λ=1.492 Epoch 90: test=0.041 λ=1.490 Epoch 100: test=0.039 λ=1.491 Result: Vanilla=0.622, Lyap=0.102, Δ=-0.521 ============================================================ Config: depth=4, reg_type=mult_log, λ_reg=0.1 ============================================================ Training Vanilla... Epoch 10: test=0.499 Epoch 20: test=0.560 Epoch 30: test=0.583 Epoch 40: test=0.601 Epoch 50: test=0.605 Epoch 60: test=0.608 Epoch 70: test=0.614 Epoch 80: test=0.621 Epoch 90: test=0.623 Epoch 100: test=0.622 Training Lyapunov (mult_log, λ_reg=0.1)... Epoch 10: test=0.108 λ=1.444 Epoch 20: test=0.042 λ=1.436 Epoch 30: test=0.032 λ=1.445 Epoch 40: test=0.030 λ=1.447 Epoch 50: test=0.037 λ=1.452 Epoch 60: test=0.031 λ=1.455 Epoch 70: test=0.022 λ=1.457 Epoch 80: test=0.029 λ=1.463 Epoch 90: test=0.029 λ=1.466 Epoch 100: test=0.028 λ=1.464 Result: Vanilla=0.623, Lyap=0.108, Δ=-0.515 ============================================================ Config: depth=4, reg_type=mult_linear, λ_reg=0.3 ============================================================ Training Vanilla... Epoch 10: test=0.507 Epoch 20: test=0.559 Epoch 30: test=0.582 Epoch 40: test=0.595 Epoch 50: test=0.609 Epoch 60: test=0.612 Epoch 70: test=0.621 Epoch 80: test=0.622 Epoch 90: test=0.624 Epoch 100: test=0.624 Training Lyapunov (mult_linear, λ_reg=0.3)... Epoch 10: test=0.014 λ=1.526 Epoch 20: test=0.016 λ=1.498 Epoch 30: test=0.016 λ=1.440 Epoch 40: test=0.018 λ=1.437 Epoch 50: test=0.011 λ=1.446 Epoch 60: test=0.024 λ=1.447 Epoch 70: test=0.024 λ=1.447 Epoch 80: test=0.034 λ=1.446 Epoch 90: test=0.034 λ=1.444 Epoch 100: test=0.031 λ=1.442 Result: Vanilla=0.624, Lyap=0.034, Δ=-0.590 ============================================================ Config: depth=4, reg_type=mult_log, λ_reg=0.3 ============================================================ Training Vanilla... Epoch 10: test=0.491 Epoch 20: test=0.557 Epoch 30: test=0.583 Epoch 40: test=0.600 Epoch 50: test=0.607 Epoch 60: test=0.611 Epoch 70: test=0.623 Epoch 80: test=0.628 Epoch 90: test=0.626 Epoch 100: test=0.630 Training Lyapunov (mult_log, λ_reg=0.3)... Epoch 10: test=0.038 λ=1.509 Epoch 20: test=0.026 λ=1.527 Epoch 30: test=0.012 λ=1.516 Epoch 40: test=0.020 λ=1.500 Epoch 50: test=0.016 λ=1.500 Epoch 60: test=0.015 λ=1.498 Epoch 70: test=0.021 λ=1.505 Epoch 80: test=0.020 λ=1.503 Epoch 90: test=0.018 λ=1.513 Epoch 100: test=0.018 λ=1.514 Result: Vanilla=0.630, Lyap=0.038, Δ=-0.593 ====================================================================== SUMMARY: DEPTH = 4 ====================================================================== reg_type λ_reg Vanilla Lyapunov Δ Final λ ---------------------------------------------------------------------- mult_linear 0.01 0.635 0.595 -0.039 1.827 mult_log 0.01 0.622 0.619 -0.003 1.860 mult_linear 0.05 0.627 0.075 -0.552 1.456 mult_log 0.05 0.628 0.134 -0.494 1.458 mult_linear 0.10 0.622 0.102 -0.521 1.491 mult_log 0.10 0.623 0.108 -0.515 1.464 mult_linear 0.30 0.624 0.034 -0.590 1.442 mult_log 0.30 0.630 0.038 -0.593 1.514 ---------------------------------------------------------------------- BEST: mult_log, λ_reg=0.01 → 0.619 (Δ=-0.003) Results saved to: ./runs/scaled_grid/depth4_results.json ====================================================================== GRID SEARCH COMPLETE ====================================================================== ============================================================ Finished: Tue Jan 6 00:32:27 CST 2026 ============================================================