============================================================ SCALED REGULARIZATION GRID SEARCH - DEPTH 8 Job ID: 15427304 | Node: gpub083 Start: Tue Jan 6 22:01:52 CST 2026 ============================================================ Grid: λ_reg=[0.0005, 0.001, 0.002, 0.005] × reg_type=[mult_linear, mult_log] Total: 8 experiments ============================================================ NVIDIA A40, 46068 MiB ============================================================ ====================================================================== SCALED REGULARIZATION GRID SEARCH ====================================================================== Depth: 8 Epochs: 100 Device: cuda GPU: NVIDIA A40 ====================================================================== Grid: 4 λ_reg × 2 reg_types = 8 experiments λ_reg values: [0.0005, 0.001, 0.002, 0.005] reg_types: ['mult_linear', 'mult_log'] Loading CIFAR-100... Train: 50000, Test: 10000 ============================================================ Config: depth=8, reg_type=mult_linear, λ_reg=0.0005 ============================================================ Training Vanilla... Epoch 10: test=0.435 Epoch 20: test=0.515 Epoch 30: test=0.559 Epoch 40: test=0.567 Epoch 50: test=0.578 Epoch 60: test=0.590 Epoch 70: test=0.593 Epoch 80: test=0.597 Epoch 90: test=0.593 Epoch 100: test=0.598 Training Lyapunov (mult_linear, λ_reg=0.0005)... Epoch 10: test=0.021 λ=1.726 Epoch 20: test=0.048 λ=1.659 Epoch 30: test=0.074 λ=1.649 Epoch 40: test=0.056 λ=1.666 Epoch 50: test=0.096 λ=1.671 Epoch 60: test=0.094 λ=1.668 Epoch 70: test=0.090 λ=1.677 Epoch 80: test=0.111 λ=1.666 Epoch 90: test=0.121 λ=1.669 Epoch 100: test=0.112 λ=1.667 Result: Vanilla=0.598, Lyap=0.121, Δ=-0.477 ============================================================ Config: depth=8, reg_type=mult_log, λ_reg=0.0005 ============================================================ Training Vanilla... Epoch 10: test=0.429 Epoch 20: test=0.517 Epoch 30: test=0.556 Epoch 40: test=0.564 Epoch 50: test=0.584 Epoch 60: test=0.575 Epoch 70: test=0.596 Epoch 80: test=0.594 Epoch 90: test=0.600 Epoch 100: test=0.596 Training Lyapunov (mult_log, λ_reg=0.0005)... Epoch 10: test=0.150 λ=2.466 Epoch 20: test=0.023 λ=2.422 Epoch 30: test=0.021 λ=2.654 Epoch 40: test=0.028 λ=2.725 Epoch 50: test=0.018 λ=2.763 Epoch 60: test=0.022 λ=2.787 Epoch 70: test=0.011 λ=2.807 Epoch 80: test=0.011 λ=2.827 Epoch 90: test=0.013 λ=2.828 Epoch 100: test=0.012 λ=2.828 Result: Vanilla=0.600, Lyap=0.150, Δ=-0.450 ============================================================ Config: depth=8, reg_type=mult_linear, λ_reg=0.001 ============================================================ Training Vanilla... Epoch 10: test=0.425 Epoch 20: test=0.538 Epoch 30: test=0.560 Epoch 40: test=0.576 Epoch 50: test=0.599 Epoch 60: test=0.600 Epoch 70: test=0.608 Epoch 80: test=0.608 Epoch 90: test=0.609 Epoch 100: test=0.614 Training Lyapunov (mult_linear, λ_reg=0.001)... Epoch 10: test=0.030 λ=1.726 Epoch 20: test=0.042 λ=1.748 Epoch 30: test=0.026 λ=1.624 Epoch 40: test=0.015 λ=1.667 Epoch 50: test=0.011 λ=1.659 Epoch 60: test=0.013 λ=1.662 Epoch 70: test=0.015 λ=1.667 Epoch 80: test=0.014 λ=1.641 Epoch 90: test=0.012 λ=1.642 Epoch 100: test=0.010 λ=1.642 Result: Vanilla=0.614, Lyap=0.042, Δ=-0.572 ============================================================ Config: depth=8, reg_type=mult_log, λ_reg=0.001 ============================================================ Training Vanilla... Epoch 10: test=0.443 Epoch 20: test=0.530 Epoch 30: test=0.563 Epoch 40: test=0.579 Epoch 50: test=0.602 Epoch 60: test=0.595 Epoch 70: test=0.609 Epoch 80: test=0.605 Epoch 90: test=0.608 Epoch 100: test=0.614 Training Lyapunov (mult_log, λ_reg=0.001)... Epoch 10: test=0.028 λ=1.675 Epoch 20: test=0.016 λ=1.666 Epoch 30: test=0.026 λ=1.652 Epoch 40: test=0.018 λ=1.666 Epoch 50: test=0.015 λ=1.676 Epoch 60: test=0.012 λ=1.719 Epoch 70: test=0.011 λ=1.744 Epoch 80: test=0.011 λ=1.735 Epoch 90: test=0.015 λ=1.742 Epoch 100: test=0.014 λ=1.742 Result: Vanilla=0.614, Lyap=0.028, Δ=-0.586 ============================================================ Config: depth=8, reg_type=mult_linear, λ_reg=0.002 ============================================================ Training Vanilla... Epoch 10: test=0.439 Epoch 20: test=0.524 Epoch 30: test=0.554 Epoch 40: test=0.573 Epoch 50: test=0.577 Epoch 60: test=0.587 Epoch 70: test=0.591 Epoch 80: test=0.590 Epoch 90: test=0.596 Epoch 100: test=0.601 Training Lyapunov (mult_linear, λ_reg=0.002)... Epoch 10: test=0.010 λ=1.595 Epoch 20: test=0.010 λ=1.539 Epoch 30: test=0.010 λ=1.560 Epoch 40: test=0.010 λ=1.541 Epoch 50: test=0.012 λ=1.584 Epoch 60: test=0.010 λ=1.543 Epoch 70: test=0.010 λ=1.541 Epoch 80: test=0.010 λ=1.545 Epoch 90: test=0.010 λ=1.546 Epoch 100: test=0.010 λ=1.551 Result: Vanilla=0.601, Lyap=0.012, Δ=-0.589 ============================================================ Config: depth=8, reg_type=mult_log, λ_reg=0.002 ============================================================ Training Vanilla... Epoch 10: test=0.437 Epoch 20: test=0.528 Epoch 30: test=0.559 Epoch 40: test=0.584 Epoch 50: test=0.588 Epoch 60: test=0.597 Epoch 70: test=0.608 Epoch 80: test=0.605 Epoch 90: test=0.615 Epoch 100: test=0.609 Training Lyapunov (mult_log, λ_reg=0.002)... Epoch 10: test=0.034 λ=1.688 Epoch 20: test=0.027 λ=1.653 Epoch 30: test=0.022 λ=1.661 Epoch 40: test=0.026 λ=1.644 Epoch 50: test=0.022 λ=1.591 Epoch 60: test=0.017 λ=1.589 Epoch 70: test=0.019 λ=1.606 Epoch 80: test=0.011 λ=1.641 Epoch 90: test=0.011 λ=1.631 Epoch 100: test=0.011 λ=1.650 Result: Vanilla=0.615, Lyap=0.034, Δ=-0.581 ============================================================ Config: depth=8, reg_type=mult_linear, λ_reg=0.005 ============================================================ Training Vanilla... Epoch 10: test=0.464 Epoch 20: test=0.525 Epoch 30: test=0.550 Epoch 40: test=0.556 Epoch 50: test=0.563 Epoch 60: test=0.590 Epoch 70: test=0.581 Epoch 80: test=0.584 Epoch 90: test=0.581 Epoch 100: test=0.584 Training Lyapunov (mult_linear, λ_reg=0.005)... Epoch 10: test=0.013 λ=1.609 Epoch 20: test=0.010 λ=1.550 Epoch 30: test=0.013 λ=1.549 Epoch 40: test=0.009 λ=1.551 Epoch 50: test=0.010 λ=1.541 Epoch 60: test=0.010 λ=1.540 Epoch 70: test=0.012 λ=1.541 Epoch 80: test=0.010 λ=1.549 Epoch 90: test=0.010 λ=1.551 Epoch 100: test=0.010 λ=1.552 Result: Vanilla=0.590, Lyap=0.013, Δ=-0.577 ============================================================ Config: depth=8, reg_type=mult_log, λ_reg=0.005 ============================================================ Training Vanilla... Epoch 10: test=0.439 Epoch 20: test=0.523 Epoch 30: test=0.570 Epoch 40: test=0.583 Epoch 50: test=0.583 Epoch 60: test=0.599 Epoch 70: test=0.588 Epoch 80: test=0.607 Epoch 90: test=0.604 Epoch 100: test=0.610 Training Lyapunov (mult_log, λ_reg=0.005)... Epoch 10: test=0.012 λ=1.576 Epoch 20: test=0.014 λ=1.536 Epoch 30: test=0.010 λ=1.531 Epoch 40: test=0.010 λ=1.528 Epoch 50: test=0.011 λ=1.532 Epoch 60: test=0.010 λ=1.541 Epoch 70: test=0.010 λ=1.554 Epoch 80: test=0.014 λ=1.572 Epoch 90: test=0.010 λ=1.550 Epoch 100: test=0.010 λ=1.581 Result: Vanilla=0.610, Lyap=0.014, Δ=-0.596 ====================================================================== SUMMARY: DEPTH = 8 ====================================================================== reg_type λ_reg Vanilla Lyapunov Δ Final λ ---------------------------------------------------------------------- mult_linear 0.00 0.598 0.121 -0.477 1.667 mult_log 0.00 0.600 0.150 -0.450 2.828 mult_linear 0.00 0.614 0.042 -0.572 1.642 mult_log 0.00 0.614 0.028 -0.586 1.742 mult_linear 0.00 0.601 0.012 -0.589 1.551 mult_log 0.00 0.615 0.034 -0.581 1.650 mult_linear 0.01 0.590 0.013 -0.577 1.552 mult_log 0.01 0.610 0.014 -0.596 1.581 ---------------------------------------------------------------------- BEST: mult_log, λ_reg=0.0005 → 0.150 (Δ=-0.450) Results saved to: ./runs/scaled_grid/depth8_results.json ====================================================================== GRID SEARCH COMPLETE ====================================================================== ============================================================ Finished: Wed Jan 7 20:52:01 CST 2026 ============================================================