summaryrefslogtreecommitdiff
path: root/runs/slurm_logs/14360854_test.out
blob: 44463f39630dcf397c3945d9944ac31f350d11a5 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
============================================================
Quick Test: SNN with Lyapunov Regularization
Job ID: 14360854 | Node: gpub040
============================================================
NVIDIA A40, 46068 MiB
============================================================
Test 1: Model and Lyapunov computation...
  Logits shape: torch.Size([8, 10])
  Lyapunov exponent: 0.3758
  Spikes shape: torch.Size([8, 50, 64])
  PASSED

Test 2: Training loop with Lyapunov regularization...
  Step 1: loss=5.7394, lyap=0.4977
  Step 2: loss=4.2242, lyap=0.4870
  Step 3: loss=3.0435, lyap=0.4873
  Step 4: loss=2.5410, lyap=0.4870
  Step 5: loss=2.1885, lyap=0.4874
  PASSED

Test 3: Depth comparison (2 epochs, depths 1,2,4)...
======================================================================
Experiment: Vanilla vs Lyapunov-Regularized SNN
======================================================================
Depths: [1, 2, 4]
Hidden dim: 64
Epochs: 2
Lambda_reg: 0.1
Device: cuda

Using SYNTHETIC data for quick testing
Data: T=50, D=100, classes=10

==================================================
Depth = 1 layers
==================================================

  Training VANILLA...
    Final: loss=0.1148 acc=0.979 val_acc=0.998 λ=N/A ∇=0.99

  Training LYAPUNOV...
    Final: loss=0.1152 acc=0.979 val_acc=1.000 λ=-0.063 ∇=0.99

==================================================
Depth = 2 layers
==================================================

  Training VANILLA...
    Final: loss=0.0306 acc=0.999 val_acc=1.000 λ=N/A ∇=0.24

  Training LYAPUNOV...
    Final: loss=0.0424 acc=1.000 val_acc=1.000 λ=0.285 ∇=0.28

==================================================
Depth = 4 layers
==================================================

  Training VANILLA...
    Final: loss=0.7594 acc=0.758 val_acc=0.826 λ=N/A ∇=1.03

  Training LYAPUNOV...
    Final: loss=0.7975 acc=0.774 val_acc=0.828 λ=0.638 ∇=1.02

======================================================================
SUMMARY: Final Validation Accuracy by Depth
======================================================================
Depth    Vanilla         Lyapunov        Difference     
----------------------------------------------------------------------
1        0.998           1.000           +0.002         
2        1.000           1.000           0.000          
4        0.826           0.828           +0.002         
======================================================================

Gradient Norm Analysis (final epoch):
----------------------------------------------------------------------
Depth    Vanilla ∇       Lyapunov ∇     
----------------------------------------------------------------------
1        0.99            0.99           
2        0.24            0.28           
4        1.03            1.02           

Results saved to runs/test_output/20251227-055553

============================================================
All tests PASSED
============================================================