summaryrefslogtreecommitdiff
path: root/docs
diff options
context:
space:
mode:
authorYurenHao0426 <blackhao0426@gmail.com>2026-01-13 23:50:59 -0600
committerYurenHao0426 <blackhao0426@gmail.com>2026-01-13 23:50:59 -0600
commit00cf667cee7ffacb144d5805fc7e0ef443f3583a (patch)
tree77d20a3adaecf96bf3aff0612bdd3b5fa1a7dc7e /docs
parentc53c04aa1d6ff75cb478a9498c370baa929c74b6 (diff)
parentcd99d6b874d9d09b3bb87b8485cc787885af71f1 (diff)
Merge master into main
Diffstat (limited to 'docs')
-rw-r--r--docs/description-from-rainer.md66
-rw-r--r--docs/temp-notes.md0
2 files changed, 66 insertions, 0 deletions
diff --git a/docs/description-from-rainer.md b/docs/description-from-rainer.md
new file mode 100644
index 0000000..2304e7e
--- /dev/null
+++ b/docs/description-from-rainer.md
@@ -0,0 +1,66 @@
+# Improving Spiking Network Training using Dynamical Systems
+
+## Goal
+The goal is to overcome a key obstacle in training brain-inspired **Spiking Neural Networks (SNNs)**.
+While SNNs are promising, their all-or-nothing β€œspikes” are **non-differentiable**, clashing with standard training algorithms.
+The common workaround, **surrogate gradients**, often fails on temporal tasks due to **unstable error signals** that either **explode or vanish**.
+
+This project reframes the problem using **dynamical systems theory**.
+You will evaluate a novel technique called **surrogate gradient flossing**, which measures the stability of the training process by calculating **surrogate Lyapunov exponents**.
+By actively tuning the network to control these exponents, you will stabilize learning and enable SNNs to tackle challenging **temporal credit assignment** problems.
+
+---
+
+## Project Rationale
+The **surrogate gradient flossing** idea has been implemented and showed initial promise, but it requires **rigorous evaluation** and a **deeper theoretical understanding** of its mechanics.
+
+The central question is whether **explicitly controlling the stability of the gradient flow** is a viable and robust strategy for improving learning in SNNs.
+
+This project offers a chance to explore the **fundamental link between system dynamics and learning capacity**.
+
+---
+
+## Key Outcomes
+
+- βœ… **Train a simple SNN** with surrogate gradient flossing and perform gradient analysis.
+- πŸš€ **(Stretch)**: Compare performance on **benchmark temporal tasks** such as *spiking spoken digits classification* and analyze the resulting network dynamics.
+- 🧠 **(Stretch)**: Implement a **multi-layer SNN** and quantify gradients and performance **with and without** surrogate gradient flossing.
+- πŸ”§ **(Stretch)**: **Optimize the gradient flossing schedule** to improve stability and convergence.
+
+---
+
+## Project Profile
+
+| Category | Intensity |
+|--------------------|------------|
+| Analytical Intensity | ⭐⭐⭐ |
+| Coding Intensity | ⭐⭐⭐ |
+| Computational Neuroscience | ⭐⭐⭐ |
+| Machine Learning | ⭐⭐⭐ |
+
+**Starter Toolkit:** Reference example implementations available in **Julia** and **Python**.
+
+---
+
+## Literature
+
+1. **Neftci, E. O., Mostafa, H., & Zenke, F. (2019).**
+ *Surrogate gradient learning in spiking neural networks.*
+ _IEEE Signal Processing Magazine, 36(6), 51–63._
+
+2. **Engelken, R. (2023).**
+ *Gradient Flossing: Improving Gradient Descent through Dynamic Control of Jacobians.*
+ _NeurIPS._
+ [https://doi.org/10.48550/arXiv.2312.17306](https://doi.org/10.48550/arXiv.2312.17306)
+
+3. **Gygax, J., & Zenke, F. (2025).**
+ *Elucidating the theoretical underpinnings of surrogate gradient learning in spiking neural networks.*
+ _Neural Computation, 37(5), 886–925._
+
+4. **Rossbroich, J., Gygax, J., & Zenke, F. (2022).**
+ *Fluctuation-driven initialization for spiking neural network training.*
+ _Neuromorphic Computing and Engineering, 2(4), 044016._
+
+5. **Yik, J., et al. (2025).**
+ *The Neurobench framework for benchmarking neuromorphic computing algorithms and systems.*
+ _Nature Communications, 16(1), 1545._
diff --git a/docs/temp-notes.md b/docs/temp-notes.md
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/docs/temp-notes.md