Skip to content

Latest commit

 

History

History
36 lines (29 loc) · 2.27 KB

README.md

File metadata and controls

36 lines (29 loc) · 2.27 KB

Enabling Deep Spiking Neural Networks with Hybrid Conversion and Spike Timing Dependent Backpropagation

This is the code related to the paper titled "Enabling Deep Spiking Neural Networks with Hybrid Conversion and Spike Timing Dependent Backpropagation" published in ICLR, 2020

Training Methodology

The training is performed in the following two steps:

  • Train an ANN ('ann.py')
  • Convert the ANN to SNN and perform spike-based backpropagation ('main.py')

Trained ANN models

Trained SNN models

Issues

  • Sometimes the 'STDB' activation becomes unstable during training, leading to accuracy drop. The solution is to modulate the alpha and beta parameter or change the activation to 'Linear' in 'main.py'
  • Another reason for drop in accuracy could be the leak parameter. Please change 'leak_mem=1.0' in 'main.py'. This changes the leaky-integrate-and-fire (LIF) neuron to integrate-and-fire (IF) neuron.

Citation

If you use this code in your work, please cite the following paper

@inproceedings{
Rathi2020Enabling,
title={Enabling Deep Spiking Neural Networks with Hybrid Conversion and Spike Timing Dependent Backpropagation},
author={Nitin Rathi and Gopalakrishnan Srinivasan and Priyadarshini Panda and Kaushik Roy},
booktitle={International Conference on Learning Representations},
year={2020},
url={https://openreview.net/forum?id=B1xSperKvH}
}