欢迎来到惊蜇(SpikingJelly)的文档
SpikingJelly 是一个基于 PyTorch ,使用脉冲神经网络(Spiking Neural Network, SNN)进行深度学习的框架。
版本说明
自 0.0.0.0.14
版本开始,包括 clock_driven
和 event_driven
在内的模块被重命名了,请参考教程 从老版本迁移。
不同版本文档的地址(其中 latest 是开发版):
安装
注意,SpikingJelly是基于PyTorch的,需要确保环境中已经安装了PyTorch,才能安装spikingjelly。
奇数版本是开发版,随着GitHub/OpenI不断更新。偶数版本是稳定版,可以从PyPI获取。
从 PyPI 安装最新的稳定版本:
pip install spikingjelly
从源代码安装最新的开发版:
通过 GitHub:
git clone https://github.com/fangwei123456/spikingjelly.git
cd spikingjelly
python setup.py install
通过 OpenI :
git clone https://git.openi.org.cn/OpenI/spikingjelly.git
cd spikingjelly
python setup.py install
模块文档
文档索引
引用和出版物
如果您在自己的工作中用到了惊蜇(SpikingJelly),您可以按照下列格式进行引用:
@article{
doi:10.1126/sciadv.adi1480,
author = {Wei Fang and Yanqi Chen and Jianhao Ding and Zhaofei Yu and Timothée Masquelier and Ding Chen and Liwei Huang and Huihui Zhou and Guoqi Li and Yonghong Tian },
title = {SpikingJelly: An open-source machine learning infrastructure platform for spike-based intelligence},
journal = {Science Advances},
volume = {9},
number = {40},
pages = {eadi1480},
year = {2023},
doi = {10.1126/sciadv.adi1480},
URL = {https://www.science.org/doi/abs/10.1126/sciadv.adi1480},
eprint = {https://www.science.org/doi/pdf/10.1126/sciadv.adi1480},
abstract = {Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency by introducing neural dynamics and spike properties. As the emerging spiking deep learning paradigm attracts increasing interest, traditional programming frameworks cannot meet the demands of the automatic differentiation, parallel computation acceleration, and high integration of processing neuromorphic datasets and deployment. In this work, we present the SpikingJelly framework to address the aforementioned dilemma. We contribute a full-stack toolkit for preprocessing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips. Compared to existing methods, the training of deep SNNs can be accelerated 11×, and the superior extensibility and flexibility of SpikingJelly enable users to accelerate custom models at low costs through multilevel inheritance and semiautomatic code generation. SpikingJelly paves the way for synthesizing truly energy-efficient SNN-based machine intelligence systems, which will enrich the ecology of neuromorphic computing. Motivation and introduction of the software framework SpikingJelly for spiking deep learning.}}
使用惊蜇(SpikingJelly)的出版物可见于 Publications using SpikingJelly。
项目信息
北京大学信息科学技术学院数字媒体所媒体学习组 Multimedia Learning Group 和 鹏城实验室 是SpikingJelly的主要开发者。
开发人员名单可见于 贡献者 。
友情链接
Welcome to SpikingJelly’s documentation
SpikingJelly is an open-source deep learning framework for Spiking Neural Network (SNN) based on PyTorch.
Notification
From the version 0.0.0.0.14
, modules including clock_driven
and event_driven
are renamed. Please refer to the tutorial Migrate From Old Versions.
Docs for different versions (latest is the developing version):
Installation
Note that SpikingJelly is based on PyTorch. Please make sure that you have installed PyTorch before you install SpikingJelly.
The odd version number is the developing version, which is updated with GitHub/OpenI repository. The even version number is the stable version and available at PyPI.
Install the last stable version from PyPI:
pip install spikingjelly
Install the latest developing version from the source codes:
From GitHub:
git clone https://github.com/fangwei123456/spikingjelly.git
cd spikingjelly
python setup.py install
From OpenI:
git clone https://git.openi.org.cn/OpenI/spikingjelly.git
cd spikingjelly
python setup.py install
- Migrate From Old Versions
- Basic Conception
- Container
- Neuron
- Surrogate Gradient Method
- Monitor
- Single Fully Connected Layer SNN to Classify MNIST
- Convolutional SNN to Classify FMNIST
- Neuromorphic Datasets Processing
- Classify DVS Gesture
- Recurrent Connection and Stateful Synapse
- Train large-scale SNNs
- STDP Learning
- ANN2SNN
- Legacy Tutorials
- Implement CUPY Neuron
- Convert to Lava for Loihi Deployment
Modules Docs
Indices and tables
Citation
If you use SpikingJelly in your work, please cite it as follows:
@article{
doi:10.1126/sciadv.adi1480,
author = {Wei Fang and Yanqi Chen and Jianhao Ding and Zhaofei Yu and Timothée Masquelier and Ding Chen and Liwei Huang and Huihui Zhou and Guoqi Li and Yonghong Tian },
title = {SpikingJelly: An open-source machine learning infrastructure platform for spike-based intelligence},
journal = {Science Advances},
volume = {9},
number = {40},
pages = {eadi1480},
year = {2023},
doi = {10.1126/sciadv.adi1480},
URL = {https://www.science.org/doi/abs/10.1126/sciadv.adi1480},
eprint = {https://www.science.org/doi/pdf/10.1126/sciadv.adi1480},
abstract = {Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency by introducing neural dynamics and spike properties. As the emerging spiking deep learning paradigm attracts increasing interest, traditional programming frameworks cannot meet the demands of the automatic differentiation, parallel computation acceleration, and high integration of processing neuromorphic datasets and deployment. In this work, we present the SpikingJelly framework to address the aforementioned dilemma. We contribute a full-stack toolkit for preprocessing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips. Compared to existing methods, the training of deep SNNs can be accelerated 11×, and the superior extensibility and flexibility of SpikingJelly enable users to accelerate custom models at low costs through multilevel inheritance and semiautomatic code generation. SpikingJelly paves the way for synthesizing truly energy-efficient SNN-based machine intelligence systems, which will enrich the ecology of neuromorphic computing. Motivation and introduction of the software framework SpikingJelly for spiking deep learning.}}
Publications using SpikingJelly are recorded in Publications using SpikingJelly. If you use SpikingJelly in your paper, you can also add it to this table by pull request.
About
Multimedia Learning Group, Institute of Digital Media (NELVT), Peking University and Peng Cheng Laboratory are the main developers of SpikingJelly.
The list of developers can be found at contributors.
- spikingjelly.activation_based package
- Subpackages
- spikingjelly.activation_based.ann2snn package
- spikingjelly.activation_based.examples package
- Subpackages
- Submodules
- spikingjelly.activation_based.examples.A2C module
- spikingjelly.activation_based.examples.DQN_state module
- spikingjelly.activation_based.examples.PPO module
- spikingjelly.activation_based.examples.Spiking_A2C module
- spikingjelly.activation_based.examples.Spiking_DQN_state module
- spikingjelly.activation_based.examples.Spiking_PPO module
- spikingjelly.activation_based.examples.cifar10_r11_enabling_spikebased_backpropagation module
- spikingjelly.activation_based.examples.classify_dvsg module
- spikingjelly.activation_based.examples.conv_fashion_mnist module
- spikingjelly.activation_based.examples.dqn_cart_pole module
- spikingjelly.activation_based.examples.lif_fc_mnist module
- spikingjelly.activation_based.examples.rsnn_sequential_fmnist module
- spikingjelly.activation_based.examples.speechcommands module
- spikingjelly.activation_based.examples.spiking_lstm_sequential_mnist module
- spikingjelly.activation_based.examples.spiking_lstm_text module
- Module contents
- spikingjelly.activation_based.model package
- Submodules
- spikingjelly.activation_based.model.parametric_lif_net module
- spikingjelly.activation_based.model.sew_resnet module
- spikingjelly.activation_based.model.spiking_resnet module
- spikingjelly.activation_based.model.spiking_vgg module
- spikingjelly.activation_based.model.train_classify module
- spikingjelly.activation_based.model.train_imagenet module
- Module contents
- Module contents
- Subpackages
- spikingjelly.datasets package
- Submodules
- spikingjelly.datasets.asl_dvs module
- spikingjelly.datasets.cifar10_dvs module
- spikingjelly.datasets.dvs128_gesture module
- spikingjelly.datasets.es_imagenet module
- spikingjelly.datasets.n_caltech101 module
- spikingjelly.datasets.n_mnist module
- spikingjelly.datasets.nav_gesture module
- spikingjelly.datasets.shd module
cal_fixed_frames_number_segment_index_shd()
integrate_events_segment_to_frame_shd()
integrate_events_by_fixed_frames_number_shd()
integrate_events_file_to_frames_file_by_fixed_frames_number_shd()
integrate_events_by_fixed_duration_shd()
integrate_events_file_to_frames_file_by_fixed_duration_shd()
custom_integrate_function_example()
SpikingHeidelbergDigits
SpikingSpeechCommands
- spikingjelly.datasets.speechcommands module
- Module contents
save_every_frame_of_an_entire_DVS_dataset()
save_as_pic()
play_frame()
load_aedat_v3()
load_ATIS_bin()
load_npz_frames()
integrate_events_segment_to_frame()
cal_fixed_frames_number_segment_index()
integrate_events_by_fixed_frames_number()
integrate_events_file_to_frames_file_by_fixed_frames_number()
integrate_events_by_fixed_duration()
integrate_events_file_to_frames_file_by_fixed_duration()
save_frames_to_npz_and_print()
create_same_directory_structure()
split_to_train_test_set()
pad_sequence_collate()
padded_sequence_mask()
NeuromorphicDatasetFolder
NeuromorphicDatasetFolder.set_root_when_train_is_none()
NeuromorphicDatasetFolder.resource_url_md5()
NeuromorphicDatasetFolder.downloadable()
NeuromorphicDatasetFolder.extract_downloaded_files()
NeuromorphicDatasetFolder.create_events_np_files()
NeuromorphicDatasetFolder.get_H_W()
NeuromorphicDatasetFolder.load_events_np()
random_temporal_delete()
RandomTemporalDelete
create_sub_dataset()
- spikingjelly.timing_based package
- spikingjelly.visualizing package