spikingjelly.clock_driven.neuron package

Module contents

spikingjelly.clock_driven.neuron.check_backend(backend: str)[源代码]
class spikingjelly.clock_driven.neuron.BaseNode(v_threshold: float = 1.0, v_reset: float = 0.0, surrogate_function: Callable = Sigmoid(alpha=4.0, spiking=True), detach_reset: bool = False)[源代码]

基类:MemoryModule

参数
  • v_threshold (float) – 神经元的阈值电压

  • v_reset (float) – 神经元的重置电压。如果不为 None,当神经元释放脉冲后,电压会被重置为 v_reset; 如果设置为 None,则电压会被减去 v_threshold

  • surrogate_function (Callable) – 反向传播时用来计算脉冲函数梯度的替代函数

  • detach_reset (bool) – 是否将reset过程的计算图分离

可微分SNN神经元的基类神经元。

参数
  • v_threshold (float) – threshold voltage of neurons

  • v_reset (float) – reset voltage of neurons. If not None, voltage of neurons that just fired spikes will be set to v_reset. If None, voltage of neurons that just fired spikes will subtract v_threshold

  • surrogate_function (Callable) – surrogate function for replacing gradient of spiking functions during back-propagation

  • detach_reset (bool) – whether detach the computation graph of reset

This class is the base class of differentiable spiking neurons.

abstract neuronal_charge(x: Tensor)[源代码]

定义神经元的充电差分方程。子类必须实现这个函数。

Define the charge difference equation. The sub-class must implement this function.

neuronal_fire()[源代码]

根据当前神经元的电压、阈值,计算输出脉冲。

Calculate out spikes of neurons by their current membrane potential and threshold voltage.

neuronal_reset(spike)[源代码]

根据当前神经元释放的脉冲,对膜电位进行重置。

Reset the membrane potential according to neurons’ output spikes.

extra_repr()[源代码]
forward(x: Tensor)[源代码]
参数

x (torch.Tensor) – 输入到神经元的电压增量

返回

神经元的输出脉冲

返回类型

torch.Tensor

按照充电、放电、重置的顺序进行前向传播。

参数

x (torch.Tensor) – increment of voltage inputted to neurons

返回

out spikes of neurons

返回类型

torch.Tensor

Forward by the order of neuronal_charge, neuronal_fire, and neuronal_reset.

training: bool
class spikingjelly.clock_driven.neuron.AdaptiveBaseNode(v_threshold: float = 1.0, v_reset: float = 0.0, v_rest: float = 0.0, w_rest: float = 0, tau_w: float = 2.0, a: float = 0.0, b: float = 0.0, surrogate_function: Callable = Sigmoid(alpha=4.0, spiking=True), detach_reset: bool = False)[源代码]

基类:BaseNode

neuronal_adaptation(spike)[源代码]
extra_repr()[源代码]
forward(x: torch.Tensor)

Helper for @overload to raise when called.

training: bool
class spikingjelly.clock_driven.neuron.IFNode(v_threshold: float = 1.0, v_reset: float = 0.0, surrogate_function: Callable = Sigmoid(alpha=4.0, spiking=True), detach_reset: bool = False, cupy_fp32_inference=False)[源代码]

基类:BaseNode

参数
  • v_threshold (float) – 神经元的阈值电压

  • v_reset (float) – 神经元的重置电压。如果不为 None,当神经元释放脉冲后,电压会被重置为 v_reset; 如果设置为 None,则电压会被减去 v_threshold

  • surrogate_function (Callable) – 反向传播时用来计算脉冲函数梯度的替代函数

  • detach_reset (bool) – 是否将reset过程的计算图分离

  • cupy_fp32_inference (bool) – 若为 True,在 eval 模式下,使用float32,却在GPU上运行,并且 cupy 已经安装,则会自动使用 cupy 进行加速

Integrate-and-Fire 神经元模型,可以看作理想积分器,无输入时电压保持恒定,不会像LIF神经元那样衰减。其阈下神经动力学方程为:

\[V[t] = V[t-1] + X[t]\]
参数
  • v_threshold (float) – threshold voltage of neurons

  • v_reset (float) – reset voltage of neurons. If not None, voltage of neurons that just fired spikes will be set to v_reset. If None, voltage of neurons that just fired spikes will subtract v_threshold

  • surrogate_function (Callable) – surrogate function for replacing gradient of spiking functions during back-propagation

  • detach_reset (bool) – whether detach the computation graph of reset

  • cupy_fp32_inference (bool) – If True, if this module is in eval mode, using float32, running on GPU, and cupy is installed, then this module will use cupy to accelerate

The Integrate-and-Fire neuron, which can be seen as a ideal integrator. The voltage of the IF neuron will not decay as that of the LIF neuron. The subthreshold neural dynamics of it is as followed:

\[V[t] = V[t-1] + X[t]\]
neuronal_charge(x: Tensor)[源代码]
forward(x: Tensor)[源代码]
training: bool
class spikingjelly.clock_driven.neuron.MultiStepIFNode(v_threshold: float = 1.0, v_reset: float = 0.0, surrogate_function: Callable = Sigmoid(alpha=4.0, spiking=True), detach_reset: bool = False, backend='torch', lava_s_cale=64)[源代码]

基类:IFNode

参数
  • v_threshold (float) – 神经元的阈值电压

  • v_reset (float) – 神经元的重置电压。如果不为 None,当神经元释放脉冲后,电压会被重置为 v_reset; 如果设置为 None,则电压会被减去 v_threshold

  • surrogate_function (Callable) – 反向传播时用来计算脉冲函数梯度的替代函数

  • detach_reset (bool) – 是否将reset过程的计算图分离

  • backend (str) – 使用哪种计算后端,可以为 'torch''cupy''cupy' 速度更快,但仅支持GPU。

多步版本的 spikingjelly.clock_driven.neuron.IFNode

小技巧

对于多步神经元,输入 x_seq.shape = [T, *],不仅可以使用 .v.spike 获取 t = T - 1 时刻的电压和脉冲,还能够 使用 .v_seq.spike_seq 获取完整的 T 个时刻的电压和脉冲。

小技巧

阅读 传播模式 以获取更多关于单步和多步传播的信息。

参数
  • v_threshold (float) – threshold voltage of neurons

  • v_reset (float) – reset voltage of neurons. If not None, voltage of neurons that just fired spikes will be set to v_reset. If None, voltage of neurons that just fired spikes will subtract v_threshold

  • surrogate_function (Callable) – surrogate function for replacing gradient of spiking functions during back-propagation

  • detach_reset (bool) – whether detach the computation graph of reset

  • backend (str) – use which backend, 'torch' or 'cupy'. 'cupy' is faster but only supports GPU

The multi-step version of spikingjelly.clock_driven.neuron.IFNode.

Tip

The input for multi-step neurons are x_seq.shape = [T, *]. We can get membrane potential and spike at time-step t = T - 1 by .v and .spike. We can also get membrane potential and spike at all T time-steps by .v_seq and .spike_seq.

Tip

Read Propagation Pattern for more details about single-step and multi-step propagation.

forward(x_seq: Tensor)[源代码]
extra_repr()[源代码]
to_lava()[源代码]
reset()[源代码]
training: bool
class spikingjelly.clock_driven.neuron.LIFNode(tau: float = 2.0, decay_input: bool = True, v_threshold: float = 1.0, v_reset: float = 0.0, surrogate_function: Callable = Sigmoid(alpha=4.0, spiking=True), detach_reset: bool = False, cupy_fp32_inference=False)[源代码]

基类:BaseNode

参数
  • tau (float) – 膜电位时间常数

  • decay_input (bool) – 输入是否会衰减

  • v_threshold (float) – 神经元的阈值电压

  • v_reset (float) – 神经元的重置电压。如果不为 None,当神经元释放脉冲后,电压会被重置为 v_reset; 如果设置为 None,则电压会被减去 v_threshold

  • surrogate_function (Callable) – 反向传播时用来计算脉冲函数梯度的替代函数

  • detach_reset (bool) – 是否将reset过程的计算图分离

  • cupy_fp32_inference (bool) – 若为 True,在 eval 模式下,使用float32,却在GPU上运行,并且 cupy 已经安装,则会自动使用 cupy 进行加速

Leaky Integrate-and-Fire 神经元模型,可以看作是带漏电的积分器。其阈下神经动力学方程为:

decay_input == True:

\[V[t] = V[t-1] + \frac{1}{\tau}(X[t] - (V[t-1] - V_{reset}))\]

decay_input == False:

\[V[t] = V[t-1] - \frac{1}{\tau}(V[t-1] - V_{reset}) + X[t]\]

小技巧

eval 模式下,使用float32,却在GPU上运行,并且 cupy 已经安装,则会自动使用 cupy 进行加速。

参数
  • tau (float) – membrane time constant

  • decay_input (bool) – whether the input will decay

  • v_threshold (float) – threshold voltage of neurons

  • v_reset (float) – reset voltage of neurons. If not None, voltage of neurons that just fired spikes will be set to v_reset. If None, voltage of neurons that just fired spikes will subtract v_threshold

  • surrogate_function (Callable) – surrogate function for replacing gradient of spiking functions during back-propagation

  • detach_reset (bool) – whether detach the computation graph of reset

  • cupy_fp32_inference (bool) – If True, if this module is in eval mode, using float32, running on GPU, and cupy is installed, then this module will use cupy to accelerate

The Leaky Integrate-and-Fire neuron, which can be seen as a leaky integrator. The subthreshold neural dynamics of it is as followed:

IF decay_input == True:

\[V[t] = V[t-1] + \frac{1}{\tau}(X[t] - (V[t-1] - V_{reset}))\]

IF decay_input == False:

\[V[t] = V[t-1] - \frac{1}{\tau}(V[t-1] - V_{reset}) + X[t]\]

Tip

If this module is in eval mode, using float32, running on GPU, and cupy is installed, then this module will use cupy to accelerate.

extra_repr()[源代码]
neuronal_charge(x: Tensor)[源代码]
forward(x: Tensor)[源代码]
training: bool
class spikingjelly.clock_driven.neuron.MultiStepLIFNode(tau: float = 2.0, decay_input: bool = True, v_threshold: float = 1.0, v_reset: float = 0.0, surrogate_function: Callable = Sigmoid(alpha=4.0, spiking=True), detach_reset: bool = False, backend='torch', lava_s_cale=64)[源代码]

基类:LIFNode

参数
  • tau (float) – 膜电位时间常数

  • decay_input (bool) – 输入是否会衰减

  • v_threshold (float) – 神经元的阈值电压

  • v_reset (float) – 神经元的重置电压。如果不为 None,当神经元释放脉冲后,电压会被重置为 v_reset; 如果设置为 None,则电压会被减去 v_threshold

  • surrogate_function (Callable) – 反向传播时用来计算脉冲函数梯度的替代函数

  • detach_reset (bool) – 是否将reset过程的计算图分离

  • backend (str) – 使用哪种计算后端,可以为 'torch''cupy''cupy' 速度更快,但仅支持GPU。

多步版本的 spikingjelly.clock_driven.neuron.LIFNode

小技巧

对于多步神经元,输入 x_seq.shape = [T, *],不仅可以使用 .v.spike 获取 t = T - 1 时刻的电压和脉冲,还能够 使用 .v_seq.spike_seq 获取完整的 T 个时刻的电压和脉冲。

小技巧

阅读 传播模式 以获取更多关于单步和多步传播的信息。

参数
  • tau (float) – membrane time constant

  • decay_input (bool) – whether the input will decay

  • v_threshold (float) – threshold voltage of neurons

  • v_reset (float) – reset voltage of neurons. If not None, voltage of neurons that just fired spikes will be set to v_reset. If None, voltage of neurons that just fired spikes will subtract v_threshold

  • surrogate_function (Callable) – surrogate function for replacing gradient of spiking functions during back-propagation

  • detach_reset (bool) – whether detach the computation graph of reset

  • backend (str) – use which backend, 'torch' or 'cupy'. 'cupy' is faster but only supports GPU

The multi-step version of spikingjelly.clock_driven.neuron.LIFNode.

Tip

The input for multi-step neurons are x_seq.shape = [T, *]. We can get membrane potential and spike at time-step t = T - 1 by .v and .spike. We can also get membrane potential and spike at all T time-steps by .v_seq and .spike_seq.

Tip

Read Propagation Pattern for more details about single-step and multi-step propagation.

forward(x_seq: Tensor)[源代码]
extra_repr()[源代码]
to_lava()[源代码]
reset()[源代码]
training: bool
class spikingjelly.clock_driven.neuron.ParametricLIFNode(init_tau: float = 2.0, decay_input: bool = True, v_threshold: float = 1.0, v_reset: float = 0.0, surrogate_function: Callable = Sigmoid(alpha=4.0, spiking=True), detach_reset: bool = False)[源代码]

基类:BaseNode

参数
  • init_tau (float) – 膜电位时间常数的初始值

  • decay_input (bool) – 输入是否会衰减

  • v_threshold (float) – 神经元的阈值电压

  • v_reset (float) – 神经元的重置电压。如果不为 None,当神经元释放脉冲后,电压会被重置为 v_reset; 如果设置为 None,则电压会被减去 v_threshold

  • surrogate_function (Callable) – 反向传播时用来计算脉冲函数梯度的替代函数

  • detach_reset (bool) – 是否将reset过程的计算图分离

Incorporating Learnable Membrane Time Constant to Enhance Learning of Spiking Neural Networks 提出的 Parametric Leaky Integrate-and-Fire (PLIF)神经元模型,可以看作是带漏电的积分器。其阈下神经动力学方程为:

decay_input == True:

\[V[t] = V[t-1] + \frac{1}{\tau}(X[t] - (V[t-1] - V_{reset}))\]

decay_input == False:

\[V[t] = V[t-1] - \frac{1}{\tau}(V[t-1] - V_{reset}) + X[t]\]

其中 \(\frac{1}{\tau} = {\rm Sigmoid}(w)\)\(w\) 是可学习的参数。

参数
  • init_tau (float) – the initial value of membrane time constant

  • decay_input (bool) – whether the input will decay

  • v_threshold (float) – threshold voltage of neurons

  • v_reset (float) – reset voltage of neurons. If not None, voltage of neurons that just fired spikes will be set to v_reset. If None, voltage of neurons that just fired spikes will subtract v_threshold

  • surrogate_function (Callable) – surrogate function for replacing gradient of spiking functions during back-propagation

  • detach_reset (bool) – whether detach the computation graph of reset

The Parametric Leaky Integrate-and-Fire (PLIF) neuron, which is proposed by Incorporating Learnable Membrane Time Constant to Enhance Learning of Spiking Neural Networks and can be seen as a leaky integrator. The subthreshold neural dynamics of it is as followed:

IF decay_input == True:

\[V[t] = V[t-1] + \frac{1}{\tau}(X[t] - (V[t-1] - V_{reset}))\]

IF decay_input == False:

\[V[t] = V[t-1] - \frac{1}{\tau}(V[t-1] - V_{reset}) + X[t]\]

where \(\frac{1}{\tau} = {\rm Sigmoid}(w)\), \(w\) is a learnable parameter.

extra_repr()[源代码]
neuronal_charge(x: Tensor)[源代码]
training: bool
class spikingjelly.clock_driven.neuron.MultiStepParametricLIFNode(init_tau: float = 2.0, decay_input: bool = True, v_threshold: float = 1.0, v_reset: float = 0.0, surrogate_function: Callable = Sigmoid(alpha=4.0, spiking=True), detach_reset: bool = False, backend='torch')[源代码]

基类:ParametricLIFNode

参数
  • init_tau (float) – 膜电位时间常数的初始值

  • decay_input (bool) – 输入是否会衰减

  • v_threshold (float) – 神经元的阈值电压

  • v_reset (float) – 神经元的重置电压。如果不为 None,当神经元释放脉冲后,电压会被重置为 v_reset; 如果设置为 None,则电压会被减去 v_threshold

  • surrogate_function (Callable) – 反向传播时用来计算脉冲函数梯度的替代函数

  • detach_reset (bool) – 是否将reset过程的计算图分离

多步版本的 Incorporating Learnable Membrane Time Constant to Enhance Learning of Spiking Neural Networks 提出的 Parametric Leaky Integrate-and-Fire (PLIF)神经元模型,可以看作是带漏电的积分器。其阈下神经动力学方程为:

\[V[t] = V[t-1] + \frac{1}{\tau}(X[t] - (V[t-1] - V_{reset})\]

其中 \(\frac{1}{\tau} = {\rm Sigmoid}(w)\)\(w\) 是可学习的参数。

小技巧

对于多步神经元,输入 x_seq.shape = [T, *],不仅可以使用 .v.spike 获取 t = T - 1 时刻的电压和脉冲,还能够 使用 .v_seq.spike_seq 获取完整的 T 个时刻的电压和脉冲。

小技巧

阅读 传播模式 以获取更多关于单步和多步传播的信息。

参数
  • init_tau (float) – the initial value of membrane time constant

  • decay_input (bool) – whether the input will decay

  • v_threshold (float) – threshold voltage of neurons

  • v_reset (float) – reset voltage of neurons. If not None, voltage of neurons that just fired spikes will be set to v_reset. If None, voltage of neurons that just fired spikes will subtract v_threshold

  • surrogate_function (Callable) – surrogate function for replacing gradient of spiking functions during back-propagation

  • detach_reset (bool) – whether detach the computation graph of reset

  • backend (str) – use which backend, 'torch' or 'cupy'. 'cupy' is faster but only supports GPU

The multi-step Parametric Leaky Integrate-and-Fire (PLIF) neuron, which is proposed by Incorporating Learnable Membrane Time Constant to Enhance Learning of Spiking Neural Networks and can be seen as a leaky integrator. The subthreshold neural dynamics of it is as followed:

\[V[t] = V[t-1] + \frac{1}{\tau}(X[t] - (V[t-1] - V_{reset})\]

where \(\frac{1}{\tau} = {\rm Sigmoid}(w)\), \(w\) is a learnable parameter.

Tip

The input for multi-step neurons are x_seq.shape = [T, *]. We can get membrane potential and spike at time-step t = T - 1 by .v and .spike. We can also get membrane potential and spike at all T time-steps by .v_seq and .spike_seq.

Tip

Read Propagation Pattern for more details about single-step and multi-step propagation.

forward(x_seq: Tensor)[源代码]
extra_repr()[源代码]
training: bool
class spikingjelly.clock_driven.neuron.QIFNode(tau: float = 2.0, v_c: float = 0.8, a0: float = 1.0, v_threshold: float = 1.0, v_rest: float = 0.0, v_reset: float = -0.1, surrogate_function: Callable = Sigmoid(alpha=4.0, spiking=True), detach_reset: bool = False)[源代码]

基类:BaseNode

参数
  • tau (float) – 膜电位时间常数

  • v_c (float) – 关键电压

  • a0 (float) –

  • v_threshold (float) – 神经元的阈值电压

  • v_rest (float) – 静息电位

  • v_reset (float) – 神经元的重置电压。如果不为 None,当神经元释放脉冲后,电压会被重置为 v_reset; 如果设置为 None,则电压会被减去 v_threshold

  • surrogate_function (Callable) – 反向传播时用来计算脉冲函数梯度的替代函数

  • detach_reset (bool) – 是否将reset过程的计算图分离

Quadratic Integrate-and-Fire 神经元模型,一种非线性积分发放神经元模型,也是指数积分发放神经元(Exponential Integrate-and-Fire)的近似版本。其阈下神经动力学方程为:

\[V[t] = V[t-1] + \frac{1}{\tau}(X[t] + a_0 (V[t-1] - V_{rest})(V[t-1] - V_c))\]
参数
  • tau (float) – membrane time constant

  • v_c (float) – critical voltage

  • a0 (float) –

  • v_threshold (float) – threshold voltage of neurons

  • v_rest (float) – resting potential

  • v_reset (float) – reset voltage of neurons. If not None, voltage of neurons that just fired spikes will be set to v_reset. If None, voltage of neurons that just fired spikes will subtract v_threshold

  • surrogate_function (Callable) – surrogate function for replacing gradient of spiking functions during back-propagation

  • detach_reset (bool) – whether detach the computation graph of reset

The Quadratic Integrate-and-Fire neuron is a kind of nonlinear integrate-and-fire models and also an approximation of the Exponential Integrate-and-Fire model. The subthreshold neural dynamics of it is as followed:

\[V[t] = V[t-1] + \frac{1}{\tau}(X[t] + a_0 (V[t-1] - V_{rest})(V[t-1] - V_c))\]
extra_repr()[源代码]
neuronal_charge(x: Tensor)[源代码]
training: bool
class spikingjelly.clock_driven.neuron.EIFNode(tau: float = 2.0, delta_T: float = 1.0, theta_rh: float = 0.8, v_threshold: float = 1.0, v_rest: float = 0.0, v_reset: float = -0.1, surrogate_function: Callable = Sigmoid(alpha=4.0, spiking=True), detach_reset: bool = False)[源代码]

基类:BaseNode

参数
  • tau (float) – 膜电位时间常数

  • delta_T (float) – 陡峭度参数

  • theta_rh (float) – 基强度电压阈值

  • v_threshold (float) – 神经元的阈值电压

  • v_rest (float) – 静息电位

  • v_reset (float) – 神经元的重置电压。如果不为 None,当神经元释放脉冲后,电压会被重置为 v_reset; 如果设置为 None,则电压会被减去 v_threshold

  • surrogate_function (Callable) – 反向传播时用来计算脉冲函数梯度的替代函数

  • detach_reset (bool) – 是否将reset过程的计算图分离

Exponential Integrate-and-Fire 神经元模型,一种非线性积分发放神经元模型,是由HH神经元模型(Hodgkin-Huxley model)简化后推导出的一维模型。在 \(\Delta_T\to 0\) 时退化为LIF模型。其阈下神经动力学方程为:

\[V[t] = V[t-1] + \frac{1}{\tau}\left(X[t] - (V[t-1] - V_{rest}) + \Delta_T\exp\left(\frac{V[t-1] - \theta_{rh}}{\Delta_T}\right)\right)\]
参数
  • tau (float) – membrane time constant

  • delta_T (float) – sharpness parameter

  • theta_rh (float) – rheobase threshold

  • v_threshold (float) – threshold voltage of neurons

  • v_rest (float) – resting potential

  • v_reset (float) – reset voltage of neurons. If not None, voltage of neurons that just fired spikes will be set to v_reset. If None, voltage of neurons that just fired spikes will subtract v_threshold

  • surrogate_function (Callable) – surrogate function for replacing gradient of spiking functions during back-propagation

  • detach_reset (bool) – whether detach the computation graph of reset

The Exponential Integrate-and-Fire neuron is a kind of nonlinear integrate-and-fire models and also an one-dimensional model derived from the Hodgkin-Huxley model. It degenerates to the LIF model when \(\Delta_T\to 0\). The subthreshold neural dynamics of it is as followed:

\[V[t] = V[t-1] + \frac{1}{\tau}\left(X[t] - (V[t-1] - V_{rest}) + \Delta_T\exp\left(\frac{V[t-1] - \theta_{rh}}{\Delta_T}\right)\right)\]
extra_repr()[源代码]
neuronal_charge(x: Tensor)[源代码]
training: bool
class spikingjelly.clock_driven.neuron.MultiStepEIFNode(tau: float = 2.0, delta_T: float = 1.0, theta_rh: float = 0.8, v_threshold: float = 1.0, v_rest: float = 0.0, v_reset: float = -0.1, surrogate_function: Callable = Sigmoid(alpha=4.0, spiking=True), detach_reset: bool = False, backend='torch')[源代码]

基类:EIFNode

::param tau: 膜电位时间常数 :type tau: float

参数
  • delta_T (float) – 陡峭度参数

  • theta_rh (float) – 基强度电压阈值

  • v_threshold (float) – 神经元的阈值电压

  • v_rest (float) – 静息电位

  • v_reset (float) – 神经元的重置电压。如果不为 None,当神经元释放脉冲后,电压会被重置为 v_reset; 如果设置为 None,则电压会被减去 v_threshold

  • surrogate_function (Callable) – 反向传播时用来计算脉冲函数梯度的替代函数

  • detach_reset (bool) – 是否将reset过程的计算图分离

多步版本的 spikingjelly.clock_driven.neuron.EIFNode

对于多步神经元,输入 x_seq.shape = [T, *],不仅可以使用 .v.spike 获取 t = T - 1 时刻的电压和脉冲,还能够 使用 .v_seq.spike_seq 获取完整的 T 个时刻的电压和脉冲。

小技巧

阅读 传播模式 以获取更多关于单步和多步传播的信息。

参数
  • tau (float) – membrane time constant

  • delta_T (float) – sharpness parameter

  • theta_rh (float) – rheobase threshold

  • v_threshold (float) – threshold voltage of neurons

  • v_rest (float) – resting potential

  • v_reset (float) – reset voltage of neurons. If not None, voltage of neurons that just fired spikes will be set to v_reset. If None, voltage of neurons that just fired spikes will subtract v_threshold

  • surrogate_function (Callable) – surrogate function for replacing gradient of spiking functions during back-propagation

  • detach_reset (bool) – whether detach the computation graph of reset

  • backend (str) – use which backend, 'torch' or 'cupy'. 'cupy' is faster but only supports GPU

Tip

The input for multi-step neurons are x_seq.shape = [T, *]. We can get membrane potential and spike at time-step t = T - 1 by .v and .spike. We can also get membrane potential and spike at all T time-steps by .v_seq and .spike_seq.

Tip

Read Propagation Pattern for more details about single-step and multi-step propagation.

forward(x_seq: Tensor)[源代码]
extra_repr()[源代码]
training: bool
class spikingjelly.clock_driven.neuron.GeneralNode(a: float, b: float, c: float = 0.0, v_threshold: float = 1.0, v_reset: float = 0.0, surrogate_function: Callable = Sigmoid(alpha=4.0, spiking=True), detach_reset: bool = False)[源代码]

基类:BaseNode

neuronal_charge(x: Tensor)[源代码]
training: bool
class spikingjelly.clock_driven.neuron.MultiStepGeneralNode(a: float, b: float, c: float, v_threshold: float = 1.0, v_reset: float = 0.0, surrogate_function: Callable = Sigmoid(alpha=4.0, spiking=True), detach_reset: bool = False, backend='torch')[源代码]

基类:GeneralNode

forward(x_seq: Tensor)[源代码]
extra_repr()[源代码]
training: bool
class spikingjelly.clock_driven.neuron.LIAFNode(act: Callable, threshold_related: bool, *args, **kwargs)[源代码]

基类:LIFNode

参数
  • act (Callable) – the activation function

  • threshold_related (bool) – whether the neuron uses threshold related (TR mode). If true, y = act(h - v_th), otherwise y = act(h)

Other parameters in *args, **kwargs are same with LIFNode.

The LIAF neuron proposed in LIAF-Net: Leaky Integrate and Analog Fire Network for Lightweight and Efficient Spatiotemporal Information Processing.

Warning

The outputs of this neuron are not binary spikes.

training: bool
forward(x: Tensor)[源代码]