no code implementations • 22 Mar 2024 • Yoshihide Sawada, Ryuji Saiin, Kazuma Suetake
Recently, the number of parameters in DNNs has explosively increased, as exemplified by LLMs (Large Language Models), making inference on small-scale computers more difficult.
no code implementations • 4 Oct 2023 • Ryuji Saiin, Tomoya Shirakawa, Sota Yoshihara, Yoshihide Sawada, Hiroyuki Kusumoto
Our proposed method can solve these problems; namely, SAF can halve the number of operations during the forward process, and it can be theoretically proven that SAF is consistent with the Spike Representation and OTTT, respectively.
no code implementations • 3 Feb 2023 • Kazuma Suetake, Takuya Ushimaru, Ryuji Saiin, Yoshihide Sawada
Spiking neural networks (SNNs) are energy-efficient neural networks because of their spiking nature.
no code implementations • 3 Mar 2022 • Shin-ichi Ikegawa, Ryuji Saiin, Yoshihide Sawada, Naotake Natori
Biologically inspired spiking neural networks (SNNs) are widely used to realize ultralow-power energy consumption.
no code implementations • 26 Jan 2022 • Kazuma Suetake, Shin-ichi Ikegawa, Ryuji Saiin, Yoshihide Sawada
To solve these problems, we propose a single-step spiking neural network (S$^3$NN), an energy-efficient neural network with low computational cost and high precision.
1 code implementation • 23 May 2021 • Takashi Furuya, Kazuma Suetake, Koichi Taniguchi, Hiroyuki Kusumoto, Ryuji Saiin, Tomohiro Daimon
Recurrent neural networks (RNNs) are a class of neural networks used in sequential tasks.