2 code implementations • Cancer Imaging 2024 • Elmira Yazdani, Najme Karamzadeh-Ziarati, Seyyed Saeid Cheshmi, Mahdi Sadeghi, Parham Geramifar, Habibeh Vosoughi, Mahmood Kazemi Jahromi, Saeed Reza Kheradpisheh
Prostate-specific membrane antigen (PSMA) PET/CT imaging is widely used for quantitative image analysis, especially in radioligand therapy (RLT) for metastatic castration-resistant prostate cancer (mCRPC).
no code implementations • 17 Aug 2023 • Neda Rahimpour Anaraki, Alireza Azadbakht, Maryam Tahmasbi, Hadi Farahani, Saeed Reza Kheradpisheh, Alireza Javaheri
In this paper, we focus on deep learning and provide three geometric post-processing methods that improve the quality of the work.
no code implementations • 7 Jun 2023 • Arsham Gholamzadeh Khoee, Alireza Javaheri, Saeed Reza Kheradpisheh, Mohammad Ganjtabesh
The human brain constantly learns and rapidly adapts to new situations by integrating acquired knowledge and experiences into memory.
1 code implementation • 23 Oct 2022 • Alireza Azadbakht, Saeed Reza Kheradpisheh, Ismail Khalfaoui-Hassani, Timothée Masquelier
However, most SOTA networks are too large for edge computing.
1 code implementation • 27 Sep 2021 • Saeed Reza Kheradpisheh, Maryam Mirsadeghi, Timothée Masquelier
By assuming IF neuron with rate-coding as an approximation of ReLU, we backpropagate the error of the SNN in the proxy ANN to update the shared weights, simply by replacing the ANN final output with that of the SNN.
1 code implementation • 12 Sep 2021 • Hafez Ghaemi, Erfan Mirzaei, Mahbod Nouri, Saeed Reza Kheradpisheh
Brain-inspired computation and information processing alongside compatibility with neuromorphic hardware have made spiking neural networks (SNN) a promising method for solving learning tasks in machine learning (ML).
no code implementations • 31 Aug 2021 • Maryam Mirsadeghi, Majid Shalchian, Saeed Reza Kheradpisheh, Timothée Masquelier
To do so, we consider a convolutional SNN (CSNN) with two sets of weights: real-valued weights that are updated in the backward pass and their signs, binary weights, that are employed in the feedforward process.
Ranked #11 on Image Classification on Fashion-MNIST
1 code implementation • 8 Jul 2020 • Saeed Reza Kheradpisheh, Maryam Mirsadeghi, Timothée Masquelier
We recently proposed the S4NN algorithm, essentially an adaptation of backpropagation to multilayer spiking neural networks that use simple non-leaky integrate-and-fire neurons and a form of temporal coding known as time-to-first-spike coding.
no code implementations • 9 Nov 2019 • Aref Moqadam Mehr, Saeed Reza Kheradpisheh, Hadi Farahani
Biological neurons use spikes to process and learn temporally dynamic inputs in an energy and computationally efficient way.
1 code implementation • 21 Oct 2019 • Saeed Reza Kheradpisheh, Timothée Masquelier
In particular, in the readout layer, the first neuron to fire determines the class of the stimulus.
2 code implementations • 22 Apr 2018 • Amirhossein Tavanaei, Masoud Ghodrati, Saeed Reza Kheradpisheh, Timothee Masquelier, Anthony S. Maida
In this approach, a deep (multilayer) artificial neural network (ANN) is trained in a supervised manner using backpropagation.
no code implementations • 1 Mar 2018 • Timothée Masquelier, Saeed Reza Kheradpisheh
Here we investigated how a single spiking neuron can optimally respond to one given pattern (localist coding), or to either one of several patterns (distributed coding, i. e. the neuron's response is ambiguous but the identity of the pattern could be inferred from the response of multiple neurons), but not to random inputs.
no code implementations • 25 May 2017 • Milad Mozafari, Saeed Reza Kheradpisheh, Timothée Masquelier, Abbas Nowzari-Dalini, Mohammad Ganjtabesh
In the highest layers, each neuron was assigned to an object category, and it was assumed that the stimulus category was the category of the first neuron to fire.
no code implementations • 29 Mar 2017 • Matin N. Ashtiani, Saeed Reza Kheradpisheh, Timothée Masquelier, Mohammad Ganjtabesh
This means that, low frequency information is sufficient for superordinate level, but not for the basic and subordinate levels.
1 code implementation • 4 Nov 2016 • Saeed Reza Kheradpisheh, Mohammad Ganjtabesh, Simon J. Thorpe, Timothée Masquelier
Coding was very sparse, with only a few thousands spikes per image, and in some cases the object category could be reasonably well inferred from the activity of a single higher-order neuron.
no code implementations • 21 Apr 2016 • Saeed Reza Kheradpisheh, Masoud Ghodrati, Mohammad Ganjtabesh, Timothée Masquelier
This feed-forward architecture has inspired a new generation of bio-inspired computer vision systems called deep convolutional neural networks (DCNN), which are currently the best algorithms for object recognition in natural images.
no code implementations • 17 Aug 2015 • Saeed Reza Kheradpisheh, Masoud Ghodrati, Mohammad Ganjtabesh, Timothée Masquelier
Deep convolutional neural networks (DCNNs) have attracted much attention recently, and have shown to be able to recognize thousands of object categories in natural image databases.
no code implementations • 15 Apr 2015 • Saeed Reza Kheradpisheh, Mohammad Ganjtabesh, Timothée Masquelier
Retinal image of surrounding objects varies tremendously due to the changes in position, size, pose, illumination condition, background context, occlusion, noise, and nonrigid deformations.