• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Zongxia Wang (Zongxia Wang.) | Naigong Yu (Naigong Yu.) | Firdaous Essaf (Firdaous Essaf.)

Abstract:

The neuron model serves as the foundation for building a neural network. The goal of neuron modeling is to shoot a tradeoff between the biological meaningful and the implementation cost, so as to build a bridge between brain science knowledge and the brain‐like neuromorphic computing. Unlike previous neuron models with linear static synapses, the focus of this research is to model neurons with relatively detailed nonlinear dynamic synapses. First, a universal soma‐synapses neuron (SSN) is proposed. It contains a soma represented by a leaky integrate‐and‐fire neuron and multiple excitatory and inhibitory synapses based on ion channels dynamics. Short‐term plasticity and spike‐timing‐dependent plasticity linked to biological microscopic mechanisms are also presented in the synaptic models. Then, SSN is implemented on field‐programmable gate array (FPGA). The performance of each component in SSN is analyzed and evaluated. Finally, a neural network SSNN composed of SSNs is deployed on FPGA and used for testing. Experimental results show that the stimulus‐response characteristics of SSN are consistent with the electrophysiological test findings of biological neurons, and the activities of SSNN exhibit a promising prospect. We provide a prototype for embedded neuromorphic computing with a small number of relatively detailed neuron models.

Keyword:

short-term plasticity FPGA ion channel LIF neuromorphic computing spike-timing-dependent plasticity

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Source :

Concurrency and Computation:Practice and Experience

ISSN: 1532-0626

Year: 2023

Issue: 27

Volume: 35

Page: n/a-n/a

2 . 0 0 0

JCR@2022

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count: -1

Chinese Cited Count:

30 Days PV: 3

Affiliated Colleges:

Online/Total:167/10662680
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.