Analysis and Design of a k-Winners-Take-All Model With a Single State Variable and the Heaviside Step Activation Function

被引:76
|
作者
Wang, Jun [1 ,2 ]
机构
[1] Chinese Univ Hong Kong, Dept Mech & Automat Engn, Shatin, Hong Kong, Peoples R China
[2] Shanghai Jiao Tong Univ, Dept Comp Sci & Engn, Shanghai 200052, Peoples R China
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2010年 / 21卷 / 09期
关键词
Global stability; k-winners-take-all; optimization; recurrent neural network; RECURRENT NEURAL-NETWORK; RANK-ORDER; CIRCUIT; COMPUTATION; KWTA; ARCHITECTURES; DYNAMICS;
D O I
10.1109/TNN.2010.2052631
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a k-winners-take-all (kWTA) neural network with a single state variable and a hard-limiting activation function. First, following several kWTA problem formulations, related existing kWTA networks are reviewed. Then, the kWTA model model with a single state variable and a Heaviside step activation function is described and its global stability and finite-time convergence are proven with derived upper and lower bounds. In addition, the initial state estimation and a discrete-time version of the kWTA model are discussed. Furthermore, two selected applications to parallel sorting and rank-order filtering based on the kWTA model are discussed. Finally, simulation results show the effectiveness and performance of the kWTA model.
引用
收藏
页码:1496 / 1506
页数:11
相关论文
共 15 条
  • [1] Analysis and design of a distributed k-winners-take-all model
    Zhang, Yinyan
    Li, Shuai
    Xu, Bin
    Yang, Yong
    AUTOMATICA, 2020, 115
  • [2] Single-state distributed k-winners-take-all neural network model
    Zhang, Yinyan
    Li, Shuai
    Zhou, Xuefeng
    Weng, Jian
    Geng, Guanggang
    INFORMATION SCIENCES, 2023, 647
  • [3] Design of a K-Winners-Take-All Model With a Binary Spike Train
    Tymoshchuk, Pavlo, V
    Wunsch, Donald C., II
    IEEE TRANSACTIONS ON CYBERNETICS, 2019, 49 (08) : 3131 - 3140
  • [4] ON THE ROBUST DESIGN OF K-WINNERS-TAKE-ALL NETWORKS
    PERFETTI, R
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-ANALOG AND DIGITAL SIGNAL PROCESSING, 1995, 42 (01): : 55 - 58
  • [5] Parametric Sensitivity and Scalability of k-Winners-Take-All Networks with a Single State Variable and Infinity-Gain Activation Functions
    Wang, Jun
    Guo, Zhishan
    ADVANCES IN NEURAL NETWORKS - ISNN 2010, PT 1, PROCEEDINGS, 2010, 6063 : 77 - 85
  • [6] A Recurrent Neural Network with a Tunable Activation Function for Solving K-Winners-Take-All
    Miao Peng
    Shen Yanjun
    Hou Jianshu
    Shen Yi
    2014 33RD CHINESE CONTROL CONFERENCE (CCC), 2014, : 4957 - 4962
  • [7] An Analog Circuit Design for k-Winners-Take-All Operations
    Liu, Xiaoyang
    Wang, Jun
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT VII, 2018, 11307 : 666 - 675
  • [8] A model of analogue K-winners-take-all neural circuit
    Tymoshchuk, Pavlo V.
    NEURAL NETWORKS, 2013, 42 : 44 - 61
  • [9] Two k-winners-take-all networks with discontinuous activation functions
    Liu, Qingshan
    Wang, Jun
    NEURAL NETWORKS, 2008, 21 (2-3) : 406 - 413
  • [10] A Distributed k-Winners-Take-All Model With Binary Consensus Protocols
    Wang, Xiaoxuan
    Yang, Shaofu
    Guo, Zhenyuan
    Ge, Quanbo
    Wen, Shiping
    Huang, Tingwen
    IEEE TRANSACTIONS ON CYBERNETICS, 2024, 54 (05) : 3327 - 3337