An Analog-Digital Hardware for Parzen-Based Nonparametric Probability Density Estimation

被引:0
作者
Stankovic, Djordje [1 ,2 ]
Draganic, Andjela [1 ]
Lekic, Nedjeljko [1 ]
Ioana, Cornel [2 ]
Orovic, Irena [1 ,3 ]
机构
[1] Univ Montenegro, Fac Elect Engn, Podgorica 81400, Montenegro
[2] Grenoble Inst Technol Grenoble INP, GIPSA Lab, F-38402 Grenoble, France
[3] Univ Lusofona, COPELABS, P-1700097 Lisbon, Portugal
关键词
Estimation; Hardware; Probability density function; Histograms; Kernel; Computational modeling; Data models; Analog hardware; non-parametric approach; Parzen window; probability density estimation;
D O I
10.1109/ACCESS.2024.3446370
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Probability estimation measures the likelihood of different outcomes in a statistical context. It commonly involves estimating either the parameters or the entire distribution of a random variable. Parametric approaches, where a specific functional form is assumed for data distribution, have been used in various fields, particularly in computational statistics for modeling and simulating physical phenomena. However, non-parametric methods have gained prominence, especially in machine learning and signal processing. These methods are focused on estimating or modeling probability density functions without relying on predefined parametric forms. It becomes crucial when faced with unknown or complex distributions, especially if parametric assumptions do not hold. This paper deals with the non-parametric method based on the Parzen window for probability density function estimation, a versatile approach applicable to univariate and multivariate data. Having a sufficient amount of data, this method provides reliable estimates, while at the same time, it is quite suitable for implementation. Considering the advantages of hardware implementations compared to software solutions, this paper introduces analog-digital hardware for the Parzen approach. The proposed solution avoids the need for sorting operations, which are typically challenging to implement in hardware. The simulation is performed using PSpice software (OrCad version 22.1) showing that the required processing time is under 420 ns.
引用
收藏
页码:116226 / 116237
页数:12
相关论文
共 25 条
[1]   The 2016 Data Challenge of the American Statistical Association [J].
Amjadi, Roya ;
Martinez, Wendy .
COMPUTATIONAL STATISTICS, 2021, 36 (03) :1553-1560
[2]  
Batcher K. E., 1968, SPRING JOINT COMP C, P307, DOI DOI 10.1145/1468075.1468121
[3]   Using Neural Networks with Routine Health Records to Identify Suicide Risk: Feasibility Study [J].
DelPozo-Banos, Marcos ;
John, Ann ;
Petkov, Nicolai ;
Berridge, Damon Mark ;
Southern, Kate ;
LLoyd, Keith ;
Jones, Caroline ;
Spencer, Sarah ;
Manuel Travieso, Carlos .
JMIR MENTAL HEALTH, 2018, 5 (02)
[4]   Efficient Hardware Architectures for Accelerating Deep Neural Networks: Survey [J].
Dhilleswararao, Pudi ;
Boppu, Srinivas ;
Manikandan, M. Sabarimalai ;
Cenkeramaddi, Linga Reddy .
IEEE ACCESS, 2022, 10 :131788-131828
[5]   A Parzen-Window-Kernel-Based CFAR Algorithm for Ship Detection in SAR Images [J].
Gao, Gui .
IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2011, 8 (03) :557-561
[6]   A review of Hidden Markov models and Recurrent Neural Networks for event detection and localization in biomedical signals [J].
Khalifa, Yassin ;
Mandic, Danilo ;
Sejdic, Ervin .
INFORMATION FUSION, 2021, 69 :52-72
[7]   Adaptive gradient-based analog hardware architecture for 2D under-sampled signals reconstruction [J].
Lekic, Nedjeljko ;
Zaric, Maja Lakicevic ;
Orovic, Irena ;
Stankovic, Srdjan .
MICROPROCESSORS AND MICROSYSTEMS, 2018, 62 :72-78
[8]   Comprehensive Graph Gradual Pruning for Sparse Training in Graph Neural Networks [J].
Liu, Chuang ;
Ma, Xueqi ;
Zhan, Yibing ;
Ding, Liang ;
Tao, Dapeng ;
Du, Bo ;
Hu, Wenbin ;
Mandic, Danilo P. .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (10) :14903-14917
[9]  
Londhe A. R., 1980, Nonparametric density estimation using kernels withvariable size windows, DOI [10.31274/rtd-180813-3561, DOI 10.31274/RTD-180813-3561]
[10]  
Mandic J., 2001, Recurrent Neural Networks for Prediction:Learning Algorithms, Architectures and Stability, DOI [10.1002/047084535X.11Y, DOI 10.1002/047084535X.11Y]