A method for estimating coding gain of an orthogonal wavelet transform considering higher-order statistics

被引:0
|
作者
Yokota, Y [1 ]
Usui, S
机构
[1] Gifu Univ, Fac Engn, Dept Elect & Comp Engn, Gifu 50111, Japan
[2] Toyohashi Univ Technol, Dept Informat & Comp Sci, Toyohashi, Aichi 441, Japan
来源
ELECTRONICS AND COMMUNICATIONS IN JAPAN PART III-FUNDAMENTAL ELECTRONIC SCIENCE | 1999年 / 82卷 / 01期
关键词
subband coding; orthogonal wavelet transform; coding gain; higher-order statistics; generalized Gaussian distribution;
D O I
10.1002/(SICI)1520-6440(199901)82:1<58::AID-ECJC7>3.0.CO;2-3
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In applications such as subband coding, the coding gain is a useful evaluation measure for representing the bit-rate reduction performance of the subband filter for the object of coding. It is widely used in the estimation of coding performance and the design of subband filters. To estimate the coding gain requires knowledge of the probability density distribution of the object of coding and the decomposed subband series. Identification of the probability density function, however, is in general difficult. Consequently an approximation is often used in the estimation of the coding gain, where it is assumed that the probability density distributions of the object of coding and all subband series are the same. Thus, estimation of the coding gain contains an error for data, such as images, where this assumption is not satisfied. This paper notes that the probability density distribution of the subband decomposed image can be well approximated by a generalized Gaussian distribution and proposes a method for a more accurate estimation of the coding gain in the orthogonal wavelet transform attained by a variety of subband filter, using the higher-order statistics of the object of coding. The coding gain is estimated for several objects of coding. Comparing the result to the actually measured value, it is shown that the accuracy of estimation is greatly improved in the proposed method, compared to the method based on an approximation. (C) 1998 Scripta Technica.
引用
收藏
页码:58 / 67
页数:10
相关论文
共 50 条
  • [41] Higher-order statistics for Rice's representation of cyclostationary signals
    Izzo, L
    Napolitano, A
    SIGNAL PROCESSING, 1997, 56 (03) : 279 - 292
  • [42] An Unsupervised Method based on Support Vector Machines and Higher-Order Statistics for Mechanical Faults Detection
    Borges, F.
    Pinto, A.
    Ribeiro, D.
    Barbosa, T.
    Pereira, D.
    Barbosa, B.
    Magalhaes, R.
    Ferreira, D.
    IEEE LATIN AMERICA TRANSACTIONS, 2020, 18 (06) : 1093 - 1101
  • [43] Seismic pattern recognition by wavelet based-higher order statistics
    Kharintsev, SS
    Salakhov, MK
    NOISE AND FLUCTUATIONS, 2005, 780 : 825 - 828
  • [44] Blind identification of noncausal AR models based on higher-order statistics
    Chen, L
    Kusaka, H
    Kominami, M
    Yin, QY
    SIGNAL PROCESSING, 1996, 48 (01) : 27 - 36
  • [45] Solving the Zeh Problem About the Density Operator with Higher-Order Statistics
    Deville, Alain
    Deville, Yannick
    INFORMATION, 2025, 16 (02)
  • [46] Approach of radar clutter recognition based on higher-order statistics combination
    Ma, XY
    Fang, XL
    Zhang, RH
    Xiang, JB
    2000 5TH INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING PROCEEDINGS, VOLS I-III, 2000, : 1933 - 1937
  • [47] Discrete Tomography with Unknown Intensity Levels Using Higher-Order Statistics
    Jozsef Nemeth
    Journal of Mathematical Imaging and Vision, 2015, 53 : 314 - 331
  • [48] Discrete Tomography with Unknown Intensity Levels Using Higher-Order Statistics
    Nemeth, Jozsef
    JOURNAL OF MATHEMATICAL IMAGING AND VISION, 2015, 53 (03) : 314 - 331
  • [49] Subspace design of low-rank estimators for higher-order statistics
    Bradaric, I
    Petropulu, AP
    Diamantaras, KI
    JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2002, 339 (02): : 161 - 187
  • [50] The use of higher-order statistics in rapid object categorization in natural scenes
    Banno, Hayaki
    Saiki, Jun
    JOURNAL OF VISION, 2015, 15 (02):