An Empirical Bayes Optimal Discovery Procedure Based on Semiparametric Hierarchical Mixture Models

被引:0
|
作者
Noma, Hisashi [1 ]
Matsui, Shigeyuki [1 ]
机构
[1] Inst Stat Math, Dept Data Sci, Tachikawa, Tokyo 1908562, Japan
基金
日本学术振兴会;
关键词
DIFFERENTIAL GENE-EXPRESSION; MAXIMUM-LIKELIHOOD; MICROARRAY DATA; INFERENCE; SELECTION; RATES; POWER; SIZE;
D O I
10.1155/2013/568480
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Multiple testing has been widely adopted for genome-wide studies such as microarray experiments. For effective gene selection in these genome-wide studies, the optimal discovery procedure (ODP), which maximizes the number of expected true positives for each fixed number of expected false positives, was developed as a multiple testing extension of the most powerful test for a single hypothesis by Storey (Journal of the Royal Statistical Society, Series B, vol. 69, no. 3, pp. 347-368, 2007). In this paper, we develop an empirical Bayes method for implementing the ODP based on a semiparametric hierarchical mixture model using the "smoothing-by-roughening" approach. Under the semiparametric hierarchical mixture model, (i) the prior distribution can be modeled flexibly, (ii) the ODP test statistic and the posterior distribution are analytically tractable, and (iii) computations are easy to implement. In addition, we provide a significance rule based on the false discovery rate (FDR) in the empirical Bayes framework. Applications to two clinical studies are presented.
引用
收藏
页数:9
相关论文
共 21 条
  • [1] Empirical Bayes ranking and selection methods via semiparametric hierarchical mixture models in microarray studies
    Noma, Hisashi
    Matsui, Shigeyuki
    STATISTICS IN MEDICINE, 2013, 32 (11) : 1904 - 1916
  • [2] The optimal discovery procedure in multiple significance testing: an empirical Bayes approach
    Noma, Hisashi
    Matsui, Shigeyuki
    STATISTICS IN MEDICINE, 2012, 31 (02) : 165 - 176
  • [3] Empirical Bayes estimators in hierarchical models with mixture priors
    Rosenkranz, Gerd K.
    JOURNAL OF APPLIED STATISTICS, 2018, 45 (16) : 2958 - 2980
  • [4] An empirical Bayes procedure for the selection of Gaussian graphical models
    Donnet, Sophie
    Marin, Jean-Michel
    STATISTICS AND COMPUTING, 2012, 22 (05) : 1113 - 1123
  • [5] IDENTIFIABILITY OF NONPARAMETRIC MIXTURE MODELS AND BAYES OPTIMAL CLUSTERING
    Aragam, Bryon
    Dan, Chen
    Xing, Eric P.
    Ravikumar, Pradeep
    ANNALS OF STATISTICS, 2020, 48 (04) : 2277 - 2302
  • [6] Hierarchical Bayes based Adaptive Sparsity in Gaussian Mixture Model
    Wang, Binghui
    Lin, Chuang
    Fan, Xin
    Jiang, Ning
    Farina, Dario
    PATTERN RECOGNITION LETTERS, 2014, 49 : 238 - 247
  • [7] Empirical Bayes estimation utilizing finite Gaussian Mixture Models
    Orellana, Rafael
    Carvajal, Rodrigo
    Aguero, Juan C.
    2019 IEEE CHILEAN CONFERENCE ON ELECTRICAL, ELECTRONICS ENGINEERING, INFORMATION AND COMMUNICATION TECHNOLOGIES (CHILECON), 2019,
  • [8] A two-step estimation procedure for semiparametric mixture cure models
    Musta, Eni
    Patilea, Valentin
    Van Keilegom, Ingrid
    SCANDINAVIAN JOURNAL OF STATISTICS, 2024, 51 (03) : 987 - 1011
  • [9] Quasi-Bayes properties of a procedure for sequential learning in mixture models
    Fortini, Sandra
    Petrone, Sonia
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2020, 82 (04) : 1087 - 1114
  • [10] Comparative Analysis of Empirical Bayes and Bayesian Hierarchical Models in Hotspot Identification
    Guo, Xiaoyu
    Wu, Lingtao
    Zou, Yajie
    Fawcett, Lee
    TRANSPORTATION RESEARCH RECORD, 2019, 2673 (07) : 111 - 121