Estimation Under Model Misspecification With Fake Features

被引:10
作者
Hellkvist, Martin [1 ]
Ozcelikkale, Ayca [1 ]
Ahlen, Anders [1 ]
机构
[1] Uppsala Univ, Dept Elect Engn, S-75121 Uppsala, Sweden
基金
瑞典研究理事会;
关键词
Estimation; Data models; Dictionaries; Covariance matrices; Noise measurement; Noise level; Focusing; Model uncertainty; model mismatch; robustness; LINEAR-MODEL; REGULARIZATION; REGRESSION; BOUNDS;
D O I
10.1109/TSP.2023.3237174
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We consider estimation under model misspecification where there is a model mismatch between the underlying system, which generates the data, and the model used during estimation. We propose a model misspecification framework which enables a joint treatment of the model misspecification types of having fake features as well as incorrect covariance assumptions on the unknowns and the noise. We present a decomposition of the output error into components that relate to different subsets of the model parameters corresponding to underlying, fake and missing features. Here, fake features are features which are included in the model but are not present in the underlying system. Under this framework, we characterize the estimation performance and reveal trade-offs between the number of samples, number of fake features, and the possibly incorrect noise level assumption. In contrast to existing work focusing on incorrect covariance assumptions or missing features, fake features is a central component of our framework. Our results show that fake features can significantly improve the estimation performance, even though they are not correlated with the features in the underlying system. In particular, we show that the estimation error can be decreased by including more fake features in the model, even to the point where the model is overparametrized, i.e., the model contains more unknowns than observations.
引用
收藏
页码:47 / 60
页数:14
相关论文
共 37 条
[21]  
Meckes E. S., 2019, ser. Cambridge Tracts in Mathematics
[22]   The Generalization Error of Random Features Regression: Precise Asymptotics and the Double Descent Curve [J].
Mei, Song ;
Montanari, Andrea .
COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, 2022, 75 (04) :667-766
[23]  
Mingbing Li, 2019, 2019 IEEE 5th International Conference on Computer and Communications (ICCC), P410, DOI 10.1109/ICCC47050.2019.9064198
[24]   Robust Estimation of a Random Parameter in a Gaussian Linear Model With Joint Eigenvalue and Elementwise Covariance Uncertainties [J].
Mittelman, Roni ;
Miller, Eric L. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2010, 58 (03) :1001-1011
[25]  
Muthukumar Vidya, 2020, IEEE J SEL AREAS INF, V1, P67, DOI DOI 10.1109/JSAIT.2020.2984716
[26]  
Nakkiran P., 2021, PROC INT C LEARN REP, P426
[27]   SOME NOTES ON MISSPECIFICATION IN MULTIPLE REGRESSIONS [J].
RAO, P .
AMERICAN STATISTICIAN, 1971, 25 (05) :37-&
[28]  
Ren C, 2015, EUR SIGNAL PR CONF, P514, DOI 10.1109/EUSIPCO.2015.7362436
[29]  
Richards Dominic, 2021, P MACHINE LEARNING R, V130
[30]   mixOmics: An R package for 'omics feature selection and multiple data integration [J].
Rohart, Florian ;
Gautier, Benoit ;
Singh, Amrit ;
Le Cao, Kim-Anh .
PLOS COMPUTATIONAL BIOLOGY, 2017, 13 (11)