共 50 条
- [1] Multi-Armed Recommender System Bandit Ensembles RECSYS 2019: 13TH ACM CONFERENCE ON RECOMMENDER SYSTEMS, 2019, : 432 - 436
- [2] Scaling Multi-Armed Bandit Algorithms KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, : 1449 - 1459
- [3] A Multi-Armed Bandit Model Selection for Cold-Start User Recommendation PROCEEDINGS OF THE 25TH CONFERENCE ON USER MODELING, ADAPTATION AND PERSONALIZATION (UMAP'17), 2017, : 32 - 40
- [4] Characterizing Truthful Multi-Armed Bandit Mechanisms 10TH ACM CONFERENCE ON ELECTRONIC COMMERCE - EC 2009, 2009, : 79 - 88
- [6] A Bayesian Multi-armed Bandit Approach for Identifying Human Vulnerabilities DECISION AND GAME THEORY FOR SECURITY, GAMESEC 2018, 2018, 11199 : 521 - 539
- [8] Repeated Dollar Auctions: A Multi-Armed Bandit Approach AAMAS'16: PROCEEDINGS OF THE 2016 INTERNATIONAL CONFERENCE ON AUTONOMOUS AGENTS & MULTIAGENT SYSTEMS, 2016, : 579 - 587
- [9] Interface Design Optimization as a Multi-Armed Bandit Problem 34TH ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, CHI 2016, 2016, : 4142 - 4153