Hierarchical Schema Representation for Text-to-SQL Parsing With Decomposing Decoding

被引:3
作者
Song, Meina [1 ]
Zhan, Zecheng [1 ]
E, Haihong [1 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Comp Sci, Beijing 100876, Peoples R China
关键词
Semantic parsing; SQL generation; deep learning; neural network; graph encoder; natural language process;
D O I
10.1109/ACCESS.2019.2931464
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Most of existing studies on parsing natural language (NL) for constructing structured query language (SQL) do not consider the complex structure of database schema and the gap between NL and SQL query. In this paper, we propose a schema-aware neural network with decomposing architecture, namely HSRNet, which aims to address the complex and cross-domain Text-to-SQL generation task. The HSRNet models the relationship of the database schema with a hierarchical schema graph and employs a graph network to encode the information into sentence representation. Instead of end-to-end generation, the HSRNet decomposes the generation process into three phases. Given an input question and schema, we first choose the column candidates and generate the sketch grammar of the SQL query. Then, a detail completion module fills the details based on the column candidates and the corresponding sketch. We demonstrate the effectiveness of our hierarchical schema representation by incorporating the information into different baselines. We further show that the decomposing architecture significantly improves the performance of our model. Evaluation of Spider benchmark shows that the hierarchical schema representation and decomposing architecture improves our parser result by 14.5% and 4.3% respectively.
引用
收藏
页码:103706 / 103715
页数:10
相关论文
共 41 条
[1]  
ANDROUTSOPOULOS I, 1993, INDUSTRIAL AND ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND EXPERT SYSTEMS - IEA/AIE 93, P327
[2]  
Androutsopoulos I., 1995, J. Lang. Eng, V1, P29, DOI DOI 10.1017/S135132490000005X
[3]  
[Anonymous], P 4 INT NAT LANG GEN
[4]  
[Anonymous], 2014, 3 INT C LEARN REPR
[5]  
[Anonymous], 2014, P ANN C NEUR INF PRO, DOI [DOI 10.1021/acs.analchem.7b05329, DOI 10.48550/ARXIV.1409.3215]
[6]  
Bogin B., 2019, ARXIV190506241
[7]  
Bruna J., 2014, C TRACK P
[8]  
Chai YF, 2018, INTL CONF POWER SYST, P911, DOI 10.1109/POWERCON.2018.8602273
[9]  
Defferrard M., 2016, P ADV NEUR INF PROC, P3844
[10]  
Dong L, 2016, PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, P33