关注
Daya Guo
Daya Guo
在 mail2.sysu.edu.cn 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
Codebert: A pre-trained model for programming and natural languages
Z Feng, D Guo, D Tang, N Duan, X Feng, M Gong, L Shou, B Qin, T Liu, ...
arXiv preprint arXiv:2002.08155, 2020
17232020
Graphcodebert: Pre-training code representations with data flow
D Guo, S Ren, S Lu, Z Feng, D Tang, S Liu, L Zhou, N Duan, ...
arXiv preprint arXiv:2009.08366, 2020
799*2020
Codexglue: A machine learning benchmark dataset for code understanding and generation
S Lu, D Guo, S Ren, J Huang, A Svyatkovskiy, A Blanco, C Clement, ...
arXiv preprint arXiv:2102.04664, 2021
688*2021
Unixcoder: Unified cross-modal pre-training for code representation
D Guo, S Lu, N Duan, Y Wang, M Zhou, J Yin
arXiv preprint arXiv:2203.03850, 2022
2712022
Codebleu: a method for automatic evaluation of code synthesis
S Ren, D Guo, S Lu, L Zhou, S Liu, D Tang, N Sundaresan, M Zhou, ...
arXiv preprint arXiv:2009.10297, 2020
2242020
Graph-based reasoning over heterogeneous external knowledge for commonsense question answering
S Lv, D Guo, J Xu, D Tang, N Duan, M Gong, L Shou, D Jiang, G Cao, ...
Proceedings of the AAAI conference on artificial intelligence 34 (05), 8449-8456, 2020
1822020
Baize: An open-source chat model with parameter-efficient tuning on self-chat data
C Xu, D Guo, N Duan, J McAuley
arXiv preprint arXiv:2304.01196, 2023
1282023
Dialog-to-action: Conversational question answering over a large-scale knowledge base
D Guo, D Tang, N Duan, M Zhou, J Yin
Advances in Neural Information Processing Systems 31, 2018
1282018
Multi-task learning for conversational question answering over a large-scale knowledge base
T Shen, X Geng, T Qin, D Guo, D Tang, N Duan, G Long, D Jiang
arXiv preprint arXiv:1910.05069, 2019
872019
Automating code review activities by large-scale pre-training
Z Li, S Lu, D Guo, N Duan, S Jannu, G Jenks, D Majumder, J Green, ...
Proceedings of the 30th ACM Joint European Software Engineering Conference …, 2022
77*2022
Question generation from sql queries improves neural semantic parsing
D Guo, Y Sun, D Tang, N Duan, J Yin, H Chi, J Cao, P Chen, M Zhou
arXiv preprint arXiv:1808.06304, 2018
542018
Reacc: A retrieval-augmented code completion framework
S Lu, N Duan, H Han, D Guo, S Hwang, A Svyatkovskiy
arXiv preprint arXiv:2203.07722, 2022
522022
Coupling retrieval and meta-learning for context-dependent semantic parsing
D Guo, D Tang, N Duan, M Zhou, J Yin
arXiv preprint arXiv:1906.07108, 2019
522019
Syntax-enhanced pre-trained model
Z Xu, D Guo, D Tang, Q Su, L Shou, M Gong, W Zhong, X Quan, N Duan, ...
arXiv preprint arXiv:2012.14116, 2020
462020
Learning to complete code with sketches
D Guo, A Svyatkovskiy, J Yin, N Duan, M Brockschmidt, M Allamanis
arXiv preprint arXiv:2106.10158, 2021
422021
Laprador: Unsupervised pretrained dense retriever for zero-shot text retrieval
C Xu, D Guo, N Duan, J McAuley
arXiv preprint arXiv:2203.06169, 2022
362022
Ar-lsat: Investigating analytical reasoning of text
W Zhong, S Wang, D Tang, Z Xu, D Guo, J Wang, J Yin, M Zhou, N Duan
arXiv preprint arXiv:2104.06598, 2021
18*2021
Multi-modal representation learning for short video understanding and recommendation
D Guo, J Hong, B Luo, Q Yan, Z Niu
2019 IEEE International Conference on Multimedia & Expo Workshops (ICMEW …, 2019
132019
Soft-Labeled Contrastive Pre-training for Function-level Code Representation
X Li, D Guo, Y Gong, Y Lin, Y Shen, X Qiu, D Jiang, W Chen, N Duan
arXiv preprint arXiv:2210.09597, 2022
122022
LongCoder: A Long-Range Pre-trained Language Model for Code Completion
D Guo, C Xu, N Duan, J Yin, J McAuley
arXiv preprint arXiv:2306.14893, 2023
82023
系统目前无法执行此操作,请稍后再试。
文章 1–20