KDD2023丨大模型論文合集

ACM SIGKDD(國際數(shù)據(jù)挖掘與知識發(fā)現(xiàn)大會,簡稱KDD)會議始于1989年,是數(shù)據(jù)挖掘領(lǐng)域歷史最悠久、規(guī)模最大的國際頂級學(xué)術(shù)會議,也是首個引入大數(shù)據(jù)、數(shù)據(jù)科學(xué)、預(yù)測分析、眾包等概念的會議,每年吸引了大量數(shù)據(jù)挖掘、機器學(xué)習(xí)、大數(shù)據(jù)和人工智能等領(lǐng)域的研究學(xué)者、從業(yè)人員參與。
AMiner通過AI技術(shù),對 KDD2023 收錄的會議論文進行了分類整理,今日分享的是大模型主題論文?。ㄓ捎谄P(guān)系,本篇只展現(xiàn)部分論文,點擊閱讀原文可直達KDD頂會頁面查看所有論文)
1.WebGLM: Towards An Efficient Web-Enhanced Question Answering System with Human Preferences
https://www.aminer.cn/pub/64893b17d68f896efa9826b7/
2.Pre-training Antibody Language Models for Antigen-Specific Computational Antibody Design
https://www.aminer.cn/pub/64af9a063fda6d7f065a6c00/
3.LightToken: A Task and Model-agnostic Lightweight Token Embedding Framework for Pre-trained Language Models
https://www.aminer.cn/pub/64af99fd3fda6d7f065a62f2/
4.JiuZhang 2.0: A Unified Chinese Pre-trained Language Model for Multi-task Mathematical Problem Solving
https://www.aminer.cn/pub/64af9a043fda6d7f065a6a45/
5.BERT4CTR: An Efficient Framework to Combine Pre-trained Language Model with Non-textual Features for CTR Prediction
https://www.aminer.cn/pub/64af9a043fda6d7f065a69f8/
6.RecruitPro: A Pretrained Language Model with Skill-Aware Prompt Learning for Intelligent Recruitment
https://www.aminer.cn/pub/64af9a053fda6d7f065a6b09/
7.QUERT: Continual Pre-training of Language Model for Query Understanding in Travel Domain Search
https://www.aminer.cn/pub/6487e9fad68f896efa482b50/
8.Automated 3D Pre-Training for Molecular Property Prediction
https://www.aminer.cn/pub/64893b17d68f896efa982588/
9.GLM-Dialog: Noise-tolerant Pre-training for Knowledge-grounded Dialogue Generation
https://www.aminer.cn/pub/63fec3cd90e50fcafdd70322/
10.CodeGeeX: A Pre-Trained Model for Code Generation with Multilingual Benchmarking on HumanEval-X
https://www.aminer.cn/pub/64264f7b90e50fcafd68e145/
11.Graph-Aware Language Model Pre-Training on a Large Graph Corpus Can Help Multiple Graph Applications
https://www.aminer.cn/pub/647eaf51d68f896efad41cdb/
12.Text Is All You Need: Learning Language Representations for Sequential Recommendation
https://www.aminer.cn/pub/646d863cd68f896efa09f2e5/
KDD頂會:https://www.aminer.cn/conf/5ea1b22bedb6e7d53c00c41b/KDD2023